Saturday, October 30, 2010

LibQUAL


What is LibQUAL+®?
LibQUAL+® is a suite of services that libraries use to solicit, track, understand, and act upon users' opinions of service quality. These services are offered to the library community by the Association of Research Libraries (ARL). The program's centerpiece is a rigorously tested Web-based survey bundled with training that helps libraries assess and improve library services, change organizational culture, and market the library. The goals of LibQUAL+® are to:
  • Foster a culture of excellence in providing library service
  • Help libraries better understand user perceptions of library service quality
  • Collect and interpret library user feedback systematically over time
  • Provide libraries with comparable assessment information from peer institutions
  • Identify best practices in library service
  • Enhance library staff members' analytical skills for interpreting and acting on data
More than 1,000 libraries have participated in LibQUAL+®, including colleges and university libraries, community college libraries, health sciences libraries, academic law libraries, and public libraries -- some through various consortia, others as independent participants. LibQUAL+® has expanded internationally, with participating institutions in Africa, Asia, Australia, and Europe. The growing LibQUAL+® community of participants and its extensive dataset are rich resources for improving library services.

How will LibQUAL+® Benefit your Library?
Library administrators have successfully used LibQUAL+® survey data to identify best practices, analyze deficits, and effectively allocate resources. Benefits to participating institutions include:
  • Institutional data and reports that enable you to assess whether your library services are meeting user expectations
  • Aggregate data and reports that allow you to compare your library's performance with that of peer institutions
  • Workshops designed specifically for LibQUAL+® participants
  • Access to an online library of LibQUAL+® research articles
  • The opportunity to become part of a community interested in developing excellence in library services

How Does LibQUAL+® Benefit Your Library Users?
LibQUAL+® gives your library users a chance to tell you where your services need improvement so you can respond to and better manage their expectations. You can develop services that better meet your users' expectations by comparing your library's data with that of peer institutions and examining the practices of those libraries that are evaluated highly by their users.

How Is the LibQUAL+® Survey Conducted?
Conducting the LibQUAL+® survey requires little technical expertise on your part. Use our online Management Center to set up and track the progress of your survey. You invite your users to take the survey, distributing the URL for your library's Web form via e-mail or posting a link on the library's Web site. Respondents complete the survey form and their answers are sent to the LibQUAL+® database. The data are analyzed and presented to you in reports describing your users' desired, perceived, and minimum expectations of service.

What Are the Origins of the LibQUAL+® Survey?
The LibQUAL+® survey evolved from a conceptual model based on the SERVQUAL instrument, a popular tool for assessing service quality in the private sector. The Texas A&M University Libraries and other libraries used modified SERVQUAL instruments for several years; those applications revealed the need for a newly adapted tool that would serve the particular requirements of libraries. ARL, representing the largest research libraries in North America, partnered with Texas A&M University Libraries to develop, test, and refine LibQUAL+® . This effort was supported in part by a three-year grant from the U.S. Department of Education's Fund for the Improvement of Post-Secondary Education (FIPSE).


Dalhousie Libraries Survey
January 28 - February 18, 2005



LibQUAL logo

What Is the LibQUAL Survey?
LibQUAL is the first total market survey developed specifically for academic research libraries. From January 28th til February 18th, the Dalhousie Libraries will be running the LibQUAL survey campus-wide to determine how Dal faculty, students and staff feel about the quality of the resources and services we offer. We need your feedback, and we welcome your comments and suggestions!


How Does the Survey Work?
The LibQUAL survey will be available online to the entire Dalhousie community. It features questions which cover three vital facets of academic libraries: �Affect of Service,� �Library As Place,� and �Information Control.� The survey should take approximately 10 minutes to complete.

LibQUAL+ Survey 2010

LibQUAL+ Logo 2010
In 2010, the University of Toronto Libraries will again participate in the LibQUAL+ survey, a national initiative designed to measure library service quality and identify best practices on an ongoing basis, led by the Canadian Association of Research Libraries.
The survey is important because it will:
  • Help us better understand how our students rate library services
  • Allow us to benchmark our results against other libraries to determine best practices
  • Let us know where we can concentrate space and service improvements for University of Toronto Libraries users
This year’s survey marks the second time the University of Toronto Libraries (UTL) will be participating. In March of 2007, UTL implemented the LibQUAL+ survey as part of a consortium of 62 Canadian institutions and 217 institutions worldwide, including college and university libraries, health sciences libraries, community college libraries and law libraries. Survey respondents will be asked about their perceptions and expectations of library service quality on three dimensions:
  • Affect of Service: Customer services provided by library staff
  • Information Control: Library resources, collections and access to resources
  • Library as Place: Library spaces, facilities and amenities (for study, meeting, etc.)
We are pleased to report that the feedback we received from our community of students, faculty and staff through the 2007 survey helped us to identify and act on a number of priority areas for improvement. Here is a sample of what you have helped us to accomplish as a result of the 2007 survey:

AFFECT OF SERVICE:

INFORMATION CONTROL:

LIBRARY AS PLACE:
* To view the U of T Mississauga Library's LibQUAL+ Survey Action Plan Update Report, click here.
Input from our community does make a difference and we are hoping for a good response rate to the survey again this year to help identify future service priorities. For questions about this year’s LibQUAL+ survey, please view our FAQ or contact Lari Langford at lari.langford@utoronto.ca with any additional questions not addressed in the FAQ.

Wednesday, October 13, 2010

Banned Books: Libraries’ rebellious side


University Libraries teamed up with the Progressive Librarians Guild to bring exhibits and events together for Banned Books Week, for the second year in a row.
Banned Books Week was created by the American Library Association and "highlights the benefits of free and open access to information while drawing attention to the harms of censorship by spotlighting actual or attempted bannings of books across the United States," according to the American Library Association's website.
"It's really a celebration of our freedom to read," said Rebecca Blakiston, UA instructional services librarian.
This year there will be two exhibits, one in the Main Library and one in the Science-Engineering Library.
"It's showing other people that these are examples of books that have been banned and it's mainly getting the word out that we still ban books for simple issues," said Andrea Goodrich, president of the UA's chapter of the Progressive Librarians Guild. "It's part of what we stand for as progressive librarians."
Goodrich says that as progressive librarians they tackle issues other librarians don't and look at the community at large.
The Main Library exhibit has more than 50 books on display, with a special focus on Harper Lee's "To Kill a Mockingbird."
"That's one of the most challenged books over time," Blakiston said.
They chose "To Kill a Mockingbird" not only because it is one of the most challenged but also because this year marks the 50th anniversary of its publication.
At the exhibit a list of the most frequently challenged books in the last year is available.
Also on display is an exhibit created by the Tucson chapter of REFORMA, the national association to promote library and information services to Latinos and the Spanish-speaking.
The display examines H.B. 2281, the ethnic studies bill.
"TUSD is a few blocks away from campus so it's part of the community, and there is also a really big tie between the ethnic studies program at UA and the Mexican studies program at TUSD," said Jeffery Cruz, REMORA president, Progressive Librarians Guild member and information and library science graduate student.
Cruz said the exhibit examines the text of the bill and a letter Tom Horne wrote in 2007 that specifically names three textbooks.

"We're not necessarily trying to take a stand on the issue. My personal hope for the UA, even though I have a stance and REFORMA has a stance, is just to provide information and for people to be able to access that information and form their own opinions," Cruz said.
The science library exhibit is new this year and explores scientists that have been banned such as Galileo Galilei.
Like many books throughout the ages, the works of scientists that have stirred up controversy have also been banned.
In addition to the exhibits, there will be two events for Banned Books Week.
The first event is today and will be a free screening in the Gallagher Theater of "To Kill A Mockingbird" and the second will be a documentary screening and read out/speak out.
The read-out will involve panel discussion about access to a diversity of ideas and the freedom to read. In addition to the panel, people will have the opportunity to read aloud from banned books.
Attendees can read from the selection of banned books available or bring their own banned book.
"There's a lot of censorship that's still going on," Blakiston said. "It is an issue that is worth bringing up to awareness and let them have an open discussion about censorship."If you go:
Banned Books Week: Free Movie Screening of "To Kill a Mockingbird"

When: Tuesday, Sept. 28,  Tuesday, at 6:030 p.m.
Where: Gallagher Theater in the Student Union Memorial Center

If you go:
Banned Books Read Out/Speak Out: Modern forms of censorship read-out and panel discussion

When: Thursday, Sept. 30, Thursday, from 6:00 p.m. – 7:30 p.m.
Where: UA in the Main Library
Panelists:
Richard DiRusso, Head of Collection Development, Pima County Public Library,
Agnes Griffen, former Director, Tucson-Pima Public Library (now Pima County Public Library)
Kay Mathiesen, Assistant Professor, School of Information Resources and Library Science, David Robinson, Associate Professor, Department of English

Sunday, October 3, 2010

WEB 2.0


The term Web 2.0 is commonly associated with web applications that facilitate interactive information sharing, interoperability, user-centered design,[1] and collaboration on the World Wide Web. A Web 2.0 site gives its users the free choice to interact or collaborate with each other in a social media dialogue as creators (prosumer) of user-generated content in a virtual community, in contrast to websites where users (consumer) are limited to the passive viewing of content that was created for them. Examples of Web 2.0 include social-networking sites, blogs, wikis, video-sharing sites, hosted services, web applications, mashups and folksonomies.
The term is closely associated with Tim O'Reilly because of the O'Reilly Media Web 2.0 conference in 2004.[2][3] Although the term suggests a new version of the World Wide Web, it does not refer to an update to any technical specifications, but rather to cumulative changes in the ways software developers and end-users use the Web. Whether Web 2.0 is qualitatively different from prior web technologies has been challenged by World Wide Web inventor Tim Berners-Lee, who called the term a "piece of jargon"[4], precisely because he intended the Web in his vision as "a collaborative medium, a place where we [could] all meet and read and write". He called it the 'Read/Write Web'.[5]


History

The term "Web 2.0" was coined in 1999 by Darcy DiNucci, a consultant on electronic information design (information architecture). In her article, "Fragmented Future", DiNucci writes:[6][7]
The Web we know now, which loads into a browser window in essentially static screenfulls, is only an embryo of the Web to come. The first glimmerings of Web 2.0 are beginning to appear, and we are just starting to see how that embryo might develop. The Web will be understood not as screenfulls of text and graphics but as a transport mechanism, the ether through which interactivity happens. It will [...] appear on your computer screen, [...] on your TV set [...] your car dashboard [...] your cell phone [...] hand-held game machines [...] maybe even your microwave oven.
Her use of the term deals mainly with Web design, aesthetics, and the interconnection of everyday objects with the Internet; she argues that the Web is "fragmenting" due to the widespread use of portable Web-ready devices. Her article is aimed at designers, reminding them to code for an ever-increasing variety of hardware. As such, her use of the term hints at, but does not directly relate to, the current uses of the term.
The term Web 2.0 did not resurface until 2003.[8][9][10][11] These authors focus on the concepts currently associated with the term where, as Scott Dietzen puts it, "the Web becomes a universal, standards-based integration platform".[10] John Robb wrote: "What is Web 2.0? It is a system that breaks with the old model of centralized Web sites and moves the power of the Web/Internet to the desktop."[11]
In 2004, the term began its rise in popularity when O'Reilly Media and MediaLive hosted the first Web 2.0 conference. In their opening remarks, John Battelle and Tim O'Reilly outlined their definition of the "Web as Platform", where software applications are built upon the Web as opposed to upon the desktop. The unique aspect of this migration, they argued, is that "customers are building your business for you".[12] They argued that the activities of users generating content (in the form of ideas, text, videos, or pictures) could be "harnessed" to create value. O'Reilly and Battelle contrasted Web 2.0 with what they called "Web 1.0". They associated Web 1.0 with the business models of Netscape and the Encyclopædia Britannica Online. For example,
Netscape framed "the web as platform" in terms of the old software paradigm: their flagship product was the web browser, a desktop application, and their strategy was to use their dominance in the browser market to establish a market for high-priced server products. Control over standards for displaying content and applications in the browser would, in theory, give Netscape the kind of market power enjoyed by Microsoft in the PC market. Much like the "horseless carriage" framed the automobile as an extension of the familiar, Netscape promoted a "webtop" to replace the desktop, and planned to populate that webtop with information updates and applets pushed to the webtop by information providers who would purchase Netscape servers.[13]
In short, Netscape focused on creating software, updating it on occasion, and distributing it to the end users. O'Reilly contrasted this with Google, a company which did not at the time focus on producing software, such as a browser, but instead focused on providing a service based on data such as the links Web page authors make between sites. Google exploits this user-generated content to offer Web search based on reputation through its "page rank" algorithm. Unlike software, which undergoes scheduled releases, such services are constantly updated, a process called "the perpetual beta". A similar difference can be seen between the Encyclopædia Britannica Online and Wikipedia: while the Britannica relies upon experts to create articles and releases them periodically in publications, Wikipedia relies on trust in anonymous users to constantly and quickly build content. Wikipedia is not based on expertise but rather an adaptation of the open source software adage "given enough eyeballs, all bugs are shallow", and it produces and updates articles constantly. O'Reilly's Web 2.0 conferences have been held every year since 2004, attracting entrepreneurs, large companies, and technology reporters.
In terms of the lay public, the term Web 2.0 was largely championed by bloggers and by technology journalists, culminating in the 2006 TIME magazine Person of The Year (You).[14] That is, TIME selected the masses of users who were participating in content creation on social networks, blogs, wikis, and media sharing sites. The cover story author Lev Grossman explains:
It's a story about community and collaboration on a scale never seen before. It's about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. It's about the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes.
Since that time, Web 2.0 has found a place in the lexicon; in 2009 Global Language Monitor declared it to be the one-millionth English word.[15]

Characteristics


Flickr, a Web 2.0 web site that allows its users to upload and share photos
Web 2.0 websites allow users to do more than just retrieve information. By increasing what was already possible in "Web 1.0", they provide the user with more user-interface, software and storage facilities, all through their browser. This has been called "Network as platform" computing.[3] Users can provide the data that is on a Web 2.0 site and exercise some control over that data.[3][16] These sites may have an "Architecture of participation" that encourages users to add value to the application as they use it.[2][3]
The concept of Web-as-participation-platform captures many of these characteristics. Bart Decrem, a founder and former CEO of Flock, calls Web 2.0 the "participatory Web"[17] and regards the Web-as-information-source as Web 1.0.
The impossibility of excluding group-members who don’t contribute to the provision of goods from sharing profits gives rise to the possibility that rational members will prefer to withhold their contribution of effort and free-ride on the contribution of others.[18] This requires what is sometimes called Radical Trust by the management of the website. According to Best,[19] the characteristics of Web 2.0 are: rich user experience, user participation, dynamic content, metadata, web standards and scalability. Further characteristics, such as openness, freedom[20] and collective intelligence[21] by way of user participation, can also be viewed as essential attributes of Web 2.0.
  
[edit] Technologies
The client-side/web browser technologies typically used in Web 2.0 development are Asynchronous JavaScript and XML (Ajax), Adobe Flash and the Adobe Flex framework, and JavaScript/Ajax frameworks such as Yahoo! UI Library, Dojo Toolkit, MooTools, and jQuery. Ajax programming uses JavaScript to upload and download new data from the web server without undergoing a full page reload.
To permit the user to continue to interact with the page, communications such as data requests going to the server are separated from data coming back to the page (asynchronously). Otherwise, the user would have to routinely wait for the data to come back before they can do anything else on that page, just as a user has to wait for a page to complete the reload. This also increases overall performance of the site, as the sending of requests can complete quicker independent of blocking and queueing required to send data back to the client.
The data fetched by an Ajax request is typically formatted in XML or JSON (JavaScript Object Notation) format, two widely used structured data formats. Since both of these formats are natively understood by JavaScript, a programmer can easily use them to transmit structured data in their web application. When this data is received via Ajax, the JavaScript program then uses the Document Object Model (DOM) to dynamically update the web page based on the new data, allowing for a rapid and interactive user experience. In short, using these techniques, Web designers can make their pages function like desktop applications. For example, Google Docs uses this technique to create a Web-based word processor.
Adobe Flex is another technology often used in Web 2.0 applications. Compared to JavaScript libraries like jQuery, Flex makes it easier for programmers to populate large data grids, charts, and other heavy user interactions.[22] Applications programmed in Flex, are compiled and displayed as Flash within the browser. As a widely available plugin independent of W3C (World Wide Web Consortium, the governing body of web standards and protocols), standards, Flash is capable of doing many things which are not currently possible in HTML, the language used to construct web pages. Of Flash's many capabilities, the most commonly used in Web 2.0 is its ability to play audio and video files. This has allowed for the creation of Web 2.0 sites where video media is seamlessly integrated with standard HTML.
In addition to Flash and Ajax, JavaScript/Ajax frameworks have recently become a very popular means of creating Web 2.0 sites. At their core, these frameworks do not use technology any different from JavaScript, Ajax, and the DOM. What frameworks do is smooth over inconsistencies between web browsers and extend the functionality available to developers. Many of them also come with customizable, prefabricated 'widgets' that accomplish such common tasks as picking a date from a calendar, displaying a data chart, or making a tabbed panel.
On the server side, Web 2.0 uses many of the same technologies as Web 1.0. New Languages such as PHP, Ruby, ColdFusion, Perl, Python, JSP and ASP are used by developers to dynamically output data using information from files and databases. What has begun to change in Web 2.0 is the way this data is formatted. In the early days of the Internet, there was little need for different websites to communicate with each other and share data. In the new "participatory web", however, sharing data between sites has become an essential capability. To share its data with other sites, a web site must be able to generate output in machine-readable formats such as XML, RSS, and JSON. When a site's data is available in one of these formats, another website can use it to integrate a portion of that site's functionality into itself, linking the two together. When this design pattern is implemented, it ultimately leads to data that is both easier to find and more thoroughly categorized, a hallmark of the philosophy behind the Web 2.0 movement.


Concepts

Web 2.0 draws together the capabilities of client- and server-side software, content syndication and the use of network protocols. Standards-oriented web browsers may use plug-ins and software extensions to handle the content and the user interactions. Web 2.0 sites provide users with information storage, creation, and dissemination capabilities that were not possible in the environment now known as "Web 1.0".
Web 2.0 websites typically include some of the following features and techniques. Andrew McAfee used the acronym SLATES to refer to them:[23]
Search
Finding information through keyword search.
Links
Connects information together into a meaningful information ecosystem using the model of the Web, and provides low-barrier social tools.
Authoring
The ability to create and update content leads to the collaborative work of many rather than just a few web authors. In wikis, users may extend, undo and redo each other's work. In blogs, posts and the comments of individuals build up over time.
Tags
Categorization of content by users adding "tags" - short, usually one-word descriptions - to facilitate searching, without dependence on pre-made categories. Collections of tags created by many users within a single system may be referred to as "folksonomies" (i.e., folk taxonomies).
Extensions
Software that makes the Web an application platform as well as a document server. These include software like Adobe Reader, Adobe Flash player, Microsoft Silverlight, ActiveX, Oracle Java, Quicktime, Windows Media, etc.
Signals
The use of syndication technology such as RSS to notify users of content changes.
While SLATES forms the basic framework of Enterprise 2.0, it does not contradict all of the higher level Web 2.0 design patterns and business models. In this way, a new Web 2.0 report from O'Reilly is quite effective and diligent in interweaving the story of Web 2.0 with the specific aspects of Enterprise 2.0. It includes discussions of self-service IT, the long tail of enterprise IT demand, and many other consequences of the Web 2.0 era in the enterprise. The report also makes many sensible recommendations around starting small with pilot projects and measuring results, among a fairly long list.[24]


Usage

The popularity of the term Web 2.0, along with the increasing use of blogs, wikis, and social networking technologies, has led many in academia and business to coin a flurry of 2.0s,[25] including Library 2.0,[26] Social Work 2.0,[27] Enterprise 2.0, PR 2.0,[28] Classroom 2.0[29], Publishing 2.0 [30], Medicine 2.0[31], Telco 2.0, Travel 2.0, Government 2.0,[32] and even Porn 2.0.[33] Many of these 2.0s refer to Web 2.0 technologies as the source of the new version in their respective disciplines and areas. For example, in the Talis white paper "Library 2.0: The Challenge of Disruptive Innovation", Paul Miller argues
Blogs, wikis and RSS are often held up as exemplary manifestations of Web 2.0. A reader of a blog or a wiki is provided with tools to add a comment or even, in the case of the wiki, to edit the content. This is what we call the Read/Write web.Talis believes that Library 2.0 means harnessing this type of participation so that libraries can benefit from increasingly rich collaborative cataloguing efforts, such as including contributions from partner libraries as well as adding rich enhancements, such as book jackets or movie files, to records from publishers and others.[34]
Here, Miller links Web 2.0 technologies and the culture of participation that they engender to the field of library science, supporting his claim that there is now a "Library 2.0". Many of the other proponents of new 2.0s mentioned here use similar methods.

[edit] Web-based applications and desktops

Ajax has prompted the development of websites that mimic desktop applications, such as word processing, the spreadsheet, and slide-show presentation. In 2006 Google, Inc. acquired one of the best-known sites of this broad class, Writely.[35] WYSIWYG wiki and blogging sites replicate many features of PC authoring applications.
Several browser-based "operating systems" have emerged, including EyeOS[36] and YouOS.[37] Although coined as such, many of these services function less like a traditional operating system and more as an application platform. They mimic the user experience of desktop operating-systems, offering features and applications similar to a PC environment, and are able to run within any modern browser. However, these operating systems do not directly control the hardware on the client's computer.
Numerous web-based application services appeared during the dot-com bubble of 1997–2001 and then vanished, having failed to gain a critical mass of customers. In 2005, WebEx acquired one of the better-known of these, Intranets.com, for $45 million.[38]

[edit] Internet applications

Rich Internet Applications (RIAs) are web 2.0 applications that have many of the characteristics of desktop applications and are typically delivered via a browser.

[edit] Hyper distribution of hypermedia

[edit] XML and RSS

Many regard syndication of site content as a Web 2.0 feature. Syndication uses standardized protocols to permit end-users to make use of a site's data in another context (such as another website, a browser plugin, or a separate desktop application). Protocols which permit syndication include RSS (really simple syndication, also known as web syndication), RDF (as in RSS 1.1), and Atom, all of them XML-based formats. Observers have started to refer to these technologies as web feeds.
Specialized protocols such as FOAF and XFN (both for social networking) extend the functionality of sites or permit end-users to interact without centralized websites.

[edit] Web APIs

Web 2.0 often uses machine-based interactions such as REST and SOAP. Servers often expose proprietary APIs (Application Programming Interfaces), but standard APIs (for example, for posting to a blog or notifying a blog update) have also come into use. Most communications through APIs involve XML or JSON payloads.
REST APIs, through their use of self-descriptive messages and hypermedia as the engine of application state, should be self describing once an entry URI is known. Web Services Description Language (WSDL) is the standard way of publishing a SOAP API and there are a range of web service specifications. EMML, or Enterprise Mashup Markup Language by the Open Mashup Alliance, is an XML markup language for creating enterprise mashups.

[edit] Criticism

Critics of the term claim that "Web 2.0" does not represent a new version of the World Wide Web at all, but merely continues to use so-called "Web 1.0" technologies and concepts. First, techniques such as AJAX do not replace underlying protocols like HTTP, but add an additional layer of abstraction on top of them. Second, many of the ideas of Web 2.0 had already been featured in implementations on networked systems well before the term "Web 2.0" emerged. Amazon.com, for instance, has allowed users to write reviews and consumer guides since its launch in 1995, in a form of self-publishing. Amazon also opened its API to outside developers in 2002.[39] Previous developments also came from research in computer-supported collaborative learning and computer-supported cooperative work and from established products like Lotus Notes and Lotus Domino, all phenomena which precede Web 2.0.
But perhaps the most common criticism is that the term is unclear or simply a buzzword. For example, in a podcast interview,[4] Tim Berners-Lee described the term "Web 2.0" as a "piece of jargon":
"Nobody really knows what it means...If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along."[4]
Other critics labeled Web 2.0 "a second bubble" (referring to the Dot-com bubble of circa 1995–2001), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models. For example, The Economist has dubbed the mid- to late-2000s focus on Web companies "Bubble 2.0".[40] Venture capitalist Josh Kopelman noted that Web 2.0 had excited only 53,651 people (the number of subscribers at that time to TechCrunch, a Weblog covering Web 2.0 startups and technology news), too few users to make them an economically viable target for consumer applications.[41] Although Bruce Sterling reports he's a fan of Web 2.0, he thinks it is now dead as a rallying concept.[clarification needed][42]
Critics have cited the language used to describe the hype cycle of Web 2.0[43] as an example of Techno-utopianist rhetoric.[44]
In terms of Web 2.0's social impact, critics such as Andrew Keen argue that Web 2.0 has created a cult of digital narcissism and amateurism, which undermines the notion of expertise by allowing anybody, anywhere to share and place undue value upon their own opinions about any subject and post any kind of content, regardless of their particular talents, knowledge, credentials, biases or possible hidden agendas. Keen's 2007 book, Cult of the Amateur, argues that the core assumption of Web 2.0, that all opinions and user-generated content are equally valuable and relevant, is misguided. Additionally, Sunday Times reviewer John Flintoff has characterized Web 2.0 as "creating an endless digital forest of mediocrity: uninformed political commentary, unseemly home videos, embarrassingly amateurish music, unreadable poems, essays and novels", and also asserted that Wikipedia is full of "mistakes, half truths and misunderstandings".[45]

[edit] Trademark

In November 2004, CMP Media applied to the USPTO for a service mark on the use of the term "WEB 2.0" for live events.[46] On the basis of this application, CMP Media sent a cease-and-desist demand to the Irish non-profit organization IT@Cork on May 24, 2006,[47] but retracted it two days later.[48] The "WEB 2.0" service mark registration passed final PTO Examining Attorney review on May 10, 2006, and was registered on June 27, 2006.[46] The European Union application (application number 004972212, which would confer unambiguous status in Ireland) was refused on May 23, 2007.

[edit] Web 3.0

Definitions of Web 3.0 vary greatly. Amit Agrawal states that Web 3.0 is, among other things, about the Semantic Web and personalization.[49]. Focusing on the computer elements, Conrad Wolfram has argued that Web 3.0 is where "the computer is generating new information", rather than humans.[50]
Andrew Keen, author of The Cult of the Amateur, considers the Semantic Web an "unrealisable abstraction" and sees Web 3.0 as the return of experts and authorities to the Web. For example, he points to Bertelsman's deal with the German Wikipedia to produce an edited print version of that encyclopedia. CNN Money's Jessi Hempel expects Web 3.0 to emerge from new and innovative Web 2.0 services with a profitable business model.[51] Others still such as Manoj Sharma, an organization strategist, in the keynote "A Brave New World Of Web 3.0" proposes that Web 3.0 will be a "Totally Integrated World" - cradle-to-grave experience of being always plugged onto the net.[52]
Futurist John Smart, lead author of the Metaverse Roadmap[53] echoes Sharma's perspective, defining Web 3.0 as the first-generation Metaverse (convergence of the virtual and physical world), a web development layer that includes TV-quality open video, 3D simulations, augmented reality, human-constructed semantic standards, and pervasive broadband, wireless, and sensors. Web 3.0's early geosocial (Foursquare, etc.) and augmented reality (Layar, etc.) webs are an extension of Web 2.0's participatory technologies and social networks (Facebook, etc.) into 3D space. Of all its metaverse-like developments, Smart suggests Web 3.0's most defining characteristic will be the mass diffusion of NTSC-or-better quality open video to TVs, laptops, tablets, and mobile devices, a time when "the internet swallows the television."[54] Smart considers Web 4.0 to be the Semantic Web and in particular, the rise of statistical, machine-constructed semantic tags and algorithms, driven by broad collective use of conversational interfaces, perhaps circa 2020.[55] David Siegel's perspective in Pull: The Power of the Semantic Web, 2009, is consonant with this, proposing that the growth of human-constructed semantic standards and data will be a slow, industry-specific incremental process for years to come, perhaps unlikely to tip into broad social utility until after 2020.