PrroBooks.com » Computers » E-books and e-publishing by Samuel Vaknin (essential reading .TXT) 📕

Book online «E-books and e-publishing by Samuel Vaknin (essential reading .TXT) 📕». Author Samuel Vaknin



1 ... 8 9 10 11 12 13 14 15 16 ... 36
but

demolished the non-local nature of the early Internet. It has

also brought it into the remit of existing national laws.

Moreover, governments throughout the world have become more

assertive in exercising territorial jurisdiction over the

hitherto ostensibly extraterritorial Net. A French court has

prohibited Yahoo! from making certain content on its Web sites

available to French citizens. An American court advised Yahoo!

to ignore this decision. A Russian programmer was arrested by

the FBI for offering a decryption software for sale in Russia

(where it is perfectly legal). Governments from China to Saudi

Arabia filter Web content regularly. Following the September

11 attacks, restrictive anti-terrorist legislation the world

over targeted cyberspace.

But the real territorialization of the Internet - the

redrawing of its internal contours and the withdrawal of its

libertarian foundations - is more pernicious, all-pervasive,

quotidian, and surreptitiously gradual. This is not the

outcome of legal revolutions and court-driven evolution. It is

piecemeal, quiet, unnoticed, often inadvertent and unintended.

It is an “afterthought” rather than a premeditated “plot”. It

happens etailer by etailer, one Web site after the other,

like the spread of a virus.

Consider these two - by no means exhaustive - examples.

Amazon and Geocities (now, Yahoo!Geocities) are two Internet

establishments, two gigantic communities of users that,

between them, represent a sizable chunk of all the activity on

the Internet.

It has long been impossible for a non-US publisher to sell its

wares (books, for instance) through Amazon or to Amazon

directly. Amazon works exclusively with US publishers and

distributors. To collaborate with Amazon - one of the members

of a duopoly as far as B2C e-commerce goes - a non-US

publisher (no matter how substantial) has to work with a US

distributor and thus forgo a large portion of its revenues

(payable to the distributor as commissions). Moreover, said

publisher cannot even open a ZShop (Amazon’s version of mom

and pop store). One has to be a US resident to do so. Amazon

is closed to the outside world, despite its (false) global

image. It sells all over the world - but it only buys

American.

This discriminatory behaviour is partly profit-motivated. It

is logistically easier and cheaper to deal only with US

businesses. But Barnes and Noble works directly with foreign

publishers and they preceded Amazon in the book business by

decades.

 

Yahoo!Geocities has lately instituted a new policy. It limits

the size of downloads from the free home pages of members of

its community. If the downloaded content from a given home

page exceeds 3 Gb (extrapolated based on hourly usage) - the

“offending” member’s page is shut down for an hour. The member

is then prompted to pay a monthly subscription fee for a

Premium Service in order avoid a recurrence of this

unfortunate event. This “marketing drive” is intended to

compensate Yahoo!Geocities for a precipitous drop in online

advertising revenues.

The “Premium” package includes “Premium Mail”. But only US

citizens or residents can subscribe to it. And, you guessed it

right, without the Premium Mail component, one cannot complete

the subscription process. Though not stated explicitly

anywhere, the Premium services are closed to the outside world

and are the exclusive reserve of Americans. One can get around

this virtual ethnic cleansing by providing false data while

registering, but this is besides the point.

The Internet is a reflection of the outside world. As

economies contract, unemployment soars, personal safety

vanishes, the social fabric disintegrates, and consumption

slumps - countries tend to isolate themselves politically,

react aggressively, and protect their national economies.

Protectionism, unilateralism, and isolationism are scourges

the Internet was supposed to be immune to. Little did we know.

 

The InCredible Web

By: Sam Vaknin

http://www.webcredibility.org/

 

People are conditioned to trust written words, not to mention

images. “I read it in the paper” or “As seen on TV” are worn

out but still effective cliches. The Internet combines both

the written and the seen. It is both a textual and a visual

(and audio) medium. Do people trust Internet content? Is the

incredible Internet - credible?

In the “brick and mortar” world, credibility is associated

with brands. A brand, in effect, guarantees the quality and

specifications of a product (think McDonald’s hamburgers), its

performance (think Palm), level of service and commitment to

customer care (Amazon), variety, or price (Wal-Mart). Brands

are sustained and enhanced by advertising campaigns. The

content or sales pitch of specific ads are often less

important than the message conveyed by the very existence of a

campaign: “This company is rich enough (read: stable,

reliable, trustworthy, here to stay) to spend millions on

advertising”.

The Internet has very few brands (Yahoo!, Amazon) - and some

of them are tarnished. Some “old media” brands have entered

the fray (Barnes and Noble, The Wall Street Journal, the

Britannica) - hitherto without much success. The overwhelming

bulk of Web content is created or disseminated by small time

entrepreneurs and monomaniacs.

So, how does one establish or acquire credibility in such a

diffuse and anarchic medium?

Enter Stanford University’s “Web Credibility Project”.

They define themselves thus:

“Our goal is to understand what leads people to believe what

they find on the Web. We hope this knowledge will enhance Web

site design and promote future research on Web credibility. As

part of this ongoing project we are:

* Performing quantitative research on Web credibility.

* Collecting all public information on Web credibility.

* Acting as a clearinghouse for this information.

* Facilitating research and discussion about Web

credibility.

* Helping designers create credible Web sites.”

 

* Examples of current projects:

 

Timeliness: How does having out-of-date content affect the

credibility of a Web site?

 

Interaction: How does having a personalized interaction with

a Web site affect its credibility?

 

Negative Content: How does displaying negative content

associated with a branded web site affect the credibility of

the brand?

It is useful to confine ourselves to this definition of trust:

“The subjective belief, perception, or conviction that

information provided is true, factual, and objective, and that

commitments undertaken, explicitly, or implicitly, will be

honoured fully and in a timely manner”.

Such perception, belief, or conviction are based on:

* Past experience in general (with spam, with merchants, or

providers, with a similar product category, with the same

type of content, etc.) and personal proclivity to trust

or to distrust

* Experience with the specific merchant or provider

(whether personal or gleaned from other people’s feedback

- reviews, complaints, and opinions)

There is little that a merchant can do about the former. The

latter is, expectedly, influenced by:

* Professionalism (as evident in Web site design, e-commerce facilities, user-friendliness, navigability,

links to other relevant Web pages, links from other Web

sites, ease and speed of download, updated content,

proofreading, domain name which matches the company’s

name, availability, multilingualism, etc.)

* Trustworthiness (lack of bias, good intentions,

truthfulness, thoroughness, objectivity, expertise and

author credentials, knowledgeable sources and treatment,

citations and bibliography), and what the authors of the

research call “Real World Feel” (physical address,

phone/fax numbers, non-Web e-mail address, photos of

facilities and staff, audio recording, ownership by a not

for profit organization, URL ending with ORG).

* Commercial Web sites are less trusted. Cluttered ads,

paid subscriptions, e-commerce enabled forms - all reduce

the site’s credibility! This is especially true if the

entire site is a one, big ad and when it is hard to

distinguish ads from content.

* Track record (how veteran is the merchant, past financial

performance, credit history, brand name recognition,

lists of customers, etc.)

* Selection (how many products are carried, how often is

inventory refreshed, etc.)

* Advertising (is the company’s business sufficiently

lucrative to support a campaign?)

* Service (good service indicates a reassuring readiness to

sacrifice the bottom line to cater to the customer’s

legitimate concerns, feedback forms, live support, etc.)

* Full disclosure of rates, prices, privacy policy,

security issues, etc.

* Feedback from other users (opinions, reviews, comments,

FAQs, support groups, etc.)

* Site rating and certification by trustworthy agencies

(like the Better Business Bureau - BBB, VeriSign, TRUSTe)

- or awards won (from credible and reputable

organizations). Links from other, well-known and

believable Web sites.

The Credibility Web discovered that trust in e-commerce is

also influenced by idiosyncratic factors. Certain domain names

(org) are more trusted than others (com). Too many ads, broken

links, typos, outdated or old content - all diminish trust. In

the absence of proven markers and behavioral guidelines,

people seem to resort to extrapolation (“if they can’t

maintain their own Web site …”) and stereotypes (e.g., NGO’s

are more trustworthy than corporations).

As Web sites proliferate (Google indexes well over 3 billion

now) and Web authoring becomes a routine task - the noise to

signal ratio of garbage to useful information is bound to

deteriorate. Search engines already incorporate crude measures

of credibility in their rankings (e.g., the number of links

from external Web sites). But, to remain useful, search

engines (and Web directories) would do well to rate Web

content more comprehensively and thoroughly. They should rank

Web sites by authoritativeness, reliability, and objectivity,

for instance.

Research shows that 75% of all respondents resort to the

Internet as a primary information provider. The inundation of

irrelevant material caused most surfers to confine their

surfing to 10 Web sites (the equivalent of “anchors” in

shopping malls), which they deem reliable, timely, accurate,

objective, authoritative, and credible. The rest of the

Internet gets the leftovers. This worrying trend can be

reversed only through the emergence of independent and

commercially-viable rating agencies. Web sites (at least the

business ones) should be willing to pay for credible rating to

enhance their stickiness and attract monetizable “eyeballs”.

In the absence of such third party accreditation, the Internet

risks both irrelevance and disrepute.

 

Does Free Content - Sell?

By: Sam Vaknin

 

The answer is: no one knows. Many self-styled “gurus” and

“pundits” - authors of voluminous tomes they sell to the

gullible - pretend to know. But their “expertise” is an

admixture of guesswork, superstitions, anecdotal “evidence”

and hearsay. The sad truth is that no methodical, long term,

and systematic research has been attempted in the nascent

field of e-publishing and, more broadly, digital content on

the Web. So, no one knows to say for sure whether free content

sells, when, or how.

There are two schools - apparently equally informed by the

dearth of hard data. One is the “viral school”. Its vocal

proponents claim that the dissemination of free content fuels

sales by creating “buzz” (word of mouth marketing driven by

influential communicators). The “intellectual property” school

roughly says that free content cannibalizes paid content

mainly because it conditions potential consumers to expect

free information. Free content also often serves as a

substitute (imperfect but sufficient) to paid content.

Experience - though patchy - confusingly seems to points both

ways. Views and prejudices tend to converge around this

consensus: whether free content sells or not depends on a few

variables. They are:

(1) The nature of the information. People are generally

willing to pay for specific or customized information,

tailored to their idiosyncratic needs, provided in a timely

manner, and by authorities in the field. The more general and

“featureless” the information, the more reluctant people are

to dip into their pockets (probably because there are many

free substitutes).

(2) The nature of the audience. The more targeted the

information, the more it caters to the needs of a unique, or

specific group, the more often it has to be updated

(“maintained”), the less indiscriminately applicable it is,

and especially if it deals with money, health, sex, or

relationships - the more valuable it is and the more people

are willing to pay for it. The less computer savvy users -

unable to find free alternatives - are more willing to pay.

(3) Time dependent parameters. The more the content is linked

to “hot” topics, “burning” issues, trends, fads, buzzwords,

and “developments” - the more likely it is to sell regardless

of the availability of free alternatives.

(4) The “U” curve.

1 ... 8 9 10 11 12 13 14 15 16 ... 36

Free e-book «E-books and e-publishing by Samuel Vaknin (essential reading .TXT) 📕» - read online now

Similar e-books:

Comments (0)

There are no comments yet. You can be the first!
Add a comment