1 month ago
The information is everywhere, a constant feed in our hands, in our pockets, on our desktops, our cars, even in the cloud. The data stream can’t be shut off. It pours into our lives a rising tide of words, facts, jokes, GIFs, gossip and commentary that threatens to drown us. Perhaps it is this fear of submersion that is behind this insistence that we’ve seen, we’ve read, we know. It’s a none-too-convincing assertion that we are still afloat. So here we are, desperately paddling, making observations about pop culture memes, because to admit that we’ve fallen behind, that we don’t know what anyone is talking about, that we have nothing to say about each passing blip on the screen, is to be dead. Cite Arrow Faking Cultural Literacy
2 years ago

How thoroughly and how radically Google has already transformed the information economy has not been well understood. The merchandise of the information economy is not information; it is attention. These commodities have an inverse relationship. When information is cheap, attention becomes expensive. Attention is what we, the users, give to Google, and our attention is what Google sells—concentrated, focused, and crystallized.

Google’s business is not search but advertising. More than 96 percent of its $29 billion in revenue last year came directly from advertising, and most of the rest came from advertising-related services. Google makes more from advertising than all the nation’s newspapers combined. Siva Vaidhyanathan, a media scholar at the University of Virginia, puts it this way: “We are not Google’s customers: we are its product. We—our fancies, fetishes, predilections, and preferences—are what Google sells to advertisers.”

Cite Arrow How Google Dominates Us, by James Gleick
3 years ago

(CNN) — With little notice or fanfare, the digital world is fundamentally changing. What was once an anonymous medium where anyone could be anyone — where, in the words of the famous New Yorker cartoon, nobody knows you’re a dog — is now a tool for soliciting and analyzing our personal data.

According to one Wall Street Journal study, the top 50 Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like “depression” on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants.

The new Internet doesn’t just know you’re a dog; it knows your breed and wants to sell you a bowl of premium kibble.

The race to know as much as possible about you has become the central battle of the era for Internet giants like Google, Facebook, Apple, and Microsoft. As Chris Palmer of the Electronic Frontier Foundation explained to me, “You’re getting a free service, and the cost is information about you. And Google and Facebook translate that pretty directly into money.”

While Gmail and Facebook may be helpful, free tools, they are also extremely effective and voracious extraction engines into which we pour the most intimate details of our lives.

As a business strategy, the Internet giants’ formula is simple: The more personally relevant their information offerings are, the more ads they can sell, and the more likely you are to buy the products they’re offering.

And the formula works. Amazon sells billions of dollars worth of merchandise by predicting what each customer is interested in and putting it in the front of the virtual store.

What the Internet is hiding from you

Up to 60% of Netflix’s rentals come from the personalized guesses it can make about each customer’s movie preferences — and at this point, Netflix can predict how much you’ll like a given movie within about half a star. Personalization is a core strategy for the top five sites on the Internet — Yahoo, Google, Facebook, YouTube, and Microsoft Live — as well as countless others.

It would be one thing if all this customization were just about targeted advertising. But personalization isn’t just shaping what we buy. For a quickly rising percentage of us, personalized news feeds like Facebook are becoming a primary news source. Thirty-six percent of Americans under 30 get their news through social networking sites.

And personalization is shaping how information flows far beyond Facebook, as websites from Yahoo News to the New York Times-funded startup News.me cater their headlines to our particular interests and desires. It’s influencing what videos we watch on YouTube and a dozen smaller competitors, and what blog posts we see.

It’s affecting whose e-mails we get, which potential mates we run into on OK Cupid, and which restaurants are recommended to us on Yelp — which means that personalization could easily have a hand not only in who goes on a date with whom, but in where they go and what they talk about. The algorithms that orchestrate our ads are starting to orchestrate our lives.

The basic code at the heart of the new Internet is pretty simple. The new generation of Internet filters looks at the things you seem to like — the actual things you’ve done, or the things people similar to you like — and tries to extrapolate. Together, these engines create a unique universe of information for each of us — what I’ve come to call a filter bubble — which fundamentally alters the way we encounter ideas and information.

Of course, to some extent we’ve always consumed media that appealed to our interests and avocations and ignored much of the rest. But the filter bubble introduces three dynamics we’ve never dealt with before:

First, you’re alone in it. A cable channel that caters to a narrow interest (say, golf) has other viewers, with whom you share a frame of reference. But you’re the only person in your bubble. In an age when shared information is the bedrock of shared experience, the filter bubble is a centrifugal force, pulling us apart.

Second, the filter bubble is invisible. Most viewers of conservative or liberal news sources know when they’re going to a station curated to serve a particular political viewpoint. But Google’s agenda is opaque. Google doesn’t tell you who it thinks you are, or why it’s showing you the results you’re seeing. You don’t know if its assumptions about you are right or wrong — and you might not even know it’s making assumptions about you in the first place.

Finally, you don’t choose to enter the bubble. When you turn on Fox News or read The Nation, you’re making a decision about what kind of filter to use to make sense of the world. It’s an active process, and just as you would if you put on tinted glasses, you can guess how the editors’ leaning shapes your perception. You don’t make the same kind of choice with personalized filters. They come to you — and because they drive up profits for the websites that use them, they’ll become harder and harder to avoid.

The consequences of living in a bubble are becoming clear. Left to their own devices, personalization filters serve up a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar, and leaving us oblivious to the dangers lurking in the dark territory of the unknown.

In the filter bubble, there’s less room for the chance encounters that bring insight and learning. Creativity is often sparked by the collision of ideas from different disciplines and cultures. Combine an understanding of cooking and physics, and you get the nonstick pan and the induction stovetop. But if Amazon thinks I’m interested in cookbooks, it’s not very likely to show me books about metallurgy. It’s not just serendipity that’s at risk.

By definition, a world constructed from the familiar is a world in which there’s nothing to learn. If personalization is too acute, it could prevent us from coming into contact with the mind-blowing, preconception-shattering experiences and ideas that change how we think about the world and ourselves.

It’s not too late to make sure that personalization avoids these traps. But to shift its course, we need more people to become educated about how and why the Web is being edited for them, and we need the companies doing this filtering to show us not just what we’ll click most, but what we need to know. Otherwise, we could each find ourselves trapped in a bubble for one.

A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa. Cite Arrow Mark Zuckerberg, Facebook
3 years ago 3 years ago 3 years ago 3 years ago