Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

The Filter Bubble: What the Internet Is Hiding from You
The Filter Bubble: What the Internet Is Hiding from You
The Filter Bubble: What the Internet Is Hiding from You
Audiobook7 hours

The Filter Bubble: What the Internet Is Hiding from You

Written by Eli Pariser

Narrated by Kirby Heyborne

Rating: 4 out of 5 stars

4/5

()

About this audiobook

In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google's change in policy is symptomatic of the most significant shift to take place on the Web in recent years-the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society-and reveals what we can do about it.

Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook-the primary news source for an increasing number of Americans-prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like The Washington Post devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes, a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.

In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs-and because these filters are invisible, we won't know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.

While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend and shows how we can-and must-change course. With vivid detail and remarkable scope, The Filter Bubble reveals how personalization undermines the Internet's original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.
LanguageEnglish
Release dateMay 12, 2011
ISBN9781452671819
The Filter Bubble: What the Internet Is Hiding from You

Related to The Filter Bubble

Related audiobooks

Technology & Engineering For You

View More

Related articles

Reviews for The Filter Bubble

Rating: 3.933333333333333 out of 5 stars
4/5

15 ratings10 reviews

What did you think?

Tap to rate

Review must be at least 10 words

  • Rating: 3 out of 5 stars
    3/5
    The central message in The Filter Bubble is that the search algorithms used by websites like Amazon, Netflix, Facebook, and (most perniciously, Pariser argues) Google are incredibly good at showing us content that similar to content we’ve already looked at. The cumulative effect of all this, Pariser argues, is that if we do nothing we wind up living in a tightly circumscribed online world filled with information, ideas, and outlooks already familiar to us: the “filter bubble” of the title. Pariser also has reservations about the ways in which companies like Google and Facebook gather, store, and use information about us: the raw material their algorithms use to decide what we want to see. Such privacy concerns, though, rest on ground already mapped by other writers, going back to Vance Packard and The Naked Society in 1964. Pariser’s central – and far more novel – theme is the perniciousness of the filter bubble itself. The internet shows us what we want to see, not what we need to see, and that deeply frustrates him. What frustrated me, for virtually the entire length of the book, is that Pariser seems far more concerned with warning readers that they’re on the road that leads to filter-bubble Hell than with asking why that particular route might have seemed – or might still seem – more attractive than the other routes available. He never stops, for example, to consider why filters feel like essential tools when exploring even a narrow, bounded world like Facebook (much less the web as a whole): A hyper-abundance of information, a horrific signal-to-noise ratio, and users with limited time and shaky information- literacy skills. Filtered search results and tailored news feeds have flourished, in part, because people find them useful and efficient.Pariser, who wants them to return a higher proportion of results that aren’t just what the user would expect (and thus want) is thus in the odd position of arguing that search engines would be improved if they were – in the eyes of most users – made less efficient. Arguing that efficiency isn’t an absolute virtue is far from absurd (it works for hand-dipped milkshakes, artisan bread, and craft-brewed beer) but it’s hard to see it being used to sell lifeboat bilge pumps or body armor. Or search engines. Some things, you just want to be boringly efficient.The premise underlying Pariser’s case for less-tightly-filtered, (and thus seemingly less-efficient) search engines and news feeds isn’t absurd, either. It’s that “efficiency” in search isn’t giving the user the information they want, it’s giving them the information they need – the information that will make them better informed, better able to think, and thus better able to deal with the world. It’s far from clear, however, that an internet search engine programmed (by others) to give them that is any more desirable than one programmed (by others) to give them just what they want. It’s also far from clear that most people, if presented with that broader range of information, would not – using their own homegrown filters – immediately weed out (as “irrelevant,” “biased,” “uninteresting” or simply “wrong”) precisely the information that Pariser is so determined to provide them with.
  • Rating: 4 out of 5 stars
    4/5
    Really informative. But seemed too dry. Glad I read it.
  • Rating: 4 out of 5 stars
    4/5
     An interesting look at how the web is changing with the push for 'personalization' leading the way; a push that is reducing the 'World' in 'World Wide Web' to 'access'. We can all access the web but we don't all see the same content even if we are all on the same page!The author argues for the need to pressure Google, Facebook and the other big internet players on transparency; we need to know exactly how what we see is being 'personalized'; just what personal data they're holding on us and who else has access to all or part of that data. Whether we'll get transparency or not is another matter. All corporations 'tend to their own good', any public good they generate is a collateral benefit, and, like collateral damage, is 'just business' although in this case (and maybe in all cases) IT'S PERSONAL. The information collected by the various sites is our personal data and is being selected and stored based on the personal decisions of a small handful of people. Decisions that impact literally billions of us everyday, yet most of us are completely unaware that the same Google search will return different results for different people. Or that Facebook news feeds ignore many a 'friends' updates. One thing the book does make abundantly clear: everything we do on the Web and, increasingly, off the web, is being tracked, stored and used by someone to their gain and, potentially, our loss not just of personal privacy but of a collective civil society. If all we see on the web is our own reflection what a small, dull world it will be.
  • Rating: 3 out of 5 stars
    3/5
    Eli Pariser (of MoveOn.org, which I dimly recall was a relevant site when Bush II was on the throne, but seems to have lost some luster since Obama turned out to be Bush III) writes about the way the internet is becoming personalized, and how that distorts our view of reality. He says, “In polls, a huge majority of us assume search engines are unbiased. But that may be just because they’re increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click”(p. 3). Not only do vendors like Amazon and Netflix use your purchase history to try to predict what you might want to buy next (which seems legitimate, and no more than an on-the-ball shopkeeper in the bricks and mortar days would do), but increasingly info you don’t know is being collected about you is in play in ways you don’t realize.

    In commerce this info is gathered by “data companies like BlueKai and Acxiom, [which has] accumulated an average of 1,500 pieces of data on each person on its database—which includes 96 percent of Americans” (p. 7). Credit card companies have been profiling us based on what we buy for decades, now cellphones can report on where you go, too. This has some unpleasant implications in terms of direct marketing, and some possibly scary implications in terms of surveillance. But Pariser believes it also has a negative impact on social bonds and even on epistemology.

    The filter bubble is Pariser’s name for the feedback loop of information that surrounds us, as even Google searches we believe to be unbiased are increasingly tailored to our personal profiles. Although he admits we’ve always selected media that appeals to our interests and preconceptions (hence the echo-chamber cable news channels), Pariser says the filter bubble is different in three ways. “First, you’re alone in it.” Each personal news feed or search is tailored specifically to you, so you’re no longer even part of a narrow affinity group. Second, it’s invisible. “Google’s agenda is opaque. Google doesn’t tell you who it thinks you are or why it’s showing you the results you’re seeing. You don’t know if its assumptions about you are right or wrong—and you might not even know it’s making assumptions about you in the first place” (p. 10). Finally, Pariser says while the viewers of politically slanted media are (presumably) aware there are a range of options and that they’ve chosen one of them, as the filter bubble gives us more and more seemingly objective positive reinforcement in our preferences and prejudices, we begin to believe the world is like us.

    Because even web searches from commercial sites like Google are increasingly tailored to each user’s profile, Pariser says we are less likely to be exposed to a rich variety of ideas. Politically, this would tend to make us even more obnoxiously American than we already are. But he also claims that it will hinder creativity by promoting a more passive style of info gathering, and by narrowing what he calls “the solution horizon, the limit of the conceptual area that we’re operating in” (p. 95). It’s hard to be outside the box, Pariser believes, if the box is narrow and invisible. If innovation comes from the juxtaposition of unrelated ideas and from some type of creative cross-pollenization that happens when people expose themselves to unfamiliar stimuli, then we could be headed toward a generation of accountants. And this change has been noticed even by some techies: “The shift from exploration and discovery to the intent-based search of today was inconceivable,”Pariser quotes an unnamed Yahoo editor lamenting (p. 103).

    Some of the comparisons Pariser makes between Google (which profiles you based on your click history) and Facebook (which profiles you based on what you share) are less heavy than his argument above. You might even call them trivial. But the point is still worth remembering, “If you’re not paying for something, you’re not the customer; you’re the product being sold” (p. 21). Overall, though, I think Pariser overstates the danger of the filter bubble because just like the techno-evangelists he criticizes, he overestimates the importance of the technology. The box isn’t invisible – the box is the commercial internet. Creative people have no trouble seeing that. The problem, which Pariser gets close to and then misses, is that we’re training a generation of people not to be creative. 36 hours a week the average American spends watching TV. Switch that to surfing the web and you’ve still got the same problem.

    It’s a good book and a quick read. Pariser asks some provocative questions. But he doesn’t offer a lot of solutions. A government regulatory agency that supervises these data collectors does not sound like a good idea to me. The only people I want to have my personal info less than salesmen are bureaucrats. Pariser mentions the movie Minority Report – I’m thinking Enemy of the State. RFIDS cost about a nickel apiece, and it’s been nearly 15 years since I sat in a presentation by a semiconductor manufacturer’s rep (I think from National Semi) who talked about all the ways they were thinking of deploying them. So what are some ways of getting out of the filter bubble?

    First, limit the amount of info you’re giving away. Assume you’re always being watched, and act accordingly. Don’t carry a smartphone everywhere. Use cash. Search on something other than Google. Use TOR or some other anonymizing web service. Get off Facebook. Remember that everything you post to any website you don’t personally own probably becomes someone else’s property, and that the stuff you post on your own site can be copied and saved by anybody. Forever. And from the network perspective, it’s never been easier for regular people to communicate, and it doesn’t have to be through the commercial web. WIMAX base stations are cheap, and can connect entire towns and cities into networks that don’t depend on the AT&Ts and Time Warner Cables of the world. Those networks won’t have Netflix or YouTube on them (or much porn, either), but if that’s all we’re really looking for, then it’s already too late.
  • Rating: 3 out of 5 stars
    3/5
    The book seriously wandered from the original premise -- to include philosophical discussions and LOTS of ink spent on targeting people for ads. Did learn a few things, so would consider worth the time spent reading.
  • Rating: 4 out of 5 stars
    4/5
    In a world of warm and fuzzy internet giants such as Google, Facebook, Netflix, Amazon, etc., all may not be well according to the author of the new book, The Filter Bubble. While we theoretically have access to more than ever, what we access, algorithmically derived from our own previous click history, is actually limited in scope. Due to “embedded filtering” our own “personalized internet” purportedly limits one’s ability to encounter new ideas, serendipitous discoveries, and opportunities to learn. Where the internet initially was rooted in anonymity, the filter bubble creates a homogenous online environment where privacy is but an illusion. In an apparent twist on selling our souls to the devil, these free internet-based services use personal histories, preferences, and data without our explicit knowledge or approval to extend their power, control, and profit-margin. Certainly alarmist, this book is food for thought for creatures of habit though technological determinists are likely to see it as a false alarm.
  • Rating: 4 out of 5 stars
    4/5
    The subtitle is what caught my attention: What the Internet is Hiding from You. I use the Internet every day but I don’t really understand how it works. Pariser, board president of MoveOn.org, is making it all clear - and frightening. His basic argument is that large companies such as Google and Facebook are prying personal information from us and then selling it to advertisers and websites eager to get our attention. Because of this, we get out of the Internet only what we put into it - trapping us in a filter bubble that blocks us from new ideas, innovation, and a global perspective.It’s hard to be informed in the filter bubble. Going online to news sites isn’t enough. Seeing what your friends are up to won’t pop the bubble. Reading Pariser’s book will. One of the quotes Pariser uses is by John Dewey. Dewey used the term “bars” but I’ve replaced it with “filters” to show how relevant his concern is to today’s society: “Everything which FILTERS freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life.”
  • Rating: 3 out of 5 stars
    3/5
    I might be reading too many of these “the internet is bad for you” books as I seem to be finding a pattern. Propose a theory, even if it is a very thin one, without definite proof, and upon that idea expand upon the world your own personal beliefs about how people don’t read enough, that the internet is making us stupid, and now, that the internet is filtering information from you. I was of course, fascinated with Eli Pariser’s Ted Talk. While perusing his Facebook and googling information, he noticed that certain friends and pieces of information were disappearing. He noticed the news feeds from his conservative Facebook friends were disappearing. Since he clicked on the links of his liberal friends more often, the conservative ones went away. He then asked some of his friends to search for Egypt.The results from one friend was about the revolution and the other just received basic information about Egypt. Because one of his friends had searched and clicked on Egyptian revolution links, his search results showed more. Based on these two examples, Pariser delves into the problems with Google and Facebook’s Algorithmic functions that are the source of their success. Using Google’s Page Rank and Facebook’s Edgerank, Pariser describes how this filtering leads to an unchallenged self-fulfilling view of the world. He believes that viewpoint is anti-democratic and stifles innovation. He argues that too much homogeneity can lead to a weakened culture and this overpersonalization, if placed into the wrong hands, could lead to some Science Fiction Distopia similar to the movie Gattica. Your future will be determined by where you were born, who your parents were, your friends, and your interests. A company or government can predict and track your whole life, a scary prospect for anyone. However, I think the problem with this book might be too much paranoia. As a staunch defender of democracy, Pariser strongly resists any attempts at manipulation. His chronicle of how we have historically received news, been marketed and manipulated by those who seek to profit either monetarily or politically is interesting, but not exactly new information. It’s this viewpoint that leads Pariser down a different corridor. He spend a little too much time on this history and I think that weakens his point. Yes we have always been manipulated by the media, the government, and by big business (even Taco Bell can doesn’t have to put real beef in their tacos, but the government definition of what beef is would leave anyone to question who is in control of things), so this filter bubble isn't anything new. In fact it might be a weaker substitute than what is already going on around us. When I did the same test, searching for Egypt, I received the same basic information, but I know all about the Arab Spring and the Egyptian revolutions. I also found out about the death of Osama Bin Laden through the same network, Twitter. Pariser says that using Twitter over Facebook might be a better source of information, as Twitter doesn’t track you using these techniques. Therefore, your search results aren’t altered. It’s also difficult to track whether Pariser is being paranoid or not, since much of this, by his claim, is invisible. It’s enough to make one paranoid, but he at least provides solutions on how to prevent falling into a filter bubble. The way to avoid being filtered is don’t be a mouse. Don’t go back to the same websites over and over again, but diversify your interests. Use more of Twitter, not Facebook to get information, since you are then not subject to Edgerank. He also suggests an internet adoption of the 1973 Department of Housing, Education, and Welfare recommendations:-You should know who has your personal data, what data they have, and how it’s used.-You should be able to prevent information collected about you for one purpose from being used for others.-You should be able to correct inaccurate information about you.-Your Data should be secureOverall, Pariser's call for skepticism and diversity in thought is important, but whether this science fiction distopia he fears will come true seem doubtful, and distracts from his main points. A fascinating read and history, but the concept is too weak, and the conclusion a little melodramatic. It did make me more conscious of my searching and online habits, and is a good thing to be aware of these consequences.
  • Rating: 5 out of 5 stars
    5/5
    Eli Pariser offers not just a diagnosis of a problem but also some ideas for making a start at addressing it. I especially liked the suggestions for individuals he includes in his final chapter along with the suggestions for businesses and governments. This book will change the way you interact with Facebook, Google, and the internet as a whole.
  • Rating: 4 out of 5 stars
    4/5
    If you’re not paying for something, you’re not the customer; you’re the product being sold.Attributed to a commenter on the Internet and oft-quoted by LT’s Tim Spaulding, that line lays the foundation of half of this book (thematic half, not structural): that virtually every website you visit collects, compiles and integrates your personal data and then uses or sells it for commercial purpose. Google and Facebook are particularly vilified: Google captures your searches and result-link clicks; Facebook doesn’t have to capture data, users provide it voluminously.The other half of the book is the stuff of the title -- that the Internet increasingly uses that collected data to tailor itself to you, creating an online space that is your own little filtered bubble. Used to be, you could Google something and tell someone to click the third link on the first page -- those “Page rankings” (named after Google’s Larry Page) having been based on what was most relevant to the whole of Internet users. But since 2009, Google searches are individualized and ranked according to what’s most relevant to you, i.e. what you’re most likely to click on. Google yourself and you won’t get the results I get for you. Google a controversial topic and you’ll get results in line with what you already know; same with the prioritization of your Facebook feeds. Online (and increasingly offline), you are what you click on the web; you are what you share there or link to … did you know that your “real life” credit-worthiness is affected by the credit-worthiness of your Facebook friends?Pariser acknowledges that media has always been filtered (network news and newspapers) and that at least with the Internet, you can go find what you don’t know … as long as you know you don’t know it; stumbling on “unknown unknowns” is harder. But he riles against the lack of transparency in data collection and cautions that Internet filtering puts techies in charge of the dialogue and discourse that create society. That reminds me of Edward Tufte’s caution in The Visual Display of Quantitative Information: “Allowing artist-illustrators to control the design and content of statistical graphics is almost like allowing typographers to control the content, style, and editing of prose.”If I recall correctly (I listened via audiobook), Pariser suggests there is little solution other than regulation; he offers a couple suggestions to mitigate, for example setting your web browser to delete cookies/history each time it closes.The book is revelatory. It or another book on the topic is required reading for every person who goes online.