"The Filter Bubble"
by Eli Pariser
I called these posts "speed reads" because I really zipped through the book, skimming and scanning and synthesizing my own understandings into the statements below. It's not impossible that I completely missed the point or got off on a tangent that doesn't reflect the author's intent. I trust that readers will use the comment block below to correct me and extend the conversation.
The Race for Relevance:
This first chapter chronicles internet giants' pursuit of mechanisms for parsing meaning from who clicks where when and under what circumstances. It's not a surprise to anyone anymore that free online services like Facebook are in the business of selling audience to advertisers. Not just any audience though: Facebook sells a specially selected segment of the total audience that is more likely to acquire the advertised item.
The User is the Content:
Users see media platforms, like Facebook and Google, as the product through which they consume media content. From the other side of the screen, to the internet companies themselves, the media platform is the product through which users contribute demographic information which is, in turn, sold to advertisers. Social media sites create a window into individual lives, experiences, likes, thoughts, and consumer patterns. Users of social media are the content.
The Adderall Society:
Adderall is a drug prescribed to treat attention deficit allowing users to focus more intently and attend for longer periods of time. Research suggests associative creative is affected resulting in a more clinical and calculated engagement with the world. The filters through which we view the world, ground by our own experiences, beliefs, and goals, influence the way we engage with the world. We understand creativity to arise from serendipitous connections amongst trains of thought and random memories or experiences. Programmers are experimenting with computational creativity creating algorithms that bring peripheral or random content to the core of a concept.
Search engines and social media craft responses to queries that mirror our demographic and consumer profile. This means we are less likely to encounter peripheral and random content as the engine feeds us content that fits our existing schema; in the process, our exposure to opposing points of view, or alternative approaches is filtered from us.
The You Loop:
Users click links that define interests. The site analyzes selections and refines its user profile. The profile determines which results are returned to the user who is influenced by the content. This feedback loop creates an interest groove that is "narrow and deep"
The Public is Irrelevant:
As marketing interests seek to define who we are, and we respond to that definition, we become segmented and isolated in information pockets - filter bubbles. It is increasingly difficult to get information from opposing points of view. The Public as a community of people with common core values is at risk of fragmentation.
Software engineers' lines of code create a virtual space within which people are living, creating, interacting, producing and consuming. Simple frameworks, like a wiki space, blog forum, or the algorithm for returning search results, influence the nature of those interactions intentionally, or unintentionally. Questioning whether coders have an ethical responsibility to anticipate the larger social and political implications of the online environments and tools they create seems a tall order. Nevertheless, the data gathered from these spaces leaves the public open to exploitation, and internet companies in a position of great power.
What You Want, Whether You Want It or Not:
Internet information-based companies harvest a tremendous amount of data to personalize the product it delivers, but the profit is in the consumers it delivers to advertisers. We are led to believe that search results are unbiased and represent a fair sampling of what is available, if not ALL of what is available. Suspicions that results and information is skewed to generate profits requires no great leap of logic. Google's famous motto, "Don't be evil", for end users, is, at first, comforting, then, upon reflection, as more a matter of faith. Pariser shares this quote from a Google insider, "We try really hard not to be evil, but if we wanted to, man, could we ever!” We are exposed to whatever messages, results, opinions, reviews, advertisements, products, beliefs, and spin that the Google algorithms believe we want. The coders of those algorithms have the power to influence the world on a massive scale.
Escape From the City of Ghettos:
Google's search and Facebook's feed of "Top Stories" seek to feed their understanding of our personal needs, individuals have other sources of information. Exploiting crowd-sourced social media like Twitter, Reddit, and Digg where users themselves "form part of its operational matrix". Broadening the ways we engage and the content we explore can lift us out of the deep and narrow ruts cut for us by the social profiling algorithms.
In this book, I've argued that the rise of pervasive, embedded filtering is changing the way we experience the Internet and ultimately the world. At the center of this transformation is the fact that for the first time it's possible for a medium to figure out who you are, what you like, and what you want. Even if the personalizing code isn't always spot-on, it's accurate enough to be profitable, not just by delivering better ads but also by adjusting the substance of what we read, see, and hear.
"The Filter Bubble" by Eli Pariser
Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You, Penguin Press (New York, May 2011) ISBN 978-1-59420-300-8