Thursday, July 21, 2011

The Filter Bubble

Recently, I've been reading Eli Pariser's new book: The Filter Bubble: What the Internet is Hiding From You. The title is a bit awkward; the topic of the book is personalization on the Internet and its effects. Personalization is the technique in which Internet web servers keep track of who their users are, and what information is available about them, and use that information to customize themselves, adapting their responses based on who is making the request.

As Pariser points out, personalization is endemic and ubiquitous:

When you search for the flight, Kayak places a cookie on your computer -- a small file that's basically like putting a sticky note on your forehead saying "Tell me about cheap bicoastal fares." Kayak can then sell that piece of data to a company like Acxiom or its rival Blue-Kai, which auctions it off to the company with the highest bid -- in this case, probably a major airline like United. Once it knows what kind of trip you're interested in, United can show you ads for relevant flights -- not just on Kayak's site, but on literally almost any Web site you visit across the Internet. This whole process -- from the collection of your data to the sale to United -- takes under a second.


Why is this something you should worry about?

Well, it's not terribly easy to explain, but Pariser tries, presenting several interesting perspectives on why there are downsides to having your computers conform too subserviently to your desires.

Firstly, there is the fact that "some of the most important creative breakthroughs are spurred by the introduction of the entirely random ideas that filters are designed to rule out":

in moments of major change, when our whole way of looking at the world shifts and recalibrates, serendipity is often at work. "Blind discovery is a necessary condition for scientific revolution," they write, for a simple reason: The Einsteins and Copernicuses and Pasteurs of the world often have no idea what they're looking for. The biggest breakthroughs are sometimes the ones that we least expect.


Another important reason is that it's not clear who "we" are, since human beings don't have a simple, easily classified identity:

The personality traits that serve us well when we're at dinner with our family might get in the way when we're in a dispute with a passenger on the train or trying to finish a report at work. The plasticity of the self allows for social situations that would be impossible or intolerable if we always behaved exactly the same way.

...

Personalization doesn't capture the balance between your work self and your play self, and it can also mess with the tension between your aspirational and your current self. How we behave is a balancing act between our future and our present selves. In the future we want to be fit, but in the present, we want the candy bar. In the future, we want to be a well-rounded, well-informed intellectual virtuoso, but right now we want to watch Jersey Shore


Pariser is not claiming that he can solve these problems; at this point, all he really wants to do is to raise awareness about what is going on:

We live in an increasingly algorithmic society, where our public functions, from police databases to energy grids to schools, run on code. We need to recognize that societal values about justice, freedom, and opportunity are embedded in how code is written and what it solves for. Once we understand that, we can begin to figure out which variables we care about and imagine how we might solve for something different.


Pariser's book covers a number of important issues of the moment. This is not to say it's a perfect book, unfortunately. Like all too many popular non-fiction books, it stretches out the page count with wordy text and filler; I felt like this was a great 30-page essay that became a 200 page book. Pariser is a skilled writer and the book reads well; you'll move through it in a few days. And you'll learn a lot and you'll look at the Internet with a new eye, so these are good outcomes and good reasons to spend your time with this book.

1 comment:

  1. A long time ago, before I got into networking and storage, I dabbled a bit in AI. One of the thoughts I had was similar to Pariser's point about filtering out the connections we need to make. It's hill-climbing, basically. If the solution to your problem is on a far hill, a too-narrow focus on always going uphill - i.e. what past experience says is most likely to be relevant - can leave you stranded on a near hill nowhere near that solution. Sometimes you have to do a random leap, not forgetting your objective but ignoring the usual pathfinding algorithm for a moment. In fact, ISTRC a paper in just the last couple of years about the effectiveness of such a "search nearby for a while, leap, search nearby..." kind of approach to searches (possibly for food but I don't really remember).

    My idea based on this was to make computers dream like we do, associating "thoughts" that had no particular logical relationship but had happened to occur close together in time. Think Archimedes in a bathtub, or Newton under an apple tree. I never really got to implement the idea, but I still think it would be an interesting use of otherwise-idle periods like after the researchers have gone home. ;) Maybe we need to do more of the same as well sometimes, ignoring our left-brain demand for proven relevance and indulging in a little right-brain free association once in a while. Something like an "I'm Feeling Lucky" button on steroids, combining terms from recent searches just to see what pops out.

    ReplyDelete