In my first post I talked about echo chambers and how it is easy to find yourself hearing the same things or related stories over and over, without getting other perspectives or other types of stories. In my most recent post, I talked about the ethics of children’s behavioral advertising, and more widely, behavioral advertising as a whole.

In class, two other girls and I did a group media analysis on the ethics of behavioral advertising. We had a great group discussion, ranging from talking about the rights and wrongs of those lengthy Terms and Agreement pages, what is public, what is private, how much information gathered it too much, and where do we draw that line? Also, how should behavioral advertising be regulated and will it be effective?

Today, I’m combining those two articles in a way, talking about the implications of personalizing the internet. An article I found on Mashable argues that by tailoring these advertisements to specific individuals, it creates a universally uninformed society.

Eli Pariser, former executive director of MoveOn.org, notes a recent observation. His conservative friends on Facebook stopped showing up on his newsfeed as often as they used to. He realized this was because of his frequent clicking behavior to liberal posts and liberal advertisements.

Pariser says, “this invisible algorithmic editing of the web moves us to a world where the Internet shows us what it thinks we need to see, but not what we should see.”

Even beyond social media sites like Facebook, Pariser notices his friends Google results are different. When one friend Googled “Egypt” the recent protests came up, but when another Googled the same thing, links for travel came up.

It is hard to escape our “filter bubble” which is our personal bubble that keeps us going back to the same perspectives and same news, rather than venturing out into points of view that we aren’t used to reading about. This is particularly important in political news. Yes, people have specific party affiliations, some stronger than others, but if you always tune into a liberal news station or a conservative new station, you’re not making yourself a well informed citizen. It is important to seek information that doesn’t always agree with your beliefs.

If you don’t seek this information, individuals become very one-sided and also arguably close minded.

Is it our own civic duty to seek all sides of a story, or is the the media’s civic duty to provide us with all sides, no matter what our online behavior shows?

Pariser says we should have controls on what gets through our filter and what doesn’t, which brings us full circle back into the ethics of behavioral advertising. Should it be opt-in or an opt-out? In other words, should the individual have to say that they want to be tracked behaviorally, or should they say have to say specifically that they don’t want to be tracked behaviorally.

It will be a long time until anything related to this subject gets effectively addressed, but like Pariser said, I’d like to see a world where we get “a bit of Justin Bieber and a bit of Afghanistan.”

Advertisements