RSS

Algorithms in the Mist: How researchers try to decode Facebook’s newsfeed

24 Jul

Algorithms in the Mist

The algorithms used by Facebook and others are murky at best. Researchers are working to understand them.Andrew Harrer/Bloomberg


by Alex Dalenberg
 –

The UpTake: Algorithms are a fact of digital life, but the inner-workings of the likes of Facebook’s newsfeed and Google Search are famously opaque. These researchers are working to change that.

The algorithms that power applications likeFacebook‘s newsfeed and GoogleSearch shape the contours of our digital lives, but only a handful of engineers know how they actually work.

That was the upshot at a luncheon forum this week at Harvard’s Berkman Center Internet and Society in which three researchers described their efforts to decipher Facebook’s newsfeed from the outside.

Even the most technically savvy can only make educated guesses at how these tools function. Silicon Valley companies zealously guard their algorithms as proprietary secrets. They’re also constantly changing (here’s an interactive showing how Google Search worked in 2007).

For most Web users they remain out of sight, out of mind. A recent study of Facebook users suggests that a lot of us aren’t even aware of their existence.

When Karrie Karahalios, an associate professor in computer science at theUniversity of Illinois, and her colleagues went to test how Facebook chooses what to display on its newsfeed, the majority of their 40 study participants (about 62 percent) had no idea that Facebook was even making choices about what they saw or didn’t see in their newsfeed.

To do this, Karahalios and her team built an app using Facebook’s API showing everything happening in a user’s network, no filter, placed alongside the regular, filtered newsfeed that users see when they log into Facebook.

In short, it was the first time many of the participants realized they weren’t getting updates from certain friends, family members or other pages because of choices made by Facebook algorithms.

The MIT Center for Civic Media has a nice play-by-play breakdown of the entire presentation here. The Berkman Center website hasn’t posted the video of it yet, but when they do, it should show up here.

There’s a lot to unpack in this presentation–far more than in a short post–but I thought it was worth pointing out as part of a growing trend of academics, journalists and other advocates calling for what might be described as algorithmic transparency. For example, Nick Diakopoulos, a Tow Fellow at the Columbia Journalism School, has argued that poking and prodding at algorithms should be its own journalism beat.

Why is it important? For one thing, algorithms determine what gets an airing in the marketplace of ideas. The companies that build these algorithms also have their prerogatives, and they don’t necessarily line up with the user’s when it comes to what they see and when.Christian Sandvig of the University of Michigan, who also presented at the Berkman Center, writes in detail about this here.

Recall the firestorm over the news that Facebook experimented on users by making newsfeed changes to test its impact on their emotions as well as the company’s tepid response. This conversation isn’t going away.

 

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Leave a comment