I just got back from a Girl Geek Dinner here in Oslo. As always after these events, my brain is full of ideas and thoughts about the internet, and my phone has been low on battery for a while. So after discussing Klout, Fanbooster, and statistics in social media for a few hours, I rushed home to my laptop to write a blog post/note of some of the thoughts that were going through my head. Here goes:
One of our discussions was about content filtering algorithms – like Facebook’s system for choosing what shows up in your news feed or Google’s algorithm for search results. On the one hand, these algorithms are tools to help us deal with the impossible amounts of info that our friends, ex-classmates and brands we once liked throw at us on a daily basis. On the other hand, they might trap us in a filter bubble, where we only see views we already agree with and only interact with people who are like us.
Some discussions, including to a certain extent the one the Girl Geeks (with a few Guy Guests) just had, make these two factors seem like a complete either/or dichotomy: an unfiltered stream of all information or a carefully curated selection. To me the ideal is fairly simple: I want a filtered internet, but I want to filter it myself.
I love being able to use digital media to filter the world, to view it the way I want to. I want to get recommendations from Twitter users I follow, bloggers I read, Facebook friends, even mass e-mails from my co-workers. I love that I can get my own mix of views on the world from sources I enjoy and trust, and that I have more options than one news network.
Tools that let me filter my own content are great, and I wish there were more of them in online news sources. I would love to be able to check a little box that said “No sports news please” and another one that said “I don’t know who reality stars are, so I won’t click on stories about their sex lives no matter how hard you try.”
The difference between these options and algorithms is that the first are options. There is a huge difference between “You told us you like this, so we’re giving you more of it” and “For reasons we will never tell you, you will never see this content”.
Compared to pre-internet information segmentation, algorithm-based filtering is becoming both more invisible, harder to break out of and more difficult for actual word-reading, picture-viewing, link-clicking people to understand.
Many people decide for themselves that they have no interest in political views different from their own. But when they search the internet, and don’t see any of these offensive views, they should know why. It should be because they decided to filter something out of their lives, not because a company tricked them into thinking that something doesn’t exist.
- Facebook vs. Facebook users – One reason I don’t trust Facebook that much
- Back to the weblog – About regaining control over my own content
Facebook illustration by Sean MacEntree, Creative Commons
If Girl Geek Dinners sounds interesting to you, see if there are events or other ways to get involved with this network in your area. If your area is Oslo, here’s the local network website – and you can join our Twitter discussions with #GGDO
Pingback: Update: We are still in control of our Facebook newsfeeds | According to Julie
October 20, 2013 at 9:41 pm
Some good points. I totally agree!
Pingback: Think about why you started | According to Julie
Pingback: How to get me to pay for news | According to Julie
Pingback: I guess I’m a social media expert | According to Julie
Pingback: “Racism bad. Eat kale.” – Why Upworthy is noteworthy | According to Julie