by Joakim Larsen
Escape your Search Engine’s Filter Bubble
When we search the Internet, different results are shown to different people. Based on your search history and your click history, results are tailored to who you are.
Since you often click on things you agree with, you keep getting more and more of what you already agree with. This means other things are demoted (effectively filtered).
This raises the question: what are you missing? In other words you are living in a filter bubble, that promotes the things it thinks you like, and demotes the rest. This may limit your exposure to opposing information.
Unfortunately it is not easy to pop your filter bubble because the technology is used so much across the Internet, even when signed out.
Could this search engine really be the safe alternative to Google, Yahoo, Ask, Bing, Webcrawler and the numerous others?
DuckDuckGo claims to neither store your personal information online, nor sell it for commercial purposes. Combined with Firefox Web Browser, this may currently be among the safest ways to make a private search.
Myths About Private Browsing
You may have tried your browser’s “private” mode, but have you read the fine print? Things like cookies and history are deleted when you close out. But there are plenty of other ways you’re being tracked, like through your IP Address and logins.
Search engines can still record your searches. Your Internet provider can still see where you go. So, even though YOU can’t see your past activity, the rest of the Internet can.
This means advertisers can still build profiles based on your activity, which they use to show targeted ads, that follow you around the web. Also, all this info can be requested by the government, which means private browsing is pretty ineffective.
Private browsing or not - your online activity is still monitored unless you use services that dont track you by default. like DuckDuckGo for search.
All over the Web, invisible algorithmic editing is taking place right in front of you. And you may not even be aware of it.
Facebook does it. And on Google your search is affected by which type of computer you are using, what kind of browser you are using, and where you are located. These are used to edit your query results. There is no standard Google anymore. Yahoo News, the biggest news site on the Internet is now personalised. Different people get exposed to different things.
This brings us to a world where the Internet moves us to what it thinks we want to see, but not necessarily to what we need to see.
If you bring all these algorithms together, you get what can be called a filter bubble. Your filter bubble is your own unique universe of information that you live in online. And what is inside your filter bubble depends on who you are and it depends on what you do. But you dont get to decide what gets in. And more importantly, you dont actually see what gets edited out.
The challenge with the algorithmic filters is that they mainly prioritize what you click on first. So instead of a balanced information diet, you can end up with information junk food.
In a broadcast society, the gatekeepers, the editors, were swept out of the way by the Internet. But in the Internet society we see the passing on of the torch from the human gatekeepers to the algorithmic ones. And the algorithms dont yet have the kind of embedded ethics that the gate keepers of the broadcast society did. So if algorithms are going to curate the world for us, we have to make sure that we dont just keep to relevance, but that they also show us things that are challenging, important, uncomfortable and represent the point of views of other people.
We have been here before as a society. In 1915 news papers were not sweating al lot about their civic responsibilities. But then people realised that the news providers were doing something really important: you couldnt have a well-functioning society, if citizens didnt get a good flow of information. Newspapers were critical and acting like filters. So then journalistic ethics developed. It wasn't perfect but it got us through the last century.
And now with the Web we are back in 1915. We need the code people to write these things into the algorithms. The filtering needs to have an inbuild code of life, a code of civic responsibility. They must be transparent enough for us to see what passes through the filters - and what doesnt. And we the users need to have som kind of control so we can decide what gets through and what doesn't.
We need the Internet to introduce us to new ideas and to new people. And different perspectives. And it is not going do that if it leaves us isolated in a Web of one.
- Eli Pariser, Beware online “filter bubbles”
Take time to read this again.
And consider the implications.
Where is the boundary?
Try reading the above quotation again and consider what it implies. Not only is your search content being filtered by Facebook to give you a more "safe" experience. Safe from whom?
On top of this you have to allow for information gathering BOTH on AND outside Facebook! So where does Facebook's surveillance end?
When it comes to trust, there are two central questions: Who are the people behind the software you use, the browser, the search engine, and the social media? And how can trust them?
- Joakim Larsen, VenturePress.dk
Facebook *is* biasing the news — but not in the way you think
Over the last three days we saw a really clear example of the new world of news that Facebook is creating, whether you like it or not.
On Thursday, Mark Zuckerberg defended his decision to show a video of Philando Castile — a black man who was shot by the police at a routine traffic stop in Minnesota — on the Facebook Live video platform. It shows Castile covered in blood, dying next to his distraught girlfriend and her baby daughter. Facebook sometimes takes down videos that are overly graphic. But not in this case, Zuckerberg said:
The images we've seen this week are graphic and heartbreaking, and they shine a light on the fear that millions of members of our community live with every day. While I hope we never have to see another video like Diamond's, it reminds us why coming together to build a more open and connected world is so important -- and how far we still have to go.
Clearly, Zuckerberg believes it is important for everyone to see how the police behave when they draw their weapons on civilians.
The video — along with a stream of others showing white police officers killing unarmed or unresisting black Americans — has animated a widespread movement against police brutality in the US. It spurred a peaceful march in Dallas Thursday night that ended in bloodshed when a gang of maniacs began shooting at police officers, killing five.
It is not true to say that the Castile Facebook Live video caused the march that became a target for the Dallas snipers. But the extra exposure that Facebook gave that video certainly contributed support for the march, and the anger behind it. That's a good thing — people should protest when they see injustice.
The problem for news publishers — and anyone who wants their news served with a reasonable effort to rank stories by importance, not format — is that Facebook's preference for video is biasing the news in favor of stories with dramatic video and against those that do not have moving pictures.
This is not a neutral bias.
There are plenty of incredibly important stories that Facebook is downplaying in your news feed because the algorithm that controls it favours video over text or photos. These are some examples of stories where the available information is largely or exclusively in text-only format:
There is no video of the collapse of the Italian banking system .
There is no video of the collapse of the commercial property investment market in the UK .
There is no video of China's national debt, the largest lump of questionable non-transparent credit ever assembled in human history.
Each of these stories has potentially much further-reaching consequences than a police shooting. After all, the 2007-2008 global financial crisis was triggered by these same issues a decade ago, and they're (maybe) happening again now.
But you can't see them on Facebook Live.
Facebook has paid 140 publishers a total of $50 million to produce a guaranteed number of Facebook Live video posts, according to the Wall Street Journal. (Disclosure: Business Insider is one of the publishers.) In order to speed up adoption of Facebook Live, Facebook has altered its news feed ranking to downplay the reach of text and photo posts, and to extend the reach of native video and Facebook Live events.
That is why your news feed might suddenly feel like it's full of video.
Nicola Mendelsohn, Facebook's vice president for Europe, the Middle East and Africa, recently said she expected to see an historic "decline" of text as a medium for delivering information:
"We’re seeing a year-on-year decline of text … If I was having a bet I’d say: video, video, video. ... The best way to tell stories in this world— where so much information is coming at us—actually is video. It commands so much information in a much quicker period so actually the trend helps us digest more of the information in a quicker way."
Facebook's daily video views have gone from 1 billion to 8 billion over the course of a year. Text posts, meanwhile, are declining year-on-year, Mendelsohn said.
Katharine Viner, the editor of The Guardian, said she believes the news is becoming distorted by algorithm decisions at companies like Facebook:
"Social media companies have become overwhelmingly powerful in determining what we read and whether publishers make any money," she said. "The idea of challenging the wide-open worldwide web has been replaced by platforms and publishers who maximize the amount of time you spend with them and find clever ways to stop you leaving. That may be great news for advertisers and the platforms themselves, but it’s a real concern for the news industry."
On the one hand, these are merely the complaints of publishers who once enjoyed a firehose of free traffic pouring off Facebook. That firehose has been turned down in recent months as Facebook has shifted its emphasis toward posts from friends — and video. So take the complaints of media moguls with a pinch of salt.
Facebook was recently accused of biasing its "trending" section in favour of liberal news sources, and playing down news of interest to conservatives. That accusation has largely turned out to be false.
The video bias, however, is true. If Facebook continues to keep its thumb on the scale in favor of moving pictures you can expect to see a lot more visceral, live, compelling, and maybe even grueling human dramas in your news feed. There is no doubt that bloody mayhem goes viral more quickly than an analysis of central bank interest rate policy. And some of it — like the Castile shooting — will be incredibly important, because that video tells a story of police violence far more compelling than any description in words ever could.
But just remember: every live video in your feed is pushing down a non-video item simply because that item is not a video. Facebook's pro-video bias is real and deliberate, based on format not content. You won't know what you're missing — which is the worrying part.
- Jim Edwards, Business Insider, Jul. 09, 2016
Sharing is happiness...