Apr 14th 2017

What Google and Facebook must do about one of their biggest problems

Google could lose as much as $750 million because of a boycott by advertisers, according to Nomura Research. Companies are protesting against the placement of their ads next to extremist and hateful content. An even worse offender is Facebook, which had enabled the propagation of fake news that may have influenced the outcome of the U.S. elections. The two companies have reaped massive profits from the spread of misinformation; yet they have claimed both ignorance of how their technology is misused, and an inability to control it.

It wasn’t supposed to be this way. Social media was developed with the promise of spreading democracy, community and freedom, not ignorance, bigotry and hatred. Connecting billions of people together and allowing them to share knowledge and ideas, it could have enabled them to achieve equality and justice; to expose what is wrong and crowd-solve global problems. Instead, it has become a tool enabling technology companies to mine data to sell to marketers, politicians and special-interest groups, enabling them to spread disinformation. It has created echo chambers in which people with similar views reinforce their ignorance and bias. And the loss of control over user data has now affected not just the economic lives of Americans but also the political messages they receive on platforms such as Facebook.

Part of the problem is that a handful of large technology companies have become small oligopolies in connectivity and information; they are reaping incredible profits but forsaking the responsibilities that come with the power they have gained. Facebook, for example, has become a media company with more power and influence than The Washington Post and the New York Times. More than 65 percent of its users — 44 percent of U.S. adults — get their news through its platform. Yet it claims not to be a publisher and to not have responsibility for what appears on its platform or done with its marketing data.

In light of the backlash, Facebook and Google have acknowledged the problem and pledged to do something about it. In a blog post, Facebook chief executive Mark Zuckerberg detailed plans to build a safe, informed, civically engaged and inclusive community that fulfills the benevolent promise of social media. But he says that Facebook can’t possibly review the billions of billions of posts that are made on its platform every day and solving the problem will require artificial intelligence — which is “technically difficult” and will take years of research and development.

We can’t wait years. And it isn’t that this industry is powerless when it comes to controlling the misuse of its platforms; there is insufficient motivation. Go back a few years in time, for example, to when our mailboxes were getting flooded with spam. Tech companies created filters and blacklists and many other defenses and virtually eliminated it. When marketers learned how to game Google’s page-ranking system by creating multitudes of websites with links to each other, it updated its algorithms to penalize the offenders. Whenever it comes to making money, the tech industry always seems to be able to find a way — and it doesn’t take years.

Trolling is also common problem on Twitter, with millions of automated bots being available for hire. You can openly purchase fake accounts and fake followers, and have other accounts spread marketing content as well as misinformation and hate. The company has the technology to disable these accounts, but it doesn’t, possibly because doing so would hurt its stock price.

There is no way of turning back technology; what we need is for their owners to steer them in a more positive direction.  The problems of fake news and the spread of disinformation can be remedied by opening up social networks and vetting news in a more effective manner; by using technology and imagination to solve the problems that technology has created through lack of imagination.

In his book, “Whose Global Village?,” Ramesh Srinivasan explains that digital technologies are not neutral but are socially constructed: created by people within organizations, who in turn approach the design process on the basis of a set of values and presumptions. The platforms that have come to dominate our experience of the Internet, Google and Facebook, are for-profit companies, not democratic institutions. As they become the face of journalism and public information, they must be held accountable for their effects.

Srinivasan points out that invisible algorithms determine the content that social-media networks curate and present to us; they decide what is important. These algorithms take input from the people we associate with on social media — and this leads to the echo chambers — but a lot more is done in secret. What we do know is that they tend to confirm our existing biases, and those of our existing networks. Yet as users we know almost nothing about the choices that went into these personalization algorithms, and we are not given much of an alternative.

Srinivasan argues for a few important choices:

First, we can ask for social-media companies to make transparent and comprehensible the filters and choices that go into the most important algorithms that shape interactivity. This does not mean having to publish proprietary software code, but rather giving users an explanation of how the content they view is selected. Facebook can explain whether content is chosen because of location, number of common friends, or similarity in posts. Google can tell us what factors lead to the results we see in a search and provide a method to change their order.

Second, we must provide users with the opportunity to choose between different types of information, whether it be the news shared by people beyond their social networks or options on the filters on their feeds. Such filters would allow users to determine what parts of the world they’d like to see information from and the range of political opinions they choose to be exposed to.

Third, we can return to a practice that long characterized the Web: open-ended browsing and surfing. Social-media companies can develop tools that allow news credibility to be visualized, enabling users to browse content within and beyond their immediate social network. Facebook could make posts available from users who are not in the user’s friend network, or provide the user with tools to browse the networks of others, with their permission. It could even develop interfaces that allow users to look across posts from multiple perspectives, places, and cultures in relation to a given topic.

The bigger issue is that we need to develop political literacy in our educational and social systems. This entails viewing no piece of information, whether presented on social media or through a traditional news outlet as infallible, but instead learning to scrutinize that story’s framing, the agenda it serves, and the integrity and transparency of its sources. In other words, as a society, we need to up our own game.

Link to article on Washington Post’s website

The post What Google and Facebook must do about one of their biggest problems appeared first on Vivek Wadhwa.

Browse articles by author

More Essays

Jan 1st 2014

When Nobel Prizewinning author Alexander Solzhenitsyn died five years ago, I experienced several days of flashbacks to the surrealistic times of Soviet power. I had been a correspondent in Moscow in the 1960s and 1970s and my most vivid memory was encountering the great writer face to face.

Dec 31st 2013

“I wonder if anyone in my generation is able to make the movements of faith?”

Nov 16th 2013

This article was originally posted on Truthdig, www.truthdig.com, poste

Oct 21st 2013

Following on the heels of a new book by Jesse Ventura that maintains Lee Harvey Oswald was not John Kennedy’s lone assassin, plus a movie just out about the event, entitled “Parkland,” several books are about to be released to coincide with the 50th anniversary of

Sep 30th 2013

The demand for gossipy detail on writer J.D. Salinger’s private life seems to be a bottomless pit.

Sep 1st 2013

Alvin Lucier’s book: Music 109: Notes on Experimental Music, reviewed by Michael Johnson is in the Music Review section.

Aug 2nd 2013

I thought the book business was being choked to death by television and iPods but I must be wrong. Clean, well-lighted superstores are still going strong. Could customers merely be doing penance for spending too much time slumped on their living room couch? 

Jul 22nd 2013
Margaret Brown: You have your main character creating the story of his deceased wife’s affair through memory and invention. It’s a novel approach to narrative — how did you arrive at it?
Mary L.
Jul 20th 2013

The first time I encountered poet Dana Gioia was in 1991 when I read his controversial essay in The Atlantic Monthly, “Can Poetry Matter?” and then the book with that title that followed. Gioia has deeply influenced my own thinking about poetry, about literature and about work.

Jun 19th 2013
Journalists who left their native countries to report on the outside world find few things more distressing than the death throes of their profession. As today’s newspapers shrink, fold and “go digital”, television turns to entertainers and opinionators.
May 31st 2013

Robert Craft knew from an early age that his considerable musical gifts would never be quite enough to make him a great composer, conductor or performer.

May 20th 2013

Adventurous readers, myself included, make a practice of looking for talented new writers who are just waiting to be discovered. These solitary artists are often buried alive in the overcrowded publishing world, wondering if word-of-mouth will ever kick in.

May 20th 2013
None of us can say for certain how starvation might affect our behavior but I’m guessing that slow death by hunger is one of the most degrading ways to exit this life.