NewsPronto

 
The Property Pack
.

The Conversation

  • Written by Joseph B. Walther, Professor of Communication; Director, Center for Information Technology and Society, University of California, Santa Barbara
I'm safe, but you should be more careful online.Rapeepat Pornsipak/Shutterstock.com

A number of prominent figures have called for some sort of regulation of Facebook – including one of the company’s co-founders and a venture capitalist who was one of Facebook’s early backers.

Much of the criticism of Facebook relates to how the company’s algorithms target users with advertising, and the “echo chambers” that show users ideologically slanted content.

Despite the public criticism, the company has posted record profits. And billions of people – including more than two-thirds of American adults – continue to use the unregulated version of Facebook that exists now.

I have been studying the social dynamics of the internet for 30 years, and I suspect what’s behind these apparent contradictions is something psychological. People know about Facebook’s problems, but each person assumes he or she is largely immune – even while imagining that everyone else is very susceptible to influence. That paradox helps explain why people keep using the site – which still boasts more than 2 billion monthly average users. And ironically, it also helps explain what’s behind pressure to regulate the social media giant.

It’s not me, it’s them

The psychological tendency at work here is called “the third person effect,” the belief that media don’t fool me, and maybe don’t fool you, but all those other people are sitting ducks for media effects.

Ironically, this dynamic can encourage people to support restrictions on media consumption – by others. If someone uses, say, a social media site and feels immune to its negative influences, it triggers another psychological phenomenon called the “influence of presumed influence.” When that happens, a person worries that everyone else falls victim, and supports efforts to protect others, even if they think they themselves don’t need the protection.

This could be why there are lots of Facebook users who complain about Facebook’s danger to others, but continue using it nevertheless.

Even the Facebook-funding venture capitalist Roger McNamee, who wrote a book about how bad Facebook has become, may have fallen prey to this psychological irony. As the Washington Post reports, “despite … his disgust with the worst crimes of social media platforms … McNamee not only still owns Facebook shares … he also still counts himself among the behemoth’s more than 2 billion users. After all, McNamee acknowledges with a shrug and a smile, ‘I’ve got a book to promote.’”

Not everyone can be above average

McNamee may think he’s immune to the echo chambers and other online influences that, he warns, affect the average Facebook user. What if average Facebook users think they’re not the average Facebook user, and therefore also believe that they are immune to Facebook’s pernicious influences?

I explored this possibility in a survey of 515 adults in the U.S. who used Facebook at least once the previous week. Participants were recruited by Qualtrics, a company that administered my survey questions. Respondents resided in all 50 states. Their average age was 39, and they reported an average of just under 10 hours per week on Facebook, which they estimated to be similar to most other Facebook users.

The survey asked the respondents three groups of questions. One group was about how strongly they believe that Facebook affects them on a number of important social and political topics, including building a wall on the U.S.-Mexico border, expanding or repealing the Affordable Care Act, whether President Trump is doing a good job and other major national issues.

The second group of questions asked how much each respondent believes Facebook affects others’ perceptions of those same issues – how much social media affects their idea of “the average person.”

The third group of questions asked how strongly each respondent supported regulating Facebook, through a variety of possible strategies that include rulings from the Federal Trade Commission or the Federal Communications Commission, breaking up Facebook using anti-trust laws, requiring Facebook to reveal its algorithms and other steps.

Eager to protect others

Respondents believed that Facebook affects other people’s perceptions much more strongly than it affects their own. The more they thought that others were more vulnerable than they were, the more they wanted to rein Facebook in.

A man misled by online information surrenders to police in Washington, D.C., after firing a rifle in a pizzeria.Sathi Soma via AP

People who thought they were far less affected than others, and who wanted to regulate Facebook, also believed more strongly that the source of the problem with Facebook lies in the power of echo chambers to repeat, amplify and reinforce a user’s beliefs. That was true even though they would be affected by the regulations as well.

Echo chambers do exist, and they do affect people’s perceptions – even leading one person to shoot up a pizza parlor alleged to be a front for child prostitution. But research has called into question the idea that echo chambers are extremely influential over most people’s views.

In my view, it’s more important to help people understand that they are just as much at risk from Facebook as everyone else, whatever the level of risk may actually be. Society may bear some responsibility, but so do individual Facebook users. Otherwise they’ll ignore recommendations about their own media consumption, while supporting calls for sweeping regulations that may be too broad and potentially misdirected. Ultimately, people need to save themselves more, and worry a little less about saving everyone else.

Joseph B. Walther receives funding from the US National Science Foundation, the National Institutes of Health, and donors to the UCSB Center for Information Technology and Society.

Authors: Joseph B. Walther, Professor of Communication; Director, Center for Information Technology and Society, University of California, Santa Barbara

Read more http://theconversation.com/facebook-doesnt-fool-me-but-i-worry-about-how-it-affects-you-117296