Social-Media

Facebook has reportedly demonstrated that its algorithms divide people but top leaders have killed or debilitated solutions.

In spite of internal research, Mark Zuckerberg and others made nixed proposals to remedy Facebook's platform, which encourages polarisation.
Internal research on Facebook found that polarization was encouraged, but the Wall Street Journal reported that Mark Zuckerberg and other senior managers had rejected ideas that would address the problem.
According to The Journal, the algorithms from Facebook have concluded that "the human brain exploits the attraction of division."
 
The Journal, however, repeatedly rejected Zuckerberg and the Facebook Chief of Policies, Joel Kaplan, who feared they would appear partial to the Conservatives or simply lose interest in solving the problem.
In a blog post on Wednesday, Facebook replied that it had made steps such as re-calibrating its news feeds and forbidding harmful content for polarisation.
During the Coronavirus pandemic and prior to 2020 presidential elections, Facebook is under increasing pressure to deal with harmful content and misinformation on its platform.
 
But top executive officials including CEO Mark Zuckerberg have killed or weakened proposed solutions, reports The Wall Street Journal on Tuesday, saying their algorithms encourage polarization and "expand the human brain's divide attraction" The Wall Street Journal.
 
In response to the Cambridge Analytica scenario, the effort to better understand the impact of Facebook on user behaviour, and its internal researchers found that the company's mission to connect the world was not to do with its products.
 
One 2016 report showed that '64% of all extremist groups are based on our recommendation tools,' most of whom join Facebook's 'groups that you should join' and 'discover' algorithms. According to The Journal, "Our recommendation systems increase the problem."
 
The Journal said that Facebook teams have made multiple solutions, including reducing the proliferation of information from the hyperactive and hyperparty users of groups , suggesting a broader range of groups to users and setting up subgroups for heated debates to avoid de-railing entire groups.
 
However, the newspaper reported that Zuckerberg had lost interest in trying to address the polarization problem and was concerned about the potential to limit user growth. These proposals were often dismissed or diluted by Zuckerberg and Facebook's political head, Joel Kaplan.
 
Zuckerberg accepted a diluted version in reply to the pitch to limit the distribution of hyperactive user posts and asked the team not to bring it again, the Journal says.
 
Researchers also found that any changes – including apolitical tweaks such as reducing the clickbait – would have affected the conservatives disproportionately, because of the increased presence of far-right accounts and Facebook pages publishing content.
 
Kaplan was concerned that he had previously stopped a project called "Common Ground" to promote healthier political debate on the platform.
 
In the end, much of the effort was not included in Facebook's products and managers reported to the Journal that in September 2018 the company was turning its weight away from corporate to personal value.
 
A Facebook spokesman told Business Insider, 'We learnt a lot since 2016 but we don't have the same company today. "We have established a strong integrity team; our policies and practices have been strengthened in order to limit the harmful content and research has been used to understand the impact of our platform on society to continue to improve."
 
The Wall Street Journal also reported on Guy Rosen, Facebook's vice president of integrity, Wednesday in a news blog.
 
Facebook has taken various measures, such as prioritizing content from friends and family in news feeds, not recommending groups that violate its terms and conditions, prohibited hate speech and content that could damage the real world or partnerships with fact checking groups. Rosen said Facebook has done several things to combat polarisation.
 
Critics who say the company does not do anything to limit the spread of harmful content on its platform have repeatedly checked Facebook for their use. This theme is deepened as misinformation related to coronavirus has been rampant in social media and the presidential electoral process is approaching in 2020.

 






Follow Us


Scroll to Top