Facebook looks to improve the quality of posted information

Addressing the issue of misinformation on social media

More stories from Sydney Purpora

The+CEO+of+Facebook%2C+Mark+Zuckerberg%2C+published+a+post+addressing+the+issue+of+misinformation+after+numerous+complaints+following+the+presidential+election+were+filed+against+the+social+media+site.++

Photo by Submitted

The CEO of Facebook, Mark Zuckerberg, published a post addressing the issue of misinformation after numerous complaints following the presidential election were filed against the social media site.

Social media can be a fun, easy and quick way to interact with others anywhere at anytime by sharing posts for the world to view, comment on or like; but unfortunately, this freedom has been doing more harm than good.

After the 2016 presidential elections results came in, many Americans felt that social media sites were the ones to blame with their contribution to the spread of misinformation about each candidate. The site that gathered the most attention from this controversy was Facebook.

According to an article from The New York Times, the CEO of Facebook, Mark Zuckerberg, wrote a detailed post on Nov. 19 in response to recent complaints concerning misinformation about the presidential election.

“The bottom line is: we take misinformation seriously,” Zuckerberg wrote. “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information.”

This announcement is a gigantic step in the right direction towards bettering the information displayed on social media. With a variety of different stories being posted, shared and commented on every day, it is hard for users to know what is real and what is fake.

According to Facebook’s statistics from politico.com, Facebook produced an overall 4 billion likes, posts, comments and shares related to the 2016 presidential election. Because it is hard to distinguish fact from fiction at first glance, the vast spread of these false stories seemed hard to control.

Zuckerberg’s post included a list of their plans solving the issue that are already underway:

       Strong detection: improving classification of misinformation;

       Easy reporting: enabling others to report fake stories faster;

       Third party verification: using respected fact checking organizations;

       Warnings: improving labeling of false stories;

       Related articles quality: raising the standards for related articles;

       Listening: working with journalists and news organizations to better understand fact checking.

Although these strategies are effective in theory, it will be difficult to enforce at first and will take awhile for users to get used to. Zuckerberg eluded to the complexity of addressing this issue further in his post.

“We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible,” Zuckerberg wrote. “We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content.”

An additional problem contributing to this issue is the nature of the posts themselves. Paul Mihailidis, a professor of media literacy at Emerson College in Boston, said in an article from the Washington Post that many people sharing links on Facebook don’t care whether it’s true or not.

“I don’t think a lot of people didn’t know; I think they didn’t care,” Mihailidis said. “…The more they could spread rumors, or could advocate for their value system or candidate, that took precedent over them not knowing.”

How can we begin to address the issue if the majority of users who add to it do not even care about how their actions affect others?

Impacts of misinformation can be seen at the user level but more often than not, those who are writing the truth are affected as well, since their stories get overshadowed by the impostors.

The amount of phony news stories, false information and overdramatized events is so immense that it makes it harder for real journalists to get the truth out to the public. Because not all articles written by professional journalists are “juicy” or “entertaining” by everyone’s standards, important and factual stories often get overlooked.

Not only is Facebook itself taking action in response to the numerous complaints, but others who are passionate about the subject are implementing their own strategies as well.

The Washington Post article identified a group of college students that created a fact checking and labeling program, making it easier for Facebook users to know what stories are misleading.

Nabanita De, an international second-year student at the University of Massachusetts at Amherst, proposed the fact-checking idea to her three teammates at a hackathon hosted by Princeton University this past week to develop an algorithm to authenticate what is real and what is fake on Facebook, according to the article.

It then went on to explain De and her group members, Anant Goel, a freshman at Purdue University, and Mark Craft and Qinglin Chen, sophomores at the University of Illinois at Urbana-Champaign, generated a Chrome browser extension to tag links and stories on Facebook feeds as verified or not verified by examining the sources’ credibility and cross-checking the content with other news stories.

This is just one example of how we can unite as a nation to make sure the truth gets told to the public. We as citizens of the United States have a right to know what is going on in the country, especially when it comes to presidential candidates during elections.

The push to improve the quality of information posted on Facebook is a great change that every social media site should adopt.

Users shouldn’t need to do additional research when reading an article shared on social media to prove if it is fact or fiction. On the other hand, those doing the posting of articles should make sure that what they are sharing to the world is something justly worth supporting.   

Let’s work together to inform the country with real facts, one post at a time.