As highlighted by buzzfeed article on the dominance of fake news over real news in the last American election, or the the guardian paper on need it and Robert Mercer, it appears social platforms are at the moment not willing to be accountable for what they propagate as their main incentive is how much some content can be viral.
Therfore, one could ask how to create an incentive to make them better control the quality of propagated information?
2 options seem available:
1) external incentive: create a legally binding status for all provider of information forcing them to check propagated information.
2) internal incentive
Interesting, I just read an article on Buzzfeed (https://www.buzzfeed.com/craigsilverman/its-all-about-the-data-and-algorithms?utm_term=.qyVAkbR2k#.kh2jLekrL ) about the new tools Facebook is developing to stop the propagation of fake news. Facebook is currently developing partnerships with human fact checkers who are in charge of checking developing stories and give them (or not) a label of “fake news”. It is also adapting their algorithm to stop the propagation of these labelled fake news on the website. In the long run, they are planning to cut the advertising options of all the public pages repeatedly sharing fake news stories.
Facebook’s main argument for this curb, is that they want to provide the most relevant content, which means a certain level of quality (as it has become the biggest news platform in the world, in term of users and content shared).