Research has shown that negative information is three times more likely to be clicked on than positive information, which means that social media algorithms designed to maximize such signals are bound to spread negative partisan information that is destructive to our government’s ability to create policies that reflect good ideas from both parties. The 2016 election was one of the ugliest and most partisan elections that we have seen and 2020 may be even uglier as it seems as if every election is more partisan than the last. Among the causes of this polarization is the “filter bubble” that is created by our ever personalized social media news feeds that reinforce negative information about the other side. We increasingly surround ourselves with news and information, both in the real world and online, that supports our existing worldview, leading to what psychologists call “groupthink” whereby group members compete to show themselves as better members of the group by demonstrating increasingly extreme rhetoric. This leads to pressure from both sides’ partisans to refuse to compromise.
Just as people are pulled apart by stories of partisan fighting, they are brought together by stories of bipartisan cooperation. There is ample evidence in psychology that “extended contact” where people read about other group members making cross-group friendships, has a positive impact on inter-group perceptions. In this project, we formally test and quantify the ability of such information - that cross-partisan friendship and cooperation is actually fairly common - to mitigate the partisan rancor that social media creates.
Our specific method is to randomly assign audiences to receive either our treatment (stories of people working together across partisan divisions) or control (their regular newsfeed without our intervention) and use our measures of cross-partisan rancor to see if negative feelings and stereotypes about the other side are indeed mitigated. Our hope is that if we can show this experimental intervention to work, this information will lead more groups to consciously create and promote such stories. Our ultimate hope is that the social media companies themselves may adjust their algorithms in light of our research, and we plan to leverage our existing contacts in this world to ensure that they, as well as the wider public, are aware of the outcome of our research.