Emergent Mind

Abstract

We restrict the propagation of misinformation in a social-media-like environment while preserving the spread of correct information. We model the environment as a random network of users in which each news item propagates in the network in consecutive cascades. Existing studies suggest that the cascade behaviors of misinformation and correct information are affected differently by user polarization and reflexivity. We show that this difference can be used to alter network dynamics in a way that selectively hinders the spread of misinformation content. To implement these alterations, we introduce an optimization-based probabilistic dropout method that randomly removes connections between users to achieve minimal propagation of misinformation. We use disciplined convex programming to optimize these removal probabilities over a reduced space of possible network alterations. We test the algorithm's effectiveness using simulated social networks. In our tests, we use both synthetic network structures based on stochastic block models, and natural network structures that are generated using random sampling of a dataset collected from Twitter. The results show that on average the algorithm decreases the cascade size of misinformation content by up to $70\%$ in synthetic network tests and up to $45\%$ in natural network tests while maintaining a branching ratio of at least $1.5$ for correct information.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.