Research Workshop: "Do it Yourself Content and the Wisdom of the Crowds"
Dallas Amico-Korby (UCSD)
Friday, May 3rd, 2024 (12:10-2:00 PM) Business Building (3) Room 113
Abstract:
Many social media platforms enable (nearly) anyone to post (nearly) anything. One cleardownside of this style of social media platform is that many people appear bad at determining who to trust online. In this free market of ideas, hacks, quacks, climate change deniers, and election deniers have all gained massive followings many of whom seem to genuinely trust them (as evidenced by vaccine misinformation being negatively correlated with intent to vaccinate (Enders et al., 2020; Loomba et al., 2021; Latkin et al., 2023), and by followers showing up at the White House on January 6th). At the same time, there are many cases in which people seem to reliably determine who to trust online. Consider, for example, Do It Yourself (DIY) content such as content that teaches one how to play guitar, to bake, to fix one’s plumbing, apply cosmetics, or repair one’s car. For these topics, those who have the largest accounts and the most popular videos also possess significant expertise. That is, social media users seem to reliably pick out experts in these areas. We thus have a puzzle: why are social media users competent at identifying DIY experts, but incompetent at recognizing climate science or vaccine experts?
We offer a solution to this puzzle. Specifically, we claim that social media platforms have enabled a novel wisdom of the crowd phenomena to emerge. Specifically, the crowd—in combination with search and recommendation algorithms—reliably picks out DIY experts and thus fulfills one of the main functions of a credentialing institution. However, we also claim that a necessary condition for this wisdom of the crowds phenomenon to get off the ground is that individuals are—at least—somewhat reliable evaluators of individual pieces of content. In what follows, we argue that this condition fails to hold for many types of non-DIY content, including content about climate science, vaccine safety, and election safety. Thus, the crowd—in combination with search and recommendation algorithms—is unable to serve as a reliable credentialing institution for these content creators. And this is what explains the puzzle. Evaluations of DIY content by individuals are reliable enough that when aggregated the crowd reliably picks out experts, but this doesn’t hold for non-DIY content.