The Dunning–Kruger effect is a cognitive bias in which people with limited competence in a particular domain overestimate their abilities. – Wikipedia
Above, Bill Maher gives more evidence that he was a casualty of the Hurricane of Bullshit that accompanied the Covid pandemic. In the conversation, he evokes all the standard tropes that we see climate deniers use, only applied to epidemiology. Seth Macfarlane pushes back fairly effectively.
Meanwhile, new research illustrates a key Dunning Krueger mechanism.
Search algorithms, as we know, serve up a lot of nonsense. Moreover, if your search engine algorithm has profiled you as an idiot, it will dutifully serve your bias with ever more idiotic bullshit.
Continue reading “How “Doing Your Own Research” Can Make You Dumber”The personalization method makes it very easy to understand how the filter bubble is created. As certain results are bumped up and viewed more by individuals, other results not favored by them are relegated to obscurity. As this happens on a community-wide level, it results in the community, consciously or not, sharing a skewed perspective of events.[17] Filter bubbles have become more frequent in search results and are envisaged as disruptions to information flow in online more specifically social media.
Conventional wisdom suggests that searching online to evaluate the veracity of misinformation would reduce belief in it. But a new study by a team of researchers shows the opposite occurs: Searching to evaluate the truthfulness of false news articles actually increases the probability of believing misinformation.
The findings, which appear in the journal Nature, offer insights into the impact of search engines’ output on their users — a relatively under-studied area.
“Our study shows that the act of searching online to evaluate news increases belief in highly popular misinformation — and by notable amounts,” says Zeve Sanderson, founding executive director of New York University’s Center for Social Media and Politics (CSMaP) and one of the paper’s authors.
The reason for this outcome may be explained by search-engine outputs — in the study, the researchers found that this phenomenon is concentrated among individuals for whom search engines return lower-quality information.
“This points to the danger that ‘data voids’ — areas of the information ecosystem that are dominated by low quality, or even outright false, news and information — may be playing a consequential role in the online search process, leading to low return of credible information or, more alarming, the appearance of non-credible information at the top of search results,” observes lead author Kevin Aslett, an assistant professor at the University of Central Florida and a faculty research affiliate at CSMaP.
In the newly published Nature study, Aslett, Sanderson, and their colleagues studied the impact of using online search engines to evaluate false or misleading views — an approach encouraged by technology companies and government agencies, among others.
To do so, they recruited participants through both Qualtrics and Amazon’s Mechanical Turk — tools frequently used in running behavioral science studies — for a series of five experiments and with the aim of gauging the impact of a common behavior: searching online to evaluate news (SOTEN).
