Echo Chambers Unveiled: The Impact Of Content Recommendation Algorithms On Ideological Pipelines
Abstract
Personalized content recommendation algorithms are sophisticated systems employed in various digital platforms to tailor and optimize user experiences by delivering content tailored to individual preferences. Leveraging machine learning and artificial intelligence, these algorithms analyse a user's historical interactions, preferences, and behaviours to predict and suggest content that aligns with their unique interests. Through a combination of collaborative filtering, content-based filtering, and sometimes deep learning techniques, these algorithms strive to offer users a curated selection of articles, videos, products, or other digital content. The goal is to enhance user engagement and satisfaction by presenting relevant and appealing information, ultimately creating a more personalized and enjoyable online experience.
However, the potential societal impacts, such as the formation of echo chambers and ideological reinforcement, also merit careful consideration in the ongoing development and deployment of these recommendation systems. The constant influx of information from social media largely affects our way of thinking and shape our opinions on various topics. AI algorithms often present information based on our previous interactions and preferences, hence filtering out a large amount of other information, limiting exposure to diverse perspectives and reinforcing pre-existing knowledge. The project aims to investigate whether the use of AI algorithms in recommendations and dissemination contributes to the process of radicalization.
References
2. “Far-Right Trends in South Eastern Europe: The Influences of Russia, Croatia, Serbia and Albania” by Arlinda Rrustemi
3. “TikTok censorship” by Fergus Ryan, Audrey Fritz and Daria Impiombato
4. "Artificial Intelligence, Deepfakes, and Disinformation” written by Todd C.
5. “AI, Society, and Governance: An Introduction” by Peter Engelke
6. “How Internet Users Engage with Extremism Online” by Alexandra T. Evans and Heather J. Williams.
7. Internet policy review [https://policyreview.info/articles/analysis/recommender-systems-and-amplificati on-extremist-content] Agresti, A. (2013). Categorical Data Analysis. John Wiley & Sons.
8. Ayres, I., & Braithwaite, J. (1992). Responsive regulation: Transcending the deregulation debate. Oxford University Press.
10.Azeez, W. (2019, May 15). YouTube: We’re Learnt Lessons from Christchurch Massacre Video. Yahoo Finance UK.https://uk.finance.yahoo.com/news/you-tube-weve-learnt-lessons-from-christch urch-massacre-video-163653027.html
11.Bakshy, E., Messing, S., & Adamic, L. (2015). Exposure to Ideologically Diverse News and Opinion on Facebook. Science Express, 1–5. https://doi.org/10.1111/j.1460-2466.2008.00410.x
12.Baugut, P., & Neumann, K. (2020). Online propaganda uses during Islamist radicalization. Information Communication and Society, 23(11), 1570–1592. https://doi.org/10.1080/1369118x.2019.1594333