<p><em>By J Jehoson Jiresh</em></p>.<p>A casual late-night online search for running shoes can transform our digital scape. Our social media feeds are soon filled with sportswear commercials, fitness influencers, and fitness-related content. The search has now confined us to a digital fitness bubble. It makes us wonder if the algorithms have a better understanding of who we are – an understanding that, probably, does not come easy for even family or friends.</p>.<p>Our digital lives unfold within a curated landscape shaped by unseen digital orchestrators known as algorithms. We inhabit an age dominated by algorithms that curate the information we consume and shape our worldview. As we rely more on Artificial Intelligence to personalise our digital experiences, these algorithms have become channels for bias that constructs our echo chambers, a space where individuals only encounter opinionated information that reinforces our ideologies and beliefs. The more crucial concerns are that they shape our perceptions, spread misinformation, impede intellectual growth, propel impulsive consumerism, and foster prejudices that lead to hatred and bigotry. This is the ‘Algorithmic Biascope’, substantially threatening the pursuit of intellectual autonomy and understanding in the digital age.</p>.Fight health misinformation by influencing the influencers.<p>The Covid-19 pandemic is a stark reminder of the dangers lurking within these algorithmic filters. Misinformation, by exploiting pre-existing biases, spreads like wildfire through social media platforms fuelled by engagement-driven algorithms. The rapid spread of misinformation during the pandemic led to the eventual prominence of the term ‘infodemic’, universalised by the World Health Organisation (WHO); this was a surge of information on the pandemic, spreading faster than the virus itself. It is even more dangerous if the infodemic transforms into a ‘disinfodemic’ state, resulting even in the spread of ‘communal viruses’ in the process, leading to hatred, bigotry, and violence. News that appeals to emotional affiliations can polarise people. Navigating this post-truth world, dominated by AI and an information overload, necessitates a shift in the ways we interact with information. We must envision transitioning from passive consumers to discerning detectives equipped with skills to process information critically. Instead of succumbing to information anxiety, we should aim at reclaiming agency and dismantle this Algorithmic Biascope through several key strategies:</p>.<p>One of them involves active critical engagement. There is an urgent need to cultivate healthy scepticism, question assumptions, and challenge biased narratives propagated in memes, news, and articles. It requires dissecting content beyond its surface appeal and paying close attention to emotionally charged messages that often exploit pre-existing biases. In this pursuit, it is crucial to realise that suspicion is an investment made by the critical reader to reap the dividends of multilayered truths and the foundations of their construction.</p>.<p>The second strategy is to seek diverse perspectives. Powered by algorithms, the echo chambers isolate us within our confirmation bias bubbles. To break free, we must actively seek diverse perspectives that challenge our assumptions and perspectives. This involves examining different viewpoints and sources and analysing logical arguments presented. Beyond mere consumption, critical engagement necessitates adopting a detective-like approach to information. Like detectives gathering evidence, we need to critically assess the information from various angles. There has to be a scrutiny of the sources, evidence, and logical arguments presented in any content. A key question is – who benefits from the narrative, what agenda it might serve, and if it aligns with facts. This active information processing empowers individuals to discern truth from manipulation and navigate the imbricate milieu of algorithmic bias. To ensure adequate information processing, it is pertinent to democratise access to digital and media literacy.</p>.<p><strong>Transparency is non-negotiable</strong></p>.<p>The next step in this deconstructive journey is to demand transparency. Algorithms often operate as black boxes, their inner workings shrouded in secrecy. To combat bias effectively, we need to demand transparency and accountability. Advocate for platforms to disclose how algorithms operate, the data they utilise, and how they influence information curation. This transparency empowers users to understand the forces shaping their digital experience and hold platforms accountable for biased practices. Such a mission focused on demanding transparency and accountability is a vital cog in combating misinformation. In today’s information war, the quest for demanding transparency should be integrated with developing fact-checking skills. One should learn to trace information sources, objectively weigh evidence, and identify and combat misinformation and disinformation. We need to cultivate resources like fact-checking websites and develop the ability to critically evaluate the credibility and purpose of information before sharing or acting upon it. Assessing the purpose of a message is as crucial as its authenticity because even an authentic event can be propagated with an agenda to spread hatred and disharmony. For this reason, it is also essential to address societal vulnerabilities related to the urge to consume and disseminate information.</p>.<p>In addition, it is the need of the hour to promote ethical AI development. While algorithms present challenges, they also hold potential for solutions. It is important to establish systems to develop AI models for truth mapping that would visually represent complex arguments and evidence, fostering informed discourse and combating information silos. We also need to advocate for participatory algorithm design processes that centre human values and ethical considerations. When the boundaries between humans and technology are increasingly blurred, mitigating the inherent biases often embedded in our human and non-human systems is apposite. Therefore, encountering the algorithmic bias for humanity’s intellectual empowerment can be inspired by techno-realistic solutions. This search for multilayered narratives and their constructs requires individual responsibility, a commitment to diverse perspectives, and a relentless pursuit of factual understanding. Only then can the Algorithmic Biascope reels be deconstructed to ensure the triumph of critical inquiry and narratives of love, peace, social justice, democratic empowerment, and harmony over manipulation.</p>.<p><em>(The writer is an Assistant Professor in the Department of English and Cultural Studies, Christ Deemed to be University)</em></p>
<p><em>By J Jehoson Jiresh</em></p>.<p>A casual late-night online search for running shoes can transform our digital scape. Our social media feeds are soon filled with sportswear commercials, fitness influencers, and fitness-related content. The search has now confined us to a digital fitness bubble. It makes us wonder if the algorithms have a better understanding of who we are – an understanding that, probably, does not come easy for even family or friends.</p>.<p>Our digital lives unfold within a curated landscape shaped by unseen digital orchestrators known as algorithms. We inhabit an age dominated by algorithms that curate the information we consume and shape our worldview. As we rely more on Artificial Intelligence to personalise our digital experiences, these algorithms have become channels for bias that constructs our echo chambers, a space where individuals only encounter opinionated information that reinforces our ideologies and beliefs. The more crucial concerns are that they shape our perceptions, spread misinformation, impede intellectual growth, propel impulsive consumerism, and foster prejudices that lead to hatred and bigotry. This is the ‘Algorithmic Biascope’, substantially threatening the pursuit of intellectual autonomy and understanding in the digital age.</p>.Fight health misinformation by influencing the influencers.<p>The Covid-19 pandemic is a stark reminder of the dangers lurking within these algorithmic filters. Misinformation, by exploiting pre-existing biases, spreads like wildfire through social media platforms fuelled by engagement-driven algorithms. The rapid spread of misinformation during the pandemic led to the eventual prominence of the term ‘infodemic’, universalised by the World Health Organisation (WHO); this was a surge of information on the pandemic, spreading faster than the virus itself. It is even more dangerous if the infodemic transforms into a ‘disinfodemic’ state, resulting even in the spread of ‘communal viruses’ in the process, leading to hatred, bigotry, and violence. News that appeals to emotional affiliations can polarise people. Navigating this post-truth world, dominated by AI and an information overload, necessitates a shift in the ways we interact with information. We must envision transitioning from passive consumers to discerning detectives equipped with skills to process information critically. Instead of succumbing to information anxiety, we should aim at reclaiming agency and dismantle this Algorithmic Biascope through several key strategies:</p>.<p>One of them involves active critical engagement. There is an urgent need to cultivate healthy scepticism, question assumptions, and challenge biased narratives propagated in memes, news, and articles. It requires dissecting content beyond its surface appeal and paying close attention to emotionally charged messages that often exploit pre-existing biases. In this pursuit, it is crucial to realise that suspicion is an investment made by the critical reader to reap the dividends of multilayered truths and the foundations of their construction.</p>.<p>The second strategy is to seek diverse perspectives. Powered by algorithms, the echo chambers isolate us within our confirmation bias bubbles. To break free, we must actively seek diverse perspectives that challenge our assumptions and perspectives. This involves examining different viewpoints and sources and analysing logical arguments presented. Beyond mere consumption, critical engagement necessitates adopting a detective-like approach to information. Like detectives gathering evidence, we need to critically assess the information from various angles. There has to be a scrutiny of the sources, evidence, and logical arguments presented in any content. A key question is – who benefits from the narrative, what agenda it might serve, and if it aligns with facts. This active information processing empowers individuals to discern truth from manipulation and navigate the imbricate milieu of algorithmic bias. To ensure adequate information processing, it is pertinent to democratise access to digital and media literacy.</p>.<p><strong>Transparency is non-negotiable</strong></p>.<p>The next step in this deconstructive journey is to demand transparency. Algorithms often operate as black boxes, their inner workings shrouded in secrecy. To combat bias effectively, we need to demand transparency and accountability. Advocate for platforms to disclose how algorithms operate, the data they utilise, and how they influence information curation. This transparency empowers users to understand the forces shaping their digital experience and hold platforms accountable for biased practices. Such a mission focused on demanding transparency and accountability is a vital cog in combating misinformation. In today’s information war, the quest for demanding transparency should be integrated with developing fact-checking skills. One should learn to trace information sources, objectively weigh evidence, and identify and combat misinformation and disinformation. We need to cultivate resources like fact-checking websites and develop the ability to critically evaluate the credibility and purpose of information before sharing or acting upon it. Assessing the purpose of a message is as crucial as its authenticity because even an authentic event can be propagated with an agenda to spread hatred and disharmony. For this reason, it is also essential to address societal vulnerabilities related to the urge to consume and disseminate information.</p>.<p>In addition, it is the need of the hour to promote ethical AI development. While algorithms present challenges, they also hold potential for solutions. It is important to establish systems to develop AI models for truth mapping that would visually represent complex arguments and evidence, fostering informed discourse and combating information silos. We also need to advocate for participatory algorithm design processes that centre human values and ethical considerations. When the boundaries between humans and technology are increasingly blurred, mitigating the inherent biases often embedded in our human and non-human systems is apposite. Therefore, encountering the algorithmic bias for humanity’s intellectual empowerment can be inspired by techno-realistic solutions. This search for multilayered narratives and their constructs requires individual responsibility, a commitment to diverse perspectives, and a relentless pursuit of factual understanding. Only then can the Algorithmic Biascope reels be deconstructed to ensure the triumph of critical inquiry and narratives of love, peace, social justice, democratic empowerment, and harmony over manipulation.</p>.<p><em>(The writer is an Assistant Professor in the Department of English and Cultural Studies, Christ Deemed to be University)</em></p>