<p>A Google search for ‘the ugliest language in India’ yielded ‘Kannada’ as the answer late last week, causing widespread outrage.</p>.<p>Google has since apologised, saying the answer does not reflect its views, but questions still remain about why this happened at all, and who drafted the<br />answer.</p>.<p>“When artificial intelligence gets it wrong, things can go really wrong, says tech entrepreneur,” Hari Prasad Nadig, who has worked on Kannada in free and open source software. </p>.<p>“Usually, you would expect Google to give an answer based on citings from multiple sources, and at least one or two credible sources. Google’s AI should be good enough not to draw answers from opinionated sources,” he says. Google shouldn’t even try to answer prejudiced questions like this in the first place, and the answer shows how flawed it is, he told <span class="italic">Metrolife</span>.</p>.<p><strong><span class="bold">Fallible process</span></strong></p>.<p>Pranesh Prakash, Centre for Internet and Society, Bengaluru, says the incident exposes the fallibility of the process by which Google selects its “featured snippets”.</p>.<p>“It is not an opinion that Google or its employees or its algorithms have come up with, but rather an existing opinion that Google wrongly amplified,” he says.</p>.<p>It demonstrates that the snippets that Google features as ‘facts’ aren’t necessarily based on facts, he says.</p>.<p><strong><span class="bold">Periodic checks</span></strong></p>.<p>Shweta Mohandas, researcher with the Center for Internet and Society, says Google does not create content, but only provides content that is available on the Internet.</p>.<p>“Hence, the biases come from the tags, then used to train the AI. There should be periodic checks on the data fed into the system,” she<br />says.</p>.<p>Such blunders can be prevented if the tags and results are audited periodically, and a mechanism is put in place to enable people to report them, she says.</p>.<p><strong>Who was up to mischief?</strong></p>.<p>The answer was created on a financial services website whose owners aren’t revealing their names</p>.<p>Pavanaja UB, CEO, Vishva Kannada Softech, says the answer was attributed to a website called debtconsolidationsquestions. com — but he was unable to find this post anywhere on the site.</p>.<p>“This is a website registered in Russia and it offers questions and answers on many topics. But this particular page could not be found. Maybe it was removed following the outrage,” he says.</p>.<p>Pavanaja believes this was a deliberate attempt to upset people. “The website lists no information about the owner and gives no contact details. Even if such a question did exist on the page before, how did it get to the top of the Google search results?” he wonders.</p>.<p>He suggests that someone planted the answer and kept searching for it until it reached the top. “But who would take so much effort?” he says.</p>.<p><strong>Furore and after</strong></p>.<p>‘Kannada’ came up as an answer to a query in Google about ‘the ugliest language in India’.</p>.<p>Aravind Limbavali, minister for Kannada and Culture, demanded an apology from Google, and threatened legal action against the company “for maligning the image of our beautiful language.”</p>.<p>Google removed the answer and issued a statement:</p>.<p>“We know this is not ideal, but we take swift corrective action when we are made aware of an issue and are continually working to improve our algorithms. Naturally, these are not reflective of the opinions of Google, and we apologise for the misunderstanding and hurting any sentiments.”</p>
<p>A Google search for ‘the ugliest language in India’ yielded ‘Kannada’ as the answer late last week, causing widespread outrage.</p>.<p>Google has since apologised, saying the answer does not reflect its views, but questions still remain about why this happened at all, and who drafted the<br />answer.</p>.<p>“When artificial intelligence gets it wrong, things can go really wrong, says tech entrepreneur,” Hari Prasad Nadig, who has worked on Kannada in free and open source software. </p>.<p>“Usually, you would expect Google to give an answer based on citings from multiple sources, and at least one or two credible sources. Google’s AI should be good enough not to draw answers from opinionated sources,” he says. Google shouldn’t even try to answer prejudiced questions like this in the first place, and the answer shows how flawed it is, he told <span class="italic">Metrolife</span>.</p>.<p><strong><span class="bold">Fallible process</span></strong></p>.<p>Pranesh Prakash, Centre for Internet and Society, Bengaluru, says the incident exposes the fallibility of the process by which Google selects its “featured snippets”.</p>.<p>“It is not an opinion that Google or its employees or its algorithms have come up with, but rather an existing opinion that Google wrongly amplified,” he says.</p>.<p>It demonstrates that the snippets that Google features as ‘facts’ aren’t necessarily based on facts, he says.</p>.<p><strong><span class="bold">Periodic checks</span></strong></p>.<p>Shweta Mohandas, researcher with the Center for Internet and Society, says Google does not create content, but only provides content that is available on the Internet.</p>.<p>“Hence, the biases come from the tags, then used to train the AI. There should be periodic checks on the data fed into the system,” she<br />says.</p>.<p>Such blunders can be prevented if the tags and results are audited periodically, and a mechanism is put in place to enable people to report them, she says.</p>.<p><strong>Who was up to mischief?</strong></p>.<p>The answer was created on a financial services website whose owners aren’t revealing their names</p>.<p>Pavanaja UB, CEO, Vishva Kannada Softech, says the answer was attributed to a website called debtconsolidationsquestions. com — but he was unable to find this post anywhere on the site.</p>.<p>“This is a website registered in Russia and it offers questions and answers on many topics. But this particular page could not be found. Maybe it was removed following the outrage,” he says.</p>.<p>Pavanaja believes this was a deliberate attempt to upset people. “The website lists no information about the owner and gives no contact details. Even if such a question did exist on the page before, how did it get to the top of the Google search results?” he wonders.</p>.<p>He suggests that someone planted the answer and kept searching for it until it reached the top. “But who would take so much effort?” he says.</p>.<p><strong>Furore and after</strong></p>.<p>‘Kannada’ came up as an answer to a query in Google about ‘the ugliest language in India’.</p>.<p>Aravind Limbavali, minister for Kannada and Culture, demanded an apology from Google, and threatened legal action against the company “for maligning the image of our beautiful language.”</p>.<p>Google removed the answer and issued a statement:</p>.<p>“We know this is not ideal, but we take swift corrective action when we are made aware of an issue and are continually working to improve our algorithms. Naturally, these are not reflective of the opinions of Google, and we apologise for the misunderstanding and hurting any sentiments.”</p>