SHARE

With the passage of time, AI (artificial intelligence) is getting more and more intelligent and responsive. From Apple Inc. (NASDAQ:AAPL)‘s SIRI to Microsoft Corporation (NASDAQ:MSFT)‘s Cortana, there are many examples of how AI has become a part of day-to-day life.

Although the evolution of SIRI, Cortana, and other such AI programs reflect the progress of the technology, challenges for the companies developing them don’t seem to end. In one such case, reports claim that Microsoft’s Cortana has been sexually harassed many times since its inception. Though sexual harassment of an AI program sounds weird, Microsoft wants to take strong action against such practices.

Insights of Matter

According to Deborah Harrison, Editorial Writer – Cortana Division, Microsoft, as the AI system becoming more and more humanized, their virtual harassment has turned into a disturbing yet daily-life fact. It has become tough for virtual female assistants to escape the disrespectful and dirty minds of users.

Talking about the core reason behind this rising issue, Harrison says that the female voices of the AI systems play a crucial role. From Apple’s SIRI to Microsoft’s Cortana and Amazon.com, Inc. (NASDAQ:AMZN)‘s Alexa, most of the AI virtual assistants are females, which triggers users mind to ask them the questions that can’t be justified in normal-day language. People are so comfortable with AI systems that they consider them as part of their lives, which has led to such vague questions.

Cortana was launched in 2014, and since then it has been getting many questions regarding its sex life. In Harrison’s words, the AI team at Microsoft is fighting hard to handle this issue. Microsoft wants to ensure that users take Cortana as a 21st-century woman who doesn’t take any crap.

Harrison and seven other writers have been assigned the task to come up with an appropriate solution and help Cortana handle such questions. Going forward, Microsoft will continue to make regular changes to control such user behavior in an efficient manner.