02/03/2021 / By Virgilio Marin
Banks can now deny or approve loans by screening a person’s face and voice using a newly developed artificial intelligence (AI) designed to determine trustworthiness.
Japanese tech company DeepScore recently unveiled its facial and voice recognition app, which it also called DeepScore, at the trade event Consumer Electronics Show in Las Vegas. The AI-enabled app analyzes muscular twitches in the face and changes in the voice to calculate a “True Score” while the customer answers a 10-item questionnaire. The app is able to determine trustworthiness in only a minute.
The company touts DeepScore as the “next-generation scoring engine” for financial institutions. But tech experts are concerned that the app might be inaccurate and discriminatory against people with tics or anxiety. Others also raised privacy concerns.
DeepScore CEO Shirabe Ogino said that the app is based on more than 200 studies about the link between dishonesty, micro-movements and stress.
“When you tell a lie, you feel stress, your eye or mouth moves and your voice will skew,” Ogino said in a statement.
The firm claims that the app can determine lies with a 70 percent accuracy and a 30 percent false-negative rate. If the app thinks that the customer lied to any of the questions, it will alert companies to increase fees or conduct additional examinations.
But AI experts have reservations about the new product. Rumman Chowdhury, founder of algorithmic bias auditing platform Parity AI, thinks that the app may be biased against people with tics and anxiety. She said that while there are general behavior patterns when lying, these are not always true for everyone.
“I might touch my nose because it’s a nervous tic that I’ve developed. It doesn’t mean that I’m lying,” Chowdhury told Motherboard. She added that many of the studies cited by the firm addressed the correlation of facial movements with stress, not lying per se.
Amos Toh, an AI researcher for Human Rights Watch, raised similar concerns.
“The serious concern I have about this kind of technology is that there is simply no reliable science to indicate that people’s facial expressions or the inflections of their voice are proxies for their internal mental and emotional states,” Toh said.
Data security experts are also concerned about the privacy threats posed by the app. Ioannis Kouvakas, a legal officer for the U.K.-based Privacy International, said that the use of the app is not likely legal in the European Union (EU) due to the General Data Protection Regulation (GDPR).
Under the GDPR, biometric data such as face images are considered “sensitive.” Processing biometric data for identification purposes is prohibited unless the person explicitly consented.
But some countries lack comprehensive data protection laws, including Indonesia and Vietnam where DeepScore has active customers. In the United States, facial recognition laws vary per state. States such as Illinois and Washington have strict regulations in place but others do not. (Related: Surveillance state: 1 in 2 American adults is already in the FBI’s facial recognition database.)
Ogino said that prospective borrowers can opt out of the service or find another financial institution to work with. But making those choices are often mired by an “unfair balance of power.”
“It’s very easy to claim that you rely on consent, but there is a very unfair balance of power,” Kouvakas said. “It’s really hard to say no to the person deciding whether you’re getting [your money] or not next month.”
Learn more about the privacy threats posed by the tech industry at PrivacyWatch.news.
Sources include:
Tagged Under: AI, algorithm, artifical intelligence, bias, data security, Facial recognition, privacy, privacy watch, surveillance, technology
COPYRIGHT © 2017 POLICE STATE NEWS