Security

With the promise of accurately tracking mental states, emotional artificial intelligence (AI) and affective computing technologies have been heavily developed and commercialized by security companies around the world. One of the worrying examples of emotional AI’s security applications that encapsulates all the concerns over autonomy, privacy, and misuse, is ELSYS.

 

Security

This is a Russian company that markets the security software VibraImage which they ascertain can detect various emotional states by reading the micro-movements of a human head: “aggression, tension, balance, energy, inhibition, stress, suspiciousness, charm, self-regulation, neuroticism, extroversion, and stability, categorizing these automatically into positive and negative ‘emotions”. While the actual ‘science’ of their software is suspect, ELSYS’s commercial and governmental clients are all over the globe. VibraImage was deployed in the 2014 Sochi Olympics and 2018 FIFA World Cup in Russia, and in nuclear stations in Japan and Russia. ELSYS’s legitimacy is bolstered because their products often were sold through NEC, a leading Japanese electronics and security company. 

Security

In China, citizens are asked to install spyware apps onto their phones to allow the government and the police to easily mine their non-conscious body data for the purposes of using AI algorithms to identify “suspicious persons” or more specifically, members of the Muslim-Turkic minorities currently residing in Western China. The algorithms are designed to highlight “Muslim” or minority behaviors, and associate such behavior with suspicious acts. This is yet another example of the unmitigated power available to emotional artificial intelligence that can take advantage of an almost limitless amount of metadata for the purposes of racially contingent predictive policing.