A hot potato: The United Kingdom's independent authority for privacy doesn't want companies or organizations to use emotion analysis systems based on biometric traits. It's an untested and nascent technology that could even fail to materialize at all.

The UK's Information Commissioner's Office (ICO) recently released a stark warning for companies and organizations looking to deploy AI-based emotion analysis technologies. These systems don't seem to work yet and could potentially never function accurately. Deputy Commissioner Stephen Bonner said machine learning algorithms that identify and distinguish people's moods are "immature." He said the risks brought by this kind of tech are greater than the possible benefits.

"Emotional" AI is a concern for the ICO because there is currently no system developed in a way that satisfies data protection requirements, fairness, and transparency. Bonner suggested that the only sustainable biometric recognition technologies are those that are "fully functional, accountable, and backed by science," and emotion analysis algorithms are none of that.

Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions, and skin moisture. This data offers monitoring capabilities for the physical health of workers, students during exams, and more. The ICO warned that an AI system designed to identify moods could show systemic bias, inaccuracy, and even discrimination against particular traits and facial features.

Emotion analysis AI is usually paired with complex biometric systems, as it needs to manage a vast amount of personal information in addition to the facial images themselves. Beyond the algorithm's usefulness, there is another cause for concern regarding how these systems record and store data. "Subconscious behavioural or emotional responses" are more risky than traditional biometric technologies.

Since officials cannot yet trust emotion analysis technologies, the ICO warned that organizations using it "[pose] risks to vulnerable people" and will face investigation. The office advises companies to wait at least one more year to deploy commercial emotional AIs.

In the meantime, the privacy watchdog is working on a comprehensive "Biometric Guidance" regarding how biometric data, including facial traits, fingerprints, and voice samples, should be handled in a proper and non-discriminating way. The ICO expects to have guidlines published by Spring 2023.