ARTICLE
28 October 2022

Warning About Use Of Emotional Analysis Technology

PC
Preiskel & Co

Contributor

 Preiskel & Co logo
Preiskel & Co LLP, is an English law firm independently recognised as a leader in the telecommunications, media and technology sectors. Preiskel & Co team of lawyers is truly international many of whom are qualified in multiple jurisdictions. This international mind-set has proved of considerable advantage to many clients, as the firm advises on matters in England but also coordinates advice across Europe, and other continents. The firm also advises on issues concerning outer space and the virtual world.
On 7 October 2022, the Information Commissioner's Office (ICO) has posted a news alert regarding the risks from use of biometric technologies. The alert provides a warning that the ICO views...
UK Technology
To print this article, all you need is to be registered or login on Mondaq.com.

On 7 October 2022, the Information Commissioner's Office (ICO) has posted a news alert regarding the risks from use of biometric technologies. The alert provides a warning that the ICO views the use of certain emotional analysis technologies as more risky than traditional biometric technologies. This is because the ICO does not deem the current algorithms as sufficiently developed to detect accurate emotional cues. This is seen by the regulator as holding the potential for the risk of systemic bias, inaccuracy and even discrimination.

The emotional analysis technologies currently being flagged by the ICO are those that collect, store and process the range of personal data, including subconscious behavioural or emotional responses, and in some cases, special category data. Deputy Commissioner, Stephen Bonner has been quoted: [...] "While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination."

This ICO warning is layered against the heightened risk and data security requirements concerning biometric data which is already being successfully used by various industries, such as the financial sector using facial recognition to verify human identities through comparing photo IDs and a selfie. Biometric technologies are technologies that process biological or behavioural characteristics for the purpose of identification, verification, categorisation, or profiling.

Biometric Guidance to be published spring 2023

To assist companies with understanding what is seen as fair use and biometric data use and compliance requirements, there is an anticipated Biometric Guidance to be published by the ICO in spring 2023. There is a strong emphasis that biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used. This uniqueness creates a high threshold for companies using such information to embed a 'privacy by design' approach, thus reducing the risk factors.

See more on the ICO's news alert here.

We will continue to monitor these developments and any more specific recommendations and guidance as it is released by the ICO.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

ARTICLE
28 October 2022

Warning About Use Of Emotional Analysis Technology

UK Technology

Contributor

 Preiskel & Co logo
Preiskel & Co LLP, is an English law firm independently recognised as a leader in the telecommunications, media and technology sectors. Preiskel & Co team of lawyers is truly international many of whom are qualified in multiple jurisdictions. This international mind-set has proved of considerable advantage to many clients, as the firm advises on matters in England but also coordinates advice across Europe, and other continents. The firm also advises on issues concerning outer space and the virtual world.
See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More