July 26, 2024
New research warns about the risks of facial recognition and audits Zurich’s Azul regarding Down Syndrome individuals
New research warns about the risks of facial recognition and audits Zurich’s Azul regarding Down Syndrome individualsGender bias predicting women’s ages radically lower and in BMI prediction, this research provides numbers...

Gender bias predicting women’s ages radically lower and in BMI prediction, this research provides numbers to CSOs concerns: Facial recognition bias.

SPAIN, October 26, 2023 /EINPresswire.com/ — Conducted over a year, the comprehensive audit evaluated the intersection of facial recognition technology and disabilities. By employing a multi-faceted approach encompassing qualitative interviews and extensive experimental testing, the audit exposes glaring disparities and biases in age prediction, gender classification, and emotion recognition targeting disabled individuals.

For consumers but without the consumers

Facial recognition technologies put in jeopardy the consumer’s right to equal access to services, and in the name of consumers’ benefit.

This new report emphasized the need to involve consumer authorities, especially for vulnerable groups such as individuals with Down’s Syndrome, whose interests have been compromised, according to the new report’s findings.

Key Findings: Age Prediction Discrepancies and Gender Bias

Age Prediction Disparities: The audit uncovers a concerning 7.19% error rate in age prediction for individuals with Down Syndrome, compared to a 4.45% error rate for participants without Down Syndrome. These disparities underscore the pressing need for accurate and unbiased age prediction models for disabled individuals.

Gender Bias: The audit’s in-depth analysis reveals a stark gender divide in age estimation. Women consistently faced age underestimation, while men experienced overestimation, perpetuating systemic inequalities.

BMI Prediction Challenges

While Azul’s algorithm demonstrated moderate accuracy in predicting Body Mass Index (BMI) for individuals with Down Syndrome, it exhibited a tendency to overestimate BMI values, particularly in the case of women.

Further analysis revealed a stark gender disparity within Azul’s BMI predictions. Predicted BMIs for women consistently surpassed their actual values, while the algorithm’s projections for men varied widely. This disparity, combined with the technology’s unpredictable BMI forecasts for men, exposes a distressing gender bias within the algorithm.

Recommendations

The report’s key recommendations emphasize the urgent need for comprehensive re-evaluation of FR technology’s suitability and fairness. It calls for the adoption of transparent bias mitigation strategies, universal design principles for accessibility, and collaboration with disability organizations to prioritize ethical AI development. Regular third-party audits are proposed to ensure accountability and transparency in AI systems.

About Eticas

Eticas is a pioneer organization having carried out adversarial audits to systems used by YouTube, TikTok, Uber, and the Spanish Government, diving into the impact of these in radicalization, migrants portraying and discrimination, worker’s rights and the protection to victims of gender violence.

The world’s first algorithmic auditing company, Eticas has worked for major players in both public and private sectors. International institutions, like the UN, the European Commission, the OECD or the InterAmerican Bank of Development have already trusted Etica’s audits and oversight of the ethical and responsible use of their technology.

Founded by Dr. Gemma Galdon in 2012, since its inception it has been concerned with protecting people and their rights in technological processes. From video surveillance and customs controls to the gender and age gap in digital environments, in 2021 it made its algorithm audit methodology public as part of its mission to make better technology exist for a better world, consolidating itself as a world reference in carrying out algorithmic audits.

Furthermore, through Eticas Foundation, it takes part and leads different initiatives that try to raise awareness of the need to monitor and demand transparency in the use of algorithms and automated decision-making systems, such as the Observatory of Algorithms with Social Impact (OASI), a registry of algorithms that are impacting the lives of citizens around the world.

For further information: Sandra Montesinos | [email protected]

Sandra Montesinos
Eticas
[email protected]

New research warns about the risks of facial recognition and audits Zurich’s Azul regarding Down Syndrome individuals

Article originally published on www.einpresswire.com as New research warns about the risks of facial recognition and audits Zurich’s Azul regarding Down Syndrome individuals

originally published at HUMAN RIGHTS - USA DAILY NEWS 24