Debias Data Solution

Remove bias from your data

Data-driven

illumr’s ROSA is a cutting edge AI tool that removes bias from datasets.

It improves fairness in decision making whilst allowing organisations to comply with global regulations such as GDPR anti-bias discrimination in automated decision making systems and EU regulations on gender-neutral insurance pricing.

ROSA empowers organisations with an ability to not only explain the bias removal process but also a distributed ledger so they can demonstrate to auditors how and when the de-biasing took place.

no-bias-3

I want unbiased and objective data

Get your free trial
people

How Rosa works

Our completely unique & scalable solution fits into a an organisation’s existing data analytics pipeline as a one line API call & is fit and forget.

This is unlike other solutions which typically are fitted after the AI algorithm has been trained.

In addition, our solution is totally automated and requires no direct intervention from a trained Data Scientist to manually alter the algorithm.

See Rosa in action

Book a demo

Product Use Cases

Developing Responsible AI

Supporting organisations by ensuring that the data they use to train AI is unbiased and therefore does not reproduce human biases on important topics like Race, Age and Gender in the actions they perform. For instance, ensuring that any AI deployed by the Ministry of Defence on threat detection does not reflect any racial bias

Ensuring Bias Free Credit Risk Decisions

Helping organisations ensure that the data sets they derive insight from are free of bias, in a clear and traceable manner that complies with global regulations. For instance, removing data bias in credit card/mortgage/loan application data. This can prevent discrimination in AI screening processes, as seen with Apple and the gender discrimination complaints against their credit card decisioning process.

Find out more with the Rosa datasheet

Download now

How Rosa has helped customers

How Rosa found and lowered racial bias from a criminal risk assessment tool

The challenge

The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a 137 question assessment. Its purpose is to determine the likelihood that a convict will reoffend.

It produces a risk score between 1 and 10 - a score of 10 is the highest risk possible. However, there is no guidance on how this translates into actual likelihood of reoffending.

An indicator of bias is the difference in the ratio between False Negative Rates and False Positive Rates for black and white defendants. Even though the overall predictive accuracy is similar for different races, the algorithm makes mistakes in different ways, depending on the race of the defendant.

The solution

COMPAS tends to overestimate the likelihood of reoffending for black defendants. It also underestimates the likelihood of white defendants reoffending. This is clear evidence that the algorithm is biased with respect to race.

illumr was set the same task as COMPAS - to predict reoffending. We used Rosa to de-bias the data.

The result

While all tests suggested significant racial bias, the bias in the Rosa model had a 30x lower effect size than COMPAS. Its predictions were also more accurate.

It is possible to remove bias without seriously compromising predictive performance, avoiding the pitfalls of more common methodologies.

WHY ROSA?

Rosa Parks was an iconic figure in the fight for equality and impartiality. We couldn’t have asked for a stronger name to try to live up to.

This tool is a step towards treating people fairly, today.