Human review of AI decisions needs legal protection, says IT industry body

0
38

A government consultation on personal data suggests human appeal against some automated decisions by AI (which might include recruitment or loan eligibility) could be unnecessary.

But because AI doesn’t always involve personal data to make decisions about us, true protection of our right to revisit must consider wider regulation of AI, according to BCS, The Chartered Institute for IT.

The consultation Data: A New Direction, launched by the Department for Digital, Culture, Media and Sport (DCMS), is looking to update the UK’s version of the GDPR.

DCMS is seeking further evidence before forming firm proposals on reform of the UK’s existing data legislation, including considering the removal of Article 22 of the GDPR. Article 22 focuses specifically on the right to review fully automated decisions.

Clarity needed

Dr Sam De Silva, Chair of BCS’ Law Specialist Group and a partner at law firm CMS, explained: “Article 22 is not an easy provision to interpret and there is danger in interpreting it in isolation like many have done. “We still do need clarity on the rights someone has in the scenario where there is fully automated decision making which could have significant impact on that individual.

“We would also welcome clarity on whether Article 22(1) should be interpreted as a blanket prohibition of all automated data processing that fits the criteria or a more limited right to challenge a decision resulting from such processing.

“As the professional body for IT, BCS is not convinced that either retaining Article 22 in its current form or removing it achieves such clarity.

“We also need to consider that protection of human review of fully automated decisions is currently in a piece of legislation dealing with personal data. If no personal data is involved the protection does not apply, but the decision could still have a life-changing impact on us.

“For example, say an algorithm is created deciding whether you should get a vaccine. The data you need to enter into the system is likely to be DOB, ethnicity, and other things, but not name or anything which could identify you as the person.

“Based on the input, the decision could be that you’re not eligible for a vaccine. But any protections in the GDPR would not apply as there is no personal data.

“So, if we think the protection is important enough it should not go into the GDPR. It begs the question – do we need to regulate AI generally – and not through the “back door” via GDPR?

“It‘s welcome that government is consulting carefully before making any changes to people’s right to appeal decisions about them by algorithms and automated systems – but the technology is still in its infancy.”


Credit: Source link

#

LEAVE A REPLY

Please enter your comment!
Please enter your name here