UK standards committee: ‘urgent need’ to strengthen regulation of public sector AI – Government & civil service news – Global Government Forum

https://ift.tt/3bKvkgq

Chair of The Committee on Standards in Public Life, Jonathan Evans, says the government is failing on openness regarding the use of AI in the pubic sector

A new report looking at artificial intelligence (AI) in the UK public sector says there’s an “urgent need for guidance and regulation.”

AI has the potential to “revolutionise” public service delivery, says the Committee for Standards in Public Life (CSPL), with machine learning in particular transforming how decisions are made in policing, health, welfare, social care and education.

But there are “notable deficiencies,” in the UK’s regulatory framework for AI in the public sector, the report says. Transparency and data bias are two particular areas “in need of urgent attention in the form of new regulation and guidance.”

Data bias is a “serious concern” and risks “embedding and amplifying discrimination in everyday public sector practice,” the CSPL warns, and “further work is needed on measuring and mitigating the impact of bias.”

Hidden intelligence

Writing in The Times, this week, Lord Jonathan Evans, the committee’s chair, said the government is “failing” on openness.

“Public sector organisations are not sufficiently transparent about their use of AI and it is extremely difficult to find out where and how new technology is being used. We add our voice to that of the Law Society and the Bureau of Investigative Journalism in calling for better disclosure of algorithmic systems.”

The report also highlights the risk of AI systems obscuring who is ultimately responsible for key decisions made by public officials, but says it is “too early to judge if public sector bodies are successfully upholding accountability”.

While it doesn’t call for a dedicated AI regulator, the CSPL report says existing regulators must adapt to meet the challenges posed by the new technology and advises that the Centre for Data Ethics and Innovation (CDEI) should perform a regulatory assurance role. “The government should act swiftly to clarify the overall purpose of CDEI before setting it on an independent statutory footing,” it recommends.

There are a further 14 recommendations to government and frontline public sector services. These include clarifying the application of anti-discrimination law to AI; an overhaul of procurement processes; and for public sector organisations to publish a statement on how their use of AI complies with relevant laws and regulations before they launch it.

Growing regulation of social media

Meanwhile, the government is moving to tighten regulation of social media platforms. Culture secretary Nicky Morgan and home secretary Priti Patel said on Tuesday that they are “minded to appoint communications watchdog Ofcom as the regulator to enforce rules to make the internet a safer place” – enforcing a statutory duty of care requiring social media platforms to protect users from “harmful and illegal terrorist and child abuse content”.

And last week, the CDEI launched a report calling for stronger regulation of the use of user targeting systems by global platforms such as Facebook and YouTube. “Online targeting systems too often operate without sufficient transparency and accountability,” it said, and current mechanisms to hold global platforms to account are “inadequate”.

“We do not propose any specific restrictions on online targeting. Instead we recommend that the regulatory regime is developed to promote responsibility and transparency and safeguard human rights by design.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s