AI in the Public Sector

Reading Time: 4 minutes
This graph, created for the article, shows the prevalence of AI in the UK’s government.

Last year on November 29th, the UK Cabinet Office, sponsored by the Department of Work & Pensions (DWP) Permanent Secretary published a policy paper stating that by 2025, the UK government will be a transformed into a more efficient digital government for better public service — focusing on cutting-edge technologies, most notably in generative artificial intelligence (AI).

Trials and errors

Months ago, we reported regarding the use of AI in the Netherlands and the UK, in which we concluded governments have to build a solid legal framework for attending the victims of wrong predictions and incentivize collaboration with academia.

The AI system which the UK was pointed for possible nationality biases is deployed and maintained by the DWP. However, no mentions are made regarding lowering AI’s errors in DWP’s responsibilities in their recent policy collaboration with the Cabinet Office.

This system, called the Fraud Error and Debt Program (FEDP) has a long history of incidents ranging from 2011. A 2018 report by the National Audit Office (NAO) highlighted that it took 3 years to recognize the models’ issue as systematic, generating estimates of 70,000 affected people and the DWP owing £340 million in underpaid benefits. It was not until July 2017 — 8 years later — when they identified and generated a response to the affected people.

From this incident, the DWP agreed to foster stronger discussions for stronger grasps of legal obligations and risks of the system developed by their statistics team. However, although DWP has linked the Fraud and Error Audit Framework as a resource in their Staff Guide for Fraud Investigations, it only serves as a manual of good practices and not as a solid legal base.

How AI works in the public sector

DWP has consistently claimed that more data will leverage FEDP’s performance. From what has been revealed in public papers, the system is fed on real-time data from people’s employment income, their banking data, and monthly personal and credit data named “RTI database”.

Furthermore, the program regularly uses information from third-party organizations, including credit reference agencies. However, there is no information how they regulate and detect biases and errors from these outside sources. This is specially concerning, as most FEDP errors are statistical and are strongly related to the data’s accuracy — taking a significant time to detect as seen in 2011.

In a response to Transparency International, DWP revealed that they use data in different manners across services. Highlighting that Artificial Intelligence is deployed in FEDP and other projects held by the same unit, but has not reached complaints, referrals or cases, which are currently managed using statistics and more commonly used analysis techniques.

In 2020, the UK Parliament concluded and recommended the DWP to evaluate cost-effectiveness of their technology investment, monitor discrimination caused by AI, and take in special considerations claims’ contexts. To which, the DWP Annual Report & Accounts of 2021-22 was certain of crucial changes and improvement. However, recent research shows that approximately 9,000 cases have presented major setbacks from the FEDP system, complaints are not assessed in the majority of times, and there are limited channels to attend victims of wrong predictions.

High AI interest, low preparedness?

The recent years, the DWP has showed an increasing interest of expanding their use of AI. According to GIAA Annual Reports Account, the department the most transactions the Government Internal Audit Agency (GIAA) has had in the 2022-2023 period was the DWP. In which, GIAA has determined as a priority goal for the following years the deployment of AI in multiple public sector internal procedures.

A recent DWP survey on AI perceptions on recruitment reflects a growing interest in expanding AI systems beyond their fraud detection services. 6% of DWP employees opted to “Use the Same amount” of AI in recruitment, which could indicate the existence of implementation initiatives in Human Resources, whilst 59% prefers to not use at all AI in HHRR.

To date, DWP has claimed to take responsibility in the perfection, training, deployment, and operations of the FEDP model. However, there is no public evidence of AI-centered departments or initiatives for AI specialists’ recruitment at the DWP.

Their Digital Group unit, formed solely by digital professions, is the largest with 56 executives according to their latest Transparency Organization Chart Database. Their Central Analysis and Science Directorate is composed of interdisciplinary professionals of statistics and policy professions. However, there are no vacancies registered or hiring of Machine Learning, Deep Learning or AI specialists. Most of DWP team is conformed of IT, statistics or data analysis backgrounds.

Main takeaways

The Department of Work & Pensions has shown high initiative in deploying new solutions to their systems and have recognized the existence of biases and errors in the FEDP system. The DWP currently holds potential of policy and statistics interdisciplinary professions from many of its executives to build a solid digital services portfolio. Although their Fraud and Error department has long worked with these tools, there is no guarantee of professionalism expertise in AI and Machine Learning.

From the latest event’s impact and likelihood, the Department must prioritize tangible and measurable results on (1) timely prediction’s error detection, (2) quick action, (3) foster complaints channels and (4) appropriate understanding for handling biases.

Without further manuals, papers and inquiries to higher organization levels, quick action and assertive management will successfully take DWP to implement National Audit Office and Parliament’s recommendations. Furthermore, setting an end to the ongoing trial and error circle, and preventing further debt and wrong predictions consequences.

Written by Emily Ulloa

Share this:

You may also like...

X (Twitter)
LinkedIn
Instagram