AI tools are used to test public opinion on botulinum toxins and fillers
Chris Vallance

Senior Technical Reporter

Getty Images

Artificial intelligence (AI) tools have been used to categorize comments about botulinum toxin and lip fillers, submitted as part of a public consultation, what officials say is the first use of such use in the UK.

Officials set the tool for the Scottish government to consult on regulating non-surgical makeup procedures to screen for responses.

They found that it presents "almost the same" results compared to the same tasks set by humans.

Hopefully, the tool, known as "consultation", will exempt civil servants from similar time-consuming tasks in the future and save taxpayers about £20 million.

Consulting is one of the planned set of government AI-driven tools that was collectively referred to as "Humphrey" after the classic 1980s sitcom Yes was Minister of Sir Humphrey Appleby, a senior civil servant who was the minister. The series is often designed to over-bureaucratic governments.

During this trial, the AI ​​tool examined 2,000 submissions. However, public consultations that gather British citizens’ opinions on issues considered by ministers may generate thousands of responses.

It is able to identify topics in responses and calculate and classify answers accordingly - human experts examine their work in both stages.

The consultation findings were then examined to see their comparison with a team of human experts working in parallel.

Technology Minister Peter Kyle said the initial success of the trial means consulting will be used “soon” in the government.

“After proving such promising results, Humphrey will help us cut management costs and make it easier to collect and comprehensively review what experts and the public are telling us on a range of critical issues,” he wrote.

The government hopes to save £45 billion through the wider public sector using AI technology.

"People in a loop"

The government said the consultation is still in the pilot phase and will conduct more evaluations before any final decision is rolled out more widely.

The government added that there will always be "people in the loop" to check and consult.

Officials are also trying to address some ongoing concerns about AI systems.

One is that they sometimes invent information - a failure called "illusion."

Since AI is only required to perform relatively limited tasks, officials say hallucinations are not the main problem.

Such AI tools built using so-called “big language models” also show bias because they adsorb bias inherent in trained human-generated data.

Experts working with consulting firms have found that overall the bias is reduced, which is an opportunity to “project their own preconceived ideas” by eliminating individual analysts.

The consultation is also tested to check that it can handle languages ​​containing spelling errors and other errors.

However, for the moment, it only works in English and the responses of other languages ​​spoken in the UK, such as Welsh, will be translated into English first.

Green promotional banner with black squares and rectangles forming pixels, moving in from the right. The text says: