
The Prime Minister of the United Kingdom, Keir Starmer, wants to make the country a world leader in artificial intelligence
Pa images/alamy
Thousands of public officials in the heart of the United Kingdom government, including those who work directly to support Prime Minister Keir Starmer, are using a patented artificial intelligence chatbot to carry out their work, New scientist You can reveal. The officials have refused to reveal in the registry exactly how the tool is being used, if the prime minister is receiving advice that has been prepared using AI or how public officials are mitigating the risks of the outputs of the inaccurate or biased. Experts say that the lack of dissemination raises Conerns on the transparency of the government and the precision of the information used in the government.
After ensuring the first world version of the Chatgpt records under the legislation of freedom of information (FOI), New scientist He requested 20 government departments records of his interactions with Redbox, a generative the ia developed at home and proven among the government staff of the United Kingdom. The great language chatbot allows users to interrogate government documents and “generate the first drafts of information sessions,” according to one of the people’s beers. The first judgments saw a civil official who claimed to have synthesized 50 documents “in a matter of seconds”, instead of a full day.
All the contacted departments said they did not use Redbox or refused to provide the transcripts of the interactions with the tool, claiming that New scientistThe requests of ‘S were “Vexatories”, an official term used in response to FOI’s requests that the information commissioner’s office defines as “probably causes a level of anguish, interruption or disproportionate or unjustifiable irritation.”
However, two departments provided information about their use of Redbox. The cabinet office, which supports the prime minister, said that 3000 people in his department had tasks in a total of 30,000 chats with Redbox. He said that reviewing the thesis chats to edit any confidential information before releasing them under foi would require more than a year of work. The Business and Commerce Department also decreased, stating that it had “approximately 13,000 indications and responses” and reviewing them for their release would not be feasible.
When they were asked as follow -up questions about the use of Network Box, both departments referred New scientist To the Department of Science, Innovation and Technology (DSIT), which supervises the tool. DSIT refused to answer specific questions about whether Prime Minister or other cabinet ministers are receiving tips that have been prepared using AI tools.
A DSIT spokesman told him New scientist: “No one should spend time in something that AI can do better and more quickly. Built in Whitehall, Redbox is helping us to take advantage of power in a safe, safe and practical way, which makes it easier for morletas to summarize our work and disturb officials to focus on focusing on policies and improving services, boosting the change of this country.”
But the use of generative tools of AI refers to some experts. Large language models have well documented problems around bias and precision that are difficult to mitigate, so we have no way of knowing if Redbox is providing good quality information. DSIT declined to answer specific questions about how users or Redbox avoid inaccuracies or bias.
“My problem here is that the Government is supposed to serve the public, and part of that service is that we, like taxpayers, as voters, such as the electorate, must have a certain amount of access to the understanding of how decisions are made and what are the processes in thermsy”, “Staffordshire, United Kingdom.
Because the generative tools of AI are black boxes, Flick is a group that is not easy to prove or understand how it reaches a particular exit, such as highlighting certificates or a document on others. The UN of the Government
That lack of transparency extends to a third government department, the treasure. In response to FOI’s request, the treasure said New scientist That your staff does not have access to Redbox, and that “internally informed GPT tools within HM [His Majesty’s] The treasure does not retain a quick story. ”Exactly what GPT tool this refers to not being clear: although Chatgpt is the most famous example, other large language models are also known as GPTS. The answer suggests that the treasure is not the treasure that the treasure is the treasure is the treasure, the treasure is the treasure is the treasure, the treasure is the treasure. The treasure is that the treasure is that the treasure is The treasure is that the treasure is that the treasure is the treasure is the treasure, of its use. New scientistRequest for clarification.
“If the indications used are not retaining, it is difficult to have some kind of idea or how to replicate the decision -making processes there,” says Flick.
Jon Baines, from the United Kingdom’s law firm, Mishcon de Reya, says that choosing not registering this information is unusual. “It seems surprising to me that the government says that it cannot recover indications in its internal GPT systems.” Although the courts have ruled that public bodies do not have to keep public records before the archive, “good information governance would suggest that it may still be very important to retain records, especially where they can.
However, the data protection expert, Tim Turner, says that the treasure has the right not to retain AI’s instructions under FOI’s laws: “I think that unless there is a specific legal or civil service rule about the nature of the data, they can do this.”
Topics:
]