NTIA wants ‘the whole lifecycle of accountability’ to assess AI systems, agency head says

Andriy Onufriyenko

NTIA Administrator Alan Davidson said an agency report on artificial intelligence accountability “coming out this winter” will help outline an auditing process for the new tech.

The head of the National Telecommunications and Information Administration said his agency is looking at how to create an auditing process to hold artificial intelligence systems accountable, as part of an effort to promote safe and ethical uses of the emerging technologies.

During a Monday discussion at the Knight Foundation’s INFORMED 2024 conference, NTIA Administrator Alan Davidson said “I think one of the things that we've seen is, like financial audits for the financial accounting system, there is going to be a role to play for audits in the AI ecosystem.”

NTIA released a request for comment in April 2023 soliciting public feedback on how to mitigate the harms of AI, noting that it would use the input to help “draft and issue a report on AI accountability policy development, focusing especially on the AI assurance ecosystem.” The agency said it received more than 1,400 comments in response to its request. 

Davidson teased the AI report’s release, saying “it'll be coming out this winter, let's just say that.”

To answer some of the agency’s questions about holding AI systems accountable, Davidson said the report examines “how can we as policymakers help build this ecosystem, whether it’s through things that we fund [and] things that we press people to adopt?”

“Maybe there'll be things that should be mandated or that we think policymakers or legislators should look at, but the idea is to say, ‘how do we create this ecosystem of assessment standards and auditing so that we can actually put trustworthy AI out in the field and help people know that it really is trustworthy?’” he added.

Davidson said NTIA is “going to look at the whole lifecycle of accountability, starting with the transparency and information that you need to be able to hold models accountable.”

He added that this includes thinking about the broader AI ecosystem, including who is going to do these assessments and how to “create a market of third-party auditors the way that we've done in the financial auditing system.”

President Joe Biden’s October 2023 executive order on AI called, in part, for the the Commerce Department — in coordination with other relevant federal agencies — to launch “an initiative to create guidance and benchmarks for evaluating and auditing AI capabilities, with a focus on capabilities through which AI could cause harm, such as in the areas of cybersecurity and biosecurity.”

While NTIA’s request for comment predated the release of Biden’s order, Davidson said the administration’s directive tasked the agency with also examining “the risks and challenges” around open source AI.

He called the issue “a really hard problem and an interesting problem” for NTIA to tackle, given concerns about open access to some advanced AI models. 

“If you're a risk-averse policymaker, you might say, ‘well, we should be really careful here,’” Davidson said. “But I think there's another piece to this as well, which is we also want to think about the opportunities for innovation and for competition. And I think we would have to be careful about creating a world where only a small set of players have access to the most important and powerful AI systems out there.”