Forms on Forms on Forms
Regulation
Sarah Chen stared at the blinking cursor on Form 27-B, subsection 4, paragraph 12: “Purpose of Proposed Artificial Intelligence Utilization.” The answer was simple. She wanted the AI to summarize a 400-page environmental impact report into a two-page brief for the county commissioners’ meeting on Thursday.
That was it. That was the whole thing.
She typed: “Document summarization for internal government use.”
The compliance portal immediately flagged the entry. A red banner unfurled across her screen: “ATTENTION: AI utilization for document processing requires completion of Federal AI Transparency Disclosure (FAITD-1), State Algorithmic Accountability Certification (SAAC-7), and Municipal AI Ethics Review (MAER-3). Estimated processing time: 6-8 weeks.”
Sarah checked her calendar. The commissioners’ meeting was in four days.
She pulled up the FAITD-1 first, because federal requirements superseded state and local, or at least they had until the California AI Sovereignty Act of 2028, which she vaguely remembered reading about in a memo she’d been meaning to summarize using, ironically, an AI. The federal form was seventeen pages. The first question asked her to certify that her intended use did not fall under any of forty-seven prohibited categories, which were listed in Appendix C, which was a separate 200-page document available only through the Federal AI Registry portal, which required a separate login she’d never created.
She created the login. It took eleven minutes because the password requirements mandated at least one emoji, which her government keyboard didn’t support, so she had to copy-paste a thumbs-up from a Unicode chart.
Appendix C revealed that document summarization was not a prohibited category. It was, however, a “monitored category,” which meant she needed to complete Supplemental Form FAITD-1-S, which was twenty-three pages longer than the original form and asked, among other things, whether the AI system she intended to use had been trained on any data generated by, for, or about citizens of the European Union.
Sarah had no idea. She didn’t even know which AI system she was going to use. The county had three approved vendors, but she hadn’t picked one yet. She looked up the vendor documentation for all three. One had been trained on EU data. One claimed it hadn’t but noted that it couldn’t guarantee this due to the “inherent unpredictability of web-scraped training corpora.” The third vendor’s documentation was entirely in Korean and appeared to have been machine-translated into English by a different AI, which raised questions Sarah didn’t have time to think about.
She went back to the FAITD-1-S and selected “Unknown/Uncertain” from the dropdown menu. The form immediately grayed out and displayed a message: “Uncertain responses require completion of the AI Data Provenance Attestation (AIDPA-2). Please contact your Regional AI Compliance Officer for guidance.”
Sarah didn’t have a Regional AI Compliance Officer. She was, as far as she knew, the closest thing her department had to an AI Compliance Officer, a title that had been informally added to her job description after she’d made the mistake of successfully using ChatGPT to draft a press release in 2026, back when you could just do that without filing anything.
The state form was worse. The SAAC-7 required her to certify that her AI use complied with the EU AI Act, even though she worked for a county government in Ohio and had no obvious connection to the European Union. A footnote explained that any AI system capable of processing text might theoretically process text authored by EU citizens, and was therefore subject to extraterritorial jurisdiction under the Brussels Effect Harmonization Clause. There was a checkbox asking her to confirm that she had read and understood the Brussels Effect Harmonization Clause. She checked it without reading it. The checkbox unchecked itself and displayed a message: “Please scroll to the bottom of the Brussels Effect Harmonization Clause document before confirming.”
The document was 847 pages.
She scrolled. It took four minutes of continuous scrolling. When she reached the bottom, the checkbox finally stayed checked, but now there was a new field asking her to summarize, in her own words, the key provisions of the Clause.
Sarah laughed out loud, alone in her office, at 6:47 PM, surrounded by the fading light of a February evening. The form was asking her to summarize a document. The whole point of this exercise was that she wanted an AI to summarize a document for her. She was now three hours into the process of trying to get permission to save herself three hours of reading.
The municipal form was the shortest but somehow the most insane. The MAER-3 required approval from the Municipal AI Ethics Board, which met quarterly. The next meeting was in April. There was an expedited review process, but it required completion of Form MAER-3-EX, which in turn required attestation that the request could not have been anticipated more than thirty days in advance. Sarah had received the 400-page environmental report this morning.
The expedited form asked for the specific date she’d first learned of the need for AI assistance. She typed today’s date. A validation error appeared: “Expedited requests cannot be filed on the same date as the originating need. Please file your expedited request at least one business day after the originating need arises.”
The meeting was in four days. If she filed tomorrow, processing would take three business days minimum, which would land on Friday. The meeting was Thursday.
Sarah sat back in her chair. She looked at the stack of forms on her screen, twelve tabs now, each one spawning more tabs, an endless recursive loop of compliance breeding compliance.
Then she had a thought.
She opened a new browser window, navigated to one of the approved AI vendors, and typed: “Can you help me fill out government AI compliance forms?”
“I’d be happy to help you complete government AI compliance forms. Please upload the relevant documents and I’ll assist you in filling them out accurately.”
Her hands trembled slightly as she uploaded all twelve forms. The AI asked clarifying questions. She answered them. Three seconds later, every form was complete, internally consistent, and formatted according to federal, state, and local specifications.
Sarah downloaded the completed packet and stared at it. All she had to do now was use this same AI to summarize the environmental report.
She typed: “Now I need you to summarize a 400-page environmental impact report into a two-page brief.”
A new message appeared on screen: “Your organization has exhausted its allocated API credits for the current billing period. Credit allocation will reset on March 1st. For questions about your plan, please contact your account administrator.”
Sarah looked at the calendar. It was February 5th.


