Lessons from FDA’s First Warning Letter on AI-Generated cGMP Documents

QUICK ANSWER

On April 2, 2026, FDA issued Warning Letter 320-26-58 to Purolea Cosmetics Lab — the first publicly released enforcement action that explicitly cites the use of AI agents to draft CGMP documentation. The agency’s position is unambiguous: AI may assist in creating specifications, procedures, and master production records, but every output must be reviewed and cleared by an authorized human in the Quality Unit. “The AI didn’t tell me it was required” is not a defense to a CGMP violation.


Generative AI has moved into pharmaceutical quality systems faster than most regulatory frameworks have adapted. SOPs, batch record templates, specifications, deviation reports, and CAPA write-ups are being drafted or partially drafted by ChatGPT, Claude, Copilot, and a growing list of domain-specific tools. For small peptide manufacturers and 503A compounders running lean QA teams, the productivity gains are real — and so is the temptation to ship the AI’s first draft.

FDA’s April 2, 2026 Warning Letter to Purolea Cosmetics Lab makes clear where that temptation leads. The letter is the first public enforcement action we are aware of in which the agency dedicates a numbered observation to AI-assisted document creation. Every peptide developer and distributor should read it carefully.

What Happened at Purolea

FDA inspected Purolea’s Livonia, Michigan drug manufacturing facility from October 28 to 30, 2025. The resulting warning letter cites a familiar set of CGMP failures: insanitary manufacturing conditions, no microbiological release testing, no incoming-component identity testing, inadequate Quality Unit oversight, and the marketing of two products (“Dermveda Extra Strength Shingles Relief” and “Dermveda Extra Strength Ultra Genital Herpes Relief”) as unapproved new drugs treating serious conditions.

What sets this letter apart is a standalone observation titled “Inappropriate Use of Artificial Intelligence in Pharmaceutical Manufacturing.”

The AI Finding

According to the letter, Purolea told FDA investigators that it “utilized artificial intelligence (AI) agents to help your firm comply with FDA regulations,” using AI to “create drug product specifications, procedures, and master production or control records to be in compliance with FDA requirements.”

FDA’s response is worth quoting directly:

“If you use AI as an aid in document creation, you must review the AI generated documents to ensure they were accurate and actually compliant with CGMP. Your failure to do so is a violation of 21 CFR 211.22(c).”

The agency then describes a concrete consequence of skipping that review: Purolea had not conducted process validation prior to distribution, as required under 21 CFR 211.100. When confronted, the firm replied “that you were not aware of the legal requirement, as the AI agent you used… never told you it was required.”

FDA’s closing position on AI is the most important paragraph in the letter for the rest of the industry: “any output or recommendations from an AI agent must be reviewed and cleared by an authorized human representative of your firm’s QU in accordance with section 501(a)(2)(B) of the FD&C Act.”

Why Peptide Companies Should Pay Attention

The peptide market is structurally vulnerable to this exact failure mode. Many peptide developers and distributors are early-stage, run small quality teams, and operate under intense documentation pressure — new SOPs for new analytical methods, new specifications for new APIs, new master production records for each strength and presentation. AI tools collapse the time required to produce a first draft from days to minutes. The risk is not that teams use AI; the risk is that no one reads the output with regulatory eyes before it becomes a controlled document.

Three patterns we see most often in peptide quality systems are particularly exposed:

  • Specifications copied from generic templates. AI models are trained on public examples of finished-drug specifications. Those examples are rarely appropriate for synthetic peptides, recombinant peptides, or compounded preparations. Specification gaps for related substances, residual solvents, endotoxin, and counterion content are common.
  • SOPs that reference regulations that no longer exist. Models drift toward older, better-represented training data. Citations to superseded FDA guidance, withdrawn ICH versions, or pre-DSCSA distribution rules slip through routinely.
  • Procedures with missing steps. AI drafts read fluently, which masks omissions. Process validation is the canonical example Purolea fell into; equipment qualification, supplier qualification, and stability program design are similarly prone to quietly-incomplete drafts.

The “AI Didn’t Tell Me” Defense Failed

The most useful single sentence in the warning letter, for everyone operating in this space, is the one that documents Purolea’s defense: the AI agent never told them process validation was required. FDA treated that defense as confirmation of the violation, not mitigation of it.

This aligns with how FDA has handled outsourced GMP activities for decades. The agency has long held that contractors are extensions of the manufacturer and that the marketing authorization holder owns the quality of the drug regardless of who performs the work. AI is now being treated the same way: it is a tool used by the firm, and the firm owns the output.

Practical Controls: Building an AI-Aware Quality System

Companies using AI in any part of their CGMP documentation lifecycle should formalize the practice before the next inspection, not after. At minimum:

  • Write an AI-use SOP. Document where AI may be used (drafting assistance, summarization, literature review), where it may not (final release decisions, signed approvals, deviation conclusions), and the QU review and sign-off required before any AI-assisted document becomes effective.
  • Train the QU on AI failure modes. Reviewers need to know what to look for: confident-but-wrong regulatory citations, missing steps, outdated guidance references, plausible-but-fabricated specifications.
  • Log AI assistance in the document control record. A simple field — “AI tool used: Yes / No — if yes, identify tool and date” — creates an audit trail that protects the firm and surfaces patterns over time.
  • Validate critical AI-assisted content against authoritative sources. USP monographs, current ICH guidelines, the latest FDA guidance documents, and your own validated test methods — not the model’s confidence.
  • Never accept an AI rationale for skipping a regulatory requirement. If the AI did not raise process validation, stability, supplier qualification, or any other CGMP requirement, the firm is still on the hook.

What to Do This Quarter

If your firm has used AI to draft any CGMP-relevant document in the past 18 months, conduct a focused retrospective review now. Pull the list, route each document through QU re-review against current regulations and current internal practices, and document the review. This is the single most effective action you can take to prevent a Purolea-style finding in your next inspection.

Frequently Asked Questions

Is FDA banning AI use in CGMP work?

No. The warning letter is explicit that AI may be used as an aid in document creation. The violation is releasing AI-generated content into the quality system without authorized QU review, and relying on the AI’s omissions as a substitute for the firm’s own knowledge of applicable regulations.

Which regulation does AI-without-QU-review violate?

FDA cites 21 CFR 211.22(c), which requires that the Quality Control Unit have responsibility for approving or rejecting all procedures or specifications affecting drug identity, strength, quality, and purity. AI-generated procedures and specifications are not exempt from that review.

Does this apply to compounders and distributors, not just manufacturers?

503A and 503B compounders fall under their own regulatory frameworks (USP <795>, USP <797>, USP <800>, and 21 CFR Part 211 for 503B outsourcing facilities). Distributors are governed primarily by DSCSA and state wholesale distribution rules. The principle is the same: documents that govern your operations require human review by a qualified person, regardless of how they were drafted.

How should we document AI use during an FDA inspection?

Disclose proactively. Show the AI-use SOP, the QU review record for each AI-assisted document, the training records for QU staff on AI review, and any examples where QU rejected or modified an AI draft. That record is what separates “we use AI thoughtfully” from “we let AI write our quality system.”

Current Peptide Compliance helps peptide developers and distributors build AI-aware quality systems — including AI-use SOPs, QU review checklists, and retrospective document audits. Schedule a 30-minute discovery call to scope your assessment.

Leave a comment