The web-developer case for AI disclosure
The reflex from a lot of full-stack engineers and small agencies has been: "The EU AI Act is for OpenAI and Meta, not for me pushing a Next.js site to Vercel." Article 50 doesn't pattern-match on company size. It applies to providers and deployers of AI systems whose output is "made available to the public" — which catches AI-drafted landing copy on a client domain, a GPT support chatbot wired into your SaaS, AI-generated images on a product page, and AI-written docs. There is no employee-count threshold, no single-developer exemption, and no "just a side project" carve-out. The geographic gate is whether any visitor or user is in the EU, which for any public site indexed by Google is essentially always yes.
The headline penalty is €15 million or 3% of global turnover. A solo developer probably isn't the regulators' first stop, but the practical exposure arrives much earlier than the regulator. Hosting providers (Vercel, Netlify, Cloudflare) are rolling out AI-attestation prompts through 2026. Agency clients are inserting AI-disclosure indemnity clauses into delivery contracts and increasingly checking that the shipped site already has the labels in place. Open-source maintainers downstream of AI-generated PRs are pushing for provenance norms. The cheapest insurance is a per-surface disclosure on the AI elements that actually appear in your build — nothing more, nothing less. This generator emits that disclosure in four formats simultaneously: HTML banner, README badge, footer block, and JSON-LD schema.
The six AI elements web developers actually ship
You don't need to label everything. The presets map to the six real cases:
- AI-drafted landing & marketing copy. The clearest in-scope case. AI-written headlines, hero subhead, feature blurbs, pricing copy. The generator emits a header banner, a footer line, and JSON-LD
CreativeWorkwhere the AI iscreatorand you arereviewer. - Support & docs chatbots. The highest-disclosure case because chatbots get their own Article 50 clause. The generator emits an opening-message string for the system prompt, a persistent UI footer, and a JSON-LD
ChatActionblock — the three surfaces regulators will check first. - Copilot, Cursor, and Claude Code-generated code. Not Article 50 (source isn't public output) but a strong OSS norm. The generator emits a shields.io README badge, an
AI-ATTRIBUTION.mdsnippet, and an optionalX-AI-Assistedresponse-header note for downstream auditors. - AI-written docs and READMEs. Mintlify, Fern, GitBook, Docusaurus prose drafted by AI from your spec. The generator emits an above-the-fold callout and JSON-LD
TechArticleschema, plus a markdown:::ai-disclosureadmonition for MDX-based docs. - Client landing pages (agencies). Two-sided exposure: the client is the legal deployer, the agency carries integration responsibility. The generator's "agency handoff" preset bundles the banner, JSON-LD, and a delivery-contract snippet to attach to the handoff document.
- AI-generated images, OG cards, and logo variants. Article 50 covers synthetic images. The generator emits a corner-overlay snippet, an
alt-text attribution pattern, and a JSON-LDImageObjectwith the model identifier carried inidentifier.
Where to place the disclosure inside a typical Next.js or Astro build
The legal text says "clearly and distinguishably perceptible at the latest at the time of the first interaction or exposure." For a web build, that translates to three placements that survive how visitors actually consume the page:
- Visible header badge or above-the-fold callout. Above the first AI-generated block, not in the footer. A small badge plus "AI-drafted copy, reviewed by [team]" in plain text. This is the placement that satisfies regulators reading the page in a screenshot.
- JSON-LD schema in
<head>. Drop it intonext/head, Astro's<Head>component, SvelteKit'ssvelte:head, or a Webflow site-wide custom-code embed. The page becomes machine-readable as AI-generated, and the European Commission's AI Office has indicated it intends to parse this format. Both crawlers and auditors win. - README badge for the repo. A shields.io-compatible markdown badge plus an
AI-ATTRIBUTION.mdfile. Not strictly required by the Act but increasingly the OSS norm, and it travels with the project to GitHub, GitLab, and Codeberg without modification.
JSON-LD schema: machine-readable proof for crawlers, auditors, and clients
Big platforms negotiate compliance reporting with regulators directly. A small dev shop can't — which makes machine-readable schema the practical equivalent of an audit log for everyone else. The generator outputs schema with the AI tool named in the creator field, the human reviewer (you or the team) named in editor or reviewer, the model identifier carried in identifier, and a dateModified stamp. Search engines, the European Commission's AI Office, and modern attestation tooling all parse this format. If a client or platform ever asks you to demonstrate compliance for a particular page, the schema is your default answer — and unlike a screenshot, it travels with the page.
What this is not: code license, attribution, GDPR consent, and a SOC 2 attestation
An AI disclosure tells a visitor that an asset was AI-generated. A code license tells downstream users what they can do with the code. An attribution file tells contributors which AI tools were used. GDPR consent tells the user how their data is processed. A SOC 2 report tells an enterprise buyer what your security posture is. Five different documents.
Developers regularly try to roll these into one fine-print block or pretend an MIT license header is enough. Regulators have been explicit that this doesn't satisfy any of them. The W3C's emerging content-provenance work and the C2PA Content Credentials standard both treat AI labeling as a distinct artifact that should ship alongside license metadata, not merged into it. The AI label goes on the AI surface. The license goes in LICENSE. GDPR consent lives in your privacy banner. The SOC 2 attestation goes in your enterprise sales doc. The generator focuses on the first only — and does it well.
Compliance vs. theatre: what bad disclosure looks like on a web build
| Pattern | What it does | Status |
|---|---|---|
| "Built with AI" in the <meta name="generator"> tag only | Invisible to the user; perceptible only to crawlers | Non-compliant |
| One line buried in the privacy policy | Reader has to click through; not at first exposure | Non-compliant |
| Footer-only badge, no above-the-fold tag | Lost in screenshots and email previews | Borderline |
| Chatbot disclosure on the homepage but not in the bot itself | Doesn't satisfy the per-conversation rule | Aggravated risk |
| Header badge + JSON-LD in head | Visitor and crawler both see it | Compliant |
| Header badge + JSON-LD + chatbot opening message + README badge | Visitor, crawler, contributor, and auditor all see it | Best practice |
Workflow for a typical dev pipeline
You don't want to revisit the generator on every PR. Template it. Open the generator once, build your standard variants — "AI-drafted marketing copy," "AI-written docs," "AI support chatbot," "AI-generated images" — and check the outputs into the repo as a /components/AIDisclosure React/Svelte/Astro component, an AI-ATTRIBUTION.md at the repo root, and a JSON-LD partial in /lib/seo. On every new page or feature you import the component, pass the AI-tool prop, and ship. CI can lint that any route flagged as aiAssisted: true in your route manifest also imports the component. Start-to-shippable is roughly fifteen minutes for the initial wiring, then near-zero per page after that. The result is a build where AI disclosure becomes part of the type system, not a content-team afterthought.
Frequently asked questions
Do freelance web developers and dev agencies really need to disclose AI under the EU AI Act?
Yes when the AI output is what visitors or clients receive. Article 50 covers AI-generated text, image, audio, and video published publicly — an AI-drafted landing page, GPT-powered support chatbot, or AI-generated marketing copy all count. Copilot output in your backend isn't in scope; the user-facing surfaces are. Aug 2, 2026 deadline. Risk: up to €15M fines and agency-contract indemnity exposure.
If Copilot or Cursor wrote half my code, does the site need a label?
No for the code itself, yes if AI also wrote the visible page copy. Source code isn't Article 50 "public output." The rendered HTML, marketing copy, docs, and chatbot answers are. The generator emits a README badge (open-source norm) separate from the user-facing banner (Article 50 surface).
What about a GPT-powered support or docs chatbot?
Highest-disclosure case. Chatbots get their own Article 50 clause for "systems that interact with natural persons." The disclosure has to appear at the start of every conversation and the bot must not be presented as human. The generator outputs the opening message, persistent footer line, and JSON-LD ChatAction.
I run a dev agency. Whose obligation is the disclosure?
Both, in different ways. The client is the legal "deployer." Agencies sit on the "provider" line for integration and carry contractual indemnity. Clean delivery: ship with the disclosure in place, document it in the handoff, add an "AI components disclosed" line to the delivery contract. The generator's agency-handoff output bundles all three.
Where exactly do I put the disclosure on a Next.js page?
A header badge above the first AI-generated block, JSON-LD in next/head, and (for any public repo) a README badge. Footer-only doesn't satisfy the "first interaction or exposure" rule.
Does this work with Next.js, Astro, SvelteKit, Webflow, Framer, and Vercel?
Yes. The HTML banner pastes into any JSX/TSX layout, Astro component, Svelte file, Webflow custom-code embed, or Framer code component. JSON-LD slots into next/head, Astro's <Head>, svelte:head, or Webflow's site-wide head injection. README markdown renders on GitHub, GitLab, Codeberg.