AI Deployer vs Developer Obligations

Most U.S. AI laws and federal frameworks distinguish between developers of AI systems and deployers of those systems. Knowing which role applies to your organization — sometimes both — is the first compliance question.

The role definitions

Developer

A developer creates, trains, or substantially modifies an AI system. Under the Colorado AI Act § 6-1-1701, a developer is generally a person doing business in Colorado that develops or substantially modifies a high-risk AI system. NIST AI RMF and ISO/IEC 42001 use parallel concepts.

Developer duties typically include:

  • Documentation: provide deployers with usage information, intended uses, harmful uses, training data summaries, performance evaluations, and mitigation measures
  • Disclosure to deployers: enable downstream deployers to fulfill their own impact-assessment and disclosure obligations
  • Bias mitigation in design: actively avoid training on data that produces algorithmic discrimination
  • AG notification (Colorado): notify the Attorney General when discrimination is identified in the system

Deployer

A deployer puts an AI system into use for purposes within the law's scope — typically consequential decisions affecting consumers. Deployer duties usually include:

  • Impact assessment: under Colorado AI Act § 6-1-1703(3), annual impact assessment with prescribed contents (purpose, intended outputs, performance, transparency, monitoring, discrimination risks)
  • Consumer disclosure: notify the affected consumer when AI is used in a consequential decision
  • Right to correct + right to appeal: where required by law, allow consumers to correct data and appeal adverse decisions
  • Post-deployment monitoring: ongoing performance, drift, and bias monitoring
  • AEDT bias audit (NYC LL 144): deployers in NYC employment context must run annual independent bias audits

Vendor

Vendors providing AI tools to other organizations sit upstream of deployers and downstream of developers. Specific contractual obligations depend on the relationship — many vendor agreements transfer parts of the developer-side documentation obligation to the vendor.

The split varies by law

Not every law uses the same role taxonomy:

LawDeveloper obligationsDeployer obligations
Colorado AI ActSubstantial — documentation, disclosure to deployers, AG noticeSubstantial — impact assessment, consumer disclosure, monitoring
Texas TRAIGAModerate — no intentional discrimination, broad scopeModerate — disclosure when interacting with consumers
NYC Local Law 144None directlyYes — bias audit, public summary, candidate notice
Illinois HB 3773None directlyYes — anti-discrimination, notice
Utah AI Policy ActNone directlyYes — disclosure of GenAI use
CA SB 942Yes — covered providers (1M+ users)Limited
CA AB 2013Yes — public training data summaryNone
CA SB 53Yes — frontier AI safety framework + transparency reportNone
NIST AI RMFBoth — full GOVERN-MAP-MEASURE-MANAGE appliesBoth
ISO/IEC 42001Both — Annex A controls applyBoth

See the Developer obligations page and Deployer obligations page for the consolidated obligation matrix.

How to scope your role

  1. List your AI systems
  2. For each system, ask: did we create or substantially modify it? If yes → you are a developer for that system. Did we put it into operational use to make decisions or interact with consumers? If yes → you are a deployer for that system.
  3. Both is common: most enterprises that build any in-house AI also deploy AI bought from vendors. You will be a developer for some systems and a deployer for many more.

Use the Compliance Checker to walk through this scoping with your operational facts.

Where the line gets fuzzy

  • Substantial modification — fine-tuning a foundation model, applying prompt engineering, or wrapping an API may or may not count. Colorado defines "substantially modify" to include changes that materially alter the system's risks or intended use.
  • Vendor-provided AI — if you procure an AI tool and embed it in your decision flow, you are typically a deployer; the vendor is typically a developer. Contractual allocation does not change statutory obligations but can shift indemnification.
  • Internal-only deployment — laws focus on consumer-affecting use. Internal productivity AI without consequential decisions is mostly out of scope.

What to do next

  • Map each AI system to a role in your inventory
  • For developer obligations: prioritize documentation and bias-testing baselines
  • For deployer obligations: prioritize impact assessment and consumer-disclosure workflows
  • Adopt a federal framework as your control baseline — NIST AI RMF or ISO/IEC 42001