[an error occurred while processing this directive]
# Colorado Just Passed the Nation\'s First AI Law. Here\'s Why You Should Care (Even If You're Not in Colorado)

**Date:** 2026-01-22
**Category:** AI Regulation
**Read Time:** 7 minutes

In May 2024, Colorado Governor Jared Polis signed SB 24-205 - the Colorado AI Act - into law. Originally set to take effect February 1, 2026, the effective date was delayed to June 30, 2026, to allow more time for businesses to prepare and for potential revisions to the law. And this is a big deal.

Why? Because Colorado just became the first U.S. state to pass comprehensive AI regulation. And if you think this only matters for Colorado libraries... you're wrong.

## What Colorado Actually Did

The Colorado AI Act is basically the EU AI Act's American cousin. It focuses on "high-risk AI systems" - those that make or substantially assist with "consequential decisions" about people.

What\'s a "consequential decision"? The law defines it as any decision that has a "material legal or similarly significant effect" on someone\'s:
- Education access or opportunity
- Employment
- Financial services
- Healthcare
- Housing
- Legal services

Notice what\'s on that list? **Education access**. That\'s libraries.

If your library uses AI to:
- Recommend resources to students or researchers
- Make decisions about who gets access to what
- Evaluate program participation or eligibility
- Automate any decision that affects patron services

You *might* be dealing with a high-risk system under this law. Consult with legal counsel to understand how this law applies to your specific systems and circumstances. This analysis is educational only and should not be relied on as legal advice. Even if you're not in Colorado.

## The "We're Not in Colorado" Fallacy

I hear this a lot: "We\'re in [state], so this doesn\'t apply to us."

Wrong for three reasons:

**1. Your vendors operate nationally.**
They\'re not building a Colorado version and a non-Colorado version of their software. Too expensive, too annoying. They\'ll build to Colorado standards and sell it everywhere. Just like they're doing with the EU AI Act.

**2. Other states followed Colorado's lead.**
California, New York, Connecticut, and Washington all passed similar legislation in 2025. Massachusetts and Illinois have bills pending. Colorado was the test case, and within 18 months, nearly a dozen states enacted AI regulations.

**3. Patron coverage matters.**
If you serve *any* Colorado residents (distance education, digital collections, inter-library loan), you're potentially in scope. That database vendor you both use? They\'re covered.

## What the Law Actually Requires

If you're a "deployer" of high-risk AI (that\'s probably you if you're using vendor AI tools), here\'s what Colorado requires:

### 1. Impact Assessments
The law may require completion of an impact assessment before deployment of high-risk AI systems. Consult legal counsel about whether this applies to your systems. If required, assessments typically include:
- What the AI does and how it works
- What data it uses
- Potential risks and how you're mitigating them
- Whether the AI has been tested for bias
- How you'll monitor it over time

This isn\'t a one-page form. It\'s a serious document. And you have to update it annually or whenever the system changes significantly.

### 2. Risk Management Program
The law may require policies and procedures for managing high-risk AI. Consult legal counsel for your specific obligations. Typical risk management areas include:
- Identifying and mitigating AI risks
- Testing AI systems before deployment
- Monitoring AI performance
- Handling AI failures or errors
- Regular audits

### 3. Disclosure Requirements
If applicable to your systems, the law may require disclosure when AI is used to make consequential decisions. This could include:
- Clear notice that AI is involved
- Explanation of what the AI does
- Information about how to appeal or challenge AI decisions

### 4. Human Oversight
High-risk AI systems, if subject to the law, typically require meaningful human review of decisions. Consult legal counsel about the scope and timing requirements for your specific systems.

### 5. Data Protection
Best practices for AI systems typically include:
- Use training data that's representative and tested for bias
- Protect sensitive data appropriately
- Document where your data comes from

## What This Means for Library Vendors

Here's where it gets interesting. The law creates obligations for both "developers" (vendors who build AI) and "deployers" (libraries who use it).

Vendors are scrambling to figure out compliance. Some are:
- Adding AI disclosure clauses to contracts
- Creating impact assessment templates for customers
- Building bias testing into their development processes
- Limiting what their AI can do to avoid "high-risk" classification

But here's an important note: **Contract language will likely allocate compliance responsibilities between vendor and customer.** Be aware of clauses that assign work to you, such as:
- "Customer is responsible for conducting impact assessments"
- "Customer shall ensure compliance with applicable AI laws"
- "Vendor provides tools as-is; compliance is customer's responsibility"

Before accepting any such allocation, consult with legal counsel about the appropriateness and enforceability for your specific situation.

## Real Example: Discovery Systems and Bias

Let's get concrete. Say you use an AI-powered discovery system that ranks search results. It learns from usage patterns to "improve" recommendations.

Whether this is high-risk under the Colorado AI Act is uncertain and depends on specific factors. If you have such a system, consult legal counsel about whether compliance obligations apply.

This analysis is educational. For specific guidance, consult qualified legal counsel familiar with the Colorado AI Act and your library's operations.

If compliance obligations do apply, libraries typically should consider:
1. Understanding what data the AI was trained on and whether bias testing occurred
2. Documenting how the AI system works
3. Determining whether and how to disclose AI use to patrons
4. Understanding appeal or opt-out mechanisms if available
5. Monitoring performance for unintended discriminatory outcomes

Colorado's enforcement approach is still emerging, and liability exposure varies based on specific circumstances.

## FTC Enforcement Actions You Need to Know About

Quick detour on regulatory enforcement: Throughout 2024-2025, the FTC has been increasingly aggressive about AI company practices:

- **Rite Aid (2023)**: FTC banned Rite Aid from using facial recognition for 5 years after the company's AI falsely flagged customers as shoplifters, disproportionately affecting people of color.
- **Amazon (2023-2024)**: FTC investigated Amazon's Alexa data retention practices and how voice data was used for AI training.
- **OpenAI (ongoing)**: FTC opened an investigation into OpenAI's data practices and whether ChatGPT violates consumer protection laws by generating false information about real people.

The pattern is clear: Regulators are scrutinizing AI companies for deceptive practices, inadequate safety testing, biased algorithms, and privacy violations.

Your library vendors are watching these enforcement actions closely. Some are proactively improving their practices. Others are hoping they won't be next. You need to know which camp your vendors are in.

## What You Should Do Right Now

If you're using AI tools (or considering them), here\'s my recommendation:

**Immediately:**
- Consult with legal counsel familiar with Colorado AI Act requirements for libraries
- Make a list of every system you use that might involve AI
- Contact vendors to understand their compliance position and timelines

**Next 6 months:**
- Work with legal counsel to review vendor contracts for AI-related language and responsibility allocation
- Develop internal policies for AI tool evaluation and oversight (in consultation with counsel)
- Document your vendor communications regarding compliance plans

**Next year:**
- Update procurement processes based on legal counsel guidance
- Train staff on AI system capabilities and limitations
- Consider patron communication about AI use in library systems

## The Reality: Compliance Preparation Is Important

Most libraries lack in-house AI expertise and legal resources dedicated to regulatory compliance. Vendors are also navigating these requirements in real-time, with varying levels of preparation.

If you're using or considering AI tools that make decisions about patron services, you should:

1. **Consult with qualified legal counsel** about what the Colorado AI Act requires for your specific situation and systems.

2. **Engage with vendors** to understand their compliance approach, timelines, and responsibility allocation in contracts.

3. **Document your compliance efforts** - what you're monitoring, how you're evaluating tools, what questions you're asking vendors.

This article is for educational purposes and should not be relied on as legal advice. Compliance requirements and liability exposure depend on specific facts and circumstances that require professional legal analysis.

## Looking Forward

Enforcement of the Colorado AI Act is still developing. While early enforcement will likely focus on large technology companies, the landscape for AI regulation is clearly expanding - other states have passed similar legislation, and federal approaches are under development.

Libraries that proactively engage with legal counsel, understand their vendor agreements, and document their approach to AI governance will be better positioned as this regulatory environment evolves.

The opposite is also true: Libraries that delay engaging with these questions may face compressed timelines for compliance when regulatory expectations become clearer or enforcement intensifies.

This is educational analysis. For specific guidance about your library's compliance obligations and strategy, consult qualified legal counsel.

---

**Further Reading:**
- [Colorado AI Act (SB 24-205) - Full Text](https://leg.colorado.gov/bills/sb24-205)
- [National Conference of State Legislatures: AI Legislation Tracker](https://www.ncsl.org/technology-and-communication/artificial-intelligence-legislation)
- My previous post on the EU AI Act (because Colorado borrowed heavily from it)

**Need help figuring out if your AI tools are high-risk?** [Let's talk](#contact).
[an error occurred while processing this directive]