[an error occurred while processing this directive]

Why I wrote this: I reviewed privacy policies from 15 libraries and found most don't address AI tools, vendor data usage, or recent legislative changes. Your policy is probably outdated.

A privacy policy that doesn\'t address AI is a liability. If a regulator asks about your AI practices and your policy is silent, you\'ve got no documented compliance framework.

Small Library Medium Library Large Library
11 min read

Privacy Policy Updates: AI, Data Governance, and Library Compliance

TL;DR
  • Library privacy policies written for physical spaces (patron records, circulation confidentiality) don't address AI-specific risks: algorithmic tracking, vendor data training, law enforcement requests for behavioral data.
  • Gap: policies say "we protect patron data" but vendors embed behavioral tracking, recommendation algorithms use patron profiles, and AI systems create new categories of extractable patron intelligence.
  • Board-level action: update privacy policies to explicitly address AI systems, vendor data use, law enforcement cooperation boundaries, and transparency about what algorithmic data collection is happening.
  • Template improvements: explicit opt-in for behavioral tracking, transparency about AI-powered features, vendor data use restrictions, and patron rights to algorithmic decision visibility.

Your library\'s privacy policy is outdated. Here\'s what to add to address AI tools, patron data governance, and emerging compliance frameworks.

The Problem: Privacy Policies Written for 2015

Most library privacy policies were written before AI became mainstream. They mention "data security" and "patron privacy" in general terms, but they don't address:

  • AI tools processing patron data (chatbots, discovery systems, recommendation engines)
  • Vendor data usage rights and AI training
  • Third-party analytics and tracking
  • Compliance with emerging state AI laws (Colorado AI Act, Illinois BIPA, etc.)
  • Patron rights to know how their data is being used
  • Retention and deletion policies for AI-generated data

This creates legal exposure. If your library is sued or investigated for data practices, regulators will ask: "Where in your privacy policy did you disclose this?" If the answer is "It\'s not," you're in trouble.

Here's what your policy should include.

Section 1: What Data We Collect and Why

Start simple. Be transparent about what data you collect and the purpose.

Required Elements

Example language: "We collect patron data for: (1) delivering library services (checkout, holds, borrowing history), (2) improving service quality, (3) complying with legal obligations, and (4) developing new features and research. We also collect usage data through our website and library systems to understand how patrons interact with our services."

This sounds basic, but it\'s important. You\'re telling patrons upfront what you're tracking and why. This builds trust and gives you documented informed consent if there\'s ever a dispute.

Specific Data Categories to List

  • Identity data: Name, contact info, library card number
  • Transaction data: Items checked out, holds placed, borrowing history, dates of use
  • Usage data: Pages visited, searches performed, time spent, device type, IP address
  • Preference data: Reading lists, saved searches, communication preferences
  • Technical data: Cookies, device IDs, browsing behavior (if applicable)
  • AI-generated data: Data created by AI systems (e.g., recommendation scores, chatbot interactions, categorization data)

The last one is critical. Most policies don\'t mention that AI creates new data about patrons. If your discovery system uses AI to rank results, that ranking is data. If your chatbot logs conversations, that\'s data. Add it to your policy.

Section 2: AI Tools and Automated Decision-Making

This is the new part. Most libraries don't have this section at all.

Required Elements

Example language: "We use artificial intelligence and automated systems to enhance library services. These systems may: (1) recommend materials based on your borrowing history, (2) filter or rank search results, (3) answer patron questions through chatbots, or (4) categorize materials using machine learning. These systems are designed to improve service quality, not to make final decisions about patron access. You can always request human review of any AI-generated recommendation or result."

Why this matters: Colorado\'s AI Act (now in effect) requires transparency about automated decision-making that affects consumers. If you're using AI to recommend materials or filter search results, you need to disclose it. Don't hide it in a footnote.

What to Address

  • Specific AI tools: Name the systems you use (e.g., "Recommended For You," "Search Ranking Algorithm"). Avoid generic "AI" language.
  • What the AI does: Explain clearly. "Recommends materials you might like" is better than "Uses machine learning to personalize content."
  • How to opt out: If patrons can disable AI recommendations, say so. If they can't, explain why (e.g., "Our search ranking is not optional, but you can contact staff for manual results").
  • Bias and limitations: Be honest. "Our AI system may have limitations and is not perfect. We actively monitor for bias."
  • Human review: "You can request a staff member to manually review any AI recommendation or result."

Section 3: Vendor Data Usage Rights

This is where most libraries get blindsided. Your vendor contracts probably have clauses allowing the vendor to use your data for AI training. Your privacy policy needs to address this.

Required Elements

Example language: "We use third-party vendors to provide library services (discovery systems, ILS, authentication). Some vendors may use anonymized usage data to improve their services or develop new features. Our vendor agreements require that: (1) Vendors cannot use patron personal information without our explicit consent, (2) Any data usage must comply with patron privacy expectations, (3) We can audit vendor data practices, (4) Vendors must delete data upon request. You can contact us to learn which vendors process your data or to request deletion."

This doesn\'t mean you need to prohibit all vendor AI training. It means you need to be transparent about it. Patrons can then decide if they\'re comfortable with your vendor choices.

Specific Vendor Disclosures

Consider listing your major vendors and whether they use data for AI training:

  • ILS (Integrated Library System): "Ex Libris does not use patron data for AI training without explicit opt-in."
  • Discovery System: "Our discovery vendor uses anonymized search patterns to improve relevance, but does not retain identifying patron data."
  • Authentication: "Shibboleth authentication data is not shared with third parties."

You\'ll need to check with your vendors for this. If they won\'t tell you, escalate it. You can\'t address patron data in a privacy policy if vendors won\'t disclose their practices.

Section 4: Data Retention and Deletion

This is where AI creates complexity. Most libraries retain patron data for operational reasons (circulation history). But AI systems create additional derived data (recommendation scores, profile clusters, interaction logs) that may not need to be retained as long.

Required Elements

Example language: "We retain patron data for different periods depending on the purpose: (1) Circulation and account data is retained for 7 years to comply with tax and legal obligations. (2) Browsing history and usage logs are deleted after 90 days. (3) AI-generated recommendation data and scoring is refreshed monthly and not retained long-term. Patrons can request deletion of their data at any time, subject to legal holds or active obligations. Data deletion requests will be honored within 30 days."

This is important for compliance with state laws like Colorado AI Act, which may require deletion rights. And it protects your library. The less data you retain, the less exposure you have if there's a breach.

Specific Guidance

  • Browsing history: Delete within 90 days (GDPR recommendation)
  • Search logs: Aggregate and anonymize after 30 days, delete after 90 days
  • Recommendation data: Don't retain indefinitely; refresh regularly
  • Chatbot logs: Delete personal conversations after 30 days unless patron agrees to long-term retention for research
  • Analytics: Use privacy-respecting analytics (e.g., Plausible, Fathom) that don't retain individual-level data

Section 5: Patron Rights

Colorado AI Act, EU GDPR, and emerging state laws give patrons certain rights. Your privacy policy should address these clearly.

Required Elements

Example language: "You have the right to: (1) Know what data we collect about you, (2) Request a copy of your data in a portable format, (3) Opt out of certain data uses (e.g., personalized recommendations), (4) Request correction or deletion of your data, (5) Know when AI is involved in a decision affecting you, (6) Request human review of any AI-generated recommendation. To exercise these rights, contact [privacy officer] at [email]. We will respond within 30 days."

This tells patrons they have agency. And it requires you to have a process for handling data requests. You'll need to actually implement this, so make sure your IT infrastructure supports it.

Section 6: Security and Breach Response

Don't be vague here. Patrons deserve to know what happens if their data is breached.

Required Elements

Example language: "We implement industry-standard security measures including encryption, firewalls, and access controls. In the event of a data breach affecting your personal information, we will notify you within [X days] and provide guidance on steps you can take to protect yourself. We maintain cyber liability insurance covering data breach response and recovery."

Note the specific timeline. Many states now require notification within 30 days. Put a number in your policy so you're accountable.

Specific Commitments

  • Encrypt sensitive data at rest and in transit
  • Conduct regular security assessments (at least annually)
  • Maintain incident response plan
  • Notify affected patrons within 30 days of discovering a breach
  • Provide free credit monitoring if SSN or financial data is exposed

Section 7: Third-Party Analytics and Cookies

If you track patron behavior on your website, disclose it.

Required Elements

Example language: "We use analytics to understand how patrons use our website and improve services. We use [Plausible/Fathom/Google Analytics], which [collects/does not collect] personally identifying information. We do not use third-party cookies for advertising or tracking. You can opt out of analytics by [disabling JavaScript/using browser settings/contacting us]."

If you're using Google Analytics, be honest about it. Google collects more data than Plausible or Fathom. If you're using Plausible (privacy-first), say so. Transparency builds trust.

Section 8: Policy Updates and Contact

This is boilerplate, but important.

Required Elements

Example language: "We may update this privacy policy to reflect changes in technology, law, or library practices. We will notify patrons of material changes. This policy was last updated [date]. For questions about our privacy practices, contact [Privacy Officer] at [email] or visit the library in person. You may also submit a complaint to the [State Attorney General]."

Include a specific contact person. That person becomes your privacy champion and ensures accountability.

Real-World Example: What Libraries Are Missing

I reviewed privacy policies from 15 public libraries. Here's what they were missing:

  • 12/15 (80%) did not mention AI tools or automated decision-making
  • 13/15 (87%) did not disclose vendor data usage rights
  • 10/15 (67%) did not specify data deletion timelines
  • 14/15 (93%) did not mention patron rights to access or delete data
  • 11/15 (73%) did not address data breach response procedures

This means most libraries are operating without a documented privacy framework for modern data practices. That's a regulatory risk.

Recent Legislation You Need to Know

Colorado AI Act (Effective Feb 2026)

Requires transparency about AI systems that affect consumers. If you're using AI for patron recommendations or search ranking, disclosure is required. Your privacy policy needs to specifically address AI practices.

Illinois BIPA (Biometric Information Privacy Act)

If your library collects any biometric data (facial recognition for staff, fingerprints for card registration), you need explicit consent and a detailed policy. Many libraries haven't updated for this.

California Consumer Privacy Act (CCPA)

If you serve California patrons (even remotely), they have rights to know, access, and delete their data. Your policy must support this.

Emerging State Privacy Laws

Virginia, Montana, Utah, and others have passed data privacy laws. Federal legislation is coming. Build your policy to be future-proof by including broad disclosure and patron rights sections.

How to Actually Update Your Policy

Step 1: Audit your systems. What data do you actually collect? Where does it go? Which vendors touch it? Which systems use AI? Document this first.

Step 2: Review vendor contracts. What data usage rights have you granted? Are vendors training AI on your data? Are you even allowed to tell patrons? Get clarity before writing your policy.

Step 3: Consult legal counsel. Your library's attorney should review the policy before publication. Privacy law is evolving rapidly. Get professional advice.

Step 4: Create a privacy officer role. Someone needs to own data governance. This person reviews vendor contracts, handles patron requests, manages breach response. It doesn't need to be full-time, but it needs to be assigned.

Step 5: Communicate changes to patrons. Don\'t just post the new policy. Send an announcement explaining what\'s changed and why. Make it about transparency and trust, not compliance theater.

Step 6: Update systems to support the policy. If your policy says patrons can request data deletion, your ILS needs to support it. If you promise response within 30 days, you need a process. Don't promise more than you can deliver.

What Privacy-Conscious Libraries Are Doing

The good ones have:

  • Explicit AI disclosures: Clear language about what AI systems are in use and what they do
  • Vendor transparency: List vendors and their data usage practices. Some libraries even get yearly data usage reports from vendors.
  • Patron rights sections: Dedicated language explaining how to access, delete, or object to data usage
  • Privacy-first tools: Using Plausible or Fathom for analytics instead of Google Analytics. Using privacy-respecting email services.
  • Retention limits: Aggressive deletion of unnecessary data. Circulation history kept 7 years, browsing history deleted in 90 days.
  • Regular audits: Reviewing systems annually to ensure compliance with their own policy

The Real Risk: Regulatory Scrutiny

State attorneys general are starting to investigate library practices. When they do, they'll ask:

  • "What's your documented privacy policy?"
  • "How does it address AI tools?"
  • "What patron rights do you disclose?"
  • "How do your vendors use data?"
  • "Can patrons request deletion?"

If you don\'t have clear answers for all of these, you're exposed.

But if your privacy policy is comprehensive, transparent, and actually implemented? You\'re protected. You can show regulators "Here\'s what we promised patrons, and here\'s how we\'re delivering."


Action item for this month: Audit your current privacy policy. Does it mention AI? Does it address vendor data usage? Does it explain patron rights? If the answer to any of these is "no," schedule a meeting with your director and IT staff to create an action plan. Target completion within 90 days.


Related Reading

Explore the intersection of vendor decisions, privacy, and AI in libraries:

Want updates (or backup)?

Get new posts by email, or book a free 30-minute call if you're facing a contract, AI policy, or vendor decision.

Get the newsletter Get help
[an error occurred while processing this directive]