Case Study: Full Engagement

Little Schitt Creek Regional Library: AI That Serves the Community Without Selling It Out

A small-town library with 8 FTE, 12,000 cardholders, and a server in a closet built privacy-first AI tools that increased circulation 34%, boosted digital resource usage 89%, and didn't cut a single staff position.

Library type
Small regional public library
Service population
12,000 cardholders
Staff
8 FTE
Engagement
Site design, AI implementation, staff training
Duration
18 months

+34%

Circulation increase

Circulation

+89%

Digital resource usage

Digital

3 min

Avg. question resolution (was 12)

Response Time

40+

New programs added

Programs

0

Staff positions cut

Staff

0

Patron queries logged

Privacy

The Problem

In 2023, Little Schitt Creek Regional Library was a library in crisis, and they didn't realize it yet.

Patrons typed something into the catalog, got a wall of confusing results, and left empty-handed. Digital resources (databases, ebook platforms, streaming services) lived in seven different places with seven different logins. Seniors felt technology had left them behind. Parents couldn't find books reflecting their kids without already knowing exactly what to search for.

Staff was drowning. Questions that should take 30 seconds were taking 12 minutes. Reference librarians spent more time troubleshooting passwords than doing actual research. Programming, outreach, community engagement: there just wasn't time.

67%

Catalog searches ended without checkout

Search

12 min

Average "simple" question

Wait Time

23%

Cardholders using digital resources

Adoption

They knew AI could help. But the big vendor solutions wanted patron data in the cloud. The "free" consumer tools trained on everything users typed. The enterprise products cost more than their entire technology budget. None of these options aligned with library values.

What We Did

The engagement started with a simple question: what if AI could run entirely on library property, using library values?

No cloud. No data leaving the building. No training on patron queries. No black boxes the director couldn't explain to a board member or a curious 10-year-old.

Listened first, built second

Before writing a single line of code, we held listening sessions with the people who actually use the library:

  • Seniors wanted technology that wouldn't make them feel stupid for asking questions
  • Parents wanted books reflecting their families, not just the same old recommendations
  • Small business owners needed research help they couldn't afford from consultants
  • Teens wanted creative tools without feeling surveilled
  • Staff were exhausted, skeptical, and worried about their jobs

Their input shaped everything, from the conversational AI assistant's personality (patient, warm, never condescending) to the recommendation engine that surfaces diverse voices by default.

Made privacy non-negotiable

Every AI tool follows the same rule: your session is wiped when you close the tab. No search logs. No patron profiles. No "improving the model" with reading history. Everything runs on a local server (a Dell PowerEdge R750 in the library's server closet) using open-source models. Patron data never leaves the library network.

Built transparency into every feature

Every AI tool includes a "How This Works" disclosure in plain language. The library publishes its full AI Evaluation Scorecard (below) so patrons and other libraries can hold them accountable.

Trained staff to think, not just use

Training didn't just cover how to use the tools. It covered how to think about them: what they're good at, what they're bad at, when to trust them, when not to. Every tool includes a reminder that AI can make mistakes ("hallucinations"), and patrons should verify facts before relying on AI-generated content.

What We Built

The Digital Reference Desk includes:

  • Penny: Conversational search assistant (friendly, patient, talks like a neighbor who happens to know everything about the library)
  • Resume Polish: Grammar fixes and stronger action verbs for job seekers
  • The Jargon Buster: Translates medical and legal forms into plain English
  • Formal Emailer: Helps patrons draft professional messages
  • AI Business Consultant: Grant writing and business plan assistance
  • What Should I Read Next?: Recommendations that surface diverse voices by default

All running locally on a single server. Total hardware cost: approximately $12,000 (one-time), less than one year of a typical vendor AI contract.

Results

The numbers tell part of the story. The human impact tells the rest.

"I used to be afraid of the computers. Now I come in three times a week to chat with Penny. She helped me find books about my mother's village in Taiwan, books I didn't even know existed. And she never makes me feel slow."
Eleanor Chen, 74, retired teacher
"My son asked for books with Black astronauts. The old catalog gave us nothing. The recommendation tool gave us six titles, including one by a local author I'd never heard of. We've read them all twice."
Marcus Williams, parent
"I used the AI Business Consultant to write a grant application. I got $15,000 for my food truck. The library didn't charge me anything."
Sarah Nguyen, small business owner
"I was the biggest skeptic on staff. I thought AI would replace us. Instead, it handles the routine questions so I can actually do reference work. Last month I helped a genealogist trace her family back five generations. That's why I became a librarian."
Jamie Torres, Reference Librarian

Staff Impact: Augmentation, Not Replacement

Zero positions were cut. What changed was how staff spent their time.

Before AI

60% routine queries

  • Password resets
  • "Where is the bathroom?"
  • "How do I print?"
  • Catalog navigation

After AI

25% routine queries

  • 40+ new community programs
  • Regular outreach visits
  • Deep reference work
  • Better collection development

Staff time freed from routine queries went to coding clubs, a seed library, "tech help, no judgment" sessions for seniors (now with a waitlist), regular visits to the senior center and food bank, genealogy research with the local archivist, and collection development that reflects the actual community.

The AI Evaluation Scorecard

Before any AI feature goes live, it must pass all eight criteria. No exceptions. The library publishes this rubric so patrons can hold them accountable, and other libraries can adapt it for their own use.

Principle What we ask Required
Privacy Is the session wiped when the tab closes? Does any data leave the library network? Yes
Transparency Can we explain how this works in plain language? Yes
Local control Does this run on library-owned hardware? Yes
Equity Does this work for patrons with disabilities, low bandwidth, and limited tech literacy? Yes
Bias audit Have we tested for demographic bias? Do results surface diverse voices? Yes
Staff impact Does this augment staff work, not replace it? Have staff been trained and consulted? Yes
Patron benefit Is the primary beneficiary the patron (not library efficiency metrics)? Yes
Reversibility Can we turn this off without losing data or breaking workflows? Yes

Hear the staff tell it themselves

In the library's first podcast episode, "AI at the Library: One Year Later," Director Patricia Chen, Reference Librarian Jamie Torres, Youth Services Coordinator Marcus Okonkwo, and Circulation Lead Denise Kowalski discuss what changed, what surprised them, and why they didn't lose a single position. Read the full transcript.

Why This Matters

Little Schitt Creek serves 12,000 cardholders in a town most people have only heard of because of that TV show. They don't have a big budget or a dedicated tech department. They have a server in a closet and a staff of 8 FTE who learned as they went.

If they can do this, other libraries can too.

AI is reshaping how communities find and use information. That's happening whether libraries participate or not. The question is: will libraries shape that future, or hand it to companies whose values don't align with ours?

This project proved that:

  • Privacy and AI are not mutually exclusive
  • Local control is possible with modest resources
  • Community input makes the technology better, not worse
  • Staff augmentation beats staff replacement
  • Transparency builds trust

Resources: Steal These for Your Library

Everything below is free to use, adapt, and share. No strings.

Conference Presentation Deck

22-slide outline with speaker notes. Ready to adapt for your own library's story.

View deck

AI Evaluation Scorecard

The 8-criterion rubric used to evaluate every AI feature before launch.

View scorecard

Podcast Transcript

"AI at the Library: One Year Later." Staff perspectives on the full implementation.

Read transcript

Thinking about something similar?

Every library's situation is different. The free resources on this site handle most situations. If you need someone in the room for a specific engagement, that's what consulting is for.

Let's talk