[an error occurred while processing this directive]
Last updated: 2026-02-04

Educational information only. Not legal advice. Consult counsel for contracts, compliance, or liability.

1) Summary

Write this last. Keep it readable. If it turns into a sales pitch, you lost.

Fill in

Decision requested: Approve (or decline) a pilot of [tool name] for [use case].

Why now: [One sentence. Reference a real problem: backlog, access, consistency, staff time.]

Patron impact: [What gets better for patrons. Not "innovation".]

Staff impact: [What changes for staff. Include training and failure-handling.]

Risk posture: [Low/medium/high] with [top 3 risks] and mitigations documented below.

Exit plan: [How you turn it off and walk away cleanly.]

2) Need and scope

Define what the tool is for. Also define what it is not for. Most scope creep starts the day after approval.

  • Use case: [Example: "patron self-service for basic directional questions"]
  • Out of scope: [Example: "legal advice, medical advice, HR decisions, discipline"]
  • Success metrics: [Pick 3. Example: time-to-answer, accessibility, patron satisfaction]
  • Failure definition: [What outcome means we stop the pilot]

Good scope

One workflow. One audience. One month. Measurable outcomes.

  • Limited hours or limited branch
  • Clear content boundaries
  • Human fallback

Bad scope

"We'll roll it out everywhere and see what happens."

  • No owner
  • No measurement
  • No stop button

3) Data and privacy

If you cannot describe the data flow in plain language, you do not understand the system. That is not a moral failing. It is a procurement problem.

Item What we will do Pass condition
Inputs [What patrons/staff type, upload, or select] Written down
Storage [What gets stored, for how long, where] Minimal retention
Training [Whether vendor uses inputs for training or improvement] No training on patron data
Subprocessors [List any third parties. "We don't know" is not acceptable] List provided

If this tool touches protected categories, minors, or anything that looks like surveillance, slow down and get counsel involved early.

4) Risk and mitigations

Document risks like you expect to be asked about them later. Because you will.

Hallucinations

Wrong answers with confident delivery.

  • Use-case limits
  • Sources/disclosures
  • Human fallback

Privacy spill

Data stored, shared, or trained without consent.

  • Retention limits
  • No-training clause
  • Audit rights

Vendor lock-in

You cannot leave without breaking everything.

  • Export format
  • Exit timeline
  • End-of-term support

5) Evaluation criteria

Use the AI Scorecard as a baseline. If a vendor cannot meet the basics, no amount of marketing makes that okay.

  • Privacy: session handling and training use is documented
  • Transparency: plain-language disclosure exists and is posted
  • Accessibility: testing results or statement provided
  • Reversibility: exit plan is real, not aspirational

6) Cost and surprises

Write down the pricing model and the list of "not included" items. Expensive surprises usually show up as implementation, support, usage tiers, or add-ons.

Fill in

Year 1 cost: $[ ] (licenses + implementation + training + optional add-ons)

Ongoing cost: $[ ] (renewal + support + usage)

Known extras: [SSO, SMS, analytics, custom integrations, API access]

Price change triggers: [users, usage, branches, feature flags]

7) Implementation and support

  • Owner: [name/role] is accountable for the pilot
  • Support: vendor support terms and escalation path are written down
  • Staff training: [what, who, when]
  • Patron comms: disclosure text, signage, web copy
  • Fallback: what happens when it fails

8) Exit plan

If you cannot leave cleanly, you do not have a pilot. You have a slow-motion adoption.

  • Export: What you can export and in what formats
  • Timeline: How long you have to get data out after termination
  • Deletion: What gets deleted and how you verify deletion
  • Fallback: What replaces it on day one after shutdown

Appendix: Vendor questions

Paste these into an email. Require answers in writing.

  1. What data do you collect? What do you store? For how long?
  2. Do you use any library data or patron inputs for training or product improvement?
  3. List all subprocessors and where data is processed.
  4. What are the accessibility standards you meet, and what testing has been done?
  5. What is the pricing model and what is not included?
  6. What are the exit terms: export, deletion, and post-termination access?
[an error occurred while processing this directive]