[an error occurred while processing this directive]

Decision-making framework

Technology Strategy & Selection Framework

Systematically evaluate and choose technology that fits your library's actual needs, budget, and capacity. This framework guides you through selection, evaluation, and implementation planning.

Tech Fit Analyzer

Answer 10 quick questions about your library's context and priorities. Get personalized recommendations for technology stack, evaluation criteria, and implementation roadmap.

Complete Guide

Understanding Technology Fit

Technology selection is one of the most consequential decisions your library makes. Not because the technology itself matters, but because choosing wrong locks you into a multi-year commitment that affects staff capacity, patron access, and your budget. Libraries typically sign 3-5 year contracts with these systems. Breaking a contract early costs $30,000-$60,000 in migration fees alone. This isn't a casual choice.

Here\'s what goes wrong most often: Libraries buy enterprise systems designed for institutions 10 times their size. A small public library with one IT person buys software that requires a dedicated team to manage. An academic library in a consortium chooses systems built for independent libraries, then struggles when 15 members have different needs. A school library picks the cheapest option, then spends thousands on workarounds because it doesn\'t integrate with their existing systems.

The core principle is simple: Align technology to your actual needs, staff capacity, and budget, not to vendor marketing or what you think you should want. A small library with one part-time IT person shouldn\'t even consider tools requiring ongoing customization. A consortium should never adopt a system that doesn\'t accommodate different member configurations. A library serving 40% non-English speakers shouldn't compromise on language access to save $15,000 upfront.

Real example: A 12-branch suburban library system spent $45,000 per year for 6 years on a proprietary ILS ($270,000 total). System was over-featured for their needs, required expensive vendor support for basic changes, and locked them into vendor-specific data formats. Meanwhile, a similar-sized library 50 miles away chose Koha (open-source): $40,000 upfront implementation + $5,000/year support = $55,000 by year 3. When the first library finally upgraded after 6 years of frustration, switching costs were $50,000. The second library had choices; the first didn't.

This framework helps you evaluate technology through three critical lenses: (1) Does it fit what you actually do? Not what you think you should do, not what other libraries do, but what you do. (2) Can your staff manage it? Staff capacity matters more than system sophistication. (3) Does it align with your equity commitments? Will implementation require cutting services in neighborhoods that depend on you most?

The guide below walks you through these decisions step-by-step, with real numbers, concrete examples, and decision frameworks you can use immediately.

The Technology Selection Process

Technology selection isn\'t something you do once and never think about again. It\'s a disciplined process with 5 distinct steps. Taking time with each step prevents lock-in and regret later.

Step 1: Assess Your Current State

Before evaluating new technology, understand where you are now. Are you running legacy systems from 2010? A mixture of old and new tools held together with workarounds? Modern cloud-based systems? Your current state determines what you can realistically move toward.

Key questions: What are your current systems? What works well? What causes staff frustration? What integrations exist now that you depend on? How much technical debt exists? If your staff is spending 40% of their time on system maintenance and workarounds, you have a problem technology can fix. If they\'re spending 40% time on features your patrons don\'t use, new technology won't help. You need to simplify your current system first.

Step 2: Define Your Requirements

Split requirements into two categories: Functional (what the system must do) and Technical (how the system must operate).

Functional requirements: What processes must this system support? What patron-facing features are non-negotiable? What staff workflows are critical? Example functional requirements: "We must process renewals for cardholders with late fees" or "Spanish-language discovery must be as good as English discovery" or "We need mobile access to account information."

Technical requirements: How reliable must it be? How must it integrate with other systems? What security standards must it meet? Who will manage it? Example technical requirements: "99.5% uptime SLA" or "Must export data in MARC format" or "Must work with our Apache server and MySQL database" or "Must be manageable by one part-time IT person."

Step 3: Evaluate Options Systematically

Don't evaluate on gut feeling or vendor demos. Use an evaluation matrix with the 8 dimensions covered in Section 3 below. Create a scoring system, weight the criteria by importance to your library, and compare options apples-to-apples. Real example: A 8-library consortium spent 3 months evaluating 4 open-source ILS options. They created a 15-item evaluation matrix weighted by importance to their members. Evergreen scored highest on governance flexibility (critical for consortia), but required more implementation effort. Koha scored highest on ease of use. The decision: Evergreen, with first-year implementation support factored into cost.

Step 4: Plan Implementation Realistically

This is where most libraries fail. They assume implementation takes 3 months. It takes 6-12 months. They assume staff can keep doing their current jobs while migrating to new systems. They can't. They underestimate training time and complexity.

Build a realistic implementation plan with these phases:

  • Planning (Months 1-2): Data migration planning, staff assignment to implementation team, training plan, timeline communication to stakeholders
  • Procurement (Months 2-3): Finalize contracts, negotiate SLAs, sign agreements, establish vendor relationship
  • Setup (Months 3-6): System configuration, data migration, testing, parallel running with old system, bug fixes
  • Training (Months 5-8): Staff training (assume 20 hours per person), patron communication, documentation creation
  • Launch (Months 8-10): Cutover to new system, intensive vendor support, troubleshooting, staff support team

Real example: A 12-library consortium agreed to a 9-month ILS migration with this timeline. They assigned one full-time staff person from each library to the implementation team (12 people). The team met weekly, coordinated data migration, created training materials, and tested the system with live data. On launch day, they had vendor support and staff ready for the inevitable problems. The parallel running (running both systems simultaneously for 2 months) cost extra but caught problems before they affected patrons. Was it expensive? Yes. Did it work? Yes. They've now had stable systems for 4 years with 8.2/10 member satisfaction.

Step 5: Manage the Transition

The first 6 months after launch are chaotic. Staff are learning new systems while supporting patrons. Bugs emerge that weren't caught in testing. Vendor support is critical. Your role is managing change, not disappearing.

Key tactics: Create a dedicated support team (email address, phone line) for staff questions. Plan daily standup meetings during first month. Document problems systematically so you know what to escalate to vendor. Celebrate wins publicly and normalize the chaos. After 6 months, do a retrospective: What worked? What would you change? This becomes your input for the next technology choice.

Key Selection Criteria

Evaluate technology across 8 interconnected dimensions. You won\'t score perfectly on all dimensions; no system does. But you\'ll understand the tradeoffs you're making.

1. Cost (Total Cost of Ownership, Not Just License Fees)

Compare 3-year costs, not annual costs. Upfront implementation often exceeds licensing. A library paid $40,000/year for a proprietary ILS. Over 6 years: $240,000 in licensing + $30,000 implementation + $20,000 annual training = $290,000. But when they switched, exit costs were $50,000 (data migration + contract penalties). Meanwhile, another library chose open-source Koha: $40,000 upfront + $5,000/year support = $55,000 by year 3. By year 6, they spent $70,000 and could switch anytime without penalties.

Hidden costs to include: Implementation and setup (often $10,000-$50,000), data migration, staff training (assume 15-20 hours per person), vendor support tiers, hardware/infrastructure upgrades, integration with existing systems, annual maintenance above base license.

2. Ease of Use and Training Investment

This matters more than you think. Staff capability is your constraint, not system sophistication. Complex systems require ongoing training, create workarounds, and introduce errors. Simple systems require less training investment and fewer mistakes. If you have one part-time IT person, don't choose systems that require deep technical expertise.

Questions to ask: How many hours of training before staff are confident? How many hours after (refresher training, new feature training)? What does vendor documentation look like? Can staff intuitively figure things out, or do they need hand-holding? How much of the complexity is actually necessary vs. feature bloat?

3. Integration with Existing Systems

You\'re not replacing everything. You have existing payment systems, website platforms, discovery interfaces, eresource management systems. New technology must integrate with what\'s already there. If it doesn\'t, you're creating manual workarounds (where data gets entered twice, with errors) or separate systems (where patrons experience disconnected services).

Integration questions: Does it have APIs? Are they documented? What systems commonly integrate (ILS → discovery, ILS → payment, discovery → website)? What integration costs extra (many vendors charge for custom API work)? Can you extract your data in standard formats (MARC, XML) or are you locked into vendor-specific formats?

4. Vendor Stability and Support Quality

You\'re not just buying software; you're establishing a 3-5 year relationship. Vendor stability matters. If they go out of business, you're stuck with unsupported software. Support quality matters. When your system breaks, how quickly do they respond? What\'s the support response time for critical issues? Do they honor SLAs or just make promises?

Stability signals: Company financials (are they profitable or bleeding money?), development team stability (do they lose key people?), market position (growing or declining?), references from similar libraries (what do current customers say?). Prefer vendors with 5+ year track records in library space.

5. Accessibility and Inclusivity

Technology that excludes patrons contradicts library missions. Evaluate both staff and patron accessibility. WCAG 2.1 AA compliance matters. Can blind patrons use screen readers? Can patrons enlarge fonts or adjust colors? For libraries serving non-English speakers: Is language support built-in or machine-translated? Are the search algorithms accurate in each language? Is help documentation translated?

6. Security and Compliance

Who has access to patron data? How is it encrypted? How quickly does vendor notify you of breaches? Can you request complete deletion of patron data? What are the data retention policies? Who can vendor share data with? Does the contract limit how data is used? For AI features: Can vendor use your patron data to train AI without explicit opt-in?

7. Scalability for Growth

Will this system work if your library grows? If you add branches? If you join a consortium? Systems that work beautifully for one library sometimes break under consortium complexity. Systems that handle one million transactions/year sometimes struggle with two million.

8. Exit Strategy and Data Portability

Assume you'll eventually switch. How will you get your data out? Can you export in standard formats? What will migration cost? How many days will vendor give you to extract data? Does contract lock you in with massive penalties for early termination? Good contracts allow 30-day data export, permit termination for convenience (with notice), and charge minimal exit fees.

Consortium vs. Independent Technology

Consortium benefits: Cost sharing (single system for 10 libraries is cheaper per library than 10 separate systems), standardization (shared cataloging, patron mobility, reduced redundancy), leverage with vendors (10 libraries negotiate better terms than 1), centralized support (shared IT staff vs. hiring separately).

Consortium costs: Governance complexity (satisfying 10 different needs), slower decisions (consensus takes time), limited customization (your unique needs aren't accommodated), dependency on other members (if one member exits, finances shift to others), less control (decision made by group, not you).

When consortium makes sense: 10+ member libraries with similar needs, members willing to standardize processes, clear governance structure in place, technology with strong multi-tenant support (can handle different configurations). Real example: An 8-library Evergreen consortium spent 6 months establishing governance before buying anything. They wrote a decision matrix (voting rules, cost allocation, member exit procedures), negotiated rates across the 8 libraries (saving 25% vs. individual pricing), and implemented with clear authority. After 4 years: 8.2/10 member satisfaction. The success came from governance clarity, not from the technology itself.

When independent approach is better: Library with specialized needs, small library unable to influence large consortium decisions, library needing rapid change/customization, budget constraints requiring lowest-cost option. Independent systems let you choose exactly what you need without consortium compromises.

Governance is the critical success factor. Write down: Who decides major changes? What voting structure? How are costs shared (equal split, per-capita, usage-based)? How can members exit (and what are exit costs)? How are conflicts resolved? Libraries that succeed in consortia have clear written governance. Libraries that fail treated governance as informal and hit conflict later.

Resource Constraints & Realistic Planning

Be honest about staff capacity. What can one IT person actually do? Research shows the answer is: Keep systems running, perform basic maintenance, provide user support, respond to emergencies. They cannot: Redesign systems, manage complex integrations, conduct security audits, manage third-party relationships, build custom features, handle 24/7 support. If you have one part-time IT person and you're trying to implement three new systems, you have a staff problem, not a technology problem.

Real example: Iowa library with one IT director decided between proprietary ILS ($270,000 over 6 years) and open-source Koha ($55,000). The director's time was the real constraint. Open-source had a steeper learning curve initially (40 hours training) but predictable ongoing needs (4 hours/month). Proprietary required vendor support calls whenever anything changed (8 hours/month on average, unscheduled). They chose Koha because they could predict when they needed to work on it, rather than being reactive to vendor limitations.

Technology budgeting with real constraints: If you have $15,000/year to spend on technology, here\'s realistic allocation: 60% licensing/vendor support ($9,000), 20% infrastructure/upgrades ($3,000), 15% training ($2,250), 5% reserves ($750). This means you can afford licensing on a few systems, not many. Be selective. Don\'t accumulate systems that compete with each other.

Staffing as constraint: Before buying new technology, ask: Is the problem the technology or the staffing? If cataloging is backed up, is it because your system is slow or because you need another cataloger? If user support requests are overwhelming, is it because the interface is bad or because you don\'t have enough reference staff? Technology can help (a better interface might reduce user support calls by 20%), but it won\'t solve staffing shortages. Know the difference.

When to build internally vs. buy: Small libraries often buy too much complexity when simple custom tools would work. You don\'t need a $50,000 analytics system if your director needs a monthly report. A spreadsheet with automated data pulls might work fine. Conversely, don\'t try to build what vendors already do well. You\'re not a software company. Pick technology for your core business (serving patrons), build only what\'s unique to your library.

Outsourcing decisions: When is hiring a consultant worth it? Complex migrations (vendor says 3 months, you should plan 6), security audits (is your system actually secure?), major implementations (first time doing something complex), technology strategy (what should we be buying over next 5 years?). Consultant costs ($5,000-$15,000) often save time compared to staff learning through trial-and-error.

When to Hire a Consultant

Consultants add value when: You\'re solving a problem your staff has never solved before, timeline pressure means you can\'t afford learning curve, stakes are high (wrong decision locks you in for 5 years), complexity exceeds internal expertise, you need external credibility (board wants independent verification).

Real example of consultant ROI: A public library wanted to understand why their system was slow. Director spent 3 months learning to write database queries, trying to diagnose the problem. Finally hired consultant ($8,000, 2 weeks). Consultant found misaligned indexes, bad query design, and hardware mismatch, then gave recommendations worth $30,000 in system improvements. The director\'s 3 months of learning accomplished nothing; the consultant\'s 2 weeks solved it completely.

When to DIY instead: System monitoring (you can learn this), basic troubleshooting (common problems with known solutions), user training (you know your staff and patrons), documentation creation (tedious but straightforward).

Common consultant mistakes: Hiring someone for general advice when you need specific expertise (get a librarian technology consultant, not a generic IT consultant). Keeping consultant after problem is solved because they make good decisions (they don\'t; they\'re outside your mission). Expecting consultant to fix organizational problems with technology (consultants can recommend technology; they can't fix broken governance).

The Staffing vs. Technology Decision Framework: When you have a problem, ask three questions:

  1. Is this a staff capacity problem or a technology problem? Cataloging backlog: probably staffing (hire more catalogers). Discovery is too slow: probably technology. User requests unanswered: probably staffing.
  2. What's the cost of the solution? Hire one more cataloger: $50,000/year salary + benefits. Buy new system: $40,000 upfront + $8,000/year. Consulting study: $10,000 one-time. Calculate real cost, not just headline cost.
  3. What's the impact of NOT solving it? Patron wait time? Staff morale? Service quality? Revenue? Calculate the cost of the problem, not just the cost of solutions. Sometimes the cheapest solution is inaction.

The consultant question checklist:

  • Is this a new problem for us (consultant needed) or similar to problems we've solved (DIY)?
  • Do we have time to learn, or do we need results fast?
  • Are stakes high enough to justify consulting cost?
  • Do we need external credibility (consultant carries weight in board conversation)?
  • Will consultant teach us or just solve this problem (prefer the former)?

Ready to start? Use the Tech Fit Analyzer above to answer 10 questions about your library\'s context. You\'ll get personalized recommendations for technology stack, evaluation criteria, and implementation roadmap based on your actual situation. No generic advice, just what makes sense for your library.

Download Templates

Get started with our four downloadable templates designed to support your technology selection and implementation process.

Technology Evaluation Matrix

Compare and score multiple technology solutions side-by-side with our pre-built evaluation matrix. Includes pre-filled examples, scoring guidance, and cost analysis.

  • Pre-filled technology examples
  • 1-5 scoring guidance for each dimension
  • Cost and complexity analysis
  • Weighted scoring (optional)
Open in Google Sheets

Tip: Click "Make a copy" to create your own version you can edit and customize for your library.

Technology Selection RFI/RFP Template

Send a structured request to vendors with evaluation criteria built-in. Ensure you get comparable responses and can score proposals consistently.

  • Project overview and timeline
  • 15+ functional requirements
  • Technical specifications section
  • 100-point evaluation scoring matrix
Open in Google Docs

Tip: Customize the requirements section based on your library's specific needs before sending to vendors.

Implementation Timeline

Plan your technology implementation from selection through launch. Includes 5 phases, detailed task lists, and timelines for quick vs. complex implementations.

  • 5 phases: Planning, Procurement, Setup, Training, Launch
  • Detailed task breakdowns with duration estimates
  • Risk management section
  • Sample timelines (3 months to 12+ months)
Open in Google Docs

Tip: Adjust phase durations based on your library's size, complexity, and IT capacity.

Technology Evaluation Checklist

Systematically assess vendors with a 30-item evaluation checklist organized by key categories.

  • 30 evaluation items across 6 categories
  • Cost, functionality, support, integration, security, implementation
  • Green/yellow/red scoring system
  • Customization guidance by library type
Open in Google Docs

Tip: Use this checklist for each vendor you're seriously considering. Weight items based on your library\'s priorities.

Next Steps

This framework works best alongside our other decision tools:

[an error occurred while processing this directive]