Evaluating new technology is often a daunting task. Break it down into its component parts to determine if it fulfills your needs. Always keep in mind how the new technology supports your mission, improves the services you provide, and enhances access to your collections and curated information.
All of us confront changes in hardware, software, and technology in our professional and personal lives, so let’s approach the changes head on and ask pertinent questions as we go along. Begin by asking, “do the changes fit your needs, institution, department, and researchers?” This is where evaluation intersects with the mission statement. Evaluating changes in technology and software is particularly important to prevent someone else from making decisions without input from the information department. The end result of evaluation—and by extension presenting your review and recommendation—is to determine how the technology helps you, as information professionals, and your researchers.
Approaching the new
Whether your first response is a sigh, a grimace, or a smile, after taking a deep breath, embrace the task and begin to evaluate by asking, “what does the change affect?” In the case of software and databases, does it affect the frontend, backend, or both? In the case of technology as a whole, how does the new database interface affect the way you function and provide services?
When we approach new software programs and new databases or websites, we must evaluate them based on the needs of three user groups: ourselves—the information professionals, the users and researchers, and the staff members who input data. Evaluate the functionality of the different interfaces for the software or database. In many cases, the three different user groups will interact with very different interfaces.
Three types of interfaces
- The backend is the interface through which the data is input. This interface includes all the data fields, menus, and cataloging components. Evaluate ease of inputting data, adaptability to local needs, and reporting functions, among other aspects of the software.
- The in-house interface used by the reference staff and information professionals should be evaluated on basic and advanced search features, including ability to modify, narrow, save, and update search results. Reporting functions include options for search result outputs, including printing, email, and sharing links with internal and external researchers.
- The frontend used by our researchers should make access to relevant data easy. Evaluate the search and retrieval functions in the various frontend interfaces. Pay careful attention to searching and retrieval functions when using in-house, web-based, mobile, and app versions of the website and databases. They tend to display and function very differently.
Evaluating features and functionality
As we begin to evaluate usability and flexibility, we need to test all the interfaces and the functionality of each. Throughout this process, we must constantly consider how the technology fits within our mission statement and serves the needs of our various user groups, particularly when the software or database has undergone a major upgrade.
While evaluating the technology from the perspective of all three user groups, use the same queries in each of the interfaces. Do you get the same results? Do the results display differently? Can you modify the search strategy or terms to retrieve more targeted or precise results?
Take a few minutes to evaluate the Help features. Important aspects of Help features are whether the instructions make sense, are there examples of how to input queries, and are there recommendations to narrow your search.
Consider how the changes make it easier, better, faster, and more efficient to access the collections and data. With luck and a little practice, the new technology will allow the end user to research more efficiently and includes new features you always wished you had. In the end, does the improvement increase productivity and profitability?
Whose needs are more important?
When we evaluate technology based upon our mission, our first priority and most important user group is the research or end user. Consider how the end user’s access to information is facilitated by the new technology. What functionality is enhanced or limited by the changes in software or the configuration of the database?
When we put the end user or researcher’s needs first, what happens to the functionality of the input modules and the information professional’s ability to access data? How does designing a user-friendly front end affect the usability of the backend and advanced search interfaces?
Evaluating upgrades versus something new
In the first case, with an upgrade or tweak of existing software, evaluate whether you’ve lost any of the functionality of the earlier version. Determine whether glitches and problems have been fixed or do you still have to use the same “work-around.” Review previously documented complaints in functionality that you’ve made over the years, the awkward work-arounds that you’ve adopted. Does the new version suit the needs of your researchers? Has the frontend / discovery layer changed dramatically or is it just a tweak?
For new software, you need to test, test, and test again, evaluating functionality, ease of use, and search results. Document what’s changed. Study the new interfaces and identify the new, improved features in all of them. Evaluate how the new database helps your researchers retrieve information more efficiently and precisely.
Summing it up
Let’s face it, we all quiver when confronted with new software, hardware, and technology. Some of us quiver with excitement—the excitement of trying out something new, such as a new device or method for performing a complex task. When we evaluate new technology, we must keep in mind our mission and the different types of staff and researchers who access our data and collections. Ease of access and accurate, timely results are essential for retrieval and dissemination of information throughout our organizations. Focus on usability and functionality and you’ll soon embrace the new.
My next column will provide suggestions for reviewing and writing recommendations while evaluating new technology.
Miriam Kahn, MLS, PhD
Skills for special librarians include strategic research on library services, products, and policies in order to understand and serve stakeholders
Skills for special librarians who conduct training include leveraging the Kaufman Five Levels of Evaluation to assess instruction efficacy.
Skills for special librarians include leveraging technology like 360° videos, as training and orientations are increasingly virtual
Skills for special librarians including reflecting on prior experiences, keeping what works, and improving upon what doesn’t. Questions to ask.