Cornell University Library

SUMMARY

Cornell University Library (CUL) underwent a comprehensive redesign, led by the UX team. The team had the unique opportunity to start from scratch, beginning with understanding the true needs of CUL’s users and aligning those needs with the organization’s goals. The ultimate goal was to provide users with the best possible experience for accessing CUL’s vast resources. Additionally, the team aimed to enhance the Design User Experience (DUX) by implementing a scalable and user-friendly CMS for content editors. Overall, this redesign presented an excellent opportunity to improve CUL’s user experience on multiple fronts.

THE PROCESS

To begin our redesign process, we first gathered any existing user research that had been conducted in the past. While we did uncover some quantitative research in the form of surveys, we found that these were not particularly useful for answering our specific questions. Additionally, we reviewed the feedback that the library had been gathering from the website over the years, which mostly highlighted usability issues with the current site and catalog.

Given these limitations, we recognized the need to conduct our own user research from scratch in order to better understand our users’ needs. To test the waters, we developed a small survey that was conducted on our website, asking users why they had come to the site and what information they were looking for. However, the response rate was disappointingly low, and we didn’t gain many insights from this effort.

In light of these challenges, we turned to the web metrics that CUL had been collecting using Google Analytics (GA) and Matomo. We leveraged these tools to create more than 30 reports, using Google Data Studio to help us analyze the data. We also studied the web metrics for the 18-unit libraries that make up CUL, hoping to identify user patterns in those sites as well. These studies spanned the past three years and provided valuable insights that helped us plan the best UX strategy for our redesign.

WEB METRIC INSIGHTS

  • The homepage had a vast exit rate. Most people use the homepage to access the external catalog (https://newcatalog.library.cornell.edu/). Which is separate from the CUL site, and users get confused searching outside CUL.
  • We had ≈ 2,400,000 with an average of 5 mins visits. Why do users spend so much time when the exit rate at home is very high?
  • Users don’t take too many actions on CUL site. 2.1 actions (page views, downloads, outlinks, and internal site searches) per visit. Do users find what they need fast in about two clicks? Are users engaging with the content provided?
  • Users are not searching for internal content on CUL site. Do they find what they need without the use of an internal search?
  • We found content behavior flow issues, especially with the libraries and home pages. Users were in a loop going back and forth to these pages. It sounds like they didn’t get what they needed.
  • And many more

Quantitative research doesn’t tell us the why just the what. While we were able to gain some insights by analyzing web metrics, it still didn’t answer the “why” of user behavior.

To prepare for our project, we initially planned to interview multiple departments within the Library. However, before conducting any interviews, we decided to conduct some desktop research to see what other university libraries were doing. We examined dozens of libraries, looking at a range of factors, including how they introduced catalog search to their users, what type of navigation systems they used, how they integrated unit libraries within their main site, their aesthetics, and, most importantly, what tools they provided to their users. By doing so, we hoped to gain insights that could be used to enhance the user experience for CUL’s users.

TIME TO MEET USERS, STAKEHOLDERS, AND AUDIT CONTENT / INTERVIEWS

Although stakeholders and departmental users regularly utilized the library website, we recognized the importance of understanding the needs of our primary personas, our students (including undergrads, grads, and researchers). To achieve this goal, we held numerous sessions with our students, in addition to meeting with various departments (such as communications, collections, assessment, and others).

Through these sessions, we were able to collect over 500 observations and gather a wealth of insights. These insights were essential in helping us understand the unique needs and perspectives of our primary personas and informing our decisions regarding how best to improve the user experience for all users of the CUL website.

When trying to identify patterns and cluster observations, affinity maps proved to be an effective tool. We used this approach to analyze data from various sources, including interviews with our personas and feedback from a few departments. The resulting affinity map provides a sample of the insights we were able to gather and organize in a meaningful way.

Based on the affinity map conducted, we have identified several main areas where the user experience (UX) of the CUL website needs improvement. These include:

  • Search functionality, which is currently frustrating and overwhelming for users, and causing confusion between the library catalog and site. We must improve the search UX.
  • Navigation, which has caused issues for users trying to find what they need on the site.
  • Spaces and equipment, which must be implemented in a way that makes them easy to browse, reserve, and access within CUL.
  • User help, including identifying scenarios where users may require help from library resources.
  • Awareness, as users may not be fully aware of all that the library has to offer them.
  • Consolidating library services to make them easier to find and access. Helping users explore collections to better understand what research tools and services the library has available.

CONTENT AUDIT

To ensure that the information architecture of the CUL website was effectively organized and user-friendly, we conducted a thorough content analysis. This involved examining over 450 items, including pages, sections, content types, and forms, and classifying them according to author, department curation, number of views, and other relevant attributes.

We then conducted a series of card-sorting exercises with our users, utilizing open, closed, and hybrid approaches. These exercises allowed us to gather valuable insights about how users were organizing the content and helped us refine our information architecture. We used various tools, such as dendrograms, standardization grids, and similarity matrix tools, to analyze the data and draft our main sections accordingly.

Through this content analysis process, we were able to ensure that the website’s information architecture was well-organized, making it easy for users to navigate and find the information they need quickly and easily. This, in turn, helped to improve the overall user experience and make CUL’s vast resources more accessible to all users.

INFORMATION ARCHITECTURE

Using the results of our card-sorting exercises, we developed an initial information architecture for the CUL website. The IA was organized into first, second, and third-level navigation, resulting in a streamlined structure that reduced the number of levels required in some sections, compared to the current site. This approach was aimed at improving the user experience and making the vast resources of CUL more accessible and user-friendly for all users of the website.

To test the find-ability of our content, we utilized a treejack testing exercise. In this exercise, we defined tasks for users to accomplish, allowing us to evaluate the effectiveness of our information architecture in helping users find what they need on the CUL website.

We had an ideal number of participants for this type of exercise, with over 30 individuals taking part. By testing the find-ability of our content in this way, we were able to identify any issues or areas where improvements were needed, ensuring that users could quickly and easily find the information they need.

After conducting the treejack testing exercise, we found that only 50% of participants were successful in completing the tasks we had defined. However, this provided us with valuable insights into how users were interacting with the site and allowed us to adjust our information architecture accordingly.

We analyzed the data gathered during the exercise, including task statistics, how users searched for information (both directly and indirectly), and where they struggled or failed. By carefully studying this data, we were able to identify areas where the IA needed further refinement to improve the user experience and increase the likelihood of success in completing the defined tasks.

Overall, the treejack testing exercise allowed us to gain a deeper understanding of how users were interacting with the CUL website and helped us to make informed decisions about how best to refine the information architecture to better meet their needs.

As you can see in our path destination map for task 1 below, our analysis showed that out of the 31 participants, 19 utilized one of the paths provided while 9 participants used the second path. Unfortunately, 3 participants used the wrong path, which suggests that some refinement is needed to improve the effectiveness of our information architecture and help users find the information they need more quickly and easily.

USER FLOWS/TASK FLOWS/WIREFRAMES

With our insights and research in hand, we proceeded to develop user and task flows, as well as low-fidelity wireframes. Our initial focus was on the homepage, some key pages, and major features of the CUL website. As we progressed, we created high-fidelity interactive wireframes, which allowed us to test with users and present to stakeholders, ensuring that the design was both user-friendly and met the needs of all CUL users. 

VISUAL DESIGN

Our initial focus for the CUL website redesign was on the homepage, where we began conducting visual explorations. Our goal was to develop an aesthetic that aligned with the core values of CUL, while adhering to the established Cornell branding guidelines. By taking a careful and thoughtful approach to the design process, we were able to create a visually engaging and cohesive design that accurately represented the CUL brand.

After creating more than a dozen variations, we chose 3 versions and wanted to test these visuals with our personas (students, professors, researchers, and staff) and stakeholders. We wanted to see what were the first impressions users will have on these 3 samples.

We decide to do a 5-second visual test using OptimalWorkshop.

Participant percentage:
Students: 52% / Faculty or Instructor: 4% / Staff: 13% / Library Staff of Librarian: 30% / Research or Post-doc: 0

We gathered 138 observations and 73 tags from this test. Some observations from some users.

In order to ensure that our chosen visuals accurately represented the CUL brand and would make a positive first impression on users, we conducted a 5-second visual test using OptimalWorkshop. We chose three different versions from over a dozen created, and gathered a representative sample of participants from our various personas, including students, staff, librarians, and researchers.

Of the 138 observations gathered from the test, we collected 73 tags, providing valuable insights into how users perceived the visual designs. Some users provided specific observations on the various designs, allowing us to make informed decisions about which visual elements were most effective in representing the CUL brand and resonated most strongly with our target users. 

Insights

Based on feedback from participants, design version 1 emerged as the clear favorite due to its clean, organized, and navigable layout. Users appreciated the visual hierarchy and found it less crowded than the other versions. There was also a higher level of positive sentiment toward this design.

Overall, users expressed a desire to see library and research tools, and to be able to easily search and access all of the services that the Library provides. Participants also appreciated the library unit hours and demonstrated less interest in library PR from a patron’s perspective.

Using this feedback, we proceeded to update design version 1, resulting in the final version of the homepage for both mobile and desktop users, as well as a few key pages. By incorporating the insights gathered from participants, we were able to create a more user-friendly and visually appealing design that better met the needs of all CUL users. Below you can see the final version of home on mobile, desktop, and a few key pages.