Data & Decision Making: Internet Measurements in Libraries

By Georgia Bullen and Kelsey Smith of Simply Secure

Over the past two years teams from Simmons University, Measurement Lab, and Internet2 have worked together, and more recently with Simply Secure and Throneless Tech, to build a tool that gives power of information to public libraries, providing data about the health of their internet connection.

The importance of internet measurement

The societal role of the library is ever expanding. While libraries continue to promote lifelong learning and reading for people of all ages, they also work on the front lines of a fraying social safety net. They are a resource for adults facing serious life challenges and stressors, such as housing obstacles, criminal justice proceedings, unemployment, and health crises. To serve their communities, libraries have embraced a pivot to digital. This results in a steady concern for adequate hardware, software, and internet quality.

Information is power. There is an information asymmetry between public institutions, like libraries, and the private telecommunications industry that serves them. While interviewing librarians, we heard about a hunger for data:

“I like data. I like anomalous data. Am I doing something similar to the other people around me?”

“This would be great data. If it can be measured it can be improved! We can identify trends.”

“Libraries are all about statistics.”

To deliver internet measurement data to the respective libraries in an accessible format, our team from Simply Secure provided support in user research and design. We set out to design a visualization tool that would allow the various users to access the data, explore it, and therefore benefit from it. While some applications of the data visualization tool are apparent so far, there are many more to be discovered in the future. MLBN provides data that can be used in tandem with other tools for rich comparisons.

Use Cases

  1. Libraries can monitor their internet performance independent from their internet service provider.
  2. Librarians can use measurement data to determine when it’s necessary to upgrade their contracted internet service in order to serve their patrons effectively.
  3. Some libraries report annually on the state of their internet, so these measurements could save them valuable time when compiling data, allow for deeper analysis, and provide transparency to their tax payers, benefactors, and governance structures, e.g. boards, senior leadership.

Measurement Mechanism: MLBN Murakami Devices

To consistently and automatically measure internet performance, participating libraries installed small measurement devices connected to their library wired and wifi ports. These devices, named “Murakami”, perform internet performance tests from M-Lab (NDT) and Ookla (Speedtest) at randomized intervals and transmit that data to the cloud. Once the measurement devices were installed in pilot locations and transmitting measurement data, the next step was building a visualization website for the libraries to view their data and adapt the visualizations to their needs.

Design Challenges

Every design project faces a unique set of challenges. A lack of challenges could mean the solution already exists or the problem doesn’t necessitate a solution. Our primary design challenges need to be addressed frequently throughout our design process.

Various contexts

Library setups are wide-ranging. While there are national- and state-level associations/organizations dedicated to libraries and librarians, due to the independent nature of library authority, a nation-wide governance or organizational structure does not exist. Some libraries are independent, while others are integrated with the local government. There exist state library agencies that support libraries in their locale centrally, but this is not the case in every state. Library systems may span several counties, but others occupy a single building in a remote town. This gives communities agency to set up the policies and tools that work for them, but it also presents challenges in designing a tool that can be useful in complex, varied environments.

Presentation of complex data

Visualizations give us a way to understand complex data. How could we design for comprehension with visualizations? There is a fine line between not enough information (it’s not useful) and too much information (it’s crowded and unreadable). How could we allow for comparison? How could we handle data that seems to refer to the same measure, but has complex technical nuance around its measurement methodology? This information is critical to many users.

Range of user capabilities and interest

Our users range from accomplished librarians to experienced IT professionals. How could we design for multiple types of users? What features are important for each group? Would tooltips distract IT personnel? Would advanced views be confusing to others?

Design Methodology

0 | Foundation: Double Diamond

Our research and design methodology was based on the design
standard called The Double Diamond.

 

The Double Diamond: Diverging and converging explorations allow designers the focus, freedom, and agility to land on great solutions.

  1. DISCOVER:   Reading the diagram from left to right, the process starts with a period of expansion, where designers and researchers collect as much information as possible to discover unknown problems.
  2. DEFINE:    To keep the project manageable, designers analyze insights to define their focus.
  3. DEVELOP:    In a second round of expansion, designers develop a great number of solutions to explore selected problems in-depth. Pushing preconceived notions to the extreme is a way to stimulate creative thinking.
  4. DELIVER:    Designers converge by refining solutions. This process can repeat, sometimes in a cyclical fashion.

1 | Kickoff Research: Design Workshop

What: In-Person, 1 day, participatory design convening

Who: M-Lab and 30 librarians from across the US 

Where: Chicago, IL

Why: To learn about the context and the design concerns 
of librarians

When: Fall 2018

At the workshop, we learned about the infrastructure, services, and providers at the libraries. By having a clear understanding of the context, we are able to better understand needs and challenges, and create a more useful tool. This workshop allowed us to begin to understand the diversity of users and contexts, instead of relying on our assumptions of the “average” librarian or library context. If building for the average, we might make a tool that works for no one. We did our first round of user testing data visualizations with open source products. As a result, we decided to build our own visualization tool that would be more accessible to a wider range of users.

2 | Immersive Research: Site Visits

What: Interviews in the users’ environments

Who: M-Lab, Simmons University, and library personnel from year 
one cohort

Where: 10 libraries located across the US

Why: To understand the role of libraries in communities and the 
work of librarians in context 

When: Winter, Spring, Summer 2019

With insights from user research, we made basic personas to get an overview of the users that will use the tool. This allowed for clarity in discussions when making decisions. Personas also allowed us to map out the various account types (admin, editor, viewer), define system goals, and plan for features.

3 | Defining Focus: Analyzing User Research

What: Mapping insights, personas, user flows, and selecting 
features for version 1 tool

Who: M-Lab, Simply Secure

Why: We documented and referenced research insights to be in 
sync as a team.

When: Fall 2019, Winter 2020

With insights from user research, we made basic personas to get an overview of the users that will use the tool. This allowed for clarity in discussions when making decisions. Personas also allowed us to map out the various account types (admin, editor, viewer), define system goals, and plan for features.

Personas (examples)

Librarian

Workplace: one library branch of a greater system

Technical abilities: low 

Goals: view data in the tool (but they aren’t concerned with the setup) and help patrons by getting a sense of the library’s internet speeds

Colleagues: They rely on the IT support at the library system headquarters. 

 

Library director

Workplace: independent library (single location) 

Technical abilities: medium

Goals: edit all their library information in the tool, help patrons, quickly confirm slow connections, view data over specified times, and present data to the library board

Colleagues: Few to no coworkers. They rely on outsourced IT and volunteers.

 

IT professional

Workplace: Headquarters of a library system with multiple locations

Technical abilities: high

Goals: edit all their library information in the tool, monitor branch problems, plan for the future of the library technology, improve their internet system, analyze data over specified times, and present findings to the executive director

Colleagues: Many, they need to be able to manage user accounts for colleagues.

 

MLBN program administrator

Workplace: M-Lab

Technical abilities: high

Goals: Understand MLBN status, manage all users/locations/devices, compare locations, export data, provide support, view nationwide trends over time

Colleagues: They need to be able to manage every user account of the tool.

 

Account types

  • Users should be able to view either one location, multiple locations, or all locations.
  • User permissions can vary depending on the user’s job:
  • Viewer – just an overview of the important information
  • Editor – complete control over their library or library system setup
  • Program Admin – complete control over all libraries and library systems

System Goals

  • The user can interpret data.
  • The user can export data.
  • The user knows the measurement system is working.
  • The user understands the internet status.
  • The user can annotate issues.
  • The user can manage the library information, library users, and library devices.

General features

  • Login and logout
  • Status: to understand the current internet connection status
  • Charts: select date range, view by aggregate, view different tests and devices, export
  • Notes: to remember and communicate incidents
  • Compare: to compare two or more locations on charts
  • Locations: to view/edit a list of locations (that the user has access to)
  • Library information: to view the information related to the internet and measurement devices
  • Users: to view/edit other users of the tool (that the user has permission to access)
  • Account: to manage account information

 

4 | Initial Ideation: Sketches

What: Hand drawings of screens, features, and ideas

Who: Simply Secure

Why: A picture is worth 1000 words.

When: Winter 2020

We sketched to communicate our ideas with the team and explore various options. With tangible examples, we were able to have richer discussions and make decisions faster. We evaluated layouts, eliminated concepts, and made comments for subsequent sketching (see green markups). After analyzing sketches, we identified the features outside the scope of version 1, such as an alert system, because they were not core to the functionality of the visualization tool. Below are some examples of our many sketches.

5 | Solidifying Structure: Wireframes

What: Grayscale mockups using the design tool Figma

Who: Simply Secure

Why: To focus on the structure and not worry about the color 
scheme or details

When: Spring 2020

Wireframing is the framework that determines layouts, features, hierarchy, and navigation; without the distractions of styling, graphics, color, and pixel perfection. This allows the team to focus on what’s important: making key decisions that affect the organization of the website. User interface (UI) styling is layered on the established foundation at a later step. We made a wireframe for every screen of the visualization tool and considered the different types of users when planning for each account type. At this point, we also started drafting the content for key pages, like About, FAQ, and Glossary. With weekly feedback we reached a version that was ready for user testing. Due to their simplicity, wireframes are easy to change, so improvements can be integrated quickly.

Our inspiration board housed images of visualizations and interactions that we collect›ed from various sources. We were able to identify effective patterns and referred to them often.

Two early wireframe designs of the homepage exemplify the progress that we made on a weekly basis

We laid out the screens in a flow to understand the navigation.

After discussions and improvements, we agreed on final wireframes for version 1 testing. [Shown: viewing a library branch, comparing multiple branches of a library system]

6 | Testing Usability: Prototype

What: Observing participants use the prototype and taking notes

Who: Simply Secure, Thronesless Tech (prototype development), 8 librarians

Why: To find bugs, make improvements to usability, discover features 
for version 2

When: Spring 2020

With wireframes as a guide, the development team built a prototype for testing. Getting user feedback is paramount in design. We initiated a round of usability testing for users to get hands-on experience with the prototype. Observing users gave us valuable feedback for improvements and validated the core features of the tool. Additionally, we used a portion of the interview time to discuss librarians’ concerns about their internet and to get an idea of how the tool would fit into their work.

Plan and Execution

The research plan was guided by our core research questions: what we wanted to understand better, our unanswered questions, and remaining uncertainties. From there, we developed an IRB-approved interview protocol guide and scheduled eight interviews with librarians that had MLBN installed in their libraries. We asked participants to “think aloud” while using the tool so that we could try to get their unfiltered feedback. It was important that we ask open-ended questions and not influence the participant with leading questions. We studied our interview notes and highlighted important points. We grouped and compared findings in a spreadsheet to understand which ideas were expressed frequently.

We added takeaways to a spreadsheet and color-coded it by participant to see what patterns emerge. Then we were able to solidify key findings.

Key Findings

Uses for data:

  • Most participants want lots of data. Their primary interest in the measurement devices seemed to be about access to more data in general. They were excited to have access to charts and data exports, so that they are able to combine and analyze data with other tools and systems that they regularly use.
  • Most participants report that they have little to no connection problems since upgrading to fiber. They talked about leveraging measurement data to evaluate their ISP contract on a regular basis and as a way to monitor how often they hit the capacity and might need to consider upgrading speeds at their library.
  • Most participants reported recently upgrading to Meraki network hardware in their library. They are excited to have network control and detailed metrics.
  • Most participants do not check their internet speed frequently. They only do so if they notice a slow down or when needed for reporting.

On the job:

  • Participants expressed that they generally know how to solve their most frequent problems. They don’t face unique problems often.
  • Security is important to IT professionals.
  • Participant responses were mixed about the documentation of technology issues and their processes. Some users said note-taking didn’t matter in their jobs. Bigger systems have a digital system, while smaller locations write on paper. In general, participants saw utility in having annotated details associated with their speed data.

MLBN visualization website:

  • Libraries benefit from peer support and communication. The role of the forum page was confusing to participants and not vital because libraries already have systems in place.
  • Editor and viewer roles are important. However, the viewer role should be simplified to accommodate for the needs of that user type.
  • The prototype version of the quick reference widget (recent status indicator at the top of the homepage) was confusing to all users.
  • The chart options, “All tests, By month, By week, By hour” were not clear. Participants assumed that these were shortcuts to specific timeframes, rather than alternative aggregations and views of the data.

Actionables

The findings, in combination with newly discovered usability issues, contributed to a list of actions ranked by priority. We prioritized changes that would be most impactful or easy for the development team to implement. Lower priority items either require complex overhauling or more research to find a suitable solution. Those ideas will be a starting point for the research and design of the next version.

Examples of actions taken after version 1 testing (not exhaustive)

  • Add axis labels to charts (Absolutely necessary for chart readability)
  • Change the label “Devices” to “Measurement Devices” (“Devices” is ambiguous)
  • Define terminology of NDT and Ookla with tooltips (it’s not common knowledge)
  • Label “logout” button with words, not an icon (an icon is not explicit)

Examples of future features and research (not exhaustive)

  • Show data from multiple connections (wifi, wired) on a single graph
  • On the library tab, add an informational table for the measurement devices
  • Place “view” options (by day, by hour) in the date picker instead of below the chart
  • Allow users to export images of visualizations, not just data
  • Integrate other measurement devices into the dashboard so that MLBN can be a data hub

Results

We are happy to announce that the version 1 of the MLBN Visualization website is live and libraries in the pilot program have the ability to view their internet performance data. Our goal was accomplished through a team effort across four organizations. It’s rewarding to give our users something that might not only improve their jobs, but the community that they serve. We are excited to see the uses of the data in the future and make improvements through future phases and research.

Lessons Learned

  • Whenever there is collaboration, especially remote, there needs to be a documentation system in place. Communication is vital and the design deliverables make an impact on the efficiency of communication. This is where personas, user flows, and sketches come in handy.
  • It’s hard to design for such a wide array of users, so our research allowed us to identify diverse needs. Research is vital when designing for anyone who is not you.
  • Leave your assumptions at the door. One of the several surprising outcomes of the user interviews was discovering that most libraries have quality internet connections. Most reported few to no issues and satisfaction was high. The measurement data will be useful for other important purposes besides troubleshooting.
  • Finally, we learned that library workers are a wonderful group of people to work with and we look forward to future collaborations.

 

Credits

Lai Yi Ohlsen, M-Lab

Chris Ritzo, M-Lab

Roberto D’Auria, M-Lab

Georgia Bullen, Simply Secure

Kelsey Smith, Simply Secure

Josh King, Throneless Tech

Rae Gaines, Throneless Tech

Colin Rhinesmith, Simmons University

Jo Dutilloy, Simmons University

Susan Kennedy, Simmons University

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *