Learning Analytics and the Library
Using the Data You Collect to Address Student Learning Outcomes and Tell Your Story
Abstract In some way, every librarian is involved in improving student learning. Student learning outcomes have been having a bigger impact across the educational landscape for years. The flipside of learning outcomes is the tricky business of measuring if students are meeting the outcomes. This is where learning analytics comes in. Learning analytics is about how you collect and analyze data about learners and their environments with the intention of understanding and improving learning outcomes. In this session, we look at learning analytics and collecting library data. We cover what learning analytics are, how you collect them, and how to use them to your advantage. We also touch on the issue of balancing metrics with patron privacy. Finally, with library usage and COVID, we discuss updating your metrics to track different data.
Librarians are very good at collecting data, and we collect data on everything from who uses the library to how library resources are used. Data can be collected casually, such as marks on a clipboard, or in far more complicated ways, via data downloaded from library systems, such as OpenAthens or something like Alma or Koha which you may use as your library management system. Beyond collecting the data, the more important question is, how do you use the data you collect? Both authors of this article have participated in a program called CARLI Counts. As noted on CARLI’s Professional Development webpage, “CARLI Counts is a continuing education library leadership immersion program that prepares librarians to make effective use of research findings on the impact of academic libraries on student success for the twin purposes of service development and library advocacy” (CARLI, n.d.). Further, after successfully completing their first round of CARLI Counts in 2020, Kris went on to mentor a CARLI Counts group in 2021-2022, while Yasmine participated in Prioritizing Privacy: Data Ethics Training for Library Professionals. From their experiences in CARLI Counts and Prioritizing Privacy, the authors have learned how to use the data they collect to tell their libraries’ stories and demonstrate how the library supports retention and increases graduation rates.
In the course of this essay, we will define what we mean by learning assessment and learning analytics and how the two differ. Both are important tools to illustrate your library’s value, and each poses its own set of challenges. Yasmine will discuss her work with student success metrics at Chicago Theological Seminary, while Kris will focus on the types of learning analytics, as well as the goals and motivations of why you should use learning analytics, and how it can help you achieve your library’s goals.
Historically, library metrics were focused on student library interactions, such as circulation, gate counts, or access and use of electronic resources. There was not much consideration to how library outcomes aligned with wider institutional goals. ACRL’s landmark 2010 report, “Value of Academic Libraries,” changed considerably the conversations that took place around library assessment. ACRL has led an extensive effort to help libraries tell their story to administrators of all the various ways student library interactions are contributing to student learning and success.
It’s important to note that the term “learning analytics” is used in a couple of different ways. Librarians use the term learning analytics as any library assessment using student data, while administrators in higher ed may use the term learning analytics as an institution-wide system that collects individual-level learning data and centralizes that data with the goal of helping student success outcomes. There is an increasing body of research that demonstrates the value of academic libraries to student success, but learning to speak the language of administrators and deploy learning analytics in ways that align with administrative goals remains a growing edge for library administrators. When administrators are exploring strategies for persistence, retention, and completion goals, are libraries inserting themselves into the conversation? Are there ways in which your library’s data can be integrated into institution-wide analytics systems that are focused on student learning?
In examining the emerging literature on learning analytics, one of the most cited definitions you’ll encounter is one that developed and presented at the First International Conference on Learning Analytics and Knowledge, in Banff, Alberta, in 2011: “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environment in which it occurs” (Conole et al. 2011, 3). In other words, learning analytics is about using data to improve learning contexts and help learners succeed. If we can determine the variables that support learning, then maybe this can inform pedagogical strategies needed to support student learning outcomes. As library administrators, it’s important to note that institutionally the insights gathered from learning analytics can help guide decisions around how resources are spent within our institutions and for what purpose.
Both learning assessment and learning analytic approaches seek to identify whether students are learning from their interactions with the library. Both try to answer the question, “What services and resources can librarians provide to increase their impact on student learning?” There are key differences between library assessment approaches and library learning analytic approaches (Oakleaf et al. 2019, 840-841). Assessment approaches tend to be episodic and limited to information collected within the library. For example, a user satisfaction survey provided after an instruction session constitutes a learning assessment tool. By contrast, learning analytic approaches are more longitudinal and include institution-wide data that captures the whole student experience. Another way to distinguish learning assessment from learning analytics is by considering which timeframe is being evaluated. Assessment approaches evaluate the past, while learning analytics try to predict or anticipate the future. Library assessments are smaller in scope and limited in terms of individual-level data and consist of data that are largely controlled by librarians. On the other hand, learning analytics is larger in scope, is focused on individual data across the institution, and librarians may not have control or oversight over the data. The point here is not to advocate for one approach over the other, but to explore how learning analytics can offer libraries an opportunity to expand their assessment toolbox and identify where they may contribute to ongoing cross-departmental conversations around assessment.
As mentioned above, librarians already collect data to use in library reports. The question here is whether the life of the data being collected should be extended for institution-wide learning analytics—and if so, how do we determine what data can be included? Already we know that library learning assessment data can demonstrate the value of libraries to administrators. Library learning assessment data can assist with:
- Assessing student learning
- Improving the design of library finding aids
- Improving resource usage by seeing how often students use different types of library resources, such as print books, e-books, online databases, and study spaces in the library
- Helping you understand your students better by understanding their contexts
- Providing significant data to your institution about the work of the library
While at times gathering and analyzing library data might seem like an extra chore, as Kris and Yasmine learned in the CARLI Counts program, analyzed data can tell an important story about the library and its users. Analyzing library data opens up a number of possibilities for the library to better serve students and faculty while at the same time telling a library story that is understandable by non-librarians.
There are three types of learning analytics: descriptive, diagnostic, and prescriptive. First, descriptive analytics describes what it is happening in your library and what library users are doing. Several years ago, while working at the reference desk in a theological library, Kris marked down every time a student came to the reference desk with a question. Over time, additional categories were added to the check sheet to include the type of question being asked—namely reference or information—and eventually a category was added to track email reference questions. The data gathered was used to improve the student experience at the reference desk by creating additional signage to help students answer informational questions such as where the copiers were in the library. As part of her current work, Kris uses data collected via OpenAthens to determine which databases students use the most, which is helpful in budgeting decisions. If the data shows students aren’t using a particular database, why keep it? Descriptive data has many ways in which it can help librarians offer better service and make smarter decisions about the types of resources their students want. Student satisfaction surveys are another descriptive analytic you can use to better understand how your students use their library.
A second type of learning analytics is diagnostic. Diagnostic analytics tries to identify what is facilitating or hindering student success. The goal here is to figure out or “diagnose” what is going on by looking at collected data. For example, you can use your school’s learning management system, such as Canvas or Moodle, to embed a librarian in a class to help students with research and then analyze whether student papers improve as a result of having a librarian in the class. Also, learning management systems are a good place to set up self-directed learning for students, such as a class on how to use the library that contains quizzes to measure student comprehension. The quiz results can help you figure out if students understand how to use the library or if you may need to develop more library instruction to improve their research skills. Often diagnostic analytics are used to further study the results of descriptive analytics. Finally, correlational studies are considered to be a form of diagnostic analytics. A correlational study looks at the relationships between variables without a researcher controlling or directing any of them. For example, you might ask what the relationship is between the number of research consultations a student signs up for and their academic success. You as the librarian can track both types of descriptive data and compare the results.
The third type of learning analytics is called prescriptive analytics. This third type is a still a conceptual idea in higher education learning analytics. Prescriptive analytics is defined as “the ability to accurately predict future outcomes using learning data … [which] empowers stakeholders in the learning process (e.g., students, faculty, administrators, et al.) with intelligence on which they can act as a means to achieve more desirable final outcomes” (Oakleaf 2016, 472; ECAR-ANALYTICS Working Group 2015, 2). The idea is that institution-wide data can be used to predict whether an individual student is headed for success or failure. Libraries may be asked by their institutions to share data about individual students to help with their predictive analytical models. This marks a shift in how libraries have historically managed data about individual users.
While each of these types of learning analytics falls into their own separate category, there is overlap between the three. How you collect the data is specific to the context of each librarian but could include some of the following ways: via a learning management system, from the student information system, and in web conferences tools such as Zoom. The same data gathered for descriptive analytics can potentially be reused for both diagnostic and prescriptive analysis. Learning analytics can give the library another way to partner with other departments—to increase retention rates, improve student grades, or identify at-risk students—make referrals to support services, and even change institutional practices and policies.
These potential uses for learning analytics do not come without concerns. There are important ethical questions that emerge when academic libraries grapple with the usage of learning analytics. Our profession’s traditional values around privacy, anonymity, confidentiality, and intellectual freedom can give rise to tensions with practices that track individual-level data (particularly when libraries do not have opt-in and opt-out choices with the collection of data). For example, when the Lapp Learning Commons at the Chicago Theological Seminary adopted OpenAthens, we discovered the ability to track individual students’ login patterns even though we may not necessarily see which items they have accessed. This discovery about OpenAthens raises the question: “Just because we have access to data, does it mean that we should use it?” But then if the institution is leveraging data to support student success, is this not an opportunity for the library to demonstrate the value added they bring? How do we as librarians reconcile student privacy rights and informed consent with the inclusion of this type of data?
Librarians have for decades embraced protocols that protect users from the recording and maintaining of personally identifiable information. To what extent does our engagement with institutional learning analytics compromise these values if individual data is not properly secured, or if there is not a policy framework to govern the usage of this data? These are considerations and questions that librarians are well poised to ask and explore within their institutional context. Perhaps the value added that librarians bring to their institutions is their potential ability to facilitate robust discussion about what data needs to be collected before plunging into discussion about what we can do with data.
Absolute privacy does not exist and not all learning analytics is harmful. So what practical guidance is there to help balance privacy with other values? The Privacy Sourcebook is a resource developed for the IMLS-funded course, Prioritizing Privacy: Data Ethics Training for Library Professions. taught by Lisa Janicke Hinchcliffe from the iSchool at UIUC and Kyle Jones from the School of Informatics and Computing at IUPUI. The Privacy Sourcebook is a helpful guide to help start conversations at your school around learning analytics and privacy. Some of the elements the Privacy Sourcebook suggests libraries explore include the following:
- Conduct an environmental scan of various learning analytics practices. Are there synergies between the library data you collect and broader learning analytics efforts at your school? There are many ways that learning analytics can be pursued by institutions. What approach is used? What data is currently being captured? Does this data exist in departmental silos, or are there efforts to integrate the data? For what purpose is the data used? Is it personalized learning support? Is it predictive in that it tries to identify potential at-risk students? Who are the stakeholders? What campus policies or state/federal laws about data and data protections impact learning analytics activities (e.g., FERPA, GDPR, Data Governance, IT Privacy Policy, Library Circulation Records Confidentiality Act)? What kind of data does your school track, and can library data can help complete the picture at your institution for student learning and success? What security or ethical concerns are there vis-à-vis individual data?
- Develop a philosophy statement about learning analytics for your library. Where does your library stand on the various ethical, legal, and logistical issues as they relate to learning analytics? What are the ethical, legal, and logistical issues for learning analytics? Some questions to consider when developing a philosophy statement for your library include: What should learning analytics be used for? What should it not be used for? How do you understand data privacy and the role of consent in learning analytics? How do you communicate to your students your usage of learning analytics? What should your library consider when deciding whether and how to engage with learning analytics? Are existing guidelines from ALA, IFLA, and other library associations around patron privacy and other ethical issues sufficient in the era of big data?
- Develop talking points you will use with different audiences as they relate to: data ethics, data management, and trust. How would you convey your message or questions to students, faculty, director of online learning, academic dean, dean of students, director of student life, etc.? Each of these actors have different interests.
Libraries possess a considerable amount of identifiable student data. Any integration of library analytics data with wider institutional learning analytics objectives should consider not only methodological or technical considerations but ethical and legal ones too. There is so much for librarians to share when they learn to tell their story.
Resources
Scholarly Organizations/Associations
Society for Learning Analytics Research (SOLAR): https://www.solaresearch.org
Learning Analytics Research Network (LEARN): https://steinhardt.nyu.edu/learn
Communities of Practice
ACRL – Learning Analytics Toolkit: https://acrl.libguides.com/val/latoolkit/sourceofdata
Deon – Ethics checklist for data scientists: https://deon.drivendata.org/#background-and-perspective
Proceedings of the International Conference on Learning Analytics and Knowledge: https://dl.acm.org/doi/proceedings/10.1145/3448139
Companies/Vendors
JISC – Code of Practice: https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics
Works Cited
Consortium of Academic and Research Libraries in Illinois. n.d. “Professional Development, CARLI Counts.” Accessed July 7, 2022. https://www.carli.illinois.edu/products-services/prof-devel/carli-counts.
Conole, Gráinne, Dragan Gasevic, Phillip Long, and George Siemens. 2011. “Message from the LAK 2011 General and Program Chairs.” Proceeding of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, February 27–March 1, 2011: 3-4. https://dl.acm.org/action/showFmPdf?doi=10.1145/2090116.
ECAR-ANALYTICS Working Group. 2015. “The Predictive Learning Analytics Revolution: Leveraging Learning Data for Student Success.” ECAR working group paper. Louisville, CO: ECAR.
Hinchliffe, Lisa Janicke, and Kyle M. L. Jones. 2020. “New Methods, New Needs: Preparing Academic Library Practitioners to Address Ethical Issues Associated with Learning Analytics.” Paper presented at ALISE 2020 Conference Proceedings (October): 184-194. http://hdl.handle.net/2142/108811.
———. 2022. “Privacy Sourcebook.” February 2022. https://osf.io/ga8ns/.
Oakleaf, Megan. 2016. “Getting Ready and Getting Started: Academic Librarian Involvement in Learning Analytics Initiatives.” Journal of Academic Librarianship 42, no. 4: 472-5.
Oakleaf, Megan, Malcolm Brown, Dean Hendrix, Joe Lucia, and Scott Walker. 2019. “When Roles Collide: Librarians as Educators and the Question of Learning Analytics.” Paper presented at ACRL 19 National Conference: Recasting the Narrative, Cleveland, Ohio, April 10–13, 2019. https://alair.ala.org/handle/11213/17706.