Workshops

Workshops will not be offered for the 2020 virtual conference. This decision was made in light of the difficulty of running interactive, participatory workshops without significant investments in technology and time, and with the recognition that it is likely that many potential attendees are already experiencing fatigue from extended online work amidst challenging work situations.

We received many wonderful submissions and highlight their work below.

Rapid Space Assessment: Gathering Data for Unexpected Questions

Accepted as a 90-minute in-program session

Gina Petersen (Northwestern University Libraries), Katie Ediger (Harry S Truman College), Elizabeth Edwards (University of Chicago), and Gabrielle (Abby) Annala (Loyola University Chicago)

View Description

Imagine there are discussions about starting a new resource center on campus or in your community. Your library has been identified as a possible location. The director of the library is meeting with stakeholders in two weeks. To prepare for this meeting, the director needs information about who uses a particular space within the library, why they use it, and where those users would go if the XYZ study space is reimagined as this center. How would you tackle this scenario?

Much of the library literature around space assessment focuses on either needs assessments conducted in the lead up to a renovation, or on the application of ethnographic methods to open-ended user research. These methods and approaches can produce rich data that can meaningfully inform decision-making – but they also take time and training to be done effectively.

In a perfect world, we would have the time, resources, and support to proactively gather information about space use and impact. However, the need for data often emerges with little notice, necessitating urgent, agile, and creative response.

This half-day workshop will focus on how to scope and conduct timely, library-specific space assessments that can be completed on a short timeline.

We’ll start by sharing case studies involving urgent, time-sensitive data requests about a range of library spaces. We will identify the timeline, stakeholders, methods used, data collected, and strategies for reporting. We will discuss the strengths, limitations, and lessons learned.

We will discuss how to mine and triangulate data, including using headcounts, observational data, logs of use, anonymous feedback (white boards, sticky notes), or quick polls. For those working in an academic context, we will briefly discuss the role of the campus institutional review board. We will touch upon bandwidth and how to know when ‘enough’ information has been collected.

Following this presentation, participants will work together in groups to explore sample scenarios from a range of institutions with varying organizational structures and priorities. They will brainstorm types of data that could be collected, and how those data could inform different areas of decision-making and advocacy. Each group will create a project outline that will be presented and discussed with the rest of the workshop.

Participants will leave the workshop with checklists, a project template, and other tools they can apply to space assessment needs at their institutions.

Learning Outcomes:

  1. Develop a working definition of space assessment that can be used to clearly scope future projects at their institutions.
  2. Articulate how to construct questions and projects when the information is needed quickly.

Audience

Professionals with some assessment experience are the intended audience for this workshop.

Gina Petersen
Assessment Librarian
Northwestern University Libraries
gina.petersen@northwestern.edu

Gina Petersen has been the Assessment Librarian at Northwestern University Libraries since 2015. In this role, she’s conducting a series of cascading space assessments, each of which seeks to illuminate space use either within different spaces or by different user types. She particularly enjoys conducting meta-analyses of these projects. She has an MS in Library and Information Science from UIUC.

Katie Ediger
Library Chairperson
Harry S Truman College
kediger@ccc.edu

Katie Ediger has been the Library Chairperson at Harry S Truman College since 2019. She is an active member of Truman’s Assessment Committee and has been working on several space assessments. She has an MLIS from Dominican University.

Elizabeth Edwards
Assessment Librarian
University of Chicago
eee@uchicago.edu

Elizabeth Edwards has been responsible for assessment at the University of Chicago Library since 2012. Her recent projects have included the organization of a library-wide census and the administration of the Ithaka S+R student and faculty surveys. Elizabeth serves on the LibraryUX Chicago steering committee, and holds an MS and CAS in Library and Information Science from the University of Illinois at Urbana-Champaign.

Gabrielle (Abby) Annala
Assessment Librarian & Business Research Specialist
Loyola University Chicago
gannala@luc.edu

Abby Annala is the Assessment Librarian and Business Librarian at Loyola University Chicago. Previously she served as the Marketing and Community Engagement Coordinator at the Morton Grove Public Library. She studied Library Marketing at Dominican University and Concentrated in Marketing while earning her MBA at Loyola University Chicago. She spent 3 years planning the Library Marketing and Communications Conference and am the former convener of Academic Library Marketing. Now, she teaches Leadership, Marketing, and Strategic Communication to Library Science students at Dominican University, as well as Research Methods at Loyola University Chicago.

Advanced Data Visualization: Concepts & Techniques to Craft Powerful Assessment Narratives with Tableau

Accepted as a half-day workshop

Jen-chien Yu (University of Illinois at Urbana-Champaign) and Sarah Murphy (The Ohio State University)

View Description

The growing popularity of data visualization tools such as Tableau and Microsoft Power BI both inspire and present assessment professionals with several opportunities and challenges in their daily work. Assessment librarians in particular, not only incorporate data visualization into their workflows, but are often asked to teach visualization skills internally to library colleagues and externally to the greater campus community. While tools like Tableau make it very easy to analyze data and create graphs, text tables, and dashboards, these tools do not guarantee the quality or effectiveness of visualized data. To successfully convey a story, data visualization must be informational, accessible, and visually engaging. Hence, data visualization is both an art and craft which requires knowledge and practice to master.

This workshop is designed to help library assessment professionals who have some experience with Tableau or a similar analytics/visualization tool to enhance their knowledge of various methods and principles of information design and then apply this knowledge to create effective visualizations using assessment data in a fun and engaging environment. Participants will review best practices for visualizing data, supported with examples of truly awesome representations of data (TARDs), and TARDs hapless cousin, the corresponding truly unfortunate representations of data (TURDs).Several hands on activities will give participants the opportunity to practice concepts and begin to build more advanced, visually engaging, interactive data views using advanced analytics and visualization features in Tableau. Participants will leave the workshop armed with knowledge, references, and tools, that will allow them to create dashboards for effective data storytelling.

Specifically, by the end of the workshop, participants will be able to:

  • Apply methods and information system design principles to create effective visualizations using assessment data
  • Build advanced, visually engaging, interactive views using parameters, actions, level of detail calculations, analytics, and other techniques in Tableau.
  • Create dashboards that incorporate design principles required for effective data storytelling.

The instructors will engage a number of strategies to help participants learn and apply the methods and information system design principles as well as master advanced Tableau techniques. Participants are asked to bring a laptop with an installed version of Tableau Desktop or Tableau Academic to the workshop. All instructional activities will incorporate a variety of data related to libraries, such as COUNTER reports, library statistics, or basic library catalog data. Workshop materials will be provided through a Tableau Packaged Workbook, allowing participants to both reference and practice concepts in one centralized place, and share the packaged workbooks with colleagues when they return home.

All participants will be invited to participate in an informal Iron Viz competition at the end of the workshop. What is an Iron Viz? Each year Tableau hosts a formal Iron Viz competition during the Tableau Conference. Modeled on an Iron Chef competition, but with a tagline “Win or learn — you can’t lose!” three Tableau users are invited to stage to create a data visualization story from a common dataset. The workshop Iron Viz will be very low-key, but designed to give participants a fun environment in which to apply the techniques and methods they learned during the workshop and receive feedback from fellow workshop participants.

The tentative workshop agenda is below. The facilitators are open to reworking the agenda for a half-day workshop.

Agenda

9:00–9:30 Welcome, introduction and overview

9:30–10:00 Principles and Methods I: Technical Considerations

10:00–10:30 Activity I

10:30–10:45 Break

10:45–11:15 Advanced Tableau I: Charts and Analytics Tools

11:15–12:00 Activity II

11:45–12:00 Review and Debrief

12:00–1:00 Lunch

1:00–1:30 Principles and Methods II: Narrative Design/Storytelling

1:30–1:45 Activity III

1:45–2:15 Advanced Tableau II: Parameters, Actions, LOD Calculations, and Other Techniques

2:15–2:45 Activity IV

2:45–3:00 Break

3:00–4:00 1st Annual Library Assessment Conference Iron Viz

4:00–4:30 Next Steps

Learning Outcomes

Participants will be able to:

  • Apply methods and information system design principles to create effective visualizations using assessment data
  • Build advanced, visually engaging, interactive views using parameters, actions, level of detail calculations, analytics, and other techniques in Tableau.
  • Create dashboards that incorporate design principles required for effective data storytelling.

Audience

This workshop is designed to help library assessment professionals who have some experience with Tableau or a similar analytics/visualization tool to enhance their knowledge of various methods and principles of information design and then apply this knowledge to create effective visualizations using assessment data in a fun and engaging environment.

Jen-chien Yu is the Director of Library Assessment at University of Illinois at Urbana-Champaign. Jen coordinates library assessment programs and works closely with faculty and staff across the Library, and around the campus, to design and implement assessment activities that supports data-informed decision making related to library services, collections, technology, and facilities. Jen has taught data literacy workshops and Tableau Public workshops to librarians, faculty and students.

Sarah Murphy is a professor and data literacy and visualization librarian and former assessment coordinator at The Ohio State University Libraries. Murphy helps researchers to gather, prepare, and clean data and communicate effectively using various analytics and visualization tools. She currently leads the Ohio State Tableau Users Group and continues to work to build student success narratives for academic libraries using analytic and visualization tools.

Walking the Magnificent Mile: From Strategic Planning to Implementation and Assessment

Accepted as a half-day workshop

Maurini Strub (University of Rochester) and Starr Hoffman (University of Nevada, Las Vegas)

View Description

Organizations invest a great deal of resources into developing a strategic plan, only for it to frequently land on a shelf, file cabinet, or be electronically archived. We will take a look at best practices for creating, implementing, and managing a strategic plan that remains at the forefront of every staff member’s mind. Throughout the workshop, we will involve participants in a blend of topical lecture and hands-on activities such as worksheets and small group discussion to put their new skills in practice.

Participants will examine how to walk their organization through the process of creating a strategic plan, including demonstrating some worksheets and exercises to jump-start the process. We will consider how to create and balance different kinds of goals (aspirational, strategic, operational, etc.), how to develop a shared vocabulary, and how to bake assessment in from the beginning. We will particularly focus on constructing a planning process that is inclusive and involves all levels of the organization, and how that can increase buy-in. The end goal is a living document, presented in a digestible format that is flexible and outcomes-based.

Next, the workshop will exhibit how to apply an outcomes-based assessment framework to create a required (but flexible) structure that drives projects that advance strategic goals. This framework is also presented by sharing project planning documents that can be used to increase buy-in by creating a shared understanding of scope and success criteria while baking in accountability. We will also cover how to design an assessment and communication plan for the implementation that keeps the strategic plan at the forefront of everyone’s minds and incorporates regular communication on the progress, evaluation, and reflection on the plan. Finally, we will address the challenges in using the tools and some of the ways that participants might encounter organizational resistance, along with strategies for mitigating common pitfalls.

Learning Outcomes

  • Construct a structure for creating and implementing a strategic plan.
  • Identify, manage, and realign cultural mismatches of operational vs strategic work.
  • Build in accountability in an implementation plan.

Audience level: Beginner or intermediate

This workshop is suited to anyone involved in leading or involved in a strategic planning process. It will provide guidance for beginners as well as tips and best practices for those who have more experience with the process.

Maurini Strub
Director of Library Assessment
University of Rochester
maurini.strub@rochester.edu

As the Director of Library Assessment for the University of Rochester, Maurini Strub manages, leads, and collaborates on projects focused on gathering, analyzing, and using high-quality, actionable data to determine the value of library services, programs, learning spaces, and resources. She has a strong background in user-centered design, and prior assessment work focused on spaces and services assessment at the University of Louisville Libraries. Recently, she facilitated a session at the Canadian Libraries Assessment Workshop entitled Evaluating and Managing the Implementation of Your Strategic Plan and has also recently facilitated a section of Library Planning, Marketing, and Assessment at Syracuse University.

Starr Hoffman
Director of Planning & Assessment
University of Nevada, Las Vegas
starr.hoffman@unlv.edu

Starr Hoffman (MLS, MA, PhD) is Director of Planning & Assessment and a member of the senior administrative team for the University of Nevada, Las Vegas Libraries. Her work includes leading strategic planning and library assessment to improve services and support decision-making, as well as contributing to the accreditation process. She has led workshops on strategic planning and assessment at ACRL, LAC, and the International Conference on Library Performance Measurement. Dr. Hoffman’s scholarship includes many peer-reviewed articles and the chapter “Triangulating an Assessment Plan” in the recent book, Academic Libraries and the Academy: Strategies and Approaches to Demonstrate Your Value.

Libraries and Learning Analytics: Facts, Fallacies, and Future Forays

Accepted as a full day workshop

Megan Oakleaf (Syracuse University, Ken Varnum (University of Michigan), Rebecca Croxton (University of North Carolina at Charlotte), and Anne Cooper Moore (University of North Carolina at Charlotte)

View Description

Learning analytics offers a new tool in the library assessment toolbox, one that closes gaps left by other assessment methods but also raises myriad questions for librarians. As higher education institutions expand their learning analytics initiatives, librarians need to prepare to participate in learning analytics as campus partners. Librarians should be aware of current definitions and common deployment models for institutional learning analytics, understand the purposes and pitfalls of learning analytics as an approach to support student learning, and develop a plan for learning analytics engagement that fits their values as well as institutional and library needs. To advance this work, librarians need to know the “facts,” avoid “fallacious” diversions, and align their ethics, priorities, and options to determine a future course for involvement in learning analytics.

This workshop will open with introductory content defining learning analytics from a higher education perspective, identifying misalignments between library use of the term and broader institutional uses, outlining the purposes of learning analytics as an assessment approach in support of student success, and describing six “false choices” that often derail discussions of learning analytics in the library community. Following this introduction, participants will move into a series of five hands-on activities:

  1. For the first activity, participants will work in small groups to sort “user stories” related to library integration into learning analytics into four categories ranging from low to high impact, with a separate category for user stories that generate significant concern. Afterwards, participants will prioritize high impact user stories and consider which might be relevant for their home institutions. They will also brainstorm ways to mitigate the concerns generated by potentially problematic user stories.
  2. For the second activity, participants will review lists of 1) questions that the inclusion of library data in institutional learning analytics might help answer and 2) actions that library integration in institutional learning analytics might help librarians take. Working in small groups, participants will cross out questions and actions that they perceive as not interesting, relevant, or impactful and circle those that may be interesting, relevant, or impactful to them or their home institutions. At the end of this exercise, participants will prioritize and rank the questions and actions they circled.
  3. For the third activity, participants will examine a list of obstacles to library integration in learning analytics (e.g., privacy concerns, data quality, data granularity, data access, organizational culture), adding to it as needed. Participants will vote on which obstacles to focus on and then small groups will be formed to address individual obstacles. For each obstacle, groups will engage in problem solving using a four-step process: 1) list what is known about the obstacle, 2) list what is unknown and draft questions expressing those gaps, 3) describe what a successful mitigation or resolution would the obstacle would look like, and 4) outline a course of action or series of strategies to move forward in pursuing an acceptable solution.
  4. For the fourth activity, participants will engage with a list of library data points and sources as well as static and dynamic institutional data points. In small groups, participants will consider whether, or to what degree, connecting these data points might reveal information that would allow libraries to better support student learning or empower students to make more informed decisions about their interactions with libraries and then brainstorm research questions drawing on this data that, once answered, might promote better library or student decision-making and action-taking.
  5. For the fifth activity, participants will review lists of 1) possible roles librarians could play in learning analytics at their institutions and in their libraries and 2) possible next steps for library integration in institutional learning analytics. Working independently or in pairs, participants will cross out roles and steps that they perceive as not interesting, relevant, or impactful and circle those that may be interesting, relevant, or impactful to them or their home institutions. At the end of this exercise, participants will prioritize and rank the roles and steps they circled.

Following these activities, participants will engage in a reflective discussion to identify what they have learned about the library’s possible role in learning analytics and what they have yet to learn. Finally, they will craft action plans including steps to follow to continue their development in this area, resources they need to move forward, partnerships to initiate or develop, questions to ask, conversations that need to begin or continue, etc. The workshop will close with participants sharing one step they plan to take to move forward.

Learning Outcomes:

  • Participants will be able to define learning analytics in a higher education context in order to prepare for, ask questions about, and/or engage in learning analytics work at their institutions.
  • Participants will be able to identify, envision, and articulate possible contributions libraries may make to learning analytics initiatives in order to ensure librarian involvement in the ethical and responsible support of student success at their institutions.
  • Participants will be able to develop an action plan to extend their own learning and engagement in learning analytics that aligns with their personal and professional values and contributes to ongoing work to support the teaching and learning missions of their institutions.

Audience

This workshop can be adapted to the audience that registers.  It is designed for those new to learning analytics work.  Should more advanced participants register (which will be known in advance through the registration list and a pre-assessment prior to the conference), the activities can be adjusted or enriched to engage a more experienced group.

Megan Oakleaf is an Associate Professor of Library and Information Science in the iSchool at Syracuse University. She is the author of The Value of Academic Libraries: A Comprehensive Research Review and Report and Academic Library Value: The Impact Starter Kit. She is the Principal Investigator for the IMLS-funded Library Integration in Institutional Learning Analytics (LIILA) and Connecting Libraries and Learning Analytics for Student Success (CLLASS) grants.

Ken Varnum is the Senior Program Manager for Discovery, Delivery, and Library Analytics at the University of Michigan Library. Ken’s research and professional interests include discovery systems, library analytics, and technology in the library setting. He has written or edited six books, the most recent of which, “Beyond Reality: Augmented, Virtual, and Mixed Reality in the Library” and the LITA Guide “New Top Technologies Every Librarian Needs to Know”, were published in 2019. He can be found on Twitter at @varnum.

Rebecca (Becky) Croxton is the Head of Assessment for J. Murrey Atkins Library at the University of North Carolina at Charlotte. She previously worked as a Reference Librarian at Johnson & Wales University’s Charlotte campus and Central Piedmont Community College. She earned a PhD in educational studies, a doctoral minor in educational research methods, and her MLIS degree from the University of North Carolina at Greensboro (UNCG). Key research interests include: quantifying the value of the academic library, information seeking needs, preferences, and motivation of undergraduate students, online learning, and professional identity development, She is an active member of the Association of College & Research Library’s Value of Academic Libraries Committee and the Library Leadership & Management Association (LLAMA) Assessment Community of Practice. Becky is also an adjunct professor for UNCG and frequently teaches courses in Data Visualization, Media Production Services for Library Programs, Instructional Technology.

Anne Cooper Moore joined UNC Charlotte as Dean of the J. Murrey Atkins Library in June 2015. Previously she served as Dean of Library Affairs at Southern Illinois University (SIU) Carbondale from 2012 to 2015 and as Dean of Libraries at the University of South Dakota (USD) in Vermillion from 2008-2012. She also worked in libraries at the University of Massachusetts Amherst, George Mason University, and the University of Arizona. She is the current president of the Library Leadership and Management Association of the American Library Association, chair of the UNC System University Library Advisory Council, and a board member for the Association of Southeastern Research Libraries. She holds a PhD in Educational Management and Development from New Mexico State University, an MSLIS from the University of North Carolina Chapel Hill, and a BA from Duke University.

Anything with a Hash is for Tweets! Implement the READ Scale (Reference Effort Assessment Data) and Record Reference Value for Real

Accepted as a 90-minute in-program session

Bella Gerlich (Goucher College) and Cynthia L. Henry (Texas Tech University Libraries)

View Description

The READ Scale is a six-point (1-6) sliding scale tool created by Bella Gerlich that asks reference librarians to assign a number based on the effort, skills, knowledge, teaching moment, techniques and tools utilized during the reference / research transaction instead of a hashmark. The instrument also has practical applications, including: staffing strategies, training and continuing education, establishing outreach needs, renewed personal and professional interest, updated reporting and statistics models, and demonstrating return on investment. The ACRL Board of Directors 2017 and SAA Council 2018 approved the READ Scale for use in the joint ‘Standardized Statistical Measures and Metrics for Public Services in Archival Repositories and Special Collections Libraries’ as an ‘advanced statistical measure’ for recording reference transactions. There have been a number of articles written about the Scale since it was introduced in 2010 at ALA after national study, and it is taught in LIS courses. Approximately 350 libraries from all over the world are using the Scale at their institutions: public, health sciences, special, government, law, archives, academic – public, private of all sizes. The READ Scale has also been incorporated into Springshare Libstats for use at no additional cost to subscribers, making it an easy to use tool, as well normalize services and compare with like institutions – but more importantly – recognize the efforts of assisting people with research queries with some tangible data. NOTE: You DO NOT have to use Springshare to be a part of this workshop; the Scale can be used with any data collection system. This will be covered in the workshop.

  1. Introduction. The Benefits using the tool and Engaging Stakeholders
    What is the READ Scale? Dr. Gerlich will share the inspiration for developing the READ Scale – interviews with both reference librarians and administrators that agreed 100% the traditional counting method was not a measure of the effort nor knowledge taking place – an undervalued service. Changing how we record reference statistics takes work – it means changing how we do something the same way we always have – what are the benefits? Gerlich and Henry will address the benefits, how to engage stakeholders and why it can be beneficial to them to adopt the Scale.
  2. Implementing the Scale
    In this section, the Dr. Gerlich and Ms. Henry will discuss ways to implement the READ Scale based on Library type: Academic, Health Sciences, Special Libraries, Archives. Groups will work together to create sample implementation materials and exercises that enable a confident approach to introduction at their home institution. Deciding on ways to record the statistics will be discussed and determined for each participant based on their library’s practice.
  3. Assessing Training and Planning Outcomes
    Dr. Gerlich and Ms. Henry will review training overall and hone in on desired outcomes for using the READ Scale based on discussions. For this exercise, like institutions may be paired to enable a shared understanding of size, budget and so on that might impact the training options and outcomes desired.
  4. Using READ Scale Statistics for Reporting Purposes
    Dr. Gerlich and Ms. Henry will end the workshop with a discussion on how using the data gathered from READ Scale statistics can and has be used in practical applications: creating reports that illustrate the activities of reference librarians to stakeholders that will build support for services; training for students, staff; ROI; renewed professional interest; time estimates and so on. Various papers / presentations published by librarians who have used the READ Scale will be discussed, citing real-world examples of how using the Scale is better than a hash-anything.

A list of references will be included for workshop attendees. It is suggested that participants bring laptops or tablets to enable creating forms, etc.

Audience

New, managers of reference, public services, or someone in between

Bella Karr Gerlich, PhD
Interim College Librarian
Goucher College
bkarrgerlich@yahoo.com

Bella Karr Gerlich, PhD, Special Projects Librarian, began working at Texas Tech (TTU), a Tier One, ARL institution and Texas Digital Library founding member in 2015. Dr. Gerlich draws leadership expertise from over 25 years in academic libraries, including administrative appointments at Texas Tech, University of Alaska Fairbanks, Dominican University, Georgia College & State University and Carnegie Mellon University.

Dr. Gerlich’s research interests include assessment, advocacy, organizational strategy / planning and valuation of services; she has authored or co-authored numerous peer-reviewed publications, participated in panel sessions and has been invited to present at conferences in the United States and abroad on a variety of topics in information sciences and leadership.

Dr. Gerlich has been recognized for her contribution to Librarianship working to acknowledge hidden work of library staff by developing qualitative data-gathering methodology. She created the Reference Effort Assessment Data (the READ) Scale, a 1–6 sliding scale tool for recording the effort, knowledge, skill, and teaching that occur during reference transactions. Over 350 libraries worldwide use the tool, which has been incorporated into commercial and open-source data analysis products.

Dr. Gerlich has a BFA from Virginia Commonwealth University, a Masters of Public Management from Carnegie Mellon and a PhD from University of Pittsburgh. She recently completed The Carnegie Mellon Leadership and Negotiation Academy for Women at the Tepper School of Business and was the 2013 recipient of University of Pittsburgh’s School of Information Sciences Professional Achievement Award.

Cynthia Henry
College of Human Sciences
Texas Tech University
(806) 834-0898
cynthia.henry@ttu.edu

Cynthia L. Henry has been a librarian since graduating from Texas Woman’s University in 2004. Hired as a subject librarian at Texas Tech University in 2005, she enjoys working on an academic campus. Meeting faculty and students research needs from her subject area of the College of Human Sciences is a rewarding experience in addition to being an instructor for the Essentials of Scholarly Research.

How to Transform Library Spaces, Services, and Staffing to Create 21st-Century Learning Environments

Accepted as a half-day workshop

Martha Kyrillidou (QualityMetrics, LLC) and Elliot Felix (brightspotstrategy)

View Description
Learning Outcomes:

Participants will

  1. Walk through the 10 steps of library transformation through tailored discussions
  2. Explore the information needs of different library personas and empathize with their users
  3. Understand the importance of post-occupancy evaluation and its benefits and implement a case study

Purpose: 

To explore assessment practices in library spaces, services, and staffing to create 21st century learning environments focuses on the transformation of libraries through well tested engagement methods brightspotstrategy and QualityMetrics have been using to help libraries connect people, programs, and spaces and transform the student experience.

This workshop aims at interactively exploring how to transform library spaces, services, and staffing to create 21st century learning environments. We will review case studies tailored to the audience’s biggest planning challenges and walk through the 10 steps of library transformation through those tailored discussions:

  1. Conduct internal and external research
  2. Establish your vision for the future
  3. Forecast your space, service, and technology needs
  4. Create a playbook of ideas
  5. Update services and integrate partners
  6. Rationalize space/services across campus
  7. Rationalize space/services within
  8. Identify phases for implementation
  9. Identify and implement pilot projects
  10. Redesign and develop the organization

Case studies from a variety of institutions such as Georgia Tech, Temple University, and the University of Miami will examine issues of renewal, visioning and implementation, and new service models.

We will engage in understanding Needs Assessments and Post Occupancy Evaluation (POE) with hands on exercises reflecting specific personas and specific use-case tools. Participants will explore how they can turn their user survey data into personas for academic or public library audiences. They will explore how to engage in POE, articulate the needs, the issues, the purpose, the approaches, and how the information can feed into library policies and procedures. Space planning is not ending when a new building is being built. Space planning is an ongoing, iterative, engaging process of reinventing and rediscovering the way to engage with each other and our world.

Discussion questions:

  • How might you assess, plan, and transform spaces in your library? Name one thing to keep, one thing to toss and one new thing to create
  • What are the needs of your undergraduate students and how do you know about them? Name your student, their personality, and their wants and needs
  • What are the needs of your graduate students and how do you know about them? Name your student, their personality, and their wants and needs
  • What are the needs of your faculty and how do you know about them? Step into the shoes of your faculty, their personality, and their wants and needs
  • If is the dominant characteristic of your library service ten years into the future? Dream the dream: the library of the future
Audience

Professionals new to space planning and assessment

Martha Kyrillidou is founder and CEO of QualityMetrics LLC since 2016 working with academic, public, and government libraries and agencies and helping them with their transformation and innovation strategies. She serves as the chair of the NISO Z39.7 and as an expert advisor to ISO, and the IMLS Library Statistics Working Group. She is an experienced LIS evaluator and researcher having designed and taught research methods courses at the U of Maryland and at Kent State University. She has a PhD from the iSchool at the U of Illinois. Martha led what is now known as the Research and Analytics capability at the Association of Research Libraries (ARL) from 1994 to 2015.  She co-developed the well known LibQUAL+ protocol; she also established ClimateQUAL, MINES for Libraries, and many value-based assessment protocols that were readily adopted by libraries having established an internal data analysis capability collecting the well known ARL Statistics and the ARL Annual Salary Survey. She is currently deploying UX and Assessment methods consulting on strategy, new spaces, and transformative library services.

Elliot Felix is the Founder of brightspot, a strategy consultancy that is reimagining the higher education experience through integrated organizational, operational, and space planning projects that increase student, faculty, and staff engagement on-campus and online.

Build a Live & Interactive Library Dashboard Using Google Data Studio

Read more about the workshop (PDF)

Accepted as a half-day workshop

Kineret Ben-Knaan (University of Miami Libraries) and Cameron Riopelle (University of Miami Libraries)

View Description

Abstract

Libraries are good at data collection and interpreting this data but need to become better at sharing our results. We need to make our analytics insights accessible—both internally and externally—so they do not exist in a vacuum but demonstrate value and inform decision making.

One step towards building a culture of assessment in your institution is presentation of real-time data (from multiple sources) in a clear and visually engaging manner. Data collection can be messy, drawing from numerous service areas, using different data tools and methods. There is information on expenditures, collections statistics, teaching support, technology use, and both physical and virtual traffic analytics. Stakeholders throughout one university’s libraries may have access to some sources relevant to them but be unaware of others. As such, creating a dashboard that consolidates different data streams filtered and displayed into a single report is important for getting stakeholders involved and your insights put into practice. In addition, while proprietary software such as Tableau is an option for some institutions, the full version can be prohibitively expensive, and the free version makes all of your data public without access restrictions.

For the 2020 Library Assessment Conference, we planned to present a workshop to guide participants through the process of setting up a dashboard using Google Data Studio and connecting this dashboard to data streams. At the end of the session, participants were to have created a sample dashboard that they could enhance and easily share with colleagues and administrators.

Audience

This workshop was intended for anyone interested in creating data dashboards for their institution. There was no requirement for advanced technical skills, only knowledge of their institution’s different data collection products.

Topics Covered

We planned to begin by briefly describing the current dashboard/visualization/business intelligence landscape, with an emphasis on products and techniques relevant to standard library data streams. We then intended to demo the University of Miami Libraries dashboards and show examples of what data might be captured and what sort of questions might be answered with such a tool. This workshop was designed to walk participants through the process of setting up a Google Data Studio account, building data connectors, interacting with the data, and generating visualizations.

Finally, we prepared to discuss how one can customize dashboards for different stakeholders’ needs, from librarians trying to improve their collection development decisions, to a department head trying to better manage resources and to an administrator or dean who wants to get a bird’s eye view on operations.

Learning Outcomes

We believe that our method of using Google Data Studio provides an inexpensive way for Libraries to present their information to their stakeholders. At the end of the workshop, participants were to have created a sample dashboard that they could enhance and easily share with their colleagues.

Endnotes

The Google Data Studio workshop has been offered to the University of Miami students and faculty.

References

Ben-Knaan, Kineret, and Andrew Darby. “Build a Live & Interactive Library Dashboard Using Google Data Studio.” In Data-Driven Decision-Making in the Library: Using Business Intelligence and Data Analytics Software Tools for Library Management. Lanham: Rowman & Littlefield (LITA Guide), In Press.

Ben-Knaan, Kineret, and Andrew Darby. “The Learning & Research Dashboard: Making Data-Driven Decisions with Google Data Studio.” Paper presented at 13th International Conference on Performance Measurement in Libraries (LibPMC), Aberystwyth, Wales, UK, July 24, 2019.

Ben-Knaan, Kineret. “Consolidate Your Library’s Data using Google Data Studio & Alma Analytics API Connector.” Paper presented at ELUNA 2019 Annual Meeting, Atlanta, Georgia, May 3, 2019

Ben-Knaan, Kineret, and Andrew Darby. “Make Your Library’s Data Accessible and Usable: Create Live Dashboards with Google Data Studio.” Poster presented at the 2018 Library Assessment Conference — Building Effective, Sustainable, Practical Assessment, Houston, Texas, December 2018.

Kineret Ben-Knaan
Research & Assessment Librarian
University of Miami Libraries
kbenknaan@miami.edu

Kineret Ben-Knaan is the Research and Assessment Librarian at the University of Miami. In this role, Kineret provides support for the achievement of the University of Miami Libraries’ strategic goals through the development of an assessment program and data-driven assessment activities related to services, collections, technology, and physical spaces. She works collaboratively with members of the administrative leadership and management teams to understand, predict, and accommodate user needs as well as enhance organizational effectiveness.

Kineret holds a master’s in information science, from Bar-Ilan University, Israel and a Certificate of Business Analytics, from Cornell University.

Cameron Riopelle
Head of Data Services
University of Miami Libraries
criopelle@miami.edu

Cameron Riopelle is the Head of Data Services and a Librarian Assistant Professor at the University of Miami. He received his PhD in Sociology and Master’s in Statistics from the University of Illinois. His research interests include quantitative and qualitative methods, theories of colonialism and the state, and the study of educational systems.

What Counts and What Can be Counted—Fundamentals of Electronic Resource Assessment​

Accepted as a half-day workshop

Klara Maidenberg (University of Toronto) and Sabina Pagotto (Scholars Portal, Ontario Council of University Libraries)

View Description

This workshop will cover best practices in assessment of electronic resources by presenting a structured process for using evidence to make decisions around these collections. The workshop will also orient participants to the latest (5th) release of the COUNTER Code of Practice, the industry standard for counting and reporting usage of electronic content.

As electronic resources claim a growing proportion of academic libraries’ collections budgets, librarians outside the electronic resources team are increasingly being asked to evaluate and make decisions about these resources and assessment practitioners are often consulted in this work. The approaches that are required for evaluating electronic content can be different from those used in other types of assessment projects, and there is a scarcity of professional development opportunities in this area. Where expertise in assessing collections exists, it is often limited to a small number of expert staff. The goal of this workshop is to enhance the capacity and confidence of librarians by providing practical tools and approaches that they can adopt as they engage in decision-making around electronic collections. Assessment practitioners will walk away with a process flow and checklist that will help them support or lead collection assessment projects at their own institutions.

The workshop will include several hands on exercises where participants will be provided with materials to support an assessment of an e-resource. At the end of the session, participants will have an opportunity to work individually or in groups to apply their learning to reach a decision regarding the renewal of the product.

Learning Outcomes:

  1. Participants will learn about indicators of value – sources of qualitative and quantitative data to support electronic resource assessment.
  2. Participants will develop their capacity to critically analyze data related to their electronic resources and learn ways to avoid common erroneous conclusions.
  3. Participants will become familiar with COUNTER e-resources usage reports and understand their place within a comprehensive e-resources assessment project.
  4. Participant will learn about a five step checklist that can be applied to a variety of collections assessment projects.

Please note:

A slightly different version of this workshop was delivered at the Canadian Library Assessment Workshop in 2019 and received very positive evaluations from attendees:

  • 60% rated the workshop as excellent. 33.33% rated it as good.
  • 87% said the workshop met or exceeded their expectations.

Audience

Assessment practitioners at all levels of experience and expertise (aside from those with deep expertise in Collections Assessment) would find this session useful.

Klara Maidenberg has been the Assessment Librarian at the University of Toronto Libraries since 2015. She holds an BA (Hons.) and a B.Ed from York University and an MISt from the University of Toronto. In her current role, she is tasked with helping library colleagues make evidence-based decisions in all areas of library activity, with a special focus on collections assessment. She has presented on assessment related initiatives at various conferences including LAC, QQML, ALA, CLAW, and elsewhere.

Sabina Pagotto is the Client Services & Assessment Librarian at Scholars Portal, the digital infrastructure arm of the Ontario Council of University Libraries. As part of this role, she oversees the production of usage reports for the locally hosted e-book and e-journal platforms and manages compliance with COUNTER standards. Sabina previously worked on the Journal Value Analytics Tool at the Canadian Research Knowledge Network and completed her MLIS at the University of Western Ontario in 2013.

It's Critical: Tools and Strategies for Adopting an Equity-Driven Assessment Practice

Accepted as a half-day workshop

Ebony Magnus (Simon Fraser University), Maggie Faber (University of Washington Libraries), and Jackie Belanger (University of Washington Libraries)

View Description

Building on our previous work on critical assessment, we invite participants to join us for a half-day workshop in which they will engage structures of power and positionality inherent in library assessment in order to question assumptions, challenge objectivity, and come away with tools encouraging more mindful and equitable practice. This workshop will provide an opportunity for participants to reflect on their ongoing assessment work while developing an action plan for sustained critical practice. Throughout the workshop, participants will be asked to consider and reflect upon their identities and positionality, and the ways in which their assessment work is consciously or unconsciously influenced by them. We will provide tools and prompts to facilitate this reflection and discussion-based activities will centre heavily in this portion of the workshop.

Brief lecture-based portions of the workshop will introduce participants to key definitions (i.e. equity, anti-racism, whiteness, etc.) and critical methodologies, including feminist theory, indigenous research methods, and critical data studies, among others.

Using the assessment cycle as a frame for the lesson plan, we will use case studies to guide participants through each stage of the cycle with a critical lens. Case studies will facilitate a common dialogue around questions related to assessment design and motivation, methodological selection, sampling and recruitment, data analysis, and community engagement. Participants will work through a series of individual and group activities culminating in the development of a detailed critical assessment action plan, grounded in their institutional and individual context.

Activities will draw on Liberating Structures (1) & Participatory Design and may include:

  • Using a bias/privilege checklist, participants individually will reflect on identity and positionality with respect to their professional roles and institutions. Group discussion will follow in which participants share their responses to questions such as “How do our own identities, institutional positions, and perspectives shape our work?”
  • 15% solutions: Focusing on questions of power-sharing and community engagement, participants brainstorm (individually and in groups) actions they can implement during design and recruitment stages of the assessment cycle to engage in more critical practice, given resources and degrees of autonomy available to them
  • Wicked questions: In order to examine tensions between stated professional values, participants will create lists of paradoxes or perceived conflicts that arise when bringing a critical lens to their daily assessment practice. As a group, participants will discuss the ideas generated and collectively identify the most compelling questions to discuss and explore solutions
  • Methodology mash-up: Groups of 4–5 will be assigned commonly used assessment methodologies. Together, they will identify hierarchies or sites of power embedded in each method, and list assumptions about neutrality or objectivity associated with each method. Groups will identify critical alternatives and/or suggest ways to bring a more critical approach to the method at hand.
  • The SQUID (Sequential Question & Insight Diagram): By alternating between collectively-generated questions and answers, participants draw on the perspectives of the group to help navigate uncertainty about implementing critical perspectives in their work and collectively discuss tools and strategies for implementation.
  • By the end of the workshop, participants will have an action plan that brings together the ideas and reflection throughout the day into a set of concrete steps for adopting critical and equity-driven practices in their assessment projects. We will also provide attendees with a set of questions and activities for ongoing reflection after the workshop ends.

Learning outcomes

Participants will who attend this workshop will be able to

  • Utilize models for reflective practice in assessment work
  • Identify methodological approaches that centre marginalized voices through grounding in critical theoretical models
  • Develop an action plan for sustainable critical assessment practice in their library or place of work / at their institution

(1) Lipmanowicz, H., & McCandless, Keith. (2014). The surprising power of liberating structures : Simple rules to unleash a culture of innovation. Seattle, WA: Liberating Structures Press.

Audience

This workshop will be most valuable for participants with prior knowledge of library assessment trends and practices, and especially for those regularly engaged in assessment work at their institutions. While we are planning for a half-day workshop, we do believe this can be scaled for a full-day schedule as well.

Ebony Magnus is Head of Samuel and Frances Belzberg Library at Simon Fraser University in Vancouver, Canada. Ebony has over ten years teaching experience as a graduate student and as a liaison librarian, in addition to her experience facilitating library assessment workshops in person and via webinar in Canada and the US at venues including Library Assessment Conference, Canadian Association of Professional Academic Librarians Conference, Canadian Library Assessment Workshop, and National Diversity in Libraries conferences. She has completed the Instructional Skills Workshop as well as a variety of local instructional training programs.

Maggie Faber is the Assessment & Data Visualization Librarian at the University of Washington. She has eight years of teaching experience in and out of academia. Recently, she facilitated workshops at the International Conference on Performance Measurement in Libraries (LibPMC), the Library Assessment Conference (LAC), the Canadian Association of Professional Academic Librarians (CAPAL), and the Canadian Library Assessment Workshop (CLAW).

Jackie Belanger is Director of Assessment and Planning at the University of Washington Libraries.

Institutional Repository Usage Statistics (IRUS): Using IRUS to Share and Compare COUNTER-Conformant Usage Data

Accepted as a half-day workshop

Hannah Rosen (LYRASIS), Jo Lambert (JISC), and Jim Ottaviani (University of Michigan)

View Description

Description:

IRUS, created and managed by Jisc, an organization based in the UK that provides digital solutions for education and research, is a service In which a piece of code can be added to institutional or data repositories, enabling raw data to be transmitted and then processed into authoritative, standards-based, standards-based statistics supporting universities to gain a better understanding of the breakdown and usage of their institution’s research, which they can share with key stakeholders. Currently, IRUS is used by the majority of UK-based institutions to allow for not only institutional assessment, but also the ability to perform cross-institutional comparisons.

The benefits of IRUS, particularly the ability to access standards-compliant usage data so that participating institutions can run complex reports, do cross-institutional comparisons, and generally better visualize and benchmark their own usage statistics, has been an attractive feature for interested users in the United States. In 2018, Jisc, in conjunction with CLIR/DLF, performed the 2018 IRUS-USA pilot project with eleven U.S. institutions, in a web portal with limited functionality. Based on feedback from those pilot institutions (presented at LAC in 2018), Jisc has established a partnership with LYRASIS, a non-profit dedicated to supporting enduring access to shared academic, scientific and cultural heritage through leadership in open technologies, content services and digital solutions, to build IRUS-US, a US based community of IRUS users. IRUS-US users will have access to a newly updated user-interface with full functionality and COUNTER-5 compliance.

The goal of this half-day workshop is to provide an immersive, hands-on introduction to IRUS-US. The workshop will include an overview by LYRASIS of the user portal, statistics and download procedures, and will include exercises designed to help participants become comfortable using IRUS. LYRASIS will also provide a brief overview of how to implement the tracker within various types of institutional repository systems. There will also be an active demonstration by subscribing U.S. institutional user(s), displaying the way they use IRUS for their advocacy and reporting needs. Jisc staff will conclude the workshop by demonstrating use cases for national and international benchmarking.

Learning Outcomes:

  • Participants will learn about IRUS features
  • Participants will learn to use IRUS functionalities
  • Participants will learn how IRUS statistics can support their institutional and cross-institutional usage data requirements
  • Participants will identify and discuss national and international use cases for IRUS statistics
Audience

The ideal audience for this workshop would be institutional repository managers interested in learning about a new service able to provide standards-based, cross-institutionally comparable statistics for IR usage. The workshop is open to all levels of professionals working within institutional repositories – technical expertise is not required.

Hannah Rosen is a Scholarly Communication Specialist and the IRUS-US Community Specialist at LYRASIS. She is responsible for managing the IRUS-US community, including creating educational material, hosting workshops, providing first-line support, and performing outreach on behalf of the service. She holds a bachelor’s Degree in social and cultural history from Carnegie Mellon University and a Master’s Degree in Library and Information Science from the University of Pittsburgh, with a specialization in Archives, Preservation and Records Management.

Use Excel for Data-Wrangling, Analysis, and Visualization

Accepted as a half-day workshop

Jingjing Wu (Texas Tech University Libraries) and Kimberly K. Vardeman (Texas Tech University Libraries)

View Description

Library employees are dealing with more and more data in their routine work. Cataloger or metadata librarians receive data from various sources in different formats and convert them to a format accepted by their systems. Reference librarians summarize users’ evaluations on their library instruction or information literacy courses and share findings with their counterparts in other colleges. Scholarly communication librarians help researchers, departments, or universities assess their research productivity. Almost all librarians need to evaluate the performance of their projects, services, or resources to some extent using data from a variety of sources, such as web analytics, survey results, and circulation statistics in order to make informed decisions. With libraries’ involvement in research data management, librarians are expected to give more instructions on data management and data-wrangling skills.

Although more tools are available in data science, Excel spreadsheets remain an indispensable part of the analytics industry. Excel handles a variety of data formats, transforms raw data into other formats for further analysis, and has important features for summarizing data and visualizing data. This workshop aims to develop advanced skills in Microsoft Excel for library employees’ daily work. The workshop will use some example datasets from librarians’ routine tasks to demonstrate data input, data wrangling, and data visualizations. The workshop covers advanced Excel features: PivotTable, PivotChart, Power Query, Power Pivot, and data model.

This is the draft schedule for the half-day workshop.

00:00 MS Excel Table

  • Why using table is important for data wrangling
  • How to create a table
  • How to name a table
  • Format a Table
  • Sort and filter

00:15 PivotTable & PivotChart

How to create a PivotTable report

  • Use recommended PivotTable
  • Define the layout of your PivotTable report
  • Total and Subtotal in the PivotTable
  • Numeric formats
  • Sort and filter
  • Print PivotTable

00:45 PivotChart

  • How to choose a visualization
  • Redesign the default chart

01:00 Power Query

01:15 Import data from external sources

  • Import files in the format of text, csv, or spreadsheet
  • Import files from a folder
  • Filtering out files which do not meet the requirements

01:30 Split column

  • Split column by custom delimiters

01:45 Create new columns

  • Create new columns calculated from other columns
  • Create new columns on conditions

02:00 Unpivot

  • What is proper data or tidy data?
  • Unpivot

02:15 Merge tables

  • Merge tables by shared column

02:45 M Language and Data model

  • What is M Language
  • How to reuse M Language
  • Data model and Power Pivot

03:00 Wrap-up

  • Other helpful resources for Excel

03:10

This session will expect that attendees have at least basic familiarity with Excel. The methods of analyzing and visualizing data will be practical for attendees at the foundation or intermediate level. The attendees will:

  • Learn how to create PivotTable report and use PivotChart to visualize results
  • Learn how to use Excel Power Query to connect data from various sources and transform data to the format ready for analysis

Audience

Beginning or intermediate level in Microsoft Excel

Jingjing Wu is the Web assistant librarian in the Texas Tech University Libraries. Her research interests include Web technologies, user experience in libraries, and data analysis. She presented her studies at Library Assessment Conference, Singapore Library Association Inaugural Webinar, Designing for Digital, Conference of Library Society of China, and Annual LITA forum. Jingjing had led workshops on data visualization and data dashboard for faculty and students in 2019.

Kimberly Vardeman is User Experience Librarian at Texas Tech University Libraries. Prior to her current appointment, she served as a reference librarian and subject liaison to several academic areas. Her research interests include user experience, library instruction, and the imposter phenomenon. She has presented at the Texas Library Association Annual Conference, Library Assessment Conference, Texas Council of Academic Libraries Annual Conference, Designing for Digital, and Amigos Library Services Member Conference. Ms. Vardeman has led workshops on her campus targeted to graduate students and faculty that covered citation management software and Turnitin and iThenticate software training. She served on the ACRL Instruction Section Pre-Conference Program Planning Committee for the ALA Annual Conference in San Francisco, 2015, which gave her valuable experience observing what goes into planning and teaching effective workshops.

Eating the Elephant One Bite at a Time: Fitting Assessment into Your Workday as a Practicing Librarian

Accepted as a 90-minute in-program session

Lisa Hinchliffe (University of Illinois at Urbana-Champaign)

View Description

Overview:

Library practitioners often struggle to find the time and focus necessary to undertake assessment projects and complete them in a timely manner. Even those who have time allocated for assessment are often not trained in assessment and so lack familiarity with approaches to structure and manage research activities. This leads to frustration, wasted effort, false starts, and even impostor syndrome.

This workshop will draw on the scholarship of project management, time management, task analysis, and the psychology of habits in order to enable practitioners to develop skills and strategies for fitting assessment into their professional workflow.

Participants in the workshop will:

  • Identify the phases and the component activities for a current or potential assessment project.
  • Develop a timeline for the assessment project and a “time budget” for each phase.
  • Select specific time and project management strategies that reflect their personal work style preferences.
  • Create an action plan that draws upon the psychology of habits.

Workshop Activities:

The workshop is designed to be interactive and pragmatic. Participants will leave the workshop with a plan for action developed through the following activities:

  • Discussion of key workshop concepts: project management, time management, task analysis, and the psychology of habits.
  • Self-reflection on personal work style preferences and existing tendencies .
  • Developing a “story of my assessment project” for small group presentation/peer feedback.
  • Role-play activity on preserving one’s allocated time for assessment in order to prioritize research and balance that priority with other demands.

Target Audience:

The workshop is for any practicing librarian who does assessment (or who would like to) and wants to improve the efficiency and effectiveness of their assessment work practices.

Lisa Janicke Hinchliffe is Professor/Coordinator for Information Literacy Services and Instruction in the University Library at the University of Illinois at Urbana-Champaign. She is also an affiliate faculty member in the University’s School of Information Sciences.

In addition to teaching MLS courses (including the newly developed “Evaluation and Assessment of Library Services” at the iSchool), Lisa has two decades of experience leading professional development workshops in professional conference and stand-alone settings.

Using Liberating Structures for Assessment

Accepted as a 90-minute in-program session

Chloe Riley (Simon Fraser University)

View Description

This half-day workshop will introduce Liberating Structures as a set of facilitation techniques that can be used or adapted for assessment practices in libraries. These hands-on techniques are designed to be inclusive, and to disrupt stale or conventional practices of working in groups by promoting participation, engagement, and innovation. This session will introduce Liberating Structures and explore how they can be useful for qualitative, collaborative, participatory assessment work.

Liberating Structures are a collection of 33 techniques (or microstructures) that can be employed in any situation that involves people working together, including meetings, presentations, classrooms, and team retreats. They can often be scaled up or down depending on the size of the group. Their flexibility makes them easily adaptable to a variety of assessment and evaluation practices.

In this session, I will facilitate a variety of Liberating Structure activities, posing questions and identifying topics related to assessment, so that participants have a chance to participate in the Liberating Structures and gain an understanding of how they work in practice. These activities are all highly participatory and interactive, and will require everyone to participate. After experiencing what it’s like to participate in several Liberating Structures, LAC participants will feel confident in facilitating them after the workshop themselves. In addition to these hands-on activities, participants will learn the essential design elements that make up Liberating Structures. They will also be prompted to reflect on the activities and consider or explore applying these techniques to their own work.The following Liberating Structure activities may be included in the session:

  • Impromptu Networking: In this quick, opening activity, participants will share and focus their expectations for the session. This involves everyone in the room, and after this activity, LAC participants can employ this technique as a way to notice patterns and collectively determine an area of focus, as well as build connections among a group.
  • 25/10 Crowdsourcing: In this activity, participants can generate and share bold and innovative ideas in a low-stakes environment. Highest-rated ideas emerge through the activity, allowing the group to prioritize and identify possible actions. By participating in this highly inclusive and engaging activity, LAC participants will be able to understand and adapt it to situations where they want to elicit bold feedback and ideas from everyone in the room. This may be useful when moving forward with the results of an assessment, engaging in strategic planning, or other scenarios.
  • What, So What, Now What: Participants will engage in this activity to explore and assess a shared experience. The activity progresses in stages, acknowledging salient facts of the experience, making sense of the facts in context, and identifying logical next steps. After this activity, LAC participants will be able to use this framework in assessment scenarios that aim to encourage equal participation from everyone involved, promote reflection on a shared experience, and spur coordinated and collaborative action. For example, it may be useful for a team making sense of assessment results, or as a way to facilitate a focus group.
  • Ecocycle Planning: This activity involves a team or unit working together to assess a collection or portfolio of activities, allowing them to identify blockages and opportunities for renewal. By collaboratively placing projects, initiatives, and other work onto an “ecocycle” map, the group is able to understand the work as a whole. LAC participants may find this activity useful in team retreats or strategic planning, as it allows a group to understand the work as a whole and strategize about how to balance activities, set priorities, and free up resources.
  • 1-2-4-All: This activity is similar to “think-pair-share,” and enables participants to reflect, share, and hear from others. By inviting all participants to share how they might apply Liberating Structures to their own assessment work, participants will leave the workshop with a generated list of ideas and strategies to employ these activities in their own institutions.

Learning Outcomes

Participants who attend this workshop will be able to:

  • Understand the essential design elements that make up Liberating Structures techniques.
  • Having now participated in a variety of Liberating Structures, facilitate a Liberating Structures activity.
  • Identify opportunities to explore and apply Liberating Structures in their own assessment work.
Audience

Audience can be beginners or experts. No prior experience or knowledge is necessary to participate.

Chloe Riley is the Research Commons Librarian at Simon Fraser University Library in Burnaby, Canada. She has over seven years of teaching experience in academia, as a graduate student and a librarian. She is a member of the Instruction Interest Group Planning Team at SFU Library, where she helps to organize monthly events and to foster an instruction-based community of practice among library staff. Recently, she facilitated interactive workshops at the Canadian Library Assessment Workshop (CLAW), the Canadian Association of Professional Academic Librarians (CAPAL) Conference, and the British Columbia Academic Libraries Section (BCALS) Winter Meeting.

What's in a Name? Impact, Contribution, and Value in Library Assessment Contexts

Accepted as an in-program workshop

Jackie Belanger (University of Washington Libraries), Steve Borrelli (Pennsylvania State University Libraries), Megan Oakleaf (Syracuse University), and Craig Smith (University of Michigan)

View Description

Over the last decade, a great deal of professional effort has been devoted to studies focused on measuring the impact of library contributions to institutional communities, such as the ACRL Assessment in Action project and the ARL Impact Framework Pilot Projects. However, definitions of impact are often unclear, with a variety of terms—including impact, value, and contribution—being used interchangeably to indicate how a library may make a difference to its users. A lack of shared understanding of terms can create tension among library assessment practitioners as they seek to describe, interpret, and communicate the results of library impact studies. Exploring how such terms are defined—and how they might be measured—can be a generative approach for individual practitioners and the library assessment community as a whole. Specifically, defining and understanding the language around impact, contribution, and value can help library assessment practitioners design studies that use methods and data analysis strategies and present findings in ways that are compelling and methodologically sound. Disentangling the meanings of impact language can lead to greater clarity in assessment purpose and design.

To respond to this need within the library assessment community, this workshop will engage participants in discussion and activities to more clearly articulate the meaning of terms such as impact, contribution, and value in their institutional contexts. This workshop will enable practitioners to share their understanding and use terms like “impact,” “value,” and “contribution,” identify terminology that may resonate with their own institutions, and gain confidence in their use of shared language describing impact studies.

Workshop leaders will provide a brief framing of the challenge faced by assessment practitioners when trying to navigate the desire to show impact/contribution alongside the limitations of common study designs. Throughout the session, workshop leaders will frame discussions and activities with case studies and definitions from their own experiences, as well as from readings from a variety of disciplines. (1)

Topics addressed by the workshop leaders over the course of the session will include:

  • Small “i” impact and big “I” impact: what’s the difference and why does it matter?
  • Impact, value, contribution: gaining definitional clarity
  • Quantitative and qualitative impact: the value of user perception of impact
  • Causation/correlation
  • Designing capital “I” Impact studies

Participants will engage in a number of hands-on activities:

  1. Participants will generate a list of terms that present difficulties in terms of shared understandings in library impact studies (e.g., impact, value, contribution, outcome, KPIs). This list will serve as a focusing exercise and can be extended into a draft glossary that participants can use as a crosswalk for understanding difficult terms within their own institutional context.
  2. Participants will reflect on how they use difficult terms in their own assessment practice, then participate in a think-pair-share exercise in small groups.
  3. Participants will explore a number of research studies and/or case studies that use difficult terms in a variety of ways, and discuss where terms are used similarly and/or dissimilarly across studies in small groups, then use their observations to fuel a whole group discussion.
  4. Expanding upon the first three activities, a whole group discussion will be used to surface multiple “right” ways of using difficult terms and place difficult terms on a continuum of ways to describe and discuss levels of relationships and associations. During the discussion, the workshop leaders will share insights from their own training and experience, and will highlight definitions commonly used in in a variety of disciplines
  5. Penultimately, participants will brainstorm ways in which difficult terms may give rise to misunderstandings in a larger context; these ways will be listed and then assigned to small groups for a “broken method” exercise in which groups identify strategies for avoiding, mitigating, or communicating about each type of misunderstanding arising from difficult language. Such misunderstandings might include lack of clear distinctions between lower case “i” and upper case “I” impact studies, perceptions and biases related to differences between qualitative and quantitative assessment approaches, or distinctions between correlation and causation.
  6. Finally, participants will return to their list of difficult terms and craft a personal action plan for honing their definitions and communicating about their assessments in ways that maximize clarity and minimize misunderstandings.

At the end of the workshop, participants will come away with a glossary of key terms generated during the course of the workshop, as well as a reading list of studies (from LIS and other disciplinary fields) discussing impact terminology and methods.

Learning outcomes:

  • Participants will be able to define terms relating to impact, contribution, and outcomes assessment in order to design effective studies for their needs and institutional contexts.
  • Participants will be able to articulate ways of discussing relationships and associations in assessment results that are on a correlation continuum.
  • Participants will be able to identify strategies for data gathering, analysis, and interpretation of results that are aligned with varying aims for demonstrating impact, contribution, and value.

(1) For example, Belcher, B., & Palenberg, M. (2018). Outcomes and impacts of development interventions: toward conceptual clarity.American Journal of Evaluation, 39(4), 478-495. https://doi.org/10.1177/1098214018765698).

Audience

This workshop will be of value to those who are new to assessment and would like to learn more about designing impact and contribution studies, as well as those who have experience of those studies who would like to strengthen their understanding of this work.

Jackie Belanger is Director of Assessment and Planning at the University of Washington Libraries.

Megan Oakleaf is an Associate Professor of Library and Information Science in the iSchool at Syracuse University. She is the author of The Value of Academic Libraries: A Comprehensive Research Review and Report and Academic Library Value: The Impact Starter Kit. She is the Principal Investigator for the IMLS-funded Library Integration in Institutional Learning Analytics (LIILA) and Connecting Libraries and Learning Analytics for Student Success (CLLASS) grants.

Craig Smith is the Assessment Specialist at the University of Michigan Library. Craig has a doctorate in Human Development and Psychology from the Harvard Graduate School of Education, and completed postdocs in the Harvard and University of Michigan psychology departments. In 2014 Craig moved into institutional research, and he became the Assessment Specialist in the library in 2018. Craig has facilitated workshops on a wide range of topics, including study design, methods, data visualization, and communicating results to diverse audiences.