News & Views


Using Excel to Manipulate Library of Congress Call Numbers

posted Mar 27, 2018, 5:09 PM by Greg Smith   [ updated Mar 27, 2018, 5:10 PM ]

This post shows how to use formulas to manipulate Library of Congress (LC) call numbers in Microsoft Excel. I demonstrated this technique as part of a presentation that I made at the Wrangling Library Data virtual conference last month.

An LC call number denotes the subject matter of a published resource, such as a book, as judged by a cataloger. In theory, by relying on call number data, libraries can measure the strengths and weaknesses of their collections, evaluate resource use, and assess the merits of prospective resource purchases. However, because call numbers often denote very specific topics, it can be challenging to convert them into categories that provide for insightful analysis.

I have created an Excel file that can quickly manipulate up to 20,000 call numbers, deriving multiple data points for each one. As illustrated in the image below, this utility file performs the following functions:

  • It extracts the one- and two-letter classes that form the core of the LC classification.
  • It generates two data points by way of numerical rounding: LC100 and LC Class + Number (Integer). Depending on the breadth of one’s data set, these derivatives provide for increasingly granular subject analysis.
  • It translates the one- and two-letter classes into their corresponding descriptions.
Call-Number-Manipulation

Using Excel to Pre-Code Survey Comments

posted Mar 26, 2018, 3:18 PM by Greg Smith   [ updated Mar 26, 2018, 3:20 PM ]

This post shows how to use formulas to pre-code survey comments automatically in Microsoft Excel. I demonstrated this technique as part of a presentation that I made at the Wrangling Library Data virtual conference last month.

Libraries use surveys for at least two purposes: to elicit user feedback concerning specific programs and services, and to gauge overall satisfaction with library resources and services. Many surveys provide comment boxes where respondents can provide open-ended feedback. Comments can convey useful insights, but they can be time-consuming—and thus costly—to analyze. Automating the analysis process to some extent is an attractive proposition.

The creators of a survey can often make reasonable guesses as to the themes that respondents might address in their comments. For example, comments collected via a library customer satisfaction survey might speak to the value of resources, spaces, and services. If survey administrators can identify keywords that denote a given theme, Excel can easily enough determine if a comment contains one or more of those keywords.

In an academic library environment, the theme of library resources might be represented by keywords such as book, journal, article, and database, among others. Consider, then, the following survey comment:

Full online access to all journals is the largest roadblock. Older journal articles should be online by now (a scan is fine). This may be out of the hands of the library however.

Judging by the occurrence of forms of journal and article, an automated process can determine that this comment likely says something about the library’s resources.

I have devised an Excel file that can quickly process up to 2,500 survey comments, looking for up to seven keywords associated with each of four distinct themes. The image below shows how it visualizes the presence of keywords and summarizes the score for a particular theme.

Pre-coding of Survey Comments

Of course, automated pre-coding isn’t perfect. A survey comment that describes the experience of booking a study room could be taken to refer to the resources theme. Once comments have been pre-coded, a human needs to read them to make sense of what has been written and, as necessary, reassign them to appropriate categories. Nevertheless, if you have a lot of data to analyze in a short time, it can be very helpful to have your data organized into “piles” where comments generally pertain to a given theme.

It seems appropriate to mention one more caveat: You may pre-code comments into particular categories, but as a responsible researcher, you need to be open to the fact that respondents may have emphasized themes that you didn’t anticipate. You should always go into the analysis phase with an open mind, ready to create categories based on what people actually said.

Using Excel to Manipulate Date and Time Stamps

posted Mar 25, 2018, 1:09 PM by Greg Smith   [ updated Mar 25, 2018, 6:04 PM ]

This post shows how to use formulas to manipulate date and time data in Microsoft Excel. I demonstrated this technique as part of a presentation that I made at the Wrangling Library Data virtual conference last month.

Systems typically log the activities that they automate, with each transaction getting a date and/or time stamp. In a library context, a date/time stamp might be associated with a DVD checkout, an article download, a website visit, or passage through a security gate. Below is an example of a date/time stamp:

Dec 3, 2014 11:22:40 AM

At first glance this might seem to consist simply of two pieces of data: a date and a time. Consider, though, that one can derive other units of information from these two pieces. The date consists of a month, day, and year; it also fell on a particular day of the week and occurred in a certain week of the year. As for the time, it consists of hours, minutes, seconds, and the ante meridiem designation.

Not all of these units of information are insightful in every situation, but the fact is that they’re contained or implied within the date/time stamp. So, how can one easily derive them from the raw date/time stamp? It’s simply a matter of creating formulas that use appropriate functions and reference the cell where the date/time stamp appears.

Let’s say that the date/time stamp is located is cell A1. We can derive other pieces of information by entering formulas such as the following in the adjacent cells:

  • =YEAR(A1)
  • =MONTH(A1)
  • =WEEKDAY(A1)
  • =VALUE(HOUR(A1)&":00") [Note: This formula displays the hour bracket in which the transaction occurred.]

When it comes to analyzing transactional data, it’s very helpful to be able to place dates or times into certain categories. For example, if you’re looking at gate traffic, you might want to be able to construct a grid that shows average traffic by the day of week and the hour of the day. To do this, you have to convert the raw dates and times into derivative data points, and you can easily do this with function-based formulas.

If you can see the potential in this, but don’t want to learn how to create formulas yourself, you’re in luck. As a part of my conference presentation, I shared an Excel file that performs date and time manipulation automatically; all you have to do is paste in a column of date/time stamps. The date manipulation results will look like the image below.

Date-manipulation

Using Excel to Prepare Data for Analysis

posted Mar 23, 2018, 5:37 AM by Greg Smith   [ updated Mar 27, 2018, 5:24 PM ]

A month ago I presented a session at an online conference called Wrangling Library Data: Analytics, Dashboards, and Spreadsheets. My presentation was entitled "An Old Tool with Enduring Value: Using Excel to Prepare Data for Analysis."

The premises of my presentation were pretty simple:

  • Data sets need to be prepared for analysis. The data we’re called on to analyze typically need stuff done to them. Particularized data need to be grouped into meaningful categories. Free text needs to be broken down into keywords. Dirty data need to be cleansed and other irregularities need to be dealt with. In sum, signals need to be distinguished from noise.
  • Data manipulation tools more potent than Microsoft Excel are available, but I find that …
    • I’m already using Excel frequently.
    • I’m often far from exhausting its capabilities.
    • It’s easier for me to learn new tricks in a familiar environment than in an unfamiliar one.

With those thoughts in mind, I set out to share some of the data manipulation tricks that I’ve learned to use, mostly with library-oriented data, over the years. Specifically, I showed how I’ve used Excel formulas to perform the following tasks:

  • deriving time-series categories from date and time stamps
  • pre-coding survey comments based on keywords
  • dealing with messy data points such as call numbers and publisher names
  • removing extraneous punctuation and symbols from textual data
  • identifying keywords that are misspelled or on a stopword list

I offered presentation viewers eight utility files—that is, spreadsheets where one can paste a column of data and see it transformed into something more conducive to insightful analysis. In the coming days, I’ll make posts featuring a few of these utilities. The files that I demonstrated, along with my slide deck, can be downloaded here.

Reflecting on Seven Years of Professional Reading

posted Sep 25, 2016, 8:20 PM by Greg Smith   [ updated Oct 17, 2016, 10:22 AM ]

Throughout my career I've pursued the vision of being a reflective professional: one who benefits from others' experiences and research findings, and who occasionally shares his own insights publicly for others' professional benefit. My efforts to assimilate new knowledge are perhaps best embodied in the bibliography of selected professional readings that I've maintained at citeulike.org/user/higher_ed for the past seven years. The sources listed there give evidence of continuous intellectual and professional growth, whether related to academic course work, publication and presentation, or the practical demands of managing in the higher education enterprise.

The bibliography that I've amassed now consists of just over 600 items--not a massive library by any stretch, but not a trivial collection either. The list is nowhere near comprehensive of everything I've read since mid-2009. Its scope is limited to those sources that I've found beneficial enough to justify establishing a record for future reference: the sort of thing that I might want to cite in a paper or presentation, or consult when facing particular professional challenges. Additionally, the scope is limited to those items most germane to my identity as a higher education leader; other interests of mine, such as biblical studies and Christian worldview, are poorly represented.

As I look back at the records of 600+ intellectual engagements--roughly 1.6 items per week--I can draw a handful of conclusions:
  • My intellectual interests are broad. I assigned 514 distinct subject tags to the sources that I indexed (see a continuously updated tag cloud here). The list of topics clearly illustrates that my thinking ranges far beyond the spectrum of librarianship.
  • Not all of the sources that I found beneficial are scholarly. Many, in fact, appeared in news publications.
  • The MBA degree that I pursued from 2012 to 2015 had a definite impact on my reading habits. Among the 25 tags that I assigned most frequently (see table below), eight were clearly influenced by my academic pursuits (e.g., BUSI642, operations_management, isomorphism).
  • Areas of persistent interest have included library management, higher education administration and reform, and organizational behavior (including communication and change).
Reflecting on the sources that have influenced my thinking in recent years recalls to mind a quote from Francis Bacon that I encountered early in my career: "Some books are to be tasted, others to be swallowed, and some few to be chewed and digested: that is, some books are to be read only in parts, others to be read, but not curiously, and some few to be read wholly, and with diligence and attention." While Bacon could scarcely have imagined the information milieu of the early 21st century--notably, one dominated by media other than books--the spirit of his quote remains true. For better or for worse, as leaders we bear the imprint of what (and how much) we choose to read.

RankTagCount of Items
1academic libraries73
2teaching and learning60
3library assessment51
4higher education reform50
5higher education administration46
6decision-making41
7library management40
8human resource management39
9organizational behavior34
10busi65032
11busi64230
12busi61029
13higher education finance28
14operations management27
15assessment26
16busi60126
17faculty25
18isomorphism24
19organizational communication24
20higher education22
21innovation22
22leadership21
23information technology management20
24management20
25busi60419
26organizational change19

A Library R&D Project

posted Aug 23, 2016, 8:41 PM by Greg Smith   [ updated Aug 24, 2016, 5:38 AM ]

Libraries aren’t the sort of environment that one most naturally associates with research and development. Nevertheless, over the past month I’ve had the chance to carry out a simple R&D project. The product was a research guide—a set of Web pages—designed to mediate access to business research resources.

LibGuides, a content management system, provided the technical infrastructure, so R&D wasn’t needed for that aspect of the project. Rather, research insights guided the selection and arrangement of the content of the guide. A snapshot appears directly below (click on it to see an enlarged view).

Business Research Guide

The development of the business research guide was actually a redesign project. The predecessor guide had been used for a few years. Unfortunately, analysis of use data revealed that the old guide had failed to elicit desired research behaviors. Users rarely ventured beyond the old guide’s entry page, and even though the guide provided links to dozens of targets, a single business database attracted 75% of outbound clicks!

Use data provided motivation for redesigning the guide, but insights supporting design choices had to come from another source: records of research help provided by librarians and staff. The Jerry Falwell Library captures qualitative data describing research services provided through various means (email, chat, etc.). After examining records for 400+ business-oriented transactions, I had a much clearer understanding of the tasks that researchers were attempting to perform.

In the interest of keeping this post as short as possible, I’ll spare the details of how I went about developing and launching the new guide. However, I do want to point out some key design elements:

  • Creation of pages focused on research tasks (e.g., Companies & Industries, Countries); this strategy replaced the older guide's focus on resource formats (e.g., Databases, Books & E-books)
  • Relegation of resource descriptions to pop-up boxes that appear when one hovers over links; this allowed for a general de-cluttering of guide pages
  • Creation of sub-pages for various business specializations (economics, finance, marketing, etc.)

Will the new design be a success? It’s difficult to imagine that the new guide, being based on research insights, will fail to outperform the old guide. Nevertheless, a definitive answer to this question can only emerge as additional use data are accumulated. I’ll have to wait to see if my efforts bear fruit.

Conference Presentation in Development

posted Jul 24, 2016, 2:12 PM by Greg Smith   [ updated Mar 27, 2018, 5:16 PM ]

It's a Sunday afternoon, and I've spent the last few hours alternating my attention between not-so-interesting ESPN programming and developing components of an upcoming conference presentation. My proposal to present at this fall's Virginia Library Association Annual Conference was accepted in May, so I'm now working to translate the proposal into reality.

The title of my concurrent session is "Managing with Purpose: Integrating Assessment, Planning, and Budgeting." Time permitting, I intend to introduce attendees to a range of management tools and concepts that are relevant to libraries of all types:

  • The logic model
  • Six dimensions of assessment
  • The balanced scorecard
  • The Business Model Canvas
  • Opportunity cost
  • The principal-agent problem

My session is scheduled for Thursday, October 27, 4:15 PM. Additionally, I'll most likely share slides and/or handouts via a link from the Presentations page. A related resource that I posted some months ago is linked here: "Planning and the Future of the Academic Library: An Annotated Bibliography."

Business Model Canvas: Academic Library Edition

posted Nov 21, 2015, 5:39 PM by Greg Smith

I’ve been inattentive to posting news and views here over the past few years because I’ve devoted much of my time to pursuing the MBA degree. I posted my portfolio some months ago here on this site, and I’ve added to it as I’ve progressed toward degree completion. Now, with weeks left in my final course, I can finally see the light at the end of the tunnel.

The course I’m taking now focuses on business strategies and models. One of the tools we’ve been exposed to is the Business Model Canvas (BMC), as elaborated by Osterwalder and Pigneur (2010). A few weeks ago I took the time to interpret the operations of the Jerry Falwell Library, where I work, via the structure of the BMC. The results are pictured below. Click on the image to view a larger version.

Jerry Falwell Library Business Model Canvas

I know that I’m not the first to apply the BMC to a library setting; nevertheless, thinking through this has been valuable to me. Getting acquainted with the BMC was enlightening, but it was particularly rewarding to apply it fruitfully to my own work context. The concepts and tools that I learned about in my MBA program were usually explained in function of a for-profit environment. It is refreshing whenever I find that a concept or tool has been applied, or can be applied, to the nonprofit world as well.

Osterwalder, A., & Pigneur, Y. (2010). Business model generation: A handbook for visionaries, game changes, and challengers. Hoboken, NJ: Wiley.

Encyclopedia Article Published

posted Aug 22, 2014, 12:25 PM by Greg Smith   [ updated Nov 21, 2015, 5:39 PM ]

I'm pleased to report that the Encyclopedia of Information Science and Technology (3rd ed.) is now available from IGI Global. I had the privilege of contributing an entry entitled "Academic Library Assessment." Like most of the entries in this 10-volume reference work, my contribution not only provides appropriate analysis of the topic, but highlights areas for future research. In addition, it contains an extensive bibliography of important articles, books, and other sources in the field. I believe that students, scholars, and practitioners of library assessment will benefit.

Opening a New Library, Assessing Its Spaces

posted Jan 15, 2014, 7:23 PM by Greg Smith   [ updated Jan 15, 2014, 7:24 PM ]

Today was a big day. After more than three years of planning and construction, the Jerry Falwell Library was formally dedicated and opened for student use. Along with quite a few others, I had a significant role in the planning and design process. As a remote worker, I was only able to watch the ceremony via streaming, but I’ve had the blessing of touring it a few times during the construction process, and I’ve even given a conference presentation about the data underlying the design.

This week I’ve been working on the library’s budget proposal for 2014-15, and also investigating how to assess the library’s performance in time to come. The new building now having been occupied at a significant cost, it’s clear that we want to impact students and faculty. With that goal in mind, I’ve made an effort to update my understanding of space assessment in libraries and learning commons. Below are some fresh sources that I’ve found helpful:

In recent years I had mostly focused on sources that dealt with ascertaining user needs as a part of the facility design/renovation process. Sources that I found useful for that purpose included the following:

1-10 of 40