I finally got back to the article "Improving the Model for Interactive Readers' Advisory Service" by Neil Hollands in the Spring 2006 issue of Reference & User Services Quarterly, pages 205-212. In this article, Hollands lists the problems with the traditional ways of providing readers' advisory service and describes a very personalized readers' advisory service provided by Williamsburg Regional Library in Virginia.
The central feature of the WRL model is the use of a questionnaire for the collection of readers' preferences. Readers complete questionnaires, and then librarians analyze them and create annotated book lists for the readers. Hollands reports that the WRL staff spent between two and six hours to respond to each form in the early days of their program. With experience, that time has been reduced in most cases to between 30 minutes and two hours.
What a form! It's huge! Click here to see it. It reminds me of the questionnaires a patient fills out when seeing a doctor for the first time. Like doctors, the librarians at Williamsburg Regional Library want a lot of information from their clients before they create individualized reading lists.
My first reaction is that Williamsburg's program is not a workable model for small libraries. It asks for a lot of input from the readers and then for a lot of time from the staff - time that many staff in small libraries will not have.
On my second reading I see that Hollands does say that his library was not swamped with completed forms, and the librarians were given a couple of weeks to produce the annotated reading lists. In two years, the library completed nearly 250 lists for readers. On a scale of 1 to 5, client satisfaction with the service was 4.79. The model looks a little more workable if the numbers are small and the clients very patient.
I still have a major question. Would providing a service that required many staff hours and served only a few clients directly be a good use of library resources? It might if what the staff learned helped them improve the collection and improved their face-to-face readers' advisory sessions.
How can a library start such a service? I think the first investigatory step is having the librarians who will be creating the annotated lists self-advise. They should go through a form, like that from Williamsburg, and see if they can make annotated reading lists for themselves.
Second, the librarians should experimentally offer to create some reading lists for a few regular clients and see how they do. Only after this experiment should they consider putting together a form-based program for the public at large. Then the article in Reference & User Services Quarterly might help.
I commend Williamsburg Regional Library for their innovative service for readers. I'd like to hear more about it.