Posted by: Ali | December 13, 2009

MAXIM #7 Recruiting users for impact analysis

#7 Recruiting users for impact analysis

Explore effective methods to recruit library users to take part in Critical Incident Technique (CIT) questionnaire surveys and follow-up interviews for impact analysis studies. State the implications (e.g. cost, time, resources etc) of the method(s) chosen

Type of library service – database training

List of methods to recruit library users to take part in CIT questionnaires and follow-up interviews.

Notes from Knowing your users: analysing impact suggest CIT questionnaires can be distributed to attendees at the end of training sessions to assess the impact of training. This would allow assessment of the short-term impact of the training session on the attendees. Alternatively they could be asked to return them after a set period of time, to assess the impact of the training session – i.e. establish the longer term impact. To compete the impact study, users that return their questionnaires can be selected for follow-up interviews.

I’m assuming here that attendance is not compulsory.

1. Survey attendees immediately after training – either via a survey handed out after the session, or email the attendees immediately after the session

– lower cost – as only surveying attendees, not all possible users of the service
– doesn’t tell you anything about those that don’t attend the sessions
– may give a higher response rate, as having taken part in the session the attendee may feel more inclined to give feedback, and to take part in later interviews

2. Survey all library customers to see who has taken part in database tutorials

– if all library users (i.e. entire population) surveyed this would mean more expense in terms of number of surveys produced (if a paper based survey used)
– if only a sample of all library customers is used, then a sampling frame would be needed – could be time-consuming/costly to generate/access this (depends on access to institutional records of staff/students/library users)
– surveying all customers could raise awareness of database training amongst non-users
– would allow to find out more about non-users
– may have a lower response rate from attendees than when surveying immediately after the training – attendees may be less motivated to take part if it has been some time since they had the training

3. Survey library users i.e. when they come into the library, or put up a link to a survey on your website. Then stop the survey when you have reached the required number of surveys you want.

– convenience sample, so can’t extrapolate results to all users – but would give you an indication of opinions about impact
– surveying customers in this way could raise awareness of database training amongst non-users
– would allow to find out more about non-users

For any of these methods incentives could be used to assist with getting a higher response rate. What incentives were provided would depend on who the library users are, and what budget is available.

As indicated from the briefing notes about this method, follow-up interviews would be conducted from amongst those who completed the survey. Do we need to elaborate on the cost, time, resources for these? e..g.
– face to face interviewing more expensive than questionnaire surveys
– more costly and time-consuming to analyse data (e.g. content analysis of responses – especially if the questions are very open, she says from personal experience!)

Not sure how much detail we need to go into here – the course is not going into detail on surveying methodology at all!


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: