Posted by: Ali | September 23, 2010

Talk on e-research

Talk from Jane about what she found out from her Australian trip. Lots of focus on e-research “collections” – driven from the top down i.e. federal government with funding, so buy-in because of that. Not a “library thing” – library as a partner with IS and research services. E-Research on our radar earlier this year – especially with the ARL special issue August 2009 on role of liaison librarians and article on The last mile: liaison roles in curating science and engineering research data. Our take on it would be that we aren’t the curators, we would be the people putting others in touch with the data and “spreading the word”

Also notes from lecture from Paula Callan, QUT Library, circulated by Jane

– eResearch – involves large scale data, often requiring high-performance computing

Learning outcomes:  awareness raising – this is part of the “stuff we should be aware of as liaison librarians” and we need to keep up with trends and what is happening.

I think this might fit under Bok 5 – organisation, retrieval, preservation and conservation of information. Specifically the aspect of understanding the principles of e-research rather than enabling access, or designing systems that the Bok talks about.

Advertisements
Posted by: Ali | September 22, 2010

The dreaded revalidation journal journey …

I’m not doing very well at keeping this up-to-date, OR contempating the BOKS or whatever the hell they are supposed to be that things fit into.

One of my colleagues reported back from the weekend hui a couple of useful tips:

  • to record activities on an ongoing basis in a simple manner
  • record your learning outcomes for all activities as your learning outcome is the key to deciding the correct BOK.
  • Ok maybe I will try and post a few more things here as my record, and try and put the learning outcomes in. I guess if the learning outcomes don’t really rock my world, maybe the thing isn’t worth putting into my final journal …

    Posted by: Ali | August 10, 2010

    Collaborating with peer tutors

    Keynote from Jan Stewart, Manager, Student Learning Support Services, Victoria University of Wellington “Understanding and supporting the student experience”

    Out of one of the discussions during the session came the idea of collaborating with peer tutors/class reps to get feedback about the student experience in terms of things library.

    Currently reading

    Badger, J. (2008). Turning ‘cold sellers’ into ‘must haves’: marketing unsought library products. Paper presented at the 2008 ALIA Biennial Conference (Dreaming 08). Retrieved from http://hdl.handle.net/1959.3/39780

    She says:

    We should use personal selling techniques not undifferentiated mass advertising. We need to collaborate with users and involve them in marketing. Using a ‘person like me’ in campaigns as is done in the ‘puffin muffin’ advertisements for iSelect health insurance products targeted at young people is a good strategy. Employing students to be peer advocates may break down barriers and help new students feel comfortable in the environment. Smith and Reynolds (2007 p.150) found many students will seek out student workers before they ask a professional librarian.

    What can/should we be doing to increase our liaison with student reps/peer tutors to help gather feedback from students and spread the word about things library?

    Will raise this at the Telsig feedback session for colleagues next week – will be interesting to see what the reaction is.

    Posted by: Ali | February 8, 2010

    Treaty lecture: Biculturalism: are we there yet?.

    Presentation by Spencer Lilley on biculturalism and the library profession in NZ. No clear cut answer. Asked a question about Maori staffing in libraries.

    Posted by: Ali | January 23, 2010

    m-Libraries: Planning and implementation seminar

    m-Libraries: Planning and Implementation Massey University Library Friday 22nd Jan 2010

    Massey University is pleased to host a symposium on effective implementation of mobile library services and m-library applications. The focus will be on preparing for implementation of mobile service delivery –the rationale, persuading others of the benefits, setting up policies and processes to demonstrate best practice and implementation and evaluation.

    Featuring International Speaker Prof. Mohamed Ally. Dr. Mohamed Ally is Professor in Distance Education and Director of the Centre for Distance Education at Athabasca University, Canada and a recognised expert in the field of mobile technologies and mobile learning.

    Prof Ally “need to have a sense of urgency as things are changing so fast”
    move from ptrin to online to mobile
    developing countries – bypassed desktops and have gone straight to mobile devices
    has edited a book on mobile techonology in libraries
    edited a book on mobile learning – can download for free from www.aupress.ca/index.php/books/120155
    need to use intelligent agents that will detect student mobiles and format for that mobile

    comments from the session/general vibe

    – lack of decent broadband access for all in NZ a major drawback
    – NZ has a very high owenship of cellphones but a very small ownership of smart phones (itouch) because telcom companies don’t subsidise them here as they do overseas

    my thoughts

    – the persistant refrain of mobile identified as one of the top trends at the recent ALA mid-winter meeting
    – I am beginning to think it would be way better for me to type this stuff once and use one of those netbook things!

     
     
     

     

    a) list the data collection methods that are used in your workplace

    1.  feedback slips for students to give comments

    2. – Insync survey – major customers satisfaction survey for staff and students that is run every few years (is same survey instrument used by other Australasian universities, which enables comparisons to be made, but probably reduces the flexibility as to the questions we can ask)

    3. Information from students obtained as a part of surveys on specific topics e.g. subject guides – usually as part of a wider project

    b) do you think that the data collection methods that are used in your library adequately assess impact? Remember that impact assessment is the evaluation of the effectiveness of outcomes, rather than the efficiency of process in information services

    – current methods are better at problem identification and/or establishing levels of satisfaction – so efficiency rather than effectiveness)

    (efficiency =  outputs / imputs – so the outputs the library achieves for the inputs put in e.g. human resources, materials)

    (effectiveness = outputs/inputs – the greater the outcomes (useful actions by the user) in relation to the same inputs, the more effective the service is

    1. Feedback slips – generally the library user is asking a specific question related to some aspect of the service – very often it is about the physical environment or physical resources (water cooler, PC availability, power outlets). So if we look at impact in terms of “effect of service” then we can make some inferences – i.e. lack of power outlets is having an impact on students ability to use library spaces effectively, but because we don’t specifically ask for direct impacts, then we are missing out on that actual data

    2. This survey (which is run every few years) asks the respondents to rate each of 29 statements (considered critical to the success of the library) – firstly to measure the importance of each of the statements and secondly to measure their impression of the library’s performance e.g. books and articles requested are delivered promptly; library staff are approachable and helpful. The words “value” and “impact” aren’t used. The satisfaction survey gathers quantitative data (that can be compared with other institutions) and a large number of verbatim comments are also collected. So we get an idea of how satisfied students are with the library’s performance but there isn’t really any attempt to get an idea of value or impact – some of this does come through in verbatim comments, but again this data isn’t gathered specifically

    3. Similar comments to those in #2

    c) how could the assessment of impact in your library or information service be improved?

    – ask those filling out feedback slips to detail what the impact of their interaction with the library service really means to them

    – run some pre and post training impact assessment for selected training courses

    – pre and post training impact assessment for selected new services

    – mine the verbatim comments from the Insync survey to look for specific comments about impact

    – incorporate some questions into the Insync survey that actually do focus specifically on impact (this may be difficult as it is a survey organised for several institutions)

    Definition – impact analysis – process of identifying key user groups and identifying and evaluating impact on users.

    Impact analysis studies can be used to

    1. access the impact of current services upon users

    – allows them to assess whether or not services are having an impact on users and make service development decisions on basis of this information

    2. assess the impact of new or improved services that libraries intend to deliver in the future

    – possible to carry out a “pre-impact” study i.e. carry out an impact study to assess current impact of services upon users, and then undertake a follow-up impact study to see if new or improved services have had an effect on the impact of users (example used of introducing an online catalogue – good grief!)

    – useful when providing training to users – users can be asked to partake in a “pre-training” study to assess current knowledge and abilities. After training has been provided they can be asked to take part in an impact study. Can help with assessment of impact on users by comparing pre and post training levels of skills.

    Posted by: Ali | December 15, 2009

    MAXIM #5 Importance of maximising impact

    (from earlier notes)

    How is impact maximised?
    can be done in several ways
    – common way is to promote services using marketing – to increase awareness, increase usage and thereby impact on users
    – impact can also be maximised by “adding value” to services: making services as effective and efficient as possible; encourage friendliness and courtesy; introducing new or extending library services

    a) how has your library or information service attempted to maximise the impact of its services to users

    1. Publicising our services and resources to users via our blog (utlisiing it as a marketing and communication channel)

    2. Reviewing the subject guides on our library website – undertaking a review that looks at the content and the coverage of the guides – with a view to making them more effective i.e. useful to users. This is a project that is still underway, and has involved surveying both library staff and students for feedback.

    3. Producing online tutorials to assist students with using databases and other resources – this could be seen as adding value to an exisiting product (databases) by providing tools to help users use them effectively. The tutorials will also provide a way for students who can’t attend classes (either because their course doesn’t have a scheduled library component, or they are extramural) to access library tutorials and advice on using resources.

    b) did these impact maximising activities have any effect on your library or information service?

    This is tricky to say, and for points 2 and 3 these projects are still underway so it’s not yet possible to analyse their impact. That said I think it would be difficult to evaluate effectiveness as we don’t have any metrics that specifically look at measuring impact. We can certainly get an idea of usage rates from web statistics etc, but this wouldn’t measure impact. Some feedback might also be obtained from our customer feedback slips and student surveys that we run.

    Further

    Why working on this I thought it might be interesting to utilise a model to undertake a gap analysis relating to what impact the library thinks it has, and what impact customers think it has.

     The model I was thinking of was this one from Yoo Seong Song (1) :

    Library positioning model (Song, 2009)

    (1) Song, Y. S. (2009). Designing library services based on user needs: new opportunities to re-position the library. Paper presented at the World Library and Information Congress: 75th IFLA General Conference and Council. Retrieved  from http://www.ifla.org/files/hq/papers/ifla75/202-song-en.pdf

    You could modify this to create quadrants representing value/impact, and then get customers (regular users/non-regular/non-users) to see where they would consider the library fits on the quadrant (maybe also where they think it should fit) , and then plot where the library considers it is. Presuming their will be a gap (!) you could then think of strategies for closing the gap i.e. increasing impact.

    Posted by: Ali | December 13, 2009

    MAXIM #7 Recruiting users for impact analysis

    #7 Recruiting users for impact analysis

    Explore effective methods to recruit library users to take part in Critical Incident Technique (CIT) questionnaire surveys and follow-up interviews for impact analysis studies. State the implications (e.g. cost, time, resources etc) of the method(s) chosen

    Type of library service – database training

    List of methods to recruit library users to take part in CIT questionnaires and follow-up interviews.

    Notes from Knowing your users: analysing impact suggest CIT questionnaires can be distributed to attendees at the end of training sessions to assess the impact of training. This would allow assessment of the short-term impact of the training session on the attendees. Alternatively they could be asked to return them after a set period of time, to assess the impact of the training session – i.e. establish the longer term impact. To compete the impact study, users that return their questionnaires can be selected for follow-up interviews.

    I’m assuming here that attendance is not compulsory.

    1. Survey attendees immediately after training – either via a survey handed out after the session, or email the attendees immediately after the session

    – lower cost – as only surveying attendees, not all possible users of the service
    – doesn’t tell you anything about those that don’t attend the sessions
    – may give a higher response rate, as having taken part in the session the attendee may feel more inclined to give feedback, and to take part in later interviews

    2. Survey all library customers to see who has taken part in database tutorials

    – if all library users (i.e. entire population) surveyed this would mean more expense in terms of number of surveys produced (if a paper based survey used)
    – if only a sample of all library customers is used, then a sampling frame would be needed – could be time-consuming/costly to generate/access this (depends on access to institutional records of staff/students/library users)
    – surveying all customers could raise awareness of database training amongst non-users
    – would allow to find out more about non-users
    – may have a lower response rate from attendees than when surveying immediately after the training – attendees may be less motivated to take part if it has been some time since they had the training

    3. Survey library users i.e. when they come into the library, or put up a link to a survey on your website. Then stop the survey when you have reached the required number of surveys you want.

    – convenience sample, so can’t extrapolate results to all users – but would give you an indication of opinions about impact
    – surveying customers in this way could raise awareness of database training amongst non-users
    – would allow to find out more about non-users

    For any of these methods incentives could be used to assist with getting a higher response rate. What incentives were provided would depend on who the library users are, and what budget is available.

    As indicated from the briefing notes about this method, follow-up interviews would be conducted from amongst those who completed the survey. Do we need to elaborate on the cost, time, resources for these? e..g.
    – face to face interviewing more expensive than questionnaire surveys
    – more costly and time-consuming to analyse data (e.g. content analysis of responses – especially if the questions are very open, she says from personal experience!)

    Not sure how much detail we need to go into here – the course is not going into detail on surveying methodology at all!

    Posted by: Ali | December 4, 2009

    Maxim: What is impact? (notes from overview pt 1)

    Markless and Streatfield define impact as:

    any effect of the service (or an event or initiative) on an individual or group

    Could be +ve or -ve, relate to quality of life or educational or other outcomes

    Value – often used to describe impact. Urquhart and Hepworth (2) define value as:

    the benefit the user obtains from the use of information and the effect of that use

     Examples of impact evaluation – several readings given. Streatfield and Markless appear to be key authors, and have researched the evaluation of library services in the academic sector.

    How is impact maximised?
    can be done in several ways
    – common way is to promote services using marketing – to increase awareness, increase usage and thereby impact on users
    – impact can also be maximised by “adding value” to services: making services as effective and efficient as possible; encourage friendliness and courtesy (this seems a bit of a stretch, isnt this a basic service standard?); introducing new or extending library services

    How is impact evaluated – readings given

    (1)  Markless, S., & Streatfield, D. (2006). Evaluating the impact of your library. London: Facet.

    (2) Urquhart, C., & Hepworth, J. (1995). The value to clinical decision making of information supplied by NHS library and information services: British Library Research and Development Department.

    « Newer Posts - Older Posts »

    Categories