Measuring the impact of summer reading
Gathering evidence before, during and after a summer reading initiative helps you understand what made a difference. You can also use it to review, prioritise, and inform future initiatives.
Why measure the impact of your summer reading initiative?
If you use an evidence-based approach, your school can extract some rich information to measure the impact of your summer reading initiative.
Decide your approach at the planning stage, along with your time frames for gathering evidence. This will enable you to:
- provide a rationale and motivation for summer reading initiatives
- come up with creative ideas for an engaging initiative
- set up goals and targets, and work out your priorities
- acknowledge the efforts and results of everyone involved
- provide information for reporting about outcomes
- use the evidence to help you reflect on and refine your summer reading initiative.
Plan a summer reading initiative in your school
We developed an evidence-based summer reading initiative, including collecting student achievement data before/after summer. We tested 20 just below kids in December; we communicated with parents, provided books for free, issued books, issued reading activity booklets, opened our library for 6 mornings in January. We re-tested in Feb; all but one kid had maintained or improved reading level.
— Participant in our Sail into Summer Reading Programme
Using an inquiry approach
If your school decides to use an inquiry model to plan an in-school initiative to support summer reading, you'll progress through the following steps:
Set an objective to maintain student reading levels in the school over summer.
Draft an initial plan for your programme.
Trial this with a group of students.
Modify your draft programme in line with the feedback and results of the trial to ensure your initiative improves student achievement.
Planning successful user-centred change provides information on this Spiral of Inquiry approach.
3 dimensions of evidence-based practice
Dr Ross Todd, Director of Rutgers University’s Center for International Scholarship in School Libraries (CISSL) describes 3 dimensions of evidence-based practice:
Evidence for practice — examining and using existing national and international research to inform and inspire changes of practice in schools.
Evidence in practice — drawing on locally-generated evidence as a result of changes in practice.
Evidence of practice — outcomes, results, impact of initiatives from information collected in your school.
Evidence-based practice and why it matters
1. Evidence for practice
National and international research provides important information about the summer reading 'slide'. It also adds impetus and helps motivate your school to address this issue in your school community.
Research on the summer slide and summer reading
2. Evidence in practice
This is about integrating research findings with professional expertise and local evidence so you can take a strategic approach to creating initiatives.
School data and teacher observations provide a local context for summer reading loss, what it means for teacher practice and how it impacts particular students.
Start by finding out the following:
Is your school aware of the 'summer slide' and its impact on student reading progress?
Are parents in your school community aware of this issue? How can you start a discussion about why it matters and what to do about it?
Is there any existing data or testing of reading levels for the end of year / beginning of year (such as STAR, BURT or PROBE reading assessments) that gives you quantitative data about student reading loss over the summer?
Next steps for gathering evidence in practice
Identify your school’s current summer holiday reading practices — use our summer reading Reflection on current practice questionnaire.
Discuss possible approaches to meet the needs of your school community.
Identify students your school would like to target.
Come up with ways to gather evidence of the impact of your proposed initiatives.
Develop a plan that involves everyone — classroom teachers, libraries and families.
Reflection on current practice (pdf, 194KB) questionnaire.
3. Evidence of practice
Your school can gather evidence of the impact of your summer reading initiatives in various formal and informal ways using:
Quantitative data — information you can report in numerical form, in graphs or tables.
Qualitative data — information about how people feel or think about things, for example, 'voices', quotes, stories, reports and written responses from questionnaires or surveys.
The focus is on student learning outcomes measures (what students achieve) — such as reading attitudes, levels, confidence or behaviours — rather than on output measures (what schools and libraries do) — such as number of books issued or numbers attending summer reading programmes.
Here are some examples of results that measure student learning outcomes:
Students are reading more, and voluntary reading is increasing, becoming a personal habit.
Reading test scores show maintenance or improvement in reading levels achieved over summer.
The number of students who say they enjoy reading is increasing.
Children are having library books read to them at home.
Parents are able to talk about their child’s reading with teachers, library staff and others.
Positive attitudes develop towards reading and library use.
Non-library users have started using the library, voluntarily, for reading materials.
Opportunities for gathering evidence of practice
The main sources of qualitative and quantitative data that you can analyse to measure the impact of your summer reading initiative come from your:
school
school library
students
public library.
Having evidence from different perspectives creates a more convincing and reliable picture.
Your school
At school you can collect evidence from:
progress made towards achieving your school’s literacy goals and targets in the annual plan
any changes in reading levels and attitudes of targeted students
information from existing testing programmes before and after summer holidays — for example, running records, STAR, BURT, e-asTTLe, PM benchmarks or PROBE
anecdotal reports, conversations and observations from teachers
strategies used by different teachers/classes, and the results of these
student participation in public library summer reading programmes
feedback from families, questionnaires or conversations
your promotion of holiday reading throughout the year, such as the uptake by students and parents of reading for pleasure each holiday break.
Your school library
Formal and informal surveys of students and their families who have used the library to get books for the holidays are a great source of data. Other options include:
evidence of change from library management system data, such as borrowing statistics
liaison with teachers about targeted students’ reading mileage
feedback from visitors to the library if it's been open during the holidays and other anecdotal reports, conversations and observations
observation of changes to students' reading habits, reading mileage and enjoyment — including their confidence in choosing books independently for pleasure and uptake of any summer reading challenges.
You could demonstrate the impact by:
taking photos of student readers
displaying the most popular books read over the summer, with a brief review of each.
Your students
A simple survey or questionnaire sent to students and their families before and after the summer break is a useful way of gathering evidence. Other methods include:
anecdotal feedback from families
asking students about their favourite reads after the holidays
checking reading logs
measuring the uptake of competitions, personal challenges or other reading goals.
Your local public library
The public library can also be a useful source of information about summer reading habits. You can look at the evidence in:
feedback from the public library about uptake by students of their summer reading programmes
students' participation in challenges or competitions.
Documenting other benefits of your initiatives
Your school might find it useful to document other spin-off benefits from your summer reading programme, such as how:
partnerships between home and school have been strengthened
students, especially those targeted by your summer reading programme, increasingly use and value the school library
you've developed stronger relationships with the public library
parents are more informed and confident about helping their children with reading through the development of home literacy practice.
Tools and methodologies for gathering data
As you plan the processes around your data gathering, you'll discuss and make decisions on:
how library and teaching staff can coordinate their surveying of students
who will gather data, when and how — timing, frequency, level, sample size etc
the design of surveys for useful results and a variety of options for feedback — pen and paper, online and face-to-face.
Analysis and reporting
Once you've gathered your evidence and data you'll need to analyse it, share the results and decide how to apply it to practice.
Sharing the results will encourage successful practices to be embedded and developed further, or lead to refinements and improvements in the future.
Student success stories can be a particularly potent way of communicating the impact of summer reading initiatives to colleagues, school managers and family/whānau.
Summer reading stories – stories from schools about how they planned, managed, and evaluated summer reading initiatives.