Response rates7
Active Lives survey years run from 16 November–15 November. Using the formula above, the response rate for the period 16 November 2023–15 November 2024 was 19.3% across the survey year, with 74.3% of all respondents taking part online and 25.7% filling in a paper questionnaire.
The response rate varied between each individual wave, from a low of 18.2% in Wave 107 to a high of 20.6% for Wave 99.
There are several possible explanations, which are outlined in the section which follows (on reasons for variations in response rates).
Response rate by wave and mode
| Wave | Response rate by wave | Proportion of returns: online | Proportion of returns: postal |
|---|---|---|---|
| W97 | 19.00% | 76.79% | 23.21% |
| W98 | 20.47% | 73.90% | 26.10% |
| W99 | 20.59% | 76.65% | 23.35% |
| W100 | 20.13% | 76.68% | 23.32% |
| W101 | 19.51% | 73.78% | 26.22% |
| W102 | 19.36% | 74.49% | 25.51% |
| W103 | 19.05% | 72.84% | 27.16% |
| W104 | 18.67% | 73.59% | 26.41% |
| W105 | 19.21% | 72.34% | 27.66% |
| W106 | 18.62% | 73.42% | 26.58% |
| W107 | 18.16% | 74.19% | 25.81% |
| W108 | 18.20% | 73.82% | 26.18% |
Reasons for variations of response rate
It is important to note that there are a wide range of external factors, beyond Ipsos’ control, which may account for variation in response rates across the year.
For instance, the day of the week a survey lands and whether the survey lands during a holiday period, or close to a bank holiday, may have influenced the number of households responding. Although overall deadwood was assumed at 8%, this may have varied between each wave.
The target number of responses was 176,150. The number of target returns varied between 200 and 2000 for the 296 local authorities in England (for 180 of the 296 local authorities, the target number of returns was 400).
To achieve these targets, the sample design varied between waves. For instance, if a response rate was low in a particular local authority, a greater number of households received invitations in following waves than originally planned, and vice versa.
The net impact was that there was an increased sample size in areas less likely to respond, which would have negatively impacted on the response rate overall. This is likely to account for some of the variation seen in the response rate across the survey year.
Other evidence for variation in response rate
The push-to-web method is comparatively new in the field of survey research and, as yet, relatively little is known about key determinants of push-to-web survey response rates.
However, it appears that they vary with numerous factors including incentive level, letter wording/layout, reminder regime, survey subject matter and, possibly, survey sponsor.
It is generally regarded as particularly important for push-to-web surveys to include an alternative response mode in later reminders.
It is known that a significant proportion of the population (around 7% of households in Great Britain8) do not have access to the internet and some respondents are more willing to respond by paper questionnaire than online.
Furthermore, there is evidence from the US indicating that inclusion of a postal survey element in a push-to-web survey reduces non-response bias. In light of the above, a paper questionnaire is sent to respondents at the third reminder stage in the hope that it will increase response rate and reduce non-response bias.
However, it should be noted that overall response rates are generally lower in online surveys than postal-only surveys, and available evidence indicates that this applies also to push-to-web surveys even when mail questionnaires are included in later reminders.
In Year 7 of the survey, an experiment was carried out to attempt to improve the response rate and push more people to the online survey through the adoption of QR codes on survey invitations.
The QR codes were found to increase response rates and were therefore implemented from Year 8 of the survey.
7 NB – Figures in this section reflect all responses received, and therefore may differ slightly from the final data, which contains only cleaned responses and omits duplicates and responses received after the closure of field.
8 https://commonslibrary.parliament.uk/research-briefings/cdp-2023-0176/