Hostname: page-component-54dcc4c588-scsgl Total loading time: 0 Render date: 2025-09-12T00:29:41.882Z Has data issue: false hasContentIssue false

Behaviorally informed interventions can increase take-up of public employment services, but conversion remains challenging: insights from an RCT in British Columbia, Canada

Published online by Cambridge University Press:  08 September 2025

Christian H. Schimpf*
Affiliation:
Department of Political Science, The University of British Columbia - Vancouver Campus, Vancouver, BC, Canada
Vince Hopkins
Affiliation:
Department of Political Science, The University of British Columbia - Vancouver Campus, Vancouver, BC, Canada
Priscilla B. Fisher
Affiliation:
Vancouver School of Economics, The University of British Columbia - Vancouver Campus, Vancouver, BC, Canada
Jeff Dorion
Affiliation:
BC Public Service Agency, Government of British Columbia, Victoria, BC, Canada
*
Corresponding author: Christian H. Schimpf; Email: christian.schimpf@ubc.ca
Rights & Permissions [Opens in a new window]

Abstract

Low take-up of government services continues to challenge public investments in social services. Behaviorally informed interventions, so-called nudges, can overcome barriers that keep eligible individuals from accessing services. We report results from a pre-registered randomized controlled trial (RCT) to test email-based interventions to increase the take-up of publicly funded employment services in British Columbia, Canada. Our RCT design distinguishes between getting people ‘to-the-door’ (awareness and interest) and ‘through-the-door’ (enrollment). We find that emails with concise information that route individuals directly to online enrollment are most effective. The best-performing interventions more than doubled enrollment within 14 days, relative to a control group that received no communication. Using machine learning identify subgroups within the population who benefit most from our interventions. Yet despite these positive effects on take-up, we find that converting expressions of interest into enrollments remains a challenge. To increase take-up, policymakers must identify the nature of the challenge: getting people to-the-door or through-the-door. We also contribute to current debates about the quality of public service delivery.

Information

Type
Findings from the Field
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0), which permits non-commercial re-use, distribution, and reproduction in any medium, provided that no alterations are made and the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use and/or adaptation of the article.
Copyright
© The Author(s), 2025. Published by Cambridge University Press.

Introduction

Take-up of many government programs is low, especially when participation is voluntary (Currie, Reference Currie, Auerbach, Card and Quigley2006; Bhargava and Manoli, Reference Bhargava and Manoli2015; Janssens and Van Mechelen, Reference Janssens and Van Mechelen2022; Daigneault, Reference Daigneault, Varone, Jacob and Bundi2023). Low take-up may undermine the welfare benefits of public investments (Bearson and Sunstein, Reference Bearson and Sunstein2023; Fox et al., Reference Fox, Feng and Reynolds2023; Ko and Moffitt, Reference Ko, Moffitt and Zimmermann2024). Active labor market policies (ALMPs) are a prominent example. ALMPs are widely used to improve the prospects of jobseekers, encompassing services such as job search assistance, training and skill programs, and subsidized jobs (Caliendo and Schmidl, Reference Caliendo and Schmidl2016; Bonoli and Liechti, Reference Bonoli and Liechti2018; Card et al., Reference Card, Kluve and Weber2018; Tübbicke and Schiele, Reference Tübbicke and Schiele2024). In 2021, for example, OECD countries spent an average of 0.6% of their GDP on public employment measures (OECD, 2024). Yet despite this spending, and the many potential benefits for jobseekers, take-up remains as low or lower than many other social programs (Crépon and van den Berg, Reference Crépon and van den Berg2016). In this paper, we ask: Why is take-up of publicly funded employment services low and what can policymakers do about it?

We report on a pre-registered, large-scale randomized controlled trial (RCT) conducted in partnership with the Government of British Columbia, Canada. We test the effect of behaviorally informed interventions on take-up of public employment services (pre-registered analysis plan on OSF: https://bit.ly/3XGc2Tm). Prior research shows that take-up of the provincial labor market program, WorkBC, is low – in one study, enrollment was below 1% among applicants for unemployment insurance benefits (Hopkins and Dorion, Reference Hopkins and Dorion2024). We build on previous research in the area of public employment service take-up (Heckman and Smith, Reference Heckman and Smith2004; Darling et al., Reference Darling2017; Sanders and Kirkman, Reference Sanders and Kirkman2019; Tregebov et al., Reference Tregebov, Seusan and Krieger2021; Mühlböck et al., Reference Mühlböck2022; Dhia and Mbih, 2023; Hopkins and Dorion, Reference Hopkins and Dorion2024; Lehner and Schwarz, Reference Lehner and Schwarz2024). Our study is mixed method: prior to the RCT, we conducted interviews and original survey research. Based on this evidence, we expected that take-up in WorkBC is low because citizens are insufficiently motivated to start the application process (due to informational barriers) and highly sensitive to frictions when they are motivated or in the process of applying (delayed action; difficulties navigating enrollment processes).

Crucially, our experimental design distinguishes between two steps in the take-up process: getting jobseekers ‘to-the-door’ (awareness and interest) and ‘through-the-door’ (enrollment). Working with the BC government, we co-designed a 2 × 2 factorial with an additional, pure control group. Everyone in the four intervention groups received an email with the same concise information about WorkBC and a call to action. To get more people to-the-door – i.e., to the WorkBC enrollment website – we varied the call to action. We invited some participants to opt in by clicking on a single button (‘standard call’), while others were invited to explicitly choose between connecting with WorkBC or not (‘Active Choice’). In both cases, making the choice was voluntary. To get interested people through-the-door – i.e., enrolled in WorkBC – we varied the enrollment process. We routed some participants directly to the WorkBC enrollment website (‘cool handoff’) or to an expression of interest form that first connected them with their local employment center (‘warm handoff’). The warm handoff was designed to support the WorkBC online enrollment process by connecting each jobseeker with an employment counselor.

Our study has three main takeaways. First, we find that the cool handoff increased take-up of WorkBC among the target population by a factor of 2.7, relative to the control group. This finding supports existing research, showing that reaching out to jobseekers and providing them with concise information can increase take-up of public employment services (e.g., Lehner and Schwarz, Reference Lehner and Schwarz2024). Using machine learning, we find evidence of treatment effect heterogeneity. Participants aged 45+ and participants under 36 who had not previously been selected for contact through BC’s Targeted Referral System were more responsive to the Active Choice prompt, while the remaining groups were more responsive to the standard call.

Second, and counter-intuitively, we find no evidence that the warm handoff increased take-up relative to the control group. We connected prospective clients with employment counselors to support the task of applying to WorkBC. But rather than enroll clients as quickly as possible, counselors may have used the opportunity to manage their workload – selectively allocating attention toward other duties. Our evidence in this regard is solely suggestive, although this explanation would be consistent with coping models of administrative discretion, where high workload causes staff to ration services (e.g., Lipsky, Reference Lipsky1980; Tummers and Bekkers, Reference Tummers and Bekkers2014; Kolstad, Reference Kolstad2023; Bell and Jilke, Reference Bell and Jilke2024; Bell and Meyer, Reference Bell and Meyer2024).

Third, we find that a low percentage of people in the warm handoff conditions who express an interest in WorkBC actually enroll – only 8–16% are enrolled in WorkBC within 14 days. In other words, the main challenge is not getting people interested in WorkBC, it is converting those expressions of interest into enrollment. To increase take-up, policymakers should first understand whether participation is low because people are having trouble getting to-the-door or through-the-door. This distinction can help policymakers allocate marginal dollars. When people are not getting to the door, governments should invest in marketing and communications (e.g., better outreach). When people cannot get through the door, they should invest in user-friendly application procedures (e.g., better online forms).

Methods

Context and background research

Our study took place in Canada, where federal and provincial governments share responsibility for labor market policy, implementation of which is often contracted to third-party service providers. This mix can foster administrative burdens leading to low take-up (Herd et al., Reference Herd2023: 2). In Canada, ALMPs were traditionally the federal government’s purview until 1996. Since then, Canada’s delivery of public employment services is primarily decentralized, as is the case for more than half of federal OECD countries (OECD, 2023). To this end, the federal government has signed Labour Market Development Agreements with the provinces and territories, which manage on-the-ground services. In the province of British Columbia, public employment services are managed by the Ministry of Social Development and Poverty Reduction (SDPR). Services include, among others, a job search bank and funding to third-party service providers who run employment offices across 45 geographic ‘catchments’ in the province, each offering a range of services from job search assistance to skills training. Jointly, these voluntary labor market programs are called WorkBC.Footnote 1

In 2022–2023, we conducted background research on WorkBC. This research consisted of four components: (1) a literature review; (2) interviews with jobseekers, WorkBC clients and staff (N = 20); and (3) an online survey of adult British Columbians (N = 1,247). Our research revealed three behavioral barriers to take-up of WorkBC. First, we found low motivation to enroll at the to-the-door stage, largely due to informational barriers. Second, we found that individuals experienced financial difficulties. The resulting stress reduced individuals’ capacity to plan ahead, leading them to delay their enrollment. Third, we found high sensitivity to frictions at the through-the-door stage, largely due to confusion with WorkBC’s online enrollment form. For more details, see Supplementary Material I.

Randomized controlled trial

In partnership with the Government of British Columbia, we fielded an RCT with 9,877 participants between September 11 and 12 October 2023. The design of the RCT is a 2 × 2 factorial with an additional control group. Our government partner sampled 9,877 participants across 44 participating catchments between August 1 and 4 September 2023 from a larger pool of unemployed jobseekers (age 16+) who had applied for job loss benefits, called Employment Insurance (EI).Footnote 2 The EI program in Canada provides temporary benefits to eligible workers who have paid into the program and have lost their jobs through no fault of their own. Funding for the program comes from mandatory employer and employee contributions.Footnote 3 Participants were randomly assigned to one of the five RCT groups.

Participants in the four treatment groups received an email on 11–12 September 2023. Each email contained the same concise information about WorkBC and a prominent call to action. First, we varied the call-to-action (standard call $ \times $ Active Choice). The standard call was a single button that read ‘Yes, I want to connect with WorkBC’. The Active Choice call displayed the same button next to another button reading ‘No, I don’t want to connect with WorkBC today’. Our Active Choice intervention follows research on similar interventions in contexts where choices are voluntary (e.g., Putnam-Farr and Riis, Reference Putnam-Farr and Riis2016). Next, we varied the enrollment process (cool handoff $ \times $ warm handoff). The cool handoff led those who clicked on the ‘Yes’ button to WorkBC’s standard online enrollment form. The warm handoff led to an online expression of interest form with pre-populated fields for name, email and a short message to the participant’s local WorkBC center. Participants who submitted the short form would eventually have to complete the standard online enrollment procedure on the WorkBC website, but this way the local office could assist where needed. For more details on the intervention design and mechanisms, see Supplementary Material I.

The email with concise information was designed to increase enrollment into WorkBC by reducing learning costs and ambiguity about the program. The Active Choice call-to-action was designed to increase enrollment by prompting individuals to make a decision and get more people to-the-door, while the warm handoff was designed to increase enrollment by helping people navigate the application process. Figure 1 summarizes the design and the measures collected by the government, which correspond to the two parts of take-up. Our primary outcome measure is enrollment into WorkBC, i.e., getting participants through-the-door. Our secondary measures capture interest (click-throughs and expression of interest submissions), i.e., getting to-the-door (Hopkins and Dorion, Reference Hopkins and Dorion2024). To contextualize our results and track the full participant journey, we also measure awareness using email opens. However, this is not an outcome of interest, as we did not expect any differences – email subject lines were identical across all treatment groups.Footnote 4 We estimate the intention-to-treat effect with a pre-treatment covariate-adjusted least squares model, using the Lin (Reference Lin2013) estimator. Based on the results, we estimate marginal means. For more details on the design, copies of the treatments, variable definitions and analysis plan, see Supplementary Material II. Replication materials are available at Harvard Dataverse (Schimpf et al., Reference Schimpf2025). The RCT was conducted by our government partner in compliance with all relevant ethical obligations. The project was classified as a quality improvement initiative involving only minimal risk. Therefore, informed consent was neither required nor deemed appropriate. Secondary data analysis and all other components were approved by ethics review boards.Footnote 5

Figure 1. RCT design.

Results

Did our interventions increase take-up of WorkBC?

We begin with our pre-registered, primary expectation (H1) that at least one of our interventions would increase take-up relative to the control group (through-the-door). Figure 2 shows the estimated, i.e., the covariate adjusted, take-up of WorkBC across the control group and the four conditions 14 days after the trial began. Take-up is lowest in the control group (i.e., no communication), at 0.68%.Footnote 6 Take-up is highest in the two cool handoff conditions: about 1.8% of participants who were directed to enroll online within 14 days of starting the trial signed up for WorkBC in the cool handoff groups. This was about 2.7 times higher than the control group, and the increase is statistically significant. In contrast, the warm handoff groups, where participants filled out an expression of interest form for their local WorkBC office after clicking the email, had lower enrollment rates that were not significantly different from the control group (Table 1). Based on the results, we reject the null hypothesis of our main expectation as we find that at least one of the interventions increased take-up. Following our pre-registered secondary expectations, we also test whether the Active Choice $ \times $ warm handoff condition has the largest effect on take-up (H2), whether there is a difference between the warm and cool handoff enrollment process (H3) and whether there is a difference in take-up between the standard call to action and the Active Choice (H4). The only statistically significant difference is between the warm and cool handoff conditions, confirming what Figure 2 suggests: on average, the cool handoff led to greater enrollment into WorkBC, with an increase of 0.80 percentage points (Supplementary Material III.3, Table A3.4).Footnote 7

Figure 2. Estimated average take-up of WorkBC after 14 days across the control group and treatment groups (95% confidence intervals). Note: Estimated marginal means based on pre-treatment covariate-adjusted linear regression model with Lin estimator and H1C robust standard errors (Table 1); * indicates statistically significant difference in take-up of WorkBC relative to the control group.

Table 1. Primary regression results showing the effect of the RCT interventions on take-up of WorkBC

Note: Estimates with robust HC1 standard errors in parentheses; Holm adjustment for multiple (N = 9) comparisons.

* p < 0.05.

Did our interventions work differently for different populations?

Next, we explore whether the interventions affected population subgroups differently as per our pre-registration plan. The data included in our analysis is also typically available to employment centers before applicants enroll, making it possible to tailor outreach materials to individual characteristics. We find evidence of treatment effect heterogeneity by gender, education, age and prior engagement with the EI Targeting, Referral and Feedback (TRF) system.Footnote 8 We use machine learning – specifically, tree-based policy learning to devise a decision rule that optimizes the assignment of our interventions across our two treatment variables (Sverdrup et al., Reference Sverdrup2020; Zhou et al., Reference Zhou, Athey and Wager2023). The results show that participants under 36 with a prior TRF referral are most likely to respond to the standard call $ \times $ cool handoff, as are participants between the ages of 36 and 45. Participants under 36 without TRF engagement before the trial respond better to the Active Choice $ \times $ cool handoff as do participants aged 45 or older. Exploratory analyses suggest that the groups more responsive to the Active Choice conditions have larger hidden pockets of interest in WorkBC, resulting in a higher level of ‘nudgeability’ (de Ridder et al., Reference de Ridder, Kroese and Van Gestel2022). The Active Choice conditions successfully convert this nudgeability into higher click-through and enrollment rates (see Supplementary Material III.6, Table A3.11). In summary, machine learning allows us to build optimal, personalized intervention strategies that target those most affected by our treatments, even in a case as ours where, on average, two interventions perform equally well (see Supplementary Material III.6 for full analysis of treatment heterogeneity).

Did interest in WorkBC translate into take-up?

In this section, we conduct a series of exploratory analyses to understand if interest in WorkBC translated into take-up. In Figure 3, we show the absolute and relative number of people in each of the four treatment groups who opened the email, clicked the link, submitted the expression of interest form (warm handoff conditions only) and ultimately enrolled in WorkBC within 14 days. On average, 48% of RCT participants across the four groups opened the email, with no discernible differences between the groups, as expected, given that the email subject lines were identical. Moreover, we find that interest in WorkBC is relatively high across all of the treatment groups. Between 11% (standard call $ \times $ warm handoff) and 13% (Active Choice $ \times $ cool handoff) of participants clicked on the hyperlinked button.Footnote 9 The click-through percentages are comparable to interest captured in our pre-trial survey.Footnote 10 We find no evidence that the Active Choice call-to-action increased click-throughs (Supplementary Material III.3, Table A3.5).Footnote 11 For the two warm handoff conditions, we see a decreasing number of participants moving to the expression of interest stage. In the two warm handoff conditions, about 68% of people who clicked on the hyperlink submitted the expression of interest form during the first 14 days. In other words, about one-in-ten RCT participants expressed an interest in WorkBC, yet just one-in-one-hundred participate.

Figure 3. Take-up of WorkBC after 14 days in RCT from to-the-door (email open, click-through, short-form submissions) to through-the-door (enrollment). Note: Numbers are unadjusted numbers based on RCT data; 95% CIs; percentages are based on the total N of RCT participants within a treatment group.

Figure 3 also shows a low conversion rate, defined as the ratio of click-throughs to enrollments, illustrating that few jobseekers who are interested in WorkBC enroll in the program. Across the four treatment conditions, unadjusted enrollment rates range from 0.7% to 1.9%. We find conversion rates between 5.7% (Active Choice $ \times $ warm handoff) and 14.9% (Active Choice $ \times $ cool handoff). The average conversion rate across all conditions is 11.5%. In other words, only about one-in-ten jobseekers who expressed an interest in WorkBC enrolled. We find some evidence of treatment effects. On average, the conversion rate in the cool handoff conditions is 4.8 percentage points higher than in the warm handoff conditions (Supplementary Material III.4). To put these numbers into context, Amazon’s conversion rate (website click-to-buy ratio) on its U.S. platform in 2022 was 12% for sponsored products (perpetua, 2024). In an RCT to understand take-up of the Supplemental Nutrition Assistance Program among elderly individuals in Pennsylvania, Finkelstein and Notowidigdo (Reference Finkelstein and Notowidigdo2019) find a conversion rate of 45% over a nine-month period (Finkelstein and Notowidigdo, Reference Finkelstein and Notowidigdo2019: 1529). In short, our data indicate that conversion is a challenge to take-up of ALMPs. Getting job seekers through-the-door may be even more challenging than getting them to-the-door.

Discussion and conclusion

The negative effects of unemployment are well known (e.g., Cooper, Reference Cooper2013; Kroft et al., Reference Kroft, Lange and Notowidigdo2013; Eriksson, Reference Eriksson2014; Schmieder et al., Reference Schmieder, van Wachter and Bender2014). As part of a growing ALMP expansion across many countries, public employment services can play a crucial role in helping people find stable employment and offset some of the potential costs associated with unemployment. However, take-up remains low.

Working with the Government of British Columbia, Canada, we sought to increase take-up of the province’s employment services program, WorkBC. After conducting background research, we distinguished two steps in the take-up process: getting jobseekers to-the-door (awareness and interest) and through-the-door (enrollment). In a 2 × 2 factorial RCT, we experimentally manipulated the call-to-action (standard call vs Active Choice) and the enrollment process (cool handoff vs warm handoff). Actively reaching out to jobseekers, here EI applicants, and providing concise information increases take-up of WorkBC within 14 days. Our best-performing intervention improved on the baseline control condition, and machine learning suggests treatment effect heterogeneity. Since WorkBC centers are already obligated to contact recently unemployed jobseekers, our best-performing emails offer a cost-effective alternative to help address the low take-up of WorkBC services.

Overall, the treatment effects we observe – an increase in WorkBC enrollment of 1.1 percentage points – are similar to trials run by nudge units in the USA. In a comprehensive meta-analysis, DellaVigna and Linos (Reference DellaVigna and Linos2022) find that the average treatment effect of interventions tested by two nudge units in the USA in 126 RCTs (N = 23 million) is 1.4 percentage points. In RCTs focusing specifically on encouraging individuals to enroll in government programs, the average treatment effect is 0.89 percentage points. From this perspective, our results are on the higher end of studies to increase take-up. Yet despite these encouraging results, we find evidence of a more concerning pattern: interest in WorkBC does not easily translate into enrollment. While our observed conversion rates compare favorably to examples from e-commerce, they compare less favorably against prior research in the area of social policy (Finkelstein and Notowidigdo, Reference Finkelstein and Notowidigdo2019). Why might that be the case? First, a click-through in our context is likely a stronger expression of interest than a click on a product on an e-commerce site. Moreover, whereas e-commerce customers might be presented with alternatives while reviewing a product, the conversion funnel in our case is narrower – clicking through our emails led participants either to the WorkBC online enrollment form or to a short expression of interest form, which would then notify their local WorkBC center.

Second, research on administrative burdens and take-up suggests that providing assistance to potential clients and designing interventions beyond nudges can facilitate take-up (Bettinger et al., Reference Bettinger2012; Herd and Moynihan, Reference Herd and Moynihan2018; DeLuca et al., Reference DeLuca, Katz and Oppenheimer2023; Castell et al., Reference Castell2024). Assistance can be particularly effective in cases where compliance costs are high (Herd et al., Reference Herd2023: 21), as is the case here where enrolling into WorkBC involves navigating a lengthy online enrollment process. Our warm handoff interventions did not increase take-up relative to the control group. One possible explanation is that street-level bureaucrats worked to enroll a higher than usual number of interested individuals, which may have led to greater bureaucratic discretion (Lipsky, Reference Lipsky1980; Tummers and Bekkers, Reference Tummers and Bekkers2014). In some cases, this discretion can cause inequities in public program access (Kolstad, Reference Kolstad2023; Bell and Meyer, Reference Bell and Meyer2024). Exploratory analyses of our RCT data suggest that among the participants in the warm handoff conditions who submitted the expression of interest form (N = 315), those with a college/university degree were 1.8 times more likely to enroll in WorkBC than those without a college/university degree (Supplementary Material III.5). Results from a previous RCT with EI clients in BC also shows that higher levels of education led to higher enrollment in WorkBC among participants who received a warm handoff, even in a context of lower levels of baseline interest and overall enrollment in WorkBC (Hopkins and Dorion, Reference Hopkins and Dorion2024). These findings may be consistent with an argument that staff engage in discretionary behavior to manage workload, potentially resulting in inequity of access. These findings certainly deserve more attention from a research perspective as they relate to essential debates around equity in public service delivery (e.g., Einstein and Glick, Reference Einstein and Glick2017; Olsen et al., Reference Olsen, Kyhse‐Andersen and Moynihan2022; Hamel and Holliday, Reference Hamel and Holliday2024). For policymakers, the main takeaway remains that conversion of interested clients presents a major challenge to take-up. As long as interest in a program is higher than the capacity to enroll individuals, even providing additional assistance to help prospective clients navigate the enrollment procedure might not be enough.

In summary, we learned that directly reaching out to eligible individuals, providing them with concise information about the program and implementing a prominent call to action can increase take-up of public employment services. Clearly separating the take-up process into two steps – to-the-door vs through-the-door – and incorporating indicators for both in the research process can help researchers provide clear policy recommendations regarding resource allocation and identify successful behavioral interventions.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/bpp.2025.10017.

Acknowledgements

This project was made possible thanks to funding from the Government of Canada and the Province of British Columbia as part of the Community and Employer Partnerships Research and Innovation funding. For comments, feedback and helpful suggestions along the way, we are grateful to Kirstin Appelt and Florian Foos. We also thank Kelly Foley, Eline de Rooij and Daniel Rubenson for being part of the blind jury that reviewed changes made to the pre-analysis plan. We are grateful for the fantastic research assistance we received from Parker Li, Victoria Martinez and Farkhonda Tahery. Lastly, we thank the editor and the anonymous reviewers for their helpful comments and suggestions.

Footnotes

1 For a comparison of ALMP designs between Canada and other OECD countries, see Fredriksson (Reference Fredriksson2021) and Haapanala (Reference Haapanala2022).

2 The final N exceeds the minimum number of participants, N = 3,500, required to draw meaningful statistical conclusions from the data, based on the power simulations we conducted in advance of the RCT.

3 The benefits and eligibility vary based on income and local employment rates. In 2023, the year of the RCT, the maximum weekly EI benefit was $650 CAD. For successful EI applicants, participation in WorkBC is not mandatory, but it can help them fulfill their obligations to conduct job search activities that enhance their chances of finding employment.

4 In the context of the RCT, email opens are also a measure of treatment compliance.

5 Secondary analysis and study components were approved by ethic review boards at the University of Saskatchewan (ethics certificate numbers starting with: Beh-REB) and the University of British Columbia (ethics certificate numbers starting with: H23): Interviews (Beh-REB 3327); Secondary data analysis of RCT data (Beh-REB 4338 and H23-03231); Survey (Beh-REB 3590); RCT pre-test (Beh-REB 4224 and H23-02282).

6 Compared to a similar RCT conducted during the early months of COVID-19, which measured WorkBC enrollment after 30 days, take-up in the control group increased by roughly 0.4% (Hopkins and Dorion, Reference Hopkins and Dorion2024: 12).

7 For alternative model specifications, including unadjusted models and estimates of local average treatment effects, see Supplementary Material III.2.

8 The TRF system is a cooperation between the British Columbia’s Ministry of Social Development and Poverty Reduction (SDPR) and the federal government’s Employment and Social Development Canada (ESDC). The program supports EI applicants, both new and returning. WorkBC service providers set targeting criteria in their local catchments in coordination with SDPR (Targeting). When an EI applicant submits their application, they are referred to their local WorkBC center if they match the criteria (Referral). The service providers are required to attempt to contact these applicants and invite them to enroll in WorkBC services. They then send a report back to ESDC to inform them about the outcome of the referral (Feedback). In our trial, 18.6% of participants had been referred by TRF prior to the trial.

9 In general, our email open and click-through rates are higher than the industry average. The average government email open rates on the popular email provider service Mailchimp are about 41%, while the average click-through rate for government-sent emails is 4.6% (Mailchimp, 2024). This might indicate higher awareness (email opens) and interest (click-throughs) than for other programs governments communicate in their emails. In turn, this may impact the extent to which our intervention effects travel to other contexts.

10 Without additional information, 12.5% of working-age respondents (18–65) in our survey indicated they would turn to WorkBC if they wanted help finding a job, for example.

11 In the Active Choice conditions, 2.2% clicked on ‘No’. We analyze these responses in Supplementary Material III.7.

References

Bearson, D. F. and Sunstein, C. R. (2023), Take up, Behavioural Public Policy, 116. https://doi.org/10.1017/bpp.2023.21CrossRefGoogle Scholar
Bell, E. and Jilke, S. (2024), Racial discrimination and administrative burden in access to public services, Scientific Reports, 14(1), 1071. https://doi.org/10.1038/s41598-023-50936-1CrossRefGoogle ScholarPubMed
Bell, E. and Meyer, K. (2024), Does reducing street-level bureaucrats’ workload enhance equity in program access? Evidence from burdensome college financial aid programs, Journal of Public Administration Research and Theory, 34(1), 1638. https://doi.org/10.1093/jopart/muad018CrossRefGoogle Scholar
Bettinger, E. P. et al. (2012), The role of application assistance and information in college decisions: results from the H&R block FAFSA experiment, The Quarterly Journal of Economics, 127(3): 12051242. https://doi.org/10.1093/qje/qjs017CrossRefGoogle Scholar
Bhargava, S. and Manoli, D. (2015), Psychological frictions and the incomplete take-up of social benefits: evidence from an IRS field experiment, American Economic Review, 105(11), 34893529. https://doi.org/10.1257/aer.20121493CrossRefGoogle Scholar
Bonoli, G. and Liechti, F. (2018), Good intentions and Matthew effects: access biases in participation in active labour market policies, Journal of European Public Policy, 25(6): 894911.10.1080/13501763.2017.1401105CrossRefGoogle Scholar
Caliendo, M. and Schmidl, R. (2016), Youth unemployment and active labor market policies in Europe, IZA Journal of Labor Policy, 5(1): 130.10.1186/s40173-016-0057-xCrossRefGoogle Scholar
Card, D., Kluve, J. and Weber, A. (2018), What works? A meta analysis of recent active labor market program evaluations, Journal of the European Economic Association, 16(3): 894931.10.1093/jeea/jvx028CrossRefGoogle Scholar
Castell, L. et al. (2024), ‘Take-up of social benefits: experimental evidence from France’. Available at: https://shs.hal.science/halshs-04720989/file/wp202430_.pdf (Accessed: 18 October 2024).Google Scholar
Cooper, D. H. (2013), ‘The effect of unemployment duration on future earnings and other outcomes’, Federal Reserve Bank of Boston Research Department Working Papers, 1318.10.2139/ssrn.2366542CrossRefGoogle Scholar
Crépon, B. and van den Berg, G. J. (2016), Active labor market policies, Annual Review of Economics, 8: 521546. 110.1146/annurev-economics-080614-115738 10.1146/annurev-economics-080614-115738CrossRefGoogle Scholar
Currie, J. (2006), The Take-up of Social Benefits, Auerbach, A. J., Card, D. and Quigley, J. M. (eds), Public Policy and the Income Distribution, New York: Russell Sage Foundation, 80148.Google Scholar
Daigneault, P.-M. (2023), Evaluation of the non-take-up of public services and social benefits, Varone, F., Jacob, S. and Bundi, O. (eds), Handbook of Public Policy Evaluation, Cheltenham: Edward Elgar Publishing Limited, 408424.10.4337/9781800884892.00036CrossRefGoogle Scholar
Darling, M. et al. (2017), ‘Using behavioral insights to improve take-up of a reemployment program: trial design and findings’. Mathematica Policy Research. Available at: https://research.upjohn.org/externalpapers/73 (Accessed: 6 September 2024).Google Scholar
DellaVigna, S. and Linos, E. (2022), RCTs to scale: comprehensive evidence from two nudge units, Econometrica, 90(1), 81116. https://doi.org/10.3982/ECTA18709CrossRefGoogle Scholar
DeLuca, S., Katz, L. F. and Oppenheimer, S. C. (2023), “When someone cares about you, it’s priceless”: reducing administrative burdens and boosting housing search confidence to increase opportunity moves for voucher holders, RSF: The Russell Sage Foundation Journal of the Social Sciences, 9(5), 179211. https://doi.org/10.7758/RSF.2023.9.5.08CrossRefGoogle Scholar
de Ridder, D., Kroese, F. and Van Gestel, L. (2022), Nudgeability: mapping conditions of susceptibility to nudge influence, Perspectives on Psychological Science, 17(2), 346359. https://doi.org/10.1177/1745691621995183CrossRefGoogle ScholarPubMed
Dhia and Mbih (2023), Do information frictions affect enrollment in public-sponsored trainings? results from an online experiment, Annals of Economics and Statistics, 152: 142. https://doi.org/10.2307/48754783Google Scholar
Einstein, K. L. and Glick, D. M. (2017), Does race affect access to government services? an experiment exploring street‐level bureaucrats and access to public housing, American Journal of Political Science, 61(1), 100116. https://doi.org/10.1111/ajps.12252CrossRefGoogle Scholar
Eriksson, S. (2014), Do employers use unemployment as a sorting criterion when hiring? evidence from a field experiment, American Economic Review, 104(3): 10141039.10.1257/aer.104.3.1014CrossRefGoogle Scholar
Finkelstein, A. and Notowidigdo, M. J. (2019), Take-Up and targeting: experimental evidence from SNAP, The Quarterly Journal of Economics, 134(3), 15051556. https://doi.org/10.1093/qje/qjz013CrossRefGoogle Scholar
Fox, A., Feng, W. and Reynolds, M. (2023), The effect of administrative burden on state safety-net participation: evidence from food assistance, cash assistance, and Medicaid, Public Administration Review, 83: 367384. https://doi.org/10.1111/puar.13497CrossRefGoogle Scholar
Fredriksson, D. (2021), Reducing unemployment? examining the interplay between active labour market policies, Social Policy & Administration, 55(1), 117. https://doi.org/10.1111/spol.12606CrossRefGoogle Scholar
Haapanala, H. (2022), Carrots or sticks? A multilevel analysis of active labour market policies and non‐standard employment in Europe, Social Policy & Administration, 56(3), 360377. https://doi.org/10.1111/spol.12770CrossRefGoogle Scholar
Hamel, B. T. and Holliday, D. E. (2024), Unequal responsiveness in city service delivery: evidence from 42 million 311 calls, Quarterly Journal of Political Science, 19(3), 243274. https://doi.org/10.1561/100.00022089CrossRefGoogle Scholar
Heckman, J. J. and Smith, J. A. (2004), The determinants of participation in a social program: evidence from a prototypical job training program, Journal of Labor Economics, 22(2): 243298.10.1086/381250CrossRefGoogle Scholar
Herd, P. et al. (2023), Introduction: administrative Burden as a mechanism of inequality in policy implementation, RSF: The Russell Sage Foundation Journal of the Social Sciences, 9(5): 130. https://doi.org/10.7758/RSF.2023.9.5.01CrossRefGoogle Scholar
Herd, P. and Moynihan, D. P. (2018), Administrative Burden: Policymaking by Other Means, New York: Russell Sage Foundation.Google Scholar
Hopkins, V. and Dorion, J. (2024), Nudging increases take-up of employment services: evidence from a large field experiment, Journal of Policy Analysis and Management, Early View, 43: 12091228. https://doi.org/10.1002/pam.22617CrossRefGoogle Scholar
Janssens, J. and Van Mechelen, N. (2022), To take or not to take? An overview of the factors contributing to the non-take-up of public provisions, European Journal of Social Security, 24(2), 95116. https://doi.org/10.1177/13882627221106800CrossRefGoogle Scholar
Ko, W. and Moffitt, R. A. (2024), Take-up of social benefits, Zimmermann, K. F. (ed.), Handbook of Labor, Human Resources and Population Economics, Springer Cham, 142. https://link.springer.com/referencework/10.1007/978-3-319-57365-6#bibliographic-informationGoogle Scholar
Kolstad, K. L. (2023), ‘Overburdened bureaucrats: providing equal access to public services during COVID-19’. doi: https://doi.org/10.31219/osf.io/zmt5y.CrossRefGoogle Scholar
Kroft, K., Lange, F. and Notowidigdo, M. J. (2013), Duration dependence and labor market conditions: evidence from a field experiment, The Quarterly Journal of Economics, 128(3), 11231167. https://doi.org/10.1093/qje/qjt015CrossRefGoogle Scholar
Lehner, L. and Schwarz, A. (2024), Reframing active labor market policy: field experiments on barriers to program participation.10.2139/ssrn.5006946CrossRefGoogle Scholar
Lin, W. (2013), Agnostic notes on regression adjustments to experimental data: reexamining Freedman’s critique, The Annals of Applied Statistics, 7(1), 295318. https://doi.org/10.1214/12-AOAS583CrossRefGoogle Scholar
Lipsky, M. (1980), Street Level Bureaucracy: Dilemmas of the Individual in Public Services, New York: Russell Sage Foundation.Google Scholar
Mailchimp (2024), Email Marketing Benchmarks and Metrics V3.2 Report. Available at: https://mailchimp.com/resources/email-marketing-benchmarks/success/ (Accessed: 31 August 2024).Google Scholar
Mühlböck, M. et al. (2022), Information, reflection, and successful job search: a labor market policy experiment, Social Policy & Administration, 56(1): 4872. https://doi.org/10.1111/spol.12754CrossRefGoogle Scholar
OECD (2023), “Who does what” for active labour market policies. Available at: https://www.oecd.org/en/publications/who-does-what-for-active-labour-market-policies_d8d6868d-en.html (Accessed: 6 March 2025).Google Scholar
Olsen, A. L., Kyhse‐Andersen, J. H. and Moynihan, D. (2022), The unequal distribution of opportunity: a national audit study of bureaucratic discrimination in primary school access, American Journal of Political Science, 66(3), 587603. https://doi.org/10.1111/ajps.12584CrossRefGoogle Scholar
Putnam-Farr, E. and Riis, J. (2016), “Yes/no/not right now”: yes/no response formats can increase response rates even in non-forced-choice settings, Journal of Marketing Research, 53(3), 424432. https://doi.org/10.1509/jmr.14.0227CrossRefGoogle Scholar
Sanders, M. and Kirkman, E. (2019), I’ve booked you a place, good luck: applying behavioral science to improve attendance at high-impact job recruitment events, Journal of Behavioral Public Administration, 2(1), 19. https://doi.org/10.30636/jbpa.21.24CrossRefGoogle Scholar
Schimpf, Ch. H. et al. (2025), Replication Data for “Behaviourally informed interventions can increase take-up of public employment services, but conversion remains challenging Insights from an RCT in British Columbia, Canada”, Harvard Dataverse, V1. https://doi.org/10.7910/DVN/3DIDQPGoogle Scholar
Schmieder, J. F., van Wachter, T. and Bender, S. (2014), ‘The causal effect of unemployment duration on wages: evidence from unemployment insurance extensions’, IZA Discussion Paper, 8700.10.2139/ssrn.2543894CrossRefGoogle Scholar
Sverdrup, E. et al. (2020), policytree: policy learning via doubly robust empirical welfare maximization over trees, Journal of Open Source Software, 5(50): 2232. https://doi.org/10.21105/joss.02232CrossRefGoogle Scholar
Tregebov, S., Seusan, A. and Krieger, M. (2021), ‘Applying behavioural insights to career guidance’. Future Skills Centre | Centre des Compétences futures.Google Scholar
Tübbicke, S. and Schiele, M. (2024), On the effects of active labour market policies among individuals reporting to have severe mental health problems, Social Policy & Administration, 58(3): 404422.10.1111/spol.12968CrossRefGoogle Scholar
Tummers, L. and Bekkers, V. (2014), Policy implementation, street-level bureaucracy, and the importance of discretion, Public Management Review, 16(4), 527547. https://doi.org/10.1080/14719037.2013.841978CrossRefGoogle Scholar
Zhou, Z., Athey, S. and Wager, S. (2023), Offline multi-action policy learning: generalization and optimization, Operations Research, 71(1), 148183. https://doi.org/10.1287/opre.2022.2271CrossRefGoogle Scholar
Figure 0

Figure 1. RCT design.

Figure 1

Figure 2. Estimated average take-up of WorkBC after 14 days across the control group and treatment groups (95% confidence intervals). Note: Estimated marginal means based on pre-treatment covariate-adjusted linear regression model with Lin estimator and H1C robust standard errors (Table 1); * indicates statistically significant difference in take-up of WorkBC relative to the control group.

Figure 2

Table 1. Primary regression results showing the effect of the RCT interventions on take-up of WorkBC

Figure 3

Figure 3. Take-up of WorkBC after 14 days in RCT from to-the-door (email open, click-through, short-form submissions) to through-the-door (enrollment). Note: Numbers are unadjusted numbers based on RCT data; 95% CIs; percentages are based on the total N of RCT participants within a treatment group.

Supplementary material: File

Schimpf et al. supplementary material

Schimpf et al. supplementary material
Download Schimpf et al. supplementary material(File)
File 1.1 MB