Introduction
Take-up of many government programs is low, especially when participation is voluntary (Currie, Reference Currie, Auerbach, Card and Quigley2006; Bhargava and Manoli, Reference Bhargava and Manoli2015; Janssens and Van Mechelen, Reference Janssens and Van Mechelen2022; Daigneault, Reference Daigneault, Varone, Jacob and Bundi2023). Low take-up may undermine the welfare benefits of public investments (Bearson and Sunstein, Reference Bearson and Sunstein2023; Fox et al., Reference Fox, Feng and Reynolds2023; Ko and Moffitt, Reference Ko, Moffitt and Zimmermann2024). Active labor market policies (ALMPs) are a prominent example. ALMPs are widely used to improve the prospects of jobseekers, encompassing services such as job search assistance, training and skill programs, and subsidized jobs (Caliendo and Schmidl, Reference Caliendo and Schmidl2016; Bonoli and Liechti, Reference Bonoli and Liechti2018; Card et al., Reference Card, Kluve and Weber2018; Tübbicke and Schiele, Reference Tübbicke and Schiele2024). In 2021, for example, OECD countries spent an average of 0.6% of their GDP on public employment measures (OECD, 2024). Yet despite this spending, and the many potential benefits for jobseekers, take-up remains as low or lower than many other social programs (Crépon and van den Berg, Reference Crépon and van den Berg2016). In this paper, we ask: Why is take-up of publicly funded employment services low and what can policymakers do about it?
We report on a pre-registered, large-scale randomized controlled trial (RCT) conducted in partnership with the Government of British Columbia, Canada. We test the effect of behaviorally informed interventions on take-up of public employment services (pre-registered analysis plan on OSF: https://bit.ly/3XGc2Tm). Prior research shows that take-up of the provincial labor market program, WorkBC, is low – in one study, enrollment was below 1% among applicants for unemployment insurance benefits (Hopkins and Dorion, Reference Hopkins and Dorion2024). We build on previous research in the area of public employment service take-up (Heckman and Smith, Reference Heckman and Smith2004; Darling et al., Reference Darling2017; Sanders and Kirkman, Reference Sanders and Kirkman2019; Tregebov et al., Reference Tregebov, Seusan and Krieger2021; Mühlböck et al., Reference Mühlböck2022; Dhia and Mbih, 2023; Hopkins and Dorion, Reference Hopkins and Dorion2024; Lehner and Schwarz, Reference Lehner and Schwarz2024). Our study is mixed method: prior to the RCT, we conducted interviews and original survey research. Based on this evidence, we expected that take-up in WorkBC is low because citizens are insufficiently motivated to start the application process (due to informational barriers) and highly sensitive to frictions when they are motivated or in the process of applying (delayed action; difficulties navigating enrollment processes).
Crucially, our experimental design distinguishes between two steps in the take-up process: getting jobseekers ‘to-the-door’ (awareness and interest) and ‘through-the-door’ (enrollment). Working with the BC government, we co-designed a 2 × 2 factorial with an additional, pure control group. Everyone in the four intervention groups received an email with the same concise information about WorkBC and a call to action. To get more people to-the-door – i.e., to the WorkBC enrollment website – we varied the call to action. We invited some participants to opt in by clicking on a single button (‘standard call’), while others were invited to explicitly choose between connecting with WorkBC or not (‘Active Choice’). In both cases, making the choice was voluntary. To get interested people through-the-door – i.e., enrolled in WorkBC – we varied the enrollment process. We routed some participants directly to the WorkBC enrollment website (‘cool handoff’) or to an expression of interest form that first connected them with their local employment center (‘warm handoff’). The warm handoff was designed to support the WorkBC online enrollment process by connecting each jobseeker with an employment counselor.
Our study has three main takeaways. First, we find that the cool handoff increased take-up of WorkBC among the target population by a factor of 2.7, relative to the control group. This finding supports existing research, showing that reaching out to jobseekers and providing them with concise information can increase take-up of public employment services (e.g., Lehner and Schwarz, Reference Lehner and Schwarz2024). Using machine learning, we find evidence of treatment effect heterogeneity. Participants aged 45+ and participants under 36 who had not previously been selected for contact through BC’s Targeted Referral System were more responsive to the Active Choice prompt, while the remaining groups were more responsive to the standard call.
Second, and counter-intuitively, we find no evidence that the warm handoff increased take-up relative to the control group. We connected prospective clients with employment counselors to support the task of applying to WorkBC. But rather than enroll clients as quickly as possible, counselors may have used the opportunity to manage their workload – selectively allocating attention toward other duties. Our evidence in this regard is solely suggestive, although this explanation would be consistent with coping models of administrative discretion, where high workload causes staff to ration services (e.g., Lipsky, Reference Lipsky1980; Tummers and Bekkers, Reference Tummers and Bekkers2014; Kolstad, Reference Kolstad2023; Bell and Jilke, Reference Bell and Jilke2024; Bell and Meyer, Reference Bell and Meyer2024).
Third, we find that a low percentage of people in the warm handoff conditions who express an interest in WorkBC actually enroll – only 8–16% are enrolled in WorkBC within 14 days. In other words, the main challenge is not getting people interested in WorkBC, it is converting those expressions of interest into enrollment. To increase take-up, policymakers should first understand whether participation is low because people are having trouble getting to-the-door or through-the-door. This distinction can help policymakers allocate marginal dollars. When people are not getting to the door, governments should invest in marketing and communications (e.g., better outreach). When people cannot get through the door, they should invest in user-friendly application procedures (e.g., better online forms).
Methods
Context and background research
Our study took place in Canada, where federal and provincial governments share responsibility for labor market policy, implementation of which is often contracted to third-party service providers. This mix can foster administrative burdens leading to low take-up (Herd et al., Reference Herd2023: 2). In Canada, ALMPs were traditionally the federal government’s purview until 1996. Since then, Canada’s delivery of public employment services is primarily decentralized, as is the case for more than half of federal OECD countries (OECD, 2023). To this end, the federal government has signed Labour Market Development Agreements with the provinces and territories, which manage on-the-ground services. In the province of British Columbia, public employment services are managed by the Ministry of Social Development and Poverty Reduction (SDPR). Services include, among others, a job search bank and funding to third-party service providers who run employment offices across 45 geographic ‘catchments’ in the province, each offering a range of services from job search assistance to skills training. Jointly, these voluntary labor market programs are called WorkBC.Footnote 1
In 2022–2023, we conducted background research on WorkBC. This research consisted of four components: (1) a literature review; (2) interviews with jobseekers, WorkBC clients and staff (N = 20); and (3) an online survey of adult British Columbians (N = 1,247). Our research revealed three behavioral barriers to take-up of WorkBC. First, we found low motivation to enroll at the to-the-door stage, largely due to informational barriers. Second, we found that individuals experienced financial difficulties. The resulting stress reduced individuals’ capacity to plan ahead, leading them to delay their enrollment. Third, we found high sensitivity to frictions at the through-the-door stage, largely due to confusion with WorkBC’s online enrollment form. For more details, see Supplementary Material I.
Randomized controlled trial
In partnership with the Government of British Columbia, we fielded an RCT with 9,877 participants between September 11 and 12 October 2023. The design of the RCT is a 2 × 2 factorial with an additional control group. Our government partner sampled 9,877 participants across 44 participating catchments between August 1 and 4 September 2023 from a larger pool of unemployed jobseekers (age 16+) who had applied for job loss benefits, called Employment Insurance (EI).Footnote 2 The EI program in Canada provides temporary benefits to eligible workers who have paid into the program and have lost their jobs through no fault of their own. Funding for the program comes from mandatory employer and employee contributions.Footnote 3 Participants were randomly assigned to one of the five RCT groups.
Participants in the four treatment groups received an email on 11–12 September 2023. Each email contained the same concise information about WorkBC and a prominent call to action. First, we varied the call-to-action (standard call
$ \times $ Active Choice). The standard call was a single button that read ‘Yes, I want to connect with WorkBC’. The Active Choice call displayed the same button next to another button reading ‘No, I don’t want to connect with WorkBC today’. Our Active Choice intervention follows research on similar interventions in contexts where choices are voluntary (e.g., Putnam-Farr and Riis, Reference Putnam-Farr and Riis2016). Next, we varied the enrollment process (cool handoff
$ \times $ warm handoff). The cool handoff led those who clicked on the ‘Yes’ button to WorkBC’s standard online enrollment form. The warm handoff led to an online expression of interest form with pre-populated fields for name, email and a short message to the participant’s local WorkBC center. Participants who submitted the short form would eventually have to complete the standard online enrollment procedure on the WorkBC website, but this way the local office could assist where needed. For more details on the intervention design and mechanisms, see Supplementary Material I.
The email with concise information was designed to increase enrollment into WorkBC by reducing learning costs and ambiguity about the program. The Active Choice call-to-action was designed to increase enrollment by prompting individuals to make a decision and get more people to-the-door, while the warm handoff was designed to increase enrollment by helping people navigate the application process. Figure 1 summarizes the design and the measures collected by the government, which correspond to the two parts of take-up. Our primary outcome measure is enrollment into WorkBC, i.e., getting participants through-the-door. Our secondary measures capture interest (click-throughs and expression of interest submissions), i.e., getting to-the-door (Hopkins and Dorion, Reference Hopkins and Dorion2024). To contextualize our results and track the full participant journey, we also measure awareness using email opens. However, this is not an outcome of interest, as we did not expect any differences – email subject lines were identical across all treatment groups.Footnote 4 We estimate the intention-to-treat effect with a pre-treatment covariate-adjusted least squares model, using the Lin (Reference Lin2013) estimator. Based on the results, we estimate marginal means. For more details on the design, copies of the treatments, variable definitions and analysis plan, see Supplementary Material II. Replication materials are available at Harvard Dataverse (Schimpf et al., Reference Schimpf2025). The RCT was conducted by our government partner in compliance with all relevant ethical obligations. The project was classified as a quality improvement initiative involving only minimal risk. Therefore, informed consent was neither required nor deemed appropriate. Secondary data analysis and all other components were approved by ethics review boards.Footnote 5

Figure 1. RCT design.
Results
Did our interventions increase take-up of WorkBC?
We begin with our pre-registered, primary expectation (H1) that at least one of our interventions would increase take-up relative to the control group (through-the-door). Figure 2 shows the estimated, i.e., the covariate adjusted, take-up of WorkBC across the control group and the four conditions 14 days after the trial began. Take-up is lowest in the control group (i.e., no communication), at 0.68%.Footnote 6 Take-up is highest in the two cool handoff conditions: about 1.8% of participants who were directed to enroll online within 14 days of starting the trial signed up for WorkBC in the cool handoff groups. This was about 2.7 times higher than the control group, and the increase is statistically significant. In contrast, the warm handoff groups, where participants filled out an expression of interest form for their local WorkBC office after clicking the email, had lower enrollment rates that were not significantly different from the control group (Table 1). Based on the results, we reject the null hypothesis of our main expectation as we find that at least one of the interventions increased take-up. Following our pre-registered secondary expectations, we also test whether the Active Choice
$ \times $ warm handoff condition has the largest effect on take-up (H2), whether there is a difference between the warm and cool handoff enrollment process (H3) and whether there is a difference in take-up between the standard call to action and the Active Choice (H4). The only statistically significant difference is between the warm and cool handoff conditions, confirming what Figure 2 suggests: on average, the cool handoff led to greater enrollment into WorkBC, with an increase of 0.80 percentage points (Supplementary Material III.3, Table A3.4).Footnote 7

Figure 2. Estimated average take-up of WorkBC after 14 days across the control group and treatment groups (95% confidence intervals). Note: Estimated marginal means based on pre-treatment covariate-adjusted linear regression model with Lin estimator and H1C robust standard errors (Table 1); * indicates statistically significant difference in take-up of WorkBC relative to the control group.
Table 1. Primary regression results showing the effect of the RCT interventions on take-up of WorkBC

Note: Estimates with robust HC1 standard errors in parentheses; Holm adjustment for multiple (N = 9) comparisons.
* p < 0.05.
Did our interventions work differently for different populations?
Next, we explore whether the interventions affected population subgroups differently as per our pre-registration plan. The data included in our analysis is also typically available to employment centers before applicants enroll, making it possible to tailor outreach materials to individual characteristics. We find evidence of treatment effect heterogeneity by gender, education, age and prior engagement with the EI Targeting, Referral and Feedback (TRF) system.Footnote 8 We use machine learning – specifically, tree-based policy learning to devise a decision rule that optimizes the assignment of our interventions across our two treatment variables (Sverdrup et al., Reference Sverdrup2020; Zhou et al., Reference Zhou, Athey and Wager2023). The results show that participants under 36 with a prior TRF referral are most likely to respond to the standard call
$ \times $ cool handoff, as are participants between the ages of 36 and 45. Participants under 36 without TRF engagement before the trial respond better to the Active Choice
$ \times $ cool handoff as do participants aged 45 or older. Exploratory analyses suggest that the groups more responsive to the Active Choice conditions have larger hidden pockets of interest in WorkBC, resulting in a higher level of ‘nudgeability’ (de Ridder et al., Reference de Ridder, Kroese and Van Gestel2022). The Active Choice conditions successfully convert this nudgeability into higher click-through and enrollment rates (see Supplementary Material III.6, Table A3.11). In summary, machine learning allows us to build optimal, personalized intervention strategies that target those most affected by our treatments, even in a case as ours where, on average, two interventions perform equally well (see Supplementary Material III.6 for full analysis of treatment heterogeneity).
Did interest in WorkBC translate into take-up?
In this section, we conduct a series of exploratory analyses to understand if interest in WorkBC translated into take-up. In Figure 3, we show the absolute and relative number of people in each of the four treatment groups who opened the email, clicked the link, submitted the expression of interest form (warm handoff conditions only) and ultimately enrolled in WorkBC within 14 days. On average, 48% of RCT participants across the four groups opened the email, with no discernible differences between the groups, as expected, given that the email subject lines were identical. Moreover, we find that interest in WorkBC is relatively high across all of the treatment groups. Between 11% (standard call
$ \times $ warm handoff) and 13% (Active Choice
$ \times $ cool handoff) of participants clicked on the hyperlinked button.Footnote 9 The click-through percentages are comparable to interest captured in our pre-trial survey.Footnote 10 We find no evidence that the Active Choice call-to-action increased click-throughs (Supplementary Material III.3, Table A3.5).Footnote 11 For the two warm handoff conditions, we see a decreasing number of participants moving to the expression of interest stage. In the two warm handoff conditions, about 68% of people who clicked on the hyperlink submitted the expression of interest form during the first 14 days. In other words, about one-in-ten RCT participants expressed an interest in WorkBC, yet just one-in-one-hundred participate.

Figure 3. Take-up of WorkBC after 14 days in RCT from to-the-door (email open, click-through, short-form submissions) to through-the-door (enrollment). Note: Numbers are unadjusted numbers based on RCT data; 95% CIs; percentages are based on the total N of RCT participants within a treatment group.
Figure 3 also shows a low conversion rate, defined as the ratio of click-throughs to enrollments, illustrating that few jobseekers who are interested in WorkBC enroll in the program. Across the four treatment conditions, unadjusted enrollment rates range from 0.7% to 1.9%. We find conversion rates between 5.7% (Active Choice
$ \times $ warm handoff) and 14.9% (Active Choice
$ \times $ cool handoff). The average conversion rate across all conditions is 11.5%. In other words, only about one-in-ten jobseekers who expressed an interest in WorkBC enrolled. We find some evidence of treatment effects. On average, the conversion rate in the cool handoff conditions is 4.8 percentage points higher than in the warm handoff conditions (Supplementary Material III.4). To put these numbers into context, Amazon’s conversion rate (website click-to-buy ratio) on its U.S. platform in 2022 was 12% for sponsored products (perpetua, 2024). In an RCT to understand take-up of the Supplemental Nutrition Assistance Program among elderly individuals in Pennsylvania, Finkelstein and Notowidigdo (Reference Finkelstein and Notowidigdo2019) find a conversion rate of 45% over a nine-month period (Finkelstein and Notowidigdo, Reference Finkelstein and Notowidigdo2019: 1529). In short, our data indicate that conversion is a challenge to take-up of ALMPs. Getting job seekers through-the-door may be even more challenging than getting them to-the-door.
Discussion and conclusion
The negative effects of unemployment are well known (e.g., Cooper, Reference Cooper2013; Kroft et al., Reference Kroft, Lange and Notowidigdo2013; Eriksson, Reference Eriksson2014; Schmieder et al., Reference Schmieder, van Wachter and Bender2014). As part of a growing ALMP expansion across many countries, public employment services can play a crucial role in helping people find stable employment and offset some of the potential costs associated with unemployment. However, take-up remains low.
Working with the Government of British Columbia, Canada, we sought to increase take-up of the province’s employment services program, WorkBC. After conducting background research, we distinguished two steps in the take-up process: getting jobseekers to-the-door (awareness and interest) and through-the-door (enrollment). In a 2 × 2 factorial RCT, we experimentally manipulated the call-to-action (standard call vs Active Choice) and the enrollment process (cool handoff vs warm handoff). Actively reaching out to jobseekers, here EI applicants, and providing concise information increases take-up of WorkBC within 14 days. Our best-performing intervention improved on the baseline control condition, and machine learning suggests treatment effect heterogeneity. Since WorkBC centers are already obligated to contact recently unemployed jobseekers, our best-performing emails offer a cost-effective alternative to help address the low take-up of WorkBC services.
Overall, the treatment effects we observe – an increase in WorkBC enrollment of 1.1 percentage points – are similar to trials run by nudge units in the USA. In a comprehensive meta-analysis, DellaVigna and Linos (Reference DellaVigna and Linos2022) find that the average treatment effect of interventions tested by two nudge units in the USA in 126 RCTs (N = 23 million) is 1.4 percentage points. In RCTs focusing specifically on encouraging individuals to enroll in government programs, the average treatment effect is 0.89 percentage points. From this perspective, our results are on the higher end of studies to increase take-up. Yet despite these encouraging results, we find evidence of a more concerning pattern: interest in WorkBC does not easily translate into enrollment. While our observed conversion rates compare favorably to examples from e-commerce, they compare less favorably against prior research in the area of social policy (Finkelstein and Notowidigdo, Reference Finkelstein and Notowidigdo2019). Why might that be the case? First, a click-through in our context is likely a stronger expression of interest than a click on a product on an e-commerce site. Moreover, whereas e-commerce customers might be presented with alternatives while reviewing a product, the conversion funnel in our case is narrower – clicking through our emails led participants either to the WorkBC online enrollment form or to a short expression of interest form, which would then notify their local WorkBC center.
Second, research on administrative burdens and take-up suggests that providing assistance to potential clients and designing interventions beyond nudges can facilitate take-up (Bettinger et al., Reference Bettinger2012; Herd and Moynihan, Reference Herd and Moynihan2018; DeLuca et al., Reference DeLuca, Katz and Oppenheimer2023; Castell et al., Reference Castell2024). Assistance can be particularly effective in cases where compliance costs are high (Herd et al., Reference Herd2023: 21), as is the case here where enrolling into WorkBC involves navigating a lengthy online enrollment process. Our warm handoff interventions did not increase take-up relative to the control group. One possible explanation is that street-level bureaucrats worked to enroll a higher than usual number of interested individuals, which may have led to greater bureaucratic discretion (Lipsky, Reference Lipsky1980; Tummers and Bekkers, Reference Tummers and Bekkers2014). In some cases, this discretion can cause inequities in public program access (Kolstad, Reference Kolstad2023; Bell and Meyer, Reference Bell and Meyer2024). Exploratory analyses of our RCT data suggest that among the participants in the warm handoff conditions who submitted the expression of interest form (N = 315), those with a college/university degree were 1.8 times more likely to enroll in WorkBC than those without a college/university degree (Supplementary Material III.5). Results from a previous RCT with EI clients in BC also shows that higher levels of education led to higher enrollment in WorkBC among participants who received a warm handoff, even in a context of lower levels of baseline interest and overall enrollment in WorkBC (Hopkins and Dorion, Reference Hopkins and Dorion2024). These findings may be consistent with an argument that staff engage in discretionary behavior to manage workload, potentially resulting in inequity of access. These findings certainly deserve more attention from a research perspective as they relate to essential debates around equity in public service delivery (e.g., Einstein and Glick, Reference Einstein and Glick2017; Olsen et al., Reference Olsen, Kyhse‐Andersen and Moynihan2022; Hamel and Holliday, Reference Hamel and Holliday2024). For policymakers, the main takeaway remains that conversion of interested clients presents a major challenge to take-up. As long as interest in a program is higher than the capacity to enroll individuals, even providing additional assistance to help prospective clients navigate the enrollment procedure might not be enough.
In summary, we learned that directly reaching out to eligible individuals, providing them with concise information about the program and implementing a prominent call to action can increase take-up of public employment services. Clearly separating the take-up process into two steps – to-the-door vs through-the-door – and incorporating indicators for both in the research process can help researchers provide clear policy recommendations regarding resource allocation and identify successful behavioral interventions.
Supplementary material
To view supplementary material for this article, please visit https://doi.org/10.1017/bpp.2025.10017.
Acknowledgements
This project was made possible thanks to funding from the Government of Canada and the Province of British Columbia as part of the Community and Employer Partnerships Research and Innovation funding. For comments, feedback and helpful suggestions along the way, we are grateful to Kirstin Appelt and Florian Foos. We also thank Kelly Foley, Eline de Rooij and Daniel Rubenson for being part of the blind jury that reviewed changes made to the pre-analysis plan. We are grateful for the fantastic research assistance we received from Parker Li, Victoria Martinez and Farkhonda Tahery. Lastly, we thank the editor and the anonymous reviewers for their helpful comments and suggestions.