Skip to main content Accessibility help
×
Hostname: page-component-68c7f8b79f-kbpd8 Total loading time: 0 Render date: 2025-12-24T04:53:49.276Z Has data issue: false hasContentIssue false

Appendix B - Supplementary Interview Methods

Published online by Cambridge University Press:  aN Invalid Date NaN

Christoph Knill
Affiliation:
Ludwig-Maximilians-Universität München
Yves Steinebach
Affiliation:
University of Olso
Dionys Zink
Affiliation:
Ludwig-Maximilians-Universität München

Information

Type
Chapter
Information
Triage Bureaucracy
The Organizational Challenge of Implementing Growing Policy Stocks
, pp. 303 - 310
Publisher: Cambridge University Press
Print publication year: 2026
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

Appendix B Supplementary Interview Methods

Our study is based on 157 anonymous, semi-structured interviews conducted with members of environmental and social agencies across six countries: Denmark (24), Germany (38), Ireland (28), Italy (40), Portugal (15), and the UK (12). >Appendix C provides a detailed breakdown of interviews by country, sector, and organization. This section of the Appendix A offers additional insights into our data collection process, including our sampling approach, interview methodology, and the coding procedure used to analyze responses (following Bleich and Pekkanen, 2013).

To ensure ethical research standards, we obtained informed and voluntary consent from all interviewees. Their statements have been fully anonymized in accordance with data protection guidelines, as detailed in the following sections. The entire interview process was reviewed and approved by a relevant ethical board, ensuring compliance with best practices in research ethics.

A.1 Sampling Process

Our sampling strategy was purposive and guided by theoretical considerations. In each country, we targeted members of central environmental and social agencies involved in policy implementation. The sample frame included individuals whose contact details were publicly available online.

To recruit interviewees, we initially reached out via email. If no response was received, we followed up with a friendly reminder two weeks later or attempted contact by phone. The interviews were conducted remotely via phone or Zoom between April 21, 2021, and August 1, 2023. The duration of the interviews varied between 25 and 125 minutes, with an average length of approximately 50 minutes.

We conducted anonymous interviews for both research and ethical reasons. First, discussing triage and prioritization practices in the face of high implementation burdens is a sensitive topic for implementers. These practices often run counter to their professional ethos and may imply that, to some extent, they have “failed” to fully implement policies as intended. Ensuring anonymity increased the likelihood that interviewees would speak candidly about the difficult decisions they face.

Second, we anticipated that political principals could potentially use interview findings to target and blame specific individuals who admitted to being unable to meet the full expectations of their roles. This concern was validated during the interview process, as several participants sought additional reassurances that their statements would remain anonymous before proceeding. To further protect interviewees, we opted against providing full transcripts, as these could facilitate their identification.

We contacted potential interviewees via email, inviting them to participate in a study on “Policy Implementation in the European Union.” By framing the study in broad and neutral terms, we aimed to minimize self-selection bias. Had we explicitly stated that our research focused on the impact of increased workloads on policy implementation, it is likely that respondents who were particularly affected by this issue would have been more inclined to participate, skewing our sample.

At the same time, we ensured that our approach remained ethically sound, as our description was not misleading. The relationship between workload and implementation effectiveness is naturally encompassed within the broader theme of policy implementation in the European Union (EU), aligning with how respondents themselves might perceive their roles and challenges. This strategy allowed us to maintain methodological rigor while respecting ethical considerations in participant recruitment.

A.2 Interview Strategy

We conducted the interviews via telephone or video conference and recorded them, with the recordings securely stored on a protected university server. To ensure that interviewees felt comfortable and willing to share candid insights, we assured them of their anonymity before the interview. This approach was intended to foster openness, particularly when discussing potentially sensitive topics.

At the start of each interview, we introduced the study in broad terms to avoid priming respondents or steering their responses in a particular direction. Our initial questions were deliberately open-ended, allowing interviewees to describe their working conditions and challenges from their own perspective. More sensitive topics – such as coping mechanisms employed to manage excessive workloads – were introduced later in the conversation, after interviewees had become more comfortable. This sequencing helped build rapport and encouraged more forthright discussions on difficult aspects of their roles.

The open-ended questions we asked included the following:

  • What are your specific responsibilities within the organization?

  • What is your professional background? Is that a common background for your organization?

  • How long have you been working in the organization?

  • Did or do you face increases in your workload?

  • Where is this workload coming from?

  • When did you face the steepest increases of workload?

  • Can you give concrete examples of such increases?

  • How do you (try to) cope with them?

  • Do you think that these coping practices influence (your agency’s) goal attainment?

A.3 Coding Procedure and Coding Results

Seven researchers independently coded the interviews to ensure reliability and consistency. In cases where notable discrepancies emerged in the coding results, the researchers revisited the relevant interview sections to reassess their interpretations. If disagreements persisted, they engaged in discussions to reach a consensus on the most appropriate coding decisions.

The interviews were coded for the following key aspects:

  • Whether interviewees experience a high workload (yes/no)

  • Whether workload has been increasing over time (yes/no)

  • Whether they can identify periods during which the workload increased considerably (open)

  • Why they think workload has increased (open)

  • How they try to cope with the increased workload (open)

  • Whether they prioritize tasks (yes/no; open)

  • Which logic they use to prioritize different tasks

  • Whether increased workload affects policy implementation and, by extension, goal attainment (yes/no; open)

  • Which factors helped them to effectively implement a policy (open)

  • Additional comments that illuminate the research situation

Table A1 reports additional interview details using the COREQ criteria.

Table A1

Consolidated criteria for reporting qualitative studies (COREQFootnote [1]).

Table A1 ( - A)
A table presents the details for the research team, study design, and analysis. See long description.
Table A1 (cont. - A)Long description

The table has four columns for number, item, guide questions or description, and answer or information. The table has three sections for domain a1, research team and reflexivity, domain 2, study design, and domain 3, analysis and findings. Domain 1 presents personal characteristics and relationship with participants. Domain 2 presents the theoretical framework, participant selection, setting, and data collection. Domain 3 presents data analysis and reporting.

The table presents data in the following order: number, item, guide questions or description, and answer or information.

Domain 1: Research team and reflexivity.

Personal characteristics:

Row 1: 1. Interviewer or facilitator. Which authors conducted the interview or focus group? All of them.

Row 2: 2. Credentials. What were the researcher’s credentials, example. P h D, M D? 6 doctoral student, 4 researchers with P h D.

Row 3: 3. Occupation. What was their occupation at the time of the study? Doctoral student, post-doc, university professors.

Row 4: 4. Gender. Was the researcher male or female? Male or female.

Row 5: 5. Experience and training. What experience or training did the researcher have? All interviewers received prior interview training or had prior experience with interviews.

Relationship with participants:

Row 6: 6. Relationship established. Was a relationship established prior to study commencement? No.

Table A1 ( - B)
A table presents the details for the research team, study design, and analysis. See long description.
Table A1 (cont. - B)Long description

A continuation of the table. Row 7: 7. Participant knowledge of the interviewer. What did the participants know about the researcher, example, personal goals, reasons for doing the research? Participants received basic personal information from interviewer at the start of the interview.

Row 8: 8. Interviewer characteristics. What characteristics were reported about the interviewer or facilitator, example, bias, assumptions, reasons and interests in the research topic? Professional background, reasons and interest in the research topic, funding information.

Domain 2: Study design.

Theoretical framework:

Row 9: 9. Methodological orientation and theory. What methodological orientation was stated to underpin the study, example, grounded theory, discourse analysis, ethnography, phenomenology, content analysis? Content analysis of interview statements.

Participant selection:

Row 10: 10. Sampling. How were participants selected, example, purposive, convenience, consecutive, snowball? Purposive, potential interviewees were identified using the websites of pertinent environmental agencies or entities.

Row 11: 11. Method of approach. How were participants approached? Example, face-to-face, telephone, mail, email. Participants were contacted via email.

Table A1 ( - C)
A table presents the details for the research team, study design, and analysis. See long description.
Table A1 (cont. - C)Long description

A continuation of the table. Row 12: 12. Sample size. How many participants were in the study. 157.

Row 13: 13. Non-participation. How many people refused to participate or dropped out? Reasons? 600, no explicit reasons were provided; potential interviewees simply did not respond to email.

Setting:

Row 14: 14. Setting of data collection. Where was the data collected? Example, home, clinic, workplace. Via Zoom and telephone.

Row 15: 15. Presence of non-participants. Was anyone else present besides the participants and researchers? No.

Row 16: 16. Description of sample. What are the important characteristics of the sample, example, demographic data, date? The interviewees from the six countries were responsible for a large variety of implementation-related activities within the areas of social and environmental policy. Interviews were conducted between April 21, 2021 and August 1, 2023.

Data collection:

Row 17: 17. Interview guide. Were questions, prompts, guides provided by the authors? Was it pilot tested? Interviewees were asked the questions reported above, p. 9. Several pilot interviews, in different countries, were conducted.

Table A1 ( - D)
A table presents the details for the research team, study design, and analysis. See long description.
Table A1 (cont. - D)Long description

A continuation of the table. Row 18: 18. Repeat interviews. Were repeat interviews carried out? If yes, how many? No.

Row 19: 19. Audio or visual recording. Did the research use audio or visual recording to collect the data? Audio recording was used.

Row 20: 20. Field notes. Were field notes made during and, or after the interview or focus group? N A.

Row 21: 21. Duration. What was the duration of the interviews or focus group? 25 to 125 minutes.

Row 22: 22. Data saturation. Was data saturation discussed? Yes. Saturation was considered very high for all questions asked.

Row 23: 23. Transcripts returned. Were transcripts returned to participants for comment and, or correction? No.

Domain 3: Analysis and findings.

Data analysis:

Row 24: 24. Number of data coders. How many data coders coded the data? 7.

Row 25: 25. Description of coding tree. Did authors provide a description of the coding tree? Yes.

Row 26: 26. Derivation of themes. Were themes identified in advance or derived from the data? In advance, the two themes that structure table B 1 above.

Row 27: 27. Software. What software, if applicable, was used to manage the data? A custom software solution developed by the research team.

Table A1 ( - E)
A table presents the details for the research team, study design, and analysis. See long description.

[1] Tong, Allison, Peter Sainsbury, and Jonathan Craig. 2007. “Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups.” International Journal for Quality in Health Care 19(6), 349–357.

Table A1 (cont. - E)Long description

A continuation of the table. Row 28: 28. Participant checking. Did participants provide feedback on the findings? No.

Reporting:

Row 29: 29. Quotations presented. Were participant quotations presented to illustrate the themes or findings? Was each quotation identified, example, participant number? Yes.

Row 30: 30. Data and findings consistent. Was there consistency between the data presented and the findings? Yes.

Row 31: 31. Clarity of major themes. Were major themes clearly presented in the findings? Yes.

Row 32: 32. Clarity of minor themes. Is there a description of diverse cases or discussion of minor themes? No.

Footnotes

[1] Tong, Allison, Peter Sainsbury, and Jonathan Craig. 2007. “Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups.” International Journal for Quality in Health Care 19(6), 349–357.

Accessibility standard: WCAG 2.2 AAA

Why this information is here

This section outlines the accessibility features of this content - including support for screen readers, full keyboard navigation and high-contrast display options. This may not be relevant for you.

Accessibility Information

The HTML of this book complies with version 2.2 of the Web Content Accessibility Guidelines (WCAG), offering more comprehensive accessibility measures for a broad range of users and attains the highest (AAA) level of WCAG compliance, optimising the user experience by meeting the most extensive accessibility guidelines.

Content Navigation

Table of contents navigation
Allows you to navigate directly to chapters, sections, or non‐text items through a linked table of contents, reducing the need for extensive scrolling.
Index navigation
Provides an interactive index, letting you go straight to where a term or subject appears in the text without manual searching.

Reading Order & Textual Equivalents

Single logical reading order
You will encounter all content (including footnotes, captions, etc.) in a clear, sequential flow, making it easier to follow with assistive tools like screen readers.
Short alternative textual descriptions
You get concise descriptions (for images, charts, or media clips), ensuring you do not miss crucial information when visual or audio elements are not accessible.
Full alternative textual descriptions
You get more than just short alt text: you have comprehensive text equivalents, transcripts, captions, or audio descriptions for substantial non‐text content, which is especially helpful for complex visuals or multimedia.
Visualised data also available as non-graphical data
You can access graphs or charts in a text or tabular format, so you are not excluded if you cannot process visual displays.

Visual Accessibility

Use of colour is not sole means of conveying information
You will still understand key ideas or prompts without relying solely on colour, which is especially helpful if you have colour vision deficiencies.

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×