Hostname: page-component-cb9f654ff-w5vf4 Total loading time: 0 Render date: 2025-08-31T15:44:09.817Z Has data issue: false hasContentIssue false

Reconciling data-enabled design and clinical trials: conceptual phases for eHealth development

Published online by Cambridge University Press:  27 August 2025

Hosana Cristina Morales Ornelas*
Affiliation:
Delft University of Technology, The Netherlands Tecnológico de Monterrey, Mexico
Maaike Kleinsmann
Affiliation:
Delft University of Technology, The Netherlands
Gerd Kortuem
Affiliation:
Delft University of Technology, The Netherlands
Arend W. van Deutekom
Affiliation:
Erasmus Medical Center - Sophia Children’s Hospital, The Netherlands

Abstract:

eHealth systems, such as digital care applications or remote monitoring devices, can improve health outcomes using user-centered design principles to create medical devices that adapt to users’ needs and contexts. Data-enabled design (DED) builds on these principles by leveraging user-generated data to iteratively refine systems based on real-world use, enabling adaptive and context-sensitive solutions. However, its exploratory and iterative nature conflicts with the rigid protocols required in clinical trials to evaluate safety and effectiveness. This study revises DED in alignment with clinical trial requirements, identifying four key challenges and proposing a four-phase Clinical Data-Enabled Design (C-DED) framework. This framework reconciles exploratory design with trial methodological demands, supporting the development of safe, effective, and user-centered eHealth medical devices.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
© The Author(s) 2025

1. Introduction

eHealth applies information and communication technologies to develop systems, such as digital care applications or remote monitoring devices (Reference SilberSilber, 2003). eHealth systems offer opportunities to create user-centered medical devices and services that improve health by engaging users more effectively. Data-enabled design (DED) is an exploratory approach that leverages user-generated data to develop smart systems (Reference van Kollenburg and Bogersvan Kollenburg & Bogers, 2019). Creating systems through DED enables continuous adaptation by digitalising the user’s context, behaviour, and experience. This process refines the data a system should monitor, allowing it to automate appropriate functions while iteratively aligning with user needs (Reference LoveiLovei, 2024). DED has shown potential in paediatric remote care, offering actionable health data and community insights (Reference van Kollenburg and BogersJung, 2023; Reference Jungvan Kollenburg & Bogers, 2019). It has also been used in bariatric post-surgery care to conceptualise user-centered coaching interventions and observe their real-world effects, such as improved patient engagement and health outcomes (Reference Versteegden, van Himbeeck, Burghoorn, Lovei and DeckersVersteegden et al., 2022).

Despite its potential, DED is contested in regulated environments, such as the clinical trials required to establish eHealth systems’ safety and effectiveness (ISO, 2020). In eHealth, clinical trials systematically evaluate interventions, such as devices or software, by collecting and analysing standardised data on their effects (ISO, 2020). ISO 14155 (2020) provides international guidelines for devising clinical trials for eHealth medical devices (eMDs), emphasising predefined hypotheses and data collection (ISO, 2020). However, trials, including randomised controlled trials (RCTs), often conflict with the required adaptability to create context-sensitive eMDs (Reference Ammenwerth and RigbyAmmenwerth & Rigby, 2016). RCTs rely on static study designs, generalisable data collection, and predefined outcomes, contrasting with DED’s iteration based on context-specific insights (Reference van Kollenburg and Bogersvan Kollenburg & Bogers, 2019). Nevertheless, trials and exploration approaches are not inherently incompatible (ISO, 2020). Careful analysis of their methodological assumptions and their application can clarify tensions and inform their integration.

The methodological tension between trials and DED can be understood through two fundamental design theoretical paradigms: Simon’s Reference Simon(1988) problem-solving and Schön’s Reference Schön(1992) reflective practice (Reference Morales, Kleinsmann and KortuemMorales Ornelas et al., 2023). Simon’s paradigm emphasises systematic, rational, goal-oriented decision-making, central to clinical trials’ static, predefined nature. In contrast, Schön’s paradigm focuses on iterative, reflective exploration and generating context-sensitive insights by engaging with the environment–central to DED’s process. This framing highlights the conflict between clinical trials’ confirmatory approach and DED’s exploratory nature. Researchers have proposed integrating community-generated data to enhance DED’s generalisability (Reference JungJung, 2023) and involving healthcare professionals (HCPs) earlier to align study objectives with clinical practices (Reference Noortman, Lovei, Funk, Deckers, Wensveen and EggenLovei, 2024; Reference LoveiNoortman et al., 2022). However, these efforts do not fully resolve the methodological conflict. This unresolved tension limits DED’s potential to create safe, effective, and user-centered eMDs that meet clinical methodological requirements and underscores the need for further adaptation of DED to clinical trials.

Therefore, this research revises DED using the ISO 14155 (2020) standard focused on clinical trials of eMDs and identifies adaptations needed to align DED with clinical development and evaluation processes. Our research question is: What adaptations are necessary to reconcile data-enabled design’s exploratory approach with the methodological demands of clinical trials for developing safe and effective eHealth medical devices? We employ Jaakkola’s Reference Jaakkola(2020) theory adaptation approach to examine tensions between DED and clinical trials. Through this approach, we identify four challenges and propose a revised conceptual framework reconciling DED with eMD clinical trials. Our revised Clinical Data-Enabled Design (C-DED) framework can guide designers and HCPs through four phases to devise evaluative DED studies aligned with clinical trial methodological requirements.

2. Methodology

This conceptual study revises and adapts DED to meet the methodological demands of clinical trials for developing safe and effective eMDs. Following ISO 14155 (2020), we define eMDs as apparatus, machines and/or software designed for prevention, diagnosis, monitoring, treatment, or disease alleviation. We follow Jaakkola’s Reference Jaakkola(2020) theory adaptation approach to achieve our goal. It provides guidance for modifying the scope or perspective of a ‘domain’ theory by using an alternative established frame of reference or ‘method’ theory to inform adaptations (Reference JaakkolaJaakkola, 2020). This approach fits our goal by offering a systematic process to reconcile conceptual inconsistencies between theories.

Jaakkola’s Reference Jaakkola(2020) approach consists of three steps. Step one (i.e., Section 3) describes the domain theory–that is, DED–and articulates its core assumptions. Step two (i.e., Section 4) problematises these assumptions, in our case, using ISO 14155 (2020) as an established reference to identify challenges when applying DED in clinical trials. Step three (i.e., Section 5) addresses these challenges by adapting DED with the following method theories: ISO 14155 (2020) standard, SPIRIT guidelines (Reference Rivera, Liu, Chan, Denniston, Calvert, Ashrafian and BeamCalvert et al., 2021; Reference Calvert, King, Mercieca-Bebber, Aiyegbusi, Kyte, Slade and ChanRivera et al., 2020), and health and design shared evidence practices and data strategies (Reference Pannunzio, Morales Ornelas, Gurung, van Kooten, Snelders and van OsMorales Ornelas et al., 2023; Reference Morales, Kleinsmann and KortuemPannunzio et al., 2024). These already address key concerns related to clinical trials and user-centered design in eHealth development. We describe our adaptation process and the method theories in Section 5, and in Section 6, we present the revised clinical data-enabled design (C-DED) conceptual framework.

3. Step one: describing data-enabled design (DED)

As a first step in Jaakkola’s Reference Jaakkola(2020) process, we introduce our domain theory: data-enabled design (DED). We describe DED procedurally, outlining its phases, goals, activities, and core assumptions. DED is an iterative approach that connects two environments: the user’s everyday life–i.e., the top loop in Figure 1, and the design studio–i.e., the bottom loop in Figure 1 (Reference van Kollenburg and Bogersvan Kollenburg & Bogers, 2019). The process begins by situating a prototype in the user’s everyday life to collect and transmit data to the design studio, providing real-time insights into the user’s context, behaviour, and experience. In design synthesis, these insights lead to design opportunities that prompt designerly explorations with data as creative material to envision (remote) prototype adjustments. The continuous adjustments create data-oriented systems in the everyday loop that learn from and adapt to their users and facilitate continuous data-enabled communication between both environments. Overall, DED follows two design exploration phases: a research-oriented contextual exploration (see Figure 1–blue) and a design-oriented informed exploration (see Figure 1–orange) (Reference van Kollenburg and BogersFunk et al., 2024; Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and Wincklervan Kollenburg & Bogers, 2019).

Figure 1. DED activities per research—blue and design—orange phases based on Funk et al. Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and Winckler(2024)

3.1. DED phases and activities

The process begins with a research-oriented contextual exploration phase to understand contextual, behavioural, and experiential problems and opportunities in the user’s everyday life (Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and WincklerFunk et al., 2024). This phase collects research data–i.e., data integrated into the system to answer (design) research questions (Reference van Kollenburg and Bogersvan Kollenburg & Bogers, 2019). Funk et al. Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and Winckler(2024) enlist three activities (see Figure 1–blue arrows). In probe design and development, the team examines the context and selects data collection mechanisms that balance insight depth with participant acceptability. These can be personal (e.g., smartwatch), contextual (e.g., motion sensor in object), and open (e.g., event button) data trackers and digital interactive interfaces (e.g., chatbot). During probe deployment, the team integrates probes into existing infrastructure, generates test data, and negotiates data collection concerns with participants. In data visualisation and data-enabled interviews, they create visualisations and use them in participant interviews to explore the collected data and annotate insightful events. This phase’s output includes a data strategy–i.e., what data should (not) be collected, user insights, and contextual assumptions.

Subsequently, a design-oriented informed exploration phase applies identified insights to the iterative design of an intervention (Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and WincklerFunk et al., 2024). Designers explore if and how research data can become solution data–i.e., data that triggers system interactions or interacts directly with users (Reference van Kollenburg and Bogersvan Kollenburg & Bogers, 2019). Funk et al. Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and Winckler(2024) enlist four activities (see Figure 1–orange arrows). In data plumbing, scaling first insights, the team synthesises research data into system functions and solution data by defining and automating data points while reducing data collection and maintaining quality. In probe adaptation and integration, the team implements the automation and system functions. In designing and validating data-driven interventions, they qualitatively and quantitatively evaluate hypotheses with a new participant group. Research data increases in quantity and depth, while solution data is tailored to intervention functionality. In designing intelligent ecosystems, the team gathers insights about the intervention’s contextual fit, new use patterns, and its ability to create behaviour change to improve the current situation. Qualitative insights support contextual sense-making, and quantitative insights bring a comparative perspective against a baseline or context. These insights are the phase’s output and can trigger additional design- and research-oriented phases in the same DED study (Reference Funk, Lovei, Noortman, Vanderdonckt, Palanque and WincklerFunk et al., 2024).

3.2. Articulating DED core methodological assumptions

To conclude, using Schön’s Reference Schön(1992) and Simon’s Reference Simon(1988) paradigms of design activity as lenses, we articulate the following core methodological assumptions underlying both DED phases.

Assumption one: DED builds situated understanding through iteration. Situated understanding refers to how users and their context shape knowledge generation and influence design outcomes. For example, sensors placed in objects based on user routines track specific behaviours that inform particular solution spaces. Iterative learning enables the team to gradually build an understanding of the user (context) in both phases: continuous user data and repeated engagement (e.g., using chatbots) support system refinement through ongoing feedback loops. This assumption echoes Schön’s notion of design as a reflective interaction with the environment, where each iteration provides insights that refine the understanding of the problem and the solution.

Assumption two: DED’s evaluative and explorative data roles interplay during the process. An evaluative role of data indicates that data values inform what happens in the user’s everyday life based on a predefined question. This role relates to research data, where data provides an understanding of the users’ context, behaviour, or experience. For instance, active minutes in a smartwatch can indicate users’ activity levels. Conversely, an explorative role of data indicates that the meaning of (research) data is reframed. This role begins when users reinterpret data visualisations in interviews, shaping how designers understand the (research) data collected. For example, morning TV usage may indicate TV habits (evaluative role) or lack of morning exercise if a user expresses it (explorative role). Both roles interplay as the phases progress by supporting designers in ideating and implementing system functions (solution data) and discovering the results of their implementation (research data). In this assumption, both paradigms are present. Simon’s clear goal definition underlies an evaluative role where research questions and data collection are predefined for knowledge generation. Meanwhile, Schön’s paradigm substantiates the explorative role during the context-sensitive re-interpretation of data with users.

4. Step two: problematisation of DED when applied in clinical trials

Following Jaakkola’s Reference Jaakkola(2020) second step, we critically examine the limitations of DED when applied in trials for eMD development. This problematisation is based on a recent unrealised DED study in which authors were denied medical ethical approval to conduct an eMD clinical trial (Reference Noortman, Lovei, Funk, Deckers, Wensveen and EggenNoortman et al., 2022). We use ISO 14155 (2020), focused on trials of eMDs, as an established view to examine and challenge the DED assumptions formulated in Section 3.2. While ISO 14155 is not a legally binding regulation, it is an internationally recognised standard for Good Clinical Practice in eMD trials (ISO, 2020). Its adoption depends on national regulation, but it provides a widely accepted guideline for the design, conduct, and report of trials in compliance with scientific and ethical principles (ISO, 2020).

In eHealth development, clinical trials evaluate the safety and effectiveness of eMD interventions by systematically accumulating evidence on health outcomes before and after market approval (ISO, 2020). To ensure participant safety and ethical compliance, the development team must submit a clinical trial protocol to a Medical Ethical Committee (METC) before the trial (ISO, 2020). This protocol should elaborate on the research methodology and the intervention to allow the assessment of potential risks and benefits for participants (Reference Rivera, Liu, Chan, Denniston, Calvert, Ashrafian and BeamRivera et al., 2020). However, in this regulatory context, a METC rejected a DED protocol study. Noortman et al. Reference Noortman, Lovei, Funk, Deckers, Wensveen and Eggen(2022) proposed an evaluative DED study to develop a smart coaching system to support bariatric patients and their families with lifestyle behaviour change after surgery. Despite collaboration with HCPs in designing the system and protocol, the METC rejected the study, citing concerns over DED’s clinical relevance, exploratory nature, and high participant burden (Reference Noortman, Lovei, Funk, Deckers, Wensveen and EggenNoortman et al., 2022). While METC decisions vary across institutions and regulatory contexts, this rejection highlights a broader challenge: reconciling DED’s exploratory nature with clinical trials’ need for predefined protocols and measurable clinical relevance. To better understand this methodological challenge, we examine how ISO 14155 (2020) describes clinical relevance and the nature of clinical trials and identify how this standard challenges DED’s core assumptions.

ISO 14155 (2020) describes clinical relevance as a trial’s ability to produce reliable data on an eMD’s clinical safety, performance, or effectiveness for its intended patient population. A scientifically and ethically sound protocol must anticipate patient benefits and risks and align objectives and hypotheses with data collection and analysis methods (ISO, 2020). Prior scientific studies and non-clinical evaluations should justify the eMD and study design to ensure clinically meaningful insights (ISO, 2020). ISO 14155 (2020) further describes trials’ varying nature and characteristics across development phases. Before market approval, a pilot phase may start as exploratory and progress to a confirmatory nature. A pivotal phase is solely confirmatory, while in post-market, the nature can be confirmatory or observational (ISO, 2020). In exploratory trials, eMDs are introduced to generate hypotheses, confirmatory trials test these hypotheses, and observational trials draw inferences without intervening in the context (ISO, 2020). To ensure consistency and scientific rigour, ISO 14155 (2020) requires trial protocols to outline systematic procedures and predefined measures. This requirement means that procedures, prototype modifications, and data collection points must be defined before the trial begins.

4.1. Identifying challenges with DED assumptions

Based on the ISO 14155 (2020) descriptions above, we identify four challenges related to the DED’s assumptions articulated in Section 3.2. We outline two challenges for assumption one. Challenge one: DED’s situated understanding does not incorporate the scientific grounding required in trials, which relies on generalisable insights from prior studies to justify the eMD’s design and clinical trial protocol. Challenge two: DED cannot rely solely on its exploratory nature and situated iterative approach but must also integrate a confirmatory trial nature in its development process to comply with regulations for market approval. For assumption two, we outline two additional challenges. Challenge three: DED’s multiple inquiry focus on users’ context, behaviour, and experience does not include the clinical trial’s need for rigorous data on clinical safety, performance, and effectiveness. Challenge four: DED’s evaluative and explorative data roles cannot interplay during trials, as clinical trials require predefined evaluation measures and static data meanings throughout the study.

5. Step three: adaptation of DED with method theories

As part of Jaakkola’s Reference Jaakkola(2020) third methodology step, we adapt DED to address the four challenges in Section 4.1, revising key characteristics to reconcile DED with trial methodological demands. We first addressed challenges two and three as they form the foundation for integrating DED into trials, followed by challenges one and four. Below, we outline our adaptation process and the method theories used.

5.1. Addressing challenges two and three

Our adaptation process began by integrating a confirmatory nature and a clinical inquiry focus into the original DED phases to address challenges two and three. We used ISO 14155 (2020) pilot and pivotal trial phases, given their transition guidance from exploratory to confirmatory eMD trials and their distribution of clinical objects of inquiry across phases. A pilot phase evaluates an eMD’s limitations and advantages with a target population sub-group. Outputs guide eMD modifications and inform future study parameters for the pivotal phase. Examples include ‘proof of concept trials’ for initial safety and performance assessment and ‘traditional feasibility trials’ for preliminary clinical performance, safety, and effectiveness evaluation. The pivotal phase confirms an eMD’s clinical performance, effectiveness, and safety with the target population for market approval. Pivotal trials are strictly confirmatory and typically use statistically justified population groups. Due to its non-interventional purpose, we left the initial DED phase, ‘research-oriented contextual exploration’, as the first phase. We added a health-related inquiry to it as a research goal to inform subsequent clinical inquiries required by ISO while keeping its original study population and outputs. Then, we extended the goal, study population, and output of the DED second phase, ‘design-oriented informed exploration’. We incorporated the research goals, population, and outputs of the ‘proof of concept trial’, ‘traditional feasibility trial’, and ‘pivotal trial’ from ISO 14155 (2020), given their common interventional purpose. This adaptation resulted in four DED clinical trial phases transitioning from exploratory to confirmatory nature: (1) research exploration, (2) design conceptual exploration, (3) design feasibility exploration, and (4) design pivotal confirmation. Each phase has updated research goals, study populations, and outputs.

5.2. Addressing challenge one

Once we established the nature and clinical inquiry in each new phase, we incorporated the scientific grounding needed per phase to address challenge one. We employed two variants of the SPIRIT trial protocol guidelines to inform this grounding. SPIRIT-AI (Reference Rivera, Liu, Chan, Denniston, Calvert, Ashrafian and BeamRivera et al., 2020) provides guidance for smart eMDs, aligning with DED’s object of design, while SPIRIT-PRO (Reference Calvert, King, Mercieca-Bebber, Aiyegbusi, Kyte, Slade and ChanCalvert et al., 2021) focuses on patient-centered evaluation, reinforcing DED’s user-centeredness. To complement ISO 14155, we used both variants as they offer additional guidance specific to DED’s concerns, particularly for smart eMDs and patient-centered evaluation in trial protocol development. SPIRIT-AI requires trial protocols to justify clinical trials with prior studies and detail the eMD’s intended use, target users, and artificial intelligence model (Reference Rivera, Liu, Chan, Denniston, Calvert, Ashrafian and BeamRivera et al., 2020). SPIRIT-PRO mandates trial protocols to outline patient-centred research questions and evaluation measures, as well as relevant patient-centred intervention findings (Reference Calvert, King, Mercieca-Bebber, Aiyegbusi, Kyte, Slade and ChanCalvert et al., 2021). We used the SPIRIT-AI and SPIRIT-PRO to add and inform the phases’ literature grounding needed for the eMD and study design based on each phase’s research goal.

5.3. Addressing challenge four

To conclude, we enabled the interplay of evaluative and explorative data roles by dividing it between phases to address challenge four. We used shared evidence practices (Reference Morales, Kleinsmann and KortuemMorales Ornelas et al., 2023) and shared data strategies (Reference Pannunzio, Morales Ornelas, Gurung, van Kooten, Snelders and van OsPannunzio et al., 2024) to guide this division because both inform evaluative knowledge transitions in eHealth from an integrated health and design perspective. Shared evidence practices represent factors (e.g., problem, solution, effect) influencing how design and health communities generate and use evidence in eHealth development. We tailored these practices to DED using shared data strategies to inform evaluation measures across phases. Shared data strategies define joint purposes, rules, and processes for data collection in eHealth development. They include four categories of evaluation measures: service-system, usage and adherence, care, and health outcomes, with corresponding measurement instruments. Based on phase goals and outputs, we used problem-, solution- and effect-driven evidence practices to guide knowledge transitions between phases. We then applied the shared data strategy categories to (pre)define evaluative data collection per transition and consolidate shared research data on patient context, behaviour, experience, clinical effects, and safety. We indicated the required knowledge transition as an ‘evidence reflection’ activity between phases.

6. Results: the revised clinical data-enabled design framework

The Clinical Data-Enabled Design (C-DED) framework addresses the identified four challenges of applying DED in clinical trials (see Section 4.1) by introducing a process with four conceptual phases: research exploration, design conceptual exploration, design feasibility exploration, and design pivotal confirmation. The framework (in Figure 2) splits and extends DED phases, integrating trial requirements for market approval, including the shift from exploration to confirmation (i.e., challenge two) and a clinical inquiry focus on safety, performance, and effectiveness (i.e., challenge three). ISO 14155 (2020) informed the framework’s structure, research goals, study populations, and outputs, while SPIRIT guidelines (Reference Rivera, Liu, Chan, Denniston, Calvert, Ashrafian and BeamCalvert et al., 2021; Reference Calvert, King, Mercieca-Bebber, Aiyegbusi, Kyte, Slade and ChanRivera et al., 2020) added literature grounding (i.e., challenge one) to justify the eMD rationale and study design, ensuring compliance with trial protocols. Finally, shared evidence practices (Reference Morales, Kleinsmann and KortuemMorales Ornelas et al., 2023) and data strategies (Reference Pannunzio, Morales Ornelas, Gurung, van Kooten, Snelders and van OsPannunzio et al., 2024) guided evidence reflection activities, enabling phase insights (i.e., explorative role) to transition into confirmatory evidence (i.e., evaluative role), addressing challenge four. Below, we introduce each C-DED phase and the evidence reflections required for effectively reconciling DED’s explorative approach with clinical trials’ methodological demands.

Figure 2. Clinical data-enabled design (C-DED) framework with phases and evidence reflections

6.1. Phase one: research exploration

The first C-DED phase is a research exploration (see Figure 2–number one). The research goal focuses on understanding the health problems and opportunities in addition to the contextual, behavioural, and experiential ones. This phase should be grounded in existing literature about the patient’s group concerns and related health outcomes of interest. This phase relies on a problem-solving theoretical paradigm as design knowledge generation concerns the representation and further definition of known problem-solution spaces given its research goal and required motivation. Studies in this phase should be conducted with a small sub-group of the target patient population, and exploratory data collection with personal, contextual, and open data trackers and interactive interfaces. Ultimately, this phase will provide insights and assumptions for the intervention based on identified problems and opportunities and acceptable data collection mechanisms for the data strategy.

After this phase, a first evidence reflection should guide the transition to a design conceptual exploration phase. The objective is to clarify the research data points that frame the current understanding of the problem from a shared perspective. This reflection should be based on the intervention’s rationale–i.e., the identified insights and assumptions about contextual, behavioural, experiential, and health problems and opportunities and the acceptable data collection mechanisms. The team should reflect on what data points could offer insight into problems with the service or system interaction and its usage and adherence, as well as problems concerning care and health. They should also reflect on how to collect this data in an acceptable manner, where possible, with established measurement instruments. The resulting research data overview can define data collection to assess the contextual factors related to clinical performance in phase two.

6.2. Phase two: design conceptual exploration

The second C-DED phase is a design conceptual exploration (see Figure 2–number two). The research goal focuses on assessing the initial safety, contextual factors related to the clinical performance of the intended use (i.e., user behaviour), and the emergent uses of the intervention. The study’s motivation and the eMD’s conceptualisation should be grounded in (un)published literature about existing interventions with similar intended uses and target users and their benefits and harms. While literature grounding complements previous insights to create the concept, the theoretical underpinning of this phase is a reflective paradigm. This is because the knowledge goal concerns identifying emergent changes in the user’s environment given the implementation of concept ideas. Studies in this phase should be conducted with a small sub-group of the target patient population. In addition, exploratory data collection with personal, contextual, and open data trackers and interactive interfaces will still be possible in this phase to identify emergent changes. This phase will result in a preliminary set of conceptual intervention features (i.e., hypotheses) and findings about adequate solution and research data for the data strategy.

After this phase, a second evidence reflection should guide the transition to a design feasibility exploration phase. The objective is to clarify the research data points that frame the current understanding of the solution concerning the problem it aims to solve from a shared perspective. This reflection should be based on the solution–i.e., the identified conceptual features with their adequate solution and research data. First, the team should reflect on the problems that the features tackle. Then, they should identify what data points from the previous overview relate to these problems from a service-system interaction, usage and adherence, and care and health outcomes perspective. Later, they should reflect on how to collect this data adequately, where possible, with established measure instruments. The updated (research) data overview with identified problem data based on the features and their solution data will define data collection informing how to make ‘change’ observable in phase three.

6.3. Phase three: design feasibility exploration

The third C-DED phase is a design feasibility exploration (see Figure 2–number three). The research goal focuses on evaluating the preliminary clinical performance (i.e., user behaviour) and safety of the intervention, as well as the effectiveness measures–i.e., the data collection to make health ‘change’ observable. This phase should be grounded in (un)published literature about existing (similar) interventions and their benefits and harms, as well as the appropriateness of the chosen (non)established patient outcome measures for ‘change’ evaluation. The phase’s foundation lies in a reflective paradigm because of the narrower exploration goal concerning the effect(s) of the updated intervention and the exploration of the data points’ appropriateness–i.e., a problem setting to observe a change in the user’s environment. Studies in this phase should be conducted with a target patient population group. Given the preliminary clinical performance evaluation goal, only personal and contextual data trackers and defined data collection in interactive interfaces will be possible at this phase. Ultimately, this phase will provide insights into the preliminary effectiveness of the intervention features and the corresponding appropriate solution and research data for the data strategy.

After this phase, a third evidence reflection should guide the transition to a design pivotal confirmation phase. The objective is to clarify the research data points that frame the current understanding of the solution’s effect from a shared perspective. This reflection should be based on the intervention’s found effect(s). That is the appropriate intervention features with their solution data and the appropriate (behaviour and health) ‘change’ research data. First, the team should reflect on what (non)desired changes were produced by the solution. Then, they should identify what data points from the previous overview relate to these changes from a service-system interaction, usage and adherence, and care and health outcomes perspective. To conclude, they should reflect on how to collect this data appropriately, where possible, with established measure instruments. The updated (research) data overview with the identified ‘change’ data points will guide the definition of hypotheses for confirmation purposes and their related data collection in phase four.

6.4. Phase four: design pivotal confirmation

The fourth and final C-DED phase is a design pivotal confirmation (see Figure 2–number four). The research goal focuses on evaluating and establishing the clinical performance (i.e., user behaviour), effectiveness (i.e., health change), and safety of the final eMD intervention. This goal should be grounded in (un)published literature about existing (similar) interventions and their benefits and harms. The theoretical underpinning of this phase is a problem-solving paradigm. This, given the goal-directed confirmation activity, where predefined hypotheses and a stable problem definition–framed with the data points from the previous phase–guide evaluation. Studies in this phase should be conducted with a statistically justified group of the target patient population. In addition, studies might involve comparative evaluation with a controlled study set-up (e.g., RCT) where there is an evaluation between interventions or the intervention and the standard of care. Given the evaluative confirmatory purpose of this phase, only personal and contextual data trackers and defined data collection in interactive interfaces will be possible in these studies. The results of this phase will provide evidence (i.e., research data) of the safety and effectiveness of the intervention and the necessary solution data for its functionality.

6.5. Implementation considerations

The C-DED framework guides the design of study protocols for each phase, ensuring compliance with clinical trial methodological requirements for eMD development and evaluation. All original DED activities are maintained across C-DED phases. However, the activities probe design and development and probe adaptation and integration have limited iteration, as prototype changes and iteration criteria must be predefined based on literature. This change is depicted in Figure 2, with a dotted line between the top and bottom loops. To implement the C-DED framework, interdisciplinary teams with designers and HCPs should ideally follow its structured phases, beginning with exploratory research and transitioning into confirmatory studies to reach market approval. For evidence reflections, we encourage a creative workshop setup where the team first clarifies the phase outcomes and then thinks creatively about the data points based on each reflection description. Close collaboration between designers and HCPs is crucial for developing clinical trial protocols informed by evidence reflections at each phase.

7. Discussion and conclusion

This study proposes the Clinical Data-Enabled Design (C-DED) framework, a conceptual adaptation of DED that reconciles its exploratory, user-centered approach with the methodological requirements of clinical trials. The C-DED framework provides a structured four-phase process, integrating established guidance. It includes ISO 14155’s (2020) evaluation phases and SPIRIT guidelines (Reference Rivera, Liu, Chan, Denniston, Calvert, Ashrafian and BeamCalvert et al., 2021; Reference Calvert, King, Mercieca-Bebber, Aiyegbusi, Kyte, Slade and ChanRivera et al., 2020) for scientific grounding, and shared evidence practices and data strategies (Reference Pannunzio, Morales Ornelas, Gurung, van Kooten, Snelders and van OsMorales Ornelas et al., 2023; Reference Morales, Kleinsmann and KortuemPannunzio et al., 2024), to facilitate systematic evidence reflections between phases. These adaptations enable an iterative clinical development of safe and effective eMDs.

For designers, the C-DED framework maintains advantages of the original DED approach, such as iterative design and exploration, while embedding these practices within a methodological clinical context. For instance, integrating safety-related data collection into prototype iterations enables the development of features that enhance safety in addition to effectiveness. This fosters a responsible shift in DED, ensuring the design output meets user needs and clinical standards. In addition, C-DED (as the original DED) is applicable across a wide range of eMDs, including software-based (e.g., mobile health apps) and hardware-integrated solutions (e.g., clinical wearables). For HCPs, the framework provides guidance to align DED’s underlying user-centeredness with clinical trial evaluation. By introducing systematic evidence reflections, our framework creates knowledge transitions between phases that refine research data for user(s)- and patient-centered evaluation of eMDs.

A central contribution of the C-DED framework is problem data , a concept describing how research data gets dynamically (re)defined to align with evolving problem settings across C-DED phases. We define problem data as a type of research data that reflects the observable setting of a design problem for evaluative purposes. Problem data refers to the (research) data points selected to observe the impact of an intervention on an identified problem. Problem data arises in evidence reflections. These describe a meta-design process that reflects on the problem’s current understanding from diverse angles as the phases progress–i.e., intervention rationale (phase one), solution features (phase two), and observed effects (phase three). This meta-design process operationalises the conceptual dimensions of meaning and collection identified by Morales Ornelas et al. Reference Morales Ornelas, Kleinsmann and Kortuem(2024), which emphasise the need to (re)define evidence generation within its context iteratively. Evidence reflections reframe what data points are used to observe problems (i.e., the meaning of data points) and how they should be collected across phases. By systematically (re)defining research data into problem data, the framework ensures that evaluations are context-sensitive, enabling patient-centered evidence generation.

Previous efforts to adapt DED for clinical contexts stress aligning design with clinical priorities but miss formal guidance on methodological clinical demands and its integration within design activities. Noortman et al. Reference Noortman, Lovei, Funk, Deckers, Wensveen and Eggen(2022) suggested involving HCPs earlier in DED processes to align study objectives with clinical priorities. Jung Reference Jung(2023) proposed using broader online community data to enhance the generalisability of design output. However, these suggestions do not offer formal guidance to incorporate a clinical inquiry focus. To address this, the C-DED framework proposes a structured process incorporating clinical research goals, scientific grounding, and evidence reflections across phases. In addition, existing eHealth frameworks offer rich insights into user-centered development or evidence generation but lack a procedural and comprehensive integration with trial demands. The CeHRes framework (Reference Van Gemert-Pijnen, Nijland, Van Limburg, Ossebaard and KeldersVan Gemert-Pijnen et al., 2011) aligns with C-DED’s iterative, user-centered approach but does not elaborate on how to manage trials’ methodological requirements. The NICE Evidence Standards Framework (Reference Unsworth, Dillon, Collinson, Powell, Salmon, Oladapo, Ayiku and ShieldUnsworth et al., 2021) highlights evidence requirements, while the NASSS framework (Reference Greenhalgh, Wherton, Papoutsi, Lynch, Hughes, A’Court and HinderGreenhalgh et al., 2017) examines user adoption barriers. Yet, both describe what should be investigated rather than how to investigate this. C-DED addresses this by combining DED’s user-centered exploration with clinical evaluation to develop compliant, user-centered eMDs.

Despite its contributions, our C-DED framework introduces certain limitations and areas for further investigation. As the framework progresses from exploratory to confirmatory phases, DED’s original open-ended exploration becomes limited. Designers must predefine iteration criteria, prototype changes, and data collection, reducing reactive prototype modifications. These constraints, while necessary for clinical trial protocols, may hinder the adaptability that defines DED’s creative strength. As a conceptual study, the framework’s practical application remains untested. While the C-DED framework offers a strong theoretical foundation, future empirical research is needed to assess its feasibility and effectiveness. Pilot implementations could evaluate the framework’s results using measures such as usability or clinical outcomes. And interviews with project stakeholders could collect feedback on how well the framework balances creativity with clinical trial rigour and aligns design outputs with clinical requirements. Nonetheless, the C-DED framework highlights the importance of integrating design methodologies with clinical evaluation practices to develop safe, effective, and user-centered eMDs. By reconciling DED’s exploratory strengths with clinical trial demands, it offers actionable guidance for fostering collaboration between design and healthcare communities in future evaluation studies.

Acknowledgement

We thank the reviewers for their thoughtful suggestions. The first author thanks Hiram Rayo Torres for financing the publication of this research and Ana Garcia, Jesus Flores, Ofelia Reyes, and Enrique Torres for their support during the initial submission of this manuscript.

References

Ammenwerth, E., & Rigby, M. (2016). Evidence-based health informatics: Promoting safety and efficiency through scientific methods and ethical policy. Ios Press.Google Scholar
Calvert, M., King, M., Mercieca-Bebber, R., Aiyegbusi, O., Kyte, D., Slade, A., Chan, A.W., et al. (2021). SPIRIT-PRO Extension explanation and elaboration: guidelines for inclusion of patient-reported outcomes in protocols of clinical trials. BMJ Open, 11(6), e045105. https://doi.org/10.1136/bmjopen-2020-045105 CrossRefGoogle Scholar
Funk, M., Lovei, P., & Noortman, R. (2024). Designing with Data, Data-Enabled and Data-Driven Design. In: Vanderdonckt, J., Palanque, P., & Winckler, M. (Eds), Handbook of Human Computer Interaction (pp. 132). Springer, Cham. https://doi.org/10.1007/978-3-319-27648-9_40-1 CrossRefGoogle Scholar
Greenhalgh, T., Wherton, J., Papoutsi, C., Lynch, J., Hughes, G., A’Court, C., Hinder, S., Fahy, et al. (2017). Beyond adoption: A new framework for theorizing and evaluating non-adoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. Journal of Medical Internet Research, 19(11), e8775. https://doi.org/10.2196/jmir.8775 CrossRefGoogle Scholar
ISO (2020), Clinical investigation of medical devices for human subjects — Good clinical practice (ISO 14155:2020). ISO. https://www.iso.org/obp/ui/en/#iso:std:iso:14155:ed-3:v1:en Google Scholar
Jaakkola, E. (2020). Designing conceptual articles: four approaches. AMS Review, 10(1–2), 1826. https://doi.org/10.1007/s13162-020-00161-0 CrossRefGoogle Scholar
Jung, J. (2023). Developing Data-enabled Design in the Field of Digital Health [Doctoral thesis, Delft University of Technology]. TU Delft Repository. https://doi.org/10.4233/uuid:28c38358-a1ca-423c-97fb-841471138e56 CrossRefGoogle Scholar
Lovei, P. (2024). Personal Matters: Designing personalized care pathways using Data-enabled Design capabilities [Doctoral thesis, Eindhoven University of Technology]. Tu/e Repository.Google Scholar
Morales, Ornelas, H.C., Kleinsmann, M. & Kortuem, G. (2023). Exploring health and design evidence practices in eHealth systems’ development. In Proceedings of the Design Society, Volume 3: ICED23 (pp. 17951804). Cambridge University Press. https://doi.org/10.1017/pds.2023.180 CrossRefGoogle Scholar
Morales Ornelas, H.C., Kleinsmann, M.S., & Kortuem, G. (2024). Towards designing for health outcomes: implications for designers in eHealth design. In Proceedings of the Design Society, Volume 4: DESIGN2024 (pp. 16271636). Cambridge University Press. https://doi.org/10.1017/pds.2024.165 CrossRefGoogle Scholar
Noortman, R., Lovei, P., Funk, M., Deckers, E., Wensveen, S., & Eggen, B. (2022). Breaking up data-enabled design: expanding and scaling up for the clinical context. Artificial Intelligence for Engineering Design, Analysis and Manufacturing, 36(1), e19. https://doi.org/10.1017/S0890060421000433 CrossRefGoogle Scholar
Pannunzio, V., Morales Ornelas, H. C., Gurung, P., van Kooten, R., Snelders, D., van Os, H., et al. (2024b). Patient and Staff Experience of Remote Patient Monitoring—What to Measure and How: Systematic Review. Journal of Medical Internet Research, 26, e48463. https://doi.org/10.2196/48463 CrossRefGoogle Scholar
Rivera, S. C., Liu, X., Chan, A.-W., Denniston, A.K., Calvert, M.J., Ashrafian, H., Beam, A.L., et al. (2020). Guidelines for clinical trial protocols for interventions involving artificial intelligence: the SPIRIT-AI extension. The Lancet Digital Health, 2(10), e549–e560. https://doi.org/10.1016/s2589-7500(20)30219-3 CrossRefGoogle Scholar
Schön, D.A. (1992). Designing as reflective conversation with the materials of a design situation. Knowledge-Based Systems, 5(1), 314. https://doi.org/10.1016/0950-7051(92)90020-g CrossRefGoogle Scholar
Silber, D. (2003), The case for eHealth, European Institute of Public Administration, Maastricht.Google Scholar
Simon, H.A. (1988). The Science of Design: Creating the Artificial. Design Issues, 4(1/2), 6782. https://doi.org/10.2307/1511391 CrossRefGoogle Scholar
Unsworth, H., Dillon, B., Collinson, L., Powell, H., Salmon, M., Oladapo, T., Ayiku, L., Shield, G., et al. (2021). The NICE Evidence Standards Framework for digital health and care technologies – Developing and maintaining an innovative evidence framework with global impact. Digital Health, 7, 205520762110186. https://doi.org/10.1177/20552076211018617 CrossRefGoogle Scholar
Van Gemert-Pijnen, J. E. W. C., Nijland, N., Van Limburg, M., Ossebaard, H.C., Kelders, S.M., et al. (2011). A holistic framework to improve the uptake and impact of eHealth technologies. Journal of Medical Internet Research, 13(4), e1672. https://doi.org/10.2196/jmir.1672 CrossRefGoogle Scholar
van Kollenburg, J., & Bogers, S. (2019). Data-enabled design: a situated design approach that uses data as creative material when designing for intelligent ecosystems [Doctoral thesis, Eindhoven University of Technology]. Tu/e Repository.Google Scholar
Versteegden, D., van Himbeeck, M., Burghoorn, A.W., Lovei, P., Deckers, E., et al. (2022). The Value of Tracking Data on the Behavior of Patients Who Have Undergone Bariatric Surgery: Explorative Study. JMIR Formative Research, 6(5), 18. https://doi.org/10.2196/27389 CrossRefGoogle Scholar
Figure 0

Figure 1. DED activities per research—blue and design—orange phases based on Funk et al. (2024)

Figure 1

Figure 2. Clinical data-enabled design (C-DED) framework with phases and evidence reflections