Hostname: page-component-7dd5485656-tbj44 Total loading time: 0 Render date: 2025-10-25T17:14:55.123Z Has data issue: false hasContentIssue false

General Education Teachers’ Perceptions of Their Attitudes Towards Adopting Evidence-Based Practices and Its Relations With Individual and Contextual Factors

Published online by Cambridge University Press:  14 October 2025

Hari Jang*
Affiliation:
National Institute of Education, Nanyang Technological University, Singapore
Rights & Permissions [Opens in a new window]

Abstract

The implementation of evidence-based practices (EBPs) does not always lead to successful outcomes due to various contextual factors. The Evidence-Based Practice Attitude Scale (EBPAS; Aarons, 2004) assesses implementers’ attitudes towards adopting EBPs (ATE), helping to understand the discrepancy between planned and implemented EBPs. Despite the growing implementation of school-based EBPs, the EBPAS has seldom been applied to general education teachers. This study aimed to validate the EBPAS for primary school teachers in Singapore using content validity and confirmatory factor analysis and to examine how the contextual characteristics influence ATE. A total of 170 teachers from 10 schools participated anonymously in an online survey. Confirmatory factor analysis results supported the four-factor structure of the EBPAS. All subscales showed excellent to acceptable internal consistency, with Divergence being the lowest. Teachers with higher educational attainment were more likely to be open to adopting EBPs. Similarly, teachers’ perceived school leadership support was significantly associated with their ATE. However, neither years of teaching experience, years of supporting students with special educational needs, nor teacher efficacy in inclusive practices significantly predicted ATE. The study highlights the need for further refinement, particularly of the Divergence subscale, through the exploration of alternative constructs and validation with larger samples.

Information

Type
Original Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial licence (https://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of Australian Association of Special Education

In Singapore, approximately 80% of students with special educational needs (SEN) are enrolled in mainstream schools, with increasing efforts to provide inclusive classroom practices (Ministry of Education, 2023). The tiered system of support (Aljunied, Reference Aljunied, Wong and Lim2021) has increased general education teachers’ involvement in implementing educational practices for students with SEN in inclusive environments. Additionally, the recently updated Character and Citizenship Education (Ministry of Education, 2021) syllabus guides school leaders and teachers in creating ‘a caring and enabling school environment’ for all students, including those with SEN.

Despite teachers’ increasing exposure to school-based programs supporting students with SEN, the systematic evaluation and monitoring of these programs in Singapore remain limited (Chong & Lee, Reference Chong and Lee2021; Poon, Reference Poon, Teng, Manzon and Poon2019). School leaders may face challenges in reviewing their effectiveness (Chong & Lee, Reference Chong and Lee2021), partly because the success of such programs is shaped by multiple contextual factors. Evidence-based practices (EBPs) do not always ensure success across different contexts (Moir, Reference Moir2018). In particular, outcomes can vary depending on characteristics such as individual implementers’ traits (Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010) and organisational factors (Odom et al., Reference Odom, Sam and Tomaszewski2022). Among these, attitudes have received growing attention (Aarons, Reference Aarons2004; Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010; Cook et al., Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018; Farahnak et al., Reference Farahnak, Ehrhart, Torres and Aarons2020; Locke et al., Reference Locke, Lawson, Beidas, Aarons, Xie, Lyon, Stahmer, Seidman, Frederick, Oh, Spaulding, Dorsey and Mandell2019; Merle et al., Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023), as they may influence an implementer’s willingness to adopt new practices.

Originally developed for mental health providers, the Evidence-Based Practice Attitude Scale (EBPAS; Aarons, Reference Aarons2004) has been widely used to assess attitudes towards adopting EBPs (ATE). The results of the EBPAS indicate that such attitudes are a critical factor influencing implementation outcomes (e.g., Beidas et al., Reference Beidas, Edmunds, Ditty, Watkins, Walsh, Marcus and Kendall2014; Reding et al., Reference Reding, Chorpita, Lau and Innes-Gomberg2014). In the context of implementing EBPs in schools, however, there is no widely established measurement specifically designed to assess teachers’ ATE (Merle et al., Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023). The absence of such a measure may have contributed to the limited research on general education teachers’ ATE despite the increasing emphasis on EBP implementation in school settings (Merle et al., Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023).

To date, two notable studies have addressed this gap by validating the EBPAS (Aarons, Reference Aarons2004) in school-based settings. Cook et al. (Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018) demonstrated the relevance of the EBPAS beyond health care by focusing on behavioural health consultants working in schools. More recently, Merle et al. (Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023) validated an adapted version of the EBPAS for general education teachers in primary schools, highlighting its applicability in educational contexts.

Although the adaptation for primary school teachers (e.g., Merle et al., Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023) is a significant step in contextualising the scale, further research is needed to generalise how general education teachers’ ATE differs from that of other professionals. Therefore, validating a Singapore version of the EBPAS and examining its associations with individual and organisational characteristics are crucial. This will support informed decision-making in school-based EBP implementation for students with SEN.

Evidence-Based Practice Attitude Scale and Its Predictors

The original EBPAS, developed by Aarons (Reference Aarons2004), assesses mental health providers’ ATE. It comprises 15 items across four subscales: Requirements (three items, α = .90), Appeal (four items, α = .80), Openness (four items, α = .78), and Divergence (four items, α = .59). Higher scores on the Appeal subscale indicate a greater positive attitude towards EBPs when they are perceived as appealing and adequately supported by training and colleagues (Aarons, Reference Aarons2004). The Requirements subscale reflects providers’ willingness to adopt a new practice when required by an authority (Aarons, Reference Aarons2004). The Openness subscale indicates the extent of the providers’ openness to trying new practices (Aarons, Reference Aarons2004). The Divergence subscale is reverse-scored; a lower score signifies more positive perceptions of research-based interventions (Aarons, Reference Aarons2004).

Research has demonstrated that individual and organisational factors, such as educational attainment, leadership style, and years of experience, influence implementers’ ATE. Educational attainment and leadership are positively associated with ATE, whereas years of experience show a negative association. For instance, healthcare providers with higher educational attainment have shown increased openness to new practices (van Sonsbeek et al., Reference van Sonsbeek, Hutschemaekers, Veerman, Kleinjan, Aarons and Tiemens2015), especially when these practices are appealing (Aarons, Reference Aarons2004; Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010).

Conversely, years of teaching experience appear to have the opposite influence. Less experienced implementers tend to hold more positive attitudes (Aarons, Reference Aarons2004). On the other hand, more experienced implementers typically score lower on both the total EBPAS scores (Egeland et al., Reference Egeland, Ruud, Ogden, Lindstrøm and Heiervang2016; Melas et al., Reference Melas, Zampetakis, Dimopoulou and Moustakis2012) and the Openness subscale (Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010).

Similar to educational attainment, leadership style has been identified as a critical organisational factor influencing ATE. Studies have found that transformational (Aarons, Reference Aarons2006; Aarons & Sommerfeld, Reference Aarons and Sommerfeld2012; Farahnak et al., Reference Farahnak, Ehrhart, Torres and Aarons2020), transactional (Aarons, Reference Aarons2006), and proactive leadership (Powell et al., Reference Powell, Mandell, Hadley, Rubin, Evans, Hurford and Beidas2017) are positively associated with ATE. Transactional leadership involves timely reinforcement based on individuals’ accomplishments, while transformational leadership involves visionary, motivating individuals to exceed expectations (Aarons, Reference Aarons2006).

Such leadership styles may be shaped by cultural differences across nations and organisational contexts (Hofstede et al., Reference Hofstede, Hofstede and Minkov2010). According to Hofstede and his research team, Singapore has a Power Distance Index score of 74, placing it in the upper-medium range (Hofstede et al., Reference Hofstede, Hofstede and Minkov2010). This suggests that although people in Singapore are more likely to accept hierarchical structures, they also expect those in authority to fulfil their obligations (Hofstede et al., Reference Hofstede, Hofstede and Minkov2010, p. 80). These cultural norms around power and leadership may influence Singaporean general education teachers’ perceptions of school leadership and how those perceptions relate to their ATE.

Although previous studies have demonstrated the impact of individual and organisational characteristics (i.e., highest educational attainment, years of teaching experience, and organisational leadership) on healthcare providers’ ATE, research on their application to general education teachers’ ATE remains underexplored. Notably, Merle (Reference Merle2021) examined primary school teachers’ ATE using an adapted version of the EBPAS. He found a weak negative association between years of teaching experience and school-based EBPAS. This finding is consistent with those reported in healthcare settings (Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010; Egeland et al., Reference Egeland, Ruud, Ogden, Lindstrøm and Heiervang2016; Melas et al., Reference Melas, Zampetakis, Dimopoulou and Moustakis2012). Merle also highlighted the strong positive influence of implementation leadership (Aarons et al., Reference Aarons, Ehrhart and Farahnak2014) on teachers’ ATE, aligning with similar findings among healthcare providers (Powell et al., Reference Powell, Mandell, Hadley, Rubin, Evans, Hurford and Beidas2017). Although Merle suggests that the influence of individual and organisational factors on ATE may be consistent across professions, including health care and education, further validation is needed before generalising these findings to general education teachers.

Additionally, factors such as years of supporting students with SEN and teacher efficacy may also influence teachers’ ATE. Similar to years of teaching experience, longer periods of supporting students with SEN may negatively affect ATE, potentially due to the challenges of managing disruptive behaviours (Yeo et al., Reference Yeo, Chong, Neihart and Huan2016). General education teachers have reported decreased job satisfaction as the number of students with SEN in their classroom increases (Chen, Reference Chen2024). This suggests that sustained exposure to the demands of supporting students with SEN may contribute to accumulated on-the-job stress (Chen, Reference Chen2024) over time, which could negatively influence their ATE.

Teacher efficacy, defined as beliefs in one’s own capabilities to achieve given attainments (Ahsan & Malak, Reference Ahsan and Malak2020), is another factor that could influence ATE. Teachers who believe in their capabilities are more likely to engage in new tasks due to repeated experiences of success (Bandura, Reference Bandura1977; Ghaith & Yaghi, Reference Ghaith and Yachi1997). While several studies have demonstrated positive relationships between teacher efficacy and attitudes towards inclusion (e.g., Yada et al., Reference Yada, Leskinen, Savolainen and Schwab2022), the impact of teacher efficacy on ATE has seldom been studied. Notably, two studies found that teachers with higher teaching efficacy were more likely to hold positive attitudes towards new instructional innovations (Ghaith & Yaghi, Reference Ghaith and Yachi1997; Guskey, Reference Guskey1988). Although these studies have enhanced our understanding, further research using the EBPAS is needed to determine whether teacher efficacy in inclusive practices influences ATE.

Hypotheses of the Current Study

Based on the review of previous studies, the following hypotheses are proposed:

  1. 1. Years of teaching experience and years of supporting students with SEN are negatively associated with general education teachers’ attitudes towards adopting EBPs.

  2. 2. Teacher efficacy in inclusive practices significantly predicts positive attitudes towards adopting EBPs.

  3. 3. General education teachers’ openness to adopting EBPs is positively associated with their educational attainment.

  4. 4. Teachers’ perceived implementation leadership is positively associated with their attitudes towards adopting EBPs.

Method

Participants and Procedure

The Institutional Review Board at Nanyang Technological University in Singapore approved this study (NTU-IRB-2022-837). Participants were recruited from 10 primary schools across Singapore in 2023. Stratified random sampling (Cochran, Reference Cochran1977) was initially used to ensure geographic representation from Singapore’s four official school clusters: north, south, east, and west. The author randomly selected schools within each cluster and contacted school leaders via email to request permission for teacher recruitment. Once permission was granted, teachers were emailed a link to an online survey, which remained accessible for approximately 3 weeks. Participation was anonymous and voluntary, with informed consent obtained from all participating teachers.

The data collection timeline varied across schools to minimise the burden on teachers. Initially, five schools were recruited, and additional schools were progressively contacted to ensure an adequate sample size. In total, 170 general education teachers from 10 public primary schools across various regions of Singapore (i.e., four in the east, two in the west, one in the south, and three in the north) participated in the online survey. Table 1 presents the demographic information of the participants.

Table 1. Demographic Information of Participants

Note. SEN = special educational needs; PD = professional development.

Measurements

Evidence-Based Practice Attitude Scale

The EBPAS (Aarons, Reference Aarons2004; Cook et al., Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018) was adapted to examine Singapore primary school teachers’ ATE in general education settings. A 5-point Likert scale was used, ranging from 1 (not at all) to 5 (to a very great extent). The adaptation process involved a systematic content validity analysis with six experts: two primary school leaders, two educational psychologists, one senior lecturer from a tertiary educational institution, and one SEN officer (SENO). The panel rated the relevance and clarity of each item using a 4-point Likert scale, ranging from 1 (not relevant/clear) to 4 (very relevant/clear). For items rated between 1 and 3, the experts provided qualitative feedback and suggestions for improvement. The item content validity index (I-CVI), the scale content validity index (S-CVI), and kappa statistics were calculated for both relevance and clarity.

Teacher Efficacy for Inclusive Practices

Teacher efficacy in inclusive practices was measured using an adapted version of the Teacher Efficacy for Inclusive Practices (TEIP; Sharma et al., Reference Sharma, Loreman and Forlin2012) tailored for general education teachers in Singapore (Jang & Tan, Reference Jang and Tan2024). The scale comprises three subscales, each containing six items: Efficacy to Use Inclusive Instructions (use of strategies to promote inclusion), Efficacy in Collaboration (collaboration with parents and professionals), and Efficacy in Managing Behaviour (handling disruptive behaviours; Sharma et al., Reference Sharma, Loreman and Forlin2012). Responses were collected using a 6-point Likert scale, ranging from 1 (strongly disagree) to 6 (strongly agree).

Implementation Leadership Scale

The Implementation Leadership Scale (ILS; Aarons et al., Reference Aarons, Ehrhart and Farahnak2014; Lyon et al., Reference Lyon, Cook, Brown, Locke, Davis, Ehrhart and Aarons2018) was adapted to measure Singapore general education teachers’ perceptions of school leadership in supporting EBP implementation. The scale includes four subscales, each containing three items: Proactive Leadership, Knowledgeable Leadership, Supportive Leadership, and Perseverant Leadership. According to Aarons et al. (Reference Aarons, Ehrhart and Farahnak2014), a higher score on the Proactive Leadership subscale indicates that leaders are proactive in anticipating and resolving implementation challenges. The Knowledgeable Leadership subscale reflects the extent to which leaders understand EBPs and related implementation issues. Supportive Leadership indicates the degree of support provided by leaders for EBP adoption and use. Lastly, the Perseverant Leadership subscale refers to whether leaders consistently address challenges associated with EBP implementation. Items are rated on a 5-point Likert scale, ranging from 0 (not at all) to 4 (very great extent).

Data Analysis

Content validation

The I-CVI was computed for each item as the proportion of experts who rated it as 3 (relevant) or 4 (very relevant). The S-CVI was then calculated as the mean of the I-CVIs across all 15 EBPAS items (Polit & Beck, Reference Polit and Beck2006). Additionally, to assess the presence of random agreement among experts, kappa coefficient statistics were computed using the formula by Shrotryia and Dhanda (Reference Shrotryia and Dhanda2019). According to Cicchetti (Reference Cicchetti1994), a kappa coefficient below .40 is considered poor; between .40 and .59 is fair; between .60 and .74 is good, and above .74 is excellent. Items with I-CVI scores below the acceptable threshold of 0.78 (Lynn, Reference Lynn1986) or kappa coefficients below .40 (Cicchetti, Reference Cicchetti1994) were revised based on expert feedback due to their lack of alignment with the Singapore educational context. It should be noted that changes were made minimally and only when necessary.

Confirmatory factor analysis and reliability

Confirmatory factor analysis (CFA) was conducted without exploratory factor analysis since the survey items were only minimally modified from the original version (Aarons, Reference Aarons2004) and its adapted version for school settings (Cook et al., Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018). Using RStudio (Posit Team, Reference Team2024), CFA with a maximum likelihood robust (MLR) estimator was conducted to examine the first-order factor model (Aarons, Reference Aarons2004) and the higher-order factor model (Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010). The MLR estimator was chosen due to data non-normality, confirmed by the Kolmogorov–Smirnov and Shapiro–Wilk tests (p < .001). Model fit was assessed using several fit indices, such as chi-square statistics, comparative fit index (CFI), Tucker–Lewis index (TLI), root-mean-square error of approximation (RMSEA), and standardised root-mean-square residual (SRMR). This study follows commonly recommended thresholds: CFI and TLI values > .95, RMSEA < .06, and SRMR < .08 (Hu & Bentler, Reference Hu and Bentler1999). Reliability for the overall scale and subscales was measured using RStudio (Posit Team, Reference Team2024). Additionally, subscale and total mean scores were computed for follow-up regression analysis using IBM SPSS Statistics (Version 28.0).

Regression analysis

Robust regression was conducted using the Modern Applied Statistics with S (MASS) package in RStudio (Posit Team, Reference Team2024) to examine the impact of individual teacher and organisational characteristics on the total EBPAS score and its four subscales (i.e., Appeal, Requirements, Openness, and Divergence). Robust regression was chosen due to violations of the normality assumption across all subscales and violations of homoscedasticity in the Openness and Appeal subscales. Robust regression is less sensitive to these issues (The Pennsylvania State University, n.d.).

Five predictors were examined: (a) highest academic degree, (b) years of teaching experience, (c) years of supporting students with SEN, (d) the mean score of TEIP, and (e) the mean score of ILS. The highest academic degree was originally surveyed across five categories and later recategorised into diploma (n = 15), bachelor’s degree (n = 114), and graduate degrees (n = 41). This variable was treated as ordinal since the educational levels have a clear order. Years of teaching experience and supporting students with SEN were converted to decimal years for analysis. The Divergence subscale was reverse-scored so that higher scores represent more positive attitudes towards research-based interventions. These reversed scores were used in both the regression and the CFA.

Interfactor correlation analysis

Interfactor correlations were measured using Pearson’s correlation analysis via the ‘lavaan’ package in RStudio (Posit Team, Reference Team2024) to understand the extent to which these factors were correlated. The correlation matrix included the means of the total EBPAS and its subscales, as well as mean scores of educational attainments, years of teaching experience, years of supporting students with SEN, the total TEIP score, and the total ILS score.

Results

Descriptive Analysis

Table 2 presents the descriptive statistics for the EBPAS and its four subscales, including means, standard deviations, and internal reliability as measured by Cronbach’s alpha. Overall, teachers’ ATE were generally very positive (M = 3.61, SD = .457), indicating responses between to a great and a very great extent. Among the four subscales, the Divergence subscale had the highest mean score (M = 3.77, SD = .717), whereas the Requirements subscale had the lowest (M = 3.38, SD = .892). Cronbach’s alpha indicated good internal consistency for all subscales and the total EBPAS (α > .80), except for the Divergence subscale (α = .717).

Table 2. Descriptive Statistics for the Evidence-Based Practice Attitude Scale (EBPAS) and Its Four Subscales

Note. AP = Appeal; RM = Requirements; OP = Openness; DV = Divergence.

Content Validity

The I-CVI for relevance ranged from 0.33 to 1.00; for clarity, the I-CVI ranged from 0.67 to 1.00. The S-CVI achieved 0.86 for relevance and 0.88 for clarity, with both surpassing the acceptable threshold of 0.78 (Lynn, Reference Lynn1986). Items with lower relevance and clarity scores, particularly Requirements 3 and Divergence 1, were revised to reflect the Singapore context and improve word choices. Additionally, items in the Divergence subscale had low I-CVI and kappa coefficients due to experts’ disagreement with the content of the statements rather than their relevance to the target population. These items were retained, as they are crucial for assessing teachers’ perceptions of research-based interventions. Lastly, while Requirements 1 and 2 demonstrated excellent I-CVI and kappa coefficients, specific terms such as ‘administrator’ and ‘school’ were revised to better reflect Singapore educational terminology based on expert feedback. Table 3 presents the I-CVI, S-CVI, and kappa coefficients for relevance and clarity.

Table 3. Content Validity Results With I-CVI, S-CVI, and Kappa Coefficients

Note. I-CVI = item content validity index; S-CVI = scale content validity index; EBPAS = Evidence-Based Practice Attitude Scale; AP = Appeal; RM = Requirements; OP = Openness; DV = Divergence.

Construct Validity

The construct validity of the EBPAS for general education teachers was assessed in two stages. First, a first-order four-factor model identified by Aarons (Reference Aarons2004) was tested. As shown in Figure 1, the CFA factor loadings for the Singapore version confirmed the factor structure demonstrated by Aarons (Reference Aarons2004). The fit statistics were ${{\rm X}^2}$ (84) = 134.483, p < .000, CFI = .960, TLI = .950, RMSEA = .062, SRMR = .071. All standardised item factor loadings were significant at p < .000 and greater than .50.

Figure 1. First-Order Confirmatory Factor Analysis Model of the Evidence-Based Practice Attitude Scale (EBPAS).

Second, the higher-order factor CFA model was compared to the first-order model. The initial model did not converge during the estimation process because of a high correlation between RM1 (required by head of department/reporting officer) and RM2 (required by school leaders). After adjusting the model to account for this correlation, a moderately adequate fit was achieved, ${{\rm X}^2}$ (85) = 127.269, p = .002, CFI = .967, TLI = .953, RMSEA = .056, SRMR = .070, with most standardised item factor loadings significant at the p < .01 level (see Figure 2). However, four items under the Appeal subscale, as well as the Appeal (p = .19) and Divergence (p = .72) subscales, were not statistically significant. It should also be noted that the factor loading for the Appeal subscale exceeded 1 (λ = 2.14). Notably, the variance was positive, and the model showed no issue with R-squared ( ${R^2} = \;.819)$ . This indicates that the loading represented a regression coefficient rather than a correlation (Jöreskog, Reference Jöreskog1999). Nonetheless, the nonsignificant factor loadings in the higher-order factor model suggest that it did not outperform the first-order model.

Figure 2. Higher-Order Confirmatory Factor Analysis Model of the Evidence-Based Practice Attitude Scale (EBPAS).

Interfactor Correlations

Table 4 presents a correlation matrix that includes the total EBPAS score, its subscales, and predictor variables, along with corresponding p-values. Using the conventional approach for interpreting correlation coefficients (Schober et al., Reference Schober, Boer and Schwarte2018), the total EBPAS score exhibited a strong positive correlation with Appeal (r = .790), moderate positive correlations with Requirements (r = .677) and Openness (r = .672), and a weak positive correlation with Divergence (r = .365) at the p < .01 level. The Appeal subscale showed moderate positive correlations with both Openness (r = .487) and Requirements (r = .499), significant at the p < .01 level. Additionally, Requirements showed a weak but statistically significant correlation with Openness (r = .285) at the p < .01 level. In contrast, Divergence demonstrated weak negative correlations with Openness (r = −.072) and Requirements (r = −.064), although these were not statistically significant. Among the predictor variables, the total ILS score was positively correlated with the largest number of variables, including Openness (r = .295), Appeal (r = .377), Requirements (r = .213), the total EBPAS score (r = .317), and the total TEIP score (r = .203).

Table 4. Correlation Matrix of Predictors and the EBPAS

Note. EBPAS = Evidence-Based Practice Attitude Scale (Aarons, Reference Aarons2004; Cook et al., Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018); Edu-level = highest educational attainment level; Teach exp. = years of teaching experience; SEN exp. = years of supporting students with special educational needs; TEIP = Teacher Efficacy for Inclusive Practices (Sharma et al., Reference Sharma, Loreman and Forlin2012); ILS = Implementation Leadership Scale (Aarons et al., Reference Aarons, Ehrhart and Farahnak2014).

**p < .01 level. *p < .05 level.

Regression Results

The results of the robust regression analyses indicated that the highest academic degree was significantly and positively associated with the Openness subscale with a moderate effect size (Cohen, Reference Cohen1988), t(163) = 2.450, p = .012, ${R^2}$ = .140. According to Cohen (Reference Cohen1988, pp. 413–414), an ${R^2}$ value between .13 to .26 is interpreted as indicating a moderate effect size. Additionally, the mean ILS score was a significant predictor for Appeal, t(163) = 4.553, p = .001; Openness, t(163) = 3.263, p = .010; Requirements, t(163) = 2.639, p = .042; and the total mean EBPAS score, t(163) = 3.210, p = .005. However, it was not a significant predictor for Divergence, t(163) = −.676, p = .502. None of the other predictors — namely, years of teaching experience, years of supporting students with SEN, and teacher efficacy in inclusive practices — significantly predicted any of the EBPAS subscales or the total EBPAS score (see Table 5 for the full robust regression results).

Table 5. Robust Regression Results

Note. SE = standard error; CI = confidence interval; LB = lower bound; UB = upper bound; ${R^2}$ = R-squared; $R_{\;\;adj}^2$ = adjusted R-squared; SEN = special educational needs; TEIP = Teacher Efficacy for Inclusive Practices (Sharma et al., Reference Sharma, Loreman and Forlin2012); ILS = Implementation Leadership Scale (Aarons et al., Reference Aarons, Ehrhart and Farahnak2014; Lyon et al., Reference Lyon, Cook, Brown, Locke, Davis, Ehrhart and Aarons2018).

**p < .01 level. *p <.05 level.

Discussion and Implications

The current study validated the use of the EBPAS for assessing Singapore primary school teachers’ ATE. Content validity confirmed that the survey items were relevant and clear for this population. CFA supported the original first-order four-factor structure (Openness, Appeal, Requirements, and Divergence) proposed by Aarons (Reference Aarons2004). Additionally, Cronbach’s alpha indicated good internal reliability. The interfactor correlations and robust regression results provided unique insights specific to general education teachers’ ATE.

Interfactor Correlations

The interfactor correlations revealed that the Appeal subscale had stronger correlations with Openness and Requirements. This suggests that teachers who find EBPs appealing are more likely to adopt them and comply with changes required by authorities during the implementation process (Aarons, Reference Aarons2004). It also aligns with Cook et al.’s (Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018) findings that the appeal of EBPs is an important gateway to increasing providers’ openness.

In contrast, the Divergence subscale exhibited weaker correlations with other subscales, consistent with previous studies (Cook et al., Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018; Merle et al., Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023). This suggests that teachers who value research-based interventions may not always be open to new practices, find them more appealing, or adopt them when required.

Individual Characteristics as Predictors of Teacher Attitudes Towards Adopting EBPs (Hypotheses 1 to 3)

The results indicate that years of teaching experience and supporting students with SEN did not significantly influence teachers’ ATE, thereby rejecting the first hypothesis. In other words, teachers with more experience in teaching and supporting students with SEN do not necessarily hold more negative ATE. This finding contrasts with Merle (Reference Merle2021), who reported a weak but negative correlation between years of teaching experience and ATE. This insignificant finding may be explained by the significant positive correlations among the Requirements, Appeal, and Openness subscales. It is possible that Singapore teachers, regardless of their years of experience in teaching and supporting students with SEN, remain open to adopting new practices when those practices are required by authorities.

Similarly, teacher efficacy in inclusive practices did not predict willingness to implement EBPs, rejecting the second hypothesis. This contrasts with previous studies that found positive relationships between teacher efficacy and the adoption of innovative practices (Ghaith & Yaghi, Reference Ghaith and Yachi1997; Guskey, Reference Guskey1988) or the inclusion of students with SEN (Yada et al., Reference Yada, Leskinen, Savolainen and Schwab2022). Notably, Yada et al. (Reference Yada, Leskinen, Savolainen and Schwab2022) consolidated studies that examined teachers’ attitudes towards including students with SEN in mainstream schools. In contrast, the EBPAS in the current study focused on attitudes towards disseminating and implementing new practices to enhance students’ learning and behaviour (Aarons, Reference Aarons2004). In other words, although teacher efficacy in inclusive practices may enhance their attitudes towards inclusion (Yada et al., Reference Yada, Leskinen, Savolainen and Schwab2022), it may not directly influence their willingness to adopt EBPs.

The highest academic degree significantly predicts the Openness subscale, confirming the third hypothesis. This suggests that primary school teachers with higher educational attainments are more likely to be open to new practices. This finding supports van Sonsbeek et al. (Reference van Sonsbeek, Hutschemaekers, Veerman, Kleinjan, Aarons and Tiemens2015) but differs from Aarons (Reference Aarons2004) and Aarons et al. (Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010), who found that educational attainment was more closely related to Appeal. These differences suggest that the impact of educational attainment on the EBPAS may vary across professions and contexts. In this study, teachers with higher academic degrees may be more open to new EBPs because of a greater likelihood of exposure to such practices at the postgraduate level. Conversely, the appeal of EBPs could play a more significant role for mental health providers (Aarons, Reference Aarons2004; Aarons et al., Reference Aarons, Glisson, Hoagwood, Kelleher, Landsverk and Cafri2010), as they are typically more accustomed to EBPs.

This finding of a positive relationship between educational attainment and openness to adopting EBPs suggests that it is crucial to provide compulsory courses on understanding EBPs at the university level (Diery et al., Reference Diery, Knogler and Seidel2021). As Diery and her colleagues (Reference Diery, Knogler and Seidel2021) pointed out, offering such courses to preservice teachers may help address the challenges of promoting EBP implementation among in-service teachers. Currently, Singapore’s tertiary education institution offers various elective courses aimed at enhancing preservice teachers’ understanding of SEN and teaching strategies (Nanyang Technological University/National Institute of Education, 2025). Making these courses, along with an additional course introducing EBPs for students with SEN, compulsory may increase teachers’ willingness to adopt EBPs in school settings.

Organisational Characteristics as a Predictor of Teacher Attitudes Towards Adopting EBPs (Hypothesis 4)

The results confirmed the fourth hypothesis, with the mean ILS score consistently predicting the subscales of the EBPAS and the total EBPAS score, except for the Divergence subscale. This finding suggests that teachers who perceive strong school leadership support for EBP implementation are more likely to be open to new practices, find them appealing, and adopt them when required. This aligns with Singapore’s ‘nuanced implementation of distributed leadership’ (Tan, Reference Tan2024, p. 289). Despite the practice of distributed leadership in Singapore, the high-power distance culture, typical of Asia, leads teachers to seek final approval from school leaders (Tan, Reference Tan2024). Consequently, teachers tend to be more open and hold positive ATE when they perceive clear leadership support. In other words, school leadership plays a crucial role in enhancing teachers’ positive ATE in schools.

This finding, along with the positive correlations between Appeal, Openness, and Requirements, suggests the importance of professional development for school leaders to support EBP implementation for students with SEN in mainstream schools. School leaders who are aware of EBPs are more likely to promote their implementation in school settings, which can increase teachers’ acceptance of and willingness to adopt these practices.

Limitations and Future Directions

Although this study demonstrates that the revised version of EBPAS can evaluate Singapore general education teachers’ ATE, some limitations should be acknowledged. First, the study validated the EBPAS using content validity, CFA, and internal reliability with a relatively small sample size (N = 170). While the first-order model demonstrated a good model fit, the higher-order model could not be confirmed. According to Kyriazos (Reference Kyriazos2018), a large sample size is crucial when the assumption of normality is violated. Despite using a robust estimator (i.e., MLR) to mitigate the influence of non-normality, the sample size might not have been sufficient to confirm the higher-order factor model. Therefore, it is recommended that researchers of future studies test the model with a larger sample.

Second, the Divergence subscale of the EBPAS may require adaptation to better reflect the professional and cultural context of general education teachers in Singapore. Although the current study retained the Divergence items due to their importance in assessing teachers’ attitudes towards research-based interventions, this subscale has been noted as problematic, showing lower internal reliability (Santesson et al., Reference Santesson, Bäckström, Holmberg, Perrin and Jarbin2020), weak factor loading (Baumann et al., Reference Baumann, Vázquez, Macchione, Lima, Coelho, Juras, Ribeiro, Kohlsdorf and Carothers2022), and weak correlations with other subscales and the total score (Cook et al., Reference Cook, Davis, Brown, Locke, Ehrhart, Aarons, Larson and Lyon2018; Merle et al., Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023; van Sonsbeek et al., Reference van Sonsbeek, Hutschemaekers, Veerman, Kleinjan, Aarons and Tiemens2015). As a result, several studies have suggested adapting the EBPAS to align with specific national and professional contexts. For example, the Brazilian Portuguese version of the EBPAS dropped one item (‘Know better than researchers’) from the Divergence scale (Baumann et al., Reference Baumann, Vázquez, Macchione, Lima, Coelho, Juras, Ribeiro, Kohlsdorf and Carothers2022). Similarly, the Swedish version recommended replacing the same item (Santesson et al., Reference Santesson, Bäckström, Holmberg, Perrin and Jarbin2020). Another example can be found from Merle et al. (Reference Merle, Cook, Locke, Ehrhart, Brown, Davis and Lyon2023), who identified a new subscale, Fit, and removed the Divergence scale to assess ATE among US primary school teachers. Therefore, refining the Divergence subscale by identifying and validating constructs better suited to Singapore teachers through exploratory factor analysis and CFA with a larger sample may further enhance the applicability of the EBPAS for general education teachers in Singapore.

Third, the current study focused on Singapore general education teachers in primary schools and relied on self-report surveys, which might limit the generalisability and in-depth understanding of teachers’ ATE. Understanding teachers’ ATE across different school settings and cultural contexts is particularly important, given the global commitment to inclusive education. For example, 80% of students with SEN are currently included in mainstream schools in Singapore (Ministry of Education, 2023), and various school-based programs have been introduced to support them (Landulfo et al., Reference Landulfo, Chandy and Wong2015; Teng, Reference Teng2019). Researchers of future studies should explore the applicability of the adapted EBPAS across other educational settings, such as secondary schools, early childhood education, and postsecondary institutions, as well as in broader international contexts. Additionally, interviews or focus-group discussions could complement self-report survey methods and provide a deeper understanding of teachers’ ATE.

Fourth, the TEIP (Sharma et al., Reference Sharma, Loreman and Forlin2012) measuring teacher efficacy in inclusive practices might not be the best fit for understanding the relationship between teacher efficacy and ATE. The current study employed the TEIP (Sharma et al., Reference Sharma, Loreman and Forlin2012) because a specific measure of teacher efficacy for EBP implementation in schools has not yet been developed. Although the TEIP broadly involves various inclusive practices such as managing disruptive behaviour, collaborating with colleagues and parents, and providing instructional strategies (Sharma et al., Reference Sharma, Loreman and Forlin2012), EBPs specifically focus on implementing interventions or strategies proven effective in improving students’ learning and behaviour (The IRIS Center, 2016). Hence, developing a survey to measure teacher efficacy in school-based EBP implementations should be prioritised in future research to accurately examine the relationship between teacher efficacy and ATE. As Tucker et al. (Reference Tucker, Zadvinskis and Conner2021) developed self-efficacy for EBP implementation focusing on nurses in healthcare institutions, this approach could serve as a starting point to contextualise teacher efficacy for EBP implementation in school settings.

Funding

This research was supported by a Nanyang Technological University Research Scholarship (NTU-RSS) and a John M. Elliott Memorial Research Grant from the Singapore Children’s Society. The funders had no role in the study design, data collection, analysis, or manuscript preparation.

Competing interests

The author of this study certifies that she has no affiliations with or involvement in any organisation or entity with any financial or non-financial interest in the subject matter or materials discussed in this manuscript.

References

Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Research, 6(2), 6174. https://doi.org/10.1023/B:MHSR.0000024351.12294.65 CrossRefGoogle ScholarPubMed
Aarons, G. A. (2006). Transformational and transactional leadership: Association with attitudes toward evidence-based practice. Psychiatric Services, 57(8), 11621169. https://doi.org/10.1176/ps.2006.57.8.1162 CrossRefGoogle ScholarPubMed
Aarons, G. A., Ehrhart, M. G., & Farahnak, L. R. (2014). The Implementation Leadership Scale (ILS): Development of a brief measure of unit level implementation leadership. Implementation Science, 9, Article 45. https://doi.org/10.1186/1748-5908-9-45 CrossRefGoogle Scholar
Aarons, G. A., Glisson, C., Hoagwood, K., Kelleher, K., Landsverk, J., & Cafri, G. (2010). Psychometric properties and U.S. national norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychological Assessment, 22(2), 356365. https://doi.org/10.1037/a0019188 CrossRefGoogle ScholarPubMed
Aarons, G. A., & Sommerfeld, D. H. (2012). Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. Journal of the American Academy of Child & Adolescent Psychiatry, 51(4), 423431. https://doi.org/10.1016/j.jaac.2012.01.018 Google ScholarPubMed
Ahsan, M. T., & Malak, M. S. (2020, July 30). Teaching efficacy and inclusive practices in Asian countries. Oxford Research Encyclopedia of Education. https://doi.org/10.1093/acrefore/9780190264093.013.1227 CrossRefGoogle Scholar
Aljunied, M. (2021). Psychological services for children with special educational needs in mainstream schools. In Wong, M. E. & Lim, L. (Eds.), Special needs in Singapore: Trends and issues (pp. 149167). World Scientific Publishing. https://doi.org/10.1142/9789814667142_0008 CrossRefGoogle Scholar
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84(2), 191215. https://doi.org/10.1037/0033-295X.84.2.191 CrossRefGoogle Scholar
Baumann, A. A., Vázquez, A. L., Macchione, A. C., Lima, A., Coelho, A. F., Juras, M., Ribeiro, M., Kohlsdorf, M., & Carothers, B. J. (2022). Translation and validation of the Evidence-Based Practice Attitude Scale (EBPAS-15) to Brazilian Portuguese: Examining providers’ perspective about evidence-based parent intervention. Children and Youth Services Review, 136, Article 106421. https://doi.org/10.1016/j.childyouth.2022.106421 CrossRefGoogle Scholar
Beidas, R. S., Edmunds, J., Ditty, M., Watkins, J., Walsh, L., Marcus, S., & Kendall, P. (2014). Are inner context factors related to implementation outcomes in cognitive-behavioral therapy for youth anxiety? Administration and Policy in Mental Health, 41(6), 788799. https://doi.org/10.1007/s10488-013-0529-x CrossRefGoogle ScholarPubMed
Chen, Y. (2024). Effects of the proportion of students with special educational needs on middle school teachers’ well-being. Frontiers in Education, 9, Article 1307709. https://doi.org/10.3389/feduc.2024.1307709 CrossRefGoogle Scholar
Chong, W. H., & Lee, B.-O. (2021). Understanding effective implementation of prevention education programmes: Perspective from Singapore schools. The Asia-Pacific Education Researcher, 30(1), 2332. https://doi.org/10.1007/s40299-020-00511-3 CrossRefGoogle Scholar
Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6(4), 284290. https://doi.org/10.1037/1040-3590.6.4.284 CrossRefGoogle Scholar
Cochran, W. G. (1977). Sampling techniques (3rd ed.). John Wiley & Sons.Google Scholar
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Erlbaum Associates.Google Scholar
Cook, C. R., Davis, C., Brown, E. C., Locke, J., Ehrhart, M. G., Aarons, G. A., Larson, M., & Lyon, A. R. (2018). Confirmatory factor analysis of the Evidence-Based Practice Attitudes Scale with school-based behavioral health consultants. Implementation Science, 13, Article 116. https://doi.org/10.1186/s13012-018-0804-z CrossRefGoogle Scholar
Diery, A., Knogler, M., & Seidel, T. (2021). Supporting evidence-based practice through teacher education: A profile analysis of teacher educators’ perceived challenges and possible solutions. International Journal of Educational Research Open, 2, Article 100056. https://doi.org/10.1016/j.ijedro.2021.100056 CrossRefGoogle Scholar
Egeland, K. M., Ruud, T., Ogden, T., Lindstrøm, J. C., & Heiervang, K. S. (2016). Psychometric properties of the Norwegian version of the Evidence-Based Practice Attitude Scale (EBPAS): To measure implementation readiness. Health Research Policy and Systems, 14, Article 47. https://doi.org/10.1186/s12961-016-0114-3 CrossRefGoogle Scholar
Farahnak, L. R., Ehrhart, M. G., Torres, E. M., & Aarons, G. A. (2020). The influence of transformational leadership and leader attitudes on subordinate attitudes and implementation success. Journal of Leadership & Organizational Studies, 27(1), 98111. https://doi.org/10.1177/1548051818824529 CrossRefGoogle Scholar
Ghaith, G., & Yachi, H. (1997). Relationships among experience, teacher efficacy, and attitudes toward the implementation of instructional innovation. Teaching and Teacher Education, 13(4), 451458. https://doi.org/10.1016/S0742-051X(96)00045-5 CrossRefGoogle Scholar
Guskey, T. R. (1988). Teacher efficacy, self-concept, and attitudes toward the implementation of instructional innovation. Teaching and Teacher Education, 4(1), 6369. https://doi.org/10.1016/0742-051X(88)90025-X CrossRefGoogle Scholar
Hofstede, G., Hofstede, G. J., & Minkov, M. (2010). Cultures and organizations: Software of the mind: Intercultural cooperation and its importance for survival (3rd ed.). McGraw-Hill.Google Scholar
Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 155. https://doi.org/10.1080/10705519909540118 CrossRefGoogle Scholar
The IRIS Center. (2016). Autism spectrum disorder (Part 2): Evidence-based practices. https://iris.peabody.vanderbilt.edu/module/asd2/ Google Scholar
Jang, H., & Tan, P. C. (2024). Inclusive education in action: Singapore teachers’ efficacy in early childhood and primary school education. Education 3-13. Advance online publication. https://doi.org/10.1080/03004279.2024.2410948 CrossRefGoogle Scholar
Jöreskog, K. G. (1999). How large can a standardized coefficient be? Mplus. https://www.statmodel.com/download/Joreskog.pdf Google Scholar
Kyriazos, T. A. (2018). Applied psychometrics: Sample size and sample power considerations in factor analysis (EFA, CFA) and SEM in general. Psychology, 9(8), 22072230. https://doi.org/10.4236/psych.2018.98126 CrossRefGoogle Scholar
Landulfo, C., Chandy, C., & Wong, Z. Y. (2015). Expanding the provision for people with dyslexia in Singapore. Asia Pacific Journal of Developmental Differences, 2(2), 234276. https://doi.org/10.3850/S2345734115000307 CrossRefGoogle Scholar
Locke, J., Lawson, G. M., Beidas, R. S., Aarons, G. A., Xie, M., Lyon, A. R., Stahmer, A., Seidman, M., Frederick, L., Oh, C., Spaulding, C., Dorsey, S., & Mandell, D. S. (2019). Individual and organizational factors that affect implementation of evidence-based practices for children with autism in public schools: A cross-sectional observational study. Implementation Science, 14, Article 29. https://doi.org/10.1186/s13012-019-0877-3 CrossRefGoogle Scholar
Lynn, M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382385. https://doi.org/10.1097/00006199-198611000-00017 CrossRefGoogle ScholarPubMed
Lyon, A. R., Cook, C. R., Brown, E. C., Locke, J., Davis, C., Ehrhart, M., & Aarons, G. A. (2018). Assessing organizational implementation context in the education sector: Confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science, 13, Article 5. https://doi.org/10.1186/s13012-017-0705-6 CrossRefGoogle Scholar
Melas, C. D., Zampetakis, L. A., Dimopoulou, A., & Moustakis, V. (2012). Evaluating the properties of the Evidence-Based Practice Attitude Scale (EBPAS) in health care. Psychological Assessment, 24(4), 867876. https://doi.org/10.1037/a0027445 CrossRefGoogle ScholarPubMed
Merle, J. L. (2021). Teacher attitudes toward evidence-based practices: Confirmatory and predictive analyses of the school-adapted Evidence-Based Practice Attitude Scale [Doctoral dissertation, University of Minnesota]. ProQuest Dissertations & Theses Global.Google Scholar
Merle, J. L., Cook, C. R., Locke, J. J., Ehrhart, M. G., Brown, E. C., Davis, C. J., & Lyon, A. R. (2023). Teacher attitudes toward evidence-based practices: Exploratory and confirmatory analyses of the school-adapted Evidence-Based Practice Attitude Scale. Implementation Research & Practice, 4, 116. https://doi.org/10.1177/26334895221151026 CrossRefGoogle ScholarPubMed
Ministry of Education. (2021). Character and citizenship education (CCE) syllabus: Primary. https://www.moe.gov.sg/-/media/files/syllabus/2021-primary-character-and-citizenship-education.pdf Google Scholar
Ministry of Education. (2023, May 8). Support for students with special educational needs in mainstream schools who are not found suitable for government-funded SPED or cannot afford private education [Parliamentary replies]. https://www.moe.gov.sg/news/parliamentary-replies/20230509-support-for-students-with-special-educational-needs-in-mainstream-schools-who-are-not-found-suitable-for-government-funded-sped-or-cannot-afford-private-education Google Scholar
Moir, T. (2018). Why is implementation science important for intervention design and evaluation within educational settings? Frontiers in Education, 3, Article 61. https://doi.org/10.3389/feduc.2018.00061 CrossRefGoogle Scholar
Nanyang Technological University/National Institute of Education. (2025). Bachelor of Arts in an academic discipline and in education & Bachelor of Science in an academic discipline and in education. https://www.ntu.edu.sg/nie/about-us/programme-offices/office-of-teacher-education-and-undergraduate-programmes/programme-handbooks Google Scholar
Odom, S. L., Sam, A. M., & Tomaszewski, B. (2022). Factors associated with implementation of a school-based comprehensive program for students with autism. Autism, 26(3), 703715. https://doi.org/10.1177/13623613211070340 CrossRefGoogle ScholarPubMed
The Pennsylvania State University. (n.d.). STAT 501: Regression methods. T.1.1 Robust regression methods. https://online.stat.psu.edu/stat501/lesson/t/t.1/t.1.1-robust-regression-methods Google Scholar
Polit, D. F., & Beck, C. T. (2006). The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489497. https://doi.org/10.1002/nur.20147 CrossRefGoogle ScholarPubMed
Poon, K. K. (2019). Policies and initiatives for preschool children from disadvantaged environments and preschool children with disabilities in Singapore. In Teng, S. S., Manzon, M., & Poon, K. K. (Eds.), Equity in excellence: Experiences of East Asian high-performing education systems (pp. 149160). Springer. https://doi.org/10.1007/978-981-13-2975-3_10 CrossRefGoogle Scholar
Team, Posit. (2024). RStudio: Integrated development environment for R. Posit Software. http://www.posit.co/ Google Scholar
Powell, B. J., Mandell, D. S., Hadley, T. R., Rubin, R. M., Evans, A. C., Hurford, M. O., & Beidas, R. S. (2017). Are general and strategic measures of organizational context and leadership associated with knowledge and attitudes toward evidence-based practices in public behavioral health settings? A cross-sectional observational study. Implementation Science, 12, Article 64. https://doi.org/10.1186/s13012-017-0593-9 CrossRefGoogle Scholar
Reding, M. E. J., Chorpita, B. F., Lau, A. S., & Innes-Gomberg, D. (2014). Providers’ attitudes toward evidence-based practices: Is it just about providers, or do practices matter, too? Administration and Policy in Mental Health, 41(6), 767776. https://doi.org/10.1007/s10488-013-0525-1 CrossRefGoogle ScholarPubMed
Santesson, A. H. E., Bäckström, M., Holmberg, R., Perrin, S., & Jarbin, H. (2020). Confirmatory factor analysis of the Evidence-Based Practice Attitude Scale (EBPAS) in a large and representative Swedish sample: Is the use of the total scale and subscale scores justified? BMC Medical Research Methodology, 20, Article 254. https://doi.org/10.1186/s12874-020-01126-4 CrossRefGoogle Scholar
Schober, P., Boer, C., & Schwarte, L. A. (2018). Correlation coefficients: Appropriate use and interpretation. Anesthesia & Analgesia, 126(5), 17631768. https://doi.org/10.1213/ANE.0000000000002864 CrossRefGoogle ScholarPubMed
Sharma, U., Loreman, T., & Forlin, C. (2012). Measuring teacher efficacy to implement inclusive practices. Journal of Research in Special Educational Needs, 12(1), 1221. https://doi.org/10.1111/j.1471-3802.2011.01200.x CrossRefGoogle Scholar
Shrotryia, V. K., & Dhanda, U. (2019). Content validity of assessment instrument for employee engagement. SAGE Open, 9(1), 17. https://doi.org/10.1177/2158244018821751 CrossRefGoogle Scholar
Tan, C. Y. (2024). Influence of cultural values on Singapore school leadership. Educational Management Administration & Leadership, 52(2), 280303. https://doi.org/10.1177/17411432211073414 CrossRefGoogle Scholar
Teng, A. (2019, November 9). MOE to set up 3 new autism-focused schools; more peer support initiatives for special needs students. The Straits Times. https://www.straitstimes.com/singapore/education/more-peer-support-initiatives-for-special-needs-students-moe-to-set-up-3-new Google Scholar
Tucker, S., Zadvinskis, I. M., & Conner, L. (2021). Development of psychometric testing of the Implementation Self-Efficacy for EBP (ISE4EBP) scale. Western Journal of Nursing Research, 43(1), 4552. https://doi.org/10.1177/0193945920925032 CrossRefGoogle Scholar
van Sonsbeek, M. A. M. S., Hutschemaekers, G. J. M., Veerman, J. W., Kleinjan, M., Aarons, G. A., & Tiemens, B. G. (2015). Psychometric properties of the Dutch version of the Evidence-Based Practice Attitude Scale (EBPAS). Health Research Policy and Systems, 13, Article 69. https://doi.org/10.1186/s12961-015-0058-z CrossRefGoogle Scholar
Yada, A., Leskinen, M., Savolainen, H., & Schwab, S. (2022). Meta-analysis of the relationship between teachers’ self-efficacy and attitudes toward inclusive education, Teaching and Teacher Education, 109, Article 103521. https://doi.org/10.1016/j.tate.2021.103521 CrossRefGoogle Scholar
Yeo, L. S., Chong, W. H., Neihart, M. F., & Huan, V. S. (2016). Teachers’ experience with inclusive education in Singapore. Asia Pacific Journal of Education, 36(Suppl. 1), 6983. https://doi.org/10.1080/02188791.2014.934781 CrossRefGoogle Scholar
Figure 0

Table 1. Demographic Information of Participants

Figure 1

Table 2. Descriptive Statistics for the Evidence-Based Practice Attitude Scale (EBPAS) and Its Four Subscales

Figure 2

Table 3. Content Validity Results With I-CVI, S-CVI, and Kappa Coefficients

Figure 3

Figure 1. First-Order Confirmatory Factor Analysis Model of the Evidence-Based Practice Attitude Scale (EBPAS).

Figure 4

Figure 2. Higher-Order Confirmatory Factor Analysis Model of the Evidence-Based Practice Attitude Scale (EBPAS).

Figure 5

Table 4. Correlation Matrix of Predictors and the EBPAS

Figure 6

Table 5. Robust Regression Results