Hostname: page-component-7dd5485656-npwhs Total loading time: 0 Render date: 2025-10-29T08:45:04.824Z Has data issue: false hasContentIssue false

Course-Based Research and Mentorship: Results from a Multiterm Research Academy at a Minority-Serving Institution

Published online by Cambridge University Press:  20 October 2025

Marissa Brookes
Affiliation:
University of California, Riverside, USA
Kim Yi Dionne
Affiliation:
University of California, Riverside, USA
Jennifer L. Merolla
Affiliation:
University of California, Riverside, USA
Rights & Permissions [Opens in a new window]

Abstract

This article describes the creation of the Minority-Serving Institution Research Academy (MSIRA), a training and apprenticeship program for undergraduate political science students supported by the National Science Foundation and launched at the University of California, Riverside, in 2023. MSIRA is a course-based undergraduate research experience (CURE) in which students spend 10 weeks learning basic research methods through active-learning and team-based exercises. After the 10-week course, students provide 25 hours of research assistance for faculty mentors. We examine focus group and survey data from the first cohort of MSIRA fellows to describe its impact and to draw parallels and distinctions with other CUREs.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of American Political Science Association

With support from the National Science Foundation (NSF),Footnote 1 we launched in Spring 2023 the Minority-Serving Institution Research Academy (MSIRA), a two-term training and apprenticeship program for undergraduate students enrolled as political science majors at the University of California, Riverside (UCR), a Minority-Serving Institution (MSI)Footnote 2 with a diverse student body. Of the 20,919 undergraduate students enrolled in Spring 2023, 52% were first-generation college students, 47% received federal Pell Grants, and 43% identified as belonging to an underrepresented minority group.Footnote 3 Our student population, inclusive of students from equity-seeking groups that are underrepresented in the academy, presents a tremendous recruitment opportunity for graduate programs that seek to build more diverse and inclusive graduate-student cohorts. Moreover, it is an opportunity for our discipline, which has had only limited gains in reaching diversity goals due to institutions and structural barriers impeding progress (McClain Reference McClain2022; Sinclair-Chapman Reference Sinclair-Chapman2015).

This article describes and reports on the structure and impact of MSIRA, the latter measured through focus group discussions (FGDs) and a panel survey involving the first cohort of students.Footnote 4 We designed the selection to MSIRA as a lottery to encourage a wide range of students to apply. Our pre- and post-intervention surveys asked questions about students’ efficacy, confidence, capabilities, and intentions to pursue graduate school. Students also participated in FGDs after the course and after completing their research apprenticeship. From these data, we learned that the program increased students’ perceptions of their skills, sense of efficacy, and confidence in pursuing future careers, but it did not change their intentions to enroll in graduate programs.

This article describes and reports on the structure and impact of MSIRA, the latter measured through focus group discussions and a panel survey involving its first cohort of students.

MSIRA’s creation drew from and built on previous studies about undergraduate research experiences (UREs) (Carpi et al. Reference Carpi, Ronan, Falconer and Lents2017; Linn et al. Reference Linn, Palmer, Baranger, Gerard and Stone2015; Russell, Hancock, and McCullough, Reference Russell, Hancock and McCullough2007), as well as efforts in political science serving equity-seeking groups in particular (Adida et al. Reference Adida, Lake, Shafiei and Platt2020; Becker, Graham, and Zvobgo Reference Becker, Benjamin and Zvobgo2021; Duncan et al. Reference Duncan, Balcazar, Gonzalez, Tamene and Clawson2023; Peréz Reference Pérez2023; Perry, Zuhlke, and Tormos-Aponte Reference Perry, Zuhlke and Tormos-Aponte2023; Tormos-Aponte and Velez-Serrano Reference Tormos-Aponte and Velez-Serrano2020). UREs generally are associated with improved student learning in political science and higher rates of admission to graduate and professional schools. However, the effect of UREs on graduate and professional school placement may be limited to students who already were high performers before participating (Ishiyama and Breuning Reference Ishiyama and Breuning2003). Our study expands on these findings by examining the impacts of a URE for students with a wider range of abilities because MSIRA’s lottery-based selection process reduces self-selection bias. Our multiterm course-based undergraduate research experience (CURE) is similar to what Livny (Reference Livny2023) described, in that it combined curriculum in research methods and an apprenticeship but involved a larger group of undergraduate students. Our CURE’s distinction from many previous efforts in political science is in its implementation at an MSI, which faces unique challenges (e.g., resource availability) and opportunities (e.g., a diverse and inclusive student body). Our research also expands on Weinschenk’s (Reference Weinschenk2020, 290–91) findings that undergraduate research labs have additional benefits for students, including enhancing their recognition of the importance of collaboration in political science research, increasing their understanding of the steps in the research process, developing their data analysis skills, and improving their abilities to critically assess social science research.

THE MINORITY-SERVING INSTITUTION RESEARCH ACADEMY

MSIRA begins with a 10-week course followed by an apprenticeship of at least 25 hours working as a research assistant for a faculty member. The courseFootnote 5 provided instruction in research methods with hands-on training in data collection, coding, and collaboration (the syllabus is in online appendix A). We also trained students in how to sufficiently comprehend political science articles to communicate research findings to the general public. All students were required to complete training in social and behavioral research methods involving human subjects.

MSIRA begins with a 10-week course followed by an apprenticeship of at least 25 hours working as a research assistant for a faculty member.

One author served as the instructor of record for each academic term that the course was offered; the other two authors guest-taught in those weeks for which they had significant expertise. For this first cohort, Merolla served as the instructor of record; Dionne guest-taught on scientific communication, coding qualitative data, and field research; and Brookes guest-taught on qualitative methods for comparative case studies and within-case process tracing.

Compared to the large-enrollment courses typical in political science at UCR—the average class size is 75 students for upper-division and 300 for lower-division courses—the MSIRA course was intentionally intimate (Beattie and Thiele Reference Beattie and Thiele2016), with 24 enrolled students. The course engaged active-learning strategies including reading assignment quizzes, seminar-style discussions, interactive exercises, collective thought experiments, and student critiques of their peers’ in-class presentations. Interactive exercises included an in-class Boolean logic game, a whiteboard exercise on how to write survey questions, and an in-class challenge to write a research design memo studying turnout in student government elections. We assessed students’ learning through weekly writing assignments and team lab assignments. Writing assignments covered topics such as developing a research question, what to include in a literature review, and a self-reflection as a political science major. Weekly team assignments involved topics such as summarizing journal articles, coding data, producing a research design memo, demonstrating the logic of case selection for qualitative research, drafting interview protocols, and designing survey questions. It was important to include team assignments in the course for two reasons. First, the ability to work in groups is essential for students’ future careers; however, much of the undergraduate experience does not involve opportunities to improve social organizational and collaborative skills (Gade and Wallace Reference Gade and Geoffrey2023). Second, grading team assignments reduces the overall number of assignments that must be graded, which is important when there are frequent assignments and no grading assistance.Footnote 6

After conclusion of the course, MSIRA students undertook a research apprenticeship with a faculty mentor in Summer 2023. We asked faculty members at UCR and local MSIs whether they were interested in mentoring one or two students who could support the faculty members’ research projects. Participating faculty members provided information about their projects to the class and students then ranked those in which they were interested. We matched students to projects based on their preferences. To successfully complete the fellowship, students had to provide 25 hours of research assistance to their faculty mentor, after which they were given a modest participation stipend (i.e., $500). Faculty members participating in the program also received a modest honorarium (i.e., $250). The authors’ normative commitments to compensating students for labor and resource constraints jointly influenced the decision to have a 25-hour apprenticeship commitment.Footnote 7 Students’ tasks during their apprenticeships as well as the durations varied. One student completed all of his hours in two weeks, with responsibilities centered primarily on scientific communication. Another student needed four weeks to complete her apprenticeship, during which she read and coded peer-reviewed journal articles according to themes preselected by her faculty mentors. Other apprenticeships involved conducting and transcribing in-person interviews, accessing archival data, assisting with report writing, systematically coding news articles, and presenting a poster at a convening of local policy makers. Whereas some tasks required new training, others (e.g., journal coding) had been assigned as lab exercises during the course component of the program.

METHODS

We advertised the program widely, posting flyers around campus, sharing information about the program via social media (i.e., Twitter and Instagram) and emails to political science majors, and making announcements during class visits. Considering the potential for self-selection out of high-impact opportunities among students from underrepresented groups and similar to the Duncan et al. (Reference Duncan, Balcazar, Gonzalez, Tamene and Clawson2023) study, we also sent personalized emails to some students offering extra encouragement to apply. We also held two information sessions. The application was a brief Google form with 14 questions, one of which asked applicants to upload a current résumé (recruitment materials are included in online appendix B).

Six of the 95 students who applied to be MSIRA fellows were ineligible because they were either not political science majors or not in good academic standing. Of the remaining 89 applicants, we drew a lottery to select 25 students to enroll in the Spring 2023 course (Merolla, Dionne, and Brookes Reference Merolla, Dionne and Brookes2025). Randomizing selection was important for ensuring the program’s broad impact, not only its impact on high-achieving students. To be sure, we cannot entirely remove self-selection effects because students who are willing to submit an application still may be different from those who are not. However, we made efforts to reduce self-selection effects by requiring only that applicants be political science majors in good academic standing (i.e., minimum 2.0 GPA). We neither required nor accepted recommendation or cover letters. Rather, a brief Google form that required applicants to upload a résumé served as the entire application. Moreover, we designed our recruitment methods to reduce selection bias—for example, posting flyers around campus, emailing all political science majors, hosting in-person and virtual information sessions, and classroom announcements. We emphasized during recruitment that we would use a lottery for selection and that application quality would not be considered. These features are important to keep in mind when comparing MSIRA to UREs with stricter GPA rules and more onerous application requirements or targeted recruitment. All 25 students successfully completed the course and were assigned faculty mentors for the apprenticeship phase; however, only 22 students completed their apprenticeship in Summer 2023.Footnote 8

To measure MSIRA’s impact, we analyzed panel survey data from the first cohort (survey questions are in online appendix C). We surveyed students three times: pre-fellowship (March 2023), post-course (June 2023), and post-apprenticeship (August 2023). We used Qualtrics to administer the online surveys. The median time to complete each wave was as follows: Wave 1: Pre-Fellowship, 20 minutes; Wave 2: Post-Course, 17 minutes; and Wave 3: Post-Apprenticeship, 13 minutes. The shorter time for the last two waves resulted from not repeating the background questions. All three waves included questions about future career goals, including interest in graduate school, perceptions of academic skills, and measures of self-confidence. The Pre-Fellowship Wave also included questions about educational background, gender, racial and ethnic identification, age, immigration background, and social class indicators. Because our data include a small sample of students, we report descriptive statistics for each wave, including the 21 students who completed the program and all three survey waves.

We also conducted FGDs with 23 students at the end of the course (June 2023)Footnote 9 and 20 students at the end of the apprenticeship (August–October 2023). FGDs were mixed gender, with two to eight participants per FGD, and conversations were less than an hour. We conducted post-course FGDs in person and post-apprenticeship FGDs via Zoom. The authors facilitated these conversations using a question guideline (see online appendix D) and an experienced notetaker provided assistance. Questions covered the extent to which students had increased or decreased interest in postgraduate education, confidence in conducting research, feelings of accomplishment, and skill development. We took handwritten or typed notes for the post-course focus groups and, when possible and with consent, we audio-recorded the post-apprenticeship FGDs. For the latter, facilitators reviewed verbatim transcripts typed by research assistants. We analyzed the transcripts using deductive coding techniques, drawing primarily on the expectations that informed our question guideline. Because these transcripts were from the first cohort, however, we also used inductive coding, wherein we derived codes based on themes that emerged from our review.

RESULTS

Table 1 presents the first MSIRA cohort’s demographic profile and, when available, includes comparable data for all individuals majoring in political science at UCR as a reference group. There were more women in the program but men were slightly overrepresented relative to the population of men who were political science majors. Our first cohort was diverse and resembled the broader population of political science majors at UCR; however, there were no MSIRA students who identified as Black or of Middle Eastern descent.Footnote 10

Table 1 Demographic Profile of MSIRA’s First Cohort and Political Science Majors at UC Riverside

Notes: Respondents could select all ethnoracial identities that apply; therefore, these are separate indicators. Other categories may not total 100% because some people did not respond. This table reports data for the 22 MSIRA students who completed the course and apprenticeship. The data for political science majors are from institutional research at UCR for Spring 2023. N/A signifies that there were no available data in that category.

Skills and Academic Confidence

We asked several survey questions adapted from the Survey of Undergraduate Research Experiences (SURE), which scholars have used to assess the effectiveness of various undergraduate research programs (Lopatto Reference Lopatto2004, Reference Lopatto2007).Footnote 11 MSIRA was designed to be rigorous, and we expected students to gain a broad skill set. To assess perceptions of their skills, we asked students a battery of questions in which they compared themselves to the average college student in rating their confidence about their math, writing, public speaking, social, computer, creative thinking, and critical thinking skills. Response options were on a five-point scale: (1) “I’m in the bottom 10%”; (2) “I’m below average but not in the bottom 10%”; (3) “I’m about average”; (4) “I’m above average but not in the top 10%”; and (5) “I’m in the top 10%.” Class assignments required students to use most of these skills, with the exception of math; therefore, we anticipated that we would observe an increase in perceptions in the Post-Course Wave of the survey. Table 2 shows mean perception by each skill among students who completed all three survey waves and answered the skills questions.Footnote 12 To test for significant differences between waves, we use paired sample t-tests.

Table 2 Mean Perception of Skills by Survey Wave

We observed a modest increase in skill perceptions on every item from the Pre-Fellowship Wave to the Post-Course Wave, although not all of these increases were statistically meaningful. One significant improvement was in social skills, which increased by 0.33 units between the Pre-Fellowship Wave and the Post-Course Wave (p=0.065, one-tailed). This likely was due to weekly team assignments, which is rare for political science majors at our institution. Another substantial shift was in public speaking, which increased by 0.33 units between the two waves (p=0.016, one-tailed). Due to the small class size, students had more opportunities to participate in class discussions, they were expected to present on behalf of their group, and all of them presented their final project to the class. There also was a marginal increase in perceptions of math skills: a shift of 0.25 units (p=0.102, one-tailed) between waves, respectively. This was somewhat surprising in that none of the assignments required math skills, but we had reviewed how to read regression output tables in academic papers.

We did not find statistically significant improvements in perceptions of skills in writing, creative thinking, computer, or critical thinking. We were surprised to not observe shifts in perceptions of writing skills because students had weekly writing assignments, which is much more than they typically have in political science courses. We had discussed the strengths and limitations of each methodological approach; therefore, we also expected improvement in perceptions of critical thinking skills.

By the Post-Apprenticeship Wave of the survey, these perceptions remained statistically similar to the Post-Course Wave, with the exception of math skills, which declined slightly (p=0.102, one-tailed).

In the focus groups, students generally reported increased confidence in their academic and social skills, including knowing “how to read through a research article,”Footnote 13 writing in general,Footnote 14 “approaching professors,”Footnote 15 knowing “how to draft abstracts and make summaries,”Footnote 16 knowing “what kinds of questions to ask,”Footnote 17 knowing “how to be more creative with finding information,”Footnote 18 coding data,Footnote 19 and obtaining Institutional Review Board (IRB) approval.Footnote 20 As one student summarized, “I feel like the skills that we’ve gained are also really important to just being students.”Footnote 21

In addition to measuring perceptions of specific skills, we anticipated that this high-intensity experience and training would lead to greater confidence in conducting research. We therefore included a battery of indicators from SURE surveys adapted to the political science context. Respondents were asked for their level of agreement or disagreement with 15 statements (see online appendix C). Sample statements included: “I am able to think independently and formulate my own ideas”; “I am comfortable conveying political science research to a broad audience”; and “I have a difficult time interpreting research findings.” We coded all of the statements to go in a positive direction and created an additive scale (alphas are as follows: 0.74 (Wave 1: Pre-Fellowship); 0.84 (Wave 2: Post-Course); and 0.64 (Wave 3: Post-Apprenticeship)) that runs from 1 to 5. Figure 1 shows that students began the program close to the neutral point of “neither agree nor disagree,” with a mean of 3.55. At the end-of-course survey, Wave 2, confidence levels significantly increased to 3.92 (p=0.001, one-tailed) or close to the “agree” category. There was no meaningful movement after the 25-hour apprenticeship in Wave 3, with a mean of 3.92.

Figure 1 Confidence in Conducting Research by Wave

Finally, we assessed whether the program increased academic confidence and general self-confidence using two subscales of the Personal Evaluation Inventory (Shrauger and Schohn Reference Shrauger and Schohn1995). Respondents were asked for their level of agreement with seven statements on a four-point scale. Some of the statements to assess academic self-confidence included: “Academic performance is an area in which I can show my competence and be recognized for my achievement”; and “I have recognized that I am not as good a student as most of the people with whom I am competing.” This subscale included seven statements (see online appendix C), and we coded all of the statements to go in a positive direction. We also created an additive scale that ranged from 1 to 4; alphas for each wave were as follows: 0.80 (Wave 1: Pre-Fellowship); 0.80 (Wave 2: Post-Course); and 0.82 (Wave 3: Post-Apprenticeship). The general self-confidence scale also had seven statements and included questions such as: “I often feel unsure of myself even in situations I have successfully dealt with in the past”; and “I have more confidence in myself than most people I know.” We also combined these statements into an additive scale that ranged from 1 to 4; alphas for each wave were as follows: 0.81 (Wave 1: Pre-Fellowship); 0.80 (Wave 2: Post-Course); and 0.87 (Wave 3: Post-Apprenticeship). Figure 2 displays mean scores on academic (N=20) and general self-confidence (N=21) by survey wave. Students showed higher academic than general self-confidence in each wave, although we observed modest increases in each type of confidence over time. Whereas the increases in academic and general self-confidence between the Pre-Fellowship Wave and the Post-Course Wave were not statistically significant, the increases between the Post-Course Wave and the Post-Apprenticeship Wave (p=0.0003, one-tailed; p=0.102, one-tailed, respectively) and the Pre-Fellowship Wave and Post-Apprenticeship Wave (p=0.001, one-tailed; p=0.02, one-tailed, respectively) were.

Figure 2 Academic and General Self-Confidence by Survey Wave

Increased confidence also emerged as a key theme in the focus groups. For instance, one student reported feeling “more confident in making mistakes because I feel that I have the knowledge that will get me through, and I feel like making a mistake isn’t as daunting anymore because there’s not necessarily too much consequence to it, except knowing to correct it and not make that mistake in the future.”Footnote 22 Another student felt “a lot better at communicating to other professors now, just because I’ve had this experience talking to multiple professors about stuff related to research and things like that. So, I think I have more of a sense of how to approach professors.”Footnote 23

FGDs also revealed an increase in students’ confidence in research. One student credited the course: “The class has set me up to be able to do my own research. It has helped with being able to read research papers. I know how to copy the format of writing a research paper based on the research papers that we read. That is very helpful.”Footnote 24 Another noted, “At the beginning of the class, I was uncertain of my abilities. It took me a while, but the group [assignments] really did help.”Footnote 25 Other students emphasized the importance of the apprenticeship for their confidence: “I think the apprenticeship was a really great opportunity. And, like I said, a huge confidence booster….I had a lot of fun telling people, ‘Oh, I’m a research assistant. These are my tasks. This is what I’m doing.’ So, I had a lot of fun. And I did have a bit of a confidence boost. Like I can do a hard thing in my mind.”Footnote 26

Firsthand research experience also could address imposter syndrome. One student explained, “When I first applied to MSIRA, I kind of just did it on a whim, and then I got accepted. And I was like, oh, my gosh, I’m nervous, I have to do research now. And I contemplated even stepping away, because I was like, I just can’t do it coming from a family who’s not really into education or even have research experiences. It felt very daunting. But I just think [Professor Merolla], Professor Brookes, and [Professor] Dionne made it so easy, and it was a good introduction.”Footnote 27 A different student expressed similar sentiments: “At first, I was not confident in what I was doing, essentially. But now I feel…it was a very, very good introduction to how research goes and how it works….So it definitely left me more confident with wanting to dive more into research.”Footnote 28 Another student remarked that “seeing female leaders in political science has helped me a lot just because I feel less of an imposter syndrome in my major….I think it honestly helped me.”Footnote 29

Career Planning

In each wave, we also asked students about any plans to continue their education after college. Almost all MSIRA students said that they intended to continue their education, with only one student selecting “don’t know” in the Pre-Fellowship Wave, three in the Post-Course Wave, and two in the Post-Apprenticeship Wave. Whereas most students planned to continue their education, only a small percentage intended to pursue an MA or PhD in political science or a related field: six (Wave 1: Pre-Fellowship); four (Wave 2: Post-Course); and five (Wave 3: Post-Apprenticeship). The most common response about what students planned to do immediately after graduation was to go to law school: 13 (Wave 1: Pre-Fellowship); 14 (Wave 2: Post-Course); and 11 (Wave 3: Post-Apprenticeship). To summarize, we did not observe much change in future career intentions over the course of the program.

We did observe a meaningful impact of the program on students’ confidence about pursuing their career goals. In each wave, we asked students: “How confident do you feel about the next steps to take toward your future job or career?” Students responded on a seven-point scale, with higher values indicating greater confidence. Figure 3 shows mean confidence in each survey wave (N=21), which was close to the midpoint (4.62) before students took the course, increased to an average of 5 after the course (p=0.029, one-tailed), and further increased to 5.19 after the apprenticeship.Footnote 30

Figure 3 Average Confidence in Next Steps for Future Career, by Wave

In the focus groups, students elaborated on how MSIRA made them feel more confident in their career goals even though most expressed little interest in pursuing an advanced political science degree. “I was dead set on going to law school, and still am,” one student said. “But I…think these skills would be applicable. And I’m glad that the research allowed me to test out everything before heading off.”Footnote 31 Another remarked, “The skills that you guys taught us, they’re mainly related to research, but I feel like you could also implement them in other careers.”Footnote 32 Many students’ self-reported experiences can be summarized by a comment made in another focus group: “I took this course as a way to see if maybe research in this field was for me because I wanted to decide what I want to do after undergrad, and so by taking this course by learning those kind of research methods, I was better able to choose my path instead of just going blindly in one kind of direction without even looking in the research direction.”Footnote 33

DISCUSSION AND CONCLUSION

We created a research academy to increase access to research training and experiences for students at our MSI. By examining panel surveys and focus group data, we found that our two-term CURE increased students’ reported confidence in academic abilities, in conducting research, and in taking the next steps toward their future careers. Political science majors at UCR have multiple high-impact learning opportunities available to them; however, before the creation of MSIRA, they did not have access to UREs on campus. Our primary goal in creating MSIRA was to serve the discipline by increasing informed interest in and preparedness for graduate school—especially doctoral programs in political science—among students from communities that are underrepresented in the academy. However, data from the initial cohort studied herein do not show that MSIRA had an impact on students’ intentions to enroll in graduate programs. This is not unusual for opt-in pre-PhD programs because many students who participate already intend to pursue graduate study before opting in (Brutger Reference Brutger2024). Nevertheless, the research skills that students build in MSIRA are broadly useful—whether for those on the path to a doctoral degree or pursuing other graduate programs and career trajectories.

We created a research academy to increase access to research training and experiences for students at our MSI. By examining panel surveys and focus group data, we found that our two-term CURE increased students’ reported confidence in academic abilities, in conducting research, and in taking the next steps toward their future career.

When designing MSIRA, our model for the apprenticeship program was similar to Time-sharing Experiments in the Social Sciences (TESS), which is free for investigators. Our program enabled researchers to concentrate more on the production of ideas and less on acquiring funding for the significant costs of hiring (and often training) qualified research assistants and paying fair wages to perform tasks to advance researchers’ scholarly agendas. We designed the MSIRA apprenticeship specifically in response to the NSF’s Build and Broaden initiative because these trained research assistants can significantly enhance research support to faculty at under-resourced MSIs, and the collaborative nature of MSIRA—in terms of both students being eligible to apply and faculty members teaching and seeking research assistants from MSIRA—builds and strengthens relationships among scholars across MSIs.

Our experience with the first MSIRA cohort provided valuable lessons that led to adjustments in subsequent cohorts. For example, the 25-hour apprenticeship was simply too short for many students to gain or develop significant skills; likewise, offering apprenticeships during the summer reduced in-person interactions with faculty. We have shifted to a more intensive commitment in which students complete (for course credit) a 10-week apprenticeship during the regular school year. In the revised apprenticeship, students are expected to complete 10 to 12 hours of work per week and have in-person meetings with their respective faculty mentors at least every other week. We also shifted to offering the MSIRA course only during Fall and Winter terms such that students’ apprenticeships will follow in Winter and Spring terms, respectively. We continue to collect data for the subsequent cohorts that are experiencing these programmatic changes, and it is too early to know whether and what type of impacts the changes will have.

Institutional support of CUREs is critical (Duncan et al. Reference Duncan, Balcazar, Gonzalez, Tamene and Clawson2023; Zvobgo et al. Reference Zvobgo, Pickering, Settle and Tierney2023), as we learned when we faced multiple administrative and bureaucratic challenges in launching MSIRA at an institution that experiences severe underfunding and understaffing. Although MSIRA expanded research training for dozens of students who otherwise would not have such an opportunity, significant institutional support is essential for a research academy’s sustainability. The model at the University of Puerto Rico at Cayey, which involves a more holistic approach to building a culture of undergraduate research through faculty support, is one vision of UREs at MSIs that addresses institutional support at its core (Godreau et al. Reference Godreau, Gavillán-Suárez, Franco-Ortiz, Calderón-Squiabro, Marti and Gaspar-Concepción2015; see also Zvobgo et al. Reference Zvobgo, Pickering, Settle and Tierney2023). Partnership between MSIs and better-resourced institutions is another viable approach (Adida et al. Reference Adida, Lake, Shafiei and Platt2020; Perry, Zuhlke, and Tormos-Aponte Reference Perry, Zuhlke and Tormos-Aponte2023). However, this reinforces an “export model” of sending students from MSIs to train with faculty at research-intensive institutions, limiting the value added for the MSIs and the faculty that teach at them (Godreau et al. Reference Godreau, Gavillán-Suárez, Franco-Ortiz, Calderón-Squiabro, Marti and Gaspar-Concepción2015).

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit http://doi.org/10.1017/S1049096525101480.

ACKNOWLEDGMENTS

For research assistance on this project, we thank Gabriela Arroyo-Gomez, Jhonjairo Fernandez-Dearcia, Zabdi Velásquez, John Burnett, Ding Wang, and especially Minhye Joo. Of course, this article would not have been possible without the first cohort of MSIRA fellows, who were patient with the challenges of an upstart research academy and valued research so much that they agreed to participate in this study. UCR’s political science academic advisors—Bryan Barker, Donna Perez, and Kristy Salazar—have been crucial to our success in recruiting students to MSIRA. We also are grateful to our colleagues who served as faculty mentors for this first cohort of MSIRA fellows: Ben Bishin, Ivy Cargile, Miguel Carreras, Paul D’Anieri, Kevin Esterling, Yasemin Irrepoglu-Carreras, Bronwyn Leebaw, Steven Liao, Michael Tesler, and Nicholas Weller. Finally, we thank the thoughtful anonymous reviewers and PS editors for their helpful and rigorous feedback that improved this article.

DATA AVAILABILITY STATEMENT

Research documentation and data that support the findings of this study are openly available at the PS: Political Science & Politics Harvard Dataverse at https://doi.org/10.7910/DVN/V52AOM.

CONFLICTS OF INTEREST

The authors declare that there are no ethical issues or conflicts of interest in this research.

Footnotes

1. The programming and research described in this article were made possible by a grant from the National Science Foundation (Grant No. 2222122) and support from the Cherniss Fund at UC Riverside. The views expressed are those of the authors and do not necessarily reflect those of the NSF.

2. UCR has been designated a Hispanic Serving Institution (HSI) and an Asian-American and Native American Pacific Islander-Serving Institution (AANAPISI). These are special recognitions assigned by the US Department of Education’s Higher Education Act for having full-time equivalent undergraduate student enrollments that are at least 25% Hispanic and at least 10% Asian American and Native American Pacific Islander, respectively.

3. Data are from UCR Academic Data Dashboards, Enrollment by College and Program.

4. This study was determined to be exempt by UCR’s IRB (Protocol No. 22119).

5. UCR uses a 10-week quarter system; students earned four units for completing this course.

6. Increasing the relative proportion of team versus individual assignments can further reduce grading burden and support potential expansion of MSIRA, thereby increasing the number of students that the academy serves each term. However, increasing class size would counteract other goals for MSIRA, particularly keeping class size sufficiently small to increase the likelihood that students at our under-resourced MSI will have more academic interactions with professors and peers (Beattie and Thiele Reference Beattie and Thiele2016).

7. During the apprenticeships in Summer 2023, the California state minimum wage was $15.50 per hour and the authors’ other undergraduate research assistants typically earned $18 per hour.

8. One student did not finish the apprenticeship and two students received extensions and completed the apprenticeship later than their cohort.

9. Two students missed the post-course focus groups.

10. Additional survey results showed that MSIRA participants were more likely than the broader population of political science majors (N=88) to report plans for further education (96% versus 72.7%). However, the proportion of those interested in research careers was only slightly higher: 28% versus 23%.

11. Many questions were designed to assess whether research experiences lead to greater interest in science, independent thinking, and active learning (see Seymour et al. Reference Seymour, Hunter, Laursen and DeAntoni2004 for an overview).

12. Our N typically fluctuates between 20 and 21 depending on whether a student skipped a question on the survey.

13. Leah, post-apprenticeship FGD, 9/27/23. All names are pseudonyms.

14. Anaya, post-course FGD, 6/16/23.

15. Anaya, post-apprenticeship FGD, 9/27/23.

16. Lucia, post-apprenticeship FGD, 10/4/23.

17. Grace, post-course FGD, 6/16/23.

18. Isha, post-apprenticeship FGD, 8/16/23.

19. Jinwoo, post-apprenticeship FGD, 9/27/23.

20. Emilio, post-course FGD, 6/16/23.

21. Adriana, post-apprenticeship FGD, 9/26/23.

22. Leah, post-apprenticeship FGD, 9/27/23.

23. Jinwoo, post-apprenticeship FGD, 9/27/23.

24. Daniel, post-course FGD, 6/16/23.

25. Camila, post-course FGD, 6/16/23.

26. Kelsey, post-apprenticeship FGD, 9/26/23.

27. Kelsey, post-apprenticeship FGD, 9/26/23.

28. Lucia, post-apprenticeship FGD, 10/4/23.

29. Nancy, post-apprenticeship FGD, 9/26/23.

30. The difference between Wave 2 and Wave 3 was not statistically meaningful (p=0.214, one-tailed), but the difference between Wave 1 and Wave 3 was statistically significant (p=0.055, one-tailed).

31. Liam, post-apprenticeship FGD, 8/23/23.

32. Adriana, post-apprenticeship FGD, 9/26/23.

33. Anaya, post-apprenticeship FGD, 9/27/23.

References

REFERENCES

Adida, Claire L., Lake, David A., Shafiei, Fatemeh, and Platt, Matthew. 2020. “Broadening the PhD Pipeline: A Summer Research Program for HBCU Students.” PS: Political Science & Politics 53 (4): 723–28.Google Scholar
Beattie, Irenee, and Thiele, Megan. 2016. “Connecting in Class? College Class Size and Inequality in Academic Social Capital.” Journal of Higher Education 87 (3): 332–62.Google Scholar
Becker, Megan, Benjamin, A. T. Graham, and Zvobgo, Kelebogile. 2021. “The Stewardship Model: An Inclusive Approach to Undergraduate Research.” PS: Political Science & Politics 54 (1): 158–62.Google Scholar
Brutger, Ryan. 2024. “The PhD Pipeline Initiative Works: Evidence from a Randomized Intervention to Help Underrepresented Students Prepare for PhDs in Political Science.” Journal of Politics 86 (1): 383–87.10.1086/726954CrossRefGoogle Scholar
Carpi, Anthony, Ronan, Darcy M., Falconer, Heather M., and Lents, Nathan H.. 2017. “Cultivating Minority Scientists: Undergraduate Research Increases Self-Efficacy and Career Ambitions for Underrepresented Students in STEM.” Journal of Research in Science Teaching 54 (2): 169–94. https://doi.org/10.1002/tea.21341.CrossRefGoogle Scholar
Duncan, Natasha T., Balcazar, Pablo, Gonzalez, Daniella, Tamene, Meron, and Clawson, Rosalee A.. 2023. “Creating, Implementing, and Experiencing Research Opportunities: A Focus on Diversity, Equity, and Inclusion.” PS: Political Science & Politics 56 (4): 499505.Google Scholar
Gade, Emily K., and Geoffrey, P. R. Wallace. 2023. “Productive Learning Through Labs: Data Laboratories and Their Value in Undergraduate Education and Scholarly Research.” PS: Political Science & Politics 56 (4): 481–86.Google Scholar
Godreau, Isar, Gavillán-Suárez, Jannette, Franco-Ortiz, Mariluz, Calderón-Squiabro, José M., Marti, Vionex, and Gaspar-Concepción, Jessica. 2015. “Growing Faculty Research for Students’ Success: Best Practices of a Research Institute at a Minority-Serving Undergraduate Institution.” Journal of Research Administration 46 (2): 5578.Google Scholar
Ishiyama, John, and Breuning, Marijke. 2003. “Does Participation in Undergraduate Research Affect Political Science Students?Politics & Policy 31 (1): 163–80.10.1111/j.1747-1346.2003.tb00892.xCrossRefGoogle Scholar
Linn, Marcia C., Palmer, Erin, Baranger, Anne, Gerard, Elizabeth, and Stone, Elisa. 2015. “Undergraduate Research Experiences: Impacts and Opportunities.” Science 347 (6222): 1261757.10.1126/science.1261757CrossRefGoogle ScholarPubMed
Livny, Avital. 2023. “A Student-Centered, Expanded Approach to the Undergraduate Research Experience.” PS: Political Science & Politics 56 (4): 463–68.Google Scholar
Lopatto, David. 2004. “Survey of Undergraduate Research Experiences (SURE): First Findings.” Cell Biology Education 3 (4): 270–77.10.1187/cbe.04-07-0045CrossRefGoogle ScholarPubMed
Lopatto, David. 2007. “Undergraduate Research Experiences Support Science Career Decisions and Active Learning.” CBE Life Sciences Education 6:297306.10.1187/cbe.07-06-0039CrossRefGoogle ScholarPubMed
McClain, Paula. 2022. “APSA Presidential Task Force on Systemic Inequalities in the Discipline.” Washington, DC: American Political Science Association. www.apsanet.org/Portals/54/diversity%20and%20inclusion%20prgms/APSA%20Presidential%20Task%20Force%20Executive%20Summary%202021.pdf?ver=4us4UyUwzja7wyL7oM7NpQ%3d%3d&timestamp=1649340310263.Google Scholar
Merolla, Jennifer, Dionne, Kim Yi, and Brookes, Marissa. 2025. “Replication Data for ‘Course-Based Research and Mentorship: Results from a Multiterm Research Academy at a Minority-Serving Institution.’” PS: Political Science & Politics. DOI:10.7910/DVN/V52AOM.10.7910/DVN/V52AOMCrossRefGoogle Scholar
Pérez, Efrén. 2023. “Scouting and Growing Diverse Undergraduate Talent: UCLA’s Race, Ethnicity, Politics and Society Lab.” PS: Political Science & Politics 56 (4): 519–24.Google Scholar
Perry, Brittany N., Zuhlke, Samantha, and Tormos-Aponte, Fernando. 2023. “Building Infrastructure to Enhance Diversity in Political Methodology.” PS: Political Science & Politics 56 (1): 178–82.Google Scholar
Russell, Susan H., Hancock, Mary P., and McCullough, James. 2007. “Benefits of Undergraduate Research Experiences.” Science 316 (5824): 548–49. https://doi.org/10.1126/science.1140384.CrossRefGoogle ScholarPubMed
Seymour, Elaine, Hunter, Anne‐Barrie, Laursen, Sandra L., and DeAntoni, Tracee. 2004. “Establishing the Benefits of Research Experiences for Undergraduates in the Sciences: First Findings from a Three‐Year Study.” Science Education 88 (4): 493534.10.1002/sce.10131CrossRefGoogle Scholar
Shrauger, J. Sidney, and Schohn, Mary. 1995. “Self-Confidence in College Students: Conceptualization, Measurement, and Behavioral Implications.” Assessment 2 (3): 255–78.10.1177/1073191195002003006CrossRefGoogle Scholar
Sinclair-Chapman, Valeria. 2015. “Leveraging Diversity in Political Science for Institutional and Disciplinary Change.” PS: Political Science & Politics 48 (3): 454–58.Google Scholar
Tormos-Aponte, Fernando, and Velez-Serrano, Mayra. 2020. “Broadening the Pathway for Graduate Studies in Political Science.” PS: Political Science & Politics 53 (1): 145–46.Google Scholar
Weinschenk, Aaron C. 2020. “Creating and Implementing an Undergraduate Research Lab in Political Science.” Journal of Political Science Education 17 (sup1): 284–96. DOI:10.1080/15512169.2020.1795873.10.1080/15512169.2020.1795873CrossRefGoogle Scholar
Zvobgo, Kelebogile, Pickering, Paula M., Settle, Jaime E., and Tierney, Michael J.. 2023. “Creating New Knowledge with Undergraduate Students: Institutional Incentives and Faculty Agency.” PS: Political Science & Politics 56 (4): 512–18.Google Scholar
Figure 0

Table 1 Demographic Profile of MSIRA’s First Cohort and Political Science Majors at UC Riverside

Figure 1

Table 2 Mean Perception of Skills by Survey Wave

Figure 2

Figure 1 Confidence in Conducting Research by Wave

Figure 3

Figure 2 Academic and General Self-Confidence by Survey Wave

Figure 4

Figure 3 Average Confidence in Next Steps for Future Career, by Wave

Supplementary material: File

Brookes et al. supplementary material

Brookes et al. supplementary material
Download Brookes et al. supplementary material(File)
File 5.8 MB
Supplementary material: Link

Brookes et al. Dataset

Link