Christopher T. Belser, M. Ann Shillingford, Andrew P. Daire, Diandra J. Prescod, Melissa A. Dagley
The United States is facing a crisis with respect to filling job vacancies within science, technology, engineering, and math (STEM) industries and with students completing STEM undergraduate degrees. In addition, disparities exist for females and ethnic minorities within STEM fields. Whereas prior research has centered on disparities in STEM fields, retention rates, and some intervention programs, researchers have not given much attention to the role of career development initiatives within STEM recruitment and retention programming. The purpose of the present study was to incorporate demographic variables, math performance, and career development–related factors into predictive models of STEM retention with a sample of undergraduate students within a STEM recruitment and retention program. The resulting two models accurately predicted first-year to second-year retention with 73.4% of the cases and accurately predicted first-year to third-year retention with 70.0% of the cases. Based on the results, the researchers provide a rationale for STEM career programming in K–12 and higher education settings and for the inclusion of career development and career counseling in STEM education programming.
Keywords: STEM, retention, career development, career counseling, undergraduate student
The United States lacks an adequate number of workers to keep up with the demand for trained workers in science, technology, engineering, and mathematics (STEM) fields (National Center for Science and Engineering Statistics [NCSES], 2017; National Science Board, 2018; Sithole et al., 2017). Researchers have pointed to the overall stagnancy of undergraduate students declaring and completing STEM degrees (Carnevale, Smith, & Melton, 2011; Doerschuk et al., 2016; Sithole et al., 2017). Additionally, underrepresentation is a problem for racial and ethnic minorities and females in STEM fields (NCSES, 2017). Because of these disparities, universities have developed programs centered on recruitment and retention of STEM undergraduates (Bouwma-Gearhart, Perry, & Presley, 2014; Dagley et al., 2016; Schneider, Bickel, & Morrison-Shetlar, 2015) and both government and private entities invest billions of dollars annually toward STEM initiatives at the K–12 and higher education levels (Carnevale et al., 2011). However, many of these endeavors have failed to incorporate components centered on career development or career planning.
The National Career Development Association (2015) defined career development as “the sequence of career-related choices and transitions made over the life span” (p. 4) and career planning as a structured process through which a person makes decisions and plans for a future career. Career development activities, such as structured career planning courses, have shown efficacy with general undergraduate populations (Osborn, Howard, & Leierer, 2007; Reardon, Melvin, McClain, Peterson, & Bowman, 2015) but have been studied less commonly with STEM-specific undergraduate populations (Belser, Prescod, Daire, Dagley, & Young, 2017, 2018; Prescod, Daire, Young, Dagley, & Georgiopoulos, in press). In the present study, researchers examined a STEM recruitment and retention program that did include a career planning course. More specifically, the research team sought to investigate relationships between demographics (e.g., gender, ethnicity), math scores, and various aspects of the undergraduate STEM program and student retention in the first 2 years of college.
Gender, Ethnicity, and STEM
Gender disparities are a common sight within STEM degree programs and the larger STEM workforce (NCSES, 2017). Females who are interested in math and science are more likely to be tracked into non-diagnosing health practitioner fields, such as nursing (ACT, 2018; NCSES, 2017). Some researchers have pointed to the K–12 arena as the root of these gender disparities that permeate undergraduate programs and STEM professions (Mansfield, Welton, & Grogan, 2014), whereas others have identified specific problems, such as differences in math and science course completion over time (Chen & Soldner, 2013; Riegle-Crumb, King, Grodsky, & Muller, 2012), stereotype threat (Beasley & Fischer, 2012), and STEM confidence (Litzler, Samuelson, & Lorah, 2014). As a result, existing predictive models typically indicate a lower likelihood of females completing a STEM degree compared to male students (Cundiff, Vescio, Loken, & Lo, 2013; Gayles & Ampaw, 2014).
Similarly, disparities in STEM degree completion and STEM job attainment exist between ethnic groups (NCSES, 2017; Palmer, Maramba, & Dancy, 2011). Although progress has been made in degree attainment in certain STEM areas, other areas have stagnated or are declining in participation by ethnic minority students (Chen & Soldner, 2013; NCSES, 2017). Foltz, Gannon, and Kirschmann (2014) identified protective factors for minority students in STEM, such as receiving college-going expectations from home, establishing connections with STEM faculty members (particularly those of color), and developing connections with other minority students in STEM majors; however, the disparities in STEM programs help perpetuate a cycle of many students not being exposed to these protective factors. The intersectionality of ethnicity and gender in STEM fields has become a topic producing interesting findings (Riegle-Crumb & King, 2010). In addition to observing disparities across ethnic groups, researchers have observed disparities within ethnic groups based on gender (Beasley & Fischer, 2012; Cundiff et al., 2013; Riegle-Crumb & King, 2010). Specifically with males of color, predictive models have been inconclusive, with some showing a higher likelihood of completing a STEM degree (Riegle-Crumb & King, 2010) and others showing a lower likelihood (Cundiff et al., 2013; Gayles & Ampaw, 2014).
Mathematics and STEM
The SAT is one of the most widely used college admissions tests (CollegeBoard, 2018). Researchers have correlated the math sub-score with undergraduate math and science classes within the first year, indicating that higher SAT math scores indicate a higher probability of higher course grades in math and science courses (Wyatt, Remigio, & Camara, 2012). Additionally, researchers have identified SAT scores as predictors of academic success and university retention (Crisp, Nora, & Taggart, 2009; Le, Robbins, & Westrick, 2014; Mattern & Patterson, 2013; Rohr, 2012). Despite its wide use in higher education admissions, the SAT may not be free from bias. Numerous scholars have highlighted potential test bias, particularly against ethnic minorities (Dixon-Román, Everson, & McArdle, 2013; Lawlor, Richman, & Richman, 1997; Toldson & McGee, 2014). Nevertheless, its wide use makes it a prime instrument for research.
In addition to the SAT scores, researchers also have demonstrated that taking higher-level math courses and having higher math self-efficacy translate to better outcomes within STEM majors (Carnevale et al., 2011; Chen & Soldner, 2013; Nosek & Smyth, 2011). Specifically, taking calculus-based courses in high school correlated with retention in STEM majors (Chen & Soldner, 2013). Nosek and Smyth (2011) found connections between gender and internalized math variables, such as warmth for math, identification with math, and self-efficacy; females across the life span showed lower levels of each of these variables, but the authors did not test these against retention outcomes in STEM majors. However, one could hypothesize that having lower levels of warmth toward math and not being able to identify with math would likely impact one’s career decisions, particularly related to math and science fields.
Career Interventions and STEM
Career theory can provide for understanding one’s interest in STEM fields (Holland, 1973), one’s exposure to STEM fields (Gottfredson, 1981), and one’s beliefs or expectations about the process of choosing a STEM field (Lent, Brown, & Hackett, 2002; Peterson, Sampson, Lenz, & Reardon, 2002). However, career interventions, such as a career planning class, are more likely to make a direct impact on career outcomes with undergraduates. In one review of research on undergraduate career planning courses, more than 90% of the courses produced some measurable positive result for students, such as increased likelihood of completing a major, decreased negative career thinking, and increased career self-efficacy (Reardon & Fiore, 2014). Other researchers have reported similar results with generic undergraduate career planning courses (Osborn et al., 2007; Saunders, Peterson, Sampson, & Reardon, 2000).
Researchers have studied structured career planning courses specific to STEM majors with much less frequency. In one such study, Prescod and colleagues (in press) found that students who took a STEM-focused career planning course scored lower on a measure of negative career thinking at the end of the semester. In a similar study, STEM-interested students in a STEM-focused career planning course had lower posttest scores on a measure of negative career thinking than declared STEM majors at the end of the same semester (Belser et al., 2018). Additionally, in a pilot study, Belser and colleagues (2017) found that greater reductions in negative career thinking predicted higher odds of being retained in a STEM major from the first to second year of college; in this same study, the authors found that students who participated in a STEM-focused career planning course were more likely to be retained in a STEM major than students in an alternative STEM course. Researchers have not given ample attention to determining how career planning and other career variables fit into predictive models of retention in STEM majors.
Statement of the Problem and Hypotheses
As previously noted, prior researchers have paid limited attention to developing predictive models that incorporate career development variables along with demographics and math performance. Developing effective predictive models has implications for researchers, career practitioners, higher education professionals, and the STEM workforce. To this end, the researchers intend to test two such models related to retention in STEM majors using the following hypotheses:
Hypothesis 1: First-year to second-year undergraduate retention in STEM majors can be predicted by ethnicity, gender, initial major, math placement–algebra scores, SAT math scores, STEM course participation, and Career Thoughts Inventory (CTI) change scores.
Hypothesis 2: First-year to third-year undergraduate retention in STEM majors can be predicted by ethnicity, gender, initial major, math placement–algebra scores, SAT math scores, STEM course participation, and CTI change scores.
Methods
In this study, researchers examined multi-year retention data for students in a STEM recruitment and retention program at a large research university in the Southeastern United States and utilized a quasi-experimental design with non-equivalent comparison groups (Campbell & Stanley, 1963; Gall, Gall, & Borg, 2007). Because this study was part of a larger research project, Institutional Review Board approval was already in place.
The COMPASS Program
The COMPASS Program (Convincing Outstanding Math-Potential Admits to Succeed in STEM; Dagley et al., 2016) is a National Science Foundation–funded project that seeks to recruit and retain undergraduate students in STEM majors. To enter the program, students must have a minimum SAT math score of 550, an undeclared major at the time of applying to the university and program, and an expressed interest in potentially pursuing a STEM degree. However, some students accepted to the COMPASS Program declare a STEM major between the time that they are accepted into the COMPASS Program and the first day of class, creating a second track of students who were initially uncommitted to a major at the time of application. Students in both tracks have access to math and science tutoring in a program-specific center on campus, are matched with undergraduate mentors from STEM majors, have access to cohort math classes for students within the program, and can choose to live in a residence hall area designated for COMPASS participants. Depending on which COMPASS track students are in, they either take a STEM-focused career planning course or a STEM seminar course during their first semester.
COMPASS participants who started college without a declared major take a STEM-focused career planning class in their first semester. The activities of this course include a battery of career assessments and opportunities to hear career presentations from STEM professionals, visit STEM research labs, and attend structured career planning activities (e.g., developing a career action plan, résumé and cover letter writing, small group discussions). The first author and fourth author served as instructors for this course, and both were counselor education doctoral students at the time.
Participants who had declared a STEM major between the time they were accepted into the COMPASS Program and the first day of class took a STEM seminar course instead of the career planning class. The structure of this course included activities designed to help students engage with and be successful in their selected STEM majors, including presentations on learning styles and strategies, time management, study skills, professional experiences appropriate for STEM majors, and strategies for engaging in undergraduate research. Guest speakers for the class focused more on providing students with information about how to be successful as a STEM student. The course did not include career planning or career decision-making activities specifically geared toward helping students decide on a major or career field. A science education doctoral student served as the instructor of record for the course, with graduate students from various STEM fields serving as teaching assistants.
Participants
The university’s Institutional Knowledge Management Office provided demographic data on program participants. Table 1 displays descriptive data for participants, organized by second-year retention data (i.e., retention from the first year of college to the second year of college, for Hypothesis 1) and third-year retention data (i.e., retention from the first year of college to the third year of college, for Hypothesis 2). The frequencies for the subcategories were smaller for the third-year retention data (Hypothesis 2) because fewer participants had matriculated this far during the life of the project. Table 1 also breaks down each subset of the data based on which students were retained in a STEM major and which were not retained.
Table 1
Descriptive Statistics for Categorical Variables
Second-Year Retention Descriptives | Third-Year Retention Descriptives | |||||||||||
Variables | Retained | Not Retained | Total | Retained | Not Retained | Total | ||||||
n | %a | n | %b | n | %c | n | %a | n | %b | n | %c | |
Gender | ||||||||||||
Male | 159 | 58.9 | 74 | 46.5 | 233 | 54.3 | 72 | 55.8 | 65 | 44.8 | 137 | 50.0 |
Female | 111 | 41.1 | 85 | 53.5 | 196 | 45.7 | 57 | 44.2 | 80 | 55.2 | 137 | 50.0 |
Total | 270 | 100.0 | 159 | 100.0 | 429 | 100.0 | 129 | 100.0 | 145 | 100.0 | 274 | 100.0 |
Ethnicity | ||||||||||||
Caucasian/White | 147 | 54.4 | 100 | 62.9 | 247 | 57.6 | 66 | 51.2 | 85 | 58.6 | 151 | 55.1 |
African Am./Black | 31 | 11.5 | 16 | 10.1 | 47 | 11.0 | 16 | 12.4 | 18 | 12.4 | 34 | 12.4 |
Hispanic | 57 | 21.1 | 34 | 21.4 | 91 | 21.2 | 29 | 22.5 | 32 | 22.1 | 61 | 22.3 |
Asian/Pacific Islander | 24 | 8.9 | 4 | 2.5 | 28 | 6.5 | 10 | 7.8 | 5 | 3.4 | 15 | 5.5 |
Other | 11 | 4.1 | 5 | 3.1 | 16 | 3.7 | 8 | 6.2 | 5 | 3.4 | 13 | 4.7 |
Total | 270 | 100.0 | 159 | 100.0 | 429 | 100.0 | 129 | 100.0 | 145 | 100.0 | 274 | 100.0 |
Course | ||||||||||||
Career Planning | 137 | 50.7 | 120 | 75.5 | 257 | 59.9 | 76 | 58.9 | 112 | 77.2 | 188 | 68.6 |
STEM Seminar | 133 | 49.3 | 39 | 24.5 | 172 | 40.1 | 53 | 41.1 | 33 | 22.8 | 86 | 31.4 |
Total | 270 | 100.0 | 159 | 100.0 | 429 | 100.0 | 129 | 100.0 | 145 | 100.0 | 274 | 100.0 |
Initial Major | ||||||||||||
Undeclared | 130 | 48.1 | 72 | 45.3 | 202 | 47.1 | 65 | 50.4 | 63 | 43.4 | 128 | 46.7 |
STEM | 124 | 45.9 | 40 | 25.2 | 164 | 38.2 | 55 | 42.6 | 39 | 26.9 | 94 | 34.3 |
Non-STEM | 16 | 5.9 | 47 | 29.6 | 63 | 14.7 | 9 | 7.0 | 43 | 29.7 | 52 | 19.0 |
Total | 270 | 100.0 | 159 | 100.0 | 429 | 100.0 | 129 | 100.0 | 145 | 100.0 | 274 | 100.0 |
Note. a = percentage of the Retained group. b = percentage of the Not Retained group. c = percentage of the Total group.
Gender representation within the two samples was split relatively evenly, with female participants represented at a higher rate in the sample than in the larger population of STEM undergraduates and at a higher rate than STEM professionals in the workforce. Both samples were predominantly Caucasian/White, with no other ethnic group making up more than one-fourth of either sample individually; these ethnicity breakdowns were reflective of the university’s undergraduate population and somewhat reflective of STEM disciplines. The students who took the STEM-focused career planning course accounted for a larger percentage of both total samples and also of the not-retained groups. Regarding initial major, the largest percentage of students fell within the initially undeclared category, with the next largest group being the initially STEM-declared group (these students officially declared a STEM major but were uncommitted with their decision).
The researchers conducted an a priori power analysis using G*Power 3 (Cohen, 1992; Faul, Erdfelder, Lang, & Buchner, 2007), and the overall samples of 429 and 271 were sufficient for the binary logistic regression. With logistic regression, the ratio of cases in each of the dependent outcomes (retained or not retained) to the number of independent variable predictors must be sufficient (Agresti, 2013; Hosmer, Lemeshow, & Sturdivant, 2013; Tabachnick & Fidell, 2013). Following Peduzzi, Concato, Kemper, Holford, and Feinstein’s (1996) rule of 10 cases per outcome per predictor, the samples were sufficient for all independent variables except ethnicity, which had multiple categories with fewer than 10 cases. However, Field (2009) and Vittinghoff and McCulloch (2006) recommended having a minimum of five cases per outcome per predictor, which the sample achieved for all independent variables.
Variables and Instruments
The analysis included 10 independent variables within the logistic regression models. The university’s Institutional Knowledge Management Office (IKMO) provided data for the four categorical variables displayed in Table 1 (gender, ethnicity, course, and initial major). Four of the independent variables represented the participants’ total and subscale scores on the CTI, which students completed in either the career planning course or the STEM seminar course. The other two independent variables were participants’ scores on the SAT math subtest and the university’s Math Placement Test–Algebra subscale; the IKMO provided these data as well.
Career Thoughts Inventory (CTI). The CTI includes 48 Likert-type items and seeks to measure respondents’ levels of negative career thinking (Sampson, Peterson, Lenz, Reardon, & Saunders, 1996a, 1996b). To complete the CTI, respondents read the 48 statements about careers and indicate how much they agree using a 4-point scale (strongly disagree to strongly agree). The CTI provides a total score and scores for three subscales: (a) Decision Making Confusion (DMC); (b) Commitment Anxiety (CA); and (c) External Conflict (EC). Completing the instrument yields raw scores for the assessment total and each of the three subscales, and a conversion table printed on the test booklet allows respondents to convert raw scores to T scores. Higher raw scores and T scores indicate a higher level of problematic thinking in each respective area, with T scores at or above 50 indicating clinical significance. For the college student norm group, internal consistency alpha coefficients were .96 for the total score and ranged from .77 to .94 for the three subscales (Sampson et al., 1996a, 1996b). With the sample in the present study, the researchers found acceptable alpha coefficients that were comparable to the norm group. The researchers used CTI change scores as predictors, calculated as the change in CTI total and subscale scores from the beginning to the end of either the career planning class or the STEM seminar class.
SAT Math. High school students take the SAT as a college admissions test typically in their junior and/or senior years (CollegeBoard, 2018). Although the SAT has four subtests, the researchers only used the math subtest in the present study. The math subtest is comprised of 54 questions or tasks in the areas of basic mathematics knowledge, advanced mathematics knowledge, managing complexity, and modeling and insight (CollegeBoard, 2018; Ewing, Huff, Andrews, & King, 2005). In a validation study of the SAT, Ewing et al. (2005) found an internal consistency alpha coefficient of .92 for the math subtest and alpha coefficients ranging from .68 to .81 for the four math skill areas. The researchers were unable to analyze psychometric properties of the SAT math test with the study sample because the university’s IKMO only provided composite and subtest total scores, rather than individual item responses.
Math Placement Test–Algebra Subtest. The Math Placement Test is a university-made assessment designed to measure mathematic competence in algebra, trigonometry, and pre-calculus that helps the university place students in their first math course at the university. All first-time undergraduate students at the university are required to take the test; when data collection began, the mandatory completion policy was not yet in place, so some earlier participants had missing data in this area. The test is structured so that all respondents first take the algebra subtest and if they achieve 70% accuracy, they move to the trigonometry and pre-calculus subtests. Similar to the SAT, the researchers were unable to analyze psychometric properties of the test because the IKMO provided only composite and subtest total scores.
Procedure
Because the dependent variables (second-year retention and third-year retention) were dichotomous (i.e., retained or not retained), the researchers used the binary logistic regression procedure within SPSS Version 24 to analyze the data (Agresti, 2013; Hosmer et al., 2013; Tabachnick & Fidell, 2013). The purpose of binary logistic regression is to test predictors of the binary outcome by comparing the observed outcomes and the predicted outcomes first without any predictors and then with the chosen predictors (Hosmer et al., 2013). The researchers used a backward stepwise Wald approach, which enters all predictors into the model and removes the least significant predictors one by one until all of the remaining predictors fall within a specific p value range (Tabachnick & Fidell, 2013). The researchers chose to set the range as p ≤ .20 based on the recommendation of Hosmer et al. (2013).
Preliminary data analysis included identifying both univariate and multivariate outliers, which were removed from the data file; conducting a missing data analysis; and testing the statistical assumptions for logistic regression. There were no missing values for categorical variables, but the assessment variables (CTI, SAT, and Math Placement Test) did have missing values. Results from Little’s (1988) MCAR test in SPSS showed that these data were not missing completely at random (Chi-square = 839.606, df = 161, p < .001). The researchers chose to impute missing values using the Expectation Maximization procedure in SPSS (Dempster, Laird, & Rubin, 1977; Little & Rubin, 2002). The data met the statistical assumptions of binary logistic regression related to multicollinearity and linearity in the logit (Tabachnick & Fidell, 2013). As previously discussed, the data also sufficiently met the assumption regarding the ratio of cases to predictor variables, with the exception of the ethnicity variable; after removing outliers, the Asian/Pacific Islander subcategory in the non-retained outcome had only four cases, violating the Peduzzi et al. (1996) and Field (2009) recommendation of having at least five cases. However, because the goal was to test the ethnicity categories separately rather than collapsing them to fit the recommendation, and because Hosmer et al. (2013) noted this was a recommendation and not a rule, the researchers chose to keep the existing categories, noting the potential limitation when interpreting this variable.
Results
The sections that follow provide the results from each of the hypotheses and interpretation of the findings.
Hypothesis 1
Hypothesis 1 stated that the independent variables could predict undergraduate STEM retention from Year 1 to Year 2. As stated previously, the backward stepwise Wald approach involved including all predictors initially and then removing predictors one by one based on p value until all remaining predictors fell within the p ≤ .20 range. This process took five steps, resulting in the removal of four variables with p values greater than .20: (a) CTI Commitment Anxiety Change, (b) CTI External Conflict Change, (c) Gender, and (d) CTI Decision Making Confusion Change, respectively. The model yielded a Chi-square value of 91.011 (df = 10, p < .001), a -2 Log likelihood of 453.488, a Cox and Snell R-square value of .198, and a Nagelkerke R-square value of .270. These R-square values indicate that the model can explain between approximately 20% and 27% of the variance in the outcome. The model had a good fit with the data, as evidenced by the Hosmer and Lemeshow Goodness of Fit Test (Chi-square = 6.273, df = 8, p = .617). The final model accurately predicted 73.4% of cases across groups; however, the model predicted the retained students more accurately (89.6% of cases) than the non-retained cases (45.8% of cases).
Table 2 explains how each of the six variables retained in the model contributed to the final model. The odds ratio represents an association between a particular independent variable and a particular outcome, or for this study, the extent that the independent variables predict membership in the retained outcome group. With categorical variables, this odds ratio represents the likelihood that being in a category increases the odds of being in the retained group over the reference category (i.e., African American/Black participants were 1.779 times more likely to be in the retained group than White/Caucasian students, who served as the reference category). With continuous variables, odds ratios represent the likelihood that quantifiable changes in the independent variables predict membership in the retained group (i.e., for every unit increase in SAT math score, the odds of being in the retained group increase 1.004 times). The interpretation of odds ratios allows them to be viewed as a measure of effect size, with odds ratios closer to 1.0 having a smaller effect (Tabachnick & Fidell, 2013).
Table 2
Variables in the Equation for Hypothesis 1
95% C.I. for O.R. | ||||||
Variable | B | S.E. | Wald | O.R. | Lower | Upper |
Ethnicity | 10.319* | |||||
Ethnicity (African American/Black) | .576 | .393 | 2.148 | 1.779 | .823 | 3.842 |
Ethnicity (Hispanic) | .068 | .290 | .054 | 1.070 | .606 | 1.889 |
Ethnicity (Asian/Pacific Islander) | 1.889 | .637 | 8.803** | 6.615 | 1.899 | 23.041 |
Ethnicity (Other) | .258 | .714 | .131 | 1.295 | .320 | 5.246 |
Initial Major | 35.824*** | |||||
Initial Major (Declared STEM) | .412 | .265 | 2.422 | 1.511 | .899 | 2.539 |
Initial Major (Declared Non-STEM) | -1.944 | .375 | 26.905*** | .143 | .069 | .298 |
STEM Seminar (Non-CP) | .850 | .258 | 10.885** | 2.340 | 1.412 | 3.879 |
SAT Math | .004 | .002 | 2.411 | 1.004 | .999 | 1.008 |
Math Placement–Algebra | .002 | .002 | 2.080 | 1.002 | .999 | 1.005 |
CTI Total Change | .017 | .007 | 5.546* | 1.017 | 1.003 | 1.032 |
Constant | -2.994 | 1.378 | 4.717 | .050 |
Note: B = Coefficient for the Constant; S.E. = Standard Error; O.R. = Odds Ratio; * p < .05; ** p < .01; *** p < .001.
With logistic regression, the Wald Chi-square test allows the researcher to determine a coefficient’s significance to the model (Tabachnick & Fidell, 2013). Based on this test, Initial Major was the most significant predictor to the model (p < .001). Students in the initially Declared STEM category were 1.511 times more likely to be in the retained group than those in the initially Undeclared category (the reference category); the odds of being in the retained group decreased by a factor of .143 for students in the initially Declared Non-STEM group. The STEM course was the predictor with the second most statistical significance (p < .01), with students in the STEM seminar class being 2.340 times more likely to be in the retained outcome than those in the career planning class. The CTI Total Change score was statistically significant (p < .05), indicating that for every unit increase in CTI Total Change score (i.e., the larger the decrease in score from pretest to posttest), the odds of being in the retained group increase by a factor of 1.017. Ethnicity was a statistically significant predictor (p < .05), with each subcategory having higher odds of being in the retained group than the White/Caucasian group; however, the researchers caution the reader to read these odds ratios for ethnicity with caution because of the number of cases in some categories. SAT Math and Math Placement–Algebra were not statistically significant, but still fell within the recommended inclusion range (p < .20).
Hypothesis 2
Hypothesis 2 stated that the independent variables could predict undergraduate STEM retention from Year 1 to Year 3. As stated previously, the backward stepwise Wald approach involved including all predictors initially and then removing predictors one by one based on p value until all remaining predictors fell within the p ≤ .20 range. This process took six steps, resulting in the removal of five variables with p values greater than .20: (a) CTI Commitment Anxiety Change, (b) CTI Decision Making Confusion Change, (c) Gender, (d) CTI External Conflict Change, and (e) CTI Total Change, respectively. The model yielded a Chi-square value of 55.835 (df = 9, p < .001), a -2 Log likelihood of 307.904, a Cox and Snell R-square value of .191, and a Nagelkerke R-square value of .255. These R-square values indicate that the model can explain between approximately 19% and 26% of the variance in the outcome. The model had a good fit with the data, as evidenced by the Hosmer and Lemeshow Goodness of Fit Test (Chi-square = 9.187, df = 8, p = .327). The model accurately predicted 70.0% of cases across groups. In this analysis, the model predicted the non-retained students more accurately (72.7% of cases) than the retained cases (66.9% of cases).
Table 3 explains how the variables within the model contributed to the final model. Based on the Wald test, Initial Major was the most significant predictor to the model (p < .001). Students in the initially Declared STEM category were 1.25 times more likely to be in the retained group than those in the initially Undeclared category (the reference category); the odds of being in the retained group decreased by a factor of .167 for students in the initially Declared Non-STEM group. The Math Placement–Algebra variable was statistically significant (p < .05), and the odds ratios indicated that for every unit increase in Math Placement–Algebra test score, the odds of being in the retained group are 1.005 higher. The STEM course variable was slightly outside the statistically significant range but fell within the inclusion range, with students in the STEM seminar class being 2.340 times more likely to be in the retained outcome than students in the career planning class. SAT Math was not statistically significant but still fell within the recommended inclusion range (p < .20). Ethnicity also was not a statistically significant predictor but fell within the inclusion range, with each subcategory having higher odds of being in the retained group than the White/Caucasian group; however, the researchers caution the reader to read these odds ratios for ethnicity with caution because of the number of cases in some categories.
Table 3
Variables in the Equation for Hypothesis 2
95% C.I. for O.R. | ||||||
Variable | B | S.E. | Wald | O.R. | Lower | Upper |
Ethnicity | 6.445 | |||||
Ethnicity (African American/Black) | .542 | .448 | 1.467 | 1.719 | .715 | 4.134 |
Ethnicity (Hispanic) | .243 | .349 | .484 | 1.275 | .643 | 2.528 |
Ethnicity (Asian/Pacific Islander) | 1.636 | .698 | 5.494* | 5.137 | 1.307 | 20.185 |
Ethnicity (Other) | .403 | .684 | .347 | 1.497 | .391 | 5.725 |
Initial Major | 17.362** | |||||
Initial Major (Declared STEM) | .223 | .328 | .460 | 1.250 | .656 | 2.379 |
Initial Major (Declared non-STEM) | -1.792 | .468 | 14.664** | .167 | .067 | .417 |
STEM Seminar (Non-CP) | .588 | .323 | 3.327 | 1.801 | .957 | 3.389 |
SAT Math | .004 | .003 | 2.536 | 1.004 | .999 | 1.010 |
Math Placement–Algebra | .005 | .002 | 5.449* | 1.005 | 1.001 | 1.009 |
Constant | -2.994 | 1.378 | 4.717 | .050 |
Note: B = Coefficient for the Constant; S.E. = Standard Error; O.R. = Odds Ratio; * p < .05; *** p < .001.
Discussion
The researchers sought to determine the degree to which a set of demographic variables, math scores, and career-related factors could predict undergraduate retention in STEM majors. Based on descriptive statistics, the participants are remaining in STEM majors at a higher rate than other nationwide samples (Chen & Soldner, 2013; Koenig, Schen, Edwards, & Bao, 2012). The sample
in this study was quite different based on gender than what is commonly cited in the literature; approximately 46% of the study’s sample was female, whereas the NCSES (2017) reported that white females made up approximately 31% of those in STEM fields, with minority females lagging significantly behind. The present study’s sample was more in line with national statistics with regard to ethnicity (NCSES, 2017; Palmer et al., 2011).
With Hypothesis 1, the researchers sought to improve on a pilot study (Belser et al., 2017) that did not include demographics or math-related variables. Adding these additional variables did improve the overall model fit and the accuracy of predicting non-retained students, but slightly decreased the accuracy of predicting retained students, as compared to the Belser et al. (2017) model. In addition to improving the model fit, adding in additional variables reversed the claim by Belser et al. (2017) that students in the STEM-focused career planning class were more likely to be retained than the STEM seminar students. In the present study, the STEM seminar students, who declared STEM majors prior to the first day of college, were more likely to be retained in STEM majors, which is in line with prior research connecting intended persistence in a STEM major to observed retention (Le et al., 2014; Lent et al., 2016).
With Hypothesis 2, the researchers sought to expand on the Belser et al. (2017) study by also predicting retention one year farther, into the third year of college. In this endeavor, the analysis yielded a model that still fit the data well. However, this model was much more accurate in predicting the non-retained students and was slightly less accurate in predicting the retained students, with the overall percentage of correct predictions similar to Hypothesis 1. This finding indicates that the included predictors may provide a more balanced ability to predict long-term retention in STEM majors than in just the first year. The initial major and STEM course variables performed similarly as in Hypothesis 1, and as such, similarly to prior research (Le et al., 2014; Lent et al., 2016).
Although sampling issues warrant the reader to read ethnicity results with caution, ethnicity did show to be a good predictor of retention in STEM majors with both Hypotheses 1 and 2. More noteworthy, the African American/Black and Hispanic students had higher odds of being retained. This is inconsistent with most research that shows underrepresented minorities as less likely to be retained in STEM majors (Chen & Soldner, 2013; Cundiff et al., 2013; Gayles & Ampaw, 2014); however, at least one study has previously found results in which ethnic minority students were more likely to be retained in STEM majors (Riegle-Crumb & King, 2010).
Gender was removed as a predictor from both models because of its statistical non-significance. Prior research has shown that females are less likely to be retained in STEM majors (Cundiff et al., 2013; Gayles & Ampaw, 2014; Riegle-Crumb et al., 2012), which separates this sample from prior studies. However, the COMPASS sample did have a larger representation of females than typically observed. Moreover, the COMPASS Program has been mindful of prior research related to gender and took steps to address gender concerns in program development (Dagley et al., 2016).
The continuous variables retained in the models showed only a mild effect on predicting STEM retention. The SAT Math and Math Placement–Algebra scores did perform consistently with prior research, in which higher math scores related to higher odds of retention (CollegeBoard, 2012; Crisp et al., 2009; Le et al., 2014; Mattern & Patterson, 2013; Rohr, 2012). The CTI variables that were retained in the models performed in line with the Belser et al. (2017) pilot study specific to STEM majors and with prior research examining negative career thoughts in undergraduate retention in other majors (Folsom, Peterson, Reardon, & Mann, 2005; Reardon et al., 2015).
Limitations and Implications
The present study has limitations, particularly with regard to research design, sampling, and instrumentation. First, the researchers used a comparison group design rather than a control group, and as such, there were certain observable differences between the two groups. Not having a control group limits the researchers’ ability to make causal claims regarding the predictor variables or the STEM career intervention. The researchers also only included a limited number of predictors; the inclusion of additional variables may have strengthened the models. Although the sample size was sufficient based on the a priori power analysis, the low number of participants in some of the categories may have resulted in overfitting or underfitting within the models. Finally, the researchers were not able to test psychometric properties of the SAT Math subtest or the Math Placement–Algebra subtest with this sample because of not having access to the participants’ item responses for each. The researchers attempted to mitigate limitations as much as possible and acknowledge that they can and should be improved upon in future research.
Future research in this area would benefit from the inclusion of a wider variety of predictor variables, such as math and science self-efficacy, outcome expectations, and internal processes observed with gender and ethnic minority groups (e.g., stereotype threat; Cundiff et al., 2013; Litzler et al., 2014). The researchers also recommend obtaining a larger representation of ethnic minority groups to ensure an adequate number of cases to effectively run the statistical procedure. Future researchers should consider more complex statistical procedures (e.g., structural equation modeling) and research designs (e.g., randomized control trials) to determine more causal relationships between predictors and the outcome variables.
Because the results of this study indicate that a more solidified major selection is associated with higher odds of retention in STEM majors, university career professionals and higher education professionals should strive to develop programming that helps students decide on a major earlier in their undergraduate careers. Structured career development work, often overlooked in undergraduate STEM programming, may be one such appropriate strategy. Additionally, any undergraduate STEM programming must be sensitive to demographic underrepresentation in STEM majors and the STEM workforce and should take steps to provide support for students in these underrepresented groups.
Similar to work with undergraduates, this study’s results provide a rationale for school counselors to engage students in STEM career work so that they can move toward a solidified STEM major prior to enrolling in college. The industry-specific career development work discussed within this study is just as important, if not more important, for students in K–12 settings. Moreover, school counselors, through their continued access to students, can serve as an access point for researchers to learn more about the STEM career development process at an earlier stage of the STEM pipeline. All of these endeavors point to the need for counselor educators to better prepare school counselors, college counselors, and career counselors to do work specifically with STEM and to become more involved in STEM career research.
In the present study, the researchers built upon prior research in the area of STEM retention to determine which variables can act as predictors of undergraduate STEM retention. The binary logistic regression procedure yielded two models that provide insight on how these variables operate individually and within the larger model. Finally, the researchers identified some key implications for counselors practicing in various settings and for researchers who are interested in answering some of the key questions that still exist with regard to STEM career development and retention.
Conflict of Interest and Funding Disclosure
Data collected in this study was part of a dissertation study by the first author. The dissertation was awarded the 2018 Dissertation Excellence Award by the National Board for Certified Counselors.
References
ACT. (2018). The condition of STEM 2017. Retrieved from www.act.org/stemcondition
Agresti, A. (2013). Categorical data analysis (3rd ed.). Hoboken, NJ: Wiley.
Beasley, M. A., & Fischer, M. J. (2012). Why they leave: The impact of stereotype threat on the attrition of women and minorities from science, math, and engineering majors. Social Psychology of Education, 15, 427–448. doi:10.1007/s11218-012-9185-3
Belser, C. T., Prescod, D. J., Daire, A. P., Dagley, M. A., & Young, C. Y. (2017). Predicting undergraduate student retention in STEM majors based on career development factors. The Career Development Quarterly, 65, 88–93. doi:10.1002/cdq.12082
Belser, C. T., Prescod, D. J., Daire, A. P., Dagley, M. A., & Young, C. Y. (2018). The influence of career planning on career thoughts in STEM-interested undergraduates. The Career Development Quarterly, 66(2), 176–181. doi:10.1002/cdq.12131
Bouwma-Gearhart, J., Perry, K. H., & Presley, J. B. (2014). Improving postsecondary STEM education: Strategies for successful interdisciplinary collaborations and brokering engagement with education research and theory. Journal of College Science Teaching, 44, 40–47.
Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago, IL: Rand McNally.
Carnevale, A. P., Smith, N., & Melton, M. (2011). STEM: Science, technology, engineering, mathematics. Washington, DC: Georgetown University.
Chen, X., & Soldner, M. (2013). STEM attrition: College students’ paths into and out of STEM fields. Retrieved from http://nces.ed.gov/pubs2014/2014001rev.pdf
Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159. doi:10.1037/0033-2909.112.1.155
CollegeBoard. (2018). Math Test. Retrieved from https://collegereadiness.collegeboard.org/sat/inside-the-test/math
Crisp, G., Nora, A., & Taggart, A. (2009). Student characteristics, pre-college, college, and environmental factors as predictors of majoring in and earning a STEM degree: An analysis of students attending a Hispanic serving institution. American Educational Research Journal, 46, 924–942. doi:10.3102/0002831209349460
Cundiff, J. L., Vescio, T. K., Loken, E., & Lo, L. (2013). Do gender-science stereotypes predict science identification and science career aspirations among undergraduate science majors? Social Psychology of Education, 16, 541–554. doi:10.1007/s11218-013-9232-8
Dagley, M. A., Young, C. Y., Georgiopoulos, M., Daire, A. P., Parkinson, C., Prescod, D. J., & Belser, C. T. (2016). Recruiting undecided admits to pursue a STEM degree. Proceedings from the American Society for Engineering Education 123rd Annual Conference & Exposition. Retrieved from https://peer.asee.org/recruiting-undecided-admits-to-pursue-a-stem-degree
Dempster, A., Laird, N., & Rubin, D. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39, 1–38.
Dixon-Román, E. J., Everson, H. T., & McArdle, J. J. (2013). Race, poverty, and SAT scores: Modeling the influences of family income on black and white high school students’ SAT performance. Teachers College Record, 115, 1–33.
Doerschuk, P., Bahrim, C., Daniel, J., Kruger, J., Mann, J., & Martin, C. (2016). Closing the gaps and filling the STEM pipeline: A multidisciplinary approach. Journal of Science Education and Technology, 25, 682–695. doi:10.1007/s10956-016-9622-8
Ewing, M., Huff, K., Andrews, M., & King, K. (2005). Assessing the reliability of skills measured by the SAT (Report No. RN-24). New York, NY: CollegeBoard. Retrieved from https://files.eric.ed.gov/fulltext/ED562595.pdf
Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.
Field, A. (2009). Discovering statistics using SPSS (3rd ed.). London, England: Sage Publications.
Folsom, B., Peterson, G. W., Reardon, R. C., & Mann, B. A. (2005). Impact of a career planning course on academic performance and graduation rate. Journal of College Student Retention: Research, Theory & Practice, 6, 461–473. doi:10.2190/4WJ2-CJL1-V9DP-HBMF
Foltz, L. G., Gannon, S., & Kirschmann, S. L. (2014). Factors that contribute to the persistence of minority students in STEM fields. Planning for Higher Education Journal, 42(4), 46–58.
Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: An introduction (8th ed.). Boston, MA: Allyn & Bacon.
Gayles, J. G., & Ampaw, F. (2014). The impact of college experiences on degree completion in STEM fields at four-year institutions: Does gender matter? The Journal of Higher Education, 85, 439–468.
doi:10.1353/jhe.2014.0022
Gottfredson, L. (1981). Circumscription and compromise: A developmental theory of occupational aspirations. Journal of Counseling Psychology, 28, 545–579. doi:10.1037/0022-0167.28.6.545
Holland, J. L. (1973). Making vocational choices: A theory of vocational personalities and work environments (2nd ed.). Upper Saddle River, NJ: Prentice Hall.
Hosmer, D. W., Jr., Lemeshow, S., & Sturdivant, R. X. (2013). Applied logistic regression (3rd ed.). Hoboken, NJ: Wiley.
Koenig, K., Schen, M., Edwards, M., & Bao, L. (2012). Addressing STEM retention through a scientific thought and methods course. Journal of College Science Teaching, 41(4), 23–29.
Lawlor, S., Richman, S., & Richman, C. L. (1997). The validity of using the SAT as a criterion for black and white students’ admission to college. College Student Journal, 31, 507–515.
Le, H., Robbins, S. B., & Westrick, P. (2014). Predicting student enrollment and persistence in college STEM fields using an expanded P-E fit framework: A large-scale multilevel study. Journal of Applied Psychology, 99, 915–947. doi:10.1037/a0035998
Lent, R. W., Brown, S. D., & Hackett, G. (2002). Social cognitive career theory. In D. Brown (Ed.), Career choice and development (4th ed., pp. 255–311). San Francisco, CA: Jossey-Bass.
Lent, R. W., Miller, M. J., Smith, P. E., Watford, B. A., Lim, R. H., & Hui, K. (2016). Social cognitive predictors of academic persistence and performance in engineering: Applicability across gender and race/ethnicity. Journal of Vocational Behavior, 94, 79–88. doi:10.1016/j.jvb.2016.02.012
Little, R. J. A. (1988). A test of missing completely at random for multivariate data with missing values. Journal of the American Statistical Association, 83, 1198–1202. doi:10.2307/2290157
Little, R. J. A., & Rubin, D. B. (2002). Statistical analysis with missing data (2nd ed.). New York, NY: Wiley.
Litzler, E., Samuelson, C. C., & Lorah, J. A. (2014). Breaking it down: Engineering student STEM confidence at the intersection of race/ethnicity and gender. Research in Higher Education, 55, 810–832.
doi:10.1007/s11162-014-9333-z
Mansfield, K. C., Welton, A. D., & Grogan, M. (2014). “Truth or consequences”: A feminist critical policy analysis of the STEM crisis. International Journal of Qualitative Studies in Education, 27, 1155–1182.
doi:10.1080/09518398.2014.916006
Mattern, K. D., & Patterson, B. F. (2013). The relationship between SAT scores and retention to the second year: Replication with the 2010 SAT validity sample (College Board Statistical Report No. 2013-1). New York, NY: The College Board. Retrieved from https://files.eric.ed.gov/fulltext/ED563087.pdf
National Career Development Association. (2015). Career development theory and its application. Broken Arrow, OK: Author.
National Center for Science and Engineering Statistics. (2017). Women, minorities, and persons with disabilities in science and engineering. Retrieved from https://www.nsf.gov/statistics/2017/nsf17310/
National Science Board. (2018). Science & engineering indicators 2018. Retrieved from https://www.nsf.gov/statistics/2018/nsb20181/
Nosek, B. A., & Smyth, F. L. (2011). Implicit social cognitions predict sex differences in math engagement and achievement. American Educational Research Journal, 48, 1125–1156. doi:10.3102/0002831211410683
Osborn, D. S., Howard, D. K., & Leierer, S. (2007). The effect of a career development course on the dysfunctional career thoughts of racially and ethnically diverse college freshmen. The Career Development Quarterly, 55, 365–377. doi:10.1002/j.2161-0045.2007.tb00091.x
Palmer, R. T., Maramba, D. C., & Dancy, T. E., II (2011). A qualitative investigation of factors promoting the retention and persistence of students of color in STEM. The Journal of Negro Education, 80, 491–504.
Peterson, G. W., Sampson, J. P., Jr., Lenz, J. G., & Reardon, R. C. (2002). A cognitive information processing approach to career problem solving and decision making. In D. Brown (Ed.), Career choice and development (4th ed., pp. 312–369). San Francisco, CA: Jossey-Bass.
Peduzzi, P., Concato, J., Kemper, E., Holford, T. R., & Feinstein, A. R. (1996). A simulation study of the number of events per variable in logistic regression analysis. Journal of Clinical Epidemiology, 49, 1373–1379. doi:10.1016/S0895-4356(96)00236-3
Prescod, D. J., Daire, A. P., Young, C. Y., Dagley, M. A., & Georgiopoulos, M. (in press). Exploring negative career thoughts between STEM declared and STEM interested students. Journal of Employment Counseling, 55(4), 166–176.
Reardon, R., & Fiore, E. (2014). College career courses and learner outputs and outcomes, 1976-2014 (Technical report No. 55). Tallahassee, FL: Center for the Study of Technology in Counseling & Career Development, Florida State University. Retrieved from http://career.fsu.edu/content/download/223105/1906289/TechRept_55_201406.pdf
Reardon, R. C., Melvin, B., McClain, M. C., Peterson, G. W., & Bowman, W. J. (2015). The career course as a factor in college graduation. Journal of College Student Retention: Research, Theory, & Practice, 17, 336–350. doi:10.1177/1521025115575913
Riegle-Crumb, C., & King, B. (2010). Questioning a white male advantage in STEM: Examining disparities
in college major by gender and race/ethnicity. Educational Researcher, 39, 656–664. doi:10.3102/0013189X10391657
Riegle-Crumb, C., King, B., Grodsky, E., & Muller, C. (2012). The more things change, the more they stay the same? Prior achievement fails to explain gender inequality in entry into STEM college majors over time. American Educational Research Journal, 49, 1048–1073. doi:10.3102/0002831211435229
Rohr, S. L. (2012). How well does the SAT and GPA predict the retention of science, technology, engineering, mathematics, and business students? Journal of College Student Retention: Research, Theory, & Practice, 14, 195–208. doi:10.2190/CS.14.2.c
Sampson, J. P., Jr., Peterson, G. W., Lenz, J. G., Reardon, R. C., & Saunders, D. E. (1996a). Career Thoughts Inventory: Professional manual. Odessa, FL: Psychological Assessment Resources.
Sampson, J. P., Jr., Peterson, G. W., Lenz, J. G., Reardon, R. C., & Saunders, D. E. (1996b). Improving your career thoughts: A workbook for the Career Thoughts Inventory. Odessa, FL: Psychological Assessment Resources.
Saunders, D. E., Peterson, G. W., Sampson, J. P., Jr., & Reardon, R. C. (2000). Relation of depression and dysfunctional career thinking to career indecision. Journal of Vocational Behavior, 56, 288–298. doi:10.1006/jvbe.1999.1715
Schneider, K. R., Bickel, A., & Morrison-Shetlar, A. (2015). Planning and implementing a comprehensive student-centered research program for first-year STEM undergraduates. Journal of College Science Teaching, 44(3), 37–43.
Sithole, A., Chiyaka, E. T., McCarthy, P., Mupinga, D. M., Bucklein, B. K., & Kibirige, J. (2017). Student attraction, persistence and retention in STEM programs: Successes and continuing challenges. Higher Education Studies, 7, 46–59. doi:10.5539/hes.v7n1p46
Tabachnick, B. G., & Fidell, L. S. (2013). Using multivariate statistics (6th ed.). Upper Saddle River, NJ: Pearson Education.
Toldson, I. A., & McGee, T. (2014). What the ACT and SAT mean for black students’ higher education prospects. The Journal of Negro Education, 83, 1–3.
Vittinghoff, E., & McCulloch, C. E. (2006). Relaxing the rule of ten events per variable in logistic and Cox regression. American Journal of Epidemiology, 165, 710–718.
Wyatt, J. N., Remigio, M., & Camara, W. J. (2012). SAT Subject Area Readiness Indicators: Reading, Writing, & STEM. Retrieved from https://files.eric.ed.gov/fulltext/ED562872.pdf
Christopher T. Belser, NCC, is an assistant professor at the University of New Orleans. M. Ann Shillingford is an associate professor at the University of Central Florida. Andrew P. Daire is a dean at Virginia Commonwealth University. Diandra J. Prescod is an assistant professor at Pennsylvania State University. Melissa A. Dagley is an executive director at the University of Central Florida. Correspondence can be addressed to Christopher Belser, 2000 Lakeshore Drive, Bicentennial Education Center Room 174, New Orleans, LA 70148, ctbelser@uno.edu.