Brett Zyromski, Melissa Mariani, Boyoung Kim, Sangmin Lee, John Carey

This study evaluated the impact of the Student Success Skills (SSS) classroom curriculum delivered in a naturalistic setting on the metacognitive functioning of 2,725 middle and high school students in Kentucky. SSS was implemented as one intervention to fulfill an Elementary and Secondary School Counseling Grant. Results in students’ self-reports indicated that those who received the intervention demonstrated increased ability to regulate their levels of emotional arousal. No additional significant differences were found. These findings differ from the results of previous outcome studies involving SSS. Implications for implementing SSS in naturalistic school settings and directions for future research are discussed.

Keywords: Student Success Skills, naturalistic, metacognitive functioning, classroom curriculum, emotional arousal

The purpose of this study was to evaluate the impact of the Student Success Skills (SSS) school counseling curriculum (Brigman, Campbell, & Webb, 2004; Brigman & Webb, 2012) delivered in a naturalistic setting on students’ metacognitive functioning. In this case, the authors use the term naturalistic setting to describe a typical school environment, one which lacks the additional supports (e.g., hiring national trainers) that would be present in a more controlled research study. SSS is an evidence-based, school counselor-delivered, social-emotional learning intervention that is designed to support students by teaching them three integral skill sets: (a) cognitive and metacognitive skills (e.g., goal setting, progress monitoring and memory skills); (b) social skills (e.g., interpersonal skills, social problem solving, listening and teamwork skills); and (c) self-regulation skills (e.g., managing attention, motivation and anger). Research has identified this curriculum as important in promoting students’ academic achievement and success in school (Collaborative for Academic, Social, and Emotional Learning, 2015; Webb & Brigman, 2006).
SSS was designed based on reviews of educational psychology research literature that identified critical skills such as information processing, emotional self-management, and positive social skills needed for student success (Bransford, Brown, & Cocking, 1999; Durlak, Weissberg, Dymnicki, Taylor, & Schellinger, 2011; Greenberg et al., 2003; Hattie, Biggs, & Purdie, 1996; Marzano, Pickering, & Pollock, 2001; Masten & Coatsworth, 1998; Wang, Haertel, & Walberg, 1994; Zins, Weissberg, Wang, & Walberg, 2004). The curriculum (Brigman & Webb, 2012) can be delivered in two formats: (1) the SSS classroom program and (2) the SSS small group program (Brigman et al., 2004), both of which are intended for use with students in grades 4 through 12. SSS is a highly structured, manualized program that consists of weekly 45-minute lessons. The classroom format includes five lessons, while the small group program includes eight lessons. Both sets of weekly lessons are intended to be delivered in chronological order over the corresponding number of consecutive weeks. A 45-minute booster session is delivered once a month for 3 months in the spring.

Developers of SSS designed the curriculum to follow a scripted, manualized format; implementers are encouraged to follow the sequencing, format and language provided in order to ensure fidelity of treatment. If practitioners go “off-script” or change the recommended delivery of the lessons, it may result in less favorable outcomes that might not meet the same levels as have been found in past research. That being said, the SSS program comes with detailed manuals that include recommended verbiage, descriptive diagrams and supplemental handouts (Brigman & Webb, 2012). During each lesson, students learn and practice strategies in five distinct areas: (a) goal setting, progress monitoring and success sharing; (b) creating a caring, supportive and encouraging classroom environment;
(c) cognitive and memory skills; (d) calming skills; and (e) building healthy optimism. Specific strategies are taught and practiced each week and are reviewed and reinforced during subsequent lessons. Between lessons, students are encouraged to apply the new strategies that were taught and to work on the academic and personal goals that they set for themselves during the SSS session. Teachers also are expected to cue students to use the strategies during regular classroom lesson time.

 

Research has established the effectiveness of SSS in several quasi-experimental and experimental studies. SSS implementation has resulted in enhanced academic achievement as measured by standardized achievement tests (Brigman & Campbell, 2003; Brigman, Webb, & Campbell, 2007; C. Campbell & Brigman, 2005; Webb, Brigman, & Campbell, 2005) and district math and reading achievement measures (Lemberger, Selig, Bowers, & Rogers, 2015). Two studies have suggested that the effects of SSS on academic achievement are at least partially mediated by changes in students’ metacognitive functioning (Lemberger & Clemens, 2012; Lemberger et al., 2015). Lemberger and Clemens (2012) found that participation in SSS small groups was associated with improvements in students’ executive functions (as measured by the Behavior Rating Inventory of Executive Function [BRIEF]; Gioia, Isquith, Guy, & Kenworthy, 2000) and increased metacognitive activity (as measured by the Junior Metacognitive Awareness Inventory [Jr. MAI]; Sperling, Howard, Miller, & Murphy, 2002). Lemberger et al. (2015) found that participation in the classroom version of SSS was associated with improvement in executive functions (as measured by the BRIEF-SR; Guy, Isquith, & Gioia, 2004).

 

While researchers have established the efficacy of SSS in well-controlled research environments, little is known about its effectiveness when delivered in naturalistic settings. The purpose of the present study was to measure the effectiveness of the SSS curriculum when delivered in a naturalistic setting within regularly functioning schools. In this study, SSS was implemented in five schools in a district in Kentucky as part of a school counseling improvement project funded by an Elementary and Secondary School Counseling Demonstration Grant awarded by the U.S. Department of Education. The five middle and high schools collaborated together through a regional educational cooperative to apply for the grant. Demographic information for each school can be found in the Setting section below. The grant funded all necessary SSS curriculum materials and provided support for school counselors in evaluating the impact of the program. However, funding was not available to hire national trainers. National trainers are not a requirement when purchasing a manualized school intervention and many schools do not possess the funds needed to hire national trainers. Thus, this funded project provided an opportunity to evaluate the effectiveness of SSS in a naturalistic school setting.

 

The primary evaluation question was: When implemented in a naturalistic setting, does SSS impact students’ metacognitive functioning, as determined by (1) knowledge and regulation of cognition as measured by the Junior Metacognitive Awareness Inventory (Jr. MAI; Sperling et al., 2002) and (2) use of skills related to self-direction of learning, support of classmates’ learning, and self-regulation of arousal as measured by the Student Engagement in School Success Skills survey (SESSS; Carey, Brigman, Webb, Villares, & Harrington, 2013)? The secondary question was: Does the magnitude of any changes in metacognitive functioning depend on the degree to which SSS was implemented with fidelity?

Method

 

Setting

Five participating schools were chosen due to their participation in the Elementary and Secondary School Counseling grant. Specific population and demographic data related to all five schools can be found in Table 1.

 

Table 1

 

Brief Description of the Five Schools in the Study

 

School Enrollment& Grades Served Location Ethnicity Rates Gender Rates Socio-Economic Indicators
School 1 1,484 StudentsGrades 9–12 Suburban/Rural African American-0.8%American Indian-0%Asian American-1.2%

European American-95.8%

Hispanic American-0.9%

Two or More Races-1.2%

 

Female-50.2%Male-49.8% Qualifying for Free
Lunch-28%Qualifying for Reduced Lunch-7.6%
School 2 1,154 StudentsGrades 6–8 Suburban/Rural African American-1.1%American Indian-0.1%Asian American-0.7%

European American-94.1%

Hispanic American-2%

Two or More Races-2%

 

Female-49%Male-51% Qualifying for Free Lunch-33.5%Qualifying for Reduced Lunch-7.8%
School 3 325 StudentsGrades 7–12 Urban African American-2.8%American Indian-0%Asian American-0.6%

European American-95.1%

Hispanic American-1.2%

Two or More Races-0.3%

 

Female-54.5%Male-45.5% Qualifying for Free Lunch-68.6%Qualifying for Reduced Lunch-8.6%
School 4 570 StudentsGrades 6–8 Rural African American-1.4%American Indian-0%Asian American-0.2%

European American-97.5%

Hispanic American-0.5%

Two or More Races-0.4%

 

Female-48.4%Male-51.6% Qualifying for Free Lunch-50.5%Qualifying for Reduced Lunch-7.9%
School 5 471 StudentsGrades 6–8 Urban African American-11.5%American Indian-0.4%Asian American-0.4%

European American-69%

Hispanic American-10%

Two or More Races-7.9%

 

Female-45.2%Male-54.8% Qualifying for Free Lunch-62.2%Qualifying for Reduced Lunch-6.4%

Note: The demographic categories are listed as reported by the state reporting system.

 

Participants

A total of 2,725 students participated in the study with roughly equal numbers of male (50.1%) and female (49.9%) students. A relatively large percentage of participants (41.2%) qualified for free or reduced lunch. Less than 1% of participants were classified as Limited English Proficient and 11.2% qualified for special education services. In terms of racial and ethnic diversity, the participants were less diverse than desired. Eighty-five percent of participating students identified as White (non-Hispanic), 4% as Multiracial, 6% as African American, and 4% as Hispanic. All other groups combined accounted for the remaining 1% of participants. Proportionally more sixth-, seventh-, and eighth-grade students participated in the study. The percentages of participants by grade were: 20.5% sixth grade; 20.9% seventh grade; 20% eighth grade; 10.7% ninth grade; 9.9% tenth grade; 9.2% eleventh grade; and 8.7% twelfth grade.

 

Completed Jr. MAI surveys (pretest and posttest) were obtained from 1,565 students (57% of the total). Completed SESSS surveys were obtained from 1,612 students (59% of the total). School counselors were required by the grant to serve as point persons for the delivery and collection of pre- and post-instruments. School counselors also were trained on instrument collection using a structured, scripted protocol. Instruments were administrated in paper format. Although trained in data collection procedures, not all participating school counselors successfully captured both pre- and post-assessments. Issues around successfully collecting paper instruments contributed to the loss of data. For example, 4% of the Jr. MAI surveys were incomplete and 4% of the SESSS surveys were incomplete. Rather than estimating missing data in these instances, it was determined that only data from fully complete instruments would be used in the analyses.

 

Preparation of the School Counselors

An SSS manual was provided to each of the schools the year prior to implementation of the program. School counselors reviewed the SSS materials and met with the grant project manager to discuss the content and instructional processes associated with the SSS classroom guidance interventions. Again, school counselors within the five schools did not receive formal training from the national SSS trainers on how to implement the curriculum. Formal training was a cost not included in the grant. The lack of formal training reflected the more naturalistic approach to the study. More often than not, school counselors that purchase a manualized program do not have the funding to hire national trainers to guide implementation. School counselors review the manual and follow the manualized program during implementation. This approach was reflected in this study. School counselors at each school used the SSS curriculum manuals and did their best to adhere to the recommended lesson sequence and scripts (Brigman & Webb, 2012). Next, the school counselors at each school conducted a pilot SSS small group in the spring semester, prior to the onset of the whole-school SSS implementation. The intent of the pilot implementation was to ensure that the school counselors were thoroughly familiar with the materials and implementation procedures. After conducting the pilot studies, the school counselors provided 3 hours of SSS training to partner teachers prior to implementation of the whole-school SSS intervention. The teachers in all schools also received a copy of the SSS classroom manual (Brigman & Webb, 2012) to review prior to implementing the curriculum with their students. Delivery of the SSS classroom format began the subsequent fall semester (2013) and concluded the following spring (2014).

 

 

Procedures for Delivering SSS and Fidelity Issues

In every school, the school counselors experienced some problems implementing the SSS curriculum with fidelity. Implementing the program with complete fidelity would have reflected the exact scope and sequence scripted within the SSS manual, mainly delivering the program over a 45-minute time period once a week for 5 consecutive weeks. Schools varied from this scope and sequence, resulting in a lack of fidelity of the recommended implementation format for SSS. However, schools adjusted delivery of SSS in a way that reflected their educational priorities, again reflecting a more naturalistic approach than a traditional controlled research study. The primary resistance school counselors encountered related to teachers’ and administrators’ reluctance to lose instructional time as a consequence of in-class implementation of SSS. Each school identified a contextually appropriate approach to addressing this initial resistance to devoting class time to SSS. In two of the schools, teachers (rather than counselors) delivered the SSS curriculum within their own classrooms. In the other three schools, the SSS curriculum was delivered through learning communities (i.e., advisories), which are existing scheduled blocks of time during which teachers facilitate small groups (8–15 students) outside their normal classrooms. For example, a ninth-grade biology teacher might be responsible for leading a group of students across several grades with whom the teacher does not interact outside of this learning community. The manner in which SSS was delivered in each school is detailed below.

 

School One. The SSS curriculum was not delivered with complete fidelity. Instead, the school leadership determined it was more feasible for teachers to deliver the curriculum through learning communities, once a week for 30 minutes for 10 weeks.

 

School Two. The SSS curriculum was delivered with reasonable fidelity once a week for 60 minutes over a 5-week period by teachers. However, instead of being delivered in a traditional classroom format, the school leadership determined it more feasible to deliver the curriculum through learning communities.

 

School Three. SSS was delivered with reasonable fidelity. Five teachers (trained and supervised by the school counselor) delivered the SSS curriculum in the prescribed format detailed in the SSS classroom manual. The five teachers delivered SSS to all students in the school through various courses, including a study skills course (seventh and eighth grades), a social studies course (ninth grade), a biology course (tenth grade), a college readiness course (eleventh grade) and an English course (twelfth grade).

 

School Four. The SSS curriculum was delivered with reasonable fidelity to all grade levels (sixth through eighth) during social studies courses. The social studies teachers were trained and supervised by the school counselor to deliver the program.

 

     School Five. The SSS curriculum was not delivered with complete fidelity. The school leadership determined it more feasible to deliver the curriculum through learning communities two to three times a week for 25 minutes for each session over a 5-week period.

 

Procedures for Collecting SESSS and Jr. MAI Data

Pretest data were collected at the beginning of the 2013 school year and posttest data were collected in late April and early May of the 2013–2014 school year, before the end-of-grade standardized testing ensued. Prior to beginning the project, school counselors completed a Collaborative Institutional Training Initiative to alert them to issues relating to voluntary participation and confidentiality. School counselors were then trained to follow an instrument administration manual (developed for the project) so that they could administer the SESSS and the Jr. MAI in a standardized fashion. They administered the SESSS and the Jr. MAI using standardized, scripted procedures. In order to protect the confidentiality of the students, school counselors changed the student identification numbers for each student by adding a randomly determined number for each school. No other person besides the school counselor knew the number by which the student identification numbers were changed. All data were kept in a locked file cabinet in the primary investigator’s office. Data were entered into a database, which was saved on an encrypted, password-protected hard drive. As an additional safeguard, the data from each school were saved on an external hard drive and transported by hand to the primary investigator’s office.

 

Instruments

 

Junior Metacognitive Awareness Inventory (Jr. MAI)

The present study used the 18-item version of the Junior Metacognitive Awareness Inventory (Jr. MAI; Sperling et al., 2002), a self-report scale that has two subscales that measure students’ knowledge of cognition and regulation of cognition. The Jr. MAI is used to screen learners for potential metacognitive and cognitive strategy interventions. Sperling et al. (2002) developed two versions of the Jr. MAI based on the Metacognitive Awareness Inventory (Schraw & Dennison, 1994). The 12-item version was developed for students in grades 3 through 5, while the 18-item version was developed for older students.

 

Available evidence suggests that the Jr. MAI and its subscales are reliable. Sperling et al. (2002) reported an internal consistency-based reliability estimate of .82 for the overall scale. Sperling, Richmond, Ramsay, and Klapp (2012) reported an internal consistency reliability of .76 for the knowledge of cognition subscale and .80 for the regulation of cognition subscale. Sperling et al. (2002) found that the Jr. MAI total score (for both versions of the instruments) correlated with other direct measures of student metacognition, but not with teachers’ ratings of students’ metacognitive abilities. Relatedly, Sperling et al. (2012) found the student scores on the 18-item version of the Jr. MAI correlated significantly with their scores on the Swanson Metacognitive Questionnaire (SMQ; Swanson, 1990), their science grade point average and their overall grade point average.  Recently, the 12-item version of the Jr. MAI was used to measure the impact of SSS on students’ metacognitive functioning. Lemberger and Clemens (2012) found that SSS delivered in small group format to fourth- and fifth-grade students resulted in measurable increases in Jr. MAI scores.

 

Student Engagement in School Success Skills (SESSS) Survey

The study also employed the Student Engagement in School Success Skills (SESSS) survey. The SESSS (Carey et al., 2013) is a 27-item scale that was developed to measure the extent to which students use strategies that have been shown to be related to enhanced academic achievement (Hattie et al., 1996; Masten & Coatsworth, 1998; Wang et al., 1994). The SESSS has three subscales that measure students’ self-direction of learning, support of classmates’ learning and self-regulation of arousal.

 

Carey et al. (2013) found in an exploratory factor analysis of the SESSS scores of 402 fourth through sixth graders that a four-factor solution provided the best model of scale dimensionality considering both the solution’s clean factor structure and the interpretability of these factors. However, in a confirmatory factor analysis study (Brigman et al., 2014) using SESSS scores from a diverse sample of almost 4,000 fifth-grade students, researchers found that while a four-factor model fit the data well, the scales associated with two subscales correlated so highly (r = .90) as to be indistinguishable. Consequently, the items associated with the two factors were combined and the subsequent three-factor model also proved to better fit the data.

 

Brigman et al. (2014) suggested that the SESSS is best thought of as having three underlying factors corresponding to self-direction of learning, support of classmates’ learning, and self-regulation of arousal. Based on factor loadings, Brigman et al. (2014) created three SESSS subscales. The self-direction of learning subscale (19 items) reflects the students’ intentional use of cognitive and metacognitive strategies to promote their own learning. The support of classmates’ learning subscale (six items) reflects the students’ intentional use of strategies to help classmates learn effectively. Finally, the self-regulation of arousal subscale (three items) reflects students’ intentional use of strategies to control disabling anxiety and cope with stress.

 

Available data indicate that the SESSS is a reliable assessment tool. Carey et al. (2014) reported an overall alpha coefficient of 0.91. Furthermore, Villares et al. (2014) reported that the coefficient alphas for the three SESSS subscales (self-direction of learning, support of classmates’ learning and self-regulation of arousal) were .89, .79 and .68, respectively.

 

Data Analysis

In order to answer the current study’s research questions, the authors conducted separate multivariate analysis of variance (MANOVA) with a repeated measure (pretest-posttest time) for the Jr. MAI and the SESSS. In the Jr. MAI, the two subscales (knowledge of cognition and regulation of cognition) were the dependent variables. For the SESSS MANOVA, the three subscales (self-direction of learning, support of classmates’ learning, self-regulation of arousal) were the dependent variables.

 

After performing the MANOVAs, follow-up repeated measures of analysis of variance (ANOVA) were conducted, where appropriate, to determine the significance of the pretest-posttest changes for individual subscales. Effect sizes (Cohen, 1988) also were calculated to determine the magnitude of pretest-posttest change in subscale associated with the intervention. For significant subscale changes, effect sizes were compared across schools to ascertain whether the level of fidelity of SSS implementation was related to the intervention’s size of effect.

 

Results

 

MANOVA analyses with a repeated measure (pretest-posttest) were performed to determine the differences between Jr. MAI and SESSS subtests across the pretest-posttest time periods in order to answer the primary evaluation question. The primary question was: When implemented in a naturalistic setting, does SSS impact students’ metacognitive functioning, as determined by (1) knowledge and regulation of cognition as measured by the Jr. MAI (Sperling et al., 2002) and (2) use of skills related to self-direction of learning, support of classmates’ learning and self-regulation of arousal as measured by the SESSS (Carey et al., 2013)? The results of these MANOVAs are shown in Table 2. For the Jr. MAI, the repeated measures MANOVA revealed a significant difference (F (1, 1562) = 3267.47, p < .00l) between the two subscales (knowledge of cognition and regulation of cognition). However, no significant difference existed between the pretest and posttest time points. The interaction effect of main effects, subscale and time was not significant.

 

Table 2

 

Repeated Measure MANOVA:  Effects of SSS on Students’ Metacognitive Activity

 

Dependent Measures                          Jr. MAI                        SESSS
Main effects
Subtest

        3267.47***

                       356.24***
Time

.3900

                       19.84***
Interaction effect
Subtest * Time

  1.4400

28.25***

Note. ***p<.001

 

     In contrast, SESSS repeated measures MANOVA revealed both a significant main effect of Subscale (F (2, 1610) = 356.24, p < .00l) and a significant interaction effect of Subscale x Time (F (2, 1610) = 28.25, p < .00l). Figure 1 shows that the self-regulation of arousal subscale (subscale 3) corresponded to a significantly greater mean change across time, compared to the self-direction of learning (subscale 1) and the support of classmates’ learning (subscale 2) subscales.

 

Figure 1. Pre-Post SSS Treatment Changes in Metacognitive Functioning and Success Skill Use: Means for Jr. MAI and SESSS subtests at Pretest (Time 1) and Posttest (Time 2).

 

Jr. MAI                                                                     SESSS

 

 

For Jr. MAI Subtest 1 = Knowledge of Cognition and Subtest 2 = Regulation of Cognition

For SESSS Subtest 1 = Self-Direction of Learning, Subtest 2 = Support of Classmates’ Learning,
and Subtest 3 = Self-Regulation of Arousal

 

Based on these MANOVA results, authors conducted follow-up repeated measures ANOVAs in order to test the significance of pretest-posttest changes for all three SESSS subscales. Only the SESSS self-regulation of arousal subscale indicated a significant change (F (1, 1610) = 46.147, p < .001) over time, reflecting a self-reported increase in students’ abilities to regulate their levels of potentially debilitating arousal after SSS participation. As shown in Table 3, the effect size of the self-regulation of arousal subscale (Cohen’s d = -.18) pretest-posttest change would be classified as small (Cohen, 1988).

 

 

Table 3

 

Effect Sizes of ANOVAs with Repeated Measures (T1 and T2) for Jr. MAI and SESSS

 

Pre

Post

Cohen’s d

Component

M

SD

M

SD

Knowledge of Cognition

3.18

.83

3.16

.85

+.02

Regulation of Cognition

3.96

.60

3.96

.65

+.00

The Self-Direction of Learning

2.32

.65

2.33

.66

-.02

The Support of Classmates’ Learning

2.59

.74

2.63

.74

-.05

The Self-Regulation of Arousal

2.16

.88

2.32

.87

-.18

 

     The secondary research question was: Does the magnitude of any changes in metacognitive functioning depend on the degree to which SSS was implemented with fidelity? Implementation fidelity was not strongly related to SSS effect size. Cohen’s d effect sizes (Cohen, 1988) were computed for each school to assess the impact of SSS on self-regulation of arousal. Schools 2, 3 and 4 (who had reasonable fidelity of implementation) had effect sizes of .20, .17 and .26 respectively. Schools 1 and 5 (who had the greatest deviation from implementation fidelity) reported effect sizes of .30 and .13, respectively. Therefore, the schools in this study showed considerable variability in effect sizes (.13 to .30). School differences across other factors (e.g., experience levels of SSS leaders, grade levels of students) may have contributed to this variability.

 

In summary, although significant findings were not found for pre- and posttests related to the Jr. MAI, significant findings were found for the SESSS subscale self-regulation of arousal (p < .001), indicating that students increased their ability to regulate levels of potentially debilitating arousal after participating in the SSS intervention. Examination of Cohen’s d effect sizes suggested that implementation fidelity, or the amount that schools varied from the scope and sequence laid out in the SSS manuals, did not correlate to level of effect size. This result suggests that the SSS intervention resulted in positive outcomes even when practitioners modified the scope and sequence to fit the needs of their setting.

 

Discussion

 

Evaluation Question 1. Does SSS delivered in a naturalistic setting impact students’ metacognitive functioning? The results of the present study suggest that when implemented in a naturalistic setting, SSS can be expected to result in statistically significant increases in students’ abilities to regulate potentially debilitating emotional arousal. These enhanced abilities might reasonably be expected to result in benefits related to improved academic performance (Durlak et al., 2011) and better school behavior (perhaps helping students increase their self-control related to daily interpersonal conflict and stressful events; Galla & Wood, 2015). The overall effect of SSS on emotional self-regulation, while statistically significant, was comparatively small. The present study failed to find evidence that SSS influenced other aspects of students’ metacognitive functioning, including their knowledge of cognition, regulation of cognition, use of strategies related to the self-direction of learning, or use of strategies to support fellow classmates’ learning.

 

     Evaluation Question 2. Does the magnitude of any changes in metacognitive functioning depend on the degree to which SSS was implemented with fidelity? While the schools in this study showed considerable variability in effect sizes, implementation fidelity was not strongly related to SSS effect size. For example, School 1 scored lowest on implementation fidelity, but demonstrated the greatest effect size (.30). The degree of departure from fidelity was not large enough to detract from SSS’s effect on students’ self-regulation of arousal.

 

Relationship to Previous SSS Findings

The present study failed to replicate the results of previous studies that found significant effects of SSS on students’ metacognition (Lemberger & Clemens, 2012; Lemberger et al., 2015). While the present study as well as Lemberger and Clemens’ study (2012) both used the Jr. MAI to measure changes in students’ metacognition, the two studies differed in terms of SSS delivery format (classroom vs. small group). The failure to replicate Jr. MAI-measured changes after SSS participation may be related to differences in the format (classroom vs. group) for SSS delivery, or to differences in the delivery context (naturalistic vs. controlled).

 

While this study and the Lemberger et al. (2015) study both delivered the SSS classroom format, these studies differed in the instruments they used to measure students’ metacognitive functioning (the SESSS and Jr. MAI vs. the BRIEF). Differences in results may be related to differences in instrumentation or to differences in the delivery context (naturalistic vs. controlled).

 

Limitations

     A limitation of this study is the use of a one-group pre-post design rather than a quasi-experimental design with a control group. Study researchers were constrained by the practical realities of the school environment, such that it was not feasible to implement SSS in such a way as to result in a comparison group. In addition, researchers were unable to locate similar schools that were willing to have students participate in a comparison group. Since it is not always feasible to employ a control group design, a one-group pre-post design may be used to attempt to replicate the findings of stronger research studies and can point to findings that need to be investigated later using stronger evaluation designs (D. T. Campbell & Stanley, 1963; Shadish, Cook & Campbell, 2001).  The present evaluation, in fact, furthers the understanding of the effects of SSS and highlights directions for future research.

 

The failure to collect data in some schools resulted in losses of both the Jr. MAI and the SESSS data. Corresponding complete pretest and posttest surveys were obtained from only 57% and 59% of participating students, respectively. While the lack of strong control over data collection can be thought of as an inherent problem in natural setting evaluations, care should be taken in future studies to strengthen data collection processes, as it is difficult to speculate on the impact of the loss of data in this study. Loss of data in smaller studies could be extremely harmful to the validity of the study. In the present study, there was no reason to believe that the loss of data was related to students’ reactivity to the intervention; however, it is possible that this loss impacted the results, and additional studies are needed to address this issue.

 

Implications for Future Research

It is important to understand the extent and nature of SSS’s impact on students’ metacognitive functioning, whether SSS’s well-established impact on academic performance (Brigman & Campbell, 2003; Brigman, Webb, & Campbell, 2007; C. Campbell & Brigman, 2005; Lemberger et al., 2015; Webb et al., 2005) is mediated by changes in students’ metacognitive functioning or other variables, and whether the impact on academic performance is related to general improvements in all students or an improvement in the functions of specific groups of students. Larger-scale intervention studies are needed to understand relationships between SSS proximal changes in students’ abilities and functioning (e.g., metacognitive functioning, emotional self-regulation, engagement, self-efficacy and motivation) and distal changes in students’ academic performance. It is important to understand which proximal changes in students are mediating their distal improvements in academic performance. It could be that SSS’s effects on academic performance are mediated by one or several of these variables. It also could be that SSS’s effects on academic performance are mediated by relatively broad variables (e.g., self-efficacy) that would be expected to be evident in virtually all students or that are relatively specific (e.g., reductions in debilitating test anxiety) and would be expected to be evident in only some students.

 

Understanding the mediator(s) of SSS’s effects on academic performance would be useful in identifying the most appropriate target groups for this intervention. As such, future studies are needed to explore whether SSS is most appropriate as a Tier 1 intervention for all students or as a Tier 2 intervention for some groups of “at-risk” students. Given that the results of the present study suggested that SSS helped students who struggled with self-regulation of arousal, it is especially important to examine the effectiveness of SSS as a Tier 2 intervention, specifically for students who demonstrate difficulties with emotional self-regulation. Finally, further research is needed to determine the practical significance of SSS on academic performance when it is implemented in less controlled, more naturalistic settings, and to determine how deviations from implementation fidelity and other contextual factors (e.g., expertise of the SSS leaders) correspond to expected social-emotional and academic achievement-related outcomes.

 

Implications for Practice

The results of the present study indicated that SSS, even when implemented in a naturalistic school setting (as opposed to a highly controlled setting), can have a positive impact on students’ abilities to regulate their emotional arousal. The magnitude of the overall impact of SSS on students’ ability to regulate arousal appears to be relatively small. However, readers should note that this effect size was computed based on students in the general population, not students experiencing difficulties with emotional self-regulation. It is likely that SSS would have had a larger estimated effect size if the target group of participants was those who had emotional self-regulation difficulties. However, the SSS curriculum positively impacted student outcomes even when the program was not implemented as designed. Though practitioners are encouraged to follow the manual and schedule as recommended, the results are encouraging in that impacts can still be found even if practitioners modify the design.

 

Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest

for the development of this manuscript.
This project was supported by an Elementary and
Secondary School Counseling Demonstration
Grant project from the Department of Education
no. S215E13422.

 

 

 

References

 

Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.

Brigman, G., & Campbell, C. (2003). Helping students improve academic achievement and school success behavior. Professional School Counseling, 7(2), 91–98.

Brigman, G., Campbell, C., & Webb, L. (2004). Student Success Skills: Group counseling manual. Boca Raton, FL: Atlantic Education Consultants.

Brigman, G., & Webb, L. (2012). Student success skills: Classroom manual (3rd ed.). Boca Raton, FL: Atlantic Education Consultants.

Brigman, G., Webb, L., & Campbell, C. (2007). Building skills for school success: Improving the academic and social competence of students. Professional School Counseling, 10, 279–288.

Brigman, G., Wells, C., Webb, L., Villares, E., Carey, J. C., & Harrington, K. (2014). Psychometric properties and confirmatory factor analysis of the student engagement in school success skills. Measurement and Evaluation in Counseling and Development. Advance online publication. doi:10.1177/0748175614544545

Campbell, C., & Brigman, G. (2005). Closing the achievement gap: A structured approach to group counseling. Journal for Specialists in Group Work, 30, 67–82.

Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Skokie, IL: Rand McNally.

Carey, J., Brigman, G., Webb, L., Villares, E., & Harrington, K. (2013). Development of an instrument to measure student use of academic success skills: An exploratory factor analysis. Measurement and Evaluation in Counseling and Development, 47, 171–180. doi:10.1177/0748175613505622

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.

Collaborative for Academic, Social, and Emotional Learning. (2015). CASEL guide: Effective social and emotional learning programs. Middle and high school edition. Chicago, IL: Author. Retrieved from http://secondaryguide.casel.org/casel-secondary-guide.pdf

Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82, 405–432. doi:10.1111/j.1467-8624.2010.01564.x

Galla, B. M., & Wood, J. J. (2015). Trait self-control predicts adolescents’ exposure and reactivity to daily stressful events. Journal of Personality, 83, 69–83.

Gioia, G. A., Isquith, P. K., Guy, S. C., & Kenworthy, L. (2000). Behavior rating inventory of executive function. Odessa, FL: Psychological Assessment Resources.

Greenberg, M. T., Weissberg, R. P., O’Brien, M. U., Zins, J. E., Fredericks, L., Resnik, H., & Elias, M. J. (2003). Enhancing school-based prevention and youth development through coordinated social, emotional, and academic learning. American Psychologist, 58, 466–474. doi:10.1037/0003-066X.58.6-7.466

Guy, S. C., Isquith, P. K., & Gioia, G. A. (2004). Behavior Rating Inventory of Executive Function-Self-Report Version.  Odessa, FL: Psychological Assessment Resources.

Hattie, J., Biggs, J., & Purdie, N. (1996). Effects of learning skills interventions on student learning: A meta-analysis. Review of Educational Research, 66, 99–136. doi:10.3102/00346543066002099

Lemberger, M. E., & Clemens, E. (2012). Connectedness and self-regulation as constructs of the Student Success                         Skills program in inner-city African American elementary school students. Journal of Counseling &                                     Development, 90, 450–458.

Lemberger, M. E., Selig, J. P., Bowers, H., & Rogers, J. E. (2015). Effects of the Student Success Skills program on                         executive functioning skills, feelings of connectedness, and academic achievement in a predominantly                       Hispanic, low-income middle school            district. Journal of Counseling & Development, 93, 25–37.

Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2001). Classroom instruction that works: Research-based strategies                         for increasing student achievement. Alexandria, VA: Association for Supervision and Curriculum                                     Development.

Masten, A. S., & Coatsworth, J. D. (1998). The development of competence in favorable and unfavorable environments: Lessons from research on successful children. American Psychologist, 53, 205–220.

Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460–475.

Shadish, W., Cook, T., & Campbell, D. (2001). Experimental and quasi-experimental designs for generalized causal inference (2nd ed.). Boston, MA: Cengage.

Sperling, R. A., Howard, B. C., Miller, L. A., & Murphy, C. (2002). Measures of children’s knowledge and regulation of cognition. Contemporary Educational Psychology, 27, 51–79. doi:10.1006/ceps.2001.1091

Sperling, R. A., Richmond, A. S., Ramsay, C. M., & Klapp, M. (2012). The measurement and predictive ability of metacognition in middle school learners. The Journal of Educational Research, 105, 1–7.

Swanson, H. L. (1990). Influence of metacognition and aptitude on problem solving. Journal of Educational Psychology, 82, 306–314.

Villares, E., Colvin, K., Carey, J. C., Webb, L., Brigman, G., & Harrington, K. (2014). Convergent and divergent validity of the Student Engagement in School Success Skills survey. The Professional Counselor, 4, 541–552.

Wang, M. C., Haertel, G. D., & Walberg, H. J. (1994). What helps students learn? Educational Leadership, 51(4), 74–79.

Webb, L., & Brigman, G. (2006). Student success skills: Tools and strategies for improved academic and social outcomes. Professional School Counseling, 10, 112–120.

Webb, L., Brigman, G., & Campbell, C. (2005). Linking school counselors and student success: A replication of the Student Success Skills approach targeting the academic and social competence of students. Professional School Counseling, 8, 407–413.

Zins, J. E., Weissberg, R. P., Wang, M. C., & Walberg, H. J. (2004). Building academic success on social and emotional learning: What does the research say? New York, NY: Teachers College Press.

 

 

 

Brett Zyromski is an Assistant Professor at The Ohio State University. Melissa Mariani is an Assistant Professor at Florida Atlantic University. Boyoung Kim is a Research Professor at Korea University. Sangmin Lee is an Associate Professor at Korea University. John Carey is a Professor at the University of Massachusetts Amherst. This project was supported by an Elementary and Secondary School Counseling Demonstration Grant project from the Department of Education, no. S215E13422. Correspondence can be addressed to Brett Zyromski, Department of Educational Studies, Counselor Education, PAES Building, 305 W. 17th Ave., Columbus, OH, zyromski.1@osu.edu.