TPC-Journal-V4-Issue5

The Professional Counselor \Volume 4, Issue 5 543 Linking school counseling programs and interventions to improved student outcomes has become increasingly important (Carey et al., 2013). One way for school counselors to demonstrate the impact of classroom guidance and small group counseling on achievement is by measuring the impact of their interventions on intermediate variables associated with achievement. These intermediate variables include the previously mentioned skills and strategies involving cognitive, social and self-management. Instruments that measure these critically important fundamental learning skills and strategies are limited. The present article explores the convergent and divergent validity of the SESSS (Carey et al., 2013). The article builds upon previous research describing the item development of the SESSS and exploratory factor analysis (Carey et al., 2013) and a recently completed confirmatory factor analysis (Brigman et al., 2014). The current findings contribute to the establishment of the SESSS as a valid instrument for measuring the impact of school counselor-led interventions on intermediate variables associated with improved student achievement. Method The data collected on the SESSS occurred within the context of a multiyear, large-scale, randomized control trial funded through the U.S. Department of Education’s Institute of Education Sciences. The purpose of the grant was to investigate the effectiveness of the SSS program (Brigman & Webb, 2010) with fifth graders from two large school districts (Webb, Brigman, Carey, & Villares, 2011). In order to guard against researcher bias, the authors hired data collectors to administer the SESSS, Motivated Strategies for Learning Questionnaire (MSLQ) and Self-Efficacy for Self-Regulated Learning (SESRL), and standardized the training and data collection process. The authors selected these particular surveys because they reflected factors known to be related to effective learning in different ways and because they provided a range of measures, some of which were theoretically related to the SESSS and some of which were not. Procedures During the 2011–2012 academic year, graduate students who were enrolled in master’s-level Counselor Education programs at two universities were hired and trained in a one-day workshop to administer the SESSS, MSLQ and SESRL and handle data collection materials. At the training, each data collector was assigned to five of the 60 schools across two school districts. After obtaining approvals from the university institutional review board and school district, the research team members notified parents of fifth-grade students of the study via district call-home systems and sent a letter home explaining the study, risks, benefits, voluntary nature of the study and directions on how to decline participation. One month later, data collectors entered their assigned schools and participating classrooms to administer the study instruments. Prior to administering the instruments, each data collector read aloud the student assent. Students who gave their assent were instructed to place a precoded generic label at the top of their instrument, and then each data collector read aloud the directions, along with each item and possible response choice on the SESSS, MSLQ and SESRL. Each assigned data collector was responsible for distributing, collecting and returning all the completed instruments to a district project coordinator once the data collector left the school building according to the Survey Data Collection Manual. In addition, the data collector noted any student absences and/or irregularities, and confirmed that all procedures were followed. The district project coordinators were responsible for verifying that all materials were returned and secured in a locked cabinet until they were ready to be shipped to a partner university for data analysis. The coordinators gathered demographic information from the district databases and matched it to the participating fifth-grade students and the precoded instrument labels through a generic coding system (district #1–2, school #1–30, classroom #1–6, student #1–25). The coordinators then saved the demographic information in a password- protected and encrypted Excel spreadsheet on an external device and shipped it to a partner university for data analysis.

RkJQdWJsaXNoZXIy NDU5MTM1