Introduction to the Special Issue School Counselors and a Multi-Tiered System of Supports: Cultivating Systemic Change and Equitable Outcomes

Christopher A. Sink and Melissa S. Ockerman

Designed to improve preK–12 student academic and behavioral outcomes, a Multi-Tiered System of Supports (MTSS), such as Positive Behavioral Intervention and Supports (PBIS) or Response to Intervention (RTI), is a broadly applied framework being implemented in countless schools across the United States. Such educational restructuring and system changes require school counselors to adjust their activities and interventions to fully realize the aims of MTSS. In this special issue of The Professional Counselor, the roles and functions of school counselors in MTSS frameworks are examined from various angles. This introductory article summarizes the key issues and the basic themes explored by the special issue contributors.

Keywords: school counselors, multi-tiered system of supports, Positive Behavioral Intervention and Supports, Response to Intervention, implementation

School counselors must proactively adapt to the varied mandates of school reform and educational innovations. Similarly, with new federal and state legislation, they must align their roles and functions in accordance with their changing requirements (Baker & Gerler, 2008; Dahir, 2004; Gysbers, 2001; Herr, 2002; Leuwerke, Walker & Shi, 2009; Paisley & Borders, 1995). One such initiative, the Multi-Tiered System of Supports (MTSS), requires educators to revise their assessment strategies, curriculum, pedagogy and interventions to best serve the academic, behavioral, and post-secondary education and career goals of all students (Lewis, Mitchell, Bruntmeyer, & Sugai, 2016). Specifically, MTSS is an umbrella term for a variety of school-wide approaches to improve student learning and behavior. The most familiar MTSS frameworks are Response to Intervention (RTI) and Positive Behavioral Interventions and Supports (PBIS; also referred to as Culturally Responsive or CR PBIS). The latter model has been implemented throughout the U.S., spanning all 50 states and approximately 22,000 schools (H. Choi, personal communication, December 15, 2014). Moreover, 45 states have issued guidelines for RTI implementation and 17 states require RTI to be used in the identification of students with specific learning disabilities (Hauerwas, Brown, & Scott, 2013). Research indicates that when these frameworks are implemented with fidelity over several years, they are best practice for addressing students at risk for academic or behavioral problems (Lane, Menzies, Ennis, & Bezdek, 2013; Lewis et al., 2016).

In 2014, the American School Counselor Association (ASCA) revised its RTI position statement to encompass MTSS, including both RTI and CR PBIS (ASCA, 2014). Although there is little evidence to support this assumption, the writers averred that MTSS seamlessly aligns with the ASCA National Model (2012a) in the three developmental domains (academic, social-emotional, and college/career). Nevertheless, school counselors should view MTSS frameworks as an opportunity to enhance their school counseling programs through the implementation of a data-driven, multi-tiered intervention system. Doing so allows school counselors to utilize and showcase their leadership skills with key stakeholders (e.g., parents, caregivers, teachers, administrators) and to create systemic changes in their schools and thus foster equitable outcomes for all children.

The implementation of MTSS and its alignment with comprehensive school counseling programs (CSCPs) position school counselors to advance culturally responsive preventions and interventions to serve students and their families more effectively (Goodman-Scott, Betters-Bubon, & Donohue, 2016). By working collaboratively with school personnel to tap students’ strengths and create common goals, school counselors can build capacity and thereby broaden their scope of practice and accountability. Politically astute school counselors are wise to leverage their school’s MTSS framework as a way to access necessary resources, obtain additional training and further impact student outcomes.

The research is scant on school counselor involvement with—and effectiveness in—MTSS implementation. The available publications, including those presented in this special issue, suggest that the level of MTSS education and training for pre-service and in-service school counselors is insufficient (Cressey, Whitcomb, McGilvray-Rivet, Morrison, & Shandler-Reynolds, 2014; Goodman-Scott, 2013, 2015; Goodman-Scott, Doyle, & Brott, 2014; Ockerman, Mason, & Hollenbeck, 2012; Ockerman, Patrikakou, & Feiker Hollenbeck, 2015). There are legitimate reasons for counselor reluctance and apprehension. For example, not only must school counselors add new and perhaps unfamiliar duties to an already harried work day, some evidence indicates that they are not well prepared for their MTSS responsibilities. Consequently, it is essential for both in-service professional development opportunities and pre-service preparation programs to focus on best practices for aligning CSCPs with MTSS frameworks (Goodman-Scott et al., 2016).

To address the gaps in the counseling literature on successful school counselor MTSS training, implementation, and collaboration with other school personnel, this special issue of The Professional Counselor was conceived. Moreover, the articles consider various facets of MTSS and their intersection with school counseling research and practice. Overall, the contributors hope to provide much needed MTSS assistance and support to nascent and practicing school counselors.

Summary of Contributions

Sink’s lead article in this special issue situates the contributions that follow by offering a general overview of foundational MTSS theory and research, including PBIS and RTI frameworks. Subsequently, literature-based suggestions for incorporating MTSS into school counselor preparation curriculum and pedagogy are provided. MTSS roles and functions summarized in previous research are aligned to ASCA’s (2012b) School Counselor Competencies, the 2016 Council for Accreditation of Counseling and Related Educational Programs (CACREP) Standards for School Counselors (2016) and the ASCA (2012a) National Model.

The next two articles report on MTSS-related studies and specifically discuss new school counselor responsibilities associated with MTSS implementation. Ziomek-Daigle, Goodman-Scott, Cavin, and Donohue reveal through a case study the various ways MTSS and CSCPs reflect comparable features (e.g., school counselor roles, advocacy, accountability). The participating case study counselors were actively engaged in MTSS implementation at their school, suggesting that they had a relatively good idea of their responsibilities in this capacity. Addressing RTI in particular, Patrikakou, Ockerman, and Hollenbeck’s investigation reported that while most school counselors expressed positive opinions about this MTSS framework, they lacked the self-assurance to adequately perform key RTI tasks (e.g., accountability and collaboration). Perceived counselor deficiencies in RTI implementation also point to a potential disconnect between the ASCA (2012a) National Model’s program components and themes and current RTI training of pre-service and practicing school counselors, thus suggesting a need for improved pre-service and in-service education.

School counselors are called upon to be culturally responsive and competent. They are advocates for social justice and equity for all students (Ratts, Singh, Nassar-McMillan, Butler, & McCullough, 2016; Singh, Urbano, Haston, & McMahan, 2010). Two articles speak to this issue within the educational context of MTSS. Belser and colleagues maintain that the ASCA (2012a) National Model and MTSS are beneficial operational frameworks to support all students, including marginalized and so-called problem learners (e.g., at-risk students). An integrated model is then proffered as a way to improve the educational outcomes of disadvantaged students. Positive and culturally sensitive alternatives to punishment-oriented school discipline methods are discussed as well. Similarly, Betters-Bubon, Brunner, and Kansteiner address school counselor roles in devising and sustaining culturally responsive PBIS programs that meet student social, behavioral and emotional needs. In particular, they report on an action research case study showing how an elementary school counselor partnered with other stakeholders (i.e., school administrator, psychologist, teachers) to achieve this goal.

The final article by Harrington, Griffith, Gray, and Greenspan overviews a recent grant project intended to establish a quality data-driven MTSS model in an elementary school. The manuscript spotlights the role of the school counselor who collaborated with other project leaders and educators to use social-emotional data to inform and improve practice. Specifics are provided so other practitioners can replicate the project in their schools. In brief, this contribution emphasizes the importance of data-based decision-making in MTSS implementation.

Conclusion

School counselors are faced with a myriad of responsibilities that severely tax their energy and time. Competing demands from internal and external stakeholders as well as legislative changes and educational innovations stretch these practitioners to be more efficient and effective in their services to students and families. Regrettably, MTSS implementation adds to counselors’ “accountability stress.” Some counselors anticipate that PBIS and RTI frameworks will go the way of other short-lived educational trends, relieving them of the responsibility to take action. However, anecdotal and empirical evidence reported in this special issue and elsewhere suggests these professionals are in the minority. School counselors largely perceive the potential and real value of MTSS programs. They desire to partner with other school educators to help all children and youth succeed. As contributors to this issue indicate, the ASCA (2012a) National Model and PBIS and RTI frameworks can be integrated to achieve higher student academic and social-emotional outcomes. With these articles, school counselors-in-training and practitioners have additional support to successfully address their MTSS duties and advocate for increased education in this area. Continued research is needed to guide efficacious MTSS practice designed to foster equitable educational outcomes for all students.

Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest

or funding contributions for the development

of this manuscript.

References

American School Counselor Association. (2012a). The ASCA national model: A framework for school counseling programs (3rd ed.). Alexandria, VA: Author.

American School Counselor Association. (2012b). ASCA school counselor competencies. Retrieved from https://www.schoolcounselor.org/asca/media/asca/home/SCCompetencies.pdf

American School Counselor Association. (2014). Position statement: Multi-tiered systems of support. Alexandria, VA: Author. Retrieved from https://www.schoolcounselor.org/asca/media/asca/PositionStatements/PS_MultitieredSupportSystem.pdf

Baker, S. B., & Gerler, E. R., Jr. (2008). School counseling in the twenty-first century (5th ed.). Columbus, OH: Merrill Prentice Hall.

Council for Accreditation of Counseling and Related Educational Programs. (2016). 2016 CACREP standards. Retrieved from http://www.cacrep.org/for-programs/2016-cacrep-standards

Cressey, J. M., Whitcomb, S. A., McGilvray-Rivet, S. J., Morrison, R. J., & Shander-Reynolds, K. J. (2014). Handling PBIS with care: Scaling up to school-wide implementation. Professional School Counseling, 18, 90–99. doi:10.5330/prsc.18.1.g1307kql2457q668

Dahir, C. A. (2004). Supporting a nation of learners: The role of school counseling in educational reform. Journal of Counseling & Development, 82, 344–353. doi:10.1002/j.1556-6678.2004.tb00320.x

Goodman-Scott, E. (2013). Maximizing school counselors’ efforts by implementing school-wide positive behavioral interventions and supports: A case study from the field. Professional School Counseling, 17, 111–119.

Goodman-Scott, E. (2015). School counselors’ perceptions of their academic preparedness and job activities. Counselor Education and Supervision, 54, 57–67.

Goodman-Scott, E., Betters-Bubon, J., & Donohue, P. (2016). Aligning comprehensive school counseling programs and positive behavioral interventions and supports to maximize school counselors’ efforts. Professional School Counseling, 19, 57–67.

Goodman-Scott, E., Doyle, B., & Brott, P. (2014). An action research project to determine the utility of bully prevention in positive behavior support for elementary school bullying prevention. Professional School Counseling, 17, 120–129

Gysbers, N. C. (2001). School guidance and counseling in the 21st century: Remember the past into the future. Professional School Counseling, 5(2), 96–105.

Hauerwas, L. B., Brown, R., & Scott, A. N. (2013). Specific learning disability and response to intervention: State-level guidance. Exceptional Children, 80, 101–120. doi:10.1177/001440291308000105

Herr, E. L. (2002). School reform and perspectives on the role of school counselors: A century of proposals for change. Professional School Counseling, 5, 220–234.

Lane, K. L., Menzies, H. M., Ennis, R. P., & Bezdek, J. (2013). School-wide systems to promote positive behaviors and facilitate instruction. Journal of Curriculum and Instruction, 7, 6–31.
doi:10.3776/joci.2013.v7n1pp6-31

Leuwerke, W. C., Walker, J., & Shi, Q. (2009). Informing principals: The impact of different types of information on principals’ perceptions of professional school counselors. Professional School Counseling, 12, 263–271. doi:10.5330/PSC.n.2010-12.263

Lewis, T. J., Mitchell, B. S., Bruntmeyer, D. T., & Sugai, G. (2016). School-wide positive behavior support and response to intervention: System similarities, distinctions, and research to date at the universal level of support. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Handbook of response to intervention: The science and practice of multi-tiered systems of support (2nd ed.; pp. 703–717). New York, NY: Springer.

Ockerman, M. S., Mason, E. C. M., & Hollenbeck, A. F. (2012). Integrating RTI with school counseling programs: Being a proactive professional school counselor. Journal of School Counseling, 10(15), 1–37. Retrieved from http://jsc.montana.edu/articles/v10n15.pdf

Ockerman, M. S., Patrikakou, E., & Feiker Hollenbeck, A. (2015). Preparation of school counselors and response to intervention: A profession at the crossroads. The Journal of Counselor Preparation and Supervision, 7, 161–184. doi:10.7729/73.1106

Paisley, P. O., & Borders, L. D. (1995). School counseling: An evolving specialty. Journal of Counseling and Development, 74, 150–153. Retrieved from https://www.researchgate.net/publication/232543894_School_Counseling_An_Evolving_Specialty

Ratts, M. J., Singh, A. A., Nassar-McMillan, S., Butler, S. K., & McCullough, J. R. (2016). Multicultural and social justice counseling competencies: Guidelines for the counseling profession. Journal of Multicultural Counseling and Development, 44, 28–48. doi:10.1002/jmcd.12035

Singh, A. A., Urbano, A., Haston, M., & McMahan, E. (2010). School counselors’ strategies for social justice change: A grounded theory of what works in the real world. Professional School Counseling, 13(3), 135–145.

Christopher A. Sink, NCC, is a Professor and Batten Chair of Counseling at Old Dominion University and can be reached at Darden College of Education, Norfolk, VA 23508, csink@odu.edu. Melissa S. Ockerman is an Associate Professor in the Counseling Program at DePaul University and can be reached at College of Education, Chicago, IL 60614, melissa.ockerman@depaul.edu.

 

Needs and Contradictions of a Changing Field: Evidence From a National Response to Intervention Implementation Study

Eva Patrikakou, Melissa S. Ockerman, Amy Feiker Hollenbeck

As a result of the Response to Intervention (RTI) mandate in schools across many states, school counselors are well-positioned to take a leadership role. The present research study examines how school counselors across the nation perceived their training and knowledge of RTI, as well as their confidence in its implementation. Results indicate that while the majority of school counselors reported positive beliefs about RTI, they had limited confidence in their preparedness to perform certain RTI-related responsibilities, including collecting and analyzing data to determine intervention effectiveness and collaboration through teamwork. These perceived areas of deficiency point to a significant discrepancy with the American School Counselor Association National Model’s components and themes. Through building skills and capacity for leadership, school counselors can spearhead schoolwide teams to create and evaluate the effectiveness of culturally relevant and evidence-based interventions. School counselors and school counselor educators must use a multi-tiered system of supports as an opportunity to advance the field.

Keywords: collaboration, multi-tiered system of supports, Response to Intervention, school counselors, school counselor educators

The climate of accountability in today’s public schools requires all professionals to utilize data to inform decisions in the context of their practice, and the school counselor is no exception. Broader, statewide mandates such as Response to Intervention (RTI) have put additional pressure on school professionals, raising questions regarding practitioners’ preparedness to effectively utilize data to inform practice and collaborate with peers to support the needs of struggling students. The aim of this study is to examine school counselors’ beliefs, perceived level of preparedness and practices regarding RTI nationwide, specifically in states where this model has been implemented.

The reauthorization of the Individuals with Disabilities Education Act (IDEA) in 2004 and the subsequent 2008 regulations incentivized RTI, a multi-tiered system of academic and behavioral supports for struggling students (Zirkel & Thomas, 2010). In each tier of instruction, student needs and interventions are determined through ongoing data collection and analysis. To explicate, the general education environment comprises Tier 1 of RTI, with the integration of research-based practices, universal screening and differentiated small group instruction. If a child is not successful in this environment, he or she is targeted for Tier 2 intervention, small group instruction paired with ongoing progress monitoring. A continued lack of responsiveness moves the student to Tier 3, a more intensive level of intervention and progress monitoring, with possible referral for special education services (Fuchs, Mock, Morgan, & Young, 2003; National Joint Committee on Learning Disabilities, 2005; Vaughn & Fuchs, 2003). Thus, when determining whether a student has a specific learning disability (SLD) in an RTI framework, there should be a significant body of data in regards to a child’s response to intervention to inform the eligibility process (Hauerwas, Brown, & Scott, 2013; Zirkel & Thomas, 2010).

RTI has become increasingly commonplace in states across the nation since the 2004 IDEA reauthorization (Individuals with Disabilities Education Improvement Act of 2004). Review of the Web sites of 50 state departments of education indicated that 17 states require RTI in the process of identifying whether a student has an SLD, and 45 states have guidance documents to support the implementation of RTI (Hauerwas et al., 2013). In addition, Berkeley, Bender, Peaster, and Saunders (2009) found that 14 of 15 states required RTI to address both academic and behavioral domains. In a 2010 review of state laws and special education guidelines, Zirkel and Thomas noted that eight states required universal screening for academic and behavioral needs, while 23 recommended academic and behavioral screening. Thus, in some states the academic supports of RTI are specifically linked with the behavioral supports and interventions of Positive Behavioral Intervention Supports (PBIS).

PBIS is a multi-tiered, data-based system of support for students with emotional and behavioral needs that incorporates ongoing assessments and data-based decision making, professional development in research-based practices, and provision of tiered intervention for students who need additional assistance (Sugai & Horner, 2006). Both RTI and PBIS share the premise that educational outcomes can be improved for all by integrating research-based practices in the general education environment (Fairbanks, Sugai, Guardino, & Lathrop, 2007; Hollenbeck, 2007; Sadler & Sugai, 2009; Sugai & Horner, 2009), and thus they are commonly combined in schoolwide frameworks. A multi-tiered system of supports (MTSS) is a comprehensive academic and behavioral model that integrates both RTI and PBIS (Averill & Rinaldi, 2011).

As with any significant educational reform, RTI/MTSS has a high likelihood to change professional practices. For example, social workers have been urged to recognize the importance of evidence-based decisions and data collection when working with the social-emotional concerns of students (Harrison & Harrison, 2014) and to increase their collaborative practices (Avant, 2014). General educators, special educators and reading specialists in Pennsylvania indicated an increase in collaborative practices after RTI implementation (Bean & Lillenstein, 2012). Sullivan and Long (2010) reported that a survey of school psychologists found those who were actively involved with RTI spent a higher percentage of time (25%) implementing academic interventions, in comparison to those practitioners who were not actively involved and reported less than 5% of their time spent on academic interventions. While there is an emerging body of research into the effects of RTI on the professional practice of school counselors within a handful of states (Betters-Bubon & Ratas, 2015; Luck & Webb, 2009; Miller, 2008; Ockerman, Patrikakou, & Hollenbeck, 2015; Ryan, Kaffenberger & Caroll, 2011), there has yet to be a study of school counselors’ beliefs and perceptions of readiness to implement RTI across a national stage, or the impact of RTI upon school counselors’ professional practice.

In this article, we first review relevant literature focused on the changing role of the school counselor in relation to RTI/MTSS. Second, we present a nationwide study regarding school counselor perceptions, preparedness and professional practice in states mandating RTI or MTSS. Finally, we discuss implications for school counselor training and preparation and provide recommendations for future research and practice.

The Changing Role of the School Counselor in Multi-Tiered Frameworks

The American School Counselor Association (ASCA) recently revised its position statement on RTI, adding MTSS (2014). ASCA specifically outlined how all components of a comprehensive developmental school counseling program (foundation, delivery, management and accountability) align with a multi-tiered continuum and underscored school counselors’ pivotal role with data. To that end, school counselors must aid in data analysis to help identify students in need, evaluate counseling interventions to determine efficacy, and assist school staff in selecting evidence-based academic and behavioral strategies for students (ASCA, 2014; Ockerman, Mason, & Hollenbeck, 2012).

There were some notable efforts to promote school counselor involvement in this educational mandate prior to the publication of the ASCA MTSS position statement, including research conducted by the RTI Action Network (2009), which highlighted how innovative school counselors in three Western states (i.e., Colorado, Oklahoma and Wyoming) integrated their counseling services within an RTI framework. Zambrano, Castro-Villarreal, and Sullivan (2012) noted synergies between school counselors and school psychologists and called for increased collaboration to optimize services for students. Moreover, Ockerman and colleagues (2012) suggested the pairing of comprehensive developmental school counseling programs with RTI has the potential to effectively serve all students, particularly those historically underserved, and to advance the position of the school counselor as a transformational leader. Moreover, the authors called for more robust research regarding the role of the school counselor and evidence-based practices using MTSS.

As such, Ockerman et al. (2015) investigated how school counselors in a Midwestern state perceived their training and knowledge of RTI and thus their confidence in implementation. Results indicated that the majority of school counselors had little confidence in their ability to employ essential roles, including the following: increasing parental involvement, engaging in collaborative practices, and using data to make decisions about student interventions. Overall, having knowledgeable, positive building leaders such as school principal, assistant principals, and deans, in conjunction with a firm understanding of specific school counselor roles and responsibilities, predicted having favorable views of RTI as a means to improve students’ academic and behavioral outcomes. Concomitant with these findings, Betters-Bubon and Ratas (2015) reported that school counselors in a neighboring Midwestern state experienced both positive outcomes (e.g., positive school climate, enhanced perception of the school counselor and increased teacher involvement) and barriers to success (e.g., increased record keeping, lack of training and buy-in, and lack of time to use data effectively) as a result of MTSS implementation. The authors also found that strong administrative support was associated with affirmative perceptions of MTSS, corroborating the findings of Ockerman et al. (2015). Finally, Bookard (2015) surveyed 35 elementary school counselors in North Carolina, all of whom were designated as RTI chairperson within their schools. School counselors reported a decreased amount of time to complete core school counseling responsibilities due to an increased demand to organize, communicate and coordinate logistics on behalf of the RTI team. However, these counselors reported increases in their self-efficacy to perform multiple counseling duties and perceived RTI as having a positive impact on student achievement.

While these efforts at understanding the impact of RTI/MTSS on the roles and responsibilities of school counselors should be lauded, they remain focused on the state level and therefore may be generalizable only to a particular state or region. Thus, there is an urgent need for research examining school counselors’ preparedness and experiences with RTI/MTSS nationwide, especially in states where this model has been implemented. The present study investigates school counselors’ beliefs, perceived level of preparedness, and practice regarding RTI. Specifically, the following research questions were investigated: (1) What are school counselors’ beliefs regarding RTI? (2) How prepared do school counselors feel regarding their training on the various implementation aspects of RTI?
(3) What roles and responsibilities of school counselors changed due to the RTI implementation?
(4) Is attitude toward RTI predicted by factors including demographics, as well as perceived confidence with various aspects of RTI?

Method

Participants

     Members of ASCA participated in this study by completing a survey. Participants were randomly selected from each of the 15 states that were reported as implementing RTI fully or partly at the time of this study’s construction (Zirkel, 2014). Specifically, participants were targeted in the following states: Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois, Iowa, Louisiana, Maine, New Mexico, New York, Rhode Island, West Virginia and Wisconsin.

In looking at the characteristics of survey respondents, 99% indicated they were currently practicing, with 96% employed full-time. Eighty-two percent were between 31 and 60 years old, and 85% were female. Ninety-two percent reported working in public school settings. Twenty-seven percent indicated working in an elementary setting, 14% in an elementary-middle school, 19% in a middle school, and 35% in a high school. A total of 81% indicated six years or more of practice as a school counselor, with 73% indicating six years or more since their last degree conferral (see Table 1 for demographic information).

Table 1
Participant Demographics

Percent

Currently Practicing

99

Full-time employment

96

Age
   25 or under

 1

   26–30

 8

   31–40

33

   41–60

49

   Over 60

 9

Sex
   Female

85

   Male

15

School Setting
   Public                  92
   Charter    3
   Private                    5
School Population
Elementary  27
Elementary/Middle  14
Middle School  19
Middle/High School    4
High School  35
K–12    1
Years in Practice
   1–5 years

19

   6–10 years

36

   11–15 years

19

   16+ years

26

Years Since Final Degree
   1–5 years

27

   6–10 years

34

   11–15 years

17

   16+ years

22

Measures

The survey was originally developed for a statewide investigation of school-based professionals in response to RTI (Hollenbeck & Patrikakou, 2014), which was then adapted specifically for school counselors and administered in the same Midwestern state (Ockerman et al., 2015). It is important to note that survey items align with the ASCA National Model (2003, 2005, 2012). Specifically, questions paralleled the four ASCA model quadrants (foundation, delivery, management and accountability) and their four surrounding themes (advocacy, collaboration, leadership and systemic change). For example, survey questions, such as perceived preparedness for counseling interventions at each tier, represented the delivery component, and items about data collection and data management systems were representative of the accountability component. Themes also were assessed through survey questions, including items addressing leadership responsibilities and effective teamwork within the RTI framework (see Table 2 for scales and specific ASCA quadrants and themes). The purpose of the survey was to illuminate school counselors’ participation in RTI, as well as their underlying beliefs and attitudes, with the goal of providing insight into changing professional practices and future preparation needs.

The survey was comprised of five parts. The first section addressed demographics (e.g., age, employment status, years in the field). The second section involved questions regarding RTI training and implementation (e.g., How many professional development sessions have you received in relation to RTI? What year did your school implement an RTI framework?). The third section contained 14 Likert-type items asking participants about their perceived level of preparation toward specific aspects of RTI (e.g., underlying rationale, counseling interventions for Tier 1, schoolwide data management systems for documentation and tier decision making). The fourth part included 14 Likert-type questions measuring participants’ beliefs and practices (e.g., RTI is the best option to support struggling learners; RTI is a vehicle for promoting culturally responsive practices). Lastly, the fifth section addressed changes to school counselors’ responsibilities due to RTI via seven yes-no questions, such as I am now involved in data collection and/or data management in support of RTI decisions. In addition, an open-ended question encouraged participants to share any additional thoughts on RTI and its implementation.

Procedure

The authors obtained a list of members from ASCA who had noted that they wished to receive ASCA-approved, research-related mailings. Participants were then randomly selected from each of the 15 states that were reported as implementing RTI fully or partly (Zirkel, 2014). Surveys were mailed to those randomly selected participants along with a self-addressed, prepaid return envelope. No incentives were provided for returned surveys. From 2,477 surveys mailed, 528 were returned, for a 21.3% return rate, higher than other online surveys (Cochrane & Laux, 2008; Sullivan, Long, & Kucera, 2011).

Scales

For the purpose of this study’s analyses, eight scales were used. These scales were constructed and tested in two previous research studies, and tests of internal consistency have yielded consistently robust results with high reliability coefficients (Hollenbeck & Patrikakou, 2014; Ockerman et al., 2015). The scales’ original construction was based on an extensive literature review of RTI and its implementation to incorporate all pertinent aspects of MTSS. The survey underwent a piloting phase prior to being utilized in prior research studies to address construct and content validity. During the pilot phase, in addition to experts in the field, items also were reviewed by 80 school-based professionals who provided specific feedback (Hollenbeck & Patrikakou, 2014).

As a measure of internal consistency, Cronbach’s Alpha (α) was computed for each of the eight scales (scale items and reliability coefficients are reported in Table 2). For scales with more than two items, Cronbach’s α was calculated with and without each of the scale’s items to determine whether dropping an item would increase the scale’s internal consistency. There was no occasion in which the deletion of an item increased the α coefficient; therefore, no changes were made to the scales. Alpha coefficients ranged from .75 to .94. The use of a similar survey on a different population also obtained strong coefficients (Ockerman et al., 2015), indicating the robustness of the instrument across populations.

Table 2
Scale Items and Cronbach’s Alpha Coefficients

Variables Items

Cronbach’s α

RTI Background Information (2)* – Historical overview- Underlying rationale .80
Responsibilities and benefits (2) – Anticipated benefits- Roles and responsibilities within the tiered model     .75
Tier service delivery model (2)(ASCA Model – Delivery Component) – Tier service delivery model (general)- Tier service delivery model (specific to one’s school)   .87
Counseling interventions (3)(ASCA Model – Delivery Component) – for Tier 1- for Tier 2- for Tier 3   .94
Data collection, management, and implementation (3)(ASCA Model – Accountability Component) – Collecting and analyzing outcome data to determine effectiveness of RTI interventions- Schoolwide data management systems for documentation and decision making about students who need supportive services within RTI- Assuming leadership in RTI implementation   .89
Collaborative practices (2)
(ASCA Model – Collaboration Component)
– Effective teamwork in RTI implementation- Informing and involving parents within an RTI framework   .86
School building leadership and RTI competence (4)
(ASCA Model – Leadership Theme)
– Principal describes RTI in a positive manner- Principal seems highly knowledgeable about RTI- Other building-level leaders highlyknowledgeable about RTI- RTI concerns and challenges are addressed in a positive manner within my school   .86
RTI viewed as beneficial (7) – RTI is the best option to support struggling learners and students with social-emotional concerns- RTI is the best option to support students with social-emotional concerns- RTI can improve the outcomes for all students- RTI can improve the behavior outcomes for all students

– RTI can inform the process of identifying students with learning disabilities (LD)

– RTI data are sufficient in determining whether or not a student has an LD

– RTI is a vehicle of promoting culturally responsive practices within my school

.84

* Number of items

Data Analysis

Descriptive statistics were generated to address the first three research questions, while a simultaneous liner least squares regression model was tested to address the fourth question. Variance Inflation Factors (VIF) were calculated to test for multicollinearity in relation to the regression model. All VIFs were under 4, well below the 10 threshold that is used as a rule of thumb to raise concerns regarding multicollinearity (O’Brien, 2007; Stevens, 1992). Additionally, White’s (1980) heteroscedasticity test was performed to determine whether the error term in the regression model had constant variance, to avoid using biased standard errors that would lead to invalid inference. Since White’s test indicated the existence of heteroscedasticity (χ2 = 164.13; p < .01), the regression model was estimated with White’s correction for the standard errors.

Results

Descriptive Statistics

Research question 1: What are school counselors’ beliefs regarding RTI? Sixty-three percent of the respondents agreed and 13% strongly agreed with the statement that RTI can improve the academic outcomes of all students. Fewer participants indicated that RTI can improve the behavioral outcomes for all students (53% agreed and 9% strongly agreed). Seventy-five percent of participants agreed or strongly agreed that RTI is the best option to support struggling learners, while only 49% agreed or strongly agreed that RTI is the best option to support students with social and emotional concerns. Only half of the respondents (54%) agreed or strongly agreed that RTI is a vehicle of promoting culturally responsive practices. The majority of participants agreed or strongly agreed that their school principal described RTI in a positive manner, but only 57% reported that they viewed their principal as highly knowledgeable about RTI. The same percentage of respondents (57%) agreed or strongly agreed with the statement that building leaders in general seemed knowledgeable, whereas only 46% agreed with the statement that the majority of their colleagues were in favor of RTI. While the striking majority of participants viewed RTI as informing the process of identifying students with learning disabilities (88%), only 26% agreed with the statement that RTI data are sufficient in determining whether or not a student has a learning disability (see Table 3).

Table 3

RTI Beliefs and Practices

Strongly Disagree

Disagree

Agree

Strongly Agree

                                                                   Percent

   RTI is the best option to support
struggling learners

3

22

66

9

   RTI is the best option to support
students with social-emotional concerns

 6

45

44

 5

   RTI can improve academic outcomes
for all students

 2

22

63

13

   RTI can improve behavioral outcomes
for all students

 3

35

53

 9

   RTI can inform the process of identifying students with learning disabilities

 3

9

71

17

   RTI data are sufficient in determining whether or not a student has a learning disability

 16

58

23

 3

   RTI is a vehicle for promoting culturally responsive practices

 5

41

49

 5

   My principal describes RTI in a
positive manner

 5

18

62

15

   My principal seems highly
knowledgeable about RTI

12

31

43

14

   Our building-level leaders seem highly
knowledgeable about RTI

10

33

45

12

   RTI concerns and challenges are addressed in a positive manner

 8

30

55

 7

   The majority of colleagues are in favor of an RTI framework

 9

45

43

 3

   RTI is viewed as a collaborative endeavor among school professionals in my school

8

33

51

 8

   There are building-wide supports for
collaboration within my school (e.g., common planning time, teams, etc.)

11

21

51

17

Research question 2: How prepared do school counselors feel regarding their training on the various implementation aspects of RTI? The top three aspects in which participants felt either adequately or expertly prepared are as follows: understanding the tiered service delivery model in general (69%), counseling interventions for Tier 1 (68%), and the anticipated benefits of RTI (66%). The bottom three aspects of RTI in which respondents felt adequately or expertly prepared include the following: the historical background of RTI (29%), schoolwide data management systems for documentation and decisions (36%), and collecting and analyzing data to determine effectiveness of RTI interventions (42%; see Table 4 for detailed percentages).

Table 4

Perceived Preparedness on Different Aspects of RTI

Not

Prepared

Somewhat Prepared

Adequately

Prepared

Expertly

Prepared

Historical overview of RTI

36

35

26

 3

Underlying rationale of RTI

 9

30

53

 8

Anticipated benefits of RTI

 8

27

56

10

Tiered service delivery model – general

 6

25

54

15

Tiered service delivery model – school specific

11

30

44

15

Role and responsibilities within the tiered model

14

29

41

16

Counseling interventions for Tier 1

12

20

44

24

Counseling interventions for Tier 2

13

25

43

19

Counseling interventions for Tier 3

13

26

41

21

Collecting and analyzing data to determine
effectiveness of RTI interventions

23

35

34

 8

Schoolwide data management systems for
documentation & decision making

26

38

27

 9

Informing and involving parents within an
RTI framework

21

34

34

11

Effective teamwork in RTI framework

16

33

38

13

Assuming leadership in RTI implementation

27

30

30

13

Research question 3: What roles and responsibilities of school counselors changed due to the RTI implementation? The majority of respondents (55%) reported that their responsibilities have changed due to RTI. The top two new roles and responsibilities in which respondents identified as now being directly involved are as follows: collaborate with colleagues as part of an RTI team (52%) and involvement in data collection and data management in support of RTI (41%). The two responsibilities reported as least changed were directly providing Tier 1 academic services (14%) and assuming increased special education responsibilities (3%; Table 5 includes reported changes in various roles and responsibilities).

Table 5

Changes in Roles and Responsibilities

Percent

Directly provide Tier 1 academic services

14

Directly provide Tier 1 behavioral services

23

Directly provide Tier 2 and/or Tier 3 academic interventions

19

Directly provide Tier 2 and/or Tier 3 behavioral interventions

30

Involved in data collection and/or data management in support of RTI

41

Collaborate with colleagues as part of an RTI team

52

Train others about RTI practices within my school or district

21

Increased special education responsibilities

3

Regression Analysis

Research question 4: Is attitude toward RTI predicted by factors including demographics, as well as perceived confidence with various aspects of RTI? The full regression model accounted for 26% of the variance in perception of RTI as a beneficial change. In order to estimate the effect size for this analysis, Cohen’s f2was calculated . The effect size was found to be equal to Cohen’s (1988) convention for a large effect (f2 = .35). As Cohen (1988) noted, effect size indicates “the degree to which the phenomenon is present in the population” (p. 9). In addition to the effect size, the Precision Efficacy Analysis for Regression method was used to test the appropriateness of the sample size, since regression analysis is used for prediction (Brooks & Barcikowski, 1999). The minimum size required was calculated at 101; therefore, with 528 observations, the sample size is appropriate for this analysis.

Two variables were statistically significant at the p <. 001 level: perceived leadership competence (β = .26) and understanding the specific roles, responsibilities and benefits of RTI (β = .25). In other words, if school counselors (a) perceived building-level leaders as knowledgeable and positively predisposed to RTI, and (b) were confident about understanding their roles and responsibilities within an RTI model, as well as the anticipated benefits of the RTI framework, they were more likely to view RTI as a vehicle to drive improvements in academic and behavioral outcomes for all students. Table 6 includes standardized coefficients (β), unstandardized coefficients (B), and standard errors (SE) for all variables in the model.

Table 6
Estimated Coefficients of Full Model With White’s Correction for Standard Errors

Variable Name

B

SE B

β

Age

-.081

.033 -.138
Sex

-.064

.058 -.052
Ethnicity

-.133

.063 -.096
Total years in practice

-.020

.029 -.046
Years since final degree conferral

 .219

.026  .045
Number of RTI trainings received

-.029

.026 -.061
Year of RTI implementation

-.044

.035 -.060
Leadership competence             .183 .035   .261**
RTI background information             .012  .023    .028
Data collection and management

 .080

.050   .145
Tier service model delivery

-.069

.050 -.107
Counseling interventions

-.006

.034 -.012
Collaborative practices

 .042

.050   .075
Responsibilities and benefits    .165  .056    .253*
 F                                                            9.056**R2                                                                                          .26                                 Adjusted R2                                           .23
* p < .01; ** p < .001

These results provided a descriptive picture of school counselors’ beliefs and practices regarding RTI/MTSS, as well as their level of perceived preparedness to complete tasks inherent in a multi-tiered framework of student support. For example, school counselors indicated they were directly involved in schoolwide data management systems for documentation and decisions; however, the majority (64%) reported they were either not prepared or somewhat prepared (26% and 38%, respectively) to fulfill such a role. Likewise, although 52% of practitioners reported that they are now required to collaborate with colleagues as part of an RTI team, 49% of them indicated that they were either not prepared (16%) or somewhat prepared (33%) to engage in effective teamwork within an RTI model. In addition, results from the regression analysis indicated the importance of role clarity and educational leadership, with school counselors having a more positive view of RTI if they themselves had a clear understanding of their roles and responsibilities within the RTI framework, and also when they considered school leaders to be positive and knowledgeable about this initiative.

Discussion

The integration of RTI into districts and schools has influenced professional practices, including the work of the school counselor. Study participants indicated the ways in which their roles and responsibilities have changed under RTI, as well as their beliefs and perceptions of preparedness to work in a multi-tiered framework. Data analysis highlights a number of needs and incongruities for the field of school counseling. We address these contradictions and highlight their represented needs in relation to pre-service and in-service preparation.

Contradictions: Disability Identification

The results of this study suggest noteworthy contradictions that merit further exploration. First, many school counselors believe that RTI is the best option to support struggling learners and that RTI is a vehicle for identifying students with SLD. Yet, only a quarter of participants agreed that data garnered through RTI is sufficient for learning disability determination. We postulate this incongruence may be the result of an ongoing debate between school professionals regarding the process of identifying students with SLD (McKenzie, 2009; Reschly, 2003; Scruggs & Mastropieri, 2002). Historically, the process of SLD identification involved standardized testing to determine if there was a significant discrepancy between a student’s intelligence (as measured by standardized IQ tests) and levels of achievement (as measured by standardized achievement tests). However, many researchers and practitioners have objected to this method, citing the rapid increase in the identification of SLD since 1975 (Vaughn, Linan-Thompson, & Hickman, 2003) and the cultural and racial biases still inherent in IQ testing, leading to the over-representation of minorities in special education classrooms (Francis, Fletcher, & Morris, 2003). In addition, this method is perceived as “wait to fail” diagnostics, since a significant discrepancy between IQ and achievement is not typically established until grade three or higher, past the crucial early intervention window (Mellard, Deshler, & Barth, 2004). This contentious discourse is reflected in varying state regulations, with some allowing for discrepancy testing (e.g., Illinois and Idaho) while others legally forbid its use (e.g., Colorado and Indiana; Zirkel & Thomas, 2010). Thus, participants’ responses might be reflective of the lack of consensus in relation to best practice in identifying students with SLD.

Furthermore, the majority of surveyed school counselors believed RTI can improve academic outcomes, but were less inclined to believe that RTI can improve behavioral outcomes, and were even less convinced that RTI is the best option to support students with social-emotional concerns. When RTI was originally referenced in the 2004 IDEA reauthorization (Individuals with Disabilities Education Improvement Act of 2004), it was promoted with an academic focus as an alternative or supportive means of identifying students with learning disabilities. There was no reference in the law to identifying students with emotional or behavioral disabilities, nor was there reference to a system of supports for social-emotional and behavioral needs. However, the natural alignment of the tiered frameworks of RTI with PBIS encouraged some states to mandate a multi-tiered system of supports (Averill & Rinaldi, 2011). It is important to note that while some states, such as Wisconsin, require a comprehensive MTSS framework, this is not true of all states (Berkeley et al., 2009). Therefore, school counselors’ unease with the use of RTI in support of students with social-emotional concerns is again reflective of a greater debate in the field in regards to the role of RTI or MTSS in supporting all students and informing disability identification. These contradictions point to a need for increased awareness and dialogue about the processes of disability identification within the profession of school counseling. With clear understanding and background knowledge, school counselors will be better prepared to advocate for fair and unbiased methods of disability identification, thereby helping to reduce the disproportionate disability identification of students of color.

Contradictions: Changing Responsibilities and Levels of Preparation

Two significant gaps were apparent in relation to school counselors’ RTI-related roles and their levels of confidence in regards to these changing responsibilities: School counselors felt underprepared to foster collaboration, as well as to use data to inform their practices and make decisions about students.

Collaborative practices. Beginning with collaboration, as aligned with Ockerman and colleagues’ (2015) statewide findings, an overwhelming majority of participants reported they are now required to engage more in collaborative practices as a result of RTI implementation. However, many respondents did not believe other school professionals viewed RTI as favorable or as a collaborative endeavor, and over a third of respondents believed there were not building-wide supports for collaborative efforts (e.g., common planning time, teams). Additionally, about half of the respondents reported that they were not adequately prepared for teamwork. Yet, collaboration is at the core of the school counseling profession. Specifically, the ASCA National Model (2012) emphasized the importance of collaboration by including it as one of its four main themes, and several components of the ASCA National Model (e.g., advisory council, annual agreements) are only achievable through collaborative relationships. Moreover, the Transformed School Counseling Initiative (TSCI) cited teaming and collaboration as necessary components for a school counselor’s ability to create sustained systemic change (Martin, 2002; Sears, 1999). Thus, school counselors need to find pathways to build community and create a culture of shared responsibility, not only to benefit students but to be efficient and effective in their jobs.

This finding also signals counselor educators to better prepare pre-service school counselors to work in school climates viewed as divisive or individualistic and to cultivate the requisite skill sets to do so. Bolstering communication, facilitation and conflict-resolution skills, school counselors can be trained to help school teams unite around the broader goals of ensuring the academic, emotional and behavioral success of all students. Leveraging these unique skill sets, they can improve the efficacy of RTI teams and ensure they remain integral to the process.

 Schoolwide data management systems for documentation and decision making. Although scholars within the school counseling profession have emphasized the importance of evidence-based research for over a decade (Dimmitt, Carey & Hatch, 2007; Whiston, 2001, 2002) and the need for school counselor accountability was discussed as early as the 1920s (Gysbers, 2004), school counselors still indicated they felt inadequately prepared to work with data to drive decisions or analyze data in meaningful ways. Similarly, an overwhelming majority of respondents in this survey indicated a lack of preparedness for schoolwide data management and reported not feeling adequately trained to analyze outcome data to determine effectiveness of RTI interventions. Yet, many reported that their roles have changed to involve data collection and data management in support of RTI. This discrepancy points to an urgent need for both pre-service and in-service professional training around the use of data, as it is central to RTI and many educational reforms. School counselors must be well-prepared to understand the utility of data rather than be stymied by it. If school counselors are to play a pivotal role in dismantling the achievement gap, which is now an ethical obligation (ASCA, 2010) rather than a laudable goal, they must be able to critically analyze data to ensure all students are served equitably. Moreover, if school counselors are active members of the RTI team, as many indicated in this survey that they are, they must be able to determine how their efforts are helping or thwarting a young person’s ability to succeed. While RTI may or may not be a welcome mandate in schools, school counselors can leverage its emphasis on data collection and management to ensure students are receiving evidence-based interventions (Ockerman et al., 2012). The inability to do so not only jeopardizes school counselors’ job security, but also shortchanges their students.

Fortunately, there are several resources that school counselors and counselor educators can employ to meet this dire need. Hatch’s recent text, The Use of Data in School Counseling (2014), centers on this subject and complements other publications including Kaffenberger and Young’s Making Data Work (2013), and Dimmitt et al.’s seminal text, Evidence-Based School Counseling: Making a Difference With Data-Driven Practices (2007). School counselors also can advocate for evidence-based small and large group counseling interventions, including Second Step: Skills for Social and Academic Success (Committee for Children, 2010) and Student Success Skills (Brigman & Webb, 2007). School counselors and counselor educators can hone and refine their data skills by attending the annual Evidence-Based National School Counseling Conference and becoming familiar with the burgeoning research conducted at the Ronald H. Frederickson Center for School Counseling Outcome Research and Evaluation. Moreover, counselor educators need to ensure this topic is discussed and evaluated in both their core school counseling and clinical courses so as to best prepare future school counselors to be accountable and data savvy (Hatch, 2014; Studer & Diambra, 2016).

Needs: Defining Roles and Leadership Opportunities

School counselors were most likely to view RTI as a means of positively impacting academic and behavioral outcomes for all students when they (a) had leaders who were knowledgeable and positive about RTI; and (b) were clear about their own roles and responsibilities, as well as the anticipated benefits of the model. These results support findings from state-level surveys of RTI preparedness and beliefs across both school counselors and school psychologists (Hollenbeck & Patrikakou, 2014; Ockerman et al., 2015). Thus, school counselors should work to ensure role clarity and consider how best to utilize their skills and knowledge in support of change.

There are several ways in which school counselors can leverage their unique skill sets to optimize their collaborative relationships with school administration and staff. This may involve meeting with the principal to discuss roles and responsibilities, advocating for a leadership role in relation to collaborative practices or data-based decision making, and working with parents to ensure they are engaged and informed. School counselors also can better define their roles in relation to RTI by documenting these duties in their annual agreement (ASCA, 2012). By working collaboratively with school personnel to harness their strengths and create common goals, school counselors can build capacity and thereby increase their ability to reach more students. Additionally, school counselors should work with their building leaders to create professional development aimed at increasing staff knowledge about RTI in positive, proactive ways. As such, school staff can begin to view school counselors as leaders within this area and collaborative partners for creating systemic change.

School counselor educators also must infuse leadership competence and role clarity within their coursework and evaluate pre-service students’ understanding and aptitudes as requisites for advancing into the profession (Chen-Hayes, Ockerman, & Mason, 2014). Introductory and foundational school counseling courses should emphasize the school counselors’ role, including appropriate and inappropriate tasks (ASCA, 2012). Moreover, field-based practicum and internship courses should require practically-based experiential activities that build leadership and advocacy capacity through data collection and analysis. All graduating school counselors should be required to measure the impact of their work and its contributions to the betterment of students, schools and communities. In such, state standards for the preparation of school counselors should reflect an emphasis on this pivotal skill set.

Limitations and Future Directions

The aim of the present study was to examine school counselors’ beliefs, perceived levels of preparedness and practices regarding RTI in states where this model has been implemented. Inherent in the self-reporting through survey research is the credibility of such reports. As Paulhus and Vazire (2007) noted, “even when respondents are doing their best to be forthright and insightful, their self-reports are subject to various sources of inaccuracy” (p. 228). Participants may have exaggerated or under-reported their lack of preparedness and confidence. In addition, respondents also might have inaccurately remembered their trainings and preparation, therefore imprecisely reporting it in their responses.

While results provided a descriptive picture of perceived preparedness and its impact on the degree to which school counselors viewed RTI as beneficial, this study did not investigate possible indirect and total effects that can offer a fuller picture of influences. Future studies should apply structural equation modeling to explore direct, indirect and total effects, and therefore provide further implications for practice. Additionally, given the developmental differences between elementary, middle and high school students, the focus of school counselors’ involvement in RTI implementation may vary at the different grades. Future studies should examine whether differences exist in the way RTI is viewed by practitioners serving at various school levels so that training can be customized based on specific needs. Lastly, data for this study were collected by surveying school counselors in the 15 states that were reported as implementing RTI fully or partly. It would be beneficial to survey practitioners in states where future implementation of MTSS has been planned so that proactive and well-informed steps can be taken to better prepare school counselors for the effective implementation of such frameworks.

There are significant areas of opportunity in MTSS for school counselors. School counselors have the cultivated abilities to lead, advocate and partner with their peers, which can be foundational in the design, implementation and evaluation of MTSS systems. The school counselor is positioned to lead with a vision of creating culturally relevant and evidence-based interventions aimed at reducing the achievement gap. Therefore, school counselor educators must be producers (not just consumers) of data to assist their students in making informed, culturally responsive decisions to support academic, social and emotional learning for all students. Major educational reforms such as RTI should serve as a welcome motivation for improved practice and professional advancement. Politically aware and comprehensively trained school counselors can leverage such educational mandates to access necessary resources and become the innovators and path-charters of their profession.

Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest or funding contributions for the development of this manuscript.

References

American School Counselor Association. (2003). The ASCA national model: A framework for school counseling programs. Alexandria, VA: Author.

American School Counselor Association. (2005). The ASCA national model: A framework for school counseling programs (2nd ed.). Alexandria, VA: Author.

American School Counselor Association. (2010). Ethical standards for school counselors. Retrieved from http://www.schoolcounselor.org/asca/media/asca/Resource Center/Legal and Ethical Issues/Sample Documents/EthicalStandards2010.pdf

American School Counselor Association. (2012). The ASCA national model: A framework for school counseling programs (3rd ed.). Alexandria, VA: Author.

American School Counselor Association. (2014). Position statement: Multi-tiered system of supports. Alexandria, VA: Author. Retrieved from https://www.schoolcounselor.org/asca/media/asca/PositionStatements/PositionStatements.pdf

Avant, D. W. (2014). The role of school social workers in implementation of response to intervention. School Social Work Journal, 38(2), 11–31.

Averill, O. H., & Rinaldi, C. (2011). Multi-tier system of supports. District Administration, 47(8), 91–94. Retrieved from https://www.districtadministration.com/article/multi-tier-system-supports

Bean, R., & Lillenstein, J. (2012). Response to intervention and the changing roles of schoolwide personnel. The Reading Teacher, 65, 491–501. doi:10.1002/TRTR.01073

Berkeley, S., Bender, W. N., Peaster, L. G., & Saunders, L. (2009). Implementation of response to intervention: A snapshot of progress. Journal of Learning Disabilities, 42, 85–95. doi:10.1177/0022219408326214

Betters-Bubon, J., & Ratas, L. (2015, April). The impact of multi-tiered systems of support on school counselors. Poster presented at the American Educational Research Association (AERA) Annual Meeting, Chicago, IL.

Bookard, K. L. (2015). Perceived effects of North Carolina’s response to intervention process on school counselor’s professional duties and responsibilities: A correlational study. Dissertation Abstracts International Section A, 75.

Brigman, G., & Webb, L. (2007). Student success skills: Impacting achievement through large and small group work. Group Dynamics: Theory, Practice and Research, 11, 283–292.

Brooks, G. P., & Barcikowski, R. S. (1999, April). The precision efficacy analysis for regression sample size method. Paper presented at the meeting of the American Educational Research Association, Montreal, Quebec, Canada. (ERIC Document Reproduction Service No. ED449177)

Chen-Hayes, S. F., Ockerman, M. S., & Mason, E. C. M. (2014). 101 solutions for school counselors and leaders in challenging times. Thousand Oaks, CA: Corwin Press.

Cochrane, W. S., & Laux, J. M. (2008). A survey investigating school psychologists’ measurement of treatment integrity in school-based interventions and their beliefs about its importance. Psychology in the Schools, 45, 499–507. doi:10.1002/pits.20319

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.

Committee for Children. (2010). Second step program and SEL research. Retrieved from http://www.cfchildren.org/second-step/research

Dimmitt, C., Carey, J. C., & Hatch, T. (2007). Evidence-based school counseling: Making a difference with data-driven practices. Thousand Oaks, CA: Corwin Press.

Fairbanks, S., Sugai, G. M., Guardino, D., & Lathrop, M. (2007). Response to intervention: Examining classroom behavior support in second grade. Exceptional Children, 73, 288–310. doi:10.1177/001440290707300302

Francis, D.  J., Fletcher, J.  M., & Morris, R. D. (2003, December). Response to intervention (RTI): A conceptually and statistically superior alternative to discrepancy. Paper presented at the National Research Center on Learning Disabilities Responsiveness-to-Intervention Symposium, Kansas City, MO.

Fuchs, D., Mock, D., Morgan, P. L., & Young, C. L. (2003). Responsiveness-to-intervention: Definitions, evidence, and implications for the learning disabilities construct. Learning Disabilities Research & Practice, 18, 157–171. doi:10.1111/1540-5826.00072

Gysbers, N. C. (2004). Comprehensive guidance and counseling programs: The evolution of accountability. Professional School Counseling, 8, 1–14. Retrieved from http://www.counseling.org/docs/default-source/vistas/comprehensive-guidance-and-counseling-program-evaluation-program-personnel-results.pdf?sfvrsn=10

Harrison, K., & Harrison, R. (2014). Utilizing direct observation methods to measure social-emotional behaviors in school social work practice. School Social Work Journal, 39, 17–33.

Hatch, T. (2014). The use of data in school counseling: Hatching results for students, programs and the profession. Thousand Oaks, CA: Corwin Press.

Hauerwas, L. B., Brown, R., & Scott, A. N. (2013). Specific learning disability and response to intervention: State-level guidance. Exceptional Children, 80, 101–120. doi:10.1177/001440291308000105

Hollenbeck, A. F. (2007). From IDEA to implementation: A discussion of foundational and future responsiveness-to-intervention research. Learning Disabilities Research & Practice, 22, 137–146. doi:10.1111/j.1540-5826.2007.00238.x

Hollenbeck, A. F., & Patrikakou, E. (2014). Response to intervention in Illinois: An exploration of school professionals’ attitudes and beliefs. Mid-Western Educational Researcher, 26, 58–82.

Individuals with Disabilities Education Improvement Act (IDEA) of 2004, PL 108–446, 20 USC §§ 1400 et seq.

Kaffenberger, C., & Young, A. (2013). Making data work (3rd ed.). Alexandria, VA: American School Counseling Association.

Luck, L., & Webb, L. (2009). School counselor action research: A case example. Professional School Counseling, 12, 408–412.

Martin, P. J. (2002). Transforming school counseling: A national perspective. Theory into Practice, 41, 148–153.

McKenzie, R. G. (2009). Obscuring vital distinctions: The oversimplification of learning disabilities within RTI. Learning Disability Quarterly, 32, 203–215.

Mellard, D. F., Deshler, D. D., & Barth, A. (2004). Learning disabilities identification: It’s not simply a matter of building a better mousetrap. Learning Disability Quarterly, 27, 229–242.

Miller, B. (2008). Jefferson intermediate school: Pella, Iowa. Retrieved from http://www.rtinetwork.org/voices-from-the-field/entry/2/84

National Joint Committee on Learning Disabilities. (2005). Responsiveness to intervention and learning disabilities. Learning Disability Quarterly, 28, 249–260. Retrieved from http://ldq.sagepub.com/content/28/4/249.full.pdf+html

O’Brien, R. M. (2007). A caution regarding rules of thumb for variance inflation factors. Quality and Quantity41, 673–690. doi:10.1007/s11135-006-9018-6

Ockerman, M. S., Mason, E. C. M., & Hollenbeck, A. F. (2012). Integrating RTI with school counseling programs: Being a proactive professional school counselor. Journal of School Counseling, 10. Retrieved from http://jsc.montana.edu/articles/v10n15.pdf

Ockerman, M. S., Patrikakou, E., & Hollenbeck, A. F. (2015). Preparation of school counselors and response to intervention: A profession at the crossroads. Journal of Counselor Preparation & Supervision, 7(3). doi:10.7729/73.1106 Retrieved from http://repository.wcsu.edu/jcps/vol7/iss3/7

Paulhus, D. L., & Vazire, S. (2007). The self-report method. In R. W. Robins, R. C. Fraley, & R. Krueger (Eds.), Handbook of research methods in personality psychology (pp. 224–239). New York, NY: Guilford Press.

Reschly, D. J. (2003, December). What if learning disabilities identification changed to reflect research findings? Paper presented at the National Research Center on Learning Disabilities Responsiveness-to-Intervention Symposium, Kansas City, MO.

RTI Action Network. (2009). Voices from the field. Retrieved from http://rtinetwork.org/voices-from-the-field

Ryan, T., Kaffenberger, C. J., & Caroll, A.G. (2011). Response to intervention: An opportunity for school counselor leadership. Professional School Counseling, 14, 211–221.

doi:10.5330/PSC.n.2011-14.211

Sadler, C., & Sugai, G. M. (2009). Effective behavior and instructional support: A district model for early identification and prevention of reading and behavior problems. Journal of Positive Behavior Interventions, 11, 35–46. doi:10.1177/1098300708322444

Scruggs, T. E., & Mastropieri, M. A. (2002). On babies and bathwater: Addressing the problems of identification of learning disabilities. Learning Disability Quarterly, 25, 155–168.

Sears, S. J. (1999). Transforming school counseling: Making a difference for students. NASSP Bulletin, 83(603), 47–53.

Stevens, J. P. (1992). Applied multivariate statistics for the social sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Studer, J. R., & Diambra, J. F. (Eds.). (2016). A guide to practicum and internship for school counselors-in-training. New York, NY: Routledge.

Sugai G. M., & Horner, R. H. (2006). A promising approach for expanding and sustaining school-wide positive behavior support. School Psychology Review, 35, 245–259.

Sugai G. M., & Horner, R. H. (2009). Responsiveness-to-intervention and school-wide positive behavioral supports: Integration of multi-tiered system approaches. Exceptionality, 17, 223–237. doi:10.1080/09362830903235375

Sullivan, A. L. & Long, L. (2010). Examining the changing landscape of school psychology practice: A survey of school-based practitioners’ training and involvement in RTI. Psychology in the Schools, 47, 1059–1079. doi:10.1002/pits.20524

Sullivan, A. L., Long, L., & Kucera, M. (2011). A survey of school psychologists’ preparation, participation, and perceptions related to positive behavior interventions and supports. Psychology in the Schools, 48, 971–985. doi:10.1002/pits.20605

Vaughn, S., & Fuchs, L. S. (2003). Redefining learning disabilities as inadequate response to instruction: The promise and potential problems. Learning Disabilities Research & Practice, 18, 137–146. doi:10.1111/1540-5826.000070

Vaughn, S., Linan-Thompson, S., & Hickman, P. (2003). Response to instruction as a means of identifying students with reading/learning disabilities. Exceptional Children, 69, 391–409.

Whiston, S. C. (2001). Selecting career outcome assessments: An organizational scheme. Journal of Career Assessment, 9, 215–228. doi:10.1177/106907270100900301

Whiston, S. C. (2002). Response to the past, present and future of school counseling: Raising some issues. Professional School Counseling, 5, 148–157.

White, H. (1980). A heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity. Econometrica, 48, 817–838.

Zambrano, E., Castro-Villarreal, F., & Sullivan, J. (2012). School counselors and school psychologists: Partners in collaboration for student success within RTI and CDCGP Frameworks. Journal of School Counseling, 10(24). Retrieved from http://jsc.montana.edu/articles/v10n24.pdf

Zirkel, P. A. (2014). State laws and guidelines for RTI: Additional implementation features. Commun-iqué, 39 (7), 30–32.

Zirkel, P. A., & Thomas, L. B. (2010). State laws and guidelines for implementing RTI. Teaching

 Exceptional Children, 43, 60–73.

Eva Patrikakou is an Associate Professor at DePaul University. Melissa S. Ockerman is an Associate Professor at DePaul University. Amy Feiker Hollenbeck is an Associate Professor at DePaul University. Correspondence can be addressed to Eva Patrikakou, DePaul University, 2247 North Halsted Street, Chicago, IL 60614–3624, epatrika@depaul.edu.

The ASCA Model and a Multi-Tiered System of Supports: A Framework to Support Students of Color With Problem Behavior

Christopher T. Belser, M. Ann Shillingford, J. Richelle Joe

The American School Counselor Association (ASCA) National Model and a multi-tiered system of supports (MTSS) both provide frameworks for systematically solving problems in schools, including student behavior concerns. The authors outline a model that integrates overlapping elements of the National Model and MTSS as a support for marginalized students of color exhibiting problem behaviors. Individually, the frameworks employ data-driven decision making as well as prevention services for all students and intervention services for at-risk students. Thus, the integrated model allows schools to provide objective alternatives to exclusionary disciplinary actions (e.g., suspensions and expulsions) that are being assigned to students of color at a disproportionate rate. The manuscript outlines the steps within the integrated model and provides implications for school counselors and counselor educators.

Keywords: ASCA National Model, multi-tiered system of supports, school counselors, marginalized students, students of color

Educational disparities are well documented for students of color in the United States (Delpit, 2006; Ford & Moore, 2013; U.S. Department of Education [USDOE], 2014). Today’s students of color are facing lower graduation rates, overuse of exclusionary disciplinary action, overrepresentation in exceptional education programming and school policies that negatively impact students of color rather than support them (Moore, Henfield, & Owens, 2008; USDOE, 2014; R. Palmer & Maramba, 2010; Toldson & Lewis, 2012). School discipline policies based on a framework of zero tolerance have not reduced suspensions or expulsions as initially intended. Instead, these policies have resulted in more students being excluded from the classroom due to reactive disciplinary action (Skiba, 2014). Bernstein (2014) posited that these policies are increasing the educational achievement gap and negatively impacting the development of students of color. What then can be done as an alternative to or as a measure to prevent exclusionary disciplinary actions such as suspensions and expulsions?

A multi-tiered system of supports (MTSS) is a systematic data-driven program designed to address academic concerns and problem behavior by utilizing both prevention and intervention strategies (Sugai & Horner, 2009). Specific to behavior-related concerns, MTSS programs offer a structured method for providing both universal and individual support for students and present data-driven alternatives to suspension and expulsion. School counselors are uniquely positioned to play a critical role in the implementation of such programs due to their training in data analysis, program development and direct service delivery. Moreover, MTSS programs align well with the American School Counselor Association (ASCA) National Model (2012a).

The ASCA National Model has themes of social justice, advocacy and systemic change infused throughout, as comprehensive school counseling programs are designed to remove barriers to student success and help students reach their potential in the areas of academic, career, social and emotional development (ASCA, 2012a). With these themes in mind, integrating the National Model with the objective and data-driven framework of MTSS may offer one solution for systemic educational disparities such as the school-to-prison pipeline. The purpose of this article is to describe a model for integrating elements of the ASCA National Model within the MTSS framework. The authors will describe steps involved in the process and will provide context for how such an intervention can specifically benefit students of color.

The School-to-Prison Pipeline

More than 6.8 million individuals were under supervision of the adult correctional system in the United States at the end of 2014, a rate of 1 in 36 adults (Kaeble, Glaze, Tsoutis, & Minton, 2015). Of those under correctional supervision, over 1.5 million were held in state and federal correctional facilities (Carson, 2015). Although these numbers mark a slight decrease in the correctional population since 2007 (Kaeble et al., 2015), the American incarceration rate has quadrupled since the 1970s (Travis, Western, & Redburn, 2014). The growth of incarceration in the United States over the past four decades has largely affected the Black and Latino communities, both of which are disproportionately represented among individuals involved with the correctional system (Carson, 2015). Scholars in multiple academic disciplines have linked American drug policy and enforcement with mass incarceration of primarily individuals of color (Alexander, 2010; Travis et al., 2014). In education, however, a parallel cause has contributed to the expansion of the correctional system in the United States. Increasingly punitive discipline policies marked by zero tolerance approaches have created a pipeline from schools to prisons where exclusion from the educational environment and criminalization of student misbehavior contribute to school dropout and involvement with the juvenile justice system (Fowler, 2011).

The effects of this school-to-prison pipeline have been particularly detrimental for students of color, who are disproportionately suspended, expelled or otherwise excluded from the academic setting. Starting in preschool, Black children are suspended at a higher rate than their White counterparts (USDOE, 2014). Whereas 5% of White students are suspended, three times as many Black students are suspended on average (USDOE, 2014). Additionally, American Indian and Native-Alaskan students, who are less than 1% of the population in American schools, account for 2% of out-of-school suspensions and 3% of expulsions. Both gender and disability intersect with race and ethnicity, resulting in disproportionate suspensions of boys and girls of color and students with disabilities (USDOE, 2014). Among students with disabilities, those with emotional-behavioral disorders are most likely to experience academic exclusion and to experience such exclusion multiple times (Bowman-Perrott et al., 2011). Double minority status can increase the likelihood of exclusion, such as with Black males who are consistently over-identified in special education (Artiles, Harry, Reschly, & Chinn, 2002; Bowman-Perrott et al., 2011; Ferri & Connor, 2005).

Similar disparities exist among the rates of arrests and referrals to law enforcement for Black students and students with disabilities. Although only 16% of the student population, Black students account for 31% of school-related arrests and 27% of referrals to law enforcement (USDOE, 2014). Similarly, students with disabilities, which comprise about 12% of the student population, represent 25% of students arrested or referred to law enforcement (USDOE, 2014). School-related arrests and referrals to law enforcement can place students at risk for future involvement with the juvenile justice system and ultimately prison. Carmichael, Whitten, and Voloudakis’s (2005) investigation of minority overrepresentation in the juvenile justice system of Texas indicated that students with a disciplinary history were more likely to be involved with juvenile justice. Although this was the case for youth in all categories of race and ethnicity, both Latino and Black youth had more frequent contact with the justice system than White youth (Carmichael et al., 2005). Demonstrating the cumulative effect of involvement with the juvenile system, Natsuaki, Ge, and Wenk’s (2008) longitudinal study of young male offenders identified age of first arrest as an indicator of criminal trajectory with a younger age producing a steeper cumulative trajectory. Additionally, for those first arrested early during their adolescent years, the pace at which they committed criminal offenses was not slowed by completion of high school (Natsuaki et al., 2008). Hence, when school discipline policies result in the exclusion of students from the educational setting and involvement with law enforcement, students are likely to be involved with the justice system as juveniles and adults (Natsuaki et al., 2008; USDOE, 2014; Wiesner, Kim, & Capaldi, 2010).

The American School Counselor Association National Model

ASCA developed a National Model (2012a) in order to provide school counselors with clear guidelines on how to meet the needs of all students. The ASCA National Model boasts a comprehensive, data-driven approach to meeting the needs of students and focuses on addressing students’ academic, personal, social and career needs. The model is driven by a key question: “How are students different as a result of what school counselors do?” Considering the data presented on the school-to-prison pipeline, this question is significant in ensuring that school counselors are providing students of color with the necessary support systems in order to foster more positive academic and social outcomes.

The National Model highlighted a collaborative approach centered on incorporating the efforts of teachers, administrators, families and other stakeholders in developing a comprehensive school counseling program. With school counselors at the helm, the model provided a new vision for the profession and emphasized school counselor accountability, leadership, advocacy, collaboration and systemic change (ASCA, 2012a). That is, the focus shifted to elevating the function of the school counseling program to align more readily with the mission of the school at large.

As a result of this new vision, school counseling programs have been able to observe significant improvements in students’ academic as well as social performance. For instance, L. Palmer and Erford (2012) found increases in high school attendance and graduation trends as the school counseling program implementation was increased. L. Palmer and Erford also reported positive changes in the academic performance of high school students, particularly improvements on Maryland State Assessment English and algebra scores. These results suggested optimistic influences of utilizing a comprehensive school counseling program as promoted by the National Model. Similarly, Carey and Dimmitt (2012) reported positive associations between the delivery of the comprehensive school counseling program and student performance; most specifically, rates of student suspensions and other disciplinary actions decreased, attendance increased, and math and reading proficiency improved. Dimmit and Wilkerson (2012) found that minority students were less likely to have access to comprehensive school counseling programs in their schools but noted correlations between an increase in counseling services and improved attendance, a decrease in suspensions, and a drop in reports of bullying. Similarly, Lapan, Whitcomb, and Aleman (2012) noted that schools with low counselor-to-student ratios and fully implemented ASCA Model programming had lower rates of suspension and fewer discipline issues.

Although much has been written on the benefits of school counselors addressing academic, personal, social and career development of students, there appears to be a paucity of research studies focused on the topic of college and career readiness of students of color. In terms of recommendations for school counselors and career development, Mayes and Hines (2014) discussed the need for more culturally sensitive and gendered approaches to college and career readiness for gifted Black females, including assisting these students in navigating through systemic and even social challenges that they may face. Similarly, Belser (2015) highlighted the impact that the school-to-prison pipeline has on career opportunities later in life for adolescent males of color. Considering the challenges that students face, especially those from marginalized populations, as well as the significant benefits of data-driven comprehensive school counseling programs, it seems appropriate that school counselors utilize the National Model as the foundation for stimulating more positive student outcomes.

Multi-Tiered System of Supports (MTSS)

Initially framed as Response to Intervention (RTI), the implementation of MTSS resulted from federal education initiatives after the 2004 reauthorization of the Individuals with Disabilities Education Improvement Act (IDEA), which called for more alignment between this policy and the No Child Left Behind Act (NCLB) of 2001 (Sugai & Horner, 2009). MTSS programs in schools are designed to provide a more systematic, data-driven and equitable approach to solving academic and behavioral issues with students. Within such programs, students are divided into three tiered categories based on the level of risk and need: (a) Tier 1 represents students who are in the general education population and who are thriving, (b) Tier 2 represents students who need slightly more intensive intervention that can be delivered both individually or in a small group setting, and (c) Tier 3 represents students who need intensive individualized interventions (Ockerman, Mason, & Hollenbeck, 2012). The process involves universal screening or testing, intervention implementation and progress monitoring.

To combat problem behaviors, MTSS is often linked to Positive Behavioral Interventions and Supports (PBIS) as an additional source of support for students. These programs have shown to reduce office disciplinary referrals and increase attendance (Freeman et al., 2016). Moreover, Horner, Sugai, and Anderson (2010) determined that PBIS programs are associated with reductions in problem behaviors, improved perception of school safety and improved academic results. Banks and Obiakor (2015) provided strategies for implementing culturally responsive positive behavior supports in schools, noting that doing so can reduce the marginalization of minority students and foster a safe and supportive school climate. With outcomes such as these, PBIS and MTSS programs have become known as best practices (Horner et al., 2010).

Several authors have noted the overlapping elements of MTSS and the ASCA National Model (ASCA, 2012a; Martens & Andreen, 2013; Ockerman et al., 2012). As both frameworks have yielded positive outcomes with the general population and minority students, it would appear that a coordinated approach would be beneficial for schools. However, existing discussions of how to integrate the two have not been comprehensive in their discussion or have not addressed the potential impact on students of color. In this manuscript, the authors have sought to provide a solution to this problem.

Putting MTSS and Comprehensive School Counseling Programs Into Practice

Integrating the ASCA National Model with MTSS involves strategic data-driven planning and decision making. The process begins with collecting baseline data on students via screening scales and surveys and then analyzing this data to group students into tiers based on indicated level of risk. A more objective approach driven by data could especially benefit students of color, who have historically been subject to disproportionate and—at times—unfair discipline policies (Hoffman, 2012). Once students have been placed in one of three MTSS tier groups, the decision-making team and school counselors can generate appropriate prevention and intervention strategies that fit with each tier and with students’ needs. The process is cyclical, as progress-monitoring data is collected periodically to determine future steps. Figure 1 outlines the process from start to finish, and the sections that follow will further highlight the phases of the process. In addition, the authors will address how these steps can affect students of color.

 

Figure 1. The MTSS Cycle for Behavior Intervention

Team Development and Planning

     The process of providing MTSS services is not a job for a single person; rather, a team of stakeholders (e.g., school counselors, administrators, teachers) must be involved in planning, enacting and evaluating the services and interventions utilized. With the integration of the ASCA National Model within MTSS, school counselors can utilize elements of the model, such as the Advisory Council and the Annual Agreement, to aid in the planning process (ASCA, 2012a). Each member of the team provides a unique role, from direct service delivery to data management. School counselors should be mindful of their numerous other duties within the school and only take the lead on program components that are appropriate and directly relate to the role of school counselors in schools (ASCA, 2014; Ockerman et al., 2012).

In the planning phase, the team should examine preliminary discipline-related data to gauge what types of universal supports might be necessary; within this conversation, understanding the school’s demographic data is crucial so the team can account for potential culture-bound concerns that may need to be addressed during the MTSS process. Additionally, the team should determine what instrument will be used for universal screening, a process that will be discussed in more detail in the next section. Once the team has a preliminary plan of action, including a timeline of key events, this information should be presented to the entire school faculty to provide a rationale for the services and procedural information to boost fidelity of implementation, especially with program elements implemented schoolwide like universal screening.

Universal Screening

Data collection through universal assessment is a necessary step to the MTSS process (Harn, Basaraba, Chard, & Fritz, 2015; von der Embse, Pendergast, Kilgus, & Eklund, 2015). School counselors often rely on referrals from teachers, parents and students to match students with interventions; however, integrating a universal screening approach to comprehensive school counseling programs can help mitigate students falling through the cracks (Ockerman et al., 2012). Universal screening involves all students being evaluated using one instrument, such as the Student Risk Screening Scale (SRSS; Drummond, 1994), which allows a decision-making team to categorize students based on level of risk for the respective issue. Cheney and Yong (2014) noted that a universal screening instrument should be time efficient for teachers to complete and should be both valid and reliable; they further noted that the purpose of such a screening tool is to identify which students warrant interventions beyond Tier 1 supports (i.e., Tier 2 and 3 interventions).

Various instruments exist for universal screening of behavior or emotional risk (Lane, Kalberg, et al., 2011). The SRSS (Drummond, 1994) is one freely available screening instrument that allows teachers to rate an entire class of students quickly on seven behavioral or social subscales. This tool fits well into an MTSS framework as the scoring places students into a category of low, moderate, or high levels of risk (Lane et al., 2015); in addition, researchers have established validity and reliability for the SRSS at the elementary (Lane et al., 2012), middle (Lane, Oakes, Carter, Lambert, & Jenkins, 2013), and high school levels (Lane, Oakes, et al., 2011), as well as in urban elementary schools (Ennis, Lane, & Oakes, 2012). Other universal screening instruments that support the MTSS framework for behavior-related concerns include the Behavioral and Emotional Screening System (BESS; Kamphaus & Reynolds, 2007), the Systematic Screening for Behavioral Disorders (SSBD; Walker & Severson, 1992), and the Social, Academic, and Emotional Behavioral Risk Screener (SAEBRS; von der Embse et al., 2015).

Procedurally, the process of conducting a universal screening at a school would need to be driven by a collaborative faculty team with heavy administrative support. Carter, Carter, Johnson, and Pool (2012) described steps that educators took at one school to identify students for Tier 2 and 3 interventions and beyond. Within their process, faculty members would complete the screening instrument on a class of students whom they see regularly (e.g., a homeroom class). Ideally, multiple faculty members would complete the instrument on a single class to provide multiple data points on each student as a means of reducing teacher bias; in such an instance, the scores could be averaged together. Once the screening process is complete, the MTSS team (or whatever team has been assembled for this purpose) can view the compiled data to identify at-risk students. The faculty team can then sort and view this data easily by students’ scores on the instrument to reveal which students are most at risk based on the assessment. The final step in this process is to place students within one of the three MTSS tiers based on the results of the universal screening instrument. After this process is complete, the school counselors and the team can design interventions for students at each level. The faculty team may find it useful to consult other school discipline data points (e.g., office disciplinary referrals and suspensions) as additional baseline measures for students identified as needing Tier 2 or Tier 3 interventions. However, the team should keep in mind that these disciplinary actions have historically been applied to students of color, particularly Black males, at a disproportionate rate; thus, these data points may not be in line with the goal of using a more objective measurement strategy (Hoffman, 2012).

Tiering and Intervention

Whereas school counselors can be an integral part of the universal screening process, they can also be a driving force with direct service delivery for students at all three MTSS tiers (Ockerman et al., 2012). The ASCA National Model (2012a) highlighted the overlapping nature of the model’s direct student services component to the three tiers of the MTSS model. The following sections will highlight the connections between the three MTSS tiers and the levels of service delivery within comprehensive school counseling programs; moreover, the authors will convey strategies and interventions that may be especially helpful for students of color facing social and behavioral concerns.

Tier 1. Tier 1 instruction or intervention takes place in the general education environment and is presented universally to students (Harn et al., 2015). Two programs commonly used at this level are PBIS and Social-Emotional Learning (Cook et al., 2015). However, Ockerman et al. (2012) noted that some elements of comprehensive school counseling programs (e.g., schoolwide interventions, large group interventions and the counseling core curriculum) fall within the first tier, as they are designed to target all or most students. For example, school counselors can partner with administrators and teachers to develop or adopt a data-driven PBIS program that integrates classroom lessons (e.g., character education) and schoolwide programming (e.g., an anti-bullying rally or positive behavior reward events). Additionally, school counselors can align their counseling curriculum with the goals of the MTSS or PBIS program and create lessons or units that support these goals. Potential topics for these lessons or units include social skills, conflict resolution, respecting diversity and differences in others, and managing one’s anger. School counselors can gather needs assessment data from students, teachers, parents and other stakeholders to determine which topics may be of most benefit to students. Tier 1 interventions are designed to effectively serve approximately 80–85% of students (Martens & Andreen, 2013).

Tier 2. Tier 2 interventions are enacted for students whose needs are not being met by Tier 1 services and may include a variety of interventions such as the following: (a) targeted interventions, (b) group interventions, and (c) individualized interventions for less problematic behaviors (Newcomer, Freeman, & Barrett, 2013). School counselors may be involved with any or all of these types of interventions but are more likely to provide direct services to students through small group interventions and individualized interventions for minor problem behaviors. The MTSS decision-making team should evaluate data from the universal screening process to determine which students may need a Tier 2 support and what type of intervention that should be. For example, after the first author compiled data from the SRSS at his middle school, he and his team evaluated the scores of students who fell in the moderate risk range to determine what interventions (e.g., small group counseling, behavior contract, Check-in/Check-out) would be appropriate for each student. Unlike Tier 1 supports, Tier 2 interventions should not be one-size-fits-all, but driven by the needs of each unique student.

Small group counseling. As students of color have been subject to disproportionate use of exclusionary disciplinary actions (e.g., in-school or out-of-school suspensions), school counselors and the decision-making team should utilize Tier 2 interventions that promote alternatives to suspension and help re-engage students with prosocial behaviors. Group counseling interventions can be more psychoeducational in nature (e.g., anger management, social skills development, conflict resolution, problem solving) or can be geared more toward personal growth and exploration of students’ feelings and concerns about everyday problems (Gladding, 2016). Regardless of the type of group, school counselors should foster an environment where students can openly express themselves and simultaneously work on an individual goal. Safety, trust and universality within the group may be especially helpful for marginalized students, as they can often feel disenfranchised from the school environment because of exclusionary discipline practices (Caton, 2012; Gladding, 2016).

Individualized interventions. Some students are not appropriate for counseling groups or their presenting issues do not warrant a group intervention. For these students, an individual approach to Tier 2 interventions is necessary. Two commonly used strategies are Check-in/Check-out and behavior contracts. Check-in/Check-out is a structured method for providing students with feedback regarding their behavior with higher frequency (Crone, Hawken, & Horner, 2010). With this strategy, students “check-in” with a designated faculty member in the morning as a source of encouragement and non-contingent attention, receive a behavior report card that is carried with them throughout their day for teachers to record feedback, and “check-out” with the same faculty member at the end of the day to evaluate progress and possibly receive a reward. The report card can then be taken home to parents as a form of home–school collaboration (Maggin, Zurheide, Pickett, & Baillie, 2015). Check-in/check-out has been shown to be an intervention that successfully prevents escalation of student behavior and reduces disciplinary referrals (Maggin et al., 2015; Martens & Andreen, 2013). Moreover, it also helps students build a positive relationship with school staff members.

Behavior contracts have a similar approach but also take the form of a less intensive behavior intervention plan (BIP). With both approaches, the report card or behavior tracking form should be modified based on the developmental and behavioral needs of the student. The first author utilized an approach that integrated both of these interventions, and each identified student was matched with an adult with whom they had a trusting relationship who acted as their designated check-in/check-out person. Students receiving an individual intervention also may benefit from small group counseling as an additional support. If Tier 2 interventions are unsuccessful in mitigating students’ problem behaviors, the team’s attention should shift to Tier 3 interventions.

Tier 3. Tier 3 interventions are appropriate for students identified as highly at risk by the universal screening and students who have not responded positively to Tier 2 interventions. As with Tier 2 interventions, school counselors’ roles with Tier 3 interventions may vary, ranging from a supporting or consultative role to directly delivering interventions. Counseling interventions at this level include individual counseling, one-on-one mentoring, or referrals to community agencies for more intensive services (Ockerman et al., 2012). School counselors should keep in mind that ASCA has identified providing long-term individual counseling as an inappropriate role for school counselors (ASCA, 2012a) due to time constraints and lack of resources. As such, referrals to community agencies may be most helpful in supporting students in need of more intensive one-on-one counseling services.

Behavior intervention plans are another Tier 3 strategy to mitigate more severe problem behaviors (Bohanon, McIntosh, & Goodman, 2015). Lo and Cartledge (2006) found that conducting functional behavioral assessments (FBAs) and creating BIPs was a successful intervention for reducing problem behaviors and increasing replacement behaviors in elementary-aged Black males. Whether through counseling intervention or intensive behavior support, structured Tier 3 interventions can provide alternatives to suspensions, which is especially helpful for students of color as previously discussed.

Progress Monitoring

The MTSS process does not end with universal screening or service delivery; the decision-making team must have a clear and systematic plan for monitoring student outcomes. Carter et al. (2012) recommended administering the universal screening tool at least twice during the school year to evaluate progress. By taking such action, the decision-making team can determine which students are responding well to interventions and which students are not. Those students responding well to Tier 2 or 3 interventions may be moved down to Tier 1, whereas those not responding well to Tier 1 or 2 may be moved up a tier. Students not responding to Tier 3 interventions may warrant additional behavioral or psychological assessment to determine if further services are more appropriate (Ockerman et al., 2012). Progress monitoring also can provide clues about the efficacy of an intervention or the fidelity of its implementation. For example, if only one student in a class is responding to a Tier 1 intervention, the team may want to evaluate the delivery of that intervention for that class or consider an alternative intervention. A primary benefit of utilizing a data-driven progress monitoring approach is that it allows for objective decision making based on data, rather than subjective decision making that may be influenced by bias.

Implications for School Counselors

In line with the ASCA National Model (2012a), school counselors are called to be advocates and agents of systemic change in their schools. Part of this calling includes implementing comprehensive school counseling programs that address inequities within the school and provide programming to address the achievement gap. As has been discussed previously, integrating MTSS and the National Model can be especially helpful for students of color who have historically been subject to bias within discipline policies and procedures, resulting in disproportionate rates of disciplinary action. School counselors acting as advocates and agents of change should be proactive in analyzing school data to determine whether these inequities are at play and must be vocal about the need to solve these problems if they do exist at their schools (ASCA, 2012b).

As such, school counselors should ensure that they are versed in best practices such as MTSS that have been shown to positively impact racial and cultural inequities. However, school counselors cannot solve the problem alone. The other two themes of the ASCA National Model (2012a)—leadership, and collaboration and teaming—are also critically important if school counselors are to implement such programs. With training in data analysis, program development and direct service implementation, school counselors are uniquely positioned to take on leadership roles with regard to MTSS programming. However, they also should recognize their roles as collaborators and team members for program elements that do not directly fall within the role of school counselors (Ockerman et al., 2012).

Implications for Counselor Educators and Researchers

As stakeholders charged with training the next generation of school counselors, counselor educators must remain versed in newer topics within school counseling and education. Although PBIS has been around since 1997, MTSS is still a relatively new concept, especially when integrated with the ASCA National Model. School counselor educators should ensure that coursework prepares future school counselors to engage in such programming. More specifically, school counselor preparation courses should include discussion and application of MTSS, data analysis, program evaluation, behavior interventions and other concepts that are vital to coordinating ASCA Model programming. At the same time, counselor educators also must empower graduate students to become advocates for marginalized students at their future schools and for themselves as professionals. Because there is little research available that evaluates the integration of MTSS and ASCA Model programming, it is imperative that school counselors and counselor educators collaborate to conduct such research.

Conclusion

Research on the school-to-prison pipeline has demonstrated an unfortunate link between the criminal justice system and K–12 disproportionate disciplinary practices faced by students of color. An integrated system including a multi-tiered system of supports and the ASCA (2012a) National Model has been introduced in this manuscript to address disciplinary concerns in a more systemically balanced manner. MTSS and the ASCA National Model utilize a similar data-driven structured approach to solving issues related to academic and behavioral concerns. When integrated, the overlapping elements of each framework can provide an avenue for addressing key concerns for students of color exhibiting problem behaviors. Rather than relying on disciplinary procedures that may result in students being excluded from class, an approach integrating frameworks of prevention and intervention can provide a much-needed alternative. The framework provided herein details steps that school counselors and other educators can take to address the school-to-prison pipeline. In order to best support marginalized students, school counselors must heed the call to leadership, advocacy, collaboration and systemic change given by the National Model; moreover, joining forces with other educators through collaborative efforts such as MTSS can only strengthen the effort to best support the success of all students.

Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest or funding contributions for the development of this manuscript.

References

Alexander, M. (2010). The new Jim Crow: Mass incarceration in the age of colorblindness. New York, NY: The New Press.

American School Counselor Association. (2012a). The ASCA national model: A framework for school counseling programs (3rd ed.). Alexandria, VA: Author.

American School Counselor Association. (2012b). ASCA school counselor competencies. Retrieved from https://www.schoolcounselor.org/asca/media/asca/home/SCCompetencies.pdf

American School Counselor Association. (2014). The school counselor and multitiered system of supports. Retrieved from http://www.schoolcounselor.org/asca/media/asca/PositionStatements/
PS_MultitieredSupportSystem.pdf

Artiles, A. J., Harry, B., Reschly, D. J., & Chinn, P. C. (2002). Over-identification of students of color in special education: A critical overview. Multicultural Perspectives, 4, 3–10. doi:10.1207/s15327892mcp0401_2

Banks, T., & Obiakor, F. E. (2015). Culturally responsive positive behavior supports: Considerations for practice. Journal of Education and Training Studies, 3(2), 83–90. doi:10.11114/jets.v3i2.636

Belser, C. T. (2015). African American males: A career and college readiness crisis. In J. R. Curry & M. A. Shillingford (Eds), African American students’ career and college readiness: The journey unraveled (pp. 279–307). Washington, DC: Lexington Books.

Bernstein, N. (2014). Burning down the house: The end of juvenile prison. New York, NY: The New Press.

Bohanon, H., McIntosh, K., & Goodman, S. (2015). Integrating academic and behavior supports within an RtI framework, part 4: Tertiary supports. Retrieved from http://www.rtinetwork.org/learn/behavior-supports/integrating-academic-and-behavior-supports-tertiary-supports

Bowman-Perrott, L., Benz, M. R., Hsu, H.-Y., Kwok, O.-M., Eisterhold, L. A., & Zhang, D. (2011). Patterns and predictors of disciplinary exclusion over time: An analysis of the SEELS national data set. Journal of Emotional and Behavioral Disorders, 21(2), 83–96. doi:10.1177/1063426611407501

Carey, J., & Dimmitt, C. (2012). School counseling and student outcomes: Summary of six statewide studies. Professional School Counseling, 16, 146–153. doi:10.5330/PSC.n.2012-16.146

Carmichael, D., Whitten, G., & Voloudakis, M. (2005). Study of minority over-representation in the Texas juvenile justice system: Final report. College Station, TX: Public Policy Research Institute at Texas A&M University.

Carson, E. A. (2015). Prisoners in 2014. Retrieved from http://www.bjs.gov/index.cfm?ty=pbdetail
&iid=5387

Carter, D. R., Carter, G. M., Johnson, E. S., & Pool, J. L. (2012). Systematic implementation of a Tier 2

behavior intervention. Intervention in School and Clinic, 48, 223–231. doi:10.1177/1053451212462879

Caton, M. T. (2012). Black male perspectives on their educational experiences in high school. Urban Education, 47, 1055–1085. doi:10.1177/0042085912454442

Cheney, D. A., & Yong, M. (2014). RE-AIM checklist for integrating and sustaining Tier 2 social-behavioral interventions. Intervention in School and Clinic, 50, 39–44. doi:10.1177/1053451214532343

Cook, C. R., Frye, M., Slemrod, T., Lyon, A. R., Renshaw, T. L., & Zhang, Y. (2015). An integrated approach to universal prevention: Independent and combined effects of PBIS and SEL on youths’ mental health. School Psychology Quarterly, 30, 166–183. doi:10.1037/spq0000102

Crone, D. A., Hawken, L. S., & Horner, R. H. (2010). Responding to problem behavior in schools: The behavior education program (2nd ed.). New York, NY: Guilford Press.

Delpit, L. (2006). Other people’s children: Cultural conflict in the classroom. New York, NY: Norton.

Dimmitt, C., & Wilkerson, B. (2012). Comprehensive school counseling in Rhode Island: Access to services and student outcomes. Professional School Counseling, 16, 125–135. doi:10.5330/PSC.n.2012-16.125

Drummond, T. (1994). The Student Risk Screening Scale (SRSS). Grants Pass, OR: Josephine County Mental Health Program.

Ennis, R. P., Lane, K. L., & Oakes, W. P. (2012). Score reliability and validity of the student risk screening scale: A psychometrically sound, feasible tool for use in urban elementary schools. Journal of Emotional and Behavioral Disorders, 20, 241–259. doi:10.1177/1063426611400082

Ferri, B. A., & Connor, D. J. (2005). In the shadow of Brown: Special education and overrepre-sentation of students of color. Remedial and Special Education, 26, 93–100. doi:10.1177/07419325050260020401

Ford, D. Y., & Moore, J. L., III. (2013). Understanding and reversing underachievement, low achievement, and achievement gaps among high-ability African American males in urban school contexts. Urban Review, 45, 399–415. doi:10.1007/s11256-013-0256-3

Fowler, D. (2011). School discipline feeds the pipeline to prison. Phi Delta Kappan, 93(2), 14–19. doi:10.1177/003172171109300204

Freeman, J., Simonsen, B., McCoach, D. B., Sugai, G., Lombardi, A., & Horner, R. (2016). Relationship between school-wide positive behavior interventions and supports and academic, attendance, and behavior outcomes in high schools. Journal of Positive Behavior Intervention, 18, 41–51. doi:10.1177/1098300715580992

Gladding, S. (2016). Groups: A counseling specialty (7th ed.). Upper Saddle River, NJ: Prentice-Hall.

Harn, B., Basaraba, D., Chard, D., & Fritz, R. (2015). The impact of schoolwide prevention efforts: Lessons learned from implementing independent academic and behavior support systems. Learning Disabilities: A Contemporary Journal, 13, 3–20. doi:10.1177/0022219407313588

Hoffman, S. (2012). Zero benefit: Estimating the effect of zero tolerance discipline policies on racial disparities in school discipline. Education Policy, 28, 69–95. doi:10.1177/0895904812453999

Horner, R. H., Sugai, G. M., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1–14.

Kaeble, D., Glaze, L. E., Tsoutis, A., & Minton, T. D. (2015). Correctional populations in the United States, 2014. Retrieved from http://www.bjs.gov/index.cfm?ty=pbdetail&iid=5519

Kamphaus, R. W., & Reynolds, C. R. (2007). BASC-2 Behavioral and emotional screening system (BASC-2 BESS). Minneapolis, MN: Pearson.

Lane, K. L., Kalberg, J. R., Menzies, H., Bruhn, A., Eisner, S., & Crnobori, M. (2011). Using systematic screening data to assess risk and identify students for targeted supports: Illustrations across the K-12 continuum. Remedial and Special Education, 32, 39–54. doi:10.1177/0741932510361263

Lane, K. L., Oakes, W. P., Carter, E. W., Lambert, W. E., & Jenkins, A. B. (2013). Initial evidence for the reliability and validity of the student risk screening scale for internalizing and externalizing behaviors at the middle school level. Assessment for Effective Intervention, 39, 24–38. doi:10.1177/1534508413489336

Lane, K. L., Oakes, W. P., Ennis, R. P., Cox, M. L., Schatschneider, C., & Lambert, W. (2011). Additional evidence for the reliability and validity of the student risk screening scale at the high school level: A replication and extension. Journal of Emotional and Behavioral Disorders, 21(2), 97–115. doi:10.1177/1063426611407339

Lane, K. L., Oakes, W. P., Harris, P. J., Menzies, H. M., Cox, M., & Lambert, W. (2012). Initial evidence for the reliability and validity of the student risk screening scale for internalizing and externalizing behaviors at the elementary level. Behavioral Disorders, 37, 99–122.

Lane, K. L., Oakes, W. P., Swogger, E. D., Schatschneider, C., Menzies, H. M., & Sanchez, J. (2015). Student risk screening scale for internalizing and externalizing behaviors: Preliminary cut scores to support data-informed decision making. Behavioral Disorders, 40, 159–170.

doi:10.17988/0198-7429-40.3.159

Lapan, R. T., Whitcomb, S. A., & Aleman, N. M. (2012). Connecticut professional school counselors: College and career counseling services and smaller ratios benefit students. Professional School Counseling, 16, 117–124. doi:10.5330/PSC.n.2012-16.124

Lo, Y.-Y., & Cartledge, G. (2006). FBA and BIP: Increasing the behavior adjustment of African American boys in schools. Behavioral Disorders, 31, 147–161.

Maggin, D. M., Zurheide, J., Pickett, K. C., & Baillie, S. J. (2015). A systematic evidence review of the check-in/check-out program for reducing student challenging behaviors. Journal of Positive Behavior Interventions, 17, 197–208. doi:10.1177/1098300715573630

Martens, K., & Andreen, K. (2013). School counselors’ involvement with a school-wide positive behavior support system: Addressing student behavior issues in a proactive and positive manner. Professional School Counseling, 16, 313–322. doi:10.5330/PSC.n.2013-16.313

Mayes, R. D., & Hines, E. M. (2014). College and career readiness for gifted African American girls: A call to school counselors. Interdisciplinary Journal of Teaching and Learning, 4, 31–42.

Moore, J. L., Henfield, M. S., & Owens, D. (2008). African American males in special education: Their attitudes and perceptions toward high school counselors and school counseling services. American Behavioral Scientist, 51, 907–927. doi:10.1177/0002764207311997

Natsuaki, M. N., Ge, X., & Wenk, E. (2008). Continuity and changes in the developmental trajectories of criminal career: Examining the roles of timing of first arrest and high school graduation. Journal of Youth and Adolescence, 37, 431–444. doi:10.1007/s10964-006-9156-0

Newcomer, L. L., Freeman, R., & Barrett, S. (2013). Essential systems for sustainable implementation of Tier 2 supports. Journal of Applied School Psychology, 29, 126–147. doi:10.1080/15377903.2013.778770

Ockerman, M. S., Mason, E. C. M., & Hollenbeck, A. F. (2012). Integrating RTI with school counseling programs: Being a proactive professional school counselor. Journal of School Counseling, 10(15), 1–37. Retrieved from http://files.eric.ed.gov/fulltext/EJ978870.pdf

Palmer, L. E., & Erford, B. T. (2012). Predicting student outcome measures using the ASCA National Model program audit. The Professional Counselor, 2, 152–159. doi:10.15241/lep.2.2.152

Palmer, R. T., & Maramba, D. C. (2010). African American male achievement: Using a tenet of Critical Theory to explain the African American male achievement disparity. Education and Urban Society, 43, 431–450. doi:10.1177/0013124510380715

Skiba, R. J. (2014). The failure of zero tolerance. Reclaiming Children and Youth, 22(4), 27–33.

Sugai, G., & Horner, R. H. (2009). Responsiveness-to-intervention and school-wide positive behavior supports: Integration of multi-tiered system approaches. Exceptionality, 17, 223–237. doi:10.1080/09362830903235375

Toldson, I. A., & Lewis, C. W. (2012). Challenge the status quo: Academic success among school-age African American males. Retrieved from http://www.cbcfinc.org/oUploadedFiles/CTSQ.pdf

Travis, J., Western, B., & Redburn, S.  (2014). The growth of incarceration in the United States: Exploring causes and consequences. Committee on Causes and Consequences of High Rates of Incarceration, Committee on Law and Justice, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academic Press.

U.S. Department of Education, Office of Civil Rights. (2014). Data snapshot: School discipline (Issue Brief No. 1). Retrieved from http://www2.ed.gov/about/offices/list/ocr/docs/crdc-discipline-snapshot.pdf

Von der Embse, N. P., Pendergast, L. L., Kilgus, S. P., & Eklund, K. R. (2015). Evaluating the applied use of a mental health screener: Structural validity of the social, academic, and emotional behavior risk screener. Psychological Assessment. Advance online publication. doi:10.1037/pas0000253

Walker, H. M., & Severson, H. H. (1992). Systematic Screening for Behavior Disorders (SSBD): User’s guide and administration manual. Longmont, CO: Sopris West.

Wiesner, M., Kim, H. K., & Capaldi, D. M. (2010). History of juvenile arrests and vocational career

outcomes for at-risk young men. Journal of Research in Crime & Delinquency, 47, 91–117. doi:10.1177/0022427809348908

Christopher T. Belser, NCC, is a doctoral candidate at the University of Central Florida. M. Ann Shillingford is an Associate Professor at the University of Central Florida. J. Richelle Joe, NCC, is an Assistant Professor at the University of Central Florida. Correspondence can be addressed to Christopher Belser, 231B Mathematical Sciences Building, University of Central Florida, Orlando, FL 32816, christopher.belser@ucf.edu.

Mental Health Practitioners’ Perceived Levels of Preparedness, Levels of Confidence and Methods Used in the Assessment of Youth Suicide Risk

Robert C. Schmidt

Youth suicide is a significant public health concern and efforts to reduce youth suicide remain a national priority (Kung, Hoyert, Xu, & Murphy, 2008; National Action Alliance for Suicide Prevention: Research Prioritization Task Force, 2014). In the United States, there were 40,600 suicides in 2012, averaging 111 suicides per day (Centers for Disease Control and Prevention [CDC], 2014a). Of the total number of suicides, 5,183 were youth suicides, averaging 14 youth suicides daily, or one youth suicide every 1 hour and 42 minutes (Drapeau & McIntosh, 2014). Youth suicide is the third leading cause of death between the ages of 10 and 14 and has become the second leading cause of death between the ages of 15 and 24 (CDC, 2014a). The results from the 2013 Youth Risk Behavior Surveillance (YRBS) reported 29.9% of high school students felt sad or hopeless almost every day for 2 weeks or more; 17% of high school students seriously considered attempting suicide; 13.6% of high school students made a suicide plan about how they would attempt suicide; and 8% of students attempted suicide one or more times (CDC, 2014b).

 

Efforts to address the increasing rate of youth suicide call for the identification of existing training and preparation gaps currently faced by practitioners (National Action Alliance for Suicide Prevention: Research Prioritization Task Force, 2014). These gaps pose many challenges for practitioners to effectively provide appropriate interventions. Although previous studies have investigated training gaps among specific professional disciplines (Debski, Spadafore, Jacob, Poole, & Hixson, 2007; Dexter-Mazza, & Freeman, 2003; O’Connor, Warby, Raphael, & Vassallo, 2004), the current study investigated a broader representation of disciplines including social workers, school counselors, professional counselors, school psychologists and psychologists. This study examined practitioner self-perceived levels of preparedness, levels of confidence and methods used in the assessment of youth suicide.

 

     Practitioner readiness in suicide assessment. In approximately eight of ten suicides, youth give advance clues or warning signs of their intentions that can be detected by others (McEvoy & McEvoy, 2000; Poland & Lieberman, 2002). In a study spanning four years of youth in a rural school district (N = 5,949) screened for suicidal thoughts, 670 (11%) reported having suicidal thoughts within the past year or past few days (Schmidt, Iachini, George, Koller, & Weist, 2015). Practitioners working within school or community mental health settings have an opportunity to play a critical role in the identification, assessment and prevention of youth suicide (Singer & Slovak, 2011). Within either setting, practitioners will encounter clients having suicidal thoughts or behaviors (Rudd, 2006). The practitioner’s responsibility in the assessment of suicide is to estimate risk based on identifying warning signs and associated behaviors and to respond appropriately (Bryan & Rudd, 2006).

 

In a national sampling of social workers, 93% of the respondents reported having worked with a suicidal patient (Feldman & Freedenthal, 2006), and 55% of clinical social workers reported having a patient attempt suicide (Sanders, Jacobson, & Ting, 2008). In a study of psychology doctoral interns (N = 238) completed by Dexter-Mazza and Freeman (2003), 99% reported providing services to suicidal patients and 5% reported experiencing a patient death by suicide. Across professional disciplines, 22% to 30% of social workers, counselors and psychologists reported having a patient die by suicide (Jacobson, Ting, Sanders, & Harrington, 2004).

 

Irrespective of the level of suicide training, comfort level or experience (i.e., even those with limited training and preparedness), the circumstances for which practitioners meet with a suicidal client are not only stressful, but also have legal and ethical ramifications (Cramer, Johnson, McLaughlin, Rausch, & Conroy 2013; Poland & Lieberman, 2002). Research suggests significant gaps exist related to the practitioner’s training and readiness to perform suicide risk assessments, highlighting training deficits in the level of preparedness, level of confidence and methods used to determine suicide risk level (Smith, Silva, Covington, & Joiner, 2014).

 

Although youth suicide remains a national concern and priority, gaps appear most prominent in translating research into practice in developing and providing appropriate levels of training and supervision for practitioners (Smith et al., 2014). Research to support this concern offers valuable recommendations (Osteen, Frey, & Ko  2014; Schmitz, Allen, Feldman, et al., 2012); however, despite these recommendations, training and preparation continue to lag (Rudd, Cukrowicz, & Bryan, 2008). Practitioner competency skills in suicide assessment continue to be neglected by colleges, universities, licensing bodies, clinical supervisors and training sites that can have the greatest impact in reducing youth and adult suicide (Schmitz et al., 2012).

 

     Practitioner preparedness. In the past several decades, researchers began identifying gaps in suicide risk knowledge, finding that practitioners were inadequately prepared to assess suicide risk. In master’s and doctoral clinical and counseling psychology training programs, 40–50% were found to offer formalized training in suicide assessment and management of suicide risk (Kleespies, Penk, & Forsyth, 1993). Suicide-specific training was only included in 2% of accredited professional counseling programs and 6% of accredited marriage and family therapist training programs (Wozny, 2005).

 

Training also has been identified as limited among social work graduate programs,

averaging 4 hours or fewer specific to suicide education (Ruth et al., 2009). In a study by Feldman and Freedenthal (2006) randomly surveying social workers through the National Association of Social Workers (N = 598), almost all of the social work participants (92.3%) reported working with a suicidal client; however, only 21.1% received any formal suicide-related training in their master’s program. Of the 21.1% of social workers receiving formal training, 46% specified their suicide-devoted training was less than 2 hours.

 

This pattern continued as additional studies found psychology doctoral interns did not receive adequate training in suicide assessment and/or managing suicide risk in clients. Neither did they receive the necessary levels of clinical supervision in suicide assessment (Mackelprang, Karle, Reihl, & Cash, 2014). In a study of psychology graduate school programs, 76% of the program directors indicated a need for more suicide-specific training and education within their programs but discovered barriers to implement this training (Jahn et al., 2012). The chief barrier reported by the directors was the absence of guidance and curriculum requirements to provide training and, secondly, the inability of colleges to create space in the existing curriculum schedule for added classes (Jahn et al., 2012).

 

In a survey that included members of the National Association of School Psychologists (N = 162), less than half (40%) of the respondents reported receiving graduate-level training in suicide risk assessment (Debski et al., 2007). Most school psychologists in this study reported feeling at least somewhat prepared to work with suicidal students while doctoral trained practitioners reported feeling well prepared.

 

School counselors share similar gaps in their preparation to provide suicide intervention and assessment to youth. Research conducted by Wachter (2006) indicated that 30% of school counselors had no suicide prevention training. In a study conducted by Wozny (2005), findings indicated that just 52.3% of the school counselors, averaging 5.6 years of experience, were able to identify critical suicide risk factors. This study exposed competency gaps in suicide assessment, training and intervention consistent with practitioner disciplines that were identified within this study. This is consistent with previous study findings (National Action Alliance for Suicide Prevention, 2014; Schmitz et al., 2012) that identified insufficient training and preparation of practitioners in the assessment and prevention of youth suicide and suicide in general.

 

     Practitioner confidence. Although most practitioners will encounter youth with suicidal thoughts and behaviors, many lack the self-confidence to effectively work with suicidal youth. The lack of confidence appears related to competency levels and limited training (National Action Alliance for Suicide Prevention, 2014; Oordt, Jobes, Fonseca, & Schmidt, 2009).

 

In contrast, researchers found that as practitioner risk assessment skills increased through suicide-specific training, noticeable increases were measured in practitioner self-confidence (McNiel et al., 2008). Oordt and colleagues (2009) studied mental health practitioner levels of confidence after receiving empirically-based suicide assessment and treatment training. The results indicated that self-reported levels of practitioner confidence increased by 44% and measured a 54% increase specific to self-confidence levels related to the management of suicidal patients. In addition, studies of school counselors identified correlations between self-efficacy, confidence and the ability to improve clinical judgment in providing suicide interventions and assessment (Al-Damarki, 2004).

 

Adequate training and experience in suicide prevention and assessment has been found to increase practitioner levels of confidence in conducting risk assessments and management planning (Singer & Slovak, 2011). Research suggests that confidence increases the practitioner’s ability to estimate suicide risk level, make effective treatment decisions and base recommendations when conducting a quality assessment. However, when the assessor is not confident, the assessment is more prone to errors or missed information, decreasing the accuracy of their assessment (Douglas & Ogloff, 2003). Paradoxically, overconfidence produces similar results as practitioners lacking confidence. Tetlock (2005) reported that overconfident practitioners are more prone to making errors during a suicide risk assessment unless their clinical judgment is further supported by objective evidence such as using a formal, validated and reliable method of assessment.

 

Methods Used in Suicide Assessment

 

There are several categories of suicide assessment instruments developed for youth (Goldston, 2003; National Action Alliance for Suicide Prevention, 2014). These include detection instruments like structured and semi-structured interviews; survey screenings that include self-report inventories and behavior checklists; and risk assessment instruments that include screenings, self-report questionnaires and multi-tier screening assessments.

 

Across settings including schools, emergency departments, primary care offices and community mental health offices, studies indicate that inconsistent methods are used to assess suicide risk (Horowitz, Ballard, & Paoa, 2009). In most instances, the use of published and validated suicide screening tools are not being properly used as intended or designed, which impacts their reliability and validity (Boudreaux & Horowitz, 2014). This may represent and reflect the practitioner’s limited training, confidence and experience in these areas.

 

In addition, the documentation of the suicide assessment also can reflect the level of the practitioner’s training and knowledge of suicide assessment. O’Connor and colleagues (2004) noted that practitioner skill deficiencies in youth suicide assessment are likely to appear in clinic notes as a brief statement, “patient currently denies suicidal thoughts,” based on the practitioner’s impressionistic and subjective perception after completing a brief unstructured interview. This is commonly the only form of documentation obtained by the practitioner (O’Connor et al., 2004). Research consistently provides evidence across disciplines that some practitioners are not prepared to make clinical judgments (Debski et al., 2007; Jahn et al., 2012; Mackelprang, et al., 2014; Ruth et al., 2009; Smith et al., 2014). This study offered an opportunity to contribute to the understanding of practitioners’ self-perceived competencies in the assessment of youth suicide while identifying existing gaps in training.

 

The Current Study

 

In previous studies, research has focused on confidence and preparedness levels only in specific disciplines related to the identification and assessment of suicidal youth (Al-Damarki, 2004; Debski et al., 2007; Wozny, 2005). This study encompassed a much broader representative sample of practitioner disciplines including psychologists, social workers, school counselors, professional counselors and school psychologists.

 

The purpose of this study was to determine relationships among practitioners’ self-perceived levels of preparedness, levels of confidence and methods used to perform suicide risk assessments in youth. These efforts were guided by the following research question: What are the relationships among the self-perceived levels of preparedness, levels of confidence, and methods used in the assessment of suicide risk for practitioners whose responsibilities require suicide risk assessment and management? In order to address this, survey questions were designed to obtain participant responses related to skill development, preparation, confidence and methods used in the process of conducting suicide risk assessments.

 

Method

 

Procedures and Instrumentation

     Since this study sought to collect data using human subjects, the proposal was reviewed and approved by the Wilmington University Human Subjects Review Committee prior to beginning this study. An exploratory descriptive survey design examined practitioner self-perceived levels of preparedness, levels of confidence and methods used to assess suicide risk in youth. Using a quantitative method to guide this study, the researcher attempted to recruit practitioners positioned and responsible for suicide risk assessment. This included working in cooperation with and posting the survey on the Maryland School Psychologists’ Association Web site and the University of Maryland Center for School Mental Health Web site. The survey was forwarded to school districts in Maryland and Virginia and directed to school counselors, school psychologists, and school-based mental health professionals, including social workers and professional counselors. In addition, the survey was forwarded to multiple outpatient mental health clinics in the mid-Atlantic region of the United States. Practitioners were provided with information about the survey, study purposes and ethical standards, and it was noted that participation was voluntary and confidential. Practitioners submitted their responses online, allowing the researcher to evaluate self-reported levels related to suicide assessment. Participants were provided with an access link to anonymously complete the survey using SurveyGizmo. The completed data were then entered into an Excel spreadsheet database.

 

The Child and Adolescent Suicide Intervention Preparedness Survey was the instrument developed for this study. This researcher received prior approval from the authors of two previously published surveys (Debski, et al., 2007; Stein-Erichsen, 2010) while adding specific queries for the purposes of this study. The survey by Debski and colleagues (2007) included a 42-item questionnaire with vignettes that measured the training, roles and knowledge of school psychologists. These questions targeted participant confidence and perceived levels of preparedness that also were sought in this current study, but from a broader discipline base.

 

The survey by Stein-Erichsen (2010) included a 55-item measure designed to identify confidence levels of school psychologists providing suicide intervention and prevention within schools. The survey questionnaires designed by Stein-Erichsen (2010) and Debski and colleagues (2007) offered questions adapted for this study specifically focusing on preparedness levels, confidence, roles, methods used to assess suicide levels, and omitted survey questions not relevant to this study. This resulted in a 23-item survey targeting practitioner levels of training, preparedness, confidence and the identification of additional training needs.

 

Participants

The study had 339 participants representing school counselors (N = 107/32%); social workers (N = 90/27%); school psychologists (N = 37/11%); professional counselors (N = 35/11%); psychologists (N = 5/1%); other (N = 62/18%); and three participants with unknown professional identification.

 

The final sampling of participants included 43 males, 292 females and four participants with unknown gender identification. Participants averaged in age ranges 22–29 (N = 33/10%), 30–39 (N = 105/31%), 40–49 (N = 94/28%), 50–59 (N = 61/18%) and ages 60 and above (N = 45/13%). The participants responded to the item querying level of education as having a bachelor’s degree (N = 18/6%), doctoral degree (N = 14/4%), master’s degree (N = 275/81%), and other (N = 28/8%) including associate levels of education, as well as four (1%) participants with unknown educational levels.

 

The participants represented a broad but targeted sampling from a variety of employers, including school settings (N = 166/49%); outpatient mental health settings (N = 108/32%); mental health agencies (N = 31/9%); and other settings (N = 33/10%); as well as one participant with an unknown employment setting. The participants also identified their employment environment as urban (N = 56/60%), rural (N = 174/52%), and suburban (N = 105/31%).

 

Participants identified the practitioner responsible to assess suicide risk within their work setting having multiple response options (see Table 1). These included a psychiatrist (N = 85/25%), nurse (N = 57/17%), school counselor (N = 179/53%), social worker (N = 168/50%), teacher (N = 7/2%), school psychologist (N = 154/46%), school mental health professional (N = 125/37%), psychologist (N = 64/19%), professional counselor (N = 101/30%), and other (N = 29/9%) including paraprofessionals, while 19 participants (6%) reported they do not complete suicide risk assessments.

 

     Prior exposure with suicidal students/clients. In the survey, 288 (86%) of the participants reported having a student or client referred to them for being potentially suicidal; 45 (14%) did not receive a similar referral; and six participants did not respond. A majority of participants (N = 287/86%) reported having worked with a student or client initially found to be presenting with active suicidal thoughts and 48 (14%) reported not yet having worked with a suicidal student or client.

 

Analysis

 

Using descriptive data, participant responses were further examined to determine frequency and percentages of the total responses. In addition, inferential statistics were used to compute possible relationships among variables using SPSS. Data from the primary survey questions provided guidance toward establishing possible relationships between practitioner preparedness, confidence and the methods used in determining suicide risk level.

 

Results

Self-perceived preparedness in suicide assessment. The majority of the respondents reported some type of exposure or training in suicide intervention and assessment. The participants had an opportunity to select multiple answers: graduate course work (N = 174/52%), attending professional development workshops (N = 233/69%), in-service trainings at work (N = 213/63%), and having not received any training (N = 21/6%). In addition, participants had multiple answer options that represented self-perceived preparedness levels: not feeling at all prepared (N = 15/4%), feeling somewhat prepared (N = 120/36%), feeling well prepared (N = 202/60%), and requesting that someone more prepared meet or assess a suicidal student/client (N = 32/9%).

 

     Self-reported confidence in suicide assessment. The confidence levels reported by the participants reflect professional skill development to conduct suicide risk assessments. The responses included feeling very confident (N = 49/15%), confident (N = 212/63%), and not very confident (N = 63/19%). A similar survey item asked about confidence levels working with a suicidal student or client. The responses included feeling very confident (N = 42/12%), confident (N = 231/69%), and not very confident (N = 63/19%). An additional survey item sought information regarding participant feelings when assessing for suicidal thoughts. Results indicated feeling not prepared (N = 39/12%), anxious (N = 116/34%), calm (N = 145/43%), and confident (N = 185/55%).

 

     Methods Used to Determine Suicide Risk Level During Assessment. Several survey items queried participant levels of training and methods used to assess a suicidal student or client. A survey item asked participants if they had received formal training to conduct suicide risk assessments. The respondents indicated Yes (N = 201/60%) or No (N = 133/40%). In addition, a survey question asked participants if they felt qualified to complete a suicide risk assessment: Yes (N = 241/73%) or No (N = 91/27%). A follow-up survey item asked participants how they determined if the student or client was at imminent risk, high to moderate risk or low risk. The participant responses indicated they would conduct an informal, non-structured interview (N = 213/64%) or use a formal, valid suicide assessment instrument (N = 90/27%); the remaining respondents indicated other (N = 31/9%).

 

Participants were asked what would limit their ability to provide a suicide intervention. Using a  “check all that apply” format, responses included practitioners not receiving formal training to work with suicidal students or clients (N = 55/17%), the role of suicide interventions and response is the job of others (N = 19/6%), not feeling adequately prepared to provide a suicide intervention or assessment (N = 65/20%), workplace policy does not allow formal suicide assessments (N = 12/4%), and feeling prepared (N = 225/68%). The discipline most frequently reported to encounter and assess a youth presenting with suicidal thoughts or behaviors in this study was the school counselor (53%). This supported previous research by Poland (1989) who identified that “the task of suicide assessment was likely to fall on the school counselor” (p. 74).

 

To determine whether relationships existed among self-perceived levels of preparedness, levels of confidence, and methods used in youth suicide assessment, the researcher completed a chi-square statistical analysis to measure numerical and categorical differences. In order to compare differences among several groups, variables were collapsed to include confident/not confident and prepared/not prepared. The first group compared practitioners’ responses of reporting confident/not confident to prepared/not prepared in the process of providing an informal versus formal suicide risk assessment in youth. The analysis indicated that there were significant differences in preparedness levels according to the method used. Seventy-three percent of those reporting use of formal assessments versus approximately 50% of those using informal assessments indicated confidence in their preparedness abilities (X2 = 12.79; df = 1. Cramer’s V = .206, p = .000). A further analysis indicated there were similar significant differences in practitioner confidence levels conducting informal, non-structured suicide risk assessments and formal assessments (X2 = 23.54, DF = 1. Cramer’s V=.280, p = .000). The results showed that 95.6% of the practitioners using formal suicide risk assessments reported higher levels of confidence versus 70.1% of the practitioners using informal, non-structured suicide risk assessments.

 

To identify existing gaps, participants were asked to rank by priority the trainings they needed to increase competency levels. The highest priority was (1) to receive a comprehensive training on warning signs, symptoms and suicidal behaviors, and (2) to attend several suicide assessment workshops.

 

Discussion

 

The purpose of this study was to determine if relationships existed among practitioners’ self-perceived levels of preparedness, levels of confidence and methods used when assessing for suicide risk in youth. A survey was designed to query participants representing a broad sampling of disciplines related to their perceptions, experience and involvement in youth suicide risk assessment. The results of the survey were analyzed using chi-square to determine if relationships existed among variables, including participant perceptions of feeling prepared and confident, and if this contributed to the methods used to determine suicide risk in youth.

 

Results of the survey indicated that a majority of the participants (86%) reported having worked with suicidal youth; however, inconsistencies in participant responses emerged related to the constructs of feeling prepared and confident in the assessment of suicide. The results suggested preparedness and training in suicide assessment is linked to practitioner confidence levels when assessing for suicide risk among youth. This finding is supported by earlier research by Oordt and colleagues (2009), who reported that practitioner confidence in suicide assessment is primarily related to competency and training levels. The interrelationship between preparedness and confidence is often reflected in the practitioner’s ability to accurately estimate risk level. This may potentially increase the likelihood of omitting critical information, which may affect the estimate of suicide risk (Douglas & Ogloff, 2003; Singer & Slovak, 2011). The results represent an important finding and highlight existing gaps in practitioner preparation. These gaps may reflect a struggle for most university and college graduate school degree programs to offer a more diversified curriculum (Allen, Burt, Bryan, Carter, Orsi, & Durkan, 2002) that includes courses specific to identifying, intervening in and assessing for suicide risk in youth (Schmitz et al., 2012).

 

The inconsistencies in participant responses related to feeling prepared and confident became apparent when participants rated themselves in working with a suicidal youth. Although over half of the respondents reported feeling well prepared and qualified in their ability, a much smaller percentage reported feeling confident in themselves (12%) and their skill preparation (15%) to assess for suicide. This finding may reflect a self-evaluation dilemma in wanting to self-report feeling prepared to work with a suicidal youth, but in actuality not feeling prepared or confident to provide a suicide intervention or complete an assessment.

 

As this study broadened its review of practitioner responses related to preparedness and confidence, findings indicated additional inconsistencies in participant responses related to self-reported feelings of preparedness and confidence when conducting a suicide intervention or suicide assessment. Despite predominantly higher levels of reported confidence, skill development and preparedness to determine if a student or client was at imminent risk, high to moderate risk, or low risk, few participants (27%/N = 90) reported using a formal suicide assessment instrument. Most respondents (64%/N = 213) reported basing their clinical judgment solely on using an informal, non-structured interview. Although practitioners reported feeling prepared and having a sense of confidence assessing for suicide risk, basing clinical judgment on this method alone raises concerns. O’Connor and colleagues (2004) described that practitioner skill deficiencies in suicide assessment are commonly reflected in clinic notes such as “patient currently denies suicidal thoughts,” based on the practitioner’s impressionistic and subjective perceptions. Consistent with identifying training deficiencies in preparation, 52% (N = 174) of the participants reported receiving limited suicide intervention or assessment training in graduate coursework.

 

The participants in this study who reported using a formal suicide assessment, however, indicated feeling better prepared to conduct a suicide assessment versus practitioners using an informal, non-structured interview. In addition, practitioners using a formal assessment also had greater confidence levels versus practitioners using an informal, non-structured interview. When participants were asked to rank their own levels of needed training to provide a more thorough suicide intervention, participants identified skill deficiencies and training gaps in identifying warning signs and behaviors and assessing for suicide using a suicide risk assessment. These deficiencies pose great concern and competency challenges for practitioners charged with assessing for suicide risk. The combination of skill attributes, guided interview and diagnostic assessment synthesizes the information and allows practitioners to determine risk level and base clinical judgment on a variety of sources (Rudd, 2006; Sullivan & Bongar, 2009). The skill deficiencies reflected across all disciplines represented significant training gaps. This study suggests the need for increased commitment by colleges and universities to prepare future practitioners to more effectively address the growing national youth suicide crisis.

 

Implications

 

Despite suicide being identified as a national public health priority, no significant reduction in suicide has been recorded in the past 50 years (Kung et al., 2008; National Action Alliance for Suicide Prevention, 2014). “With the majority of youth suicide deaths being preventable,” (O’Connor, Platt, & Gordon, 2011, p. 581), continued and more urgent calls for increasing practitioner preparedness, confidence and competency skills continue to be neglected.

 

Each of the disciplines represented in this study is faced with the challenge to address and estimate suicide risk. This study highlighted the critical role of school counselors as being identified by participants (53%) to be the most likely practitioner to respond and provide a suicide assessment. Representing a variety of disciplines and settings, participant responses suggest training deficiencies in the levels of preparedness, confidence and exposure to formal assessment measures. Previous research has made strong recommendations to increase the provisions and training in suicide assessment. Despite heeding previous calls and recommendations to prepare practitioners, more attention is needed to address previous and current identified training deficiencies among practitioners.

 

Transitioning research into practice includes revisiting several identified recommendations by Schmitz et al. (2012). This includes providing consistent core standards and competencies across disciplines by educational accrediting institutions. This may call for increased suicide-specific educational and training requirements beyond the baccalaureate degree level and include dissecting vignettes, role-playing, exposing practitioners to several suicide assessment instruments and interpreting the results (Fenwick, Vassilas, Carter, & Haque, 2004). This would include increased emphasis on recognizing the signs and symptoms of depression, suicidal thoughts and behaviors and increasing an understanding of potential next steps once a suicide risk level has been determined. In addition, to sustain these skills, state licensing boards can require continuing education specific to suicide identification, assessment and management. Rudd and colleagues (2008) placed emphasis on practitioners receiving increased suicide assessment strategies through supervision. The prevailing need practitioners identified as a chief priority in this study was to become more familiar with the warning signs, symptoms and behaviors associated with suicide and suicide assessment. The findings included within this study offer future research opportunities to monitor suicide training, preparation and continuing educational requirements of colleges, universities and licensing boards that govern and are responsible for the production of competent practitioners.

 

Although attention has focused on practitioner training deficits in the identification and assessment of youth suicide, future studies also are warranted in the measurement and impact of existing suicide prevention training programs that may provide opportunities for practitioners to increase skill sets in these areas. Another area meriting future study might include a national sampling of school counselor preparation in the identification, assessment and exposure to assessment tools. In this study, school counselors were identified to be the most likely practitioner called upon to provide an initial suicide intervention or assessment given their access to a large number of youth. This serves as a valuable finding, highlighting the call for increased and expanded counselor education, training and preparation in suicide risk identification and assessment in graduate school.

 

Limitations

 

     Providing a suicide intervention or assessment involves many complex issues, and addressing the many variables paralleling these efforts could not be entirely assessed in this study. This study was intended to explore current levels of practitioner preparedness, confidence and the methods used to assess youth suicide. There are some notable limitations regarding the current study; therefore, caution is warranted regarding the generalizability of the findings.

 

Although the Internet provided a greater opportunity for the researcher to create survey access to targeted participants and disciplines, this method did not provide a sample size completion rate. In addition, previous Internet survey research (W. Schmidt, 1997) reported that participants have access to multiple submissions, although ethical practice instructions and consent to complete this survey was provided. In order to access participants from multiple disciplines, the survey used in this study was available online as a self-report method of completion. In this process, self-report instruments, including surveys, inherently contain participant response bias. This may be reflected in responding to questions in a socially desirable or expected manner (Heppner, Wampold, & Kivlighan 2007). In addition, online surveys can be submitted containing omitted and blank responses (Sue & Ritter, 2012).

 

As previously noted, The Child and Adolescent Suicide Intervention Preparedness Survey used in this study was adapted from two previous research surveys (Debski et al., 2007; Stein-Erichsen, 2010). In this study design, survey questions were created and adapted to measure participant constructs in the assessment of youth suicide. The use of a psychometrically sound survey instrument would be an ideal application to implement and duplicate for future research.

 

Conclusion

 

The findings from this study identify significant interrelationships between the practitioner’s self-perceived feelings of preparedness, confidence levels and methods used to assess for suicide risk among youth. The self-reported feelings of being prepared and confident seem to contradict the method used to obtain a suicide risk level. This finding suggests many practitioners are well intended, but lack the necessary skills to conduct a thorough suicide risk assessment. The majority of practitioners participating in this study reported conducting a suicide risk intervention using an informal, non-structured interview to formulate a suicide risk level versus using a formalized suicide risk assessment instrument. Prior experience and exposure to suicide risk assessment instruments and increased emphasis in suicide-specific training curriculum in graduate school can offer the opportunity for a practitioner to feel better prepared, feel more confident and utilize a more effective method to determine a youth’s suicide risk level. Practitioner gaps in training are typically augmented by the practitioner seeking personal training and workshops to fill these gaps. Efforts must be made by colleges and universities to increase the competency skills in this area if we are to ever reduce the growing number of youth suicides. The findings from this study supported limited previous research sounding urgent calls to better prepare practitioners, especially school counselors, in the identification of youth presenting with suicidal thoughts or behaviors.

 

 

Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest

or funding contributions for the development

of this manuscript.

 

 

 

 

References

 

Al-Damarki, F. R. (2004). Counselor training, anxiety, and counseling self-efficacy: Implications for training psychology students from the United Arab Emirates University. Social Behavior and Personality, 32, 429–439. doi:10.2224/sbp.2004.32.5.429

Allen, M., Burt, K., Bryan, E., Carter, D., Orsi, R., & Durkan, L. (2002). School counselor’s preparation for and participation in crisis intervention. Professional School Counseling, 6, 96–102.

Boudreaux, E. D., & Horowitz, L. M. (2014). Suicide risk screening and assessment: Designing instruments with dissemination in mind. American Journal of Preventative Medicine, 47(32), 163–169.
doi:10.1016/j.amepre.2014.06.005

Bryan, C. J., & Rudd, D. M. (2006). Advances in the assessment of suicide risk. Journal of Clinical Psychology, 62, 185–200.

Centers for Disease Control and Prevention. (2014a). Fatal injury data. Web-based injury statistics query and reporting system (WISQARS). Retrieved from http://www.cdc.gov/injury/wisqars/index.html

Centers for Disease Control and Prevention. (2014b). Youth risk behavior surveillance, United States, 2003. Morbidity and Mortality Weekly Report, 63(SS-4), 1–168.

Cramer, R. J., Johnson, S. M., McLaughlin, J., Rausch, E. M., & Conroy, M. A. (2013). Suicide risk assessment training for psychology doctoral programs. Training and Education in Professional Psychology, 7, 1–11.

Debski, J., Spadafore, C. D., Jacob, S., Poole, D. A., & Hixson, M. D. (2007). Suicide intervention: Training, roles and knowledge of school psychologists. Psychology in the Schools, 44, 157–170. doi:10.1002/pits.20213

Dexter-Mazza, E. T., & Freeman, K. A. (2003). Graduate training and the treatment of suicidal clients: The students’ perspective. Suicide and Life-Threatening Behavior, 33, 211–218.

Douglas, K. S., & Ogloff, J. R. P. (2003). The impact of confidence on the accuracy of structured professional and actuarial violence risk judgments in a sample of forensic psychiatric patients. Law and Human Behavior, 27, 573–587.

Drapeau, C. W., & McIntosh, J. L. (2014). U.S.A suicide 2012: Official final data. Washington, DC: American Association of Suicidology.

Feldman, B. N., & Freedenthal, S. (2006). Social work education in suicide intervention and prevention: An unmet need? Suicide and Life-Threatening Behavior, 36, 467–480.

Fenwick, C. D., Vassilas, C. A., Carter, H., & Haque, M. S. (2004). Training health professionals in the recognition, assessment and management of suicide risk. International Journal of Psychiatry in Clinical Practice, 8, 117–121. doi:10.1080/13651500410005658

Goldston, D. B. (2003). Measuring suicidal behavior and risk in children and adolescents. Washington, DC: American Psychological Association.

Heppner, P. P., Wampold, B. E., & Kivlighan, Jr. D. M. (2007). Research design in counseling: Research, statistics, and program evaluation (3rd ed.). Belmont, CA: Brooks/Cole.

Horowitz, L., Ballard, E., & Paoa, M. (2009). Suicide screening in schools, primary care and emergency departments. Current Opinion in Pediatrics, 21, 620–627.

Jacobson, J. M., Ting, L., Sanders, S., & Harrington, D. (2004). Prevalence of and reactions to fatal and nonfatal client suicidal behavior: A national study of mental health social workers. Omega, 49, 237–248.

Jahn, D. R., Wacha-Montes, A., Dra-Peau, C. W., Grant, B., Nadorff, M. R., Pusateri, M. J., Jr., . . . & Cukrowicz, K. C.  (2012). Suicide-specific courses and training: Prevalence, beliefs, and barriers. Part I: Graduate psychology programs and professionals schools of psychology. Manuscript in preparation.

Kleespies, P., Penk, W., & Forsyth, J. (1993). The stress of patient suicidal behavior during clinical training: Incidence, impact, and recovery. Professional Psychology: Research and Practice, 24, 293–303.

Kung, H. S., Hoyert, D. L., Xu, J., & Murphy, S. L. (2008). Deaths: Final data for 2005. National Vital Statistics Reports, 56, 1–66.

Mackelprang, J. L., Karle, J., Reihl, K. M., & Cash, R. E. (2014). Suicide intervention skills: Graduate training and exposure to suicide among psychology trainees. Training and Education in Professional Psychology, 8, 136–142.

McEvoy, M. L., & McEvoy, A. W. (2000). Preventing youth suicide: A handbook for educators and human service professionals. Holmes Beach, FL: Learning Publications.

McNiel, D. E., Fordwood, S. R., Weaver, C. M., Chamberlain, J. R., Hall, S. E., & Binder, R. L. (2008). Effects of training on suicide risk assessment. Psychiatric Services, 59, 1462–1465. doi:10.1176/appi.ps.59.12.1462

National Action Alliance for Suicide Prevention: Research Prioritization Task Force. (2014). A prioritized research agenda for suicide prevention: An action plan to save lives. Rockville, MD: National Institute of Mental Health.

O’Connor, R. C., Platt, S., & Gordon, J. (2011). International Handbook of Suicide Prevention: Research, Policy and Practice. United Kingdom: John Wiley & Sons, Ltd.

O’Connor, N., Warby, M., Raphael, B., & Vassallo, T. (2004). Changeability, confidence, common sense and corroboration: Comprehensive suicide risk assessment. Australasian Psychiatry, 12, 352–360.

Oordt, M. S., Jobes, D. A., Fonseca, V. P., & Schmidt, S. M. (2009). Training mental health professionals to assess and manage suicidal behavior: Can provider confidence and practice behaviors be altered? Suicide and Life-Threatening Behavior, 39, 21–32. doi:10.1521/suli.2009.39.1.21

Osteen, P. J., Frey, J. J., & Ko, J. (2014). Advancing training to identify, intervene, and follow up with individuals at risk for suicide through research. American Journal of Preventative Medicine, 47, 216–221. doi:10.1016/j.amepre.2014.05.033

Poland, S. (1989). Suicide intervention in the schools. New York, NY: The Guilford Press.

Poland, S., & Lieberman, S. (2002). Best practices in suicide intervention. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology IV – Volume II (pp. 1151–1165). Bethesda, MD: National Association of School Psychologists.

Rudd, M. D. (2006). The assessment and management of suicidality. Sarasota, FL: Professional Resource Press.

Rudd, M. D., Cukrowicz, K. C., & Bryan, C. J. (2008). Core competencies in suicide risk assessment and management: Implications for supervision. Training and Education in Professional Psychology, 2, 219–228.

Ruth, B. J., Muroff, J., Gianino, M., Feldman, B. N., McLaughlin, D., Ross, A., & Hill, E. (2009). Suicide prevention education in social work education: What do MSW deans, directors and faculty have to say? Paper session presented at the meeting of the American Public Health Association, Philadelphia, PA.

Sanders, S., Jacobson, J. M., & Ting, L. (2008). Preparing for the inevitable: Training social workers to cope with client suicide. Journal of Teaching in Social Work, 28, 1–18.

Schmidt, W. C. (1997). World-wide web survey research: Benefits, potential problems, and solutions. Behavior Research Methods, Instruments, & Computers, 29, 274–279.

Schmidt, R. C., Iachini, A. L., George, M., Koller, J., & Weist, M. (2015). Integrating a suicide prevention program into a school mental health system: A case example from a rural school district. Children & Schools, 37, 18–27.

Schmitz, W. M., Allen, M. H., Feldman, B. N., Gutin, N, J., Jahn, D. R., Kleespies, P. M., . . . & Simpson, J. D.  (2012). Preventing suicide through improved training in suicide risk assessment and care: An American Association of Suicidology task force report addressing serious gaps in U.S. mental health training. Suicide and Life-Threatening Behavior, 42, 292–304. doi:10.1111/j.1943-278X.2012.00090.x

Singer, J. B., & Slovak, K. (2011). School social workers’ experiences with youth suicidal behavior: An exploratory study. Children & Schools, 33, 215–228.

Smith, A. R., Silva, C., Covington, D. W., & Joiner, T. E. (2014). An assessment of suicide related knowledge and skills among health professionals. Health Psychology, 33, 110–119.

Stein-Erichsen, J. L. (2010). School psychologists’ confidence level with suicide intervention and prevention in the schools. Retrieved from: http://digitalcommons.pcom.edu/psychology_dissertations/132.

Sue, V. M., & Ritter, L. A. (2012). Conducting online surveys (2nd ed.). Thousand Oaks, CA: Sage.

Sullivan, G. R., & Bongar, B. (2009). Assessing suicide risk in the adult patient. In P. M. Kleespies (Ed.), Behavioral emergencies: An evidenced –based resource for evaluating and managing risk of suicide, violence, and victimization (pp. 59–78). Washington, DC: American Psychological Association.

Tetlock, P. E. (2005). Expert political judgment: How good is it? How can we know? Princeton, NJ: Princeton University Press.

Wachter, C. A. (2006). Crisis in the schools: Crisis, crisis intervention training and school counselor burnout. ACES Research Grant. Association for Counselor Education and Supervision. Retrieved from http://libres.uncg.edu/ir/uncg/f/umi-uncg-1190.pdf

Wozny, D. A. (2005). Suicide risk assessment: Counselor competency. In J. R. Rodgers (Ed.), Suicide 2006: Proceedings of the 39th Annual Conference of the American Suicidology Association (pp. 224–227). Washington, DC: American Association of Suicidology.

 

 

 

Robert C. Schmidt, NCC, is a Behavioral Specialist at Talbot County Public Schools in Easton, MD. Correspondence can be addressed to Robert C. Schmidt, Talbot County Public Schools, 12 Magnolia Street, Easton, MD 21601, rschmidt@tcps.k12.md.us.

Development and Factor Analysis of the Protective Factors Index: A Report Card Section Related to the Work of School Counselors

Gwen Bass, Ji Hee Lee, Craig Wells, John C. Carey, Sangmin Lee

The scale development and exploratory and confirmatory factor analyses of the Protective Factor Index (PFI) is described. The PFI is a 13-item component of elementary students’ report cards that replaces typical items associated with student behavior. The PFI is based on the Construct-Based Approach (CBA) to school counseling, which proposes that primary and secondary prevention activities of school counseling programs should focus on socio-emotional, development-related psychological constructs that are associated with students’ academic achievement and well-being, that have been demonstrated to be malleable, and that are within the range of expertise of school counselors. Teachers use the PFI to rate students’ skills in four construct-based domains that are predictive of school success. School counselors use teachers’ ratings to monitor student development and plan data-driven interventions.

 

Keywords: protective factors, factor analysis, school counselors, construct-based approach, student development

 

Contemporary models for school counseling practice (ASCA, 2012) emphasize the importance of school counselors using quantitative data related to students’ academic achievement to support professional decisions (Poynton & Carey, 2006), to demonstrate accountability (Sink, 2009), to evaluate activities and programs (Dimmitt, Carey, & Hatch, 2007), to advocate for school improvement (House & Martin, 1998) and to advocate for increased program support (Martin & Carey, 2014). While schools are data-rich environments and great emphasis is now placed on the use of data by educators, the readily available quantitative data elements (e.g., achievement test scores) are much better aligned with the work of classroom teachers than with the work of school counselors (Dimmitt et al., 2007). While teachers are responsible for students’ acquisition of knowledge, counselors are responsible for the improvement of students’ socio-emotional development in ways that promote achievement. Counselors need data related to students’ socio-emotional states (e.g., self-efficacy) and abilities (e.g., self-direction) that predispose them toward achievement so that they are better able to help students profit from classroom instruction and make sound educational and career decisions (Squier, Nailor, & Carey, 2014). Measures directly associated with constructs related to socio-emotional development are not routinely collected or used in schools. The development of sound and useful measures of salient socio-emotional factors that are aligned with the work of school counselors and that are strongly related to students’ academic success and well-being would greatly contribute to the ability of counselors to identify students who need help, use data-based decision making in planning interventions, evaluate the effectiveness of interventions, demonstrate accountability for results, and advocate for students and for program improvements (Squier et al., 2014).

 

Toward this end, we developed the Protective Factors Index (PFI) and describe herein the development and initial exploratory and confirmatory factors analyses of the PFI. The PFI is a 13-item component of elementary students’ report cards that replaces typical items associated with student deportment. The PFI is based on the Construct-Based Approach (CBA) to school counseling (Squier et al., 2014), which is based on the premise that primary and secondary prevention activities of school counseling programs should be focused on socio-emotional development-related psychological constructs that have been identified by research to be associated strongly with students’ academic achievement and well-being, that have been demonstrated to be malleable, and that are within the range of expertise of school counselors. The CBA clusters these constructs into four areas reflecting motivation, self-direction, self-knowledge and relationship competence.

 

The present study was conducted as collaboration between the Ronald H. Fredrickson Center for School Counseling Outcome Research and Evaluation and an urban district in the Northeastern United States. As described below, the development of the PFI was guided by the CBA-identified clusters of psychological states and processes (Squier et al., 2014). With input from elementary counselors and teachers, a 13-item report card and a scoring rubric were developed, such that teachers could rate each student on school counseling-related dimensions that have been demonstrated to underlie achievement and well-being. This brief measure was created with considerable input from the school personnel who would be implementing it, with the goal of targeting developmentally appropriate skills in a way that is efficient for teachers and useful for counselors. By incorporating the PFI into the student report card, we ensured that important and useful student-level achievement-related data could be easily collected multiple times per year for use by counselors. The purpose of this study was to explore relationships between the variables that are measured by the scale and to assess the factor structure of the instrument as the first step in establishing its validity. The PFI has the potential to become an efficient and accurate way for school counselors to collect data from teachers about student performance.

 

Method

 

Initial Scale Development

The PFI was developed as a tool to gather data on students’ socio-emotional development from classroom teachers. The PFI includes 13 items on which teachers rate students’ abilities related to four construct-based standards: motivation, self-direction, self-knowledge and relationships (Squier et al., 2014). These four construct clusters are believed to be foundational for school success (Squier et al., 2014). Specific items within a cluster reflect constructs that have been identified by research to be strongly associated with achievement and success.

 

The PFI assessment was developed through a collaborative effort between the research team and a group of district-level elementary school administrators and teachers. The process of creating the instrument involved an extensive review of existing standards-based report cards, socio-emotional indicators related to different student developmental level, and rating scales measuring identified socio-emotional constructs. In addition, representatives from the district and members of the research team participated in a two-day summer workshop in August of 2013. These sessions included school counselors and teachers from each grade level, as well as a teacher of English language learners, a special education representative, and principals. All participants, except the principals, were paid for their time. Once the draft PFI instrument was completed, a panel of elementary teachers reviewed the items for developmental appropriateness and utility. The scale was then adopted across the district and piloted at all four (K–5) elementary schools during the 2013–2014 school year as a component of students’ report cards.

 

The PFI component of the report card consists of 13 questions, which are organized into four segments, based on the construct-based standards: motivation (4 items), self-direction (2 items), self-knowledge (3 items) and relationships (4 items). The items address developmentally appropriate skills in each of these domains (e.g., demonstrates perseverance in completing tasks, seeks assistance when needed, works collaboratively in groups of various sizes). The format for teachers to evaluate their students includes dichotomous response options: “on target” and “struggling.” All classroom teachers receive the assessment and the scoring rubric that corresponds to their grade level. The rubric outlines the observable behaviors and criteria that teachers should use to determine whether or not a student demonstrates expected, age-appropriate skills in each domain. Because the PFI instrument is tailored to address developmentally meaningful competencies, three rubrics were developed to guide teacher ratings at kindergarten and first grade, second and third grade, and fourth and fifth grade.

 

At the same time that the PFI scale was developed, the district began using a computer-based system to enter report card data. Classroom teachers complete the social-emotional section of the standards-based report card electronically at the close of each marking period, when they also evaluate students’ academic performance. The data collected can be accessed and analyzed electronically by school administrators and counselors. Additionally, data from two marking periods during the 2013–2014 school year were exported to the research team for analysis (with appropriate steps taken to protect students’ confidentiality). These data were used in the exploratory and confirmatory factor analyses described in this paper.

 

Sample

The PFI was adopted across all of the school district’s four elementary schools, housing grades kindergarten through fifth. All elementary-level classroom teachers completed the PFI for each of the students in their classes. The assessment was filled out three times during the 2013–2014 school year, namely in December, March and June. The data collected in the fall and winter terms were divided into two sections for analysis. Data from the December collection (N = 1,158) was used for the exploratory factor analysis (EFA) and data from the March collection was randomly divided into two subsamples (subsample A = 599 students and subsample B = 591 students) in order to perform the confirmatory factor analysis (CFA).

 

The sample for this study was highly diverse: 52% were African American, 17% were Asian, 11% were Hispanic, 16% were Caucasian, and the remaining students identified as multi-racial, Pacific Islander, Native Hawaiian, or Native American. In the EFA, 53.2% (n = 633) of the sample were male and 46.8% (n = 557) of the sample were female. Forty-seven kindergarten students (3.9%), 242 first-grade students (20.3%), 216 second-grade students (18.2%), 222 third-grade students (18.7%), 220 fourth-grade students (18.5%), and 243 fifth-grade students (20.4%) contributed data to the EFA.

 

The first CFA included data from 599 students, 328 males (54.8%) and 271 females (45.2%). The data included 23 kindergarten students (3.8%), 136 first-grade students (22.7%), 100 second-grade students (16.7%), 107 third-grade students (17.9%), 102 fourth-grade students (17.0%), and 131 fifth-grade students (21.9%). The data analyzed for the second CFA included assessments of 591 students, 305 males (51.6%) and 286 females (48.4%). The data consisted of PFI assessments from 24 kindergarten students (4.1%), 106 first-grade students (17.9%), 116 second-grade students (19.6%), 115 third-grade students (19.5%), 118 fourth-grade students (20.0%), and 112 fifth-grade students (19.0%).

 

Procedures

Classroom teachers completed PFI assessments for all students in their class at the close of each marking period using the rubrics described above. Extracting the data from the district’s electronic student data management system was orchestrated by the district’s information technology specialist in collaboration with members of the research team. This process included establishing mechanisms to ensure confidentiality, and identifying information was extracted from student records.

 

Data Analyses

The PFI report card data was analyzed in three phases. The first phase involved conducting an EFA at the conclusion of the first marking period. The second phase was to randomly select half of the data compiled during the second marking period and perform a confirmatory factor analysis. Finally, the remaining half of the data from the second marking period was analyzed through another CFA.

 

Phase 1. Exploratory factor analysis. An initial EFA of the 13 items on the survey instrument was conducted using the weighted least squares mean adjusted (WLSM) estimation with the oblique rotation of Geomin. The WLSM estimator appropriately uses tetrachoric correlation matrices if items are categorical (Muthén, du Toit, & Spisic, 1997). The EFA was conducted using Mplus version 5 (Muthén & Muthén, 1998–2007).

 

Model fit was assessed using several goodness-of-fit indices: comparative fit index (CFI), Tucker-Lewis Index (TLI), root mean square error of approximation (RMSEA), and standardized root mean square residual (SRMR). We assessed model fit based on the following recommended cutoff values from Hu and Bentler (1999): CFI and TLI values greater than 0.95, RMSEA value less than 0.06, and SRMR value less than 0.08.

 

     Phase 2. First confirmatory factor analysis. An initial CFA was conducted on the 13 items from the instrument survey to assess a three-factor measurement model that was based on theory and on the results yielded through the exploratory analysis. Figure 1 provides the conceptual path diagram for the measurement model. Six items (3, 4, 6, 7, 11 and 13) loaded on factor one (C1), which is named “academic temperament.” Three items (8, 9 and 12) loaded on factor two (C2), which is referred to as “self-knowledge.” Four items (1, 2, 5 and 10) loaded on factor three (C3), which is titled “motivation.” All three latent variables were expected to be correlated in the measurement model.

 

This CFA was used to assess the measurement model with respect to fit as well as convergent and discriminant validity. Large standardized factor loadings, which indicate strong inter-correlations among items associated with the same latent variable, support convergent validity. Discriminant validity is evidenced by correlations among the latent variables that are less than the standardized factor loadings; that is, the latent variables are distinct, albeit correlated (see Brown, 2006; Kline, 2011; Schumacker & Lomax, 2010).

 

The computer program Mplus 5 (Muthén & Muthén, 1998-2007) was used to conduct the CFA with weighted least square mean and variance adjusted (WLSMV) estimation. This is a robust estimator for categorical data in a CFA (Brown, 2006). For the CFA, Mplus software provides fit indices of a given dimensional structure that can be interpreted in the same way as they are interpreted when conducting an EFA.

 

     Phase 3. Second confirmatory factor analysis. A second CFA was conducted for cross-validation. This second CFA was conducted on the 13 items from the instrument survey to assess a three-factor measurement model that was based on the results yielded through the first confirmatory factor analysis. The same computer program and estimation tactics were used to conduct the second CFA.


Results

 

Phase 1. Exploratory Factor Analysis

Complete descriptive statistics for the responses to each of the 13 items are presented in Table 1. The response categories for all questions are dichotomous and also identified in Table 1 as “On Target” or “Struggling,” while incomplete data are labeled “Missing.” A total of 1,158 surveys were analyzed through the EFA. The decision to retain factors was initially guided by visually inspecting the scree plot and eigenvalues. The EFA resulted in two factors with eigenvalues greater than one (one-factor = 8.055, two-factor = 1.666, and three-factor = 0.869). In addition, the scree test also supported the idea that two factors were retained because two factors were left of the point where the scree plot approached asymptote. However, considering goodness-of-fit indices, the models specifying a three-factor structure and four-factor structure fit the data well. Methodologists have suggested that “underfactoring” is more problematic than “overfactoring” (Wood, Tataryn, & Gorsuch, 1996). Thus, there was a need to arrive at a factor solution that balanced plausibility and parsimony (Fabrigar, Wegener, MacCallum, & Strahan, 1999).

Methodologists (e.g., Costello & Osborne, 2005; Fabrigar et al., 1999) have indicated that when the number of factors to retain is unclear, conducting a series of analyses is appropriate. Therefore, two-, three-, and four-factor models were evaluated and compared to determine which model might best explain the data in the most parsimonious and interpretable fashion. In this case, the two-factor model was eliminated because it did not lend itself to meaningful interpretability. The four-factor model was excluded because one of the factors was related to only one item, which is not recommended (Fabrigar et al., 1999). Researchers evaluated models based on model fit indices, item loadings above 0.40 (Kahn, 2006), and interpretability (Fabrigar et al., 1999).

 

The three-factor measurement model fit the data well (RMSEA = 0.052, SRMR = 0.036, CFA = 0.994, TLI = 0.988, χ2 = 173.802, df = 42, p < 0.001). As shown in Table 2, the standardized factor loadings were large, ranging from 0.58 to 0.97. The first factor included six items. Items reflected students’ abilities at emotional self-control and students’ abilities to maintain good social relationships in school (e.g., demonstrates resilience after setbacks and works collaboratively in groups of various sizes). This first factor was named “academic temperament.”
The second factor included three items. All of the items reflected the understanding that students have about their own abilities, values, preferences and skills (e.g., identifies academic strengths and abilities and identifies things the student is interested in learning). This second factor was named “self-knowledge.” The third factor included four items. All of the items reflected personal characteristics that help students succeed academically by focusing and maintaining energies on goal-directed activities (e.g., demonstrates an eagerness to learn and engages in class activities). This third factor was named “motivation.” The three-factor measurement model proved to have parsimony and interpretability.

 

The two-factor model did not fit the data as well as the three-factor model (RMSEA = 0.072, SRMR = 0.058, CFA = 0.985, TLI = 0.978, χ2 = 371.126, df = 53, p < 0.001). As shown in Table 2, the standardized factor loadings were large, ranging from 0.59 to 0.94. The first factor included seven items. This first factor reflected self-knowledge and motivation. It was more appropriate to differentiate self-knowledge and motivation considering interpretability. The two-factor model provided relatively poor goodness-of-fit indices and interpretability.

 

The four-factor model fit the data slightly better than the three-factor model (RMSEA = 0.035, SRMR = 0.023, CFA = 0.998, TLI = 0.995, χ2 = 76.955, df = 32, p < 0.001). As shown in Table 2, the standardized factor loadings were large, ranging from 0.54 to 1.01. The first factor included one item, however, and retained factors should include at least three items that load 0.05 or greater (Fabrigar et al., 1999), so the first factor was removed. The second factor was comprised of six items that all relate to the construct of academic temperament. The third factor includes four items that reflect motivation. The fourth factor is composed of three items that relate to self-knowledge. The four-factor model was strong in terms of goodness-of-fit indices, though it was not possible to retain the first factor methodologically, due to the fact that it only involved one item. Therefore, given a series of analyses, the three-factor model was selected as the most appropriate.

 

Phase 2. First Confirmatory Factor Analysis

Complete descriptive statistics for the items are presented in Table 3. The responses for all items were dichotomous. A total of 569 (95.0%) of 599 surveys were completed and were used in the first CFA.

 

 

 

 

The three-factor measurement model provided good fit to the data (RMSEA = 0.059, CFI = 0.974, TLI = 0.984, χ2 = 104.849, df = 35, p < 0.001). Table 4 reports the standardized factor loadings, which

can be interpreted as correlation coefficients, for the three-factor model. The standardized factor loadings were statistically significant (p < 0.001) and sizeable, ranging from 0.72 to 0.94. The large standardized factor loadings support convergent validity in that each indicator was primarily related to the respective underlying latent variable. Table 5 reports the correlation coefficients among the three latent variables. The correlation coefficients were less than the standardized factor loadings, thus supporting discriminant validity.

 

 

 

Phase 3. Second Confirmatory Factor Analysis

Complete descriptive statistics for the items are presented in Table 3. The type of responses for all items was dichotomous. A total of 564 (95.4%) of 591 surveys had all the items complete and were used in the first CFA.

 

The second CFA was conducted on the three-factor measurement model to cross-validate the results from the first CFA. The three-factor model provided acceptable fit to the data in this second CFA (RMSEA = 0.055, CFI = 0.976, TLI = 0.983, χ2 = 100.032, df = 37, p < 0.001). Table 4 reports the standardized factor loadings, which can be interpreted as correlation coefficients, for the three-factor model. The standardized factor loadings were significantly large, ranging from 0.70 to 0.93. These large standardized factor loadings support convergent validity in that each indicator was largely related to the respective underlying latent variable. Table 5 reports the correlation coefficients among the three latent variables. The correlation coefficients were less than the standardized factor loadings so that discriminant validity was supported. Given these results, it appears that the three-factor model is the most reasonable solution.

 

Discussion

 

The ASCA National Model (2012) for school counseling programs underscores the value of using student achievement data to guide intervention planning and evaluation. This requires schools to find ways to collect valid and reliable information that provides a clear illustration of students’ skills in areas that are known to influence academic achievement. The purpose of developing the PFI was to identify and evaluate socio-emotional factors that relate to students’ academic success and emotional health, and to use the findings to inform the efforts of school counselors. The factor analyses in this study were used to explore how teachers’ ratings of students’ behavior on the 13-item PFI scale clustered around specific constructs that research has shown are connected to achievement and underlie many school counseling interventions. Because the scoring rubrics are organized into three grade levels (kindergarten and first grade, second and third grade, and fourth and fifth grade), the behaviors associated with each skill are focused at an appropriate developmental level. This level of detail allows teachers to respond to questions about socio-emotional factors in ways that are consistent with behaviors that students are expected to exhibit at different ages and grade levels.

 

Considering parsimony and interpretability, the EFA and two CFAs both resulted in the selection of a three-factor model as the best fit for the data. Through the EFA, we compared two-, three- and four-factor models. The three-factor model showed appropriate goodness-of-fit indices, item loadings and interpretability. Additionally, the two CFAs demonstrated cross-validation of the three-factor model. In this model, the fundamental constructs associated with students’ academic behavior identified are “academic temperament,” “self-knowledge,” and “motivation.” “Self-knowledge” and “motivation” correspond to two of the four construct clusters identified by Squier et al. (2014) as critical socio-emotional dimensions related to achievement. The “academic temperament” items reflected either self-regulation skills or the ability to engage in productive relationships in school. Squier et al. (2014) differentiated between self-direction (including emotional self-regulation constructs) and relationship skills clusters.

 

Although not perfectly aligned, this factor structure of the PFI is consistent with the CBA model for clustering student competencies and corresponds to previous research on the links between construct-based skills and academic achievement. Teacher ratings on the PFI seemed to reflect their perceptions that self-regulation abilities and good relationship skills are closely related constructs. These results indicate that the PFI may be a useful instrument for identifying elementary students’ strengths and needs in terms of exhibiting developmentally appropriate skills that are known to influence academic achievement and personal well-being.

 

Utility of Results

The factor analysis conducted in this study suggests that the PFI results in meaningful data that can allow for data-based decision making and evaluation. This tool has possible implications for school counselors in their efforts to provide targeted support, addressing the academic and socio-emotional needs of elementary school students. The PFI can be completed in conjunction with the academic report card and it is minimally time-intensive for teachers. In addition to school-based applications, the socio-emotional information yielded is provided to parents along with their child’s academic report card. This has the potential to support school–home connections that could prove useful in engaging families in interventions, which is known to be beneficial. Finally, the instrument can help school counselors identify struggling students, create small, developmentally appropriate groups based on specific needs, work with teachers to address student challenges that are prevalent in their classrooms, evaluate the success of interventions, advocate for program support, and share their work with district-level administrators. The PFI could come to be used like an early warning indicator to identify students who are showing socio-emotional development issues that predispose toward disengagement and underachievement.

 

The PFI also may prove useful as a school counseling evaluation measure. Changes on PFI items (and perhaps on subscales related to the three underlying dimensions identified in the present study) could be used as data in the evaluation of school counseling interventions and programs. Such evaluations would be tremendously facilitated by the availability of data that is both within the domain of school counselors’ work and that is known to be strongly related to achievement.

 

The findings offer great promise in terms of practical implications for school personnel and parents. This analysis quite clearly illustrates “academic temperament,” “self-knowledge” and “motivation” as factors that are demonstrated to be foundational to school success. The results indicate that the teachers’ ratings of students’ behavior align with findings of existing research and, thus, that the instrument is evaluating appropriate skills and constructs.

 

Implications for School Counselors

The PFI was developed as a data collection tool that could be easily integrated into schools for the purpose of assessing students’ development of skills that correspond to achievement-related constructs. Obtaining information about competencies that underlie achievement is critical for school counselors, who typically lead interventions that target such skills in an effort to improve academic outcomes. Many developmental school counseling curricula address skills that fall within the domains of “academic temperament,” “self-knowledge,” and “motivation” (see: http://www.casel.org/guide/programs for a complete list of socio-emotional learning programs). Teachers can complete the PFI electronically, at the same intervals as report cards and in a similarly user-friendly format. Therefore, the PFI facilitates communication between teachers and school counselors regularly throughout the school year. Counselors can use the data to identify appropriate interventions and to monitor students’ responsiveness to school counseling curricula over time and across settings. Although not included in this analysis, school counselors could also measure correlations between PFI competencies and achievement to demonstrate how academic outcomes are impacted by school counseling interventions and curricula.

 

Limitations and Further Study

Despite the promising findings on these factor analyses, further research is needed to confirm these results and to address the limitations of the present study. Clearly, additional studies are needed to confirm the reliability of PFI teacher ratings and future research should explore inter-rater reliability. Further research also is needed to determine if reliable and valid PFI subscales can be created based on the three dimensions found in the present study. Test-retest reliability, construct validity and subscale inter-correlations should be conducted to determine if PFI subscales with adequate psychometric characteristics can be created. Subsequent studies should consider whether students identified by the PFI as being in need of intervention also are found by other measures to be in need of support. Another important direction for future research is to examine the relationships between teachers’ ratings of students’ socio-emotional skills on the PFI and the students’ academic performance. Establishing a strong link between the PFI and actual academic achievement is an essential step to documenting the potential utility of the index as a screening tool. As this measure was developed to enhance data collection for data-based decision making, future research should explore school counselors’ experiences with implementation as well as qualitative reporting on the utility of PFI results for informing programming.

 

Although the present study suggests that the PFI in its current iteration is quite useful, practically speaking, researchers may consider altering the tool in subsequent iterations. One possible revision involves changing the format from dichotomous ratings to a Likert scale, which could allow for teachers to evaluate student behavior with greater specificity and which would benefit subscale construction. Another change that could be considered is evaluating the rubrics to improve the examples of student behavior that correspond to each rating on the scale and to ensure that each relates accurately to expectations at each developmental level. Furthermore, most of the items on the current PFI examine externalizing behaviors, which poses the possibility that students who achieve at an academically average level, but who experience more internalizing behaviors (such as anxiety), might not be identified for intervention. Subsequent iterations of the PFI could include additional areas of assessment, such as rating school behavior that is indicative of internalized challenges. Finally, it will be important to evaluate school counselors’ use of the PFI to determine if it actually provides necessary information for program planning and evaluation in an efficient, cost-effective fashion as is intended.

Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest

or funding contributions for the development

of this manuscript.

 

 


References

 

American School Counselor Association. (2012). The ASCA National Model: A Framework for School Counseling
Programs
(3rd ed.). Alexandria, VA: Author.

Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York, NY: Guilford.

Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for
getting the most from your analysis. Practical Assessment, Research & Evaluation, 10(7), 1–9.

Dimmitt, C., Carey, J. C., & Hatch, T. (Eds.) (2007). Evidence-based school counseling: Making a difference with data-driven practices. Thousand Oaks, CA: Corwin.

Fabrigar, L. R., Wegener, D. T., MacCallum, R. C., & Strahan, E. J. (1999). Evaluating the use of exploratory
factor analysis in psychological research. Psychological Methods4, 272–299.
doi:10.1037//1082-989X.4.3.272

House, R. M., & Martin, P. J. (1998). Advocating for better futures for all students: A new vision for school
counselors. Education, 119, 284–291.

Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional
criteria versus new alternatives. Structural Equation Modeling6, 1–55. doi:10.1080/10705519909540118

Kahn, J. H. (2006). Factor analysis in counseling psychology research, training, and practice – principles,
advances, and applications. The Counseling Psychologist34, 684–718. doi:10.1177/0011000006286347

Kline, R. B. (2011). Principles and practice of structural equation modeling (3rd ed.). New York, NY: Guilford.

Martin, I., & Carey, J. (2014). Development of a logic model to guide evaluations of the ASCA National Model
for School Counseling Programs. The Professional Counselor, 4, 455–466. doi:10.15241/im.4.5.455

Muthén, B. O., du Toit, S. H. C., & Spisic, D. (1997). Robust inference using weighted least squares and
quadratic estimating equations in latent variable modeling with categorical and continuous
outcomes. Psychometrika75, 1–45.

Muthén, L. K., & Muthén, B. O. (1998–2007). Mplus user’s guide (5th ed.). Los Angeles, CA: Muthén & Muthén.

Poynton, T. A., & Carey, J. C. (2006). An integrative model of data-based decision making for school
counseling. Professional School Counseling10, 121–130.

Schumacker, R. E., & Lomax, R. G. (2010). A beginner’s guide to structural equation modeling (3rd ed.). New York,
NY: Routledge.

Sink, C. A. (2009). School counselors as accountability leaders: Another call for action. Professional School
Counseling
13, 68–74. doi:10.5330/PSC.n.2010-13.68

Squier, K. L., Nailor, P., & Carey, J. C. (2014). Achieving excellence in school counseling through motivation, self-
direction, self-knowledge and relationships
. Thousand Oaks, CA: Corwin.

Wood, J. M., Tataryn, D. J., & Gorsuch, R. L. (1996). Effects of under-and overextraction on principle axis factor
analysis with varimax rotation. Psychological methods1, 354–365. doi:10.1037//1082-989X.1.4.354

 

 

Gwen Bass is a doctoral researcher at the Ronald H. Fredrickson Center for School Counseling Outcome Research at the University of Massachusetts. Ji Hee Lee is a doctoral student at Korea University in South Korea and Center Fellow of the Ronald H. Frederickson Center for School Counseling Outcome Research at the University of Massachusetts. Craig Wells is an Associate Professor at the University of Massachusetts. John C. Carey is a Professor of School Counseling and the Director of the Ronald H. Frederickson Center for School Counseling Outcome Research at the University of Massachusetts. Sangmin Lee is an Associate Professor at Korea University. Correspondence can be addressed to Gwen Bass, School of Cognitive Science, Adele Simmons Hall, Hampshire College, 893 West Street, Amherst, MA 01002, gjbass@gmail.com.