TPCJournal-13.2

The Professional Counselor | Volume 13, Issue 2 64 in their efforts, we employed an exploratory research design. Exploratory designs are used when there is limited prior research to warrant the examination of a directional hypothesis (Swedberg, 2020). Within the framework of an exploratory design, we developed a non-standardized instrument to answer the three research questions. Although this constitutes a limitation of the study, we endeavored to address validity concerns by following the principles of the tailored design method of survey research (Dillman, 2007). Prior to constructing the survey, we reviewed the extant literature on students’ COVID-19–related issues, school counselors’ roles, and professional issues, in addition to conducting a focus group (N = 7) with school counselors and school counseling supervisors from across the state in which the study was conducted to explore their perceptions in changes to student functioning, strategies they have deployed to assist students, and obstacles they have encountered. Focus group data were used to inform the development of survey items and ensure the instrument covered relevant content. For example, the focus group provided expert insight into the non-counseling duties that are frequently assigned to counselors in the state, as well as the nature of students’ psychological, academic, and behavioral problems witnessed since the onset of COVID-19. Before launching the survey, we piloted the survey with 19 school counselors in Tennessee to elicit feedback about the flow and coverage of the survey. Based on their responses, we added an item addressing universal intervention and edited language on multiple items to align with state-specific terminology (e.g., “MTSS coordination” was expanded to “RTI2B/MTSS/PBIS coordinator” to reflect more state-recognized school counselor titles when operating in these capacities). The final survey consisted of 64 items in predominantly binary, checkbox, and Likert scale formats. Demographic items were informed by categories outlined by the U.S. Census, the Tennessee DOE, and inclusive practices for data collection (Fernandez et al., 2016). Twenty-one items gathered demographic data related to school counselor characteristics (e.g., age, race, gender), counseling program variables (e.g., caseload, division of time, non-counseling duties, fair-share responsibilities), and school variables (e.g., school level, Title I status, location, staffing patterns). SES was measured using a school’s designated Title I status, with response categories of “yes,” “no,” and “unsure.” Likewise, to determine if school counselors dedicated 80% of their time to direct service, we created a multiple-choice item with the options of “yes,” “no,” and “unsure.” A concise description of the state guidelines was embedded into the survey to promote accurate responses to this item. We gathered data on counselors’ perspectives of their students’ current functioning in areas of mental health, academics, social skills, and behaviors through multiple-choice items with a 5-point range of “much better” to “much worse.” For each area of functioning, school counselors were required to indicate the areas of concern via a checkbox item. Additionally, checkbox items were used to identify school counselors’ strategies to assist students, barriers encountered, and needed resources. As noted, these response categories were based on extant literature and expert input. Cronbach’s alphas were computed to determine the reliability of the survey items in indicating overall post–COVID-19 functioning of students according to school counselors. These values indicate that these four areas were moderately related with acceptable consistency (α = .653). When making additional comparisons among the four constructs, two areas—behavior and social skills—were found to be more consistent (α = .705; Sheperis et al., 2020). Further, reliability scores likely reflect the exploratory design, which requested participants respond to conceptually related but not converging constructs (e.g., academics, mental health, social skills, and behavior). For example, a change in student academics would not necessarily signify a change in student mental health and vice versa. Thus, participant responses would not necessarily be uniform across items measuring students’ mental health, academics, and social skills, and overall instrument consistency would not be affected in turn.

RkJQdWJsaXNoZXIy NDU5MTM1