Moderation Effects of Supervisee Levels on the Relationship Between Supervisory Styles and the Supervisory Working Alliance

Dan Li

Supervisee development is integral to counselor training. Despite the general acknowledgement that supervisors adopt different styles when supervising counselor trainees at varying levels, there is a paucity of studies that (a) measure supervisee levels using reliable and valid psychometric instruments, other than a broad categorization of supervisees based on their training progression (e.g., master’s level vs. doctoral level, practicum vs. internship, counselor trainee vs. postgraduate); and (b) empirically document how the matching of supervisory styles and supervisee levels relates to supervision processes and/or outcomes. The supervisory working alliance is key to the supervision process and outcome. To test the hypothesized moderation effects of supervisee levels on the relationship between supervisory styles and the supervisory working alliance, the author performed a series (n = 16) of moderation analyses with a sample (N = 113) of master’s- and doctoral-level counseling trainees and practitioners. Results suggested that supervisee levels and their three indicators (self and other awareness, motivation, and autonomy) were statistically significant moderators under different contexts. These findings (a) revealed extra intricacies of the relationships among the study variables, (b) shed light on future research directions concerning supervisee development, and (c) encouraged supervisors to adopt a composite of styles to varying degrees to better foster supervisee growth.

Keywords: supervisee development, supervisory styles, supervisory working alliance, supervisee levels, moderation analyses

Clinical supervision is integral to promoting counseling supervisees’ learning (Goodyear, 2014), safeguarding the quality of professional services offered to supervisees’ clients, and gatekeeping the counseling profession (Bernard & Goodyear, 2019). Because supervisors and supervisees are two parties of the tripartite entity of supervision, literature has extensively documented supervisor characteristics (e.g., supervisory styles, self-disclosure, cultural humility), supervisee characteristics (e.g., professional development levels), and the relationship between the two (e.g., supervisory working alliance) as related to supervision processes and outcomes (King et al., 2020; Ladany, Walker, & Melincoff, 2001; Stoltenberg & McNeill, 2010).

Of these relationships, research has consistently revealed a positive correlation between supervisory styles and the supervisory working alliance (Efstation et al., 1990; Heppner & Handley, 1981; Ladany & Lehrman-Waterman, 1999; Ladany, Walker, & Melincoff, 2001). Although such direct positive correlation is theoretically appealing and statistically compelling, there is limited research that further investigates the intricacy of this association, if at all (e.g., whether the direction or strength of this relationship may alter in different contexts). Particularly, abundant supervision literature (Friedlander & Ward, 1984; Li et al., 2018; Li et al., 2019; Li, Duys, & Granello, 2020; Li, Duys, & Vispoel, 2020; Stoltenberg & McNeill, 2010) suggested the adoption of different supervision approaches when working with supervisees at various levels of professional development. Therefore, supervisee levels present as a potential context to examine how supervisory styles relate to the supervisory working alliance.

However, supervisee levels are frequently conceptualized based on supervisees’ training progression (e.g., master’s level vs. doctoral level, practicum vs. internship, counselor trainee vs. postgraduate), which may not accurately approximate where supervisees are. As such, I adopted the Supervisee Levels Questionnaire-Revised (SLQ-R; McNeill et al., 1992), a reliable and valid psychometric instrument, to measure supervisee levels (collectively as an overall assessment and separately with their three indicators) in this study.

Supervisory Styles
     Supervisory styles embody a constellation of behavior patterns that supervisors exhibit in establishing a working relationship with supervisees (Hunt, 1971) and are related to the interactional pattern that is fostered by supervisors in a direct or indirect manner (Munson, 1993). Specifically, supervisory styles encompass supervisors’ consistent focus in supervision, the manner in articulating their theoretical orientation, as well as the philosophy of practice and supervision and how it is communicated to supervisees (Munson, 1993). Friedlander and Ward (1984) identified three distinctive factors that correspond to three supervisory styles—attractive, interpersonally sensitive, and task-oriented—as measured by the Supervisory Styles Inventory (SSI) used in the present study. Attractive style supervisors appear to be warm, supportive, friendly, open, and flexible, denoting the collegial dimension of supervision; the interpersonally sensitive style is a relationship-oriented approach, and supervisors of this style tend to be invested, committed, therapeutic, and perceptive; and task-oriented supervisors are content-focused, goal-oriented, thorough, focused, practical, and structured (Friedlander & Ward, 1984). These styles resonate with the consultant, counselor, and teacher roles of the supervisor, respectively, in Bernard’s (1997) discrimination model.

Of the three styles, the interpersonally sensitive and task-oriented styles appear to be empirically distinct from one another and distinct from the attractive style (Shaffer & Friedlander, 2017). For instance, Li, Duys, and Vispoel (2020) studied 34 supervisory dyads and found the interpersonally sensitive style was the only discriminant variable, based on which supervisory dyads exhibited statistically different state-transitional patterns (i.e., movement patterns across six common supervision states). Earlier, Fernando and Hulse-Killacky (2005) also found this same style was the only predictor that uniquely and significantly explained supervisees’ satisfaction with supervision, but the task-oriented style was the only significant predictor in explaining supervisees’ perceived self-efficacy.

Supervisory Working Alliance
     Park et al.’s (2019) meta-analysis indicated that the supervisory working alliance was positively related to supervision outcome variables. Bordin (1983) first coined the concept of the supervisory working alliance as a parallel concept to the therapeutic working alliance and introduced the three aspects of the therapeutic working alliance to the alliance in supervision—mutual agreements on the goals, tasks, and bond—which laid the foundation for the adapted Working Alliance Inventory (WAI; Bahrick, 1989) for both supervisors and supervisees. Efstation et al. (1990) instead used three supervisor factors (client focus, rapport, and identification) and two supervisee factors (rapport and client focus) to conceptualize the supervisory working alliance in their Supervisory Working Alliance Inventory (SWAI). In view of the collinearity issue for the goal and task dimensions in the WAI (Hatcher et al., 2020), I adopted the SWAI in the present study.

The working alliance is one of the most robust predictors of outcome in psychotherapy (Norcross, 2011). Although such robust prediction cannot be directly replicated in supervision between the supervisory working alliance and supervision outcome (Goodyear, 2014), scholars (DePue et al., 2016; DePue et al., 2022) have found the supervisory working alliance to be related to the therapeutic working alliance. Specifically, supervisees’ perception of the supervisory working alliance was positively related to their perception of the therapeutic alliance (DePue et al., 2016). However, supervisees’ perception of the supervisory working alliance did not significantly contribute to clients’ perception of the therapeutic working alliance (DePue et al., 2016).

Supervisory Styles and the Supervisory Working Alliance
     Extensive research has documented a close relationship between supervisory styles and the supervisory working alliance (Efstation et al., 1990; Heppner & Handley, 1981; Ladany, Walker, & Melincoff, 2001; Shaffer & Friedlander, 2017). Broadly, as supervisees perceived a greater mixture of supervisory styles in their supervisors (i.e., higher ratings on all three styles; Ladany, Marotta, & Muse-Burke, 2001), supervisees were more likely to report a stronger supervisory working alliance (Li et al., 2021). Despite this global positive correlation, when scholars examined each style independently in relation to each dimension of the supervisory working alliance, such statistical significance was not consistent (Ladany, Walker, & Melincoff, 2001). For instance, in Ladany, Walker, and Melincoff’s (2001) study, participants’ perceptions of an attractive style uniquely and significantly accounted for their perceptions of the bond dimension in alliance, whereas both the interpersonally sensitive and task-oriented styles had this unique and significant association with the task dimension in alliance.

The Moderating Role of Supervisee Levels
     It is not uncommon for a counselor supervisor to start supervision with an expectation of a supervisory style to use (Hart & Nance, 2003). But supervisors have to decide what to address with the supervisee and adopt the most functional style (Bernard, 1997), which could be subject to a myriad of factors, such as contextual factors (Holloway, 1995), cultural considerations (Li et al., 2018), and supervisees’ developmental levels and needs (Friedlander & Ward, 1984; Stoltenberg & McNeill, 2010), among others. Particularly, in Friedlander and Ward’s (1984) study, supervisory styles were differentially related to supervisees’ experience levels. For example, supervisors reported that they were more task-oriented with practicum students but more attractive and interpersonally sensitive with internship students. This interaction effect was also echoed by practicum students’ higher ratings on the task-oriented style but lower ratings on the interpersonally sensitive style, compared to their internship counterparts (Friedlander & Ward, 1984). Similarly, in the study conducted by Li, Duys, and Granello (2020), supervisory dyads with less experienced supervisees tended to be more preoccupied with foundational competencies (e.g., counseling skills and theories, maintenance of standards of service) than dyads with more experienced supervisees. Consistently, more experienced supervisees in Li et al.’s (2019) study were more likely to display positive social emotional behaviors (e.g., self-disclosure, empathy, reflection of feelings, expanding on supervisors’ ideas, praise) in response to supervisors’ opinions, which in turn were more likely to elicit supervisors’ opinions that helped facilitate supervisees’ growth.

However, supervisees’ developmental levels were not always significantly associated with supervision processes or outcomes. For instance, in Bucky et al.’s (2010) study, doctoral-level supervisees did not rate their supervisor characteristics as related to the supervisory working alliance differently based on their developmental levels. Nevertheless, researchers in that study (Bucky et al., 2010) gauged supervisees’ developmental levels based on supervisees’ training progression (i.e., the current level or year level) as commonly practiced (e.g., practicum vs. internship), which may not accurately capture the actual developmental levels of supervisees. Or supervisee levels may not be strikingly distinct in doctoral programs, at least in that sample. In this study, supervisee levels were conceptualized not only as an overall assessment of where supervisees are but with three dimensions (self and other awareness, motivation, and autonomy) aligned with Stoltenberg and McNeill’s (2010) integrative developmental model (IDM) using the Supervisee Levels Questionnaire-Revised (SLQ-R; McNeill et al., 1992).

Statement of Purpose
     Although literature evidenced the overall positive correlation between supervisory styles and the supervisory working alliance, the direction and strength of such a relationship in different contexts warrants additional attention. Particularly, supervisees’ developmental progression entails a flexible mixture of different supervisory styles as suggested theoretically and empirically, but whether and how the relationship between supervisory styles and the supervisory working alliance may vary across different supervisee levels calls for further investigation. To this end, the purpose of the current study was to test the potential moderation effects of supervisee levels on the relationship between supervisory styles and the supervisory working alliance.

Given that supervisees at earlier stages of professional development may need more guidance and support from supervisors, which necessitates a variety of supervision styles that are critical to their perception of the working alliance with their supervisors, I hypothesized that the positive relationship between supervisory styles and the supervisory working alliance would be more sensitive for supervisees at earlier stages of development, compared to their more experienced counterparts. In other words, the positive relationship would be stronger for supervisees at lower levels of professional development and weaker for supervisees at higher levels of professional development.


     The data set of this study is part of a larger national quantitative study with a cross-sectional sample (Li et al., 2021). Yet, researchers have not examined supervisee levels that are crucial to measuring supervisee development using a robust psychometric instrument. The current sample comprised 113 participants (see Table 1), with the majority as master’s-level (n = 54, 47.79%) or doctoral-level students (n = 46, 40.71%). Approximately 17% of participants (n = 19) identified themselves as post-master’s or post-doctoral practitioners or other. Some participants reported both their training and practicing levels (e.g., both as a doctoral student and a post-master’s practitioner), which caused the sample size to be larger than 113 if simply adding the frequencies across the three categories together. Most participants reported their specialty areas in clinical mental health counseling (n = 53, 46.90%), school counseling (n = 43, 38.05%), and counselor education and supervision (n = 27, 23.89%). Because some participants indicated more than one specialty area, the total percentage did not add up to 100.

In this sample, approximately 80% were female (n = 90) and 23 were male (20.35%). At the time of filling out the questionnaire, most of them fell in the 21–30 age range (n = 72, 63.72%), with 19 in the 31–40 range (16.81%), 13 in the 41–50 range (11.50%), and nine beyond 50 years old (7.96%). Participants in this sample predominantly identified themselves as White (n = 97; 85.84%), with eight as Asian (7.08%), five as Black or African American (4.42%), one as American Indian and Alaska Native (0.88%), one as biracial or multiracial (0.88%), and one indicating other (0.88%). Most participants reported their counseling experience as 1 year or less (n = 44, 38.94%) or longer than 3 years (n = 37; 32.74%), with the rest reporting in between (n = 32, 28.31%). See Table 1 for more detailed demographic information.

     Upon receiving IRB approval, I started collecting data online through Qualtrics in 2017–2018. The recruitment criteria included (a) one is at least 18 years of age by the time of filling out the survey; and (b) one is a student or a practitioner who had supervision experience in the counseling field. I disseminated the recruitment post through several professional networks, including the Counselor Education and Supervision Network-Listserv (CESNET-L) and American Counseling Association (ACA) Connect. In addition to this convenience sampling, I also used snowball sampling because participants were encouraged to share the recruitment post with anyone who they thought might be eligible to participate in the study. The recruitment post contained a survey link that directed potential participants to the informed consent webpage and then a compiled questionnaire webpage.

Demographic Questionnaire
     The purpose of including this self-constructed Demographic Questionnaire was to report the basic demographic information of participants. Specifically, the questionnaire included the gender, age, race/ethnicity, length of counseling-related work experience, training/practicing level, and training or specialty area of participants.

Supervisory Styles Inventory
     The SSI (Friedlander & Ward, 1984) is a 33-item instrument used to measure the degree to which one endorses descriptors representative of each of the three dimensions of supervisory style: Attractive (7 items), Interpersonally Sensitive (8 items), and Task-Oriented (10 items), with the remainder as the filler items (8 items). Participants rate each item along a 7-point Likert scale from 1 (not very) to 7 (very). Higher scores in each dimension mean that one endorses descriptors of a certain supervisory style to a larger extent. Sample items for the Attractive, Interpersonally Sensitive, and Task-Oriented subscales are “supportive,” “perceptive,” and “didactic,” respectively.

Friedlander and Ward (1984) reported the Cronbach’s alphas of the three subscales separately and combined ranged from .76 to .93 (Ns ranging from 105 to 202). Additionally, the item–scale correlations ranged from .70 to .88 for the Attractive subscale, from .51 to .82 for the Interpersonally Sensitive style, and from .38 to .76 for the Task-Oriented scale (N1 = 202, N2 = 183; Friedlander & Ward, 1984). The test-retest reliability (N = 32) for the combined scale was .92; they were .94, .91, and .78 for the Attractive, Interpersonally Sensitive, and Task-Oriented subscales, respectively (Friedlander & Ward, 1984). They also reported the convergent validity based on moderate to high positive relationships (ps < .001) between the SSI and Stenack and Dye’s (1982) measure of supervisor roles (i.e., consultant, counselor, and teacher; N = 90). In the present study, the Cronbach’s alpha was .96 for the Attractive style, .94 for the Interpersonally Sensitive style, .92 for the Task-Oriented style, and .96 for the entire measure.

Supervisory Working Alliance Inventory
     The SWAI (Efstation et al., 1990) is used to measure the relationship in counselor supervision. It has both the supervisor and supervisee forms. The supervisee form applied to the current study includes two scales: Rapport (12 items) and Client Focus (7 items). Supervisees indicate the extent to which the behavior described in each item seems characteristic of their work with their supervisors on a 7-point Likert scale, with 1 being almost never and 7 being almost always. Higher scores in the Rapport scale indicate a stronger perceived rapport with their supervisor, and higher scores in the Client Focus scale suggest more attention to issues related to the client in supervision. A sample item for the Rapport scale is “I feel free to mention to my supervisor any troublesome feelings I might have about him/her.” A sample item for the Client Focus scale is “I work with my supervisor on specific goals in the supervisory session.”

Efstation et al. (1990) reported that the alpha coefficient was .90 for Rapport and .77 for Client Focus (N = 178) for the supervisee form. Moreover, the item–scale correlations ranged from .44 to .77 for Rapport, and from .37 to .53 for Client Focus. They used the SSI to obtain initial estimates of convergent and divergent validity for the SWAI (Efstation et al., 1990). As expected, the Client Focus dimension of the SWAI showed moderate correlation (r = .52) with the Task-Oriented style in the SSI supervisee’s form, but low correlation (r = .04) with the Attractive style and low correlation (r = .21) with the Interpersonally Sensitive style. The Rapport dimension from the SWAI had low correlation (r < .00) with the Task-Oriented style of the SSI. In the present study, the Cronbach’s alpha was .95 for Rapport, .90 for Client Focus, and .96 for the entire scale.

Supervisee Levels Questionnaire-Revised
     The Supervisee Levels Questionnaire-Revised (SLQ-R; McNeill et al., 1992) is used to measure supervisees’ developmental levels (Stoltenberg & Delworth, 1987). It has 30 items developed around three dimensions: Self and Other Awareness (12 items), Motivation (8 items), and Dependency-Autonomy (10 items). Supervisees can indicate their current behavior along a 7-point Likert scale, with 1 representing never, 2 rarely, 3 sometimes, 4 half the time, 5 often, 6 most of the time, and 7 always. Higher scores (after reverse-scoring for some of the items) in these dimensions reflect higher levels of supervisee development in Self and Other Awareness, Motivation, and Autonomy, respectively. A sample item for the Self and Other Awareness dimension is “I feel genuinely relaxed and comfortable in my counseling/therapy sessions”; a sample item (reverse-scoring) for the Motivation dimension is “The overall quality of my work fluctuates; on some days I do well, on other days, I do poorly”; and a sample item for the Dependency-Autonomy dimension is “I am able to critique counseling tapes and gain insights with minimum help from my supervisor.”

McNeill et al. (1992) reported that the Cronbach alpha coefficients of the SLQ-R (N = 105) were .83, .74, and .64 for the three subscales, respectively, and .88 for the total scores. To assess the construct validity of the SLQ-R, they examined the differences in subscale and total scores across the beginning, intermediate, and advanced groups. Hotelling’s test of significance indicated that the three groups differed significantly both on the total SLQ-R scores, F(2, 102) = 7.37, p < .001, and on a linear combination of SLQ-R subscale scores, F(6, 198) = 2.45, p < .026. In the present study, the Cronbach’s alpha was .89 for Self and Other Awareness, .85 for Motivation, .57 for Autonomy, and .91 for the entire measure.

Data Analysis
     To thoroughly test the potential moderation effects of supervisee levels on the relationship between supervisory styles and the supervisory working alliance, I carried out three rounds of moderation analysis in which the supervisory working alliance was always the outcome variable. In the first round (n = 1), supervisory styles as a whole were the predictor, and supervisee levels as a whole were the moderator. The second round (n = 6) involved two series of analyses. In the first series (n = 3), each supervisory style was the predictor, and supervisee levels as a whole were the moderator. In the second series (n = 3), supervisory styles as a whole were the predictor, and each indicator of supervisee levels was the moderator. In the third round (n = 9), each supervisory style was the predictor, and each indicator of supervisee levels was the moderator. Figure 1 presents path diagrams of three rounds of tests and Table 2 lists all tested models (n = 16).

I followed up each significant moderation effect (n = 5) with a simple slopes analysis (Aiken & West, 1991) to interpret the nature of the interaction effect. The PROCESS v4.0 tool in SPSS was employed to perform all these analyses. A total of 166 potential participants accessed the survey, but only 113 of them completed all the study instruments (SSI, SWAI, and SLQ-R) in the present study. To alleviate the impact of significantly incomplete responses, I removed the 53 respondents who left at least one instrument unanswered. The a priori power analysis via G*Power indicated that the minimum sample size would be 55 to detect an interaction effect with a medium effect size (f 2 = .15), given the desired statistical power level of .80 and type I error rate of .05. As such, the ultimate sample size of 113 meets this requirement.

I made the linearity and homoscedasticity assumptions using the zpred vs. zresid plot, which did not show a systematic relationship between the predicted values and the errors in the model (Field, 2017). Provided that participants independently filled out the study survey, I held the assumption of independence that the errors in the model were not dependent on each other. Further screening detected 12 missing values scattered across the three scales, which accounted for 0.13% of the entire 9,266 possible values. To determine the nature of these missing values, I performed the Little’s test (1988), and the results signified that these values were missing completely at random (MCAR; χ2 = 884.185, df = 890, p = .549). Because multiple imputation (MI; Schafer, 1999) can provide unbiased and valid estimates of associations based on information from the available data and can handle MCAR (Pedersen et al., 2017), I adopted MI to replace the missing values before performing further analyses in this study.


Results of this study in part supported my broad hypothesis that the positive relationship between supervisory styles and the supervisory working alliance would be more sensitive for supervisees at earlier stages of development, compared to their more experienced counterparts. Examining each supervisory style and each indicator of supervisee levels independently revealed the intricacy of the relationship between the two constructs.

There were two groups of major findings. First, supervisee levels as a whole were a significant moderator between the interpersonally sensitive style and the supervisory working alliance according to supervisees’ perceptions, ΔR2 = .0272, F(1, 109) = 7.8551, p = .006, with a small to medium effect size
(f 2 = .07; Lorah & Wong, 2018). Specifically, the strength of the relationship between the interpersonally sensitive style and the supervisory working alliance differed based on supervisee levels (see Table 3).

In view of this significant moderation effect, I conducted a simple slopes analysis as a follow-up, which indicated that the simple slopes for 1 standard deviation (SD) below the mean, at the mean, and 1 SD above the mean of supervisee levels were 1.6185, 1.4019, and 1.1853, respectively (see Figure 2). In other words, the interpersonally sensitive style and the supervisory working alliance were positively associated (B = 1.4019, p < .001), but the strength of this correlation decreased as supervisees reported higher levels of professional development. It is worth noting that supervisees at higher developmental levels tended to report a stronger supervisory working alliance in general, compared to those at lower levels. The linear model of the interpersonally sensitive style, supervisee levels, and the product of the two (interpersonally sensitive style × supervisee levels) explained 62.31% (p < .001) of the variance in the supervisory working alliance. A further look into the moderation effect of supervisee levels indicated that statistical significance consistently persisted as each indicator of supervisee levels (self and other awareness, motivation, and autonomy) was independently tested as a moderator between the interpersonally sensitive style and the supervisory working alliance (see Round 3 in Table 2).

Figure 2
Moderation Effect of Supervisee Levels With the Interpersonally Sensitive Style on the Supervisory Working Alliance

Note. N = 113. Predictor = Interpersonally Sensitive Style; Moderator = Supervisee Levels; Outcome = Supervisory Working Alliance. The three lines of color represent three regressions with the interpersonally sensitive style as predictor and the supervisory working alliance as outcome at different supervisee levels. The blue regression line denotes the group in which supervisee levels were one standard deviation (SD) below the mean, the green denotes the group in which supervisee levels were at the mean, and the pink denotes the group in which supervisee levels were one SD above the mean.

The second major finding was about the task-oriented supervisory style. When the three indicators of supervisee levels were independently examined as moderators, it was found that self and other awareness moderated the relationship between the task-oriented style and the supervisory working alliance, ΔR2 = .0311, F(1, 109) = 5.0639, p = .0264, with a small to medium effect size (f 2 = .05; Lorah & Wong, 2018). Similar to the first group of findings, the strength of the relationship between the task-oriented style and the supervisory working alliance varied based on the level of supervisee self and other awareness (one indicator of supervisee levels; see Table 4). A simple slopes analysis signified a consistent pattern—the task-oriented style and the supervisory working alliance were positively correlated, but the strength of this relationship decreased as supervisees rated higher on self and other awareness (see Figure 3). Specifically, the simple slopes for one SD below the mean, at the mean, and one SD above the mean of supervisee self and other awareness were 1.2620, 0.9540, and 0.6460, respectively. The area below the moderator (self and other awareness) value of 13.3857 constituted a region of significance in which the relationship between the task-oriented style and the supervisory working alliance was significant (p < .05; Johnson & Neyman, 1936). The linear model of the task-oriented style, supervisee self and other awareness, and the product of the two (task-oriented style × self and other awareness) accounted for 33.13% (p < .001) of the variance in the supervisory working alliance.


Findings of the present study corroborated the positive correlation between supervisory styles and the supervisory working alliance that has been consistently identified in the existing literature (Efstation et al., 1990; Heppner & Handley, 1981; Ladany & Lehrman-Waterman, 1999; Ladany, Walker, & Melincoff, 2001). The intricacy of this relationship was further explored, and the current study confirmed that the strength of such correlation varied across different contexts. Supervisee levels and their three indicators turned out to be significant moderators in five models out of the 16 tested. Explicitly, the positive correlation between the interpersonally sensitive style and the supervisory working alliance was stronger for supervisees at lower levels of professional development but weaker for supervisees at higher levels. Furthermore, this significant moderation effect existed not only when supervisee levels were viewed as an overarching construct but when each indicator of supervisee levels was independently examined. Moreover, this moderation pattern was echoed by the positive association between the task-oriented style and the supervisory working alliance, wherein the correlation was stronger for supervisees at lower levels of self and other awareness (one indicator of supervisee levels) but weaker for those at higher levels of self and other awareness. Notably, supervisees at higher developmental levels (including indicators of supervisee levels) in all models with significant moderation effects reported a stronger supervisory working alliance than did their counterparts at lower levels.

According to developmental theories of supervision, supervisees broadly progress through a series of qualitatively different levels in the process of becoming effective counselors, despite myriad individual idiosyncrasies (Chagnon & Russell, 1995; Stoltenberg & McNeill, 2010). Entry-level supervisees typically focus on their own anxiety, their lack of skills and knowledge, and the likelihood that they are being regularly evaluated (Stoltenberg & McNeill, 2010). Accordingly, beginning supervisees identified supervisor care and concern as one of the most important supervisor variables to allow supervisees to take risks and grow (Jordan, 2007). As such, interpersonally sensitive supervisors who are invested, committed, therapeutic, and perceptive (Friedlander & Ward, 1984) would be easily perceived as relationship-oriented and helpful in rapport building (one indicator of the supervisory working alliance) for supervisees early on in their training. Similarly, task-oriented supervisors are content-focused, goal-oriented, thorough, focused, practical, and structured (Friedlander & Ward, 1984).

Figure 3
Moderation Effect of Self and Other Awareness With the Task-Oriented Style on the Supervisory Working Alliance

Note. N = 113. Predictor = Task-Oriented Style; Moderator = Self and Other Awareness; Outcome = Supervisory Working Alliance. The three lines of color represent three regressions with the task-oriented style as predictor and the supervisory working alliance as outcome at different levels of self and other awareness (one indicator of supervisee levels). The blue regression line denotes the group in which supervisee self and other awareness was one standard deviation (SD) below the mean, the green denotes the group in which supervisee self and other awareness was at the mean, and the pink denotes the group in which supervisee self and other awareness was one SD above the mean.

Task-oriented supervisors can be perceived as particularly helpful and informative with client focus (a second indicator of the supervisory working alliance) for beginning supervisees (as indicated by their lower self and other awareness) who commonly experience substantial anxiety or fear pertaining to their lack of confidence in knowing what to do, being able to do it, and being evaluated by their clients or supervisors (Stoltenberg & McNeill, 2010).

Therefore, supervisees at lower levels of professional development were more likely to report a stronger supervisory working alliance as they perceived more interpersonally sensitive or task-oriented supervisor characteristics. As they progress to higher levels of development with accumulated knowledge, skills, and competencies, supervisees become more aware of clients and themselves, intrinsically and consistently motivated, and autonomous as practitioners (Stoltenberg & McNeill, 2010), which may in part explain why their ratings of the supervisory working alliance were less related to their perceptions of supervisor characteristics but generally higher than supervisees at lower levels of development.

In the present study, the moderator of supervisee levels as a composite score was only significant when the interpersonally sensitive style was the predictor; the moderator of self and other awareness (one indicator of supervisee levels) was also significant when the task-oriented style was the predictor. These findings resonated with the existing literature in that compared to the attractive style, the interpersonally sensitive and task-oriented styles tend to have stronger discriminating effects (Friedlander & Ward, 1984). For instance, practicum and internship students differed significantly in rating the task-oriented and interpersonally sensitive styles of their supervisors, but their perceptions about the attractive style were similar at both levels (Friedlander & Ward, 1984). Li, Duys, and Vispoel (2020) also found that supervisory state–transitional patterns differed significantly only based on the interpersonally sensitive style but not the other two styles.

Implications for Clinical Supervision
     The supervisory working alliance is inextricably intertwined with supervisees’ willingness to disclose (Ladany et al., 1996), supervisee satisfaction with clinical supervision (Cheon et al., 2009; Ladany, Ellis, & Friedlander, 1999), supervisee work satisfaction and work-related stress (Sterner, 2009), and therapeutic working alliance (DePue et al., 2016; DePue et al., 2022), among others. Nelson et al. (2001) proposed that a key task in early supervision is to build a strong supervisory working alliance that serves as a foundation to manage future potential dilemmas in supervision, and the ongoing maintenance of this working alliance should be the supervisor’s responsibility throughout the supervisory relationship. Although the three supervisory styles appear to be clear-cut with distinguishable characteristics and roles (Friedlander & Ward, 1984), supervisors are encouraged to adopt a composite of different styles to varying degrees to better serve supervisees’ needs. As revealed by the present study, and also the extant literature (Efstation et al., 1990; Ladany, Walker, & Melincoff, 2001; Li et al., 2021), supervisees were more likely to report a stronger supervisory working alliance as they perceived their supervisors to adopt a mixture of three supervisory styles (i.e., higher overall ratings of supervisory styles).

Particularly, beginning supervisees are characteristic of a strong focus on self, extrinsic motivation, and high dependency on supervisors (Stoltenberg & McNeill, 2010). Supervisors’ emphases on relationship-building (interpersonally sensitive style) and task focus (task-oriented style) would help build a safe, predictable supervision environment and enhance the working alliance with supervisees. Notably, although the strengths of the correlation between the interpersonally sensitive or task-oriented style and the supervisory working alliance were stronger for beginning supervisees, they did not suggest that these styles would not be effective in augmenting the alliance for supervisees at higher levels of professional development. The positive correlations still existed, albeit smoother, for more advanced supervisees, and they reported higher levels of supervisory working alliance in general, which may imply that these styles help maintain the working alliance that has been established early on in supervision.

Another point that is worth noting is that although no significant moderator was detected between the attractive style and the supervisory working alliance in the present study, the attractive style explained the most variance (68.1%, p < .001) in the supervisory working alliance, compared to the interpersonally sensitive (55.9%, p < .001) and task-oriented styles (24.1%, p < .001). This finding made it clear that the warm, supportive, friendly, open, and flexible features of attractive style supervisors are foundational to building and maintaining the supervisory working alliance, which does not differentiate across different levels of supervisees. As such, supervisors are encouraged to bring these qualities to their supervision and make them perceived by supervisees.

Limitations and Future Research
     This study is not exempt from limitations that may be addressed in future research. Although two moderators (supervisee levels, self and other awareness) were found to be significant in the present study, the effect sizes of both were small to medium (f 21 = .07 and f 22 = .05), which were lower than the speculated medium effect size (f 2 = .15) during the a priori power analysis. Provided the effect sizes of .07 and .05 for the moderation effect, to achieve the statistical power of .80 with the α error probability of .05, the required sample size would be 115 and 159, respectively. Researchers need to be more mindful when recruiting participants to ensure the sufficient sample size. Additionally, although supervisees were asked to respond to the questionnaires consistently based on their perceptions of one supervisor, a constellation of factors could have affected their perceptions—for example, the timing of a participant’s supervisee status (e.g., currently receiving supervision vs. received supervision in the past), the potential dual role that a participant may be in (e.g., a doctoral student who is both a supervisee and a supervisor), the level of supervision (e.g., practicum, internship), and the length of the supervisory relationship (e.g., 2 months vs. 2 years). Researchers in future studies could also collect more information about participants (e.g., geographic distribution) to help readers better contextualize study results. Also, the current data set was collected in 2017–2018, which would not be able to capture more recent societal, cultural, political, and economic changes (e.g., the COVID-19 pandemic) that could have affected supervisee perceptions.

In the present study, the association between supervisory styles and the supervisory working alliance was examined in the context of different supervisee levels. Indeed, this alliance could be subject to many other factors, such as discussions of cultural variables in supervision (Gatmon et al., 2001), supervisor adherence to ethical guidelines (Ladany & Lehrman-Waterman, 1999), and relational supervision strategies (Shaffer & Friedlander, 2017), among others. Scholars may include more related variables to expand the current model so as to further disentangle the complex relationships among predictors of the supervisory working alliance.

Last, although multiple moderation effects identified in the present study were statistically significant and theoretically coherent, exactly how supervisees experience the supervisory working alliance in relation to different supervisory styles as they proceed along the professional development is less known. A longitudinal track of the same sample using repeated measures or a qualitative inquiry into participants’ lived experiences of the targeted phenomenon could enrich our understanding of the study variables in this research.


Although the positive correlation between supervisory styles and the supervisory working alliance is well documented in the existing literature, the present study examined such relationships specifically in the context of supervisee levels. Both supervisee levels (as a whole) and self and other awareness (one indicator of supervisee levels) appeared to be significant moderators under different contexts. These findings further revealed the intricacies embedded in the broad relationship between supervisory styles and the supervisory working alliance, pointed out future research directions concerning supervisee development, and encouraged supervisors to adopt a composite of styles to varying degrees to better support supervisee growth.

Conflict of Interest and Funding Disclosure
The authors reported no conflict of interest
or funding contributions for the development
of this manuscript.


Aiken, L. S., & West, S. G. (1991). Multiple regression: Testing and interpreting interactions. SAGE.

Bahrick, A. S. (1989). Role induction for counselor trainees: Effects on the supervisory working alliance (Order No. 9014392) [Doctoral dissertation, The Ohio State University]. ProQuest Dissertations and Theses Global.

Bernard, J. M. (1997). The discrimination model. In C. E. Watkins, Jr. (Ed.), Handbook of psychotherapy supervision (pp. 310–327). Wiley.

Bernard, J. M., & Goodyear, R. K. (2019). Fundamentals of clinical supervision (6th ed.). Pearson.

Bordin, E. S. (1983). A working alliance based model of supervision. The Counseling Psychologist, 11(1), 35–42.

Bucky, S. F., Marques, S., Daly, J., Alley, J., & Karp, A. (2010). Supervision characteristics related to the supervisory working alliance as rated by doctoral-level supervisees. The Clinical Supervisor, 29(2), 149–163.

Chagnon, J., & Russell, R. K. (1995). Assessment of supervisee developmental level and supervision environment across supervisor experience. Journal of Counseling & Development, 73(5), 553–558.

Cheon, H.-S., Blumer, M. L. C., Shih, A.-T., Murphy, M. J., & Sato, M. (2009). The influence of supervisor and supervisee matching, role conflict, and supervisory relationship on supervisee satisfaction. Contemporary Family Therapy, 31(1), 52–67.

DePue, M. K., Lambie, G. W., Liu, R., & Gonzalez, J. (2016). Investigating supervisory relationships and therapeutic alliances using structural equation modeling. Counselor Education and Supervision, 55(4), 263–277.

DePue, M. K., Liu, R., Lambie, G. W., & Gonzalez, J. (2022). Examining the effects of the supervisory relationship and therapeutic alliance on client outcomes in novice therapists. Training and Education in Professional Psychology, 16(3), 253–262.

Efstation, J. F., Patton, M. J., & Kardash, C. M. (1990). Measuring the working alliance in counselor supervision. Journal of Counseling Psychology, 37(3), 322–329.

Fernando, D. M., & Hulse-Killacky, D. (2005). The relationship of supervisory styles to satisfaction with supervision and the perceived self-efficacy of master’s-level counseling students. Counselor Education and Supervision, 44(4), 293–304.

Field, A. (2017). Discovering statistics using IBM SPSS statistics (5th ed.). SAGE.

Friedlander, M. L., & Ward, L. G. (1984). Development and validation of the Supervisory Styles Inventory. Journal of Counseling Psychology, 31(4), 541–557.

Gatmon, D., Jackson, D., Koshkarian, L., Martos-Perry, N., Molina, A., Patel, N., & Rodolfa, E. (2001). Exploring ethnic, gender, and sexual orientation variables in supervision: Do they really matter? Journal of Multicultural Counseling and Development, 29(2), 102–113.

Goodyear, R. K. (2014). Supervision as pedagogy: Attending to its essential instructional and learning processes. The Clinical Supervisor, 33(1), 82–99.

Hart, G. M., & Nance, D. (2003). Styles of counselor supervision as perceived by supervisors and supervisees. Counselor Education and Supervision, 43(2), 146–158.

Hatcher, R. L., Lindqvist, K., & Falkenström, F. (2020). Psychometric evaluation of the Working Alliance Inventory—Therapist version: Current and new short forms. Psychotherapy Research, 30(6), 706–717.

Heppner, P. P., & Handley, P. G. (1981). A study of the interpersonal influence process in supervision. Journal of Counseling Psychology, 28(5), 437–444.

Holloway, E. (1995). Clinical supervision: A systems approach. SAGE.

Hunt, D. E. (1971). Matching models in education: The coordination of teaching methods with student characteristics. Ontario Institute for Studies in Education, Monograph, 10, 87.

Johnson, P. O., & Neyman, J. (1936). Tests of certain linear hypotheses and their application to some educational problems. Statistical Research Memoirs, 1, 57–93.

Jordan, K. (2007). Beginning supervisees’ identity: The importance of relationship variables and experience versus gender matches in the supervisee/supervisor interplay. The Clinical Supervisor, 25(1–2), 43–51.

King, K. M., Borders, L. D., & Jones, C. T. (2020). Multicultural orientation in clinical supervision: Examining impact through dyadic data. The Clinical Supervisor, 39(2), 248–271.

Ladany, N., Ellis, M. V., & Friedlander, M. L. (1999). The supervisory working alliance, trainee self-efficacy, and satisfaction. Journal of Counseling & Development, 77(4), 447–455.

Ladany, N., Hill, C. E., Corbett, M. M., & Nutt, E. A. (1996). Nature, extent, and importance of what psychotherapy trainees do not disclose to their supervisors. Journal of Counseling Psychology, 43(1), 10–24.

Ladany, N., & Lehrman-Waterman, D. E. (1999). The content and frequency of supervisor self-disclosures and their relationship to supervisor style and the supervisory working alliance. Counselor Education and Supervision, 38(3), 143–160.

Ladany, N., Marotta, S., & Muse-Burke, J. L. (2001). Counselor experience related to complexity of case conceptualization and supervision preference. Counselor Education and Supervision, 40(3), 203–219.

Ladany, N., Walker, J. A., & Melincoff, D. S. (2001). Supervisory style: Its relation to the supervisory working alliance and supervisor self-disclosure. Counselor Education and Supervision, 40(4), 263–275.

Li, D., Duys, D. K., & Granello, D. H. (2019). Interactional patterns of clinical supervision: Using sequential analysis. Asia Pacific Journal of Counselling and Psychotherapy, 10(1), 70–92.

Li, D., Duys, D. K., & Granello, D. H. (2020). Applying Markov chain analysis to supervisory interactions. The Journal of Counselor Preparation and Supervision, 13(1), Article 6.

Li, D., Duys, D. K., & Liu, Y. (2021). Working alliance as a mediator between supervisory styles and supervisee satisfaction. Teaching and Supervision in Counseling, 3(3), Article 5.

Li, D., Duys, D. K., & Vispoel, W. P. (2020). Transitional dynamics of three supervisory styles using Markov chain analysis. Journal of Counseling & Development, 98(4), 363–375.

Li, D., Liu, Y., & Lee, I. (2018). Supervising Asian international counseling students: Using the integrative developmental model. Journal of International Students, 8(2), 1129–1151.

Little, R. J. A. (1988). A test of missing completely at random for multivariate data with missing values. Journal of the American Statistical Association, 83(404), 1198–1202.

Lorah, J. A., & Wong, Y. J. (2018). Contemporary applications of moderation analysis in counseling psychology. Journal of Counseling Psychology, 65(5), 629–640.

McNeill, B. W., Stoltenberg, C. D., & Romans, J. S. (1992). The Integrated Developmental Model of supervision: Scale development and validation procedures. Professional Psychology: Research and Practice, 23(6), 504–508.

Munson, C. E. (1993). Clinical social work supervision (2nd ed.). Haworth Press.

Nelson, M. L., Gray, L. A., Friedlander, M. L., Ladany, N., & Walker, J. A. (2001). Toward relationship-centered supervision: Reply to Veach (2001) and Ellis (2001). Journal of Counseling Psychology, 48(4), 407–409.

Norcross, J. C. (Ed.). (2011). Psychotherapy relationships that work: Evidence-based responsiveness (2nd ed.). Oxford University Press.

Park, E. H., Ha, G., Lee, S., Lee, Y. Y., & Lee, S. M. (2019). Relationship between the supervisory working alliance and outcomes: A meta-analysis. Journal of Counseling & Development, 97(4), 437–446.

Pedersen, A. B., Mikkelsen, E. M., Cronin-Fenton, D., Kristensen, N. R., Pham, T. M., Pedersen, L., & Petersen, I. (2017). Missing data and multiple imputation in clinical epidemiological research. Clinical Epidemiology, 9, 157–166.

Schafer, J. L. (1999). Multiple imputation: A primer. Statistical Methods in Medical Research, 8(1), 3–15.

Shaffer, K. S., & Friedlander, M. L. (2017). What do “interpersonally sensitive” supervisors do and how do supervisees experience a relational approach to supervision? Psychotherapy Research, 27(2), 167–178.

Stenack, R. J., & Dye, H. A. (1982). Behavioral descriptions of counseling supervision roles. Counselor Education and Supervision, 21(4), 295–304.

Sterner, W. (2009). Influence of the supervisory working alliance on supervisee work satisfaction and work-related stress. Journal of Mental Health Counseling, 31(3), 249–263.

Stoltenberg, C. D., & Delworth, U. (1987). Supervising counselors and therapists: A developmental approach. Jossey-Bass.

Stoltenberg, C. D., & McNeill, B. W. (2010). IDM supervision: An integrative developmental model for supervising counselors and therapists (3rd ed.). Routledge.

Dan Li, PhD, NCC, LSC (NC, K–12), is an assistant professor of counseling at the University of North Texas. Correspondence may be addressed to Dan Li, Welch Street Complex 2-112, 425 S. Welch St., Denton, TX 76201,

Validation of the Adapted Response to Stressful Experiences Scale (RSES-4) Among First Responders

Warren N. Ponder, Elizabeth A. Prosek, Tempa Sherrill


First responders are continually exposed to trauma-related events. Resilience is evidenced as a protective factor for mental health among first responders. However, there is a lack of assessments that measure the construct of resilience from a strength-based perspective. The present study used archival data from a treatment-seeking sample of 238 first responders to validate the 22-item Response to Stressful Experiences Scale (RSES-22) and its abbreviated version, the RSES-4, with two confirmatory factor analyses. Using a subsample of 190 first responders, correlational analyses were conducted of the RSES-22 and RSES-4 with measures of depressive symptoms, post-traumatic stress, anxiety, and suicidality confirming convergent and criterion validity. The two confirmatory analyses revealed a poor model fit for the RSES-22; however, the RSES-4 demonstrated an acceptable model fit. Overall, the RSES-4 may be a reliable and valid measure of resilience for treatment-seeking first responder populations.

Keywords: first responders, resilience, assessment, mental health, confirmatory factor analysis


     First responder populations (i.e., law enforcement, emergency medical technicians, and fire rescue) are often repeatedly exposed to traumatic and life-threatening conditions (Greinacher et al., 2019). Researchers have concluded that such critical incidents could have a deleterious impact on first responders’ mental health, including the development of symptoms associated with post-traumatic stress, anxiety, depression, or other diagnosable mental health disorders (Donnelly & Bennett, 2014; Jetelina et al., 2020; Klimley et al., 2018; Weiss et al., 2010). In a systematic review, Wild et al. (2020) suggested the promise of resilience-based interventions to relieve trauma-related psychological disorders among first responders. However, they noted the operationalization and measure of resilience as limitations to their intervention research. Indeed, researchers have conflicting viewpoints on how to define and assess resilience. For example, White et al. (2010) purported popular measures of resilience rely on a deficit-based approach. Counselors operate from a strength-based lens (American Counseling Association [ACA], 2014) and may prefer measures with a similar perspective. Additionally, counselors are mandated to administer assessments with acceptable psychometric properties that are normed on populations representative of the client (ACA, 2014, E.6.a., E.7.d.). For counselors working with first responder populations, resilience may be a factor of importance; however, appropriately measuring the construct warrants exploration. Therefore, the focus of this study was to validate a measure of resilience with strength-based principles among a sample of first responders.

Risk and Resilience Among First Responders

In a systematic review of the literature, Greinacher et al. (2019) described the incidents that first responders may experience as traumatic, including first-hand life-threatening events; secondary exposure and interaction with survivors of trauma; and frequent exposure to death, dead bodies, and injury. Law enforcement officers (LEOs) reported that the most severe critical incidents they encounter are making a mistake that injures or kills a colleague; having a colleague intentionally killed; and making a mistake that injures or kills a bystander (Weiss et al., 2010). Among emergency medical technicians (EMTs), critical incidents that evoked the most self-reported stress included responding to a scene involving family, friends, or others to the crew and seeing someone dying (Donnelly & Bennett, 2014). Exposure to these critical incidents may have consequences for first responders. For example, researchers concluded first responders may experience mental health symptoms as a result of the stress-related, repeated exposure (Jetelina et al., 2020; Klimley et al., 2018; Weiss et al., 2010). Moreover, considering the cumulative nature of exposure (Donnelly & Bennett, 2014), researchers concluded first responders are at increased risk for post-traumatic stress disorder (PTSD), depression, and generalized anxiety symptoms (Jetelina et al., 2020; Klimley et al., 2018; Weiss et al., 2010). Symptoms commonly experienced among first responders include those associated with post-traumatic stress, anxiety, and depression.

In a collective review of first responders, Kleim and Westphal (2011) determined a prevalence rate for PTSD of 8%–32%, which is higher than the general population lifetime rate of 6.8–7.8 % (American Psychiatric Association [APA], 2013; National Institute of Mental Health [NIMH], 2017). Some researchers have explored rates of PTSD by specific first responder population. For example, Klimley et al. (2018) concluded that 7%–19% of LEOs and 17%–22% of firefighters experience PTSD. Similarly, in a sample of LEOs, Jetelina and colleagues (2020) reported 20% of their participants met criteria for PTSD.

Generalized anxiety and depression are also prevalent mental health symptoms for first responders. Among a sample of firefighters and EMTs, 28% disclosed anxiety at moderate–severe and several levels (Jones et al., 2018). Furthermore, 17% of patrol LEOs reported an overall prevalence of generalized anxiety disorder (Jetelina et al., 2020). Additionally, first responders may be at higher risk for depression (Klimley et al., 2018), with estimated prevalence rates of 16%–26% (Kleim & Westphal, 2011). Comparatively, the past 12-month rate of major depressive disorder among the general population is 7% (APA, 2013). In a recent study, 16% of LEOs met criteria for major depressive disorder (Jetelina et al., 2020). Moreover, in a sample of firefighters and EMTs, 14% reported moderate–severe and severe depressive symptoms (Jones et al., 2018). Given these higher rates of distressful mental health symptoms, including post-traumatic stress, generalized anxiety, and depression, protective factors to reduce negative impacts are warranted.

     Broadly defined, resilience is “the ability to adopt to and rebound from change (whether it is from stress or adversity) in a healthy, positive and growth-oriented manner” (Burnett, 2017, p. 2). White and colleagues (2010) promoted a positive psychology approach to researching resilience, relying on strength-based characteristics of individuals who adapt after a stressor event. Similarly, other researchers explored how individuals’ cognitive flexibility, meaning-making, and restoration offer protection that may be collectively defined as resilience (Johnson et al., 2011).

A key element among definitions of resilience is one’s exposure to stress. Given their exposure to trauma-related incidents, first responders require the ability to cope or adapt in stressful situations (Greinacher et al., 2019). Some researchers have defined resilience as a strength-based response to stressful events (Burnett, 2017), in which healthy coping behaviors and cognitions allow individuals to overcome adverse experiences (Johnson et al., 2011; White et al., 2010). When surveyed about positive coping strategies, first responders most frequently reported resilience as important to their well-being (Crowe et al., 2017).

Researchers corroborated the potential impact of resilience for the population. For example, in samples of LEOs, researchers confirmed resilience served as a protective factor for PTSD (Klimley et al., 2018) and as a mediator between social support and PTSD symptoms (McCanlies et al., 2017). In a sample of firefighters, individual resilience mediated the indirect path between traumatic events and global perceived stress of PTSD, along with the direct path between traumatic events and PTSD symptoms (Lee et al., 2014). Their model demonstrated that those with higher levels of resilience were more protected from traumatic stress. Similarly, among emergency dispatchers, resilience was positively correlated with positive affect and post-traumatic growth, and negatively correlated with job stress (Steinkopf et al., 2018). The replete associations of resilience as a protective factor led researchers to develop resilience-based interventions. For example, researchers surmised promising results from mindfulness-based resilience interventions for firefighters (Joyce et al., 2019) and LEOs (Christopher et al., 2018). Moreover, Antony and colleagues (2020) concluded that resilience training programs demonstrated potential to reduce occupational stress among first responders.

Assessment of Resilience
     Recognizing the significance of resilience as a mediating factor in PTSD among first responders and as a promising basis for interventions when working with LEOs, a reliable means to measure it among first responder clients is warranted. In a methodological review of resilience assessments, Windle and colleagues (2011) identified 19 different measures of resilience. They found 15 assessments were from original development and validation studies with four subsequent validation manuscripts from their original assessment, of which none were developed with military or first responder samples.

Subsequently, Johnson et al. (2011) developed the Response to Stressful Experiences Scale (RSES-22) to assess resilience among military populations. Unlike deficit-based assessments of resilience, they proposed a multidimensional construct representing how individuals respond to stressful experiences in adaptive or healthy ways. Cognitive flexibility, meaning-making, and restoration were identified as key elements when assessing for individuals’ characteristics connected to resilience when overcoming hardships. Initially they validated a five-factor structure for the RSES-22 with military active-duty and reserve components. Later, De La Rosa et al. (2016) re-examined the RSES-22. De La Rosa and colleagues discovered a unidimensional factor structure of the RSES-22 and validated a shorter 4-item subset of the instrument, the RSES-4, again among military populations.

It is currently unknown if the performance of the RSES-4 can be generalized to first responder populations. While there are some overlapping experiences between military populations and first responders in terms of exposure to trauma and high-risk occupations, the Substance Abuse and Mental Health Services Administration (SAMHSA; 2018) suggested differences in training and types of risk. In the counseling profession, these populations are categorized together, as evidenced by the Military and Government Counseling Association ACA division. Additionally, there may also be dual identities within the populations. For example, Lewis and Pathak (2014) found that 22% of LEOs and 15% of firefighters identified as veterans. Although the similarities of the populations may be enough to theorize the use of the same resilience measure, validation of the RSES-22 and RSES-4 among first responders remains unexamined.

Purpose of the Study
     First responders are repeatedly exposed to traumatic and stressful events (Greinacher et al., 2019) and this exposure may impact their mental health, including symptoms of post-traumatic stress, anxiety, depression, and suicidality (Jetelina et al., 2020; Klimley et al., 2018). Though most measures of resilience are grounded in a deficit-based approach, researchers using a strength-based approach proposed resilience may be a protective factor for this population (Crowe et al., 2017; Wild et al., 2020). Consequently, counselors need a means to assess resilience in their clinical practice from a strength-based conceptualization of clients.

Johnson et al. (2011) offered a non-deficit approach to measuring resilience in response to stressful events associated with military service. Thus far, researchers have conducted analyses of the RSES-22 and RSES-4 with military populations (De La Rosa et al., 2016; Johnson et al., 2011; Prosek & Ponder, 2021), but not yet with first responders. While there are some overlapping characteristics between the populations, there are also unique differences that warrant research with discrete sampling (SAMHSA, 2018). In light of the importance of resilience as a protective factor for mental health among first responders, the purpose of the current study was to confirm the reliability and validity of the RSES-22 and RSES-4 when utilized with this population. In the current study, we hypothesized the measures would perform similarly among first responders and if so, the RSES-4 would offer counselors a brief assessment option in clinical practice that is both reliable and valid.


     Participants in the current non-probability, purposive sample study were first responders (N = 238) seeking clinical treatment at an outpatient, mental health nonprofit organization in the Southwestern United States. Participants’ mean age was 37.53 years (SD = 10.66). The majority of participants identified as men (75.2%; n = 179), with women representing 24.8% (n = 59) of the sample. In terms of race and ethnicity, participants identified as White (78.6%; n = 187), Latino/a (11.8%; n = 28), African American or Black (5.5%; n = 13), Native American (1.7%; n = 4), Asian American (1.3%; n = 3), and multiple ethnicities (1.3%; n = 3). The participants identified as first responders in three main categories: LEO (34.9%; n = 83), EMT (28.2%; n = 67), and fire rescue (25.2%; n = 60). Among the first responders, 26.9% reported previous military affiliation. As part of the secondary analysis, we utilized a subsample (n = 190) that was reflective of the larger sample (see Table 1).

     The data for this study were collected between 2015–2020 as part of the routine clinical assessment procedures at a nonprofit organization serving military service members, first responders, frontline health care workers, and their families. The agency representatives conduct clinical assessments with clients at intake, Session 6, Session 12, and Session 18 or when clinical services are concluded. We consulted with the second author’s Institutional Review Board, which determined the research as exempt, given the de-identified, archival nature of the data. For inclusion in this analysis, data needed to represent first responders, ages 18 or older, with a completed RSES-22 at intake. The RSES-4 are four questions within the RSES-22 measure; therefore, the participants did not have to complete an additional measure. For the secondary analysis, data from participants who also completed other mental health measures at intake were also included (see Measures).


Table 1

Demographics of Sample

Characteristic Sample 1

(N = 238)

Sample 2

(n = 190)

Age (Years)
    Mean 37.53 37.12
    Median 35.50 35.00
    SD 10.66 10.30
    Range 46 45
Time in Service (Years)
    Mean 11.62 11.65
    Median 10.00 10.00
    SD   9.33   9.37
    Range   41 39
n (%)
First Responder Type
    Emergency Medical
67 (28.2%) 54 (28.4%)
    Fire Rescue 60 (25.2%) 45 (23.7%)
    Law Enforcement 83 (34.9%) 72 (37.9%)
    Other  9 (3.8%) 5 (2.6%)
    Two or more 10 (4.2%) 6 (3.2%)
    Not reported  9 (3.8%) 8 (4.2%)
    Women   59 (24.8%)   47 (24.7%)
    Men 179 (75.2%) 143 (75.3%)
    African American/Black 13 (5.5%) 8 (4.2%)
    Asian American   3 (1.3%) 3 (1.6%)
    Latino(a)/Hispanic  28 (11.8%) 24 (12.6%)
    Multiple Ethnicities  3 (1.3%) 3 (1.6%)
    Native American  4 (1.7%) 3 (1.6%)
    White 187 (78.6%) 149 (78.4%)

Note. Sample 2 is a subset of Sample 1. Time in service for Sample 1, n = 225;
time in service for Sample 2, n = 190.


Response to Stressful Experiences Scale
     The Response to Stressful Experiences Scale (RSES-22) is a 22-item measure to assess dimensions of resilience, including meaning-making, active coping, cognitive flexibility, spirituality, and self-efficacy (Johnson et al., 2011). Participants respond to the prompt “During and after life’s most stressful events, I tend to” on a 5-point Likert scale from 0 (not at all like me) to 4 (exactly like me). Total scores range from 0 to 88 in which higher scores represent greater resilience. Example items include see it as a challenge that will make me better, pray or meditate, and find strength in the meaning, purpose, or mission of my life. Johnson et al. (2011) reported the RSES-22 demonstrates good internal consistency (α = .92) and test-retest reliability (α = .87) among samples from military populations. Further, the developers confirmed convergent, discriminant, concurrent, and incremental criterion validity (see Johnson et al., 2011). In the current study, Cronbach’s alpha of the total score was .93. 

Adapted Response to Stressful Experiences Scale
     The adapted Response to Stressful Experiences Scale (RSES-4) is a 4-item measure to assess resilience as a unidimensional construct (De La Rosa et al., 2016). The prompt and Likert scale are consistent with the original RSES-22; however, it only includes four items: find a way to do what’s necessary to carry on, know I will bounce back, learn important and useful life lessons, and practice ways to handle it better next time. Total scores range from 0 to 16, with higher scores indicating greater resilience. De La Rosa et al. (2016) reported acceptable internal consistency (α = .76–.78), test-retest reliability, and demonstrated criterion validity among multiple military samples. In the current study, the Cronbach’s alpha of the total score was .74.

Patient Health Questionnaire-9
     The Patient Health Questionnaire-9 (PHQ-9) is a 9-item measure to assess depressive symptoms in the past 2 weeks (Kroenke et al., 2001). Respondents rate the frequency of their symptoms on a 4-point Likert scale ranging from 0 (not at all) to 3 (nearly every day). Total scores range from 0 to 27, in which higher scores indicate increased severity of depressive symptoms. Example items include little interest or pleasure in doing things and feeling tired or having little energy. Kroenke et al. (2001) reported good internal consistency (α = .89) and established criterion and construct validity. In this sample, Cronbach’s alpha of the total score was .88.

PTSD Checklist-5
     The PTSD Checklist-5 (PCL-5) is a 20-item measure for the presence of PTSD symptoms in the past month (Blevins et al., 2015). Participants respond on a 5-point Likert scale indicating frequency of PTSD-related symptoms from 0 (not at all) to 4 (extremely). Total scores range from 0 to 80, in which higher scores indicate more severity of PTSD-related symptoms. Example items include repeated, disturbing dreams of the stressful experience and trouble remembering important parts of the stressful experience. Blevins et al. (2015) reported good internal consistency (α = .94) and determined convergent and discriminant validity. In this sample, Cronbach’s alpha of the total score was .93.

Generalized Anxiety Disorder-7
     The Generalized Anxiety Disorder-7 (GAD-7) is a 7-item measure to assess for anxiety symptoms over the past 2 weeks (Spitzer et al., 2006). Participants rate the frequency of the symptoms on a 4-point Likert scale ranging from 0 (not at all) to 3 (nearly every day). Total scores range from 0 to 21 with higher scores indicating greater severity of anxiety symptoms. Example items include not being able to stop or control worrying and becoming easily annoyed or irritable. Among patients from primary care settings, Spitzer et al. (2006) determined good internal consistency (α = .92) and established criterion, construct, and factorial validity. In this sample, Cronbach’s alpha of the total score was .91.

Suicidal Behaviors Questionnaire-Revised
     The Suicidal Behaviors Questionnaire-Revised (SBQ-R) is a 4-item measure to assess suicidality (Osman et al., 2001). Each item assesses a different dimension of suicidality: lifetime ideation and attempts, frequency of ideation in the past 12 months, threat of suicidal behaviors, and likelihood of suicidal behaviors (Gutierrez et al., 2001). Total scores range from 3 to 18, with higher scores indicating more risk of suicide. Example items include How often have you thought about killing yourself in the past year? and How likely is it that you will attempt suicide someday? In a clinical sample, Osman et al. (2001) reported good internal consistency (α = .87) and established criterion validity. In this sample, Cronbach’s alpha of the total score was .85.

Data Analysis
     Statistical analyses were conducted using SPSS version 26.0 and SPSS Analysis of Moment Structures (AMOS) version 26.0. We examined the dataset for missing values, replacing 0.25% (32 of 12,836 values) of data with series means. We reviewed descriptive statistics of the RSES-22 and RSES-4 scales. We determined multivariate normality as evidenced by skewness less than 2.0 and kurtosis less than 7.0 (Dimitrov, 2012). We assessed reliability for the scales by interpreting Cronbach’s alphas and inter-item correlations to confirm internal consistency.

We conducted two separate confirmatory factor analyses to determine the model fit and factorial validity of the 22-item measure and adapted 4-item measure. We used several indices to conclude model fit: minimum discrepancy per degree of freedom (CMIN/DF) and p-values, root mean residual (RMR), goodness-of-fit index (GFI), comparative fit index (CFI), Tucker-Lewis index (TLI), and the root mean square error of approximation (RMSEA). According to Dimitrov (2012), values for the CMIN/DF < 2.0,p > .05, RMR < .08, GFI > .90, CFI > .90, TLI > .90, and RMSEA < .10 provide evidence of a strong model fit. To determine criterion validity, we assessed a subsample of participants (n = 190) who had completed the RSES-22, RSES-4, and four other psychological measures (i.e., PHQ-9, PCL-5, GAD-7, and SBQ-R). We determined convergent validity by conducting bivariate correlations between the RSES-22 and RSES-4.


Descriptive Analyses
     We computed means, standard deviations, 95% confidence interval (CI), and score ranges for the RSES-22 and RSES-4 (Table 2). Scores on the RSES-22 ranged from 19–88. Scores on the RSES-4 ranged from 3–16. Previous researchers using the RSES-22 on military samples reported mean scores of 57.64–70.74 with standard deviations between 8.15–15.42 (Johnson et al., 2011; Prosek & Ponder, 2021). In previous research of the RSES-4 with military samples, mean scores were 9.95–11.20 with standard deviations between 3.02–3.53(De La Rosa et al., 2016; Prosek & Ponder, 2021).


Table 2

Descriptive Statistics for RSES-22 and RSES-4

Variable M SD 95% CI Score Range
RSES-22 scores 60.12 13.76 58.52, 61.86 19–88
RSES-4 scores 11.66 2.62 11.33, 11.99 3–16

Note. N = 238. RSES-22 = Response to Stressful Experiences Scale 22-item; RSES-4 = Response
to Stressful Experiences Scale 4-item adaptation.

Reliability Analyses
     To determine the internal consistency of the resiliency measures, we computed Cronbach’s alphas. For the RSES-22, we found strong evidence of inter-item reliability (α = .93), which was consistent with the developers’ estimates (α = .93; Johnson et al., 2011). For the RSES-4, we assessed acceptable inter-item reliability (α = .74), which was slightly lower than previous estimates (α = .76–.78; De La Rosa et al., 2016). We calculated the correlation between items and computed the average of all the coefficients. The average inter-item correlation for the RSES-22 was .38, which falls within the acceptable range (.15–.50). The average inter-item correlation for the RSES-4 was .51, slightly above the acceptable range. Overall, evidence of internal consistency was confirmed for each scale. 

Factorial Validity Analyses
     We conducted two confirmatory factor analyses to assess the factor structure of the RSES-22 and RSES-4 for our sample of first responders receiving mental health services at a community clinic (Table 3). For the RSES-22, a proper solution converged in 10 iterations. Item loadings ranged between .31–.79, with 15 of 22 items loading significantly ( > .6) on the latent variable. It did not meet statistical criteria for good model fit: χ2 (209) = 825.17, p = .000, 90% CI [0.104, 0.120]. For the RSES-4, a proper solution converged in eight iterations. Item loadings ranged between .47–.80, with three of four items loading significantly ( > .6) on the latent variable. It met statistical criteria for good model fit: χ2 (2) = 5.89, p = .053, 90% CI [0.000, 0.179]. The CMIN/DF was above the suggested < 2.0 benchmark; however, the other fit indices indicated a model fit.


Table 3

Confirmatory Factor Analysis Fit Indices for RSES-22 and RSES-4

RSES-22 209 825.17/.000 3.95 .093 .749 .771 0.747 .112 0.104, 0.120
RSES-4    2    5.89/.053 2.94 .020 .988 .981 0.944 .091 0.000, 0.179

Note. N = 238. RSES-22 = Response to Stressful Experiences Scale 22-item; RSES-4 = Response to Stressful Experiences Scale 4-item adaptation; CMIN/DF = Minimum Discrepancy per Degree of Freedom; RMR = Root Mean Square Residual;
GFI = Goodness-of-Fit Index; CFI = Comparative Fit Index; TLI = Tucker-Lewis Index; RMSEA = Root Mean Squared Error of Approximation.


Criterion and Convergent Validity Analyses
     To assess for criterion validity of the RSES-22 and RSES-4, we conducted correlational analyses with four established psychological measures (Table 4). We utilized a subsample of participants (n = 190) who completed the PHQ-9, PCL-5, GAD-7, and SBQ-R at intake. Normality of the data was not a concern because analyses established appropriate ranges for skewness and kurtosis (± 1.0). The internal consistency of the RSES-22 (α = .93) and RSES-4 (α = .77) of the subsample was comparable to the larger sample and previous studies. The RSES-22 and RSES-4 related to the psychological measures of distress in the expected direction, meaning measures were significantly and negatively related, indicating that higher resiliency scores were associated with lower scores of symptoms associated with diagnosable mental health disorders (i.e., post-traumatic stress, anxiety, depression, and suicidal behavior). We verified convergent validity with a correlational analysis of the RSES-22 and RSES-4, which demonstrated a significant and positive relationship.


Table 4

Criterion and Convergent Validity of RSES-22 and RSES-4

M (SD) Cronbach’s α RSES-22 PHQ-9 PCL-5 GAD-7 SBQ-R
RSES-22 60.16 (14.17) .93 −.287* −.331* −.215* −.346*
RSES-4 11.65 (2.68) .77 .918 −.290* −.345* −.220* −.327*

Note. n = 190. RSES-22 = Response to Stressful Experiences Scale 22-item; RSES-4 = Response to Stressful Experiences Scale 4-item adaptation; PHQ-9 = Patient Health Questionnaire-9;
PCL-5 = PTSD Checklist-5; GAD-7 = Generalized Anxiety Disorder-7; SBQ-R = Suicidal Behaviors Questionnaire-Revised.
*p < .01.



The purpose of this study was to validate the factor structure of the RSES-22 and the abbreviated RSES-4 with a first responder sample. Aggregated means were similar to those in the articles that validated and normed the measures in military samples (De La Rosa et al., 2016; Johnson et al., 2011; Prosek & Ponder, 2021). Additionally, the internal consistency was similar to previous studies. In the original article, Johnson et al. (2011) proposed a five-factor structure for the RSES-22, which was later established as a unidimensional assessment after further exploratory factor analysis (De La Rosa et al., 2016). Subsequently, confirmatory factor analyses with a treatment-seeking veteran population revealed that the RSES-22 demonstrated unacceptable model fit, whereas the RSES-4 demonstrated a good model fit (Prosek & Ponder, 2021). In both samples, the RSES-4 GFI, CFI, and TLI were all .944 or higher, whereas the RSES-22 GFI, CFI, and TLI were all .771 or lower. Additionally, criterion and convergent validity as measured by the PHQ-9, PCL-5, and GAD-7 in both samples were extremely close. Similarly, in this sample of treatment-seeking first responders, confirmatory factor analyses indicated an inadequate model fit for the RSES-22 and a good model fit for the RSES-4. Lastly, convergent and criterion validity were established with correlation analyses of the RSES-22 and RSES-4 with four other standardized assessment instruments (i.e., PHQ-9, PCL-5, GAD-7, SBQ-R). We concluded that among the first responder sample, the RSES-4 demonstrated acceptable psychometric properties, as well as criterion and convergent validity with other mental health variables (i.e., post-traumatic stress, anxiety, depression, and suicidal behavior).

Implications for Clinical Practice
     First responders are a unique population and are regularly exposed to trauma (Donnelly & Bennett, 2014; Jetelina et al., 2020; Klimley et al., 2018; Weiss et al., 2010). Although first responders could potentially benefit from espousing resilience, they are often hesitant to seek mental health services (Crowe et al., 2017; Jones, 2017). The RSES-22 and RSES-4 were originally normed with military populations. The results of the current study indicated initial validity and reliability among a first responder population, revealing that the RSES-4 could be useful for counselors in assessing resilience.

It is important to recognize that first responders have perceived coping with traumatic stress as an individual process (Crowe et al., 2017) and may believe that seeking mental health services is counter to the emotional and physical training expectations of the profession (Crowe et al., 2015). Therefore, when first responders seek mental health care, counselors need to be prepared to provide culturally responsive services, including population-specific assessment practices and resilience-oriented care.

Jones (2017) encouraged a comprehensive intake interview and battery of appropriate assessments be conducted with first responder clients. Counselors need to balance the number of intake questions while responsibly assessing for mental health comorbidities such as post-traumatic stress, anxiety, depression, and suicidality. The RSES-4 provides counselors a brief, yet targeted assessment of resilience.

Part of what cultural competency entails is assessing constructs (e.g., resilience) that have been shown to be a protective factor against PTSD among first responders (Klimley et al., 2018). Since the items forming the RSES-4 were developed to highlight the positive characteristics of coping (Johnson et al., 2011), rather than a deficit approach, this aligns with the grounding of the counseling profession. It is also congruent with first responders’ perceptions of resilience. Indeed, in a content analysis of focus group interviews with first responders, participants defined resilience as a positive coping strategy that involves emotional regulation, perseverance, personal competence, and physical fitness (Crowe et al., 2017).

The RSES-4 is a brief, reliable, and valid measure of resilience with initial empirical support among a treatment-seeking first responder sample. In accordance with the ACA (2014) Code of Ethics, counselors are to administer assessments normed with the client population (E.8.). Thus, the results of the current study support counselors’ use of the measure in practice. First responder communities are facing unprecedented work tasks in response to COVID-19. Subsequently, their mental health might suffer (Centers for Disease Control and Prevention, 2020) and experts have recommended promoting resilience as a protective factor for combating the negative mental health consequences of COVID-19 (Chen & Bonanno, 2020). Therefore, the relevance of assessing resilience among first responder clients in the current context is evident.

Limitations and Future Research
     This study is not without limitations. The sample of first responders was homogeneous in terms of race, ethnicity, and gender. Subsamples of first responders (i.e., LEO, EMT, fire rescue) were too small to conduct within-group analyses to determine if the factor structure of the RSES-22 and RSES-4 would perform similarly. Also, our sample of first responders included two emergency dispatchers. Researchers reported that emergency dispatchers should not be overlooked, given an estimated 13% to 15% of emergency dispatchers experience post-traumatic symptomatology (Steinkopf et al., 2018). Future researchers may develop studies that further explore how, if at all, emergency dispatchers are represented in first responder research.

Furthermore, future researchers could account for first responders who have prior military service. In a study of LEOs, Jetelina et al. (2020) found that participants with military experience were 3.76 times more likely to report mental health concerns compared to LEOs without prior military affiliation. Although we reported the prevalence rate of prior military experience in our sample, the within-group sample size was not sufficient for additional analyses. Finally, our sample represented treatment-seeking first responders. Future researchers may replicate this study with non–treatment-seeking first responder populations.

     First responders are at risk for sustaining injuries, experiencing life-threatening events, and witnessing harm to others (Lanza et al., 2018). The nature of their exposure can be repeated and cumulative over time (Donnelly & Bennett, 2014), indicating an increased risk for post-traumatic stress, anxiety, and depressive symptoms, as well as suicidal behavior (Jones et al., 2018). Resilience is a promising protective factor that promotes wellness and healthy coping among first responders (Wild et al., 2020), and counselors may choose to routinely measure for resilience among first responder clients. The current investigation concluded that among a sample of treatment-seeking first responders, the original factor structure of the RSES-22 was unstable, although it demonstrated good reliability and validity. The adapted version, RSES-4, demonstrated good factor structure while also maintaining acceptable reliability and validity, consistent with studies of military populations (De La Rosa et al., 2016; Johnson et al., 2011; Prosek & Ponder, 2021). The RSES-4 provides counselors with a brief and strength-oriented option for measuring resilience with first responder clients.


Conflict of Interest and Funding Disclosure
The authors reported no conflict of interest
or funding contributions for the development
of this manuscript.



American Counseling Association. (2014). ACA code of ethics.

American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.).

Antony, J., Brar, R., Khan, P. A., Ghassemi, M., Nincic, V., Sharpe, J. P., Straus, S. E., & Tricco, A. C. (2020). Interventions for the prevention and management of occupational stress injury in first responders: A rapid overview of reviews. Systematic Reviews, 9(121), 1–20.

Blevins, C. A., Weathers, F. W., Davis, M. T., Witte, T. K., & Domino, J. L. (2015). The Posttraumatic Stress Disorder Checklist for DSM-5 (PCL-5): Development and initial psychometric evaluation. Journal of Traumatic Stress, 28(6), 489–498.

Burnett, H. J., Jr. (2017). Revisiting the compassion fatigue, burnout, compassion satisfaction, and resilience connection among CISM responders. Journal of Police Emergency Response, 7(3), 1–10.

Centers for Disease Control and Prevention. (2020, June 30). Coping with stress.

Chen, S., & Bonanno, G. A. (2020). Psychological adjustment during the global outbreak of COVID-19: A resilience perspective. Psychological Trauma: Theory, Research, Practice, and Policy, 12(S1), S51–S54.

Christopher, M. S., Hunsinger, M., Goerling, R. J., Bowen, S., Rogers, B. S., Gross, C. R., Dapolonia, E., & Pruessner, J. C. (2018). Mindfulness-based resilience training to reduce health risk, stress reactivity, and aggression among law enforcement officers: A feasibility and preliminary efficacy trial. Psychiatry Research, 264, 104–115.

Crowe, A., Glass, J. S., Lancaster, M. F., Raines, J. M., & Waggy, M. R. (2015). Mental illness stigma among first responders and the general population. Journal of Military and Government Counseling, 3(3), 132–149.

Crowe, A., Glass, J. S., Lancaster, M. F., Raines, J. M., & Waggy, M. R. (2017). A content analysis of psychological resilience among first responders. SAGE Open, 7(1), 1–9.

De La Rosa, G. M., Webb-Murphy, J. A., & Johnston, S. L. (2016). Development and validation of a brief measure of psychological resilience: An adaptation of the Response to Stressful Experiences Scale. Military Medicine, 181(3), 202–208.

Dimitrov, D. M. (2012). Statistical methods for validation of assessment scale data in counseling and related fields. American Counseling Association.

Donnelly, E. A., & Bennett, M. (2014). Development of a critical incident stress inventory for the emergency medical services. Traumatology, 20(1), 1–8.

Greinacher, A., Derezza-Greeven, C., Herzog, W., & Nikendei, C. (2019). Secondary traumatization in first responders: A systematic review. European Journal of Psychotraumatology, 10(1), 1562840.

Gutierrez, P. M., Osman, A., Barrios, F. X., & Kopper, B. A. (2001). Development and initial validation of the Self-Harm Behavior Questionnaire. Journal of Personality Assessment, 77(3), 475–490.

Jetelina, K. K., Mosberry, R. J., Gonzalez, J. R., Beauchamp, A. M., & Hall, T. (2020). Prevalence of mental illnesses and mental health care use among  police officers. JAMA Network Open, 3(10), 1–12.

Johnson, D. C., Polusny, M. A., Erbes, C. R., King, D., King, L., Litz, B. T., Schnurr, P. P., Friedman, M., Pietrzak, R. H., & Southwick, S. M. (2011). Development and initial validation of the Response to Stressful Experiences Scale. Military Medicine, 176(2), 161–169.

Jones, S. (2017). Describing the mental health profile of first responders: A systematic review. Journal of the American Psychiatric Nurses Association, 23(3), 200–214.

Jones, S., Nagel, C., McSweeney, J., & Curran, G. (2018). Prevalence and correlates of psychiatric symptoms among first responders in a Southern state. Archives of Psychiatric Nursing, 32(6), 828–835.

Joyce, S., Tan, L., Shand, F., Bryant, R. A., & Harvey, S. B. (2019). Can resilience be measured and used to predict mental health symptomology among first responders exposed to repeated trauma? Journal of Occupational and Environmental Medicine, 61(4), 285–292.

Kleim, B., & Westphal, M. (2011). Mental health in first responders: A review and recommendation for prevention and intervention strategies. Traumatology, 17(4), 17–24.

Klimley, K. E., Van Hasselt, V. B., & Stripling, A. M. (2018). Posttraumatic stress disorder in police, firefighters, and emergency dispatchers. Aggression and Violent Behavior, 43, 33–44.

Kroenke, K., Spitzer, R. L., & Williams, J. B. W. (2001). The PHQ-9: Validity of a brief depression severity measure. Journal of General Internal Medicine, 16, 606–613.

Lanza, A., Roysircar, G., & Rodgers, S. (2018). First responder mental healthcare: Evidence-based prevention, postvention, and treatment. Professional Psychology: Research and Practice, 49(3), 193–204.

Lee, J.-S., Ahn, Y.-S., Jeong, K.-S. Chae, J.-H., & Choi, K.-S. (2014). Resilience buffers the impact of traumatic events on the development of PTSD symptoms in firefighters. Journal of Affective Disorders, 162, 128–133.

Lewis, G. B., & Pathak, R. (2014). The employment of veterans in state and local government service. State and Local Government Review, 46(2), 91–105.

McCanlies, E. C., Gu, J. K., Andrew, M. E., Burchfiel, C. M., & Violanti, J. M. (2017). Resilience mediates the relationship between social support and post-traumatic stress symptoms in police officers. Journal of Emergency Management, 15(2), 107–116.

National Institute of Mental Health. (2017). Post-traumatic stress disorder.

Osman, A., Bagge, C. L., Gutierrez, P. M., Konick, L. C., Kopper, B. A., & Barrios, F. X. (2001). The Suicidal Behaviors Questionnaire–revised (SBQ-R): Validation with clinical and nonclinical samples. Assessment, 8(4), 443–454.

Prosek, E. A., & Ponder, W. N. (2021). Validation of the Adapted Response to Stressful Experiences Scale (RSES-4) among veterans [Manuscript submitted for publication].

Spitzer, R. L., Kroenke, K., Williams, J. B. W., & Löwe, B. (2006). A brief measure for assessing generalized anxiety disorder (The GAD-7). Archives of Internal Medicine, 166(10), 1092–1097.

Steinkopf, B., Reddin, R. A., Black, R. A., Van Hasselt, V. B., & Couwels, J. (2018). Assessment of stress and resiliency in emergency dispatchers. Journal of Police and Criminal Psychology, 33(4), 398–411. /10.1007/s11896-018-9255-3

Substance Abuse and Mental Health Services Administration. (2018, May). First responders: Behavioral health concerns, emergency response, and trauma. Disaster Technical Assistance Center Supplemental Research Bulletin.

Weiss, D. S., Brunet, A., Best, S. R., Metzler, T. J., Liberman, A., Pole, N., Fagan, J. A., & Marmar, C. R. (2010). Frequency and severity approaches to indexing exposure to trauma: The Critical Incident History Questionnaire for police officers. Journal of Traumatic Stress, 23(6), 734–743.

White, B., Driver, S., & Warren, A. M. (2010). Resilience and indicators of adjustment during rehabilitation from a spinal cord injury. Rehabilitation Psychology, 55(1), 23–32.

Wild, J., El-Salahi, S., Degli Esposti, M., & Thew, G. R. (2020). Evaluating the effectiveness of a group-based resilience intervention versus psychoeducation for emergency responders in England: A randomised controlled trial. PLoS ONE, 15(11), e0241704.

Windle, G., Bennett, K. M., & Noyes, J. (2011). A methodological review of resilience measurement scales. Health and Quality of Life Outcomes, 9, Article 8, 1–18.


Warren N. Ponder, PhD, is Director of Outcomes and Evaluation at One Tribe Foundation. Elizabeth A. Prosek, PhD, NCC, LPC, is an associate professor at Penn State University. Tempa Sherrill, MS, LPC-S, is the founder of Stay the Course and a volunteer at One Tribe Foundation. Correspondence may be addressed to Warren N. Ponder, 855 Texas St., Suite 105, Fort Worth, TX 76102,

Military Spouses’ Perceptions of Suicide in the Military Spouse Community

Rebekah F. Cole, Rebecca G. Cowan, Hayley Dunn, Taryn Lincoln


Newly released data from the U.S. Department of Defense shows military spouse suicide to be an imminent concern for the U.S. military. Currently, there is an absence of research in the counseling profession related to suicide prevention and intervention for this population. Therefore, this qualitative phenomenological study explored the perceptions of military spouses regarding suicide within their community. Ten military spouses were interviewed twice and were asked to provide written responses to follow-up questions. Six main themes emerged: (a) loss of control, (b) loss of identity, (c) fear of seeking mental health services, (d) difficulty accessing mental health services, (e) the military spouse community as a protective factor, and (f) desire for better communication about available mental health resources. Implications for practicing counselors and military leadership in helping to prevent military spouse suicide as well as recommendations for future research regarding ways to support military spouse mental health and prevent suicide in this community are included.

Keywords: military spouse, suicide, prevention, intervention, phenomenological


     In 2018, there were 624,000 active-duty military spouses in the United States, 92% of whom were female (U.S. Department of Defense [DOD], 2018). Recent data also noted that the average age of a military spouse was 31.5 years and 88% of spouses had postsecondary education (U.S. Chamber of Commerce, 2017). Twenty-four percent of spouses were unemployed (DOD, 2018) and 35%–40% were underemployed (U.S. Chamber of Commerce, 2017). Further, 74% of military spouses had children under the age of 18 and often acted as single parents because of the responsibilities of the service member (Institute for Veterans and Military Families, 2016). And of particular note, the Substance Abuse and Mental Health Services Administration (SAMHSA; 2015) reported that 29.1% of military spouses have had a mental illness, with 11.8% having had at least one major depressive episode, and 6.5% having had a major depressive episode with severe impairment.

Military Lifestyle and Spousal Mental Health
     Military spouses do not serve in combat as service members do, but they are subject to many stressors brought on by the military lifestyle that may affect their mental health (Cole, 2014). One of the primary stressors of the military lifestyle is frequent moving (Tong et al., 2018). Military families move every 2–3 years to a new location (Burke & Miller, 2016), which they may not have adequate time to prepare for, adding to the stress of the relocation process (Tong et al., 2018). Military spouses may feel isolated after moving, as 70% of military families live in civilian communities rather than in military housing (Blue Star Families, 2019). Although social support has been found to be key in ameliorating mental health issues in military spouses (Ross et al., 2020), this support is lost and must be rebuilt when the family moves to a new duty station.

Because of these frequent moves, military spouses are often unable to build consistent careers or finish their education (Institute for Veterans and Military Families, 2016). Relocating spouses may experience difficulty finding a new job or utilizing their professional license or certification in their new home state or country (DOD, 2020b). As a result of these lifestyle challenges, 24% of military spouses are unemployed (DOD, 2018) and 77% of employed spouses have been underemployed at least once (Blue Star Families, 2019). These employment challenges often result in anxiety and depression among military spouses (Linn et al., 1985). In addition, the inability to find work may result in financial stress for the family and often affects spousal mental and behavioral health (Blue Star Families, 2019; Center for the Study of Traumatic Stress, 2020).

In addition to stressful relocations and career disruption, spouses also face frequent deployments of their partners (Allen et al., 2011). These deployments result in increased depression and anxiety in spouses (Baer, 2019; Eaton et al., 2008; O’Keefe, 2016), with 92% of spouses reporting increased stress during a deployment, and 85% reporting that they feel anxious or depressed during a deployment (Romo, 2019). This deployment stress may be amplified when the spouse lives overseas and is away from their friends and family in an unfamiliar culture (McNulty, 2003). When their service member is deployed, military spouses have to take on new roles and responsibilities in the home, which may contribute to these high stress levels (Eaton et al., 2008). In addition, they may live in constant fear for their service member’s physical safety, as they are unable to contact their spouse regularly, or communication may be limited to social media with inherent limits to tone or context that prove to be anxiety-inducing (Allen et al., 2011; O’Keefe, 2016).

Military Spouses and Mental Health Treatment
     Although military spouses are under constant stress in their everyday lives (Cole, 2012; Eaton et al., 2008; Mailey et al., 2018), they often resist seeking mental health treatment (Lewy et al., 2014). Past studies have revealed that spouses often do not seek therapy because they cannot locate a counselor they trust or who understands their culture, they are concerned that someone will find out they are seeking counseling, or they do not know where to find counseling services (Lewy et al., 2014). The stigma that military spouses fear regarding mental health treatment affecting their service member’s career progression mirrors that of the active-duty service member population (Britt et al., 2015). In addition, the pressure that spouses feel to take care of their families without their service member’s support and the sense that they must prioritize their families before themselves has led them to resist receiving mental health help for themselves (Mailey et al., 2018). When they do seek mental health services, spouses are likely to visit their primary care doctor at a military care facility; however, these facilities are not equipped to meet spouses’ mental health needs because of lack of personnel and resources for specialized mental health services (Eaton et al., 2008; Lewy et al., 2014).

Military Spouses and Suicide
     Although many of these studies have focused on risk factors and barriers for military spouse mental health treatment, no research has focused on the consequences of these barriers, including suicide in this population. Although much focus has been placed on researching service member and veteran suicide (Blosnich et al., 2010), statistics regarding military spouse suicide were recently tracked for the first time and released to the public in September 2019 (DOD, 2019). In 2018, 128 military spouses died by suicide, with a suicide rate of 12.1 deaths per 100,000 individuals (DOD, 2020a). Of those who committed suicide, 57.8% were female and 85.1% were under the age of 40. Given the alarming numbers of spousal suicide outlined in the DOD report, it is essential that pioneering research be done to investigate suicidality in the military spouse population. This study, therefore, explored the perceptions of military spouses related to suicide in this population by interviewing military spouses themselves, who are the experts on the military spouse lifestyle and experience (Sargeant, 2012). The purpose of this study was not to focus on the experiences of spouses who have themselves attempted suicide, but rather how members of the military spouse population made meaning of suicide within their community. Thus, a qualitative phenomenological design was appropriate for exploring this meaning making (Christensen et al., 2017; Creswell & Poth, 2017). As experts on their own community and experiences, the participants provided perceptions that proved valuable in understanding the causes and risk factors associated with suicide in this population.

Purpose Statement and Research Questions
     The purpose of this qualitative phenomenological study was to explore the perceptions of military spouses related to military spouse suicide and how these spouses made meaning of suicide within the military spouse community. Based on the perceptions and recommendations of the participants, this study makes suggestions to the civilian and military communities regarding best practices for preventing suicide in and providing mental health services for this population. This study was guided by the following research questions:

  1. What are the perceptions of military spouses of suicide in the military spouse community?
  2. What are the perceptions of military spouses regarding resources to prevent military spouse suicide?


Our research team utilized the descriptive phenomenological tradition in qualitative inquiry, in which the researcher explores the participants’ meaning-making experience and how they translate this experience into their consciousness (Christensen et al., 2017; Creswell & Poth, 2017). In order to gather information and perspective regarding suicide within the military spouse community, Rebekah F. Cole, our team’s principal investigator, interviewed 10 spouses of active-duty service members, using a semi-structured interview, to explore their experiences in-depth and to understand how they make meaning of suicide within the military spouse community. A qualitative researcher does not aim to generalize but to draw out depth of insight from participants; hence, a small sample size was appropriate and justified with the aim of collecting a wealth of information from each participant (Creswell & Poth, 2017). Cole interviewed each spouse two times for approximately 30 minutes over the course of 4 weeks and then sent each participant an email with follow-up reflection questions (e.g., “What was it like for you to participate in this study?”) and demographic questions regarding the participants’ age group, gender, race/ethnicity, military branch, years as a spouse, and spouse’s rank.

     We selected the participants based on their status as active-duty spouses as well as their willingness and availability to participate in two interviews and complete the follow-up questions. We identified and recruited participants via purposeful sampling following approval by the IRB at our university (Creswell & Poth, 2017). Cole made a posting on a military spouse Facebook page explaining the nature and purpose of the study and asking for volunteers who were married to an active-duty service member. We offered each participant a $250 Target gift card to participate in the study, given to them upon completion of the two interviews and return of the emailed follow-up questions. We selected the first 10 volunteers who responded to the Facebook post as the 10 participants in this study. Once they showed interest in participating in the study, Cole contacted each participant via email to explain the nature and goals of the study and provide the participants with the informed consent document to sign and return.

The participants in this study were all spouses of active-duty service members (see Appendix A for a demographic chart). Three of the participants were Army spouses, three were Air Force spouses, three were Navy spouses, and one was a Coast Guard spouse. Two of the spouses were in the 18–29 age range, five were in the 30–39 age range, and three were in the 40–49 age range. The time spent as a spouse ranged from 1–20 years with a mean of 9.5 years. Eight of the spouses identified as White or having a European heritage and two of the spouses identified as having Asian or Pacific Islander heritage. All of the spouses identified as female. The participants were assigned numbers (Participant 1, Participant 2, etc.) to protect their confidentiality throughout the study.

Research Team
     The research team in this study consisted of Cole and two school counseling graduate students, Hayley Dunn and Taryn Lincoln. These students had been trained in research methodology and were familiar with the qualitative data analysis process. Lincoln is a 35-year-old White female whose husband is a retired service member. Dunn is a 33-year-old White female with no military connections. Cole worked closely with Dunn and Lincoln to review the transcriptions of the interviews, develop a comprehensive codebook, and discuss the themes and patterns that emerged from the data.

Data Collection
     Cole conducted and recorded the interviews via phone. She transcribed the interviews using an automated transcription service and reviewed each transcription word-by-word to verify the accuracy and reliability of the transcription (Creswell & Creswell, 2018; Creswell & Poth, 2017). In each interview, Cole asked questions related to suicide in the spouse population (see Appendix B). She also utilized probing follow-up questions (e.g., “Can you tell me more about that?” or “Why do you think that is?”) to gather additional information throughout the interviews (Creswell & Creswell, 2018). Finally, Cole sent a follow-up email consisting of process questions related to the interview experience (see Appendix B) as well as demographic questions.

Data Analysis
     We analyzed the data in a step-by-step process: 1) organizing the data, 2) looking over all of the data, 3) coding the data, 4) generating a description of themes, and 5) presenting the description of themes (Creswell & Creswell, 2018). Cole first organized the data, sorting each participant’s file and memoing ideas that began to emerge from the data (Creswell & Creswell, 2018; Creswell & Poth, 2017). We then each reviewed the transcripts and email responses in detail. After reviewing the data, we coded the interviews and follow-up questions. Cole compiled the codes that we generated into a codebook. We then identified and defined themes and patterns that emerged from the study. This collaboration continued until we decided that no additional themes and patterns were emerging from the data. Cole then sent the codebook, as well as the themes and patterns, to the external auditor of the study, Rebecca G. Cowan, who confirmed the findings of the research team. Cole then wrote a detailed narrative of the themes, which are presented in the Findings section of this article.

Strategies to Increase Trustworthiness
     In order to increase trustworthiness of the study, Cole, the key data collector in this study, engaged in reflexivity and self-analysis throughout the study (Creswell & Creswell, 2018; Darawsheh, 2014; Meyer & Willis, 2019). As a military spouse and professional counselor, Cole inherently has her own thoughts and feelings related to spousal mental health. Thus, it was important to bracket these thoughts and feelings to prevent them from interfering with the data collection and analysis process. Cole used reflective journaling throughout the study to engage in self-reflection and to increase her self-awareness of her reactions to the participants’ perspectives (Malacrida, 2007; Meyer & Willis, 2019). She also discussed these thoughts and feelings with the research team to explore her position as the researcher in the context of this study (Barrett et al., 2020).

In addition to this reflexivity, Cole kept an audit trail throughout the study, which included the transcriptions of the interviews, the participants’ emailed responses, the codebook, reflexive journal entries, and the notes from the research team (Creswell & Creswell, 2018; Creswell & Poth, 2017). Cowan, an auditor with a PhD in counselor education who has been a counselor and counselor educator for the past 10 years, reviewed the study in full to verify the data collection and analysis process (Creswell & Creswell, 2018) as well as the rigor of the study (Patton, 2002).

To triangulate the study’s data and increase the validity of the study’s results, data were collected through two individual interviews as well as through an email questionnaire, both open-ended forms of data collection (Creswell & Creswell, 2018). Prolonged engagement assisted with the development of trust and rapport (Korstjens & Moser, 2018). Additionally, through the collection of both verbal and written data, the study’s themes gained more credibility, as they emerged from both data sources (Creswell & Creswell, 2018).

Finally, we used member checking (Creswell & Creswell, 2018) to request the participants’ feedback on the credibility of the data (Creswell & Poth, 2017). Member checking allows the study’s participants to become actively involved in and make additions to the data review process (Birt et al., 2016). Cole emailed the participants transcriptions of their interviews and asked them to review and make any additions or changes they would like to the transcriptions, allowing them ownership of their thoughts and words and increasing the trustworthiness of the data (Birt et al., 2016). In addition, Cole discussed the findings of the study with the participants as the themes and patterns emerged (Shenton, 2004).


The study’s data yielded six main themes: (a) loss of control, (b) loss of identity, (c) fear of seeking mental health services, (d) difficulty accessing mental health services, (e) the military spouse community as a protective factor, and (f) desire for better communication about available mental health resources.

Theme 1: Loss of Control
     Each of the 10 participants perceived their circumstances as a military spouse to be out of their control. For example, all of the participants mentioned deployments, especially those on short notice, to be a risk factor for suicide. One spouse described how her active-duty husband “might be home on Thursday and then he’s gone the next day. He finds out on such short notice, that’s really tricky, and a lot of my friends are constantly, you’re just so constantly anxious all the time.”

Four of the participants described how they fear for their spouse’s safety during these deployments, which impacts their mental health. One spouse, for example, described how she lives “just constantly not knowing what’s happening, but then being fearful for the significant other as well.” Another spouse explained how spouses live with a “constant fear of whether or not your spouse will return.” One participant discussed how military spouses are thus more prone to mental health issues:

[T]he stress of your life and the stress you have over your spouse’s military career, whether they’re in danger or not, worrying about their mental health . . . probably aggravates all of the mental disorders that anyone could experience, but just magnifies them if you’re a military spouse.

Participants also felt like they lacked control because of frequently relocating. All 10 participants described the stress involved with moving unpredictably. One spouse described how “you’re always worried about what’s coming next and what you can plan for and what you can’t plan for.” Another participant mirrored this same sentiment: “It’s that ‘Where are we going to be next? We just moved here, but I know in two years we’re going to move again’ type deal . . . always just kind of being on your toes and not knowing what to expect.” Another spouse expressed similar thoughts: “I hope for the best but expect the worst, which is kind of sad, but that is the kind of mentality I’ve had to live by because of how unpredictable this lifestyle is.”

As a result of these constant relocations, spouses are separated and isolated from family and friends, or their “network of support” in the words of one participant. All of the participants recognized the risk of losing this support with regard to their mental health. One spouse, for example, explained the danger of not having “long-standing relationships where you could say like, ‘Wow that person really seems like they’re going through something.’”

Theme 2: Loss of Identity
     All 10 participants struggled with a loss of their identity, especially regarding their careers. Many participants described how career struggles and finding purpose are related to spousal mental health. One spouse explained how “not having that career is part of the anxiety and depression. And not having a purpose in life.” Another spouse described the struggle to maintain a career: “Eventually, it kind of weighs on you and eventually your mind can play tricks on you and you feel like you’re not worthy.” One participant summed up these career struggles in these words: “Part of being a military spouse is sacrificing your own life . . . there’s a lot of hurt and loneliness and sacrifice.”

In addition to this struggle for career identity and purpose, five of the participants described how the military fails to recognize their value. One spouse described how spousal suicide “is definitely brushed under the rug because people are kind of like, ‘You’re not going to war, you’re not doing any of these things.’” Another participant described her own experiences: “We’ve had situations where wives were struggling, but . . . he couldn’t get off that day, he had to report in because she’s not at the hospital . . .it’s not serious.” Another explained how “the military in general, they’re so focused on their job that they kind of forget that we’re all humans and that we are people.” One participant said that “spouses get beat down and they just kind of feel like there’s the whole ‘If the military wanted you to have a family, they would have issued you one.’”

The participants also described the military spouse’s tendency to prioritize family and the military over oneself and the impact of this inclination on spouses’ mental health. “So much of the burden of the family falls onto the military spouse, I think it’s easy for the spouse to not consider their own mental health a priority, and therefore the risk factors may go undetected or untreated.” Another described how spouses “go through this constant cycle that’s always churning. You move to a new place, you try to get settled . . . then we hit the point of going, ‘Ok, now what about me?’ If we ever get to that point.” One spouse described that after each of the moves and deployments, “I feel like we lose a sense of ourselves too . . . it’s like having a new baby all of the time. . . . You kind of reach a point where you’re like, ‘Where am I? What the heck am I doing?’”

As a result of prioritizing family and the military over themselves, spouses feel unworthy of receiving mental health services and feel guilty for suffering, as described by eight of the participants. One spouse explained that “spouses can feel weak or feel like they’re not holding up their end of the bargain if they get help.” Another participant noted that spouses “consider themselves less worthy of getting treatment or that their problems [are] not as important.” Finally, a spouse explained that there is a “weird mentality, I think, in the military spouse community, where you don’t complain because someone else has it worse. . . . If you’re an Air Force spouse, maybe the Army deployments are longer, so you just don’t want to complain.”

Theme 3: Fear of Seeking Mental Health Services
     Despite these challenges that military spouses face, eight of the participants described a fear of seeking out mental health services. Five of the participants, for example, said that spouses fear appearing to be unstable or, as one spouse described, a “fear of being ostracized, or the fear of having people talk behind your back, or embarrassment.” One spouse explained how mental health issues are viewed as, “Oh, she was a crazy spouse. Oh, she got everything that she needed . . . so she was just kind of crazy.” Another participant described how a spouse was viewed after verbalizing her mental health struggles: “I’ve been told by other spouses not to go hang out with her in group settings because she’s batshit crazy.” One spouse noted that “there’s still that stigma of reaching out and being known to have the mental health issue.” Finally, spouses may fear being honest with their medical providers for this same reason. One participant described her own perception of this fear of being transparent with the doctor regarding a suicidal assessment: “If you answer it honestly, sometimes you’re like ‘They’re going to put me in a padded room if I really tell you what my last 2 weeks has been like.’”

In addition to appearing unstable, seven of the participants described how military spouses fear that seeking mental health services would negatively impact or bring “backlash” on their service member’s career. One participant noted: “People keep it quiet because they don’t want their spouse, their military member, to not get promoted or not get more responsibility and stuff like that because they’re not keeping it together.” Another participant stated that often “you run into people who are kind of skittish about going just because of the stigma.” She further explained that “you don’t want to hurt your husband’s career, and that’s what you’ve heard for a long time. He looks like he can’t handle the situations at home.’”

Theme 4: Difficulty Accessing Mental Health Services
     Spouses who do decide to seek help for their mental health may experience difficulties in securing an appointment, as described by six of the participants in this study. Each of these spouses expressed difficulties with finding a mental health provider in the community or accessing mental health treatment at a military facility. One participant explained that “the reality is they can’t guarantee that the local community and local providers will be able to provide everything we need when we need it.” Another spouse expressed frustration that “TRICARE can sometimes be a pain when you’re trying to schedule something, and it will make you schedule at 6 weeks out because that’s the first available.” One participant described her experience with trying to find a counselor covered by TRICARE. She stated, “You hope that you get an appointment and hope you can jive with whoever you called because you may have to wait another month or two to try to find someone else.” Three spouses in the study also expressed concern about the consistency of care due to frequent relocations. One participant explained the need to streamline mental health services at each duty station “so that if [spouses] are seeing a psychiatrist in one place and they go to the next place, they’re not waiting for 2 or 3 months before they can get in to see a new psychiatrist.”

Five of the study’s participants also expressed concern over not having access to a mental health specialist. For example, one spouse shared that “the person I did see, who was a social worker, I just don’t feel was very equipped to talk to me about the things I wanted to talk about.” Another spouse described her perception of military family life consultants’ work with spouses on military bases:

They just kind of give them the same spiel, like you should exercise, make sure you’re eating well, getting enough sleep, instead of saying, “You know what? This is outside of the realm of what I can handle, let’s get you in to the type of professional that you need.”

Theme 5: The Military Spouse Community as a Protective Factor
     In the midst of these mental health challenges and difficulty seeking and accessing mental health services, seven of the participants described the military spouse community as a protective factor against suicidal ideation. As one participant explained, “Anyone can try to take their own life, but if they have people around them who are looking out for them, who are with them physically and emotionally, it’s harder to do.” In addition, one participant pointed out that the spouse community can offer a sense of shared understanding: “Someone else probably very close by has gone through the same thing that you have . . . and you’re not the first person to go through this and someone might be able to help lighten your load.” The participants emphasized the need to create “a friendly, inclusive environment where spouses can network and establish relationships” as well as establish a “connection and feeling of belonging.” One participant noted that within this environment and community, it is important to normalize conversations about mental health in order to decrease the stigma attached to it. “Letting people see that while we might post pretty pictures on Facebook and someone looks all together when they’re at that unit function, we’ve all had to reach out for help, and looking at that as being strong.”

To increase this protective factor as a community, six spouses described the importance of training for spouses geared toward suicide prevention so they could recognize the signs of suicide in others. One spouse said that training in “prevention measures of how to spot suicide, signs of suicide, or who to talk to, where to go, what to say” would be helpful “because spouses are probably already witnessing all of these signs in their homes or in their neighbors or in their friend groups of depression and suicidality.” Another participant described how “spouses could be looking out for friends, if they know some warning signs or give friends resources to go to so their friend could find it if they need help.”

Theme 6: Desire for Better Communication About Available Mental Health Resources
     Each of the 10 participants expressed the need for the military to communicate more with them about mental health resources. One spouse, for example, pointed out that such “information needs to be put out there clearly at military hospitals, on military bases. . . . So I think the military could make it more clear, destigmatize it, and just make the programs more widely available and advertised.” In this proposed advertisement, the spouses would want to know “what kind of help we can get, what it costs, where we can get help, and will it matter to our spouse’s career?”

In addition to this suggested advertising, six of the participants said they would like the military leadership to communicate with them directly regarding available mental health resources specifically designed for spouses. One participant described how “it’s harder for the spouse to get that information . . . if they had information sent directly to them, I think they would be more willing to seek it out and use those resources.” Another spouse noted that “military spouses need to be presented with the resources available for their mental health directly instead of solely relying on the service member to relay the information.” As a result of receiving this information on resources available specifically for them, one participant explained that “the military spouse wouldn’t have to consider themselves less worthy of getting treatment or that their problems [were] not as important.”

Finally, six of the spouses suggested that the check-in process for each duty station could be a key opportunity to provide spouses with resources and preventative services. One spouse noted: “I think that when you move somewhere new there should be someone checking to make sure you’re okay and you’re not alone all the time. I think it’s the military’s responsibility to make sure there’s a process in place.” Another spouse proposed this check-in process as being “part of the standard procedure to make sure the spouse maybe is brought in and made aware of all of the programs that are available to them.”


     In this study, all of the military spouse participants described how spouses’ loss of control and loss of identity may contribute to their increased risk for suicide. These feelings resulted from continually moving to new duty stations (often unexpectedly), being isolated and separated from their support systems, fearing for their spouse’s safety during deployments, and struggling to maintain a sense of self and a career while making their families and the military their priority. Although they were committed to prioritizing the military lifestyle and their spouses’ career, these spouses did not feel that their needs were prioritized by the military in turn.

Each of these challenges for military spouses has been previously addressed in the professional literature (Eaton et al., 2008; Lewy et al., 2014; Mailey et al., 2018), although their direct correlation to suicidality has not yet been explored. Because increased levels of suicidality have been found in other populations when social isolation increases (Calati et al., 2019; Heuser & Howe, 2019; Pompili et al., 2007) or stressful life transitions or events occur (Oquendo et al., 2014; Paul, 2018), it is important to continue to consider how these risk factors impact military spouses’ suicidality.

Most of the participants likewise described the tendency of spouses to feel guilty for suffering, as they are not the ones on the battlefield, a new phenomenon not yet explored in the professional literature. One participant concluded that these feelings of guilt may lead to spouses feeling they are unworthy of using mental health resources intended for active-duty service members. To address these feelings of guilt, one spouse described the need to normalize the conversation about mental health among spouses, which would ameliorate these feelings of unworthiness and increase spouses’ use of resources. Finally, all of the participants felt that provision and advertisement of mental health and suicide prevention programs and services specifically for spouses would help them feel more confident in utilizing these services.

When speaking about risk factors associated with suicide, most spouses described their fears of the stigma associated with accessing mental health services and the struggles associated with finding mental health providers qualified to help them when they did decide to seek help. These fears and struggles directly correspond to results in past quantitative and mixed-methods research regarding barriers to treating military spouse mental health (Eaton et al., 2008; Lewy et al., 2014). The participants in this study likewise described their frustration with not being able to get an appointment with military or community providers. These struggles echo the results of previous research describing the challenges of spouses to access mental health services (Lewy et al., 2014), highlighting the consistency of this issue.

Although the participants’ struggles with mental health and mental health providers confirm the findings of existing studies, their suggestions for preventing suicide within the military spouse community are new ideas generated from this study. Primarily, the participants focused on the community itself as a protective factor against suicide. They described how building a strong spousal community prevents feelings of isolation, as spouses can care for each other because they share common experiences of the military lifestyle. This sense of connection is especially important, as spouses are separated from their support systems when relocating from one duty station to the next (Ross et al., 2020). In order to strengthen the protective factor of their community, the spouses discussed how they wanted more training from military leadership in the areas of suicide prevention and intervention so that they can help others around them. Interestingly, contradictory themes arose in this study’s findings regarding the spouse community shunning those who were struggling with mental health issues and the spouse community serving as a much-needed protective factor. Perhaps the participants’ suggestions of focusing on normalizing mental health support within their community would help to reduce the current tendency to shun and would increase the tendency to support.

In addition to focusing on increasing the protective factor of the spouse community itself, all of the participants stated that they desired increased communication from the military regarding mental health services and programs available specifically to them. Some of the spouses suggested that a direct line of communication from military leadership to spouses would be helpful for finding out about mental health resources available to them, as well as to their spouses. This communication would involve more strategic and widely spread advertising about suicide prevention resources and mental health services in places that spouses often frequent, such as military hospitals or on-base/on-post facilities.

Finally, several spouses suggested an innovative, structured check-in process at each duty station that would promote spousal awareness and understanding of the resources available to them. They explained that this check-in would provide an immediate sense of connection and community for the spouse and a way to formally network with other spouses in the area. This formalized check-in process carried out by the administration at the new duty station may be especially helpful for newer spouses who may not be familiar with the military’s mental health resources or health care system or who may be hesitant to reach out on their own to make connections with others, a pattern noticed by three of the most senior spouses in this study.

Implications for Future Training and Practice
     Both the military community and the mental health counseling profession are called to recognize the mental health struggles that military spouses face in order to help prevent suicide in this population. Military leadership should strategize ways to provide easier access to mental health services for spouses, including suicide prevention programs designed specifically for this population. In addition, suicide education programs for spouses may help them identify warning signs in others, ultimately strengthening the protective factor of the military spouse community. Military leadership should also work to reduce the stigma of receiving mental health services, not only for active-duty service members, but for their family members as well. Military leaders may likewise consider the participants’ suggestions regarding direct communication between military leadership and spouses, including a formalized check-in process for each duty station. Each of these suggestions offers a solution to the challenges outlined by both the professional literature and the spouse participants in this study regarding the mental health challenges faced by spouses and the risk factors of military spouse suicide.

Next, mental health counselors are called to be aware of and screen for the risk factors for suicide in the military spouse population that may be correlated to the inherent challenges that the military lifestyle brings. As prevention is a primary focus within the counseling profession (Sale et al., 2018), counselors might create preventative, psychoeducational groups for spouses to enhance their sense of connectedness and wellness. These groups would serve to identify spouses who may need additional supportive services to mitigate risk of depression and anxiety as well as other mental health issues. Additionally, when relocations occur, counselors should consider connecting their military spouse clients with mental health services in their new location and, with the permission of the client, reach out to those providers to ensure continuity of care. Finally, mental health counselors should actively seek out and build partnerships with military leadership in order to develop evidence-based resources specific to preventing suicide in the spouse population and to reduce the mental health stigma present in both active-duty service members and spouse communities.

     Several limitations to this study exist related to the nature of qualitative methodology. First, in qualitative research, the researcher is the primary source of data collection and analysis. Thus, inherent biases exist throughout this data collection and analysis process (Anderson, 2010). However, bracketing and reflexivity reduced the potential impact of this limitation. Additionally, because mental health stigma exists within the military community, it is possible that participants were guarded during their interviews. Prolonged engagement assisted with mitigating this limitation. Finally, because of the nature of qualitative research, the sample size of the study is small (Atieno, 2009). For instance, the sample in this study did not include the perspectives of any male spouses or spouses who are African American or Hispanic. Additionally, although the sample includes Army, Navy, Air Force, and Coast Guard spouses, no Space Force or Marine Corps spouses are represented. Because of these limitations in gender, ethnicity, and branches, the sample is not representative of the military spouse community as a whole.

Implications for Future Research
     Given these limitations of qualitative research, future quantitative research might focus on specific causes of suicide among military spouses. For example, studies might look at the characteristics of spouses who have committed suicide to detect any patterns or correlations that may exist. There should be particular focus on exploring any ethnic, racial, sexual minority, or gender identity disparities. Future researchers could pilot training programs in the military aimed at preventing military spouse suicide to develop best practices in this area. Finally, future qualitative studies should focus on the experiences of male military spouses. This is critical as the male military spouse suicide rate was recently found to be statistically higher than the overall male suicide rate in the U.S. population (40.9 per 100,000 and 28.4 per 100,000, respectively; DOD, 2020a).

     Overall, the military spouses’ perceptions of risk factors for suicide in this study align with previous studies regarding military spouse mental health that have been conducted throughout the past 12 years. With a new knowledge of the number of spouses that are committing suicide, it is imperative that both the counseling profession and military leadership continue to work toward solutions for spousal mental health. These stakeholders are called to recognize the inherent risk factors of the military lifestyle and provide military spouses with the resources, training, and services that they need (and want) to address and prevent suicide within their community.


Disclosure and Disclaimer Statements

This research was partially funded by a faculty research grant from Arkansas State University.

The opinions and assertions expressed herein are those of the author(s) and do not necessarily reflect the official policy or position of the Uniformed Services University or the Department of Defense.

This research protocol was reviewed and approved by the Arkansas State University Institutional Review Board (IRB) in accordance with all applicable Federal regulations governing the protection of human subjects in research.

Neither the authors nor their family members have a financial interest in any commercial product, service, or organization providing financial support for this research.



Allen, E. S., Rhoades, G. K., Stanley, S. M., & Markman, H. J. (2011). On the home front: Stress for recently deployed Army couples. Family Process, 50(2), 235–247.

Anderson, C. (2010). Presenting and evaluating qualitative research. American Journal of Pharmaceutical Education, 74(8), 141.

Atieno, O. P. (2009). An analysis of the strength and limitation of qualitative and quantitative research paradigms. Problems of Education in the 21st Century, 13, 13–18.

Baer, M. D. (2019). The experiences of spouses/partners of military RPA pilots and sensor operators: A generic qualitative study (Order No. 27666760). Available from ProQuest Dissertations & Theses Global. (2331474736).

Barrett, A., Kajamaa, A., & Johnston, J. (2020). How to . . . be reflexive when conducting qualitative research. The Clinical Teacher, 17(1), 9–12.

Birt, L., Scott, S., Cavers, D., Campbell, C., & Walter, F. (2016). Member checking: A tool to enhance trustworthiness or merely a nod to validation? Qualitative Health Research, 26(13), 1802–1811.

Blosnich, J. R., Montgomery, A. E., Dichter, M. E., Gordon, A. J., Kavalieratos, D., Taylor, L., Ketterer, B., & Bossarte, R. M. (2010). Social determinants and military veterans’ suicide ideation and attempt: A cross-sectional analysis of electronic health record data. Journal of General Internal Medicine, 35(6), 1759–1767.

Blue Star Families. (2019). 2019 Military Family Lifestyle Survey executive summary.

Britt, T. W., Jennings, K. S., Cheung, J. H., Pury, C. L. S., & Zinzow, H. M. (2015). The role of different stigma perceptions in treatment seeking and dropout among active duty military personnel. Psychiatric Rehabilitation Journal, 38(2), 142–149.

Burke, J., & Miller, A. (2016). The effect of military change-of-station moves on spousal earnings.

Calati, R., Ferrari, C., Brittner, M., Oasi, O., Olié, E., Carvalho, A. F., & Courtet, P. (2019). Suicidal thoughts and behaviors and social isolation: A narrative review of the literature. Journal of Affective Disorders, 245, 653–667.

Center for the Study of Traumatic Stress. (2020). Financial stress and behavioral health in military servicemembers report.

Christensen, M., Welch, A., & Barr, J. (2017). Husserlian descriptive phenomenology: A review of intentionality, reduction and the natural attitude. Journal of Nursing Education and Practice, 7(8), 113–118.

Cole, R. F. (2012). Professional school counselors’ role in partnering with military families during the stages of deployment. Journal of School Counseling, 10(7).

Cole, R. F. (2014). Understanding military culture: A guide for professional school counselors. The Professional Counselor, 4(5), 497–504.

Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed methods. (5th ed.). SAGE.

Creswell, J. W., & Poth, C. N. (2017). Qualitative inquiry and research design: Choosing among five approaches (4th ed.). SAGE.

Darawsheh, W. (2014). Reflexivity in research: Promoting rigour, reliability and validity in qualitative research. International Journal of Therapy and Rehabilitation, 21(12), 560–568.

Eaton, K. M., Hoge, C. W., Messer, S. C., Whitt, A. A., Cabrera, O. A., McGurk, D., Cox, A., & Castro, C. A. (2008). Prevalence of mental health problems, treatment need, and barriers to care among primary care-seeking spouses of military service members involved in Iraq and Afghanistan deployments. Military Medicine, 173(11), 1051.

Heuser, C., & Howe, J. (2019). The relation between social isolation and increasing suicide rates in the elderly. Quality in Ageing and Older Adults, 20(1), 2–9.

Institute for Veterans and Military Families. (2016). The force behind the force.

Korstjens, I., & Moser, A. (2018). Series: Practical guidance to qualitative research. Part 4: Trustworthiness and publishing. European Journal of General Practice, 24(1), 120–124.

Lewy, C. S., Oliver, C. M., & McFarland, B. H. (2014). Brief report: Barriers to mental health treatment for military wives. Psychiatric Services, 65(9), 1170–1173.

Linn, M. W., Sandifer, R., & Stein, S. (1985). Effects of unemployment on mental and physical health. American Journal of Public Health, 75(5), 502–506.

Mailey, E. L., Mershon, C., Joyce, J., & Irwin, B. C. (2018). “Everything else comes first”: A mixed-methods analysis of barriers to health behaviors among military spouses. BMC Public Health, 18, 1013.

Malacrida, C. (2007). Reflexive journaling on emotional research topics: Ethical issues for team researchers. Qualitative Health Research, 17(10), 1329–1339.

McNulty, P. A. F. (2003). Does deployment impact the health care use of military families stationed in Okinawa, Japan? Military Medicine, 168(6), 465–70.

Meyer, K., & Willis, R. (2019). Looking back to move forward: The value of reflexive journaling for novice researchers. Journal of Gerontological Social Work, 62(5), 578–585.

O’Keefe, P. H. (2016). Using Facebook to communicate with husbands while deployed: A qualitative study of Army wives’ experiences (Order No. 10196383). Available from ProQuest Central; ProQuest Dissertations & Theses Global. (1853893200).

Oquendo, M. A., Perez-Rodriguez, M. M., Poh, E., Sullivan, G., Burke, A. K., Sublette, M. E., Mann, J. J., & Galfalvy, H. (2014). Life events: A complex role in the timing of suicidal behavior among depressed patients. Molecular Psychiatry, 19(8), 902–909.

Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). SAGE.

Paul, E. (2018). Proximally-occurring life events and the first transition from suicidal ideation to suicide attempt in adolescents. Journal of Affective Disorders, 241, 499–504.

Pompili, M., Amador, X. F., Girardi, P., Harkavy-Friedman, J., Harrow, M., Kaplan, K., Krausz, M., Lester, D., Meltzer, H. Y., Modestin, J., Montross, L. P., Mortensen, P. B., Munk-Jørgensen, P., Nielsen, J., Nordentoft, M., Saarinen, P. I., Zisook, S., Wilson, S. T., & Tatarelli, R. (2007). Suicide risk in schizophrenia: Learning from the past to change the future. Annals of General Psychiatry, 6, 10.

Romo, V. (2019). Military families experience high stress, anxiety and unemployment, report says.

Ross, A. M., DeVoe, E. R., Steketee, G., Emmert-Aronson, B. O., Brown, T., & Muroff, J. (2020). Outcomes of a reflective parenting program among military spouses: The moderating role of social support. Journal of Family Psychology, 34(4), 402–413.

Sale, E., Hendricks, M., Weil, V., Miller, C., Perkins, S., & McCudden, S. (2018). Counseling on Access to Lethal Means (CALM): An evaluation of a suicide prevention means restriction training program for mental health providers. Community Mental Health Journal, 54(3), 293–301.

Sargeant, J. (2012). Qualitative research part II: Participants, analysis, and quality assurance. Journal of Graduate Medical Education, 4(1), 1–3.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22(2), 63–75.

Substance Abuse and Mental Health Services Administration. (2015). Spouses and children of U.S. military personnel: Substance use and mental health profile from the 2015 National Survey on Drug Use and Health.

Tong, P. K., Payne, L. A., Bond, C. A., Meadows, S. O., Lewis, J. L., Friedman, E. M., & Maksabedian Hernandez, E. J. (2018). Enhancing family stability during a permanent change of station: A review of disruptions and policies.

U.S. Chamber of Commerce. (2017). Military spouses in the workplace: Understanding the impacts of spouse unemployment on military recruitment, retention, and readiness.

U.S. Department of Defense. (2018). Military spouse demographics and employment information.

U.S. Department of Defense. (2019). Annual suicide report.

U.S. Department of Defense. (2020a). Annual suicide report.

U.S. Department of Defense. (2020b). DOD releases military spouse licensure report.


Appendix A

Participant Age Group Gender Race Military Branch Spouse’s Rank Years as Spouse
Participant 1 30–39 Female White Army Officer 10

Participant 2














Participant 3








Coast Guard






Participant 4














Participant 5








Air Force






Participant 6














Participant 7








Air Force






Participant 8








Air Force






Participant 9














Participant 10














Appendix B
Interview Protocol

First Interview

What are your perceptions of suicide in the military spouse community?

What are the risk factors for suicide in the military spouse population?

What mental health challenges do military spouses face?

What resources currently exist to help prevent military spouse suicide?

What would you like to let the civilian world know about your life as a military spouse that they might not be aware of?

Is there anything else you would like to add?

Second Interview

Do you have anything else to add from our first interview?

What do you think causes military spouses to commit suicide?

What needs to be done to prevent suicide in the military spouse community?

What might be the consequences of not addressing suicide in the military spouse community?

What type of mental health support is most needed for the military spouse community?

How would your mental health differ, if at all, if you weren’t a military spouse?

Is there anything else you would like to add?

Follow-Up Email Questions

Is there anything else you would like to add to your interview responses?

What was it like for you to participate in this study?

What is the most important resource that military spouses need to prevent future suicides?                                                             


Rebekah F. Cole, PhD, NCC, LPC, is formerly an assistant professor at Arkansas State University and is now a research associate professor at the Uniformed Services University. Rebecca G. Cowan, PhD, NCC, BC-TMH, LPC, DCMHS, is a core faculty member at Walden University. Hayley Dunn is a graduate student at Arkansas State University. Taryn Lincoln is a graduate student at Arkansas State University. Correspondence may be addressed to Rebekah Cole, Uniformed Services University of the Health Sciences, Department of Military and Emergency Medicine, 4301 Jones Bridge Rd., Bethesda, MD 20814,

Assessment of Dispositions in Program Admissions: The Professional Disposition Competence Assessment—Revised Admission (PDCA-RA)

Curtis Garner, Brenda Freeman, Roger Stewart, Ken Coll


Tools to assess the dispositions of counselor education applicants at the point of program admission are important as mechanisms to screen entrance into the profession. The authors developed the Professional Disposition Competence Assessment—Revised Admission (PDCA-RA) as a screening tool for dispositional assessment in admissions interviews. In this study, 70 participants engaged in a video-based training protocol designed to increase the interrater reliability of the PDCA-RA. An intraclass correlations coefficient was calculated as an index of interrater reliability. Cronbach’s alpha coefficients were calculated for internal consistency, and Fleiss’ kappa, free-marginal kappa, and percent of agreement were calculated for absolute agreement. Calculations were made for pretest and posttest scores. Results of the study suggest that the PDCA-RA demonstrates “good” reliability in terms of interrater reliability and “excellent” reliability in terms of internal consistency. The video-based training improved interrater reliability.

Keywords: dispositions, counselor education, interrater reliability, counseling admissions, PDCA-RA


Beyond ethical codes and standardized education requirements, one criterion understood to be a demarcation of a profession is that it controls entry into its occupation (Miller, 2006). The stature of any profession is heavily influenced by the collective quality, preparation, and professional fit of those who are allowed to enter the profession. In the profession of counseling, counselor preparation programs, practicing counselors, field site supervisors, and state licensure boards share the overarching charge to screen for the profession (Freeman et al., 2016), but counselor educators alone bear the responsibility of initial screening of potential new entrants into the profession. The funnel of individuals seeking entrance into the profession begins with admission to graduate programs. This responsibility is a solemn one because post-admission gatekeeping can lead to high-stakes legal disputes (Dugger & Francis, 2014; Hutchens et al., 2013; McAdams et al., 2007).

Similar to other graduate programs, criteria for entrance into counselor preparation programs generally incorporate academic and career factors, but unlike many other graduate programs, the dispositions (traits and characteristics) of applicants are also critical factors for identifying appropriate candidates for the profession (Hernández et al., 2010). The use of admissions interviews is a common method for observing dispositions (Swank & Smith-Adcock, 2014). Characteristics such as interpersonal skills, warmth, emotional stability, and self-awareness are examples of traits deemed important to many counseling academic programs (Crawford & Gilroy, 2013; McCaughan & Hill, 2015), though counselor educators lack agreement about which dispositions should be screened at admission (Bryant et al., 2013).

Once applicants have been accepted into a counselor education program, if problematic dispositional issues arise the American Counseling Association (ACA) ethical codes require remediation (ACA, 2014), which is sometimes followed by suspension or dismissal. Therefore, gatekeeping, defined as the process of deterring program graduation of those lacking sufficient knowledge or skills (Koerin & Miller, 1995), begins at the point of program screening and admission (Kerl & Eichler, 2005). Bryant et al. (2013) emphasized that effective screening of applicants prior to formal admission into the academic program may greatly reduce the need to address problematic student behaviors after admission.

In addition to conducting admissions screening as a form of gatekeeping, the courts are more likely to support universities in admissions-related legal disputes if screening policies, standards for admission, and admission procedures are clear and fair (Cole, 1991). Legal research also underscores the importance of programs communicating clearly with students about the expected dispositions and other criteria from admission through exit (McCaughan & Hill, 2015). Reliable admissions tools designed to assess dispositions represent one method of showing fidelity in implementing policies (Hutchens et al., 2013). Despite the research support for sound structures to scaffold the admissions process, assessments with published psychometric properties measuring dispositions in admissions interviews are scarce (Hernández et al., 2010).

Jonsson and Svingby (2007) noted that a number of forms of reliability and validity are important in establishing the psychometric properties of admissions tools, but when multiple raters are involved, such as in the admissions process, interrater reliability for rubrics is particularly essential. Specific training in the tool is critical to improving interrater reliability (Jonsson & Svingby, 2007). Video training protocols to increase interrater reliability are becoming more important in professional dispositional research (Kopenhaver Haidet et al., 2009; Rosen et al., 2008). The use of video technology to train raters to capture behavioral observations has two advantages: the opportunity for admissions personnel to practice admissions interview ratings prior to real-time observations, and the relative ease of using modern, sophisticated recording equipment (Kopenhaver Haidet et al., 2009).

Admissions Processes and Criteria
     Overwhelmingly, admission criteria and procedures for counselor education programs have focused upon undergraduate grade point average (GPA); standardized test scores, such as the Graduate Record Examination (GRE) or the Miller Analogies Test (MAT); a personal interview; and some form of personal statement (Bryant et al., 2013). Such procedures have been shown to be reasonably predictive of academic success, but less so for counselor development (Smaby et al., 2005). Some programs have utilized Carkhuff’s Rating Scale (Carkhuff, 1969) or Truax’s Relationship Questionnaire (Truax & Carkhuff, 1967) to measure applicants’ ability to communicate the conditions of empathy, genuineness, and respect effectively (Hernández et al., 2010; Swank & Smith-Adcock, 2014). Carkhuff’s Rating Scale and Truax’s Relationship Questionnaire have been found to exhibit good interrater reliability and, when correlated with one another, have been found to exhibit considerable overlap (Engram & Vandergoot, 1978).

Dispositional Assessment
Following the gatekeeping dispute in Ward v. Wilbanks (2010), in which a graduate student in counselor education refused to work with a gay client, and the ensuing litigation upon that student’s dismissal from their program, the need for a reliable method for evaluating counseling student dispositions has become increasingly apparent. This high-profile legal case also highlighted the need to monitor and document student dispositions (Dugger & Francis, 2014; McAdams et al., 2007). Correspondingly, in 2009 the Council for Accreditation of Counseling and Related Educational Programs (CACREP) released standards that made monitoring student dispositions a mandatory aspect of program evaluation. In the 2016 CACREP standards the expectation for the assessment of counselor-in-training dispositions was expanded to include the monitoring of dispositions at multiple points over the duration of time students are enrolled in a counselor education program. The accreditation expectations for screening at the point of admission are found in Section I.L., where the standards delineate the expectation that counseling programs consider dispositions (CACREP, 2015). Dispositions for consideration include relationship skills and cultural sensitivity.

As the need for dispositional appraisal has become increasingly imperative in the counselor education profession, there have been various efforts to design specific approaches to assess student dispositions (Frame & Stevens-Smith, 1995; Kerl et al., 2002; Lumadue & Duffey, 1999; McAdams et al., 2007; Redekop & Wlazelek, 2012; Williams et al., 2014). One early approach was the utilization of standardized personality tests (Demos & Zuwaylif, 1966; Utley Buensuceso, 2008). However, the use of personality tests fell into disfavor because of the potential for conflicts with the Americans with Disabilities Act (U.S. Department of Justice, 2010) as well as for their inherent deficit orientation. Consequently, the use of standardized tests has been generally replaced by rating scales and rubrics (Panadero & Jonsson, 2013).

One reason that rubrics were considered superior to rating scales was their transparency (Panadero & Jonsson, 2013). Transparency empowers students by equipping them with an understanding of expectations for performance prior to their creating a product or performing a skill. Rubrics also have greater potential to align with learning outcomes and they provide useful direct feedback to students (Alexander & Praeger, 2009; Panadero & Jonsson, 2013).

Examples of dispositional assessments for counselors include the Counselor Characteristics Inventory (Pope, 1996), an inventory that assesses personality characteristics of effective counselors. Also, Spurgeon et al. (2012) described a process that includes a Likert-style assessment of dispositional traits. In addition, Swank et al. (2012) developed the Counseling Competencies Scale (CCS), a tool for measuring counselor competence. Frame and Stevens-Smith (1995) developed a 5-point Personal Characteristics Evaluation Form, and finally, Lumadue and Duffey (1999) published a Professional Performance Fitness Evaluation to evaluate specific behaviors of pre-professional counselors. Few studies of the reliability and validity of the tools were found in published research, especially related to admissions. However, some do have limited published psychometric research and in some cases norms (Flynn & Hays, 2015; Pope, 1996; Swank et al., 2012; Taub et al., 2011).

One example of a dispositional tool for counselor education with published psychometrics is the Counselor Personality Assessment (CPA) developed by Halinski (2010). The CPA is a 28-item scale reporting a Cronbach’s alpha reliability score of .82. Another tool, the CCS (Swank et al. 2012), is a 32-item rubric for measuring counseling skills, professional conduct, and professional dispositions in practicum. Cronbach’s alpha for the CCS was reported at .93, and interrater reliability was reported at .57. Criterion validity was established by correlating the CCS score with the semester grade and was reported as moderate. The available psychometric data for the CPA and CCS represent exceptions. In general, lack of psychometric information may result in limited confidence in available assessment tools for appraising counselor student dispositions.

Interrater Reliability
     Interrater reliability, essentially the extent to which the raters assign the same scores when observing the same behaviors (McHugh, 2012), is critical for fairness to applicants in counseling admissions interviews. Gwet (2014) stated, “If the inter-rater reliability is high, then raters can be used interchangeably without the researcher having to worry about the categorization being affected by a significant rater factor. Interchangeability of raters is what justifies the importance of inter-rater reliability” (p. 4). Consistency ensures that the data collected are realistic for practical use. When interrater reliability is poor, interviews conducted by overly critical raters (hawks) naturally lead to negative bias against applicants when compared within the same applicant pool with the scores from interviews rated by less critical raters (doves). Epstein and Synhorst (2008) discussed interrater reliability as an approximation in which different people rate the same behavior in the same way. Thus, interrater reliability can also be understood as rater consensus.

Purpose of the Present Study
     Effectively screening and selecting new entrants is one of the hallmarks that distinguishes a profession. Unfortunately, there is a dearth of available literature on assessment tools for rating admissions interviews. Further, lack of information on the reliability of the tools that exist represents a significant deficiency in professional literature (Johnson & Campbell, 2002). The Professional Disposition Competence Assessment—Revised Admission (PDCA-RA; Freeman & Garner, 2017; Garner et al., 2016) is a global rubric designed to assess applicant dispositions in brief graduate program interviews. The PDCA-RA includes a video training protocol developed to facilitate consistency across raters in scoring admissions interviews on dispositional domains.

The purpose of the study was to examine the internal stability and the interrater reliability of the PDCA-RA. The rationale for the study was that no similar rubrics assessing dispositions at admissions using training videos were found in published research, suggesting a gap in the literature. Interrater reliability was the key focus of this study because of the importance of interrater reliability for rubrics utilized in situations with multiple raters, a typical scenario for counselor education admissions processes.


     Raters for the study included 70 counselor educators, counseling doctoral students, adjunct faculty, and site supervisors. Counselor educators, doctoral students, and adjunct faculty at two universities were asked to participate in trainings on the new admissions screening tool. Site supervisors providing supervision for practicum and internship students at the two universities were offered training in the PDCA-RA as a component of continued professional development to maintain their supervision status. Training in both instances was free and included professional development credits. Informed consent for participation was obtained from all participants in accordance with ACA ethical codes (ACA, 2014) and IRB oversight at both universities. All participants in the study fully completed the PDCA-RA video-based training. The mean age of the raters was 43.9 (SD = 11.4, range 24–72). Sixty-four percent identified as female and 36% identified as male. Mean average years of experience indicated as a faculty or field supervisor was 12.2 (SD = 9.7, range 1–50). Ninety-three percent identified as White/Caucasian, 6% as Latino/a, and 1% as other ethnicity.

The counselor educators (27% of the sample) were primarily from two CACREP-accredited counseling programs in the Western United States. Participating universities included one private university and a state research university, both with CACREP-accredited programs. Counselor education doctoral students and adjunct faculty participants comprised 7% of the sample. The doctoral students participated in the training because they were involved as raters of master’s-level counselor education applicants in the admissions process at one institution. The remaining 66% of the participants were field site supervisors. Because field site supervisors were involved in gatekeeping, attending training in dispositional assessment was natural to their role as internship site supervisors. 

Measure: PDCA-RA
     The PDCA-RA was developed on the basis of the Professional Disposition Competence Assessment (PDCA; Garner et al., 2016). The PDCA, a dispositional gatekeeping tool, was revised to the Professional Disposition Competence Assessment-Revised (PDCA-R) after several rounds of use and with feedback from expert panels (Freeman & Garner, 2017). Advice from legal counsel was also reflected in the revision of the PDCA to the PDCA-R. The PDCA-R was originally used for both gatekeeping and admissions purposes, but it was determined that the PDCA-R was best used for gatekeeping, not for admissions screening, because the tool implied that the rater had prior knowledge of the student. Because this is often not the case in admissions screening, the PDCA-RA was developed.

The PDCA, PDCA-R, and PDCA-RA were conceptualized and developed through a comprehensive review of the literature, several rounds of field testing, and adjustments from expert faculty panels at two institutions. In addition to counseling literature on impairment and expert panel feedback, the Five-Factor Model, often referred to as the “Big Five” (Costa & McCrae, 1992), influenced three of the nine dispositional items. The Five-Factor Model consists of five personality traits consistently associated with positive mental health, academic success, and healthy habits and attitudes across the life span: Extraversion, Agreeableness, Conscientiousness, Emotional Stability, and Openness. The PDCA-RA dispositions are identical to the PDCA-R, with the exception of the disposition of Ethics. Ethics was removed from the PDCA-RA because the description assumed knowledge of professional ethical standards, a doubtful expectation for program applicants with no prior training in counseling. The behavioral descriptions in the PDCA-RA were narrowed so they described only those behaviors that can be observed in admissions interviews with no prior knowledge of the applicants. In addition, the rubric item descriptions were shortened to align with the practical context of brief (20- to 30-minute) admissions interviews in which there may be limited time for in-depth assessment.

If dispositions are thought of as traits, as per the definition of dispositions in the CACREP glossary (CACREP, 2015), the PDCA-RA is not technically directly measuring dispositions. Based upon advice from legal counsel, as well as the practicality of assessing applicants during short admissions interviews, the PDCA-RA assessed behaviors associated with dispositions and not the actual dispositions. Behaviors identified for each disposition can be observed during a short admissions interview, whereas personality traits would require a more in-depth assessment approach, one that counselor educators fear might be found legally problematic (Freeman et al., 2019; Schuermann et al., 2018).

The nine dispositions assessed in terms of observable behaviors via the rubric are Conscientiousness, Coping and Self-Care, Openness, Cooperativeness, Moral Reasoning, Interpersonal Skills, Cultural Sensitivity, Self-Awareness, and Emotional Stability. Each disposition in the PDCA-RA is rated on a scale of three levels—developing, meets expectation, and above expectation. The PDCA-RA is described in more detail in a manual that includes the tools as well as three suggested admissions questions for each of the nine dispositions (Freeman & Garner, 2017). The measure of internal consistency for faculty ratings of the original PDCA rubric was a Cronbach’s alpha estimated at .94 (Garner et al., 2016). Cronbach’s alpha for self-ratings was .82, and Cronbach’s alpha for peer ratings was .89. The straightforward modifications from the original PDCA to the PDCA-RA were minimal and unlikely to significantly affect these measures of internal consistency.

     A video-based training protocol was developed for the purpose of training faculty in counselor education programs, doctoral students, site supervisors, and other admissions raters to use the PDCA-RA to assess the dispositions of graduate program applicants (Freeman & Garner, 2017; Garner et al., 2016). The video was presented to participants by a trainer. The trainer also greeted participants, obtained informed consent, passed out PDCA-RA forms when prompted by the training video, and collected completed PDCA-RA forms for later analysis. Training in the use of the PDCA-RA was important not only as a mechanism to establish interrater reliability but also as a means of informing adjustments to the tool during its initial iterative development process. Development of the video-based training protocol progressed through several stages. At first, the original 90-minute training consisted of a faculty team of seven working together as a group to read and discuss each disposition, followed by each faculty viewing an admissions interview video and rating the applicant independently. Faculty then discussed their ratings, leading to subtle adjustments in the rubric item descriptions. Additional benefits to the training were an increase in faculty self-awareness of dove and hawk tendencies when rating admissions applicants and self-awareness associated with interview bias. With continued training and feedback, the original training protocol was significantly improved.

To complete the next step in the creation of the video-based training protocol, counseling student volunteers were offered a minimal incentive to come to the film studio, and after signing waivers to allow the film clips to be used, the student volunteers were asked to respond to various admissions interview questions. The faculty filming the students instructed them to “give a strong answer” or “give a weak answer.” The researchers treated all responses as unscripted role plays. The questions asked by the interviewer for each disposition were those found in the PDCA-RA materials (Freeman & Garner, 2017). Finally, the authors and developers of the training video reviewed over 100 film clips, removed those in which the acting interfered with the purpose of the video, and rated the remaining clips using the PDCA-RA, resulting in ratings of 1, 3, or 5. These numerical ratings corresponded to descriptive ratings of developing, meets expectation, and above expectation, respectively. Clips in which the researchers found the rating to be difficult were removed from consideration. In selecting the final 18 clips (two for each of nine dispositions), the researchers considered diversity in age, ethnicity, gender, and disability of the student volunteers. The goal was to create video clips of student volunteers with diverse characteristics.

The result was a video-based training protocol that could still be completed by trainees in 90 to 120 minutes. The video training protocol began with an introduction to the PDCA-RA, followed by prompts to rate the video-recorded vignettes using the PDCA-RA prior to receiving training. This initial rating of the vignettes was considered the pretest condition. Training on the application of the PDCA-RA to the vignettes was next. Training included revealing ideal scores as determined by the authors, the reasoning behind the scoring, and opportunities to discuss scoring among participants. Following the training on the PDCA-RA, participants were, once again, given the PDCA-RA rubric along with a new set of video-recorded vignettes. This was considered the posttest condition. Participants were asked to rate the new vignettes using the PDCA-RA.

The video-based training protocol, designed for use in small groups, allowed for group discussion of ratings after each participant completed the PDCA-RA independently. This was indicated by a written message on the video reading, “Pause video for discussion.” The training tape ended with a narrator discussion of how to use the PDCA-RA in actual admissions interviews, including comments on cultural sensitivity in admissions interviews.

The video-based training protocol was used as the means of training participants in dispositional assessment. The purpose of the trainings was to increase consistency of admissions raters in evaluating the admissions interviews of applicants to a master’s-level counselor education program. Typically, participants completed the video training in small groups consisting of approximately six to 10 people. In addition to viewing the training video, participants also took part in group discussion and established a consensus of opinion on group ratings of video clips. Coming to a consensus on ratings, which also included feedback on rubric items and video clips, was an important aspect of the training.

Statistical Analysis
     The PDCA-RA scores from the counselor education faculty, adjunct faculty, doctoral students, and site supervisors’ ratings of the vignettes before training were used as the pretest or baseline interrater reliability. The PDCA-RA scores after participants were trained in the tool were used as the posttest. The intraclass correlation coefficient (ICC) was calculated as a measure of interrater reliability. Interrater reliability correlations quantify rater subjectivity (Herman et al., 1992). The ICC was calculated for pretest and posttest scores. Cronbach’s alpha coefficients were calculated for internal consistency, and Fleiss’ kappa (κ) was calculated for absolute agreement. In addition, Fleiss’ free-marginal kappa (κfree) and percent overall agreement were calculated. Calculations were made for both the pretest and posttest ratings, and a t-test was conducted, using SPSS, to determine whether training improved interrater reliability.


The ICC estimates and associated 95% confidence intervals were calculated using SPSS statistical package version 23 and based on an individual rating, absolute agreement, 2-way random-effects model. ICC single measures for absolute agreement were calculated for the pretest administration of the PDCA-RA at .53 (95% CI [0.333–0.807]). The ICC single measures for absolute agreement were calculated for the posttest administration of the PDCA-RA at .76 (95% CI [0.582–0.920]). Cronbach’s alpha was calculated at .99 for both pretest and posttest scores. Pretest and posttest ICCs were compared using a t-test with an a priori significance level set at .05. The test was significant (p < .05), suggesting that there was a difference between the pretest and posttest reliability, with reliability improving from the “moderate” range to the “good” range (Koo & Li, 2016) with training.

Using Excel, kappa (κ) was calculated as a measure of overall agreement for pretest and posttest scores. This particular kappa was extended by Fleiss (1971) and accommodates multiple raters like those rating the PDCA-RA. Assumptions underpinning Fleiss’ kappa include categorical data (i.e., nominal or ordinal) with mutually exclusive categories, symmetrical cross-tabulations, and independence of raters. Data in this study met all assumptions. Data was ordinal with three mutually exclusive response categories for each dispositional area assessed, which resulted in all cross-tabulations being symmetrical. Although raters were trained in a collaborative setting where discussions about ratings were fostered, when the actual ratings of study participants occurred, raters did not discuss their ratings with others and were thus independent of one another. Pretest scores for the nine rubric items reflected a κ of .33, fair agreement according to Landis and Koch (1977). After training, posttest scores on the nine items reflected a κ of .55, moderate agreement according to Landis and Koch.

As an additional analysis, percent overall agreement and κfree was calculated. κfree is appropriate when raters do not know how many cases should be distributed into each category. In addition, κfree is resistant to influence by prevalence and bias (Randolph, 2005). The percent of overall agreement is the measure of agreement between raters and historically has also been used to calculate interrater reliability (McHugh, 2012). Table 1 illustrates that the κfree for the pretest was .36 while the percent of overall agreement was 57.6%. The posttest for the κfree was .56 and the percent of overall agreement was 70.4%. After examining the change in pretest to posttest calculations from both the κfree and the percent of overall agreement, both offer additional support for and provide evidence that training improved the agreement of dispositional ratings on the PDCA-RA.

Table 1

Pre and Post Statistics: Percent Overall Agreement and Free-Marginal Fleiss’ Kappa

Time of Rating Percent Overall Agreement Free-Marginal Kappa 95% CI for Free- Marginal Kappa
Before Training: Pre 57.6 .36 [.23, .49]
After Training: Post 70.4 .56 [.31, .80]


The overall κ, κfree and percent of agreement results were promising, but a comparison of the percent of correct responses (the response intended by the research team) by disposition showed that the ratings of correct responses decreased by more than 2% from pre- to posttesting for three dispositions (Openness, Cooperativeness, and Moral Reasoning). Because this was an unexpected finding, the research team analyzed the ratings for incorrect responses and learned that the raters appeared to be better able to discern the difference between a rating of 1 (developing) and 3 (meets expectation) than between 3 and 5 (above expectation). As a post-hoc analysis, we calculated the percent of agreement with the correct score, collapsing the 3 and 5 ratings. The percent of correct responses with dichotomous categories of 1 and a collapsed category for 3 and 5 are shown in Table 2. As is evident in Table 2, using the collapsed category, the percent of correct responses for eight of the nine dispositions improved from pretest to posttest. The percent of correct responses for one disposition, Cooperativeness, decreased by more than 2% from pretest to posttest.

Table 2

Pre and Post Percent of Correct Responses by Disposition

Disposition Pre Percent
Overall Agreement
1, 3, 5
Post Percent Overall Agreement

1, 3, 5

Pre Percent
Overall Agreement
1, 3 & 5
Post Percent Overall Agreement

1, 3 & 5


1. Conscientiousness 62.0 97.1              77.1               98.6
2. Coping & Self-Care 59.9 94.4              22.9               97.1
3. Openness 51.0 49.4              94.3             100.0
4. Cooperativeness 47.3 39.0              94.3               87.1
5. Moral Reasoning 84.1 68.8              91.4               98.6
6. Interpersonal Skills 48.0 94.4              98.6               97.1
7. Cultural Sensitivity 69.3 94.4            100.0             100.0
8. Self-Awareness 40.7 40.0              54.3               64.3
9. Emotional Stability 56.3 56.5              67.1               95.7



The results of the study suggest that the PDCA-RA has potential as a reliable instrument for assessing counseling applicants at the point of program admission. The PDCA-RA demonstrated strong reliability from the standpoint of internal consistency. The interrater reliability, as measured by the ICC, moved from the “moderate” to the “good” range with the application of the standardized training protocol.

The results of the study also provide evidence that counselor educators, supervisors, and doctoral students can improve their agreement on ratings of student dispositions with adequate and appropriate training. Multiple statistical techniques for measuring agreement, including the ICC, κ, κfree, and percent agreement measured under pre-training and post-training conditions demonstrated overall improvement in rater agreement with training. The observed post-training improvement in interrater reliability corroborates the literature, underscoring the necessity of training protocols as the pathway to improved interrater reliability (Jonsson & Svingby, 2007).

The results from the second analysis conducted through collapsing the meets expectation and above expectation categories suggest that the PDCA-RA has higher reliability as a tool to screen out inappropriate candidates than to distinguish excellence within the pool of acceptable candidates. For programs seeking to eliminate problematic applicants, the PDCA-RA could prove reliable. However, for academic programs with large numbers of applicants with an objective to accept a small group of students from a large group of acceptable candidates, the PDCA-RA may be less reliable from an interrater reliability perspective. The PDCA-RA item descriptions for above expectation need further consideration.

The percent of correct responses after training with collapsed categories was over 87% for seven of the nine dispositions. The results suggest that the PDCA-RA or the PDCA-RA training protocol needs revision on two dispositions, Cooperativeness and Self-Awareness. The decrease in correct responses to Cooperativeness may be due to a posttest interview with a higher level of difficulty than the pretest interview. The posttest percent was 87%, suggesting that overall the rubric descriptions functioned as acceptable with this sample of raters, though not excellent. The percent of correct ratings for Self-Awareness increased from pre- to posttesting, but only to 64% agreement. One explanation could be that the Self-Awareness rubric descriptions are behavioral (as recommended by legal counsel), yet Self-Awareness as a trait is difficult to describe in behavioral terms. This could leave raters confused about the difference between their intuitive sense of the self-awareness of the applicant and the narrow behavioral descriptions on the rubric. An alternative explanation is that there is a lack of agreement in the profession on the extent of self-awareness expected from students entering the academic program, leading some raters to find the applicant’s level of self-awareness acceptable, while others found the level unacceptable. In either case, the training protocol for the PDCA-RA and perhaps the rubric description need improvement. The 100% posttest agreement on the dichotomous categories for Openness and Cultural Awareness were encouraging, given the critical importance of these two dispositions (Freeman et al., 2019).

Interrater reliability is of paramount importance for the responsible use of rubrics. To improve the interrater reliability of the PDCA-RA, three issues may need to be addressed. First, the training protocol may need to be lengthened to encompass three rather than two opportunities to rate video clips. Second, structuring the discussion between raters with questions focusing attention on the gaps in ratings could be beneficial. Third, because alternate forms of the videos are being used in the training (different actors with different responses to the same question), a comparison of the complexity of the video clips should be conducted. It may be desirable to revise the training protocol to utilize less complex responses for Part 1 training, followed by equivalent complex interviews for Part 2 training, and more complex interview responses for Part 3. More complex responses, meaning the responses are partially descriptive of two categories on the rubric, are realistic to actual admissions interviews in the field.

In conducting trainings for the PDCA-RA, a potentially interesting observation was that raters appeared predisposed to using their own subjective experience to rate the video interviews instead of applying the item descriptions in the rubric. Often the trainers observed that the disposition title, such as Self-Awareness, triggered an automatic response of high rater confidence in their ability to rate self-awareness without carefully reading the rubric descriptions. The tendency of raters to believe they are “right” rather than applying a rubric description is a potential barrier for any dispositional measure.

Implications of the Study
     The implications of this study relate primarily to counselor education programs. As evident from the review of literature, careful admissions processes are critical to prevent or diminish the number of gatekeeping and remediation situations that occur in academic programs after admission. In addition to the importance of fair admissions procedures from a legal perspective, the effort required of applicants to engage in the application process justifies the importance of developing fair processes in which acceptance or denial decisions are not based solely upon the subjectivity of faculty.

For those academic programs utilizing admissions interviews, one important implication of the study is that the results suggest that without training, raters will have high variability in their ratings of admissions applicants, as illustrated by the variability of the pretest scores in this study. Structuring the rating of admissions interviews by using an assessment is one method of mitigating the variability of faculty ratings of applicants. A holistic (global) rubric such as the PDCA-RA is unlikely to ever garner the almost perfect interrater reliability associated with analytic rubrics, but the PDCA-RA is available as one practical, field-tested tool with promising reliability to help facilitate transparent and fair admissions interview rating processes.

Limitations and Future Research
     In light of the lack of an established list of professional dispositions, the PDCA-RA’s utility may be limited, as the selected dispositions may not align with the values of all counselor education programs. A second limiting factor is that the sample included both field site supervisors and faculty, and all participants were from the rural Western United States. The reliability of the tool is limited by the demographics of the sample. Another limitation was that the study’s pretest and posttest video clips, although similar, were different from one another. The initial decision to use different pretest and posttest video clips was based on an attempt to reduce the influence of testing as a threat to internal validity. However, this also introduced the possibility that either of the sets of video clips was inherently easier or more difficult to rate than the other. Further research would include randomly juxtaposing pretest and posttest video clips, or perhaps using the same video clips pre- and posttest to eliminate the possibility that differences in pretest and posttest video clips were responsible for the improvements in score reliability rather than the intended independent variable, the training. Another potential limitation to the results is that it is possible that some of the graduate students who were filmed in the vignettes may have been known by six of the faculty members from one of the institutions. The impact of this possibility was reduced by the use of multiple student actors, but prior knowledge of the student could have influenced raters’ scores.

A final issue for consideration is the decision to use site supervisors as raters for the research. Site supervisors more commonly utilize the PDCA-R rather than the PDCA-RA, the version specific to admissions screening. The PDCA-R is used by supervisors to monitor and to communicate with counselor educators and counseling program clinical personnel. Further, at least one of the counselor education programs utilizes site supervisors for the admissions process. The training protocol for both versions of the PDCA is the same, and with site supervisors routinely participating in the training, the decision was made to include site supervisors as raters. It is possible, however, that site supervisors may differ in their abilities to respond to the training protocol when compared to counselor educators, adjunct faculty, and doctoral students.

A possibility for future research is to measure the extent to which the improvement in reliability can be maintained over time. At this point, little is known about whether and how often educators and site supervisors would need training updates to function optimally as raters of student dispositions. Accordingly, rating reliability could be observed at intervals of 3 months, 6 months, or 1 year after training to monitor decay.

Future research is also needed to determine the extent to which the length of the training protocol influences interrater reliability. In addition, cultural and gender bias in the use of the PDCA-RA should be studied, as one criticism of rubrics is the potential for cultural bias.

As a tool for consistently rating counselor education program applicants, the PDCA-RA demonstrates potential, though more research needs to be conducted to increase the interrater reliability. Training improved the interrater reliability results but not to the extent that excellent interrater reliability was achieved. Adjusting the training protocol may be fruitful as a mechanism to improve interrater reliability.


There is a need for reliable admissions tools to assess dispositional behaviors of counseling program applicants. Interrater reliability is an important form of reliability in situations such as admissions interviews in which there are often multiple raters involved in the process. The importance of interrater reliability is founded in the critical premises of fairness and transparency to applicants, though legal protection of counselor education programs is also enhanced by using clear, standardized processes. Dispositional assessment is in its infancy, especially when applied to counselor education in general and to program admissions in particular. How exactly to define dispositions as well as how exactly the role of the counselor will serve as a means of selection and gatekeeping for the profession is yet to be determined. Yet counselor educators perceive both an ethical and professional responsibility for monitoring counseling student dispositions as a means for safeguarding the integrity of the profession (Freeman et al., 2019; Schuermann et al., 2018). The continued development of the PDCA-R and the PDCA-RA, as well as the associated training materials, represents initial steps toward standardizing and improving dispositional appraisal. The video-based training and the exploration of the training as a means of improving rater consistency will potentially increase the ability of counselor educators to consistently assess and monitor developing counseling students. Consistent dispositional ratings can also contribute to the development of a common language for discussing student progress. The current research represents a promising effort to continually improve the dispositions assessment process for counselor educators, counseling programs, and the counseling profession.


Conflict of Interest and Funding Disclosure
The authors reported no conflict of interest
or funding contributions for the development
of this manuscript.



Alexander, C. R., & Praeger, S. (2009, June). Smoke gets in your eyes: Using rubrics as a tool for building justice into assessment practices. Paper presented at the Annual Conference of the Australian Teacher Education Association (ATEA). Australian Teacher Education Association.

American Counseling Association. (2014). ACA code of ethics.

Bryant, J. K., Druyos, M., & Strabavy, D. (2013). Gatekeeping in counselor education programs: An examination of current trends. In Ideas and research you can use: VISTAS 2013. American Counseling Association.

Carkhuff, R. R. (1969). Critical variables in effective counselor training. Journal of Counseling Psychology, 16(3), 238–245.

Cole, B. S. (1991). Legal issues related to social work program admissions. Journal of Social Work Education, 27(1), 18–24.

Costa, P. T., Jr., & McCrae, R. R. (1992). NEO PI-R professional manual. Psychological Assessment Resources, Inc.

Council for Accreditation of Counseling and Related Educational Programs. (2009). CACREP 2009 standards.

Council for Accreditation of Counseling and Related Educational Programs. (2015). CACREP 2016 standards.

Crawford, M., & Gilroy, P. (2013). Professional impairment and gatekeeping: A survey of master’s level training programs. The Journal of Counselor Preparation and Supervision, 5(1).

Demos, G. D., & Zuwaylif, F. H. (1966). Characteristics of effective counselors. Counselor Education and Supervision, 5(3), 163–165.

Dugger, S. M., & Francis, P. C. (2014). Surviving a lawsuit against a counseling program: Lessons learned from Ward v. Wilbanks. Journal of Counseling & Development, 92(2), 135–141.

Engram, B. E., & Vandergoot, D. (1978). Correlation between the Truax and Carkhuff scales for measurement of empathy. Journal of Counseling Psychology, 25(4), 349–351.

Epstein, M. H., & Synhorst, L. (2008). Preschool behavioral and emotional rating scale (PreBERS): Test–retest reliability and inter-rater reliability. Journal of Child and Family Studies17(6), 853–862.

Fleiss, J. L. (1971). Measuring nominal scale agreement among many raters. Psychological Bulletin, 76(5), 378–382.

Flynn, S. V., & Hays, D. G. (2015). The development and validation of the Comprehensive Counseling Skills Rubric. Counseling Outcome Research and Evaluation, 6(2), 87–99.

Frame, M. W., & Stevens-Smith, P. (1995). Out of harm’s way: Enhancing monitoring and dismissal processes in counselor education programs. Counselor Education and Supervision, 35(2), 118–129.

Freeman, B. J., & Garner, C. M. (2017). Professional Dispositions Competency Assessment, Revised. Unpublished instrument, ScholarWorks.

Freeman, B. J., Garner, C. M., Fairgrieve, L. A., & Pitts, M. E. (2016). Gatekeeping in the field: Strategies and practices. Journal of Professional Counseling: Practice, Theory & Research, 43(2), 28–41.

Freeman, B. J., Garner, C. M., Scherer, R., & Trachok, K. (2019). Discovering expert perspectives on dispositions and remediation: A qualitative study. Counselor Education and Supervision, 58(3), 209–224.

Garner, C. M., Freeman, B. J., & Lee, L. (2016). Assessment of student dispositions: The development and psychometric properties of the professional disposition competence assessment (PDCA). In Ideas and research you can use: VISTAS 2016. American Counseling Association.

Gwet, K. L. (2014). Handbook of inter-rater reliability: The definitive guide to measuring the extent of agreement among raters (4th ed.). Advanced Analytics.

Halinski, K. H. (2010). Predicting beginning master’s level counselor effectiveness from personal characteristics and admissions data: An exploratory study [Doctoral dissertation, University of North Texas].

Herman, J. L., Aschbacher, P. R., & Winters, L. (1992). A practical guide to alternative assessment. Association for Supervision and Curriculum Development.

Hernández, T. J., Seem, S. R., & Shakoor, M. A. (2010). Counselor education admissions: A selection process that highlights candidate self-awareness and personal characteristics. Journal of Counselor Preparation and Supervision, 2(1), 74–87.

Hutchens, N., Block, J., & Young, M. (2013). Counselor educators’ gatekeeping responsibilities and students’ first amendment rights. Counselor Education and Supervision, 52(2), 82–95.

Johnson, W. B., & Campbell, C. D. (2002). Character and fitness requirements for professional psychologists: Are there any? Professional Psychology: Research and Practice, 33(1), 46–53.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144.

Kerl, S., & Eichler, M. (2005). The loss of innocence: Emotional costs to serving as gatekeepers to the counseling profession. Journal of Creativity in Mental Health, 1(3–4), 71–88.

Kerl, S. B., Garcia, J. L., McCullough, C. S., & Maxwell, M. E. (2002). Systematic evaluation of professional performance: Legally supported procedure and process. Counselor Education and Supervision, 41(4), 321–334.

Koerin, B., & Miller, J. (1995). Gatekeeping policies: Terminating students for nonacademic reasons. Journal of Social Work Education, 31(2), 247–260.

Koo, T. K., & Li, M. Y. (2016). A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of Chiropractic Medicine15(2), 155–163.

Kopenhaver Haidet, K., Tate, J., Divirgilio Thomas, D., Kolanowski, A., & Happ, M. B. (2009). Methods to improve reliability of video-recorded behavioral data. Research in Nursing & Health, 32(4), 465–474.

Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174.

Lumadue, C. A., & Duffey, T. H. (1999). The role of graduate programs as gatekeepers: A model for evaluating student counselor competence. Counselor Education and Supervision, 39(2), 101–109.

McAdams, C. R., III, Foster, V. A., & Ward, T. J. (2007). Remediation and dismissal policies in counselor education: Lessons learned from a challenge in federal court. Counselor Education and Supervision, 46(3), 212–229.

McCaughan, A. M., & Hill, N. R. (2015). The gatekeeping imperative in counselor education admission protocols: The criticality of personal qualities. International Journal for the Advancement of Counseling, 37, 28–40.

McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22(3), 276–282.

Miller, S. (2006). Professionalisation, ethics and integrity systems: The promotion of professional ethical standards, and the protection of clients and consumers. A report for the Professional Standards Councils, Centre for Applied Philosophy and Public Ethics, Australia.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9, 129–144.

Pope, V. T. (1996). Stable personality characteristics of effective counselors: The Counselor Characteristic Inventory (Doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Global (Order No. 9625345).

Randolph, J. J. (2005). Free-marginal multirater kappa (multirater κfree): An alternative to Fleiss’ fixed-marginal multirater kappa. Department of Computer Science, 1, 1–20.

Redekop, F., & Wlazelek, B. (2012). Counselor dispositions: An added dimension for admission decisions. In Ideas and research you can use: VISTAS 2012. American Counseling Association.

Rosen, J., Mulsant, B. H., Marino, P., Groening, C., Young, R. C., & Fox, D. (2008). Web-based training and interrater reliability testing for scoring the Hamilton Depression Rating Scale. Psychiatry Research, 161(1), 126–130.

Schuermann, H., Avent Harris, J. R., & Lloyd-Hazlett, J. (2018). Academic role and perceptions of gatekeeping in counselor education. Counselor Education and Supervision, 57(1), 51–65.

Smaby, M. H., Maddux, C. D., Richmond, A. S., Lepkowski, W. J., & Packman, J. (2005). Academic admission requirements as predictors of counseling knowledge, personal development, and counseling skills. Counselor Education and Supervision, 45(1), 43–57.

Spurgeon, S. L., Gibbons, M. M., & Cochran, J. L. (2012). Creating personal dispositions for a professional counseling program. Counseling and Values, 57(1), 96–108.

Swank, J. M., Lambie, G. W., & Witta, E. L. (2012). An exploratory investigation of the counseling competencies scale: A measure of counseling skills, dispositions, and behaviors. Counselor Education and Supervision, 51(3), 189–206.

Swank, J. M., & Smith-Adcock, S. (2014). Gatekeeping during admissions: A survey of counselor education programs. Counselor Education and Supervision, 53(1), 47–61.

Taub, D. J., Servaty-Seib, H. L., Wachter Morris, C. A., Prieto-Welch, S. L., & Werden, D. (2011). Developing skills in providing outreach programs: Construction and use of the POSE (Performance of Outreach Skills Evaluation) rubric. Counseling Outcome Research and Evaluation, 2(1), 59–72.

Truax, C. B., & Carkhuff, R. (1967). Toward effective counseling and psychotherapy: Training and practice. Aldine.

U.S. Department of Justice, Civil Rights Division. (2010). Americans with Disabilities Act Title III Regulations: Part 36 Nondiscrimination on the Basis of Disability in Public Accommodations and Commercial Facilities (CRT Docket No. 106).

Utley Buensuceso, J. M. (2008). The Sixteen Personality Factor Questionnaire and ratings of counselor effectiveness (Order No. 3341140) [Doctoral dissertation, Azusa Pacific University]. ProQuest Dissertations and Theses Global.

Ward v. Wilbanks. (2010). No. 09-CV-112 37, 2010 U.S. Dist. WL 3026428 (E.D. Michigan, July 26, 2010).

Williams, J. L., Williams, D. D., Kautzman-East, M., Stanley, A. L., Evans, W. J., & Miller, K. L. (2014). Assessing student dispositions in counselor training programs: Implications for supervision, program policy, and legal risk management [PowerPoint slides]. DocPlayer.


Curtis Garner, EdD, NCC, NCSC, LCPC, is a professor and department chair at Gonzaga University. Brenda Freeman, PhD, is a professor at the University of Nevada, Reno. Roger Stewart, PhD, is a professor at Boise State University. Ken Coll, PhD, is the Dean of the School of Education at the University of Nevada, Reno. Correspondence may be addressed to Curtis Garner, 502 East Boone Ave., Spokane, WA 99258-0102,

Technology in Counselor Education: HIPAA and HITECH as Best Practice

Tyler Wilkinson, Rob Reinhardt

The use of technology in counseling is expanding. Ethical use of technology in counseling practice is now a stand-alone section in the 2014 American Counseling Association Code of Ethics. The Health Insurance Portability and Accountability Act (HIPAA) and the Health Information Technology for Economic and Clinical Health (HITECH) Act provide a framework for best practices that counselor educators can utilize when incorporating the use of technology into counselor education programs. This article discusses recommended guidelines, standards, and regulations of HIPAA and HITECH that can provide a framework through which counselor educators can work to design policies and procedures to guide the ethical use of technology in programs that prepare and train future counselors.

Keywords: counselor education, technology, best practice, HIPAA, HITECH

The enactment of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) brought forth a variety of standards addressing the privacy, security and transaction of individual protected health information (PHI; Wheeler & Bertram, 2012). According to the language of HIPAA (2013, §160.103), PHI is defined as “individually identifiable health information” (p. 983) that is transmitted by or maintained in electronic media or any other medium, with the exception of educational or employment records. “Individually identifiable health information” is specified as follows:


Information, including demographic data, that relates to:

  • the individual’s past, present or future physical or mental health or condition,
  • the provision of health care to the individual, or
  • the past, present, or future payment for the provision of health care to the individual, and that identifies the individual for which there is a reasonable basis to believe can be used to identify the individual. Individually identifiable health information includes many common identifiers. (U.S. Department of Health and Human Services [HHS], n.d.-b, p. 4)

The HIPAA standards identify 18 different elements that are considered to be part of one’s PHI. These include basic demographic data such as names, street addresses, elements of dates (e.g., birth dates, admission dates, discharge dates) and phone numbers. It also includes information such as vehicle identifiers, Internet protocol address numbers, biometric identifiers and photographic images (HIPAA, 2013, § 164.514, b.2.i).

According to language in HIPAA, the applicability of its standards, requirements and implementation only apply to “covered entities,” which are “(1) a health plan (2) a health care clearinghouse (3) a health care provider who transmits any health information in electronic form in connection with [HIPAA standards and policies]” (HIPAA, 2013, § 160.102). Covered entities have an array of required and suggested privacy and security measures that they must take into consideration in order to protect individuals’ PHI; failure to protect individuals’ information could result in serious fines. For example, one recent ruling found a university medical training clinic to be in violation of HIPAA statutes when network firewall protection had been disabled. The oversight resulted in a $400,000 penalty (Yu, 2013). Moreover, the recent implementation of the Health Information Technology for Economic and Clinical Health (HITECH) Act in 2009 increased the fines resulting from failure to comply with HIPAA, including fines for individuals claiming they “did not know” that can range from $100–$50,000 (Modifications to the HIPAA Privacy, 2013, p. 5583). The final omnibus ruling of HIPAA–HITECH, enforcing these violations, went into effect on March 26, 2013 (Modifications to the HIPAA Privacy, 2013; Ostrowski, 2014). Enforcement of the changes from the HITECH Act on HIPAA standards began on September 23, 2013, for covered entities (Modifications to the HIPAA Privacy, 2013).


Academic departments and universities must understand the importance of HIPAA and HITECH regulations in order to determine whether the department or university is considered a covered entity. Risk analysis and management need to be employed to avoid violations leading to penalties and fines (HIPAA, 2013, §164.308). Some counselor education programs that have students at medically related practicum or internship sites also may be considered business associates (see HIPAA, 2013, § 160.103) and would need to comply with HIPAA regulations (see HIPAA, 2013, § 160.105). The authors recommend that all counselor education programs confer with appropriate legal sources to understand any risks or liabilities related to HIPAA regulations and relationships with practicum and internship sites. Many states also have their own unique privacy laws that must be considered in addition to those described in HIPAA regulations. The purpose of this article assumes that a counselor education department is not considered a covered entity by the regulations set forth by HIPAA. However, as an increasing number of counselor education programs incorporate the use of digital videos or digital audio recordings, a need for a set of policies and procedures to guide the appropriate use of digital media is evident.


The authors believe that the regulations set forth by HIPAA and HITECH create a series of guidelines that could dictate best practices for counselor educators when considering how to utilize technology in the collection, storage and transmission of any individual’s electronic PHI (Wheeler & Bertram, 2012) within counselor education programs. HIPAA regulations (2013, §160.103) describe electronic protected health information (ePHI) as any information classified as PHI, as described above, either “maintained by” or “transmitted in” (p. 983) electronic media. For example, audio recordings used in practicum and internship courses are often collected electronically by digital recorders. If the recordings remain on the device, this protected information is being maintained in an electronic format. If the data is shared through e-mail or uploaded to a computer, then it is being transmitted in electronic format. As it relates to counselor training, the PHI that is collected could be real or fictitious (i.e., from someone role playing in the program). Though fictitious information is not necessarily protected, encouraging students to engage in implementing a set of policies and procedures guided by regulations of HIPAA and HITECH creates an experiential milieu whereby students become aware of and learn the importance of security and privacy when handling digital ePHI. The authors will discuss throughout this article how specific regulations from HIPAA and HITECH can be utilized to create a set of policies and procedures that guide the ways in which members of counselor education programs can handle any ePHI they encounter during their training. These direct experiences will give faculty and students greater familiarity with current HIPAA and HITECH regulations, thus making them better prepared to work ethically and legally in modern mental health culture.


This article is not meant to cover HIPAA and HITECH regulations in a comprehensive manner. Overviews of these standards have been written concerning the regulations of HIPAA and HITECH regarding the work of mental health practitioners (see Letzring & Snow, 2011). The degree to which the myriad regulations of HIPAA will be implemented in various counselor education programs will need to be decided by the members of individual programs and by necessary stakeholders. The authors hope to introduce a dialogue regarding the thoughtful use of technology in counselor education programs guided by the parameters set forth by HIPAA.


According to the Substance Abuse and Mental Health Services Administration (SAMHSA; 2013), the trend in mental health care treatment spending is in the direction of public (i.e., Medicare and Medicaid) and private insurance growth as a means of payment. Spending for all mental health and substance abuse services totaled $172 billion in 2009; moreover, this spending accounted for 7.4% of all health care spending that year. Additionally, it is projected that spending on all mental health and substance abuse services could reach $238 billion by 2020 (SAMHSA, 2014). However, the rate at which individuals pay out-of-pocket for mental health and substance abuse services is expected to decrease steadily (SAMHSA, 2014). Historical trends show out-of-pocket spending decreased from 18% of all spending in 1986 to 11% in 2009 (SAMHSA, 2013, 2014). It is projected that out-of-pocket spending for mental health treatment will level off to account for approximately 10% of all spending while Medicaid, Medicare, and private insurance will account for approximately 70% of spending (SAMHSA, 2014). The trend toward greater insurance use will increase the number of professional counselors who will be seen as or will be working within organizations that are considered HIPAA-covered entities. Implementing policies and procedures in counseling departments that incorporate some of the HIPAA regulations is a useful way to prepare future professionals for the working environment they will enter (SAMHSA, 2013).


The implementation of the HITECH Act (2009) as a supplement to HIPAA emphasized the need to make sure future counselors understand the importance of the increasing role of technology in the practice of counseling (Lawley, 2012). The HITECH Act established an expectation that professionals in health care must be familiar with technology, specifically as it relates to policies guiding the storage and transmission of ePHI. The objectives of HITECH include “the electronic exchange and use of health information and the enterprise integration of such information” and “the utilization of an electronic health record for each person in the United States by 2014” (HITECH, 2009, §3001.c.A, emphasis added). Additionally, HITECH strengthened the enforcement of penalties for those who violate HIPAA (Modifications to the HIPAA Privacy, 2013). A multi-tiered system of violations allows for civil money penalties to range from $100–$50,000 per violation (Modifications to the HIPAA Privacy, 2013). The American Counseling Association’s (ACA) 2014 Code of Ethics acknowledged the increasing use of technology by professional counselors by introducing a new section (Section H) addressing the ethical responsibility of counselors to understand proper laws, statutes, and uses of technology and digital media. Ethical counselors are expected to understand the laws and statutes (H.1.b), the uniqueness of confidentiality (H.2.b), and the proper use of security (H.2.d) regarding the use of technology and digital media in their counseling practice.


The mental health care system exists inside the broader health care system. As such, graduates of counseling programs must be familiar with HIPAA regulations and the various modes of technology to implement these regulations (ACA, 2014; Lawley, 2012). Students will be expected to understand what security and privacy standards are required of them once they begin working as counseling professionals (ACA, 2014). For example, the movement toward increased use of ePHI across health care will place increasing demands on students to understand how to appropriately keep electronic data private and secure. Counselor educators need to be mindful of how the use of technology in the practice of counseling is being taught and implemented with counseling students. Counselor educators should thoughtfully consider how students will learn the ways in which technology can be used professionally while maintaining ethical and legal integrity (Association for Counselor Education and Supervision [ACES] Technology Interest Network, 2007; Wheeler & Bertram, 2012). Having standards to guide the use of ePHI throughout counselor education programs is a way in which students can become knowledgeable and skilled regarding the laws and ethics surrounding digital media. Policies and procedures should include information guiding the ways in which students collect, store and transmit digital media (e.g., audio recordings or videotapes) while a member of the counseling program. By requiring students to utilize the ePHI (real or fictitious) they collect in accordance with policies and procedures informed by HIPAA and HITECH, students crystallize their understanding of these complicated laws.


HIPAA Compliance and Technology


Complying with HIPAA Privacy and Security Rules requires individuals to be mindful of policies and procedures, known as “administrative safeguards” (HIPAA, 2013, §164.308, p. 1029), and work to implement safeguards consistently. The HHS has made clear that it does not provide any type of credential to certify that an individual, business, software or device is HIPAA compliant (HHS, n.d.-a; Reinhardt, 2013). Complying with HIPAA rules requires organizations and individuals to address many different processes where choice of hardware or software is only one aspect (Christiansen, 2000). Being HIPAA compliant is less about a certification or a credential on a device and more about having a set of policies and procedures in place that ensure the integrity, availability and confidentiality of clients’ ePHI (Christiansen, 2000; HHS, n.d.-b). Hardware and software technology companies who make claims that a product or an educational resource is HIPAA compliant are likely doing so for marketing purposes. Claims of this type are mostly meaningless (HHS, n.d.-a) and would not provide protection in the case of a breach (HITECH, 2009). Being HIPAA compliant is an “organizational obligation not a technical specification” (Christiansen, 2000, p. 7). The distinction is important for educators to understand as they seek to implement technology in counselor education programs. When establishing a set of policies and procedures within a counseling department, the recommendations set forth in describing the security and privacy of PHI in Part 164 of HIPAA (2013) can be an appropriate framework for establishing best practices for counselors and counselor educators. The general requirements in complying with HIPAA security standards are to ensure the confidentiality, integrity and availability of individuals’ ePHI while protecting against any reasonably anticipated threats to the security and privacy of said ePHI (HIPAA, 2013, §164.306.a). The key phrase to consider is that covered entities are asked to protect against any “reasonably anticipated” (HIPAA, 2013, §164.306.a, p.1028) threat. Educators must understand the importance of spending time considering reasonable, foreseeable risks. A primary responsibility is to create administrative safeguards that address any reasonable, foreseeable risks, which the individual, department or covered entity establishes.


Before looking at key aspects of HIPAA Privacy and Security guidelines, key definitions should be understood:


  • Administrative safeguards include policies and procedures used to manage the development, selection, implementation and security in protecting individuals’ ePHI (HIPAA, 2013, § 164.304).
  • Authentication includes “the corroboration that a person is the one claimed” (HIPAA, 2013, § 164.304, p. 1027).
  • Confidentiality defines “the property that data or information is not made available or disclosed to unauthorized persons or processes” (HIPAA, 2013, § 164.304, p. 1027).
  • Encryption is “the use of an algorithmic process to transform data into a form in which there is a low probability of assigning meaning without the use of a confidential process or key” (HIPAA, 2013, § 164.304, p. 1027).
  • Security incident is described as “the attempted or successful unauthorized access, use, disclosure, modification, or destruction of information or interference with system operation in an information system” (HIPAA, 2013, § 164.304, p. 1027).


HIPAA (2013) standards are categorized as either required or addressable as indicated in Section 164.306.d.1. The rest of this document will highlight the standards that the authors believe shape a set of best practices for counselor educators when implementing technology into their counselor education programs. The degree to which a counseling program decides to implement those standards that are considered required or addressable will be determined by their status as a covered entity, state laws, needs of their counseling program and the financial feasibility of implementing these standards.




     HIPAA requires that all covered entities maintain policies and procedures that (1) ensure confidentiality and availability of all electronic PHI, (2) protect against any reasonably (emphasis added) anticipated threats or hazards to the security or integrity of ePHI, (3) protect against any reasonably anticipated uses or disclosures of ePHI, and (4) ensure compliance by the workforce. The following sections will discuss ways in which HIPAA Privacy and Security rules can be utilized as best practices in counselor education programs so that foreseeable risks, threats and vulnerabilities may be minimized. Please note that this interpretation of safeguards is intended for the consideration of counselor education programs that are not covered entities, but may use HIPAA Privacy and Security rules to establish a set of policies and procedures as a means of best practice. (For a sample guide for counselor educators to use in developing policies and procedures, please contact the first author).


Administrative Safeguards

Administrative actions and oversight make up an important component of the language within HIPAA (2013). Administrative safeguards consist of the policies and procedures designed to “manage the selection, development, [and] implementation” (§ 164.304, p. 1027) of the security and privacy of one’s ePHI. This section describes HIPAA standards to consider when establishing administrative safeguards.


Assigned responsibility. A faculty or staff member within the counselor education program should be identified as responsible for the development, oversight and implementation of the policies and procedures for the department. The faculty member needs to be familiar with the privacy and security policies of HIPAA in order to implement the policies and procedures and to facilitate student training in ways that address the specific needs of the program. Developing a relationship with a staff member in the university information technology department may result in collaborative efforts regarding specific procedures for the use of technology within the university.


     Risk analysis. Before counselor educators can design a set of policies and procedures to guide appropriate technology use, the foreseeable risks must be analyzed. An accurate and thorough assessment is needed to identify potential risks to the protection and security of ePHI (HIPAA, 2013, §164.308) that is collected, stored and transmitted in the counseling program. Analyzing potential risk is essential to the minimization of potential disasters in the future (Dooling, 2013). HHS (2007) makes clear that it is important to spend time considering reasonably anticipated threats and vulnerabilities and then to implement policies and procedures to address the assessed risks. HIPAA security standards do not state that covered entities should protect against all possibly conceived threats, but those that can be “reasonably anticipated” based upon the technologies employed, work environments and employees of the covered entity. The National Institute of Standards and Technology (NIST; 2012) defines a threat “as any circumstance or event . . . with the potential to adversely impact organization operations . . . through an information system via unauthorized access, destruction, disclosure, or modification of information” (p. B-13). A risk is a measure of the probability of a threat triggering a vulnerability in the procedures that an organization uses to ensure the privacy and security of ePHI (NIST, 2012). Vulnerabilities are technical and non-technical weaknesses, which include limitations in utilized technology or ineffective policies within the organization (HHS, 2007). In counselor education programs, risk analysis may include looking at the threats and vulnerabilities associated with counseling students traveling between their residence, campus, and practicum or internship sites while carrying ePHI. Moreover, the analysis must include assessing the potential risks associated with the transmission and storage of protected information using technological media (e.g., e-mail, personal computers, cloud-based storage, external storage devices).


Risk management. Risk management is the ongoing process of implementing measures to reduce the threats that were determined as a part of the risk analysis (HHS, 2007). Once a counseling program has assessed and identified potential risks associated with the collection, transmission and storage of any identifiable information, it must begin to manage these risks. HHS has provided an example list of steps to assist organizations in conducting risk analysis and risk management (see Table 1). Members of counselor education programs can begin to incorporate programmatic policies and procedures that address how media containing ePHI should be handled by members of the program. The previously mentioned document (available from the first author) provides sample policies and procedures developed to serve as a guide for counseling programs. Many counselor education programs utilize student handbooks that detail policies related to the academic and professional expectations of students enrolled in their program. Incorporating an additional set of policies to address the treatment of ePHI is a seamless way to begin managing the risks of technology use in mental health. By implementing policies and procedures across the curriculum, students become increasingly knowledgeable and skilled at handling ePHI in an ethical manner.

Table 1


Example Risk Analysis and Risk Management Steps

Risk Analysis

1. Identify the scope of the analysis.
2. Gather data.
3. Identify and document potential threats and vulnerabilities.
4. Assess current security measures.
5. Determine likelihood of threat occurring.
6. Determine potential impact of threat occurrence.
7. Determine level of risk.
8. Identify security measures and finalize documentation.

Risk Management

1. Develop and implement a risk management plan.
2. Implement security measures.
3. Evaluate and maintain security measures.

Note. Adapted from “Basics of Risk Analysis and Risk Assessment,” by the U.S.
Department of Health and Human Services, 2007, HIPAA Security Series, 2(6), p. 5.


Sanction policy. It must be communicated to all members of counselor education programs that failure to comply with the policies will result in sanctions. HIPAA (§164.308, 2013) requires organizations to enforce sanctions against individual members for failing to comply with their organization’s policies and procedures. A counselor education program should have clearly documented policies and procedures for students and staff involved with the facilitation of ePHI. The language of HIPAA makes no attempt to clarify as to what these sanctions should entail; however, language needs to exist that addresses individuals’ failure to comply. For counseling students, a potential option is to consider a tiered sanction policy similar to that of the structure established by the HITECH Act (Modifications to the HIPAA Privacy, 2013) and § 1176 of the Social Security Act (2013). Varying categories of violations from “did not know” (p. 5583) to uncorrected–willful neglect result in increasingly severe fines (Modifications to the HIPAA Privacy, 2013). Since this experience is most likely educational for students, varying degrees of failure to comply could exist. For counselor education programs, this language also could easily be tied to student remediation processes that many counseling programs utilize.


Information review. Ongoing review of the activity of students, faculty and staff that involves the creation, storage and transmission of ePHI is a required safeguard according to HIPAA standards (2013, §164.308). As an educational unit, it is understandable that individuals might make mistakes regarding the implementation of HIPAA safeguards. A regular review of the activity and records of the individuals whose ePHI are being collected is important. It is required for organizations to have policies in place for recording system activity, including access logs and incident reports (§ 164.308). Additionally, protections must be in place to ensure that only those individuals who should have access to any ePHI are able to access this protected information. In the case of the sanctioned university medical training clinic cited earlier, the breaches might have been avoided with an ongoing review of the system’s firewall settings (Yu, 2013). Monitoring and developing policies regarding information review may require developing relationships and discussions with the appropriate information technology personnel at the organization.


Response, recovery and reporting plan. HIPAA regulations require that a covered entity have a plan in place should ePHI be breached or disclosed to an unauthorized party (HIPAA, 2013, § 164.308). When developing departmental policies and procedures, it is important to have such a plan in place. Whether the breach or disclosure is intentional or unintentional, each individual whose information has potentially been compromised needs to be notified. Moreover, in cases where more than 500 individuals’ PHI have been breached, the entity may need to report this information to local media or to HHS (HIPAA, 2013, §164.406–164.408). It should be noted that covered entities could be exempted from breach notification through employing security techniques such as encryption (Breach Notification, 2009; HIPAA, 2013, §164.314). The regulations of HIPAA require that a plan be in place to address emergencies (HIPAA, 2013, §164.308). In the case of theft, emergency or disaster, counseling departments need a data backup and recovery plan in place to retrieve ePHI.


Physical Safeguards

     Establishing policies and procedures that protect against unauthorized physical access and damage from natural or environmental hazards is critical to maintaining the security and privacy of PHI (HIPAA, 2013, §164.310).


     Access control. When using technology to store and transmit ePHI, the recommendation is that policies address ways in which physical access to protected information will be limited. For example, many counseling departments now incorporate the use of digitally recorded data from counseling sessions (e.g., audio or video). Policies need to clearly address how to best limit physical access to these recordings. Students need to understand what it means to keep data physically secure. The HITECH Act (Modifications to the HIPAA Privacy, 2013) includes the category “did not know” as a punishable violation. Students need to understand the consequences of failing to implement such physical safeguards. For example, keeping devices stored under lock and key when not in use is just one important step in moving toward a set of best practices. Many universities already require students to utilize login information with a username and passcode in order to access computers affiliated with their respective university. Consideration may need to be given regarding policies and procedures for accessing ePHI off campus, where the technical security may be less controlled.


Disposal and re-use. HIPAA requires covered entities to implement policies that address the disposal and re-use of ePHI on electronic media. A detailed discussion of the various types of disposal, also known as media sanitization, and re-use is beyond the scope of this article (see Kissel, Regenscheid, Scholl, & Stine, 2014). Counselor education programs must recognize the importance of properly removing protected information from media devices after it is no longer required. Media sanitization is a critical element in assuring confidentiality of information (Kissel et al., 2014). For example, in counseling internship courses, students may be asked to delete recorded sessions during the last day of classes so that the instructor can have evidence of the appropriate disposal of this information. NIST identifies four different types of media sanitization: disposal, clearing, purging and destroying (Kissel et al., 2014). The decision as to which type of media sanitization is appropriate requires a cost/benefit analysis, as well as an understanding of the available means to conduct each type of sanitization. (The authors recommend counseling departments consult with an individual from the university information technology department).


Technical Safeguards

The language in HIPAA is clear regarding the implementation of technical safeguards, requiring that access to electronic media devices containing PHI be granted only to those who need such access to perform their duties.


     Unique user identification. If a device allows for unique user identification, one should be assigned to minimize the unintended access of ePHI. HIPAA standards (2013, §164.514) state that an assigned code should not be “derived from or related to information about the individual” (p. 1064).


     Emergency access. Covered entities are required to have procedures in place that allow ePHI to be accessed in the event of an emergency (HIPAA, 2013, §164.310). The procedures can be addressed within counselor education programs so as to ensure that the student and the supervisor have access to the ePHI at the designated storage location.


     Encryption. Encryption is a digital means of increasing the security of electronic data. Using an algorithmic process, the data is scrambled so that the probability of interpretation is minimal without the use of a confidential key to decode the information. Though the language of HIPAA categorizes encryption as addressable rather than required, the implementation of encryption policies is a best practice to help ensure the protection of ePHI. The language of HIPAA makes it clear that an “addressable” item must be implemented if it is “reasonable and appropriate” (HIPAA, 2013, §164.306, p. 1028) to do so. Huggins (2013) has recommended that ePHI be stored on drives that allow for “full disk encryption” at a minimum strength of 128 bits. With the availability of many different types of software packages that can encrypt at a recommended strength, implementing encryption standards in a counseling department is affordable and reasonable. Most modern computer operating systems have options to encrypt various drives built into the functionality of the system. Full disk encryption is recommended because of its higher level of security and also because it can provide exemption from the Breach Notification Rule mentioned earlier (Breach Notification, 2009). In case of a breach, the burden is on the covered entity to prove that the ePHI was not accessed; otherwise, Breach Notification Rules must be followed. The assumption is that if a disk is fully encrypted, even if accessed by an unauthorized person, it is highly unlikely that an unauthorized party will obtain access to the ePHI (Breach Notification, 2009). The authors strongly encourage the use of encrypted devices as a standard policy for the collection and storage of ePHI (see Scarfone, Souppaya, & Sexton, 2007). The policy creates greater protection against the accidental disclosure of an individual’s ePHI. Additionally, organizations that use commercial cloud storage service providers should investigate whether these providers are willing to sign a Business Associate Agreement, in which the provider agrees to adhere to regulations of HIPAA (2013, §160.103). If not, the storage of ePHI may not be in alignment with HIPAA standards.


Disk encryption works well for the storage and collection of protected information while at rest (Scarfone et al., 2007); however, counselor education programs also should consider assessing the risk associated with the transmission of ePHI (HIPAA, 2013, §164.312). Protected information often remains encrypted while at rest, yet becomes unencrypted while in transmission. Programs need to “guard against unauthorized access to electronic PHI that is being transmitted over an electronic communication network” (HIPAA, 2013, §164.312, p. 1032). Commonly used e-mail systems, for example, often do not transmit information in an encrypted state. Assessment of the risks in sending protected information by an unsecured means should be conducted.




     The language of HIPAA allows each covered entity some leeway in how it wants to implement policies. However, HIPAA standards (2013, §164.316) are very clear that entities should “implement reasonable and appropriate policies”(p. 1033) that include administrative, physical and technical safeguards that reasonably and appropriately protect the confidentiality, integrity and availability of electronic PHI that it creates, receives, maintains or transmits. The implementation of HITECH (2009) and the meaningful use policies of the Affordable Care Act (Medicare and Medicaid Programs, 2014) emphasized the movement of the broader health care system toward increasing use of health care technology such as Electronic Health Records. Students graduating from counseling programs find themselves working in myriad settings, many of which are considered covered entities as defined in the HIPAA standards (2013, §160.103). It is imperative for counselor educators to recognize the trend toward increased technology use in the health care market and to consider ways that technology can be infused into counselor education so that students are entering the workforce with greater technological competence. Specifically, counselor educators have an imperative to teach the ethical and legal technological mandates that exist as they relate to regulations of HIPAA (2013) and HITECH (2009) so as to create competent counselors. As the health care industry continues to incorporate more technology, counselor educators must stay informed regarding ways in which graduates will utilize this technology in their professional careers.


Recommendations for Counselor Educators

     ACES (2007) published a document that recommends guidelines for infusing technology into counselor education curriculum, research and evaluation. This document provides a basic overview by which programs should guide the very broad use of technology in counseling programs. Technology is presented as a useful enhancement or supplement to practice. The shift in the broader health care culture has moved technology from a supplementary role into one in which it is primary to the ongoing success of a practitioner. The authors believe that counselor educators can utilize HIPAA and HITECH regulations to continue to infuse technology into counselor education programs, and recommend the following:


  1. Counselor educators need to increase the importance placed on technology in counselor education programs. The movement of technology into increasingly primary roles in health care is indicative of the need for it to become a primary focus during the education and training of counselors. Counselors and counselor educators must stay abreast of the trends and developments regarding health care law and technology. The implementation of Section H, “Distance Counseling, Technology, and Social Media,” in the 2014 ACA Code of Ethics also is indicative of this need. The counseling profession needs to increase the research, education and training available to counselors and counselor educators.


  1. Counselor educators need to have policies and procedures in place guiding the use of technology in their departments. The overview of HIPAA regulations will help provide guidelines for developing a set of policies and procedures. All policies and procedures must be in writing and accessible to students, faculty and staff who have access to any ePHI. Many counseling programs maintain a student handbook in which a set of standards that dictate the use of technology could easily be incorporated. Departmental policies should be in place that dictate the consequences should an individual fail to adhere to the stated policies and procedures.


  1. Counselor educators should be actively seeking ways in which technology and HIPAA can be incorporated to best prepare students for their future work environment. The regulations and language of HIPAA and HITECH should be addressed in course activities. Are counseling students getting opportunities to become familiar with Electronic Health Records? Are students having opportunities to write and store notes electronically? Have students addressed the ethical and legal concerns related to the use of technology in practice? Do students understand what it means to maintain encrypted files or how to appropriately de-identify ePHI? Do students understand how to submit health insurance claims electronically? Questions like these are necessary for students to understand so they can be prepared to work in the current mental health environment as competent professionals.


The use of technology in counseling is moving from a secondary to a primary place in counselor education. The expectation that students can find this information after graduation in the form of a workshop is no longer acceptable. The shifts in the language of HIPAA and HITECH have moved the broad health care field in an electronic, digital direction. The familiarity with technology seems to be growing toward a core competency of counselor education programs and faculty. The laws dictated by HIPAA and HITECH provide a framework by which counselor educators can continue to infuse technology into the classroom and clinical experiences.



Conflict of Interest and Funding Disclosure

The authors reported no conflict of interest

or funding contributions for the development

of this manuscript.




American Counseling Association. (2014). ACA code of ethics. Alexandria, VA: Author.

Association for Counselor Education and Supervision Technology Interest Network. (2007). Technical competencies for counselor education: Recommended guidelines for program development. Retrieved from

Breach Notification for Unsecured Protected Health Information, 74 Fed. Reg. 162 (August 24, 2009) (to be codified at 45 CFR §§ 160 & 164).

Christiansen, J. (2000). Can you really get “HIPAA Compliant” software and devices? IT Health Care Strategist, 2(12), 1, 7–8.

Dooling, J. A. (2013). It is always time to prepare for disaster. Journal of Health Care Compliance, 15(6), 55–56.

Health Information Technology for Economic and Clinical Health (HITECH) Act, Title XIII § 13001 of Division A of the American Recovery and Reinvestment Act of 2009 (AARA), Pub. L. No. 111-5 (2009).

Health Insurance Portability and Accountability Act (HIPAA), 45 CFR §§ 160, 162, & 164 (2013). Retrieved from

Huggins, R. (2013, April 5). HIPAA “safe harbor” for your computer (the ultimate in HIPAA compliance): The compleat [sic] guide [Blog post]. Retrieved from

Kissel, R., Regenscheid, A. Scholl, M., & Stine, K. (2014). Guidelines for media sanitization (NIST Publication No. 800-88, Rev. 1). Retrieved from

Lawley, J. S. (2012). HIPAA, HITECH and the practicing counselor: Electronic records and practice guidelines. The Professional Counselor, 2, 192–200. doi:10.15241/jsl.2.3.192

Letzring, T. D., & Snow, M. S. (2011). Mental health practitioners and HIPAA. International Journal of Play Therapy, 20, 153–164. doi:10.1037/a0023717

Medicare and Medicaid Programs; Modifications to the Medicare and Medicaid Electronic Health Record (EHR) Incentive Program for 2014 and Other Changes to the EHR Incentive Program; and Health Information Technology: Revisions to the Certified EHR Technology Definition and EHR Certification Changes Related to Standards Final Rule, 79 Fed. Reg., 179 (September 4, 2014) (to be codified at 45 CFR pt. 170).

Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules Under the Health Information Technology for Economic and Clinical Health Act and the Genetic Information Nondiscrimination Act; Other Modifications to the HIPAA Rules; Final Rule, 78 Fed. Reg., 5566 (January 25, 2013) (to be codified at 45 CFR pts. 160 and 164).

National Institute of Standards and Technology. (2012). Guide for conducting risk assessments (NIST Special Publication No. 800-30, Rev. 1). Retrieved from

Ostrowski, J. (2014). HIPAA compliance: What you need to know about the new HIPAA-HITECH rules. Retrieved from

Reinhardt, R. (2013, October 3). Your software and devices are not HIPAA compliant [Blog post]. Retrieved from

Scarfone, K., Souppaya, M., & Sexton, M. (2007). Guide to storage encryption technologies for end user devices: Recommendations of the national institute of standards and technology (NIST Special Publication No. 800-111). Retrieved from

Social Security Act, 42 U.S.C. § 1176 (a)(1). (2013). Retrieved from

Substance Abuse and Mental Health Services Administration. (2013). National expenditures for mental health services & substance abuse treatment, 1986–2009 (HHS Publication No. SMA-13-4740). Retrieved from

Substance Abuse and Mental Health Services Administration. (2014). Projections of national expenditures for treatment of mental and substance use disorders, 2010–2020 (HHS Publication No. SMA-14-4883). Retrieved from

U.S. Department of Health and Human Services. (n.d.-a). Be aware of misleading marketing claims. Retrieved from

U.S. Department of Health and Human Services. (n.d.-b). Summary of the HIPAA privacy rule. Retrieved from

U.S. Department of Health and Human Services (HHS). (2007). Basics of risk analysis and risk management. Retrieved from

Wheeler, A. M. N., & Bertram, B. (2012). The counselor and the law: A guide to legal and ethical practice (6th ed.). American Counseling Association: Alexandria, VA.

Yu, E. H. (2013). HIPAA privacy and security: Analysis of recent enforcement actions. Journal of Health Care Compliance, 15(5), 59–61.


Tyler Wilkinson, NCC, is an Assistant Professor at Mercer University. Rob Reinhardt, NCC, is in private practice in Fuquay-Varina, NC. Correspondence may be addressed to Tyler Wilkinson, 3001 Mercer University Drive, AACC 475, Atlanta, GA 30341,

Parent–Child Interaction Therapy for Children With Special Needs

Carl Sheperis, Donna Sheperis, Alex Monceaux, R. J. Davis, Belinda Lopez

ParentChild Interaction Therapy (PCIT) is an evidence-based method for reducing disruptive behavior in children and improving parent management of behavior. PCIT is a form of behavioral intervention that can be used in clinical, home and school settings. Although initially designed for intervention related to oppositional defiant disorder and conduct disorder, PCIT has been found to be a promising intervention for addressing behavioral issues among children with special needs. We present methods, research-based instructions and a case example of PCIT with a child diagnosed with autism. This article is intended to assist professional counselors in designing appropriate interventions for both children and parents.

Keywords: autism, parent–child interaction therapy, special needs, behavioral intervention, case example

Counseling techniques for children stem from a myriad of theoretical perspectives, and professional counselors are often in the unique position to provide systems intervention and training to families when a child has disruptive behavior. Despite the seniority of behaviorism in the field of psychology, behavioral family approaches have only recently been acknowledged as an effective practice in professional counseling. According to Gladding (2011), the following three premises underlie behavioral family counseling: (a) all behaviors are learned, (b) maladaptive behaviors are the target for change and (c) not everyone in the family has to be treated for change to occur. With these assumptions, it is easily deduced that the following also are true: (a) behavior can be unlearned and that new behaviors can be taught, (b) underlying, unresolved issues are not the key components of treatment, and (c) positive changes for one family member will impact the entire family system and surrounding ecology.

When working with children of preschool or early elementary age, it is important to directly involve the child’s caregivers. Parents’ influence on their children is significant and some parenting practices may exacerbate some children’s problems (McNeil & Hembree-Kigin, 2010). Behavioral family counseling models recognize the relationship between the child’s behavior and the family system. One behavioral family counseling technique, Parent-Child Interaction Therapy (PCIT), helps families work together with their children in reaching therapeutic goals. PCIT involves teaching parents some fundamental relationship-building strategies, including therapeutic play techniques for parents to use in the home environment (Johnson, Franklin, Hall, & Prieto, 2000). The clinician typically teaches and models PCIT techniques for the parents over the course of 8–10 weeks.

The purpose of this article is to demonstrate the practicality of PCIT as a component of behavioral family counseling, to facilitate the professional counselor’s understanding of the model through a review of PCIT and to illustrate the utility of this model for children with special needs through a case study.


An Overview of PCIT

PCIT (Neary & Eyberg, 2002) is an assessment-driven form of behavioral parent training designed for families with preschool-aged children. We present a brief overview of PCIT, define the key components integral to the process, provide the framework for implementation and discuss the application of PCIT to special populations. The core of PCIT is twofold—to create nurturing parent–child relationships and to model prosocial behaviors while increasing a child’s appropriate, compliant behaviors (Eyberg & Boggs, 1989). Play therapy skills are introduced to parents within the PCIT model to enhance the relationship between the parent and child. Additionally, PCIT cultivates problem-solving skills with parents who can then generalize gains made in the therapeutic milieu into the family environment. Similar to other models of family counseling, PCIT begins with the assessment process. Counselors using PCIT collect psychosocial information from the family through either structured or semistructured clinical interviews. Because PCIT is a behavioral model, family dynamics also are assessed through direct observation of clients. Once clinical interview and observational data are collected and evaluated, the counselor can explore family relationship dynamics.

PCIT counselors attempt to identify family roles, interactions and maladaptive disciplinary techniques (e.g., yelling, lack of follow-through, unrelated consequences) inherent in the system. The ultimate goal of these observations is to derive methods for replacing current ineffective parenting strategies with more adaptive ones, thus creating greater stability in the family system. Similar to other parenting approaches, family counselors using PCIT offer support and assistance in improving parent–child relationships without placing blame on the child or the parents (Webster-Stratton & Herbert, 1993).


The Benefits of PCIT

There are many benefits to PCIT; it is a brief, short-term family counseling procedure that teaches effective parenting skills and helps parents interact better with their children on a daily basis. Fundamentally, PCIT’s two-tailed approach benefits both parents and children (Asawa, Hansen, & Flood, 2008) by reducing the internalization of problems and parent–child stress. Additionally, PCIT empowers parents through teaching positive interactive techniques that build parent–child rapport. PCIT fosters creativity and increases child self-esteem, decreases noncompliance or disruptive behavior, and increases the quality of parent-provided positive regard through developmentally appropriate play (Eyberg et al., 2001). These positive interactions effectively decrease problem behavior, resulting in a reduction or elimination of emergency counseling visits. One PCIT study reported that only 19% of participants in a randomized controlled trial with physically abusive parents re-reported physical abuse more than 2 years after the implementation of the PCIT model (Chaffin et al., 2004).

While PCIT sessions may focus on home and play, the behavioral skills that the parent learns can be generalized to other children and additional settings, building stronger interactions across a spectrum of familial and social settings. According to Urquiza and Timmer (2012), PCIT focuses on the following three essential non-fixed elements: (a) increased positive parent–child interaction and emotional communication skills, (b) appropriate and consistent discipline methods, and (c) direct scaffolding for parent training in the interventions. Thus, once the parent has mastered these skills in the session with the child and therapist, the parent is able to transfer the skills to any location or setting to maintain positive interactions, emotional communication and disciplinary consistency with the child.


The Effectiveness of PCIT

Eyberg and her colleagues have researched and published extensively on the efficacy of PCIT and have empirically supported the effectiveness of PCIT with children exhibiting a range of behavioral and emotional problems (Neary & Eyberg, 2002). Specifically, PCIT has proven effective with problems including attention-deficit/hyperactivity disorder (ADHD), conduct disorders, separation anxiety, depression, postdivorce adjustment, self-injurious behavior and abuse (Eyberg et al., 2001; McNeil & Hembree-Kigin, 2010). For example, Nieter, Thornberry, and Brestan-Knight (2013) conducted a pilot study with 17 families completing PCIT treatment and found a significant decrease in disruptive child behaviors as well as a decrease in inappropriate parent behaviors over the 12-week group program. This study was the first to deliver PCIT via group sessions. The researchers found that treatment effects remained in place for more than 6 months after the group’s completion.

Eyberg et al. (2001) investigated long-term treatment outcomes of PCIT for families of preschoolers with conduct disorders over a period of 1–2 years, and found that treatment effects were sustained over time. According to the researchers, the study was the first of its kind to include long-term follow-up with families receiving PCIT treatment (Eyberg et al., 2001). Hood and Eyberg (2003) established further evidence in another follow-up study on PCIT treatment effects over a period of 3–6 years. In the study of treatment effects on families with young children diagnosed with oppositional defiant disorder, the researchers found that treatment effects and behavioral change were again sustained over time. Thomas and Zimmer-Gembeck (2007) conducted a review of behavior outcomes in 24 studies on PCIT and another parenting intervention, Triple P—Positive Parenting Programs. All of the studies involved children aged 3–12 and their caregivers. Meta-analyses revealed positive effects for PCIT as well as the other intervention, adding support within the literature on the efficacy of PCIT.

To demonstrate the effectiveness of PCIT for treating ADHD, Guttmann-Steinmetz, Crowell, Doron, and Mikulincer (2011) examined the interactions of children with ADHD and their mothers. Their findings suggest that while Behavior Parent Training is useful in managing ADHD, PCIT may be highly effective in impacting the attachment-related processes during the child’s later developmental stages. These researchers suggested that parents’ successful adaptation of PCIT’s verbal and behavioral skills during interaction with their child might increase the child’s sense of security.

The effectiveness of PCIT has been expanded to other disorders such as separation anxiety. For example, Choate, Pincus, Eyberg, and Barlow (2005) conducted a pilot study involving three families with children 4–8 years of age diagnosed with separation anxiety disorder. The researchers found that the child-directed activities fostered children’s sense of control and reduced separation anxiety symptomology to normative levels by the conclusion of treatment. Again, the treatment effects were shown to persist long after treatment ceased. This study was replicated by Anticich, Barrett, Gillies, and Silverman (2012), providing further support of PCIT’s effectiveness in alleviation of separation anxiety disorder symptomology.

Individuals or populations with special needs also appear to respond positively to PCIT. Bagner and Eyberg (2007) found that mothers of young children diagnosed with mental retardation and oppositional defiant disorder reported a reduction in disruptive behaviors, increased compliance and less parenting stress after participating in a randomized, controlled trial study utilizing PCIT. PCIT also has been cited as a promising evidence-based intervention for autism (Agazzi, Tan, & Tan, 2013). Solomon, Ono, Timmer, and Goodlin-Jones (2008) conducted a randomized trial of PCIT for treating autism and found the same results as researchers studying other disorders have. PCIT was shown to reduce behavioral disruptions, increase adaptability and increase positive parental perceptions of child behavior. While PCIT was originally developed to address behavioral disorders, it clearly serves as an intervention for various other disorders that impact parent–child interactions.


The impact of PCIT on parents. PCIT has been shown to have equally effective outcomes for parent-related issues as it does for child behavioral disruptions. For example, Luby, Lenze, and Tillman (2012) reported highly favorable results for using PCIT to reduce behavioral disruptions and improve executive function among preschoolers. However, PCIT also showed significant effects for parents. Specifically, PCIT interventions helped to reduce depression severity and parenting stress while increasing emotion recognition. Furthermore, Urquiza and Timmer (2012) found that parental depression decreases the likelihood that the child will complete the treatment course. However, if the parents are persuaded to continue until completion, their own psychological symptoms may be relieved.

PCIT has been shown to have positive effects on parents in a variety of circumstances. For example, Baker and Andre (2008) suggested that PCIT might offer a unique and promising advantage in the treatment of postdivorce adjustment issues in children. PCIT also has been found to be effective in working with abusive parents, many of whose histories included depression, substance abuse and violent behavior (Chaffin et al., 2004). Although still effective in reducing parenting stress and child behavior problems, Timmer et al. (2011) found that PCIT was less effective in foster parent homes than in non-foster parent homes. While PCIT is clearly an effective intervention for both children and parents, in cases with complex systems like foster care placement and abuse, PCIT could be used in conjunction with other interventions. The same is true for clients with special needs.

Diverse population efficacy. Although we recognize that one size does not fit all, PCIT has shown significant results with ethnic minorities and underserved populations. Different cultural and ethnic group parenting styles (strict vs. relaxed styles) vary across the United States. In addition to effectively increasing positive parenting behaviors and deceasing behavioral problems in children, treatment outcomes and efficacy studies support the notion that PCIT is culturally effective and produces robust modifications among diverse groups (see Bagner & Eyberg, 2007; Borrego, Anhalt, Terao, Vargas, & Urquiza, 2006; Matos, Torres, Santiago, Jurado, & Rodríguez, 2006; McCabe & Yeh, 2009). Additional literature and empirical research is available for review regarding work with specific populations such as African Americans and Asians. There also is promising evidence pointing to PCIT’s efficacy in populations exhibiting neurological and behavioral disorders such as autism (Tarbox et al., 2009).

Efficacy through translation. Matos et al. (2006) conducted research in Puerto Rico with parents of children aged 4–6 with ADHD. The manual and handouts were translated into Spanish with a few modifications. Results showed significant decreases in behavior problems and hyperactivity. A recent follow-up study using the culturally adapted version further revealed that significant and robust outcome measures resulted from large treatment effect sizes. Mothers reported reductions in “hyperactivity-impulsivity, inattention, and oppositional defiant and aggressive behavior problems, as well as a reduced level of parent-child related stress and improved parenting practices” (Matos, Bauermeister, & Bernal, 2009, p. 246). Additionally, in a single-case study with a Spanish-speaking foster mother and a 3-year-old Mexican-Chilean-Filipina child, PCIT proved to be effective; reports from other family members noted substantive behavior improvement (Borrego et al., 2006). Thus, we can deduce that PCIT can be used effectively across cultural groups.


Key Components

There are three main components of PCIT: child-directed interaction (CDI), parent-directed interaction (PDI) and cleanup. Depending on the session being held, the components are presented in 5-minute segments with varying degrees of parent control required. CDI is characteristically the first stage in PCIT. Similar in approach to filial play therapy, this first stage creates an opportunity to strengthen the parent–child relationship. Because PCIT is utilized in the context of dyadic play, it is conducted in a room conducive to play (McNeil & Hembree-Kigin, 2010). Thus, a room designated for CDI should contain a variety of toys, crayons, paper, modeling compounds and other developmentally appropriate activities for a child. As with other play techniques, in order to give children the opportunity to determine the rules by which they will play, games with rules are generally excluded from a CDI playroom. Children engaged in CDI should be allowed to play with any or all of the items in the room. Encouraging free play indicates to the child that he or she is the creator of the play, not the caregiver. This approach allows the time to truly be child-directed.

Within CDI, the establishment of a positive therapeutic relationship is a crucial step in building a foundation for the introduction of compliance training. Compliance training is simply teaching a child to mind or comply over a period of time, through small compliance goals set by the parents. To lay the groundwork for this process during CDI, the parents are instructed to praise, reflect, imitate and describe their child’s play, while not asking questions, placing demands or criticizing the activities that transpire unless harmful to the child (McNeil, Eyberg, Eisentadt, Newcomb, & Funderburk, 1991).

Another essential concept introduced during CDI, includes fostering the enthusiasm and willingness of the parent. Although responding positively to a child’s free play during CDI may appear simple, parents often need considerable practice to master this response set. For example, one of the toys in our clinic is a Mr. Potato Head. Young children can be very creative in their placement of the various accessories that come with the toy. Often they will place an arm on top of the head, lips on the ear hole or eyes over the mouth hole. In PCIT, we view this action as an expression of creativity. However, when we observe parents in free play with their children, we often witness the parents limiting their children’s creativity by redirecting the placement of the appendages on Mr. Potato Head. Parents often say, “No, honey, the lips go here,” or “That’s not where the arms go.” Instructing parents to refrain from making such comments is generally all that a PCIT counselor needs to do. PCIT counselors understand that this is a difficult skill for most parents to master, and they teach parents the acronym PRIDE for use during CDI as well as other elements of PCIT. PRIDE simply stands for praise, reflection, imitation, description and enthusiasm (Eyberg, 1999). Table 1 provides some practical examples of desired responses from parents during CDI using the PRIDE approach.

Table 1

Responses Using PRIDE model



Praise Parent: “Thank you for putting away the toys.”
Reflection Child: “I’m drawing a dinosaur.”Parent: “I see. You are drawing a dinosaur.”
Imitation Child is playing with a car. Parent gets a similar car and begins playing in the same manner.
Description Child is playing with a toy airplane. Parent says, “You are making the airplane fly.”
Enthusiasm Parent: “Wow. Your drawing is very creative.”


In the second stage of PCIT, PDI usually is initiated once parents master CDI. Mastery is evidenced during the child’s play by the parents exhibiting proper implementation of the PRIDE responses. PDI also is conducted in the playroom or room selected for CDI. PDI consists of teaching parents how to manage their child’s behavior and promoting compliance with parental requests (Bahl, Spaulding, & McNeil, 1999). Parents should understand that PDI is more difficult for children than CDI and will likely be challenging for both the child and parent. When beginning PDI, parents must understand the importance of appropriate discipline techniques and receive training in giving clear directions to their children. Because children require a great deal of structure, professional counselors emphasize the importance of consistency, predictability and follow-through in this training (McNeil & Hembree-Kigin, & 2010). In order to initiate compliance training, parents practice giving effective instructions to their child. McNeil and Hembree-Kigin (2010) offered several rules for giving good instructions as part of the parent training element of PDI that can be conceptualized in the following ways: Command Formation, Command Delivery and Command Modeling:

Command Formation

  1. Give direct commands for things you are sure the child can do. This increases the child’s opportunity for success and praise.
  2. Use choice commands with older preschoolers. This fosters development of autonomy and decision making. (e.g., “You can put on this dress or this dress” rather than “What do you want to wear?” or “Wear this”).
  3. Make direct commands. Tell the child what to do instead of asking whether they would like to comply (e.g., “Put on your coat”).
  4. State commands positively by telling child what to do instead of what not to do. Avoid using words such as “stop” and “don’t.”
  5. Make commands specific rather than vague. In doing so, the child knows exactly what is expected and it is easier to determine whether or not the child has been compliant.

Command Delivery

  1. Limit the number of commands given.
  • Some children are unable to remember multiple commands. The child’s opportunity for success and praise increases with fewer, more direct instructions given at a time.
  • When giving too many commands, parents have difficulty following through with consequences for each command. Additionally, the parent’s ignoring some minor behaviors may be best.
  1. Always provide a consequence for obedience and disobedience. Consequences are the quickest ways to teach children compliance. Consistency when providing consequences is the key to encouraging compliance.
  2. Use explanations sparingly. Some children would rather stall than know the answer. Avoiding the explanation trap prevents children from thinking they have an opportunity to talk their way out of it.

Command Modeling

  1. Use a neutral tone of voice instead of pleading or yelling. Interactions are more pleasant in this manner and the child learns to comply with commands that are given in a normal conversational voice.
  2. Be polite and respectful while still being direct. This models appropriate social skills and thus interactions are more pleasant.

After teaching parents to deliver effective instructions and allowing time for in vivo practice, professional counselors introduce appropriate discipline strategies to be used in PDI. The in vivo coaching model utilizes an observation room with a two-way mirror and the ability to for the counselor to communicate with the parent via microphone. The focus on training includes communication and behavior management skills with additional homework sessions (Urquiza & Timmer, 2012). In a study by Shanley and Niec (2010), parents who were coached via a bug-in-ear receiver with in vivo feedback during parent–child interactive play demonstrated rapid increases in positive parenting skills and interactions. Of these strategies, timeout is the most common as it is “a brief, effective, aversive treatment that does not hurt a child either physically or emotionally” (Eaves, Sheperis, Blanchard, Baylot, & Doggett, 2005, p. 252). Furthermore, Eaves et al. (2005) wrote that timeout benefits both children with problematic behaviors and those who view the technique being used on other children, in addition to those children and adolescents demonstrating developmental delays, psychiatric issues and defiance. However, for the parent to experience timeout’s full benefit, the technique must be appropriately and consistently administered. Eaves et al.’s (2005) article, “Teaching Time-Out and Job Card Grounding Procedures to Parents: A Primer for Family Counselors” is an excellent article on timeout and why it is an effective intervention.

All aspects of timeout are reviewed with the parents, such as the rationale for timeout, where timeout should take place in the home, what to do when the child escapes timeout, what to do if the child does not comply with timeout, the length of timeout, what should happen right before timeout and what should happen right after timeout. Parents receive written instructions illustrating each step of timeout and offering guidance on how to implement the procedure. These discipline strategies may not be necessary if a child is motivated to be compliant. Determining compliance is often a very hard decision for parents to make. According to McNeil and Hembree-Kigin (2010) there are several rules used to help parents determine compliance or noncompliance.

  1. Parents must be sure that the instructions are developmentally appropriate for the child. If the child is asked to bring the orange cup to the parent, one must know that the child can determine which cup is actually orange.
  2. Parents should know that the request is completely understood by the child. If there are any questions about this the parents should point or direct the child to help him or her fully understand the request.
  3. Parents should allow the child approximately 3 seconds to begin to attempt the task. If the child has not begun to attempt the task by this time it should be considered noncompliance.
  4. Parents should state the request only once. If the child pretends not to hear the request, this should be considered noncompliance.
  5. Parents should not allow the child to partially comply with instructions. If parents accept half-compliance then children will often repeat the negative behavior because they know they can get away with it.
  6. Parents should not respond to a child’s bad attitude in completing a request. As long as he or she completes the instruction, it is compliance.
  7. Parents should consider it compliance if a child does what is asked and then undoes what is asked. Compliance is compliance no matter how long it lasts.

When a parent determines that a child is compliant, verbal praise should be provided. This praise should be given immediately and focus on the child’s compliance. Parents are encouraged to practice the skills of giving good directions by delivering multiple commands to the child. These commands are given during the playtime and may include requests to hand things to the parent (e.g., “Give me the red block”) or play with certain toys (e.g., “Place the blue car in the box”). This activity allows the child to practice following directions, while also affording the parent the opportunity to practice praise (McNeil & Hembree-Kigin, 2010). The child begins to learn that when he or she follows directions, his or her parents are very appreciative and excited. After the small tasks are accomplished, parents begin to place demands on the child that are less desirable, such as cleaning up the toys or moving on to another task (McNeil & Hembree-Kigin, 2010). By assigning less desirable tasks, parents find themselves in a position to practice a timeout procedure with the child. The professional counselor is there to model timeout and coach the parents when practicing timeout.

The third and final component to consider is called cleanup, which occurs as part of PDI. This time during the PCIT process is exactly what one might think; it is intended to teach the child to clean up the toys at the end of the parent–child interaction in both the counseling and home milieus. Cleanup should be done without the parents’ help but with the parents’ direction. Although this component may seem simple, it tends to be a challenging situation, as significant behavior problems often are displayed during this phase. The expectation is that this phase lasts 5 minutes, but time varies depending upon the behavior of the child (McNeil & Hembree-Kigin, 2010). Cleanup occurs only at the end of parent-directed play, not at the end of child-directed play, to avoid confusing the child about the role of parental help during cleanup. All three components—CDI, PDI and cleanup—are opportunities for behavioral observation and data collection.

Implementing PCIT

According to McNeil and Hembree-Kigin (2010), there are six steps in conducting PCIT with a family. These authors have briefly described the contents of each step as well as provided guidelines for the number of sessions typically devoted to completing the tasks within each step. Step 1 requires one to two sessions for the intake process, Step 2 requires one session to introduce and teach parents CDI skills, and Step 3 requires two to four sessions in which the parents are coached on these skills. Steps 4 and 5 introduce and coach the PDI and may take up to six sessions. The final session is the follow-up session. These six steps complete a 10- to 15-session triadic training program.

Step 1 is the initial intake and can be accomplished in one to two counseling sessions, unless classroom or other observations are warranted. These sessions consist of assessing family dynamics, obtaining the family’s perception of the presenting problems, probing for insights into the current disciplinary beliefs and methods held by the parents, and observing the natural interactions between parents and child. In addition to the information-gathering component, the clinician defines the parameters of the sessions as well as the rules and expectations. Certain parameters involve an understanding by the parents that this CDI time is designated for the child to lead and for the parent to follow—a time often described to the parents as time-in for the child. Thus, time-in is defined as a time when the child facilitates play by selecting the type of play and initiating all play interactions.

The initial informal observation usually takes place in a sitting area while the family is waiting to visit with the counselor. In this informal observation, the counselor looks for “the child’s ability to play independently, strategies the child uses to engage the parent’s attention, parental responsiveness to child overtures, parental limit-setting, warmth of parent-child interactions, and evidence of clinging and separation anxiety” (McNeil & Hembree-Kigin, 2010, p. 20). After this stage of observation, a more formal observation is completed using the Dyadic Parent–Child Interaction Coding System (DPICS; Eyberg & Robinson, 1983). This observation is typically accomplished in three 5-minute increments in which behaviors and verbalizations are marked on the DPICS sheet. The formal observation occurs over the three PCIT stages previously described—CDI, PDI and cleanup. Following the initial observations, a third observation may be executed as a classroom observation. This type of observation is done with students who attend day care, preschool or elementary school, and allows one to see the child interact within his or her daily environment. Observation occurring in an alternate setting increases the chances of obtaining normative behavior (McNeil & Hembree-Kigin, 2010).

In Steps 2 and 3, the counselor presents and provides coaching on the CDI skills. Step 2 typically requires one counseling session. During this time the parents are taught the behavioral play therapy skills of CDI. The third step, coaching the CDI skills, may take two to four sessions depending on how the family adopts these principles into their daily interactions with their child. Coaching is described as modeling the behavior for the family, allowing the family to practice in session with feedback and prompts as needed, assigning the family homework to practice, and then repeating these steps until the parents are comfortable and fluent in the process.

In Steps 4 and 5, respectively, the counselor teaches and coaches the parents about discipline skills. These skills include both PDI and compliance training. Step 4 is typically accomplished in one session. Coaching may last from four to six sessions. Again, coaching is described as modeling, in-session practice with feedback and prompts, assigning homework, and evaluating success.

Step 6 consists of having a follow-up counseling session. This session is an opportunity to assess the family’s progress with proper implementation of each component of the PCIT model, gauge the strides made in compliance and assess the overall family satisfaction gained throughout the journey. Finally, one should use boosters to help parents maintain learned skills as they face new challenges with their children. Table 2 delineates the steps to implementing PCIT over a typical 10–15-session treatment plan.

Table 2

Implementing PCIT


Number of sessions




Informal and formal observation




Coaching CDI skills


Teaching discipline skills via PDI and compliance training






Case Study

PCIT was selected for use in the treatment of Manny, a 6-year-old Hispanic male diagnosed with autism and noncompliant to his mother. Like many children with autism, Manny had difficulty with unpredicted changes and verbalization of concerns. As Manny’s frustration with communication increased, he demonstrated stereotypies such as hand flapping and eventually progressed to tantrum behavior. The two goals of treatment were to increase the frequency of appropriate verbalizations and to decrease the frequency of inappropriate behavior including physical aggression, noncompliance and making noises. Manny was experiencing other issues related to autism, but his mother indicated that the behavioral problems were preventing him from making progress in other area.

As a result, we decided to conduct a functional behavior analysis prior to beginning treatment. This assessment of Manny’s behavior indicated that some of the behavior disruptions were a means of seeking attention, and therefore it was determined that PCIT would teach the mother to provide more consistent attention for appropriate behavior and to encourage appropriate communication more effectively. If needed, the addition of the timeout component was available after the mother began adequately attending to Manny’s appropriate behavior and ignoring inappropriate behavior.

Session 1

The counselor explained the procedure and rationale for PCIT to the mother, including CDI, PDI and timeout. CDI was modeled and demonstrated with Manny. The mother was uncomfortable about being judged on her parenting skills, so it was decided that she would practice the skills at home using the Child’s Game nightly with Manny. The Child’s Game is simply defined as any free play activity the child chooses. The family would return to the clinic in 1 week.

Session 2

The counselor reviewed CDI and had the mother conduct the Child’s Game for 5 minutes. During CDI, the counselor observed and noted the mother’s responses. The mother included 13 questions, one criticism and one demand in the 5-minute session. The mother praised Manny frequently, but did not use the other desired skills often. Manny was compliant with the demand that the mother gave and did not exhibit any of the disruptive behaviors. Following the CDI, feedback was given to the mother about increasing descriptions, reflections, imitations and praises, and reducing questions. The mother also was encouraged to recognize and praise communication attempts. Overall, the mother was directed to allow Manny to lead the play. When queried about CDI practice at home, the mother reported that the activity the family had used for the Child’s Game was watching television. Because there is no inherent interaction in television viewing, the mother was directed to provide a choice to play with action figures or art materials, both indicated as reinforcing by Manny, in place of video games or television. The Child’s Game was again given as homework.

Session 3

The professional counselor reviewed CDI and viewed the family during the Child’s Game. The mother showed improvement using descriptions (16), reflections (3), imitations (1) and praises (15). She also limited her use of questions (6), criticisms (0) and demands (0). However, Manny exhibited disruptive behavior in 23% of the observed intervals. The mother also reported that Manny continued to be noncompliant and make noises at home. The professional counselor introduced PDI and timeout. Each was modeled with Manny, and his mother was allowed to practice and receive feedback. Homework was to continue the Child’s Game, issue 10 demands throughout the day and follow through with the brief timeout procedure. Also, the mother was asked to develop five house rules to bring the following week. To keep a record of the number of instructions with which Manny complied before going to timeout, and the number of timeouts per day, the mother received a homework compliance worksheet to keep for 1 week. This log allows the parent to record the homework—in this case, using the Child’s Game daily, issuing 10 demands throughout each day and recording the Manny’s compliance to each, and using timeout as indicated.

Session 4

The counselor reviewed PDI, giving effective instructions and timeout to begin the session. The counselor then observed the family during CDI/PDI. The mother gave clear, concise instructions six out of nine times, only failing to wait before reissuing instructions when Manny did not immediately comply. Manny complied with all issued demands except when the mother reissued the demands too quickly. The mother followed Manny’s compliant behavior with praise statements four out of nine times. Manny was put in timeout for disruptive behavior and the mother used the procedure correctly. Manny demonstrated disruptive behavior during 33% of the observed intervals. A review of the homework compliance worksheet from the previous week indicated that Manny complied with 10 out of 10 instructions on 5 out of 9 days, and nine out of 10 instructions the remaining 2 days. The mother was encouraged to continue generalizing the skills she had learned throughout the day. The house rules developed by the family over the previous week were discussed and worded in positive statements and then introduced to Manny. The rules were explained and both examples and non-examples were modeled. Homework was given to continue incorporating the Child’s Game, issuing 10 demands in a brief period of time, using timeout as needed and recording compliance rates for 1 week.

Session 5

The counselor reviewed PDI, EID, timeout and the homework compliance worksheet. The mother indicated that Manny had been compliant before timeout 10 out of 10 times for 6 days and nine out of 10 times for 1 day. The mother also noted that Manny had been placed in timeout for breaking house rules. The mother reported that Manny’s behavior had improved and he had had fewer tantrums related to schedule changes. She was encouraged to continue using the PCIT skills and adapting them to more situations. Because compliance was increasing, it was not necessary to continue CDI and PDI in this session. The family was given homework to continue the Child’s Game, PDI, using timeout as needed and recording compliance rates. This time, the family was to work at home for 2 weeks before the next session.

Session 6

The counselor reviewed the family’s progress and addressed further generalization and concerns about daycare. The mother indicated that the child had been compliant before timeout on 10 out of 14 days. Two of the other days Manny had been placed in timeout 10 times and six times for violating house rules. The zero out of 10 compliance rating occurred during his birthday party, and the six out of 10 compliance rating was primarily the result of an unexpected trip to the grocery store. The family was again given homework to continue practicing generalizing CDI, PDI, using timeout as needed and recording compliance rates for 2 weeks.

Session 7

The counselor addressed concerns including the beginning of school in a few weeks and provided suggestions to ease the transition. While the mother indicated that Manny had been compliant before timeout on only 4 of the previous 14 days, a review of the compliance rates revealed that on the other 10 days, Manny was compliant no less than 80% of the time. These compliance rates from various family settings were indicative of behaviors being generalized across settings. The mother also showed evidence of her generalization of skills by adapting the house rules to address new problematic behaviors. The family was encouraged to begin reviewing material learned in the previous session and work on behavioral skills such as sitting for appropriate lengths of time. The mother was instructed to continue both the use of her attending skills in order to reinforce appropriate behavior, as well as the use of the timeout procedure to diminish inappropriate behaviors.

Session 8

For the final follow-up session, the counselor reviewed the family’s progress and determined that treatment goals were met. Concerns about how to get other family, friends and teachers to use PCIT skills with Manny were addressed in this final session. The family noted the improvements made as a result of PCIT and felt equipped to maintain the behavioral changes gained as a result of this counseling approach. Termination of the PCIT intervention was appropriate at this time; the case provided clear evidence of the application and utility of the PCIT model. Manny’s mother was offered the opportunity to continue interventions related to the other autism-specific issues that Manny was experiencing.



Professional counselors, whether working with children who have disruptive behavior or providing parenting training to families, should be knowledgeable of the application of various behavioral techniques in order to utilize them effectively and to teach them to parents. Researchers have proven that when implemented appropriately, PCIT procedures are effective in reducing undesirable and problematic behaviors in children and adolescents. Furthermore, it is clear that PCIT can be effectively applied to behavioral issues faced by children with special needs. We suggest that counselors who are interested in PCIT seek additional training to develop mastery of the techniques.

PCIT is a complex process that is often mistakenly viewed as simplistic. Thus, counselors who use PCIT without appropriate training will likely provide ineffective parental coaching. This point is especially poignant when working with children who have special needs. These children often present with numerous significant issues and deserve appropriate application of evidence-based intervention. We strongly suggest that counselors complete the web-based training provided by the University of California at Davis Children’s Hospital. The training is free and can be accessed at Given that PCIT is an effective approach and that the effectiveness of the model increases with appropriate education, professional counselors who further educate themselves on PCIT’s uses and applications can benefit their practices and the families they serve through the correct use of this empirically validated method of behavioral family counseling.

Counselors who are interested in PCIT also should consider advancing research related to counseling applications. While PCIT has been shown to be an effective intervention for autism and other disorders, more research is needed. We encourage counselors to consider implementation of studies that determine outcomes of PCIT for various child disorders and to conduct program evaluation for PCIT-based clinics.

Conflict of Interest and Funding Disclosure

The authors reported no conflict of  interest or funding contributions for  the development of this manuscript.



Agazzi, H., Tan, R., & Tan, S. Y. (2013). A case study of parent–child interaction therapy for the treatment of autism spectrum disorder. Clinical Case Studies, 12, 428–442. doi:10.1177/1534650113500067

Anticich, S. A. J., Barrett, P. M., Gillies, R., & Silverman, W. (2012). Recent advances in intervention for early childhood anxiety. Australian Journal of Guidance and Counselling, 22, 157–172. doi:10.1017/jgc.2012.24

Asawa, L. E., Hansen, D. J., & Flood, M. F. (2008). Early childhood intervention programs: Opportunities and challenges for preventing child maltreatment. Education and Treatment of Children, 31, 73–110.

Bagner, D. M., & Eyberg, S. M. (2007). Parent–child interaction therapy for disruptive behavior in children with mental retardation: A randomized controlled trial. Journal of Clinical Child and Adolescent Psychology, 36, 418–429. doi:10.1080/15374410701448448

Bahl, A. B., Spaulding, S. A., & McNeil, C. B. (1999). Treatment of noncompliance using parent child interaction therapy: A data-driven approach. Education and Treatment of Children, 22, 146–156.

Baker, A. J. L., & Andre, K. (2008). Working with alienated children & their targeted parents. Annals of the American Psychotherapy Association, 11(2), 10–17.

Borrego, J., Jr., Anhalt, K., Terao, S. Y., Vargas, E. C., & Urquiza, A. J. (2006). Parent-Child interaction therapy with a Spanish-speaking family. Cognitive and Behavioral Practice, 13, 121–133.

Chaffin, M., Silovsky, J. F., Funderburk, B., Valle, L. A., Brestan, E. V., Balachova, T., . . . Bonner, B. L. (2004). Parent–Child interaction therapy with physically abusive parents: Efficacy for reducing future abuse reports. Journal of Consulting and Clinical Psychology, 72, 500–510. doi:10.1037/0022-006X.72.3.500

Choate, M. L., Pincus, D. B., Eyberg, S. M., & Barlow, D. H. (2005). Parent–Child interaction therapy for treatment of separation anxiety disorder in young children: A pilot study. Cognitive and Behavioral Practice, 12, 126–135. doi:10.1016/j.cbpra.2005.09.001

Eaves, S. H., Sheperis, C. J., Blanchard, T., Baylot, L., & Doggett, R. A. (2005). Teaching time-out and job card grounding procedures to parents: A primer for family counselors. The Family Journal: Counseling and Therapy for Couples and Families, 13, 252–258. doi:10.1177/1066480704273638

Eyberg, S., & Boggs, S. (1989). Parent training for oppositional-defiant preschoolers. In C. E. Schaefer & J. M. Briesmeister (Eds.), Handbook of parent training: Parents as co-therapists for children’s behavior problems (pp. 105–132). New York, NY: Wiley & Sons.

Eyberg, S. M. (1999). Parent-Child interaction therapy: Integrity checklists and session materials. Retrieved from

Eyberg, S. M., Funderburk, B. W., Hembree-Kigin, T. L., McNeil, C. B., Querido, J. G., & Hood, K. K. (2001). Parent-Child interaction therapy with behavior problem children: One and two year maintenance of treatment effects in the family. Child & Family Behavior Therapy, 23(4), 1–20. doi:10.1300/J019v23n04_01

Eyberg, S. M., & Robinson, E. A. (1983). Conduct problem behavior: Standardization of a behavioral rating scale with adolescents. Journal of Clinical Child Psychology, 12, 347–354. doi:10.1080/15374418309533155

Gladding, S. T. (2011). Family therapy: History, theory, and practice (5th ed.). Upper Saddle River, NJ: Prentice Hall.

Guttmann-Steinmetz, S., Crowell, J., Doron, G., & Mikulincer, M. (2011). Associations between mothers’ and children’s secure base scripts in ADHD and community cohorts. Attachment & Human Development, 13, 597–610. doi:10.1080/14616734.2011.609010

Hood, K. K., & Eyberg, S. M. (2003). Outcomes of parent–child interaction therapy: Mothers’ reports of maintenance three to six years after treatment. Journal of Clinical Child and Adolescent Psychology, 32, 419–429. doi:10.1207/S15374424JCCP3203_10

Johnson, B. D., Franklin, L. C., Hall, K., & Prieto, L. R. (2000). Parent training through play: Parent-Child interaction therapy with a hyperactive child. The Family Journal: Counseling and Therapy for Couples and Families, 8, 180–186. doi:10.1177/1066480700082013

Luby, J., Lenze, S., & Tillman, R. (2012). A novel early intervention for preschool depression: Findings from a pilot randomized controlled trial. Journal of Child Psychology and Psychiatry, 53, 313–322. doi:10.1111/j.1469-7610.2011.02483.x

Matos, M., Bauermeister, J. J., & Bernal, G. (2009). Parent-Child interaction therapy for Puerto Rican preschool children with ADHD and behavior problems: A pilot efficacy study. Family Process, 48, 232–252. doi:10.1111/j.1545-5300.2009.01279

Matos, M., Torres, R., Santiago, R., Jurado, M., & Rodríguez, I. (2006). Adaptation of parent–child interaction therapy for Puerto Rican families: A preliminary study. Family Process, 45, 205–222. doi:10.1111/j.1545-5300.2006.00091.x

McCabe, K. & Yeh, M. (2009). Parent–Child interaction therapy for Mexican Americans: A randomized clinical trial. Journal of Clinical Child and Adolescent Psychology, 38, 753–759. doi:10.1080/15374410903103544

McNeil, C. B., Eyberg, S., Eisentadt, T. H., Newcomb, K., & Funderburk, B. (1991). Parent–Child interaction therapy with behavior problem children: Generalization of treatment effects to the school setting. Journal of Clinical Child Psychology, 20, 140–151. doi:10.1207/s15374424jccp2002_5

McNeil, C. B., & Hembree-Kigin, T. L. (2010). Parent-Child interaction therapy. New York, NY: Springer.

Neary, E. M., & Eyberg, S. M. (2002). Management of disruptive behavior in young children. Infants and Young Children, 14(4), 53–67.

Nieter, L., Thornberry, T., Jr., & Brestan-Knight, E. (2013). The effectiveness of group parent–child interaction therapy with community families. Journal of Child and Family Studies, 22, 490–501.

Shanley, J., & Niec, L. N. (2010). Coaching parents to change: The impact of in vivo feedback on parents’ acquisition of skills. Journal of Clinical Child and Adolescent Psychology, 39, 282–287. doi:10.1080/15374410903532627

Solomon, M., Ono, M., Timmer, S., & Goodlin-Jones, B. (2008). The effectiveness of parent–child interaction therapy for families of children on the autism spectrum. Journal of Autism and Developmental Disorders, 38, 1767–1776. doi:10.1007/s10803-008-0567-5

Tarbox, J., Wilke, A. E., Najdowski, A. C., Findel-Pyles, R. S., Balasanyan, S., Caveney, A. C., . . . Tia, B. (2009). Comparing indirect, descriptive, and experimental functional assessments of challenging behavior in children with autism. Journal of Developmental and Physical Disabilities, 21, 493–514. doi:10.1007/s10882-009-9154-8

Thomas, R., & Zimmer-Gembeck, M. J. (2007). Behavioral outcomes of parent-child interaction therapy and triple p—positive parenting program: A review and meta-analysis. Journal of Abnormal Child Psychology, 35, 475–495.

Timmer, S. G., Ho, L. K. L., Urquiza, A. J., Zebell, N. M., Fernandez y Garcia, E., & Boys, D. (2011). The effectiveness of parent–child interaction therapy with depressive mothers: The changing relationship as the agent of individual change. Child Psychiatry & Human Development, 42, 406–423. doi:10.1007/s10578-011-0226-5

Urquiza, A. J., & Timmer, S. (2012). Parent-Child interaction therapy: Enhancing parent-child relationships. Psychosocial Intervention, 21, 145–156. doi:10.5093/in2012a16

Webster-Stratton, C., & Herbert, M. (1993). “What really happens in parent training?” Behavior Modification, 17, 407–456. doi:10.1177/01454455930174002


Carl Sheperis, NCC, is the Chair of the Department of Counseling and Special Populations at Lamar University. Donna Sheperis, NCC, is an Associate Professor at Lamar University. Alex Monceaux is an instructor at Lamar University. R. J. Davis and Belinda Lopez are Assistant Professors at Lamar University. Correspondence may be addressed to Carl Sheperis, Box 10034, Beaumont, TX 77710,