The Professional Counselor | Volume 10, Issue 4 589 in doctoral-level CES programs (M = 17.3 years, SD = 9.2 years, Mdn = 16 years), ranging from 3 to 33 years. More than half of participants (n = 9, 60%) spent their entire careers working in doctoral-level CES programs. Geographic distribution of the programs where participants worked were as follows: eight belonged to the Southern region (53.3%); two each (13.3%) belonged to the North Atlantic, North Central, and Western regions; and one program (6.7%) belonged to the Rocky Mountain region. Twelve participants (80%) were working in brick-and-mortar programs, and three participants (20%) were working in online or hybrid programs. With regard to Carnegie classification representation, nine (60%) were working at Doctoral Universities – Very High Research Activity (i.e., R1) institutions, two (13.3%) were working at Doctoral Universities – High Research Activity (i.e., R2) institutions, and four (26.7%) were working at universities with the Master’s Colleges and Universities: Larger Programs designation (The Carnegie Classification of Institutions of Higher Education, 2019; Preston et al., 2020). Procedure After receiving approval from the last author’s IRB, the last author used the CACREP (2018) website directory to identify and recruit doctoral-level counselor educators who worked at the CACREPaccredited CES programs. Recruitment emails were sent to one faculty member at each of the 85 accredited programs. Fifteen of the 34 faculty (40% response rate) who responded were selected to participate on the basis of maximal variation. Interview Protocol Each interview began with demographic questions that addressed self-identified characteristics such as race, ethnicity, gender, sexual/affectional orientation, years as a faculty member, years working in doctoral-level CES programs, number of doctoral programs the participant had worked in, and regions of the programs in which the counselor educator had worked. A series of eight in-depth interviews followed to address the research questions of the larger qualitative study. Interview questions developed in accordance with Patton’s (2014) guidelines were open-ended, as neutral as possible, avoided “why” questions, and were asked one at a time in a semi-structured interview protocol, with sparse follow-up questions salient to the main questions to ensure understanding of participant responses. Adhering to the interview protocol as outlined in Appendix A helped to ensure that data was gathered for each research question to the highest extent possible. Participants received the interview questions ahead of time upon signing the informed consent agreement. A pilot of the interview protocol was conducted with a faculty member in a doctoral-level CES program prior to commencing the study. The interviews lasted approximately 60 minutes and were recorded using the Zoom online platform. One exception was an interview that occurred in-person during a professional conference and thus was recorded via a Sony digital audio recorder. All demographic information and recordings were assigned an alphabetical identifier known only to the last author and were blinded to subsequent transcribers and coders. Data Analysis Data analysis, as outlined by Corbin and Strauss (2015), employs the techniques of coding interview data to derive and develop concepts. In the initial step of open coding, the primary task is to “break data apart and delineate concepts to stand for blocks of raw data” (Corbin & Strauss, 2015, p. 197). During this step, the coding team sought to identify a list of significant participant statements about how they and their department perceive, value, and experience the responsibility of recruiting, retaining, and supporting underrepresented cultural groups. We met to code the first three of 15 transcripts together via Zoom video platform. The task of identifying codes included searching for data that was salient to the research questions and engaging in constant comparison until reaching saturation (Corbin & Strauss,