TPC-Journal-V5-Issue4

The Professional Counselor /Volume 5, Issue 4 476 Participants One hundred and sixty-six respondents completed the ACES survey (33% response rate). In terms of rank, 35 respondents (21%) indicated they were a professor, 53 (32%) associate professor, 49 (30%) assistant professor, 23 (14%) non-tenure track (clinical or adjunct), and 6 (3%) indicated they fell into an other category. About 51% of the respondents had taught a doctoral-level counselor education course before (84), and the other half had not (81), having only taught master’s-level classes. Twenty- seven percent (44) of respondents reported they had never served on a CES faculty search committee. Among the respondents who indicated they had served on CES faculty search committees, 44% (72) master’s-level served on 1–4 committees, 19% (31) served on 5–8 committees, 4% (7) served on 9–12 committees, and 6% (10) served on more than 12 committees. Eighteen out of 57 CACREP liaisons responded to the survey (32% response rate). Demographic data was not collected from this group. Survey Design To respond to the stated research questions, the authors deemed it was important to request demographic information on rank, programs offered, doctoral teaching experience and the number of search committees on which the participants had served. Two questions were developed asking for level of importance of qualifications when considering candidates for a tenure-track position and a non-tenure track (i.e., adjunct or clinical) position. The qualifications the authors identified were: post-master’s counseling, publications, grants, supervision, college teaching, professional organization involvement and professional organization leadership. Participants rated the level of importance as 1 (not at all), 2 (somewhat), 3 (quite a bit) and 4 (extremely). The participants also were asked to provide a minimum quantity for each qualification, if the participant deemed the qualification to be quite a bit or extremely important. The qualifications included were selected based on surveying position announcements for CES positions. Four hypothetical scenarios were presented to the participants that included situations involving serving on a search committee and serving as an advisor to a master’s student with particular questions about pursuing a doctoral degree. Each of the hypothetical scenario questions asked for a response and a rationale for that response. Researchers piloted the survey with three faculty members who all reported that the survey was clear. The pilot participants’ responses were reviewed to ensure survey questions measured what was intended. Data Analysis Authors analyzed the demographic and scaling questions by count and percentages using the SurveyMonkey results produced by the software. The results include numerical count of the participant responses. The authors analyzed responses to the open-ended comment requests using a constant comparative method described by Anfara, Brown, and Mangione (2002), along with a form of check coding described by Miles and Huberman (1994). The first three authors were the analysis team for this process. Two team members independently conducted a first iteration of assigning open codes for each of the five open-ended questions by reading the data from each question broadly and noticing regularities (Anfara et al., 2002). The two authors then conducted a second iteration of comparison within and between codes in order to create categories and identify themes. The constant comparative method of analysis allows a way to make sense of large amounts of data by organizing into manageable parts first and subsequently identifying themes and patterns. The third team member served in a peer review capacity (Miles & Huberman, 1994) during the categorizing and theme identification for that question. For each question, different team members were assigned as coders and the peer reviewer. Once the team members assigned individually

RkJQdWJsaXNoZXIy NDU5MTM1