New Research Study: Job Analysis
Job analysis is an important part of the validity argument for credentialing programs. As Myers, Morral, & Sares1 point out, a mixed-methods approach over a three to seven-year time frame is most useful for establishing content validation and currency. The I.C.E. Research and Development (R&D) Committee was interested in documenting the approach of credentialing organizations to job analysis, evaluating common industry practices and exploring new technologies that may be applied to job analysis. Thus, it embarked on a study in which members of a taskforce within the committee surveyed the credentialing community on current practices regarding frequency, format and operational considerations, as well as targeted and open-ended questions about new approaches being explored.
The Job Analysis Research Study reports on the survey of industry practices conducted by the (R&D) Committee, and begins to document the approaches of credentialing organizations to job analysis, evaluating common industry practices and exploring new technologies that may be applied to job analysis.
ACCESS THE STUDY NOW
Learn more about the report, and what its authors hope readers will gain from it below.
“The job analysis practitioner survey covered current practices, new technologies being explored, survey construction and psychometric practices used in the credentialing industry. Job analysis is one of the critical steps in ensuring the ongoing relevance, validity and legal defensibility of a credential. For example, job analysis guidelines suggest that a mixed method approach every three to seven years is appropriate, and this report shows how frequently organizations conduct job analyses and what methods are used. From a practical perspective, it is also useful to know the average cost for conducting a job analysis.” —Kirk Becker, PhD
“I think readers will be interested to see some of the fine details of how individual organizations carry out job analysis surveys. For instance, how common is it to use frequency and criticality as rating categories versus importance or other measures, and how do those choices break down by industry or credential type? It’s also interesting to see what purposes, other than blueprint construction, organizations put their job analyses to — e.g. evaluation of eligibility requirements. There is also some data on what response rates organizations are able to get and what incentives they use to do that, which is often an anxious topic to discuss.” —Matt Ferris, MA, MBA, CAE, ELS
“The paper’s primary value lies in its benchmarking practices in the industry. We’ve all heard before how ‘there’s no single way to do a job analysis.’ The paper tries to encapsulate all the different dimensions on which job analyses tend to vary — approaches, survey rating scales, even terminology (job task analysis, practice analysis, role delineation study, skills inventory). It also describes the degree of variation within each of these dimensions across organizations. Readers will see how their credentialing body ‘fits in’ within broad industry practice.” —Tim Muckle, PhD, ICE-CCP
“The paper provides readers a sense of what organizations are doing in terms of their job analysis — frequency of conducting it, number of task statements, incentives, types of analyses, etc. My hope is that the paper can help organizations who are looking to undertake job analysis for the first time, looking to change up their job analysis moving forward or looking to affirm how they undertook their prior job analysis.” —Pooja Shivraj, PhD