Features

Advancing Credentialing with the ICE R&D Committee

The ICE R&D Committee keeps pace with cutting-edge trends and best practices in credentialing, in order to keep the larger community informed. We spoke with John Wickett, R&D Committee Chair, to learn how the group operates, highlights from 2017 and see what they have in store this year.

For individuals leading credentialing organizations, the ability to keep pace with cutting-edge trends and best practices is critical. This information is not only helpful for improving the programs they administer, but ultimately ensures the programs are held to a standard of excellence. There is no group of individuals who understands this better than the ICE Research and Development (R&D) Committee. Created in 2007, this group of credentialing professionals – all volunteers – works to better inform the ICE community through their data and research reports, released regularly on the ICE website.   

This year’s committee is made up of 18 members, including ICE R&D Committee Chair, John Wickett, PhD and Vice-Chair, Jerry Reid, PhD; members are divided into six task forces carrying forward from 2017. There are at least four new topics on the agenda for 2018, including further work on noncognitive competency assessment, competency modeling, new models for credentialing and statistical models for small candidate-volume programs. 

Topics and potential research questions are drawn from a variety of sources, including the ICE Board, staff and committee members themselves. ICE Exchange attendees, as well, are invited to submit topics during the conference, and this is often a very rich source of ideas. Following the Exchange, the Chair and Vice Chair meet to cull down the list – between 30 and 40 potential topics submitted throughout the year – to a total of 10-15 topics. These are voted on by the committee, who ranks them in order of value to the ICE community and how compelling the research topic is. Both are essential, notes Wickett, because “the success of any task force is based on its topic having interest for researchers and on its utility to the ICE community.”   

The heart of the work undertaken by the committee rests with the task forces. Typically comprised of four to six members each, these dedicated groups determine what specific research questions will be addressed, as well as the deliverables that will result. A project plan is outlined, followed by data collection. Depending on the project plan, data collection may involve literature review, creating a survey tool, or telephone interviews, among other options. Once data has been collected and analyzed, the task force divides the work to create the final deliverables. After the task force has created their draft, the full committee has the opportunity to review it and provide feedback. At least two committee members conduct a thorough review of the end product, and once revised it is submitted to ICE for final editorial review before publication. While it can be summarized into a few steps on paper, this process can take anywhere from one to two years – allowing the research its due time, with input from multiple stakeholders.  

Wickett sees great value in the research for ICE members. “If you’re a member organization in ICE, you’re probably surrounded by similarly structured and sized organizations,” he said. “Getting to see what others are doing has a great deal of utility. It provides ideas on how to do things better and metrics on how other organizations perform. There are all sorts of benchmarking information in these reports that may lead to better decisions – or validation of the processes that organizations are already following.” Knowing these reports may help improve the work of others in the profession drives Wickett’s passion for leading the R&D committee. 

The committee regularly explores new and emerging topics, a critical step in helping ensure credentialing organizations have the research and tools to continuously advance their practice. Recent examples include microcredentilaing and digital badging (led by Patricia Muenzen), remote proctoring (led by Karen Plaus) and non-cognitive competence assessment (led by John Weiner). “These emerging areas are not widely adopted but are of potential interest to credentialing organizations; the research we conduct is a way to help organizations weigh the benefits and risks, or, if nothing else, review and decide it’s not the right fit for their organization.”  

The committee has also seen an increased interest in noncognitive competency testing, which is driving some of their research this year. Anecdotal data shows employers are expressing more interested in soft skills – things like emotional intelligence, communication and personality traits. And because the competency approach has historically worked well for testing technical skills, there is a new interest in applying this to testing soft skills. However, certain issues should be thoughtfully reviewed. For example, there is no one right way to communicate. “There may be some wrong ways to communicate, but defining and then measuring the right way is a difficult task to do reliably with a relatively small number of test items,” Wickett explained. “It raises an important issue that yes, you may have the technical skills [to complete the job], but you may not be effective in a work setting. And the question arises as to whether that is the responsibility of the certifying body to address that before the credential is issued.” The committee will continue their work on noncognitive competence this year with a part two follow-up to last year’s report, Assessing Noncognitive Competence 

Other projects released in 2017 include work produced jointly with ATP on Innovative Item Types, led by Patricia Young. The paper, provides an overview of considerations and best practices for incorporating alternative item types into an assessment. Last year also saw the release of Validation Strategies for Credentials, a project led by Lawrence Fabrey. This research report documents common methods licensure and certification programs use to establish their programs’ validity. Any program looking to verify the validity evidence for their credential will find considerable value in this report. 

ICE members should stay tuned for new work slated for release in 2018, including a revamped business of certification survey on a new survey platform (led by Patricia Young), a follow up on recertification benchmarks in use across credentialing bodies (led by Patricia Muenzen), best practices in standard setting (led by Larry Fabrey), and guidance for organizations considering international expansion (led by Tony Zara). This year will also see the launch of the new online research library (led by JG Federman) which will form an essential resource for anyone working in credentialing. 

The success of past ICE research projects is extensive, and the year ahead is promising for the R&D Committee. Learn more and take advantage of this valuable resource by visiting the Research section of the ICE website.