Using Nudges to Improve the Credentialing Program

By Cynthia G. Parshall and Ellen Julian

When people want to take action but have difficulty following through, behavioral design tools called “nudges” can help. Nudges encourage people to make choices in their own interest (Thaler & Sunstein, 2008). The best applications of behavioral design (also known as Action Design) help people through subtle influences to support their choices to exercise more, save for retirement, develop a study habit, or carry out other positive actions.

Behavioral design draws on a large body of research from behavioral economics, cognitive psychology, and user experience, and has accumulated substantial evidence that well-planned nudges can make a powerful difference. These behavioral techniques are being used extensively at companies like Google and Amazon, in more than 60 governmental “nudge units” around the world, and in a wide range of industries, including education, health, and finance.

In assessment, nudges can support examinees, subject-matter experts (SMEs), staff, and other stakeholders in taking actions they already want to do. For example, well-framed and well-timed messages can effectively increase the number of test-takers who show up for a scheduled exam. Nudges can also help test-takers meet registration deadlines, persist in their studies, and take their tests honestly. Nudging SMEs can increase survey response rates, committee volunteer numbers, and timely item writing contributions.

Nudge Techniques

While dozens of behavioral tools have been identified in the research literature, a handful of these nudge techniques are particularly useful and worth considering for credentialing programs. These high-value nudge tools include loss aversion, social proof, and framing. Each of these tactics is illustrated below in a credentialing context.

Loss aversion to encourage test-takers to schedule an exam before the end of a testing window

People typically anticipate greater pain for a loss than they do pleasure for an equivalent gain. As a result, people respond more strongly when they are told they might lose something.

  • Encourage your test-takers to take an action by describing it in terms of what they might lose (“Schedule your test before the window closes...”), rather than gain (“The testing window is now open…”).

Social proof to encourage honesty in test preparation and test taking

Social proof is one of the most powerful – and most often misused – of all the nudge tools available. People are frequently influenced to a course of action when they are told what the majority of others do, especially when they admire those others or if they are uncertain about what they should do.

  • Use social proof to give test-takers a sense that other test-takers prepare for and respond to tests honestly, e.g., “The vast majority of our test-takers prepare for their tests honestly and earn the scores they receive.”
  • However, it’s important not to emphasize the number of dishonest test-takers. When that misuse of social proof occurs, it backfires and creates a social norm that “if so many are doing it, it must not be that bad.”

Framing to encourage SMEs to respond to surveys

Nudges can encourage behaviors when they are framed in terms that are personally meaningful to the users.

  • To increase survey response and completion rates, help the recipients understand why completing the survey is meaningful to them, e.g., “You can help determine what content the test will cover” rather than defining the survey as useful to the exam program.


While the evidence for the effectiveness of nudges is clear, there is no magic pill. To have a truly successful implementation, research within the actual context must be carried out. In other words, while there is a great deal of evidence that nudges can be generally effective, any built for a specific purpose will need to be tried out in that context and with that target audience.

The first step in planning a nudge implementation is to identify the key behavior to be targeted. As an example, consider a certification program that has a problem with test-takers failing to recertify on time. The organization sends certificants an email reminder that includes a link to the renewal website page. However, few people use the link. Researchers at the testing organization may decide that increasing the number of certificants who Register with the email renewal link is the key behavior they want to affect.

The next step is to conduct a behavioral audit -- a detailed analysis of every action that a user must take to accomplish the task. A major purpose of a behavioral audit is to identify every potential barrier, or point of friction, that the user encounters.

In the recertification email example, where the key behavior is Register with the email renewal link, the researchers may find that many recipients never open the email. This could lead the researchers to suspect that one barrier could be the email subject line, which might be unmotivating. Another point of friction could be within the message itself, which might be overly long or unclear. Or there may be barriers with the renewal process on the website. If certificants expect that the process will be time-consuming and difficult, many of them may postpone the anticipated unpleasant task indefinitely.

From the list of barriers, researchers can select the most important one. This is often a friction point that seems especially likely to be affecting the key behavior. In the example, an unmotivating subject line seems likely to be reducing the number of recipients who “Register with the email renewal link.”

Potential nudges geared to the key behavior are then considered. In our certification example, the researchers might decide to try a new email subject line using the nudge technique of loss aversion (e.g. “Renew before it’s too late!”) or social proof (e.g. “Most certificants easily renew using this link”). With a key behavior in mind, a specific friction point to address, and a promising nudge technique identified, an experiment can be planned.

Different types of research can be employed, depending on the barrier that is addressed. Changes to the format and content of the email might be quickly tested with a small qualitative usability study, before approving a version to use with the full population. Changes to an email subject line or a webpage can be investigated with a quantitative A/B test. For example, the “open rate” of two versions (version A and version B) of the email subject line can be compared to see if the new subject line had an impact.

Results of the research may suggest that further refinements are needed, perhaps trying a different behavioral nudge technique. However, if the results are satisfactory, then the researchers’ efforts can progress to the next identified barrier. In this way, each study can build on previous findings and substantial improvements can be made over time.


Research has shown that nudge tools are effective and deliver a positive return on investment. The tools are inexpensive, and the investment in research can produce substantial results. With these tools we can make our credentialing programs better.

Through the use of well-studied behavioral tools and an iterative research process, an organization can go beyond recognizing pain points to actually addressing the underlying problems. Nudges can help SMEs volunteer for committees and persist in completing their CE requirements. Nudges can also enhance test-takers’ trust in their certification organizations, and encourage them to follow honest test preparation methods and test-taking practices.

When the barriers that have been hindering people are addressed through an iterative research process and thoughtfully-selected nudge techniques, a meaningful impact can be achieved and the credentialing program can be improved.

Learn More at the ICE Exchange, Nov. 6-9

Interested in learning more about nudges and how you can use action design to engage candidates? Join Cynthia Parshall, Ellen Julian and Dot Horber for their ICE Exchange presentation, “Improve Your Candidate Experience With Action Design.” The ICE Exchange takes place from Nov. 6-9 in Austin, Texas. Learn more here and register today.


Ariely, D. (2012). The (Honest) Truth About Dishonesty. New York: HarperCollins.

Bell, M. (2017, October). How the likes of Amazon are moving beyond nudge theory. Retrieved on 7/24/18 from

Benson, B. (2016, September). Cognitive bias cheat sheet: Because thinking is hard. [Blog post]. Retrieved on 8/7/18 from

Campaign Monitor (n.d.) A/B test your email campaigns. [Blog post]. Retrieved on 8/7/18 from

Dynarski, S. (2015, January). Helping the Poor in Education: The Power of a Simple Nudge.

Fried, C. (2018, March). Behavioral Economics: Are Nudges Cost-Effective? Retrieved on 8/10/18 from

Heath, C., & Heath, D. (2010). Switch: How to Change Things When Change Is Hard. New York: Broadway Books.

Holmes, B. (2018, February). Nudging grows up (and now has a government job). Retrieved on 7/24/18 from

Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus, and Giroux.

NBR (2009). Behavioral Finance Basics: Your Mind and Your Money. [Video file.] Retrieved on 7/24/18 from

Parshall, C.G. (2017, March). Action Design for Instruction and Assessment [Blog series on nudges: initial post]. Retrieved from

Parshall, C.G. (2017). Action Design and the Assessment Program: The Use of “Nudges” to Help Our Candidates Become Their Best Selves. Presented at the NBOME’s Research Advisory Forum, Chicago, October 17, 2017.

Parshall, C.G., & Fremer, J. (2018). Using Nudges to Encourage Rule-Following by Our Test-Takers. To be presented at the annual meeting of COTS, Park City, UT, Oct 10-12, 2018.

Parshall, C.G., & Johnson, B. (2018). Developing Nudges for Your Exam Program. [Blog post]. Retrieved from

Parshall, C.G., Julian, E, & Horber, D. (2018). Improve Your Candidate Experience with Action Design. To be presented at the annual meeting of ICE, Austin, TX, Nov 6-9, 2018.

Parshall, C.G., & Parikh, S. (2018). Access Isn’t Enough: The Use of Behavioral Tools to Transform an Adult Education Benefit Program. Presented at the annual meeting of the Behavioral Science & Policy Association (BSPA), Washington, DC, May 18, 2018.

Social and Behavioral Sciences Team. An official website of the United States Government.

Thaler, R.H., & Sunstein, C.R. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, CT: Yale University Press.

Wendel, S. (2014). Designing for Behavior Change: Applying Psychology and Behavioral Economics. O’Reilly Media: Sebastopol, CA.

Recent Stories
When a Pandemic Accelerates Innovation: Testing and Assessment in the Era of COVID-19 and Beyond

A Look Into Copyright Registration Time, Process and Costs

Managing Certification Renewals in COVID-19: Part 2