Licensure Exams: What Truly Conveys a Hygiene Student’s Competency?

© / Adobe Stock

The American Dental Education Association (ADEA) defines the competencies required for entering the practice of dental hygiene. The four core competencies are health promotion/disease prevention, patient/client care, community involvement, and professional growth and development.”3

The five-page ADEA document breaks down these core competencies into five domains. Each domain was then broken into numerous individual competencies. The domain directly associated with patient care is categorized using the ADPIED model. ADPIED stands for:

  • Assessment
  • Dental hygiene diagnosis (based on the unmet human needs of the patient)
  • Planning
  • Implementation
  • Evaluation
  • Documentation

A Debate on Clinical Licensure Exams?

Each area described in ADPIED is then broken down into the competencies necessary to adequately complete each section in a manner that will provide the best patient care possible.3 The idea that a one-shot clinical licensure exam focused on calculus removal is broad enough to measure an individual’s competence to practice dental hygiene is deserving of debate.

Objective clinical structured examinations (OSCE), such as mock boards, are consistently used in dental hygiene education. In a survey of dental hygiene educators (125 directors participated), approximately 97% of the programs utilize OSCEs, and 75% use a mock board style evaluation.7

These OSCEs can be used to simulate the environment and stress of high-stakes clinical licensure exams at varying times in a student’s education. The intention is that a mock board experience can assess a student’s competency with clinical skills and is often used to predict the student’s readiness for the clinical licensure exam.

If mock board performance correlates positively with performance on the one-shot licensure exams, then clinical licensure exams are superfluous or redundant at best. Conversely, suppose there is not a correlation or that there is a negative correlation. In that case, it becomes important to determine what each evaluation tool is assessing and whether one is better at assessing competency than the other.

What Should Be the Focus of the Exam?

I contend that clinical licensure exams almost exclusively measure calculus removal. The Western Regional Examination Board (WREB) calculates 75% of the student’s final grade based on calculus removal.5 For the Commission on Dental Competency Assessments (CODA) final score, it is 66%.11 OSCEs, included with students’ entire portfolio of accomplishments during their education, are a better measure of total patient care and, more importantly, patient safety. Although debridement is an evidenced-based procedure, the role of plaque biofilm, host response, systemic factors, and mechanical factors are just as important, if not more so, to complete patient care.

Determining competency to practice dental hygiene through such a limited lens should be openly questioned. Student performance on OSCEs is not highly correlated with performance on licensure exams, and it has been suggested that OSCE exams more likely measure other qualities, such as problem-solving, critical thinking, and communication.4

A recent study has shown a correlation between OSCE progress test scores and future performance on high-stakes examinations.9 However, further research has suggested that although there may be a correlation between mock boards and licensure exams, the existing research is “conflicting and limited” and that additional dental hygiene programs should be included in future studies to “incorporate additional testing agencies board exams and a variety of mock board experiences.”7

A 2005 study published in the Journal of Dental Education discussed the “predictive validity” of competency assessments gathered during dental hygiene school compared to one-shot clinical licensure exams.6 The research pointed out that dental hygiene educators have too often seen very clinically competent students fail their clinical exams and witnessed how devastating it is for them. For the WREB clinical licensure exam, 91.3% of students pass on the first attempt. What is concerning is the pass rate on the second attempt, where 77% of candidates pass, for a total of a 98.4% pass rate.11 For a student to fail their clinical exam and then be able to pass the next day without any remediation illustrates the issues with the validity of these types of exams.

Educators Are the Gatekeepers

An assessment of this relationship is relevant because there is evidence that OSCEs do a better job of assessing clinical competency than clinical licensure exams.10 Determining a correlation might impact how mock boards are used in the future.

CODA is a programmatic accrediting body, much like the ones used for all other health care programs. Yet, students from programs such as nursing are not required to take a clinical exam. They graduate from an accredited program and are deemed competent due to this fact. The educators are the gatekeepers of the profession, not a clinical licensure exam that only represents a single moment in time, without consideration for the years of assessments that happened during the education process. Dentistry should follow similar pathways to licensure as other medical specialties because our accrediting agency is not less rigorous than those of other health professions.

A summary of the curriculum standards as stated by CODA requires a dental hygiene program to have a written list of competencies for graduation and then utilize assessment tools that measure these defined competencies.1 Throughout a dental hygiene student’s education, they are evaluated on process competencies and must show improvement as they move through the educational process. Multiple educators evaluate them on typodonts, simulation units, and live patient clinics. Each course within the program must have clearly defined learning objectives and then employ evaluation procedures that measure and document these objectives.

The dental hygiene curriculum must also include biomedical, dental, dental hygiene sciences, and general education courses. Each course must have a “sufficient scope and depth” to ensure course and programmatic competencies are met. Didactic and clinical instruction are clearly defined and measured to ensure the competency of each individual student’s ability to provide competent, safe, and complete patient care.

CODA includes standards that monitor each program to ensure enough patient experiences and total clinical hours are provided to achieve stated competencies. Numerous standards also define the curriculum necessary to provide graduates with critical thinking skills in areas like the process of care, ethics, communication, interdisciplinary collaboration, and clinical skill. An accredited dental hygiene program must provide evidence that students receive ample instruction to become competent in ethical reasoning, the scope of practice, problem-solving, and the ability to understand research and remain current regarding scientific literature. To remain an accredited program, they must show evidence of learning tools and assessments for each didactic and clinical learning objective that fall under each competency.

I suggest that this body of evidence provides a much better measure of competency than a one-shot clinical licensure exam. The most important issue when determining competency is patient safety. Each state’s board of dental examiners is tasked with maintaining public safety. A clinical licensure exam does very little in helping to determine a student’s competency in this regard. Removing calculus on one or two quadrants of a person’s mouth falls exceedingly short of the quality of care to which patients should be entitled. There is not any follow-up to ensure that patients have returned to a dental provider to complete their care.

Questionable Value of Clinical Exam

The additional stress and expense that candidates for licensure are made to endure for an exam that tells us so little about a dental hygiene student’s competency are hard to comprehend. The ethical issues inherent in live patient exams have been debated for decades.2

Live patient exams put the patient at unnecessary risk. For this reason, almost all other health professions have stopped live patient examinations. Furthermore, students are put into the ethical dilemma of deciding whether putting off necessary treatment for months to save their patient for boards is the right course of action. They feel forced into choosing whether their need for the perfect patient outweighs the patient’s need for treatment. Requiring an exam that puts people at risk and has very little evidence-based support raises significant ethical concerns.

A review of the literature shows that mini-mock and mock boards are not necessarily the best predictors for how students score on their clinical licensing exams. Additional research needs to be done to determine the reasons why this is the case. One possible explanation is that the instructor doing the grading is being influenced by qualitative observations that further result in a skewed quantitative measure. It is possible that instructors are more prone to bias than examiners from the different licensure exams. This bias may present itself in more stringent expectations of clinical competency. Another possibility is that after students have finished the mock boards, the resulting scores contribute to increased remediation for students who performed poorly, with less attention paid to students who were successful.

Further discussion is needed to define what exactly we are trying to measure with a clinical licensure examination. Dental educators have defined the competencies necessary for a practicing dental hygienist in incredible detail. CODA has provided standards and regulations to guarantee that these competencies are being met. Dental hygiene programs provide an immense amount of aggregate data, which includes four semesters of process evaluations on all aspects of clinical competency.

Due to the COVID-19 pandemic, manikin exams are being accepted in states across the United States. These manikin exams represent a standardized evaluation that provides all dental hygiene graduates an equal playing field.

Manikins also erase the ethical issues which accompany a live patient exam. Personally, I prefer the standardized OSCE that a manikin exam represents. This way, everyone is taking the same test.  Students know their patients will show up and not have any herpetic lesions. And most importantly, actual patient care is not compromised.

Will the manakin exam remain post-COVID-19? There is very little standardization among the states regarding practice law and licensure requirements, and I would guess that continuing with manikin exams in the future will be the same. I have often heard dental professionals suggest that new graduates should also have to do one since they had to do an exam. It feels a little like hazing to get into a fraternity/sorority. Why is it that only dental professionals and barbers/cosmetologists do live patient boards? As much as I prefer the manakin exams to live patient exams, the question still remains: Is an additional exam warranted, on top of the academic rigor to which dental hygiene students are already exposed?

Before you leave, check out the Today’s RDH self-study CE courses. All courses are peer-reviewed and non-sponsored to focus solely on pure education. Click here now.

Listen to the Today’s RDH Dental Hygiene Podcast Below:


  1. Accreditation Standards for Dental Hygiene Education Programs. (n.d.). Commission on Dental Accreditation. Retrieved from
  2. Chu, T.G., Makhoul, N.M., Silva, D.R., Gonzales, T.S., Letra, A. and Mays, K.A. (2018), Should Live Patient Licensing Examinations in Dentistry Be Discontinued? Two Viewpoints. Journal of Dental Education. 2018; 82(3): 246-251. Retrieved from
  3. ADEA Competencies for Entry into the Allied Dental Professions: (As approved by the 2011 ADEA House of Delegates). J Dent Educ. 2017 Jul; 81(7): 853-860. PMID: 28668791.
  4. Dennehy, P., Susaria, S., Karimbux, N. Relationship Between Dental Students’ Performance on Standardized Mulitple-choice Examinations and OSCEs. Journal of Dental Education. 2008; 72(5): 585-592.
  5. Exam Preparation: Dental Hygiene Exam Manual. (2021). WREB. Retrieved from
  6. Gadbury-Amyot, C., Bray, K., Branson, B., Holt, L., Keselyak, N., Mitchell, T., Williams, K. Predictive Validity of Dental hygiene competency Assessment Measure on One-Shot Clinical Licensure Examinations. The Journal of Dental Hygiene. 2005; 69(3): 363-370.
  7. Martin, V., Rogo, E., Hodges, K., Piland, N., Osborn Popp, S. The Relationship Between Mock Boards and Clinical Board Examinations in Dental Hygiene Education. Journal of Dental Education. 2017: 81(1): 54-64.
  8. Navickis, M., Bray, K., Overman, P., Emmons, M., Hessel, R., Cowman, S. Examining Clinical Assessment Practices in U.S. Dental Hygiene Programs. Journal of Dental Education. 2010; 74(3): 297-306.
  9. Pugh, D., Bhanji, F., Cole, G., et al. Do OSCE Progress Test Scores Predict Performance in a National High-stakes Examination? Medical Education. 2016; 50(3): 351-358. doi:10.1111/medu.12942.
  10. Terry, R., Hing, W., Orr, R., Milne, N. Do Coursework Summative Assessments Predict Clinical Performance? A Systematic Review. BMC Med Educ. 2017; 17(40). doi:10.1186/s12909-017-0878-3.
  11. WREB 2016 Technical Report Dental Hygiene Examinations 2016. (2017). WREB. Retrieved from Technical Report 2016 Dental Hygiene Examinations.pdf
  12. 2021 Dental Hygiene Exam Manuals, (n.d.). Retrieved from