Foundations of the NBCOT Certification Examinations
Exam Development and Construction
NBCOT annually recruits OTR and COTA subject matter experts (SMEs) to develop new items for the certification examinations in accordance to the accreditation standards relating to the use of qualified personnel for examination development. Specifically, NCCA standards (2014) indicate SMEs must represent the appropriate demographics of the population to be certified and provide insight and guidance into examination processes.
SMEs are OTR and COTA certificants who represent the profession in terms of practice experiences, geographic regions, gender, and ethnicity. After questions or “items” for NBCOT examinations are developed, the items then undergo a rigorous review process by an additional committee of SMEs. This review is designed to: validate that the knowledge and tasks measured are compatible with the domain-level content specifications; assess the relative importance and frequency of each item to occupational therapy practice; and confirm that each item meets generally accepted fairness guidelines.
NBCOT takes into account the fairness of its examinations during item development. NBCOT adheres to recognized item writing, test development, and review procedures to ensure readability, neutral language, and the universal accuracy of terms used in its items. Additional fairness criteria include, but are not limited to:
- Editing items for issues of bias and stereotyping;
- Coding items to the approved examination content outline;
- Referencing items to approved and published resources in occupational therapy;
- Selecting subject matter experts who are OTR and COTA practitioners and educators from diverse geographical areas, practice experiences, and cultures;
- Field-testing items prior to their use as scored items on the exam.
During the examination process, fairness is addressed through standardized procedures regarding the registration process, accessibility issues, roles of proctors, and security of test materials and equipment. Fairness is addressed after the examination by consideration of confidentiality, accuracy of scoring, and timeliness of reporting the results.
Exam Format
How are the COTA exams formatted?
The COTA examination consists of single-response multiple-choice items and six-option multi-select items.
The single-response multiple-choice items contain a stem and three or four possible response options. Of the response options presented, there is only one correct or best answer. Candidates receive credit for selecting the correct response option. Points are not deducted for selecting incorrect response options.
The six-option multi-select items include a question stem followed by six possible response options. Of the options provided, three are correct responses and the other three are incorrect responses. The candidate must select three response options. Candidates receive credit for selecting the correct response options. Points are not deducted for selecting incorrect response options.
Examples of multi-select items can be found here. These sample items do not replicate the candidate experience on exam day. Please view the exam tutorial for more information.
What is the layout of the computer-delivered COTA exam?
Candidates are allotted a period of four hours to complete the examination. The COTA examinations consist of 200 multiple-choice items based on the exam content outline. Multiple choice items include the single-response items in which the candidate selects the single BEST option, as well as multi-select items in which the candidate must select the three BEST options out of six options. Multiple-choice and multi-select items are presented one at a time to the candidates. Some of the items include a picture or chart that contains information needed to answer the question. During the examination, candidates can highlight text in the stem of an item that they deem important. A strikeout feature is also available to help candidates visually eliminate response options. Candidates can flag items for review and change their item responses as testing time allows, or until they submit the examination for scoring. If time runs out before a candidate reviews the flagged items, the selected response(s) will be submitted for scoring. No credit will be given to flagged items that have no response option(s) selected. Candidates also have the ability to modify the color scheme by changing the background and text colors of the exam at any time.
At the start of the examinations, candidates have the option of taking a tutorial about the functionality of the test screens. Time spent on the examination tutorials are not deducted from the four-hour test clock. Details on the features of the computer, as well as additional functionality of the exam, can be viewed by accessing the online tutorial.
Candidates can access the exam tutorial here. Please note that candidates can revisit the tutorial at any time during the exam; however, the exam timer will continue to run.
How are the OTR exams formatted?
The OTR exam comprises three clinical simulation test (CST) items as well as single-response, multiple-choice items. Each CST item consists of three main components:
- Opening scene – This contains general background information about a practice-based situation that sets the scene for the CST item.
- Section headers – Following the opening scene, each CST item has four parts that each begin with a section header. These section headers contain information specific to the OT process addressed in the section. The candidate is asked a specific question based on this information.
- Response options and feedback – This includes a list of potential options the OTR may consider in response to the question posed in the section header. The list of options in the CST item consists of positive and negative options. Candidates must select either Yes or No for each option before proceeding to the next part of the CST item. Selecting Yes will cause a feedback box to appear to the right of the option. The feedback provides additional information related to the outcome selected but does not give information on whether or not the candidate’s response is correct. Feedback is not provided when No is selected.
Candidates receive credit for selecting the correct response options. Points are not deducted for selecting incorrect response options.
Candidates can navigate to previous screens within the CST item to review the information there, the selections made, and the feedback received in response to the selections; however, candidates cannot make changes to the responses. Candidates must complete the CST items in the order they are presented.
Candidates can access the exam tutorial here. Please note that there is one exam tutorial presented before the exam that describes the CST and MC portions of the exam. The time allotted for the tutorial prior to beginning the exam is separate from the exam timer. Candidates can revisit the tutorial at any time during the exam; however, the exam timer will continue to run.
What is the layout of the computer-delivered OTR exam?
Candidates are allotted four hours to complete the OTR exam. The exam comprises three clinical simulation test (CST) items and 170 single-response, multiple-choice items. The CST items are presented one at a time. Candidates are required to select Yes or No for each response option presented. Once a response is selected, it cannot be changed. Candidates can click “Next” to proceed to the next part of the item after making selections for all response options in the present part. Candidates can navigate to previous screens within the CST item, but selections for the response options cannot be changed. Candidates must complete the CST portion of the OTR exam before beginning the multiple-choice section. NBCOT offers access to sample CST problems so candidates can orient themselves to the Yes/No structure of the items and the feedback boxes. These sample items do not replicate the candidate experience on exam day. Please view the exam tutorial for more information.
Sample CST problems can be accessed here:
https://secure.nbcot.org/CSTDemo/?CSTID=1
https://secure.nbcot.org/CSTDemo/?CSTID=2
Candidates cannot access the CST portion of the exam once it is complete. In the multiple-choice portion, candidates can flag items for review and change their choices, as testing time allows or until they submit the exam for scoring. If time runs out before the candidate reviews the flagged items, the selected response will be submitted for scoring. No credit will be given to a flagged item that has no response option selected.
Throughout the entire exam, candidates can highlight text they deem important; this can be done in the CST opening scenes and section headers and in the multiple-choice stems. A strike-out feature is also available in the multiple-choice portion of the exam to help candidates visually eliminate possible response options. Candidates also have the ability to modify the color scheme by changing the background and text colors of the exam at any time.
Details on all the features, as well as additional functionality of each section of the exam, can be viewed by accessing the online tutorial.
Candidates can access the exam tutorial here. Please note that there is one exam tutorial presented before the exam that describes the CST and MC portions of the exam. The time allotted for the tutorial prior to beginning the exam is separate from the exam timer. Candidates can revisit the tutorial at any time during the exam; however, the exam timer will continue to run.
How are the exams delivered?
The OTR and COTA examinations are computer-delivered at testing centers located throughout the United States and internationally. The candidate can schedule to take the examination any day of the week during the business hours of the testing center the candidate selects. Scheduling instructions are provided in the candidate’s Authorization to Test Letter.
Are testing accommodations offered?
NBCOT provides reasonable and appropriate accommodations for qualified individuals with a disability who submit appropriate documentation. Additional information can be found on the Testing Accommodations page.
Scoring
Scaled scores provide consistent and comparable scoring across exam forms. Percent correct and number of correct scores are simply other ways of reporting raw scores and therefore, do not resolve the issue of comparability of scores across different versions of the exam. Although each version of the examination tests the same domains of occupational therapy practice, each form contains a different set of test items, which means that one or more questions on one test form may vary in difficulty than the questions appearing on another test form. Simply using a raw score does not account for this difference.
Norm-referenced scoring is used to indicate performance differences among test takers, whereas criterion-referenced scoring uses a pre-defined minimum standard or criterion that all candidates must achieve as an indicator of whether the candidate has acquired specific knowledge as defined by a valid content outline. Best test practices involve the use of scaled scoring to address direct comparison of scores across multiple versions of the examination, not to other candidates’ performance.
Candidates who pass the examination receive a performance feedback report that includes a congratulatory letter and the candidate’s earned exam score.
NBCOT also provides performance feedback to those candidates who do not achieve the passing standard on the examination. This performance feedback report includes the candidate’s score, as well as the average score of new graduates who recently passed the examination. The report also includes a domain-level performance chart to help a candidate identify areas of relative strength and weakness and additional information regarding how individual domain scores can be used. Finally, a list of frequently asked questions is presented to address queries regarding the determination of the passing score, use of scaled scores, candidate performance comparisons, score reporting, exam preparation, and exam content.
Sample OTR performance feedback report for candidate with failing score
Sample COTA performance feedback report for candidate with failing score
Test Metrics
Each NBCOT exam includes a pre-selected number of field-test items. Although these items are not considered when determining candidates’ scores, performance data is collected and analyzed for each field-test item. The statistical analysis of the field-test items is an important quality control measure NBCOT uses to preserve the reliability and validity of the examinations. Field-test items are presented randomly throughout the exams. Candidates are not able to distinguish between the scored and field-test (non-scored) items.
Once a sufficient number of responses are collected on a field-test item, the item statistics are reviewed based on pre-determined psychometric measures. Field-test items meeting these metrics are entered into the bank of items that can be used as scored items on subsequent exams. Item-level statistics falling below these metrics for field-test items are flagged for additional review and revision before undergoing further levels of field-testing.