AI guidance for certifications seeking NCCA accreditation
This guidance document was created to provide certification programs with guidance for the use of Surpass Copilot, to be consistent with the NCCA AI Guidance and the NCCA Standards.
Discover everything you need to know in our free report!
What’s inside?
The report includes the following key themes, summarizing five major areas of focus that are carried throughout the NCCA AI Guidance Document.
Organizations that use AI in test development and other certification activities, and seek NCCA accreditation, can help their programs align with the NCCA AI Guidance by ensuring that these key areas of focus are adequately addressed in their certification policies, procedures, and operations.
Human oversight of AI
This includes implementing processes and instructions for AI use, reviewing all AI outputs, approving any AI generated content selected for use, and editing AI outputs where needed. Staff, SMEs, consultants, vendors, and others involved in test development, delivery, scoring, or other certification activities where AI is used must be appropriately trained, and the specific AI-related work for which they have responsibilities must be documented.
Security
As with other aspects of certification programs, the use of AI should be secure. Items generated by AI and any source documents used for item generation must stay within a secure system for test development and not be released to open AI. In addition, the system must be separate from any systems used to create preparatory education for the certification exam.
Transparency
There are laws in some countries and regions requiring organizations to be transparent to stakeholders regarding their use of AI. This also applies to the NCCA’s AI Guidance document. Organizations using AI in their certification programs should be transparent about it.
Policies and procedures for AI use
Organizations using AI in their certification-related activities must have policies and procedures (P&P) in place for consistent and appropriate use of AI. The P&P should provide clarity on how and where AI can be used, and include rationales for its use. This includes, but is not limited to, the use of AI in test development, delivery, and scoring.
Quality management and documentation
Policies and procedures should include the quality assurance steps the program takes to ensure that its use of AI is managed by qualified humans making all critical decisions and that the AI outputs and related human decisions are documented.
Request the report






