A Privacy Threshold Analysis (PTA) is necessary to to assess privacy and security risks. The PTA asks about the use of AI and requires a description of the project or system that will use AI and how that use aligns with the state's AI principles of responsible use. The PTA is used to analyze risks for projects or systems using AI or generative AI from a privacy perspective and to document AI use for auditing and accountability. 

Privacy risk assessment is based on the Fair Information Practice Principles (FIPPS), which underpin NCDIT’s Principles for Responsible Use of AI, a foundational component of the North Carolina State Government Responsible Use of Artificial Intelligence Framework.

State agencies seeking additional information about the PTA or privacy AI risk assessment can send an email to the Office of Privacy and Data Protection. Those seeking support to mature generative AI use cases can also send an email to the AI Working Group.  

On This Page Jump Links
Off