DEV Community

Cover image for How to protect PII information from your AI apps?
Ruth Yakubu for Microsoft Azure

Posted on • Updated on

How to protect PII information from your AI apps?

Is your Generative AI app exposing PII information? With the rise in popularity with creating generative AI apps or building AI solutions, security/privacy tends to be overlooked. There are many harms that an AI system can cause if PII data is not extracted or masked. For example, information such as name, race, gender or age can lead to bias in employment decisions-making based on someone's name, race gender or age. In addition, anyone getting access to people's social security and credit card information can lead to privacy violations or security breach issues.

✨ Join the #MarchResponsibly challenge and try out the hands-on tutorial.

Check my colleague @dikosomeleze's blog tutorial on how to use AI language services and Microsoft Fabric Lakehouse to protect your PII information. Data is an important part of AI. The blog covers many valuable information from using data cleansing pipelines to handle PII information. In addition, how to identify, extract and mask PII data.

Whether you are using LLMs or training AI models, the tutorial discusses the responsible AI impacts and some of the best practices in protecting your PII information.

👉🏽 See @dikosomeleze's blog: https://aka.ms/march-rai/pii-extraction

🎉Happy Learning :)

Top comments (0)