DEV Community

Cover image for How to protect PII information from your AI apps?
Ruth Yakubu for Microsoft Azure

Posted on • Edited on

1

How to protect PII information from your AI apps?

Is your Generative AI app exposing PII information? With the rise in popularity with creating generative AI apps or building AI solutions, security/privacy tends to be overlooked. There are many harms that an AI system can cause if PII data is not extracted or masked. For example, information such as name, race, gender or age can lead to bias in employment decisions-making based on someone's name, race gender or age. In addition, anyone getting access to people's social security and credit card information can lead to privacy violations or security breach issues.

✨ Join the #MarchResponsibly challenge and try out the hands-on tutorial.

Check my colleague @dikosomeleze's blog tutorial on how to use AI language services and Microsoft Fabric Lakehouse to protect your PII information. Data is an important part of AI. The blog covers many valuable information from using data cleansing pipelines to handle PII information. In addition, how to identify, extract and mask PII data.

Whether you are using LLMs or training AI models, the tutorial discusses the responsible AI impacts and some of the best practices in protecting your PII information.

👉🏽 See @dikosomeleze's blog: https://aka.ms/march-rai/pii-extraction

🎉Happy Learning :)

Heroku

Simplify your DevOps and maximize your time.

Since 2007, Heroku has been the go-to platform for developers as it monitors uptime, performance, and infrastructure concerns, allowing you to focus on writing code.

Learn More

Top comments (0)

Billboard image

Create up to 10 Postgres Databases on Neon's free plan.

If you're starting a new project, Neon has got your databases covered. No credit cards. No trials. No getting in your way.

Try Neon for Free →

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay