SEC595: Applied Data Science and AI/Machine Learning for Cybersecurity Professionals

Experience SANS training through course previews.
Learn MoreLet us help.
Contact usBecome a member for instant access to our free resources.
Sign UpWe're here to help.
Contact UsLarge Language Models (LLMs) and Generative AI have inherent limitations, such as outdated knowledge, lack of private data access, and the potential for hallucinations. In this session, we will introduce a strategy for overcoming these challenges: Retrieval-Augmented Generation (RAG). Attendees will see how a GenAI RAG application can provide access to real-time, private data stored in an external knowledge base without needing to fine-tune the base LLM model.
With an understanding of the GenAI RAG application, we will explore an example cloud infrastructure hosting the application using Azure AI Search, Azure Storage, and Azure Container Apps. The cloud architecture review will uncover new attack vectors and cloud security misconfigurations that can unintentionally leak RAG data to an attacker. Attendees will see how these vulnerabilities can be used to gain unauthorized access to AI data. Then, we will look at the cloud security controls needed to authorize access to the RAG data.
Attendees will walk away with an understanding of GenAI RAG applications, the underlying cloud infrastructure powering these AI systems, and the security controls needed to protect sensitive RAG data.
Learning Objectives:
This webcast supports content and knowledge from SEC510: Cloud Security Engineering and Controls. To learn more about this course and explore upcoming sessions, click here.
Eric is a co-founder and principal security engineer at Puma Security, focusing on modern static analysis product development and DevSecOps automation. A SANS Fellow, he is co-author and instructor for three SANS Cloud Security courses.
Read more about Eric Johnson