You are using an outdated browser. For a faster, safer browsing experience, upgrade for free today.

Bhargav Vadgama

Data Engineer, Sunnyvale, California, USA

Title of the Talk:

Airflow for Fraud Detection: Building Dynamic DAGs Against Evolving Threat Models

Abstract:

Financial fraud is evolving faster than ever. Traditional static workflows can no longer keep pace with sophisticated, context-aware attack strategies.
This talk presents a dynamic, intelligent fraud detection framework built on Apache Airflow that automatically adapts in real time. Using modular, event-driven dynamic DAGs, the system intelligently reconfigures detection workflows by integrating real-time data streams, continuous machine learning model monitoring, conditional retraining, and Apache Kafka.
The solution delivers dramatic improvements: significantly faster fraud response times, reduced false positives, and superior model lifecycle management. Built on a complete architecture and validated with synthetic transaction data, this framework represents a practical leap toward future-ready fraud orchestration.
The talk will also share deployment strategies and exciting future directions including reinforcement learning, multimodal detection, and serverless orchestration.

Profile:

I have been working as a data engineer, where my main role are building the data pipelines in the cloud environment. I do have experience & knowlegede on both AWS and AZURE clouds.
I have experience on building the both batch and streaming the data pipelines.
To orchestrate the batch data pipelines I used airflow. And to build the streaming data pipelines I have used the Kafka.
Previous project was a data migration project where we were migrating the on premise data into the AWS cloud by loading the data into the AWS S3.
Developing the spark jobs inside the AWS EMR cluster to read the data from the S3 and to create the dataframe and do the actual data cleansing, tranformations, deduplication and standardiation.
Then the final data will be pushed to aws snowflake datawarehouse.