Processing TeraBytes of data every day and sleeping at night

15:10/15:50

This is the story of how we built a highly available data pipeline that processes terabytes of network data every day, making it available to security researchers for assessment and threat hunting. Building this kind of stuff in the cloud is not that complicated, but if you have to make it near real-time, fault tolerant and 24/7 available, well... that's another story. In this talk, we will tell you how we achieved this ambitious goal and how we missed a few good nights of sleep while trying to do that! Spoiler alert: contains AWS, serverless, elastic search, monitoring, alerting & more!

Language: English

Level: Intermediate

Luciano Mammino

Cloud Architect & FullStack dev - fourTheorem

I was born in 1987, the same year Super Mario Bros was released in Europe, which, is my favourite game! I started coding at the age of 12, hacking away with my father's old i386 armed only with MS-DOS and the QBasic interpreter and since then I have been professionally a software developer for more than 10 years. I love the full-stack web, Node.js & Serveless. I co-authored "Node.js design patterns" (https://www.nodejsdesignpatterns.com), launched fstack.link and Serverlesslab.com

Go to speaker's detail

Domagoj Katavic

Software Engineer - Vectra AI

I am a technology enthusiast coming from Split, Croatia. From my early days, I have enjoyed playing with technology, starting with programming robots in PBASIC, to gesture recognition and tinkering with FPGA. I have always liked programming so my first job was building a cloud editor Codeanywhere and teaching programming at a local university. Now I am a Software Engineer at Vectra AI where I am making a hunt for cyber attacks as easy and automated as possible.

Go to speaker's detail