I’m a Senior Engineer specializing in Data Platforms and Developer Experience, with 8+ years of experience building production infrastructure, decoupled data systems, and automated workflows. I bring hands-on depth in Python, Go, Java, and cloud-native tooling (GCP), with a focus on making teams faster and systems more reliable.
At Lighthouse Intelligence, I was appointed to the company AI Steering Committee and DevEx working group. I architected internal agentic coding tools to support infrastructure workflows and data investigations, building MCP servers for BigQuery metadata and semantic code search to drive AI adoption across ~400 engineers.
I also architected an internal cloud cost management platform integrating asset inventory, Atlan data lineage, and automated workflows. By replacing a recurring 8-person manual review process with event-driven notifications, I improved automated owner identification and status tracking. On the data side, I engineered containerized batch processing pipelines using Python and Airflow on Kubernetes, decoupling processing stages with Pub/Sub and optimizing BigQuery and Spanner models for strict data consistency and performant downstream exposures.
Earlier, I delivered measurable performance and scalability improvements as a Software Developer for Bank Degroof Petercam and KBC Group N.V. I developed Java-based processing tools that cut processing time by 40%, built SQL Server-based integration and reconciliation systems for financial risk reporting, and served as Scrum Master and developer for the BI team supporting finance leadership.