HimalayasHimalayas logo
Ken DannyKD
Open to opportunities

Ken Danny

@kendanny

Senior data engineer focused on real-time data platforms, DataOps, and reliable analytics at scale.

United States
Message

What I'm looking for

I’m looking to lead reliable, real-time data platforms—owning ingestion, transformation, and serving—while using DataOps, automated quality checks, and governance to cut latency and eliminate on-call surprises on AWS/Snowflake.

I’m a Senior Data Engineer with eleven years of experience across fintech, logistics, and SaaS, moving teams from batch pipelines to event-driven architectures that process billions of records daily. I build the “pipeline nobody pages about at 2 AM,” improving latency from 4 hours to under 90 seconds and maintaining 99.97% pipeline uptime with automated data quality using Great Expectations.

I established DataOps with Airflow, Terraform, and GitHub Actions—using blue-green deployments to keep scheduler uptime at 99.97% while cutting recovery time dramatically. I’ve engineered lakehouse transformation layers with dbt (420 models), architected real-time streaming with Kafka/Flink/Spark/Delta Lake, migrated Redshift to Snowflake, and driven governance (PII classification, column-level lineage, access policies) to achieve SOC 2 compliance 6 weeks ahead of schedule.

Experience

Work history, roles, and key accomplishments

IN
Current

Senior Data Engineer

Intellinestsystems

Jan 2022 - Present (4 years 4 months)

Led an enterprise data platform serving 90 internal analytics consumers and 14 production ML models, reducing end-to-end data latency from 4 hours to 90 seconds for 3.2B daily events. Built the dbt transformation layer with 420 models (bronze/silver/gold) and automated data-quality checks in Great Expectations, maintaining 99.97% pipeline uptime over 18 months.

PT

Data Engineer

Palantir Technologies

May 2018 - Dec 2021 (3 years 7 months)

Built and maintained data integration pipelines for 6 government and commercial clients, processing classified and sensitive datasets from 500M to 12B records using Spark on Foundry. Developed a reusable PySpark pipeline framework that standardized ingestion and schema evolution, reducing pipeline development time by 60%.

OH

Data Engineer

Optum Health

Aug 2014 - Apr 2018 (3 years 8 months)

Developed HIPAA-compliant ETL pipelines on Hadoop and Hive processing 800M daily clinical and claims events from 14 provider networks for population health and risk-adjustment analytics. Migrated a healthcare warehouse from Teradata to AWS Redshift (14TB, 220 tables) and replaced 150 cron scripts with Airflow, cutting pipeline incident response time by 70%.

Education

Degrees, certifications, and relevant coursework

Colorado State University logoCU

Colorado State University

Bachelor of Science, Computer Science

Earned a B.S. in Computer Science from Colorado State University in 2014.

Find your dream job

Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!

Sign up
Himalayas profile for an example user named Frankie Sullivan