Ravi kumar
@ravikumar12
Experienced Data Engineer with a passion for scalable data solutions.
What I'm looking for
As an experienced Data Engineer with over 6 years of hands-on expertise, I specialize in designing, building, and optimizing scalable data pipelines and architectures. My proficiency in big data technologies such as Python, SQL, Spark, and AWS cloud services has enabled me to deliver data-driven solutions across diverse industries, including healthcare, media, and gaming. I am committed to performance, reliability, and data governance, ensuring that the solutions I provide meet the highest standards.
Throughout my career, I have successfully designed and developed client-specific data pipelines, optimized SQL queries, and implemented CI/CD pipelines to automate and streamline processes. My recent role at ConcertAI involved significant improvements in data processing efficiency, achieving up to 60% faster runtimes through innovative multithreading techniques. I have also led migrations to modern data platforms like Databricks, enhancing scalability and operational governance.
I thrive in environments that challenge my technical skills and allow me to contribute to impactful projects. My goal is to continue leveraging my expertise in data engineering to drive business success and foster data-driven decision-making.
Experience
Work history, roles, and key accomplishments
Data Engineer
ConcertAI
Apr 2023 - Present (2 years 2 months)
Designed and developed scalable, client-specific data pipelines to ingest data from diverse sources including APIs, SFTP, RDS, and S3. Managed modular ETL workflows using AWS Glue by separating ingestion, transformation, and aggregation into dedicated workflows, and delivered final datasets via secure SFTP transfers or Snowflake Secure Data Sharing. Led the migration of existing data pipelines fro
Data Engineer
Agilisium Consulting
Dec 2021 - Present (3 years 6 months)
Built scalable, end-to-end data pipelines on Databricks using Delta Live Tables, Workflows, and Notebooks to automate data ingestion, transformation, and scheduling. Migrated legacy Data Lake architecture to Delta Lake, creating Delta tables from existing Parquet files and implementing Bronze-Silver-Gold layer architecture. Automated monitoring and workflow orchestration using Apache Airflow, Data
Product Engineer
LinkCXO Global Private Ltd
Apr 2019 - Present (6 years 2 months)
Designed and implemented scalable data engineering solutions using Azure Data Factory, Azure Data Lake Gen2, Blob Storage, Azure SQL Database, Azure Databricks, and HDInsight, enabling end-to-end data integration, transformation, and analytics workflows. Developed metadata-driven and event-triggered ADF pipelines with branching, chaining, parameterization, and CI/CD integration via Azure DevOps, i
Education
Degrees, certifications, and relevant coursework
Visvesvaraya Technological University
Bachelor of Engineering, Computer Science
Completed a Bachelor of Engineering in Computer Science. The curriculum covered core concepts in computer science, including programming, data structures, and algorithms.
Tech stack
Software and tools used professionally
Availability
Location
Authorized to work in
Job categories
Interested in hiring Ravi?
You can contact Ravi and 90k+ other talented remote workers on Himalayas.
Message RaviFind your dream job
Sign up now and join over 85,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
