Type of Requisition:
RegularClearance Level Must Currently Possess:
NoneClearance Level Must Be Able to Obtain:
NonePublic Trust/Other Required:
NACI (T1)Job Family:
Software EngineeringJob Qualifications:
Skills:
Agile Methodology, Apache Airflow, Data Warehousing (DW), ETL Design, Extract Transform Load (ETL)Certifications:
NoneExperience:
3 + years of related experienceUS Citizenship Required:
NoJob Description:
Seize your opportunity to make a personal impact as a Cloud ETL Engineer supporting Drug data Processing System (DDPS) Part D Processing for CMS. GDIT is your place to make meaningful contributions to challenging projects and grow a rewarding career.
At GDIT, people are our differentiator. As a Cloud ETL Engineer you will help ensure today is safe and tomorrow is smarter. Our work depends on Cloud ETL Engineer joining our team to DDPS Part D processing for CMS, to support IRA legislation Mandated ETL development and testing for CMS Part D Medicare processing ETL programming.
How a Cloud ETL Engineer will Make an Impact:
- Builds and codes applications and/or models using various computer programming languages.
- Designs, develops, deploys, and maintains advanced operating systems and operating system software
- Installs enhancements and performs updates to software of existing systems, including middleware and application programs that run on the system
- Performs troubleshooting of advanced problems and provides customer support for software systems and application issues
- Debugs advanced problems with system software. Provides recommendations for continuous improvement
- Performs maintenance tasks to keep systems running smoothly
- Writes and updates test procedures and programs
- May coach and provide guidance to less-experienced professionals
- May serve as a team or task lead
What You’ll Need to Succeed:
Education:
- BA/BS in a Computer Science or related technical discipline or the equivalent combination of education, technical certifications or training, or work experience.
Required Experience:
- 3+ years of direct related computer programming experience.
- 3+ years of IT experience with at least 4 years of SQL development experience developing on multiple relational database platforms like Snowflake.
- 2+ years of Cloud ETL development experience using AWS Services / tools, Databricks, Snowflake, and/or similar technologies.
- 3+ years of physical data modeling, partitioning, and developing optimization/indexing strategies on the Teradata platform or similar DBMS.
- 2+ years of experience with Snowflake and Snowflake ETL for loading data from AWS S3
- 2+ years of experience with UNIX scripting and utilities.
- In depth knowledge on Data Warehouse (DW) concepts for ETL Development
- 2+ year of experience in Code migration and deployment using AWS resources in the cloud environment.
- 3+ years of experience in working with Python and Spark programming.
- Candidate must be able to obtain and maintain a Public Trust clearance and must have lived in the United States at least three (3) out of the last five (5) years.
Required Technical Skills:
- AWS Development using S3, EC2 Lambda functions
- Extract Transform Load (ETL)
- Python (Programming Language)
- Apache Spark programming
- Knowledge on Snowflake Data Warehouse with ETL
- Knowledge on Databricks and notebook, coding and execution
- GitHub Code configuration and Management
Required Skills and Abilities:
- Attend daily stand-up scrum calls.
- Collaborate in a "war-room" setting with business analysts, developers, testers, architect, scrum master, and product owner to assist in grooming, designing, coding, unit testing user stories related to the Program Increment and current iteration.
- Exercise positive interpersonal communication skills and works independently and within an agile team during all phase of the software development lifecycle.
- Design, develop and implement complex ETL processes of healthcare data to meet a wide range of business and system requirements.
- Support the ETL operational processes including but not limited to: automation, job scheduling, dependencies, monitoring, maintenance, patches, upgrades, security, and administration.
- Investigate and corrects software defects and analyzes and maintains data quality.
- Mentor and provide guidance to junior team members.
- Identify process improvements and innovative ways to solve existing or new problems.
Preferred Skills:
- Prior experience developing healthcare IT solutions strongly preferred.
- 2+ years of experience in Github, or similar version control tools.
- Prior experience using the Agile development framework, and CI/CD DevOps
- Prior working experience with Medicare Part D Data with ETL development
Location: Remote
Clearance Level: Requires the ability to pass a CMS background check and meet the residency requirement for having resided in the US at least (3) three out of the last (5) five years in order to obtain a Public Trust.
Sponsorship will not be provided for this position
What GDIT Can Offer You:
- Full-flex work week to own your priorities at work and at home, with core work hours Monday – Friday 9:00 AM ET – 3:00 PM ET
- 401K with company match
- Comprehensive health and wellness packages
- Internal mobility team dedicated to helping you own your career
- Professional growth opportunities including paid education and certifications
- Cutting-edge technology you can learn from
- Rest and recharge with paid vacation and holidays
- Challenging work that makes a real impact on the world around you
- Remote work
#GDITFedHealthJobs
The likely salary range for this position is $102,000 - $138,000. This is not, however, a guarantee of compensation or salary. Rather, salary will be set based on experience, geographic location and possibly contractual requirements and could fall outside of this range.Scheduled Weekly Hours:
40Travel Required:
NoneTelecommuting Options:
RemoteWork Location:
Any Location / RemoteAdditional Work Locations:
