Type of Requisition:
RegularClearance Level Must Currently Possess:
NoneClearance Level Must Be Able to Obtain:
NonePublic Trust/Other Required:
MBI (T2)Job Family:
IT Infrastructure and OperationsJob Qualifications:
Skills:
Data Analysis, Kubernetes, Solution ArchitectureCertifications:
NoneExperience:
8 + years of related experienceUS Citizenship Required:
NoJob Description:
Position Description:
The Solutions Architect/Data Engineer will implement complex systems to meet the current and future needs of a federal agency in Washington, DC. Works closely with stakeholders to ensure IT systems are efficient, secure and compliant with industry standards. The identified candidate will also perform engineering work associated with the design, development, maintenance, and testing of infrastructures for data generation as well as optimizing data flow and collection for cross functional teams.
Position Duties:
Provides enterprise-level technical design and support to leadership to align IT systems and data solutions with organizational goals.
Develops and maintains scalable, secure, and integrated system architectures across cloud and on-premises platforms.
Bridges business needs and technology capabilities to guide complex solution development and implementation.
Leverages expertise in SQL, Python, ETL automation, data pipelines, and cloud platforms (AWS, Azure).
Apply best practices in system integration, data governance, cybersecurity, and enterprise frameworks (e.g., TOGAF, Zachman).
Designs, develops, and implements methods, processes and systems to consolidate and analyze diverse data sets, both structured and unstructured.
Proficient in building infrastructure pipelinesrequired for optimal extraction, transformation, and loading of data from a wide variety of data sources.
Qualifications:
Requires a BA/BS degree in a related discipline and at least 8 years of experience in solution architecture/data engineering.
Strong experience in Kubernetes orchestration for scalable deployment environments.
Expertise in software development, preferably using Python.
Knowledge of machine learning model deployment practices.
Familiarity with ML orchestration tools (e.g., Kubeflow, MLflow, Airflow, SageMaker, or similar).
Experience with infrastructure-as-code using Terraform, and OpenTofu.
Proficiency with GitLab for source control, CI/CD, and DevOps workflows.
Hands-on experience with Ansible for configuration management and automated provisioning.
Experience with Databricks for large-scale data engineering, ML workflows, and collaborative analytics.
Certificate in Kubernetes administration and/or machine learning certification, a plus.
Additional Requirements:
This position requires an existing Public Trust or the ability to obtain one.
This position is 100% remote.
GDIT IS YOUR PLACE
At GDIT, the mission is our purpose, and our people are at the center of everything we do.
Growth: AI-powered career tool that identifies career steps and learning opportunities
Support: An internal mobility team focused on helping you achieve your career goals
Rewards: Comprehensive benefits and wellness packages, 401K with company match, and competitive pay and paid time off
Flexibility: Full-flex work week to own your priorities at work and at home
Community: Award-winning culture of innovation and a military-friendly workplace
OWN YOUR OPPORTUNITY
Explore an enterprise IT career at GDIT and you’ll find endless opportunities to grow alongside colleagues who share your desire to drive operations forward.
Scheduled Weekly Hours:
40Travel Required:
NoneTelecommuting Options:
RemoteWork Location:
Any Location / RemoteAdditional Work Locations:
