We're looking for a Geospatial Data Engineering Intern to help build and scale our geospatial data infrastructure over the summer. You'll work directly with our data team, contributing to operational infrastructure that powers geospatial analysis and decision-making across the organization.
Requirements
- Currently a junior or senior undergraduate in Computer Science, Data Science, GIS, Geospatial Engineering, Software Engineering, or a related field
- At least one prior internship (or equivalent team-based engineering experience); you've shipped code in a shared repo, taken a code review, and worked a ticket end-to-end
- Available to work full-time for 3 months during the summer, then part-time through the fall semester
- Strong fundamentals in Python, including classes, inheritance, decorators, type hints, and explicit imports
- Strong fundamentals in SQL: joins, CTEs, window functions, and aggregations
- Comfortable working in Git/GitHub with a dev → main PR-to-deploy workflow
- Comfortable on the Unix command line or eager to learn (bash, navigating a filesystem, running scripts)
- Familiarity with geospatial concepts (CRS, spatial joins, indexing) and tooling such as PostGIS, GeoPandas, or QGIS is a plus
- Exposure to AWS or similar cloud providers; ideally S3, IAM, Athena, Glue, ECS, or Redshift
- Experience with Airflow or similar orchestration tools is a plus (or strong eagerness to learn quickly. You'll be ramping on two versions in parallel and contributing directly to our migration effort)
- Familiarity with dbt, Pandas, or Parquet/columnar data is a plus
- Exposure to AI agent architectures (e.g., ReAct) and protocols (A2A, MCP, AG-UI) is a plus
Benefits
- 100% remote work from home
- Competitive hourly wage - $35-$40 per hour
- Opportunities to learn and grow – all things startups
- A chance to play a role in defining the roadmap as we pursue a bold vision and and a big goal
- Work from (almost) anywhere.
- To get away - we all convene 1-2x a year for [optional, encouraged] retreats
