About JerseySTEM
All JerseySTEM roles are pro-bono (unpaid) positions.JerseySTEM is a mission-driven professional network of pro-bono contributors dedicated to improving access to STEM education and career pathways for underserved middle school girls in New Jersey.
Members contribute their professional skills and leverage their networks in service of the organization’s gender-equity agenda.Membership is a minimum six-month commitment of approximately six flexible hours per week and includes a $100 refundable deposit, returned after six months of active membership. K–12 educators, retirees, veterans, interns, and students are exempt from the deposit.
Overview
This is a pro-bono volunteer position.
JerseySTEM is seeking experienced Data Engineers to stabilize and scale core data pipelines that power analytics and reporting. The current platform complexity requires ownership from engineers who can design, implement, and maintain production grade workflows across multiple data sources.This role focuses on operational reliability, sound data modeling, and long term platform sustainability.
Responsibilities
- Design, build, and maintain production grade ETL pipelines using MySQL and external data sources
- Integrate third party systems and APIs, including Integrate.io
- Implement CDC and incremental loading strategies for efficient and reliable refresh
- Manage schema changes, late arriving data, and source inconsistencies
- Design and maintain analytical models including fact and dimension tables
- Build and evolve data marts and a centralized data warehouse
- Implement monitoring, documentation, and pipeline standards
- Ensure data quality, consistency, and operational resilience
- Provide technical leadership and define data engineering best practices
Qualifications
- Seven or more years of hands on experience in data engineering or data platform roles
- Strong experience working with MySQL in analytical or hybrid environments
- Proven experience integrating external APIs and third party systems
- Demonstrated experience implementing CDC or incremental load patterns
- Deep understanding of dimensional modeling and warehouse architecture
- Advanced SQL skills and strong proficiency in Python or similar languages
- Ability to operate independently and own pipelines end to end
What This Role Is Not
- Not limited to ad hoc scripts or one off fixes
- Not a purely advisory position
- Not a passive oversight role
What Success Looks Like
- ETL pipelines are stable, incremental, and predictable
- API ingestion runs with minimal manual intervention
- Data models are trusted and analytics ready
- Analytics teams focus on insights rather than resolving data issues
