The Role
This internship can be full-time or part-time. We will work with you to help grow your knowledge base of the energy industry and sales operations, while improving on your process improvement, operational, and analytical skillset.
Responsibilities for this role:
- Develop data warehouse and data marts that transform raw data into high-performance, structured data models to power analyses & visualization tools for the Sales organization
- Build “point-in-time” data models that enable trend analysis such as pipeline flow reporting
- Enable Sales Ops users to perform their own ad-hoc analysis by maintaining clean, well-documented, and easy-to-query semantic layers.
- Design automated ingestion paths for manual spreadsheet data to ensure they are validated and integrated into the central data environment.
- Implement automated data quality checks and monitoring to ensure accuracy across all sales reporting.
- Maintain clear documentation of data lineage, business logic, and definitions for all sales-related metrics.
Key Requirements
- Engineering Mindset: A preference for "code-first" analytics—moving away from fragile manual processes toward reproducible, tested pipelines.
- Expert SQL & Python: Must be comfortable writing complex transformations and using Python for pipeline automation using tools such as Jupyter or Marimo.
- Deep understanding of schemas and fact/dimension tables to ensure intuitive self-service for data users
- Experience with dbt or similar frameworks that prioritize version-controlled, modular data modeling.
- Familiarity with (or eagerness to learn) Customer lifecycles and CRM data structures (e.g. Salesforce)
- Uncommonly driven to succeed and maniacally self-initiated; extreme attention to detail
- Passion for clean energy and sustainability
- Background in a quantitative field such as Mathematics, Economics, Finance, or Engineering is required; MBA experience is preferred
