GTM Engineering Team at TRACTIAN
What you'll do
Responsibilities:
- Architect and manage data pipelines and infrastructure for GTM systems, leveraging BigQuery, Clay, ZoomInfo, as well as cloud storage and ETL frameworks.
- Develop, test, and maintain highly available scripts and microservices (Python, JavaScript/Node.js) to orchestrate data acquisition and processing workflows.
- Design and manage a unified GTM data lake, merging raw and enriched datasets into a single source of truth for segmentation, ICP scoring, and territory assignments.
- Incorporate LLM-based enrichment pipelines to intelligently augment account/contact metadata and drive predictive targeting.
- Build and maintain batch & streaming ingestion pipelines for GTM data using Apache Beam, Kafka, or Pub/Sub, with Airflow orchestration.
- Develop semantic layers and star/snowflake schemas for GTM analytics in dbt, ensuring BI tools query from pre-aggregated, materialized datasets.
- Partner closely with RevOps, Enablement, and Sales Leadership to improve seller workflows, define KPIs, and ensure data consistency across tools.
- Proactively identify areas for automation and workflow efficiency; build scalable solutions for repetitive GTM processes.
- Design and maintain high-cardinality indexing strategies for real-time ICP scoring and territory assignment.
Requirements:
- 5+ years of experience in a technical Data engineer, data analytics, or data science role—preferably in high growth B2B SaaS.
- Comfort with API integrations and at least one scripting language (Python, Go) to automate and connect tools.
- Experience managing large datasets and transforming GTM insights into tactical recommendations.
- Excellent problem-solving skills and a bias for automation and scale.
- Ability to collaborate across functions and communicate technical concepts to non-technical stakeholders.
- Comfortable working autonomously in a fast-paced environment.