Senior Data Engineer (Contract-to-Full-Time)
Company: Throttle Up Technologies, LLC
Location: Remote
Job Type: Full-time, Contract (1–3 months, with conversion to permanent role)
Salary: $100,000 – $160,000 per year
Benefits: Health, Dental, Vision, Life Insurance, Paid Time Off
About the Role
Throttle Up Technologies is hiring a Senior Data Engineer to own and modernize a complex, affiliate- and content-driven data ecosystem. You’ll collaborate closely with a Data Engineer and Data Scientist to optimize pipelines, improve observability, and lead orchestration architecture. This role begins as a short-term contract with the expectation of transitioning into a permanent leadership position.
Key Responsibilities
Data Pipeline Ownership
- Manage and optimize ETL pipelines ingesting data from affiliate platforms (Rakuten, Impact), ad networks, and web/CDN logs
- Maintain and enhance Python-based ingestion, transformation, and reverse ETL workflows
- Expand composable CDP capabilities (Hightouch) for identity resolution and audience segmentation
Orchestration & Observability
- Lead the selection and implementation of an orchestration framework (Dagster or Prefect)
- Build monitoring, alerting, and observability to reduce downtime and speed up troubleshooting
Data Modeling & Warehouse Architecture
- Refactor and scale core data models for new business lines
- Design dimensional models and fact tables for analytics and reporting
Reporting & Analytics
- Support reporting in Hex using R/Python
- Partner with Data Science on experimentation, A/B testing, and advanced analytics
AI Integration
- Apply AI-driven tools to improve pipeline performance and scalability
- Stay current with emerging AI and data engineering best practices
Collaboration & Leadership
- Work cross-functionally with engineering, analytics, and business teams
- Mentor junior engineers and promote best practices
Model Ops for Paid Media
- Build production pathways for models, including packaging, orchestration, and monitoring
- Align offline/online metrics and ad-platform attribution (gclid/gbraid, hashing, deduplication)
Required Qualifications
- Expert Python for data pipelines and custom transformations
- Advanced SQL with DBT for complex transformations and performance optimization
- Experience with GCP (BigQuery, Cloud Functions preferred)
- Hands-on ETL/ELT tools (DBT, Airbyte, or similar)
- Orchestration experience (Airflow, Dagster, Prefect)
- CDP familiarity (Hightouch, Segment, RudderStack)
- Strong data modeling and warehousing expertise
- Practical enthusiasm for using AI in daily workflows
Soft Skills
- Strong problem-solving and communication skills
- Ability to work independently in a fast-paced environment
- Comfortable owning projects end-to-end with minimal oversight
- Adaptable and resilient to changing priorities
Application Notes
- This role starts with a 1–3 month contract before conversion to full-time
- Candidates will be asked about orchestration experience and real-world production incidents related to models or ad-platform attribution
Work Location: Remote
