Match score not available

Senior Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years experience in data engineering or a data-facing role., Bachelor's degree in a quantitative field such as Computer Science, Engineering, or Mathematics., Experience in data modeling, data warehousing, and building ETL pipelines., Proven experience leveraging AI tools and integrating AI-driven solutions into workflows..

Key responsabilities:

  • Develop and maintain scalable data pipelines and build new integrations for increasing data volume.
  • Implement automated monitoring and alerting features for data consumption pipelines.
  • Ensure data quality by implementing processes and systems for accurate production data.
  • Collaborate with business units and engineering teams to develop long-term data platform architecture.

Apollo.io logo
Apollo.io Large https://www.apollo.io/demo
501 - 1000 Employees
See all jobs

Job description

Apollo.io is the leading go-to-market solution for revenue teams, trusted by over 500,000 companies and millions of users globally, from rapidly growing startups to some of the world's largest enterprises. Founded in 2015, the company is one of the fastest growing companies in SaaS, raising approximately $250 million to date and valued at $1.6 billion. Apollo.io provides sales and marketing teams with easy access to verified contact data for over 210 million B2B contacts and 35 million companies worldwide, along with tools to engage and convert these contacts in one unified platform. By helping revenue professionals find the most accurate contact information and automating the outreach process, Apollo.io turns prospects into customers. Apollo raised a series D in 2023 and is backed by top-tier investors, including Sequoia Capital, Bain Capital Ventures, and more, and counts the former President and COO of Hubspot, JD Sherman, among its board members.

**This is a Permanent EoR role and not a B2B Contract** 
Your Role & Mission

As a Senior Data Engineer you will be responsible for maintaining and operating the data warehouse and connecting in Apollo’s data sources.

Daily Adventures and Responsibilities
  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.
  • Write unit/integration tests, contributes to engineering wiki and document work.
  • Define company data models and write jobs to populate data models in our data warehouse.
  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
Competencies
  • Excellent communication skills to work with engineering, product, and business owners to develop and define key business questions and build data sets that answer those questions.
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open; loves learning
  • Critical thinking and proven problem-solving skills required
Skills & Relevant Experience
Required:
  • 5+ years experience in data engineering or in data facing role
  • Experience in data modeling, data warehousing, and building ETL pipelines
  • Deep knowledge of data warehousing with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Proven experience leveraging AI tools by demonstrating fluency in integrating AI-driven solutions into their workflows and a willingness to stay current with emerging AI technologies
Preferred:
  • Experience using the Python data stack
  • Experience deploying and managing data pipelines in the cloud (preferably AWS or GCP)
  • Experience working with technologies like Airflow, Hadoop and Spark
  • Understanding of streaming technologies like Kafka, Spark Streaming



Why You’ll Love Working at Apollo

At Apollo, we’re driven by a shared mission: to help our customers unlock their full revenue potential. That’s why we take extreme ownership of our work, move with focus and urgency, and learn voraciously to stay ahead.

We invest deeply in your growth, ensuring you have the resources, support, and autonomy to own your role and make a real impact. Collaboration is at our core—we’re all for one, meaning you’ll have a team across departments ready to help you succeed. We encourage bold ideas and courageous action, giving you the freedom to experiment, take smart risks, and drive big wins.

If you’re looking for a place where your work matters, where you can push boundaries, and where your career can thrive—Apollo is the place for you.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Assertiveness
  • Critical Thinking
  • Detail Oriented
  • Self-Motivation
  • Problem Solving

Data Engineer Related jobs