Data Engineer (Rewards Data)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Proficiency in Python and experience with data pipeline implementation., Familiarity with Apache Airflow, Kubernetes, Postgres, and Google Cloud Storage., Strong understanding of data quality assurance and automated checks., Experience in working with blockchain technology and APIs..

Key responsibilities:

  • Design and implement data pipelines for rewards and pricing data.
  • Maintain automated workflows for asset management and APY data.
  • Develop internal dashboards for revenue and asset analysis.
  • Collaborate with cross-functional teams to meet data needs and business requirements.

Chorus One logo
Chorus One Startup https://chorus.one/
51 - 200 Employees
See all jobs

Job description

About Us

Chorus One is one of the leading operators of infrastructure for Proof-of-Stake networks and decentralized protocols. Tens of thousands of retail customers and institutions are staking billions in assets through our infrastructure helping to secure protocols and earn rewards. Our mission is to increase freedom and speed of innovation through decentralized technologies.

We are a diverse team of around 75 people distributed all over the globe. We value radical transparency, striving for excellence and improvement while treating each other with kindness and generosity. If this sounds like you, we'd love to hear from you.

Role

As a Data Engineer, you will be responsible for the design, implementation, and maintenance of data pipelines that power user-facing products as well as internal tools. You will integrate with internal blockchain RPC nodes and third-party APIs to fetch stake and rewards data for multiple blockchains, and store and process this data to power internal reports and dashboards, as well as external customer reporting. You will play a critical role in enabling data-driven decision-making.

Our current data pipelines are implemented in Python, and run on a mix of Apache Airflow and Kubernetes, storing data in Postgres and Google Cloud Storage. We use various dashboarding and analysis tools, including Streamlit.

Responsibilities
  • Implement data pipelines that ingest rewards, pricing and commercial contract data. Ensure data quality via automated sanity checks.

  • Maintain automated workflows that make assets-under-management numbers and APY data available to our website and third parties.

  • Develop and maintain internal dashboards for the presentation and analysis of company revenue data, assets under management, and a detailed breakdown of these numbers.

  • Work with cutting-edge blockchain technology, so all of the above data is available at the moment when a new blockchain network launches.

  • Collaborate with our Networks, Engineering, Finance, and Customer Success teams to understand data needs and deliver on business requirements.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Detail Oriented
  • Collaboration
  • Problem Solving

Data Engineer Related jobs