Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field., 5+ years of experience in data engineering, with expertise in ETL/ELT workflows and big data technologies., Advanced skills in Python and SQL, with experience in cloud platforms like Azure, AWS, or GCP., Proven experience with tools like Apache Spark, DataBricks, Snowflake, or Airflow..
Key responsabilities:
Architect, document, implement and maintain scalable cloud-native data pipelines using modern tools.
Design and optimize ETL/ELT workflows ensuring robust data quality, security, and governance practices.
Collaborate cross-functionally to ensure smooth data accessibility and support business intelligence tools like Power BI and Tableau.
Mentor peers, lead code reviews, and contribute to real-time analytics and machine learning pipelines.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Wagepoint is on a mission to simplify payroll, time-tracking and employee management – and maybe even dare to make it delightful! Our online software was created just for small businesses, automating the most time-consuming parts of these tasks, so that our customers can get back to doing what they love most.
Backed by the world’s friendliest team, Wagepoint is always supportive, never stuffy and refreshingly human.
Click the bell icon to keep up with all things Wagepoint!
At Wagepoint, we’re good people. Sure, maybe we’re biased. But that’s only because Wagepointers bring the best versions of ourselves to do some of the best work in our careers.
We’re an authentic team who bring our diverse backgrounds, ideas, experiences and cultures together to make payroll magic. Speaking of, we “pull rabbits out of hats” for thousands of small business owners across Canada who rely on our payroll software to pay their employees and process their payroll taxes. And we’re just getting started.
We’re growing and looking for more awesome people to join our merry band of misfits. We are looking for people who share our understanding that when it comes to our commitment to our customers: we mean business. Especially given that the consequences of getting payroll wrong can sometimes be dire.
If you are the kind of person who has always wanted to make a difference and be heard at work, Wagepoint will give you plenty of opportunity to do just that.
The Role At High Level
We’re looking for a talented Senior Data Engineer who doesn’t shy away from building robust, scalable, and secure data systems. You’ll lead the development of our cloud-based data infrastructure and work closely with teams across analytics, engineering, and business intelligence to support decision-making with high-quality data. You will report directly to the Engineering Manager.
What You'll Be Expected To Own
Architect, document,implement and maintain scalable cloud-native data pipelines using modern tools that may include Apache Airflow, DataBricks, Snowflake, Spark, dbt, and Azure.
Design and optimize ETL/ELT workflows and ensure robust data quality, security, and governance practices.
Build and maintain data lakes, warehouses, and customer data architectures with a focus on performance and scalability.
Automate validation and anomaly detection across the data ecosystem.
Standardize and maintain the data dictionary and metadata repositories.
Monitor and optimize performance, cost-efficiency, and system uptime.
Collaborate cross-functionally to ensure smooth data accessibility and support business intelligence tools like Power BI and Tableau.
Mentor peers, lead code reviews, and drive best practices across the Data Engineering team.
Contribute to real-time analytics, machine learning pipelines, and customer-facing reporting systems.
What You Bring To The Table
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.
5+ years of experience in data engineering, with expertise in ETL/ELT workflows and big data technologies.
Advanced skills in Python and SQL.
Experience with Azure (Data Lake, Synapse, CosmosDB) or other cloud platforms (AWS, GCP).
Proven experience with tools like Apache Spark, DataBricks, Snowflake, or Airflow.
Solid knowledge of data warehousing, data modeling, and distributed systems.
Familiarity with CI/CD pipelines, containerization (Docker/Kubernetes), and security compliance frameworks.
Excellent problem-solving skills and attention to detail in optimizing data systems.
Strong communication skills and ability to work across multiple teams.
Bonus Points If
You have DevOps experience using Azure DevOps.
You’ve supported machine learning or AI workflows.
You’ve built or contributed to enterprise data dictionaries.
You have experience building real-time data monitoring and validation frameworks.
You’ve worked in fast-paced SaaS environments with a remote team.
What We Bring To The Table
Impact: Roll up your sleeves and directly contribute to the growth and success of Wagepoint by shaping our workforce.
Culture: The opportunity to work with the world’s friendliest team, solving interesting problems together with an endless amount of laughter (We work hard, but we always have time for a bad joke or two).
Growth: Opportunities for professional development and career advancement - we are always, always learning with a growth oriented mindset.
Innovation: Work in an environment that encourages creativity and new ideas.
Remote: The ability to work from home, forever! Wagepoint is a remote company, so you don’t have to worry about commuting to an office. Plus, more time with your pets is always a bonus!
Ready to embark on an exciting journey with us? Apply now and help us build the future of Wagepoint!
Required profile
Experience
Industry :
Fintech: Finance + Technology
Spoken language(s):
English
Check out the description to know which languages are mandatory.