Match score not available

Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor’s degree in Computer Science, Software Engineering, or a related field., 5+ years of experience in data engineering or a similar role., Strong hands-on experience with GCP services including Dataflow, Pub/Sub, and BigQuery., Proficiency in Python and SQL for data manipulation and transformation..

Key responsabilities:

  • Design, build, and maintain data pipelines using GCP tools like Dataflow and BigQuery.
  • Lead data ingestion efforts from on-premise systems and external APIs into the cloud.
  • Collaborate with data scientists and analysts to ensure data availability and quality.
  • Monitor and optimize data pipelines for performance and reliability.

Job description

Who we are: 

We are a full-Service agency & content Studio helping companies to thrive through strategy, creative, technology services, and human talent.


Job Purpose:

We’re looking for a Senior Data Engineer with deep expertise in building scalable, efficient, and secure data pipelines on Google Cloud Platform (GCP). You will play a key role in designing and implementing robust data solutions that empower data-driven decisions across the organization.

If you have a passion for cloud-native architectures, streaming data, and modern data integration strategies, we’d love to talk.


Job Details:

  • Location: 100% Remote - Open to candidates based in Central America, Mexico, or Colombia.
  • Schedule: Monday to Friday, 8:00 AM to 5:00 PM CR Time Zone
  • Job Type: Full-Time Employment
  • Language Proficiency: Spanish: Advanced (Fluent) / English: C1 in reading, writing, and conversation
  • Availability: Immediate availability is preferible


Key Responsibilities:

  • Design, build, and maintain data pipelines using Dataflow, Cloud Pub/Sub, BigQuery, and Cloud Data Fusion.
  • Develop scalable streaming and batch pipelines to support real-time and historical data use cases.
  • Lead data ingestion efforts from on-premise systems, data lakes, and external APIs into the cloud environment.
  • Collaborate with data scientists, analysts, and platform teams to ensure data availability and quality.
  • Write efficient and production-grade Python and SQL code for data transformation and validation.
  • Implement pipeline orchestration using tools such as Cloud Composer, Airflow, or similar.
  • Monitor, troubleshoot, and optimize data pipelines to ensure performance and reliability.
  • Contribute to architecture and design decisions that support long-term scalability and maintainability.


Required Academic Background:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.


Required Skills and Experience:

  • 5+ years of experience in data engineering or a similar role.
  • Strong hands-on experience with GCP services including Dataflow, Pub/Sub, BigQuery, and Cloud Data Fusion.
  • Proficiency in building stream processing systems using Kafka.
  • Familiarity with Docker, Kubernetes, and cloud services (AWS, GCP).
  • Advanced knowledge of Python and Linux shell scripting.
  • Proven expertise in streaming data architectures and real-time processing.
  • Experience ingesting and integrating data from on-premise sources, data lakes, and streaming platforms.
  • Experience with business intelligence software (e.g., Power BI, Tableau) and the graphic display of quantitative data.
  • Proficient in Python and SQL for data manipulation, transformation, and automation.
  • Skilled in pipeline orchestration tools such as Airflow, Cloud Composer, or equivalent.
  • Solid understanding of data modeling, data governance, and performance optimization best practices.


Nice to Have

  • Experience with CI/CD for data pipelines.
  • Familiarity with Terraform or Infrastructure as Code.
  • Background in security and compliance related to data.
  • Knowledge of other cloud platforms (AWS, Azure) is a plus.


Core Competencies:

At our company, we believe that success is not just about technical proficiency but also how you work with others and approach challenges. As part of our team, you’ll be expected to demonstrate the following key competencies:

  1. Cultivates Innovation:
    • Comes up with useful ideas that are new, better, or unique.
    • Introduces new ways of looking at problems.
  2. Collaborates:
    • Works cooperatively with others across the organization to achieve shared objectives.
    • Credits others for their contributions and accomplishments.
  3. Manages Complexity:
    • Evaluates pros and cons, risks and benefits of different solution options.
    • Analyzes multiple and diverse sources of information to define problems accurately before moving to solutions.
  4. Communicates effectively:
    • Delivers messages in a clear, compelling, and concise manner.
    • Provides timely and helpful information to others across the organization. 
    • Proficiency in English for written and verbal communication.
    • Ability to articulate technical challenges and solutions to diverse audiences

Additional Notes:

  • This role requires a advanced Spanish speaker with intermediate to advanced English proficiency (both written and verbal).
  • We are open to candidates based in Central America, Mexico, or Colombia.



Compensation$2,000 - $2,500 USD per month

Required profile

Experience

Spoken language(s):
SpanishEnglish
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs