Data Architect-GCP

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

7+ years of experience in architecting and optimizing ETL pipelines and data warehouses., Strong skills in GCP architecture, data governance, and security., Expertise in data modeling and system design, particularly in GCP environments., Proven ability to communicate effectively between business and technology teams..

Key responsibilities:

  • Design and implement MicroServices in AWS and GCP.
  • Architect the migration from AWS to GCP, focusing on ELT/ETL processes.
  • Build scalable and fault-tolerant ETL pipelines while ensuring data integrity and quality.
  • Provide mentorship and technical leadership to junior data engineers and collaborate with other architects on core data systems.

Saviance Technologies Pvt. Ltd. logo
Saviance Technologies Pvt. Ltd. SME https://saviance.com/
51 - 200 Employees
See all jobs

Job description

 

Data Architect-GCP
Boston, MA- entirely remote

About BigR.io:
BigR.io is a remote-based, technology consulting firm with headquarters in Boston, MA. We deliver software solutions ranging from: custom development, software implementation, data analytics, and machine learning/AI integrations.  We are a one-stop shop that attracts clients from a variety of industries because of our proven ability to deliver cutting-edge and cost-conscious software solutions.
With extensive domain knowledge, BigR.io has teams of data architects, data engineers, software engineers, web developers, and consultants who deliver best-in-class solutions across a variety of verticals. Our diverse industry exposure equips us with invaluable tools, tricks, and techniques to tackle complex software and data challenges

About the Job:

The Data Team designs, builds, and maintains the integrated platform that securely procures and links critical business data from disparate internal and external sources. This involves assimilating all structured, unstructured, and semi-structured data. The Data Engineer will touch all aspects of the data operation, in particular, data infrastructure, to ensure a robust, efficient, and consistent foundation for enterprise consumption & application development.

 

Responsibilities:

  • Design with MicroServices in AWS and GCP 
  • Architect the migration from AWS to GCP - ELT/ETL skills
  • Skilled in implementing GCP BigQuery
  • Build robust, scalable, and fault-tolerant ETL pipelines that allow flexibility and efficiency with minimum overhead and maintenance
  • Responsible for data integrity and consistency and quality of new releases
  • Provide mentorship and technical leadership to junior data engineers of the team
  • Work with other architects and engineers to define, execute, and update core data systems while maintaining a high level of availability and transactional correctness..
  • Help define future technical directions for data systems in collaboration with senior management, product management, and stakeholders.

 

Minimum Job Qualifications:

  • 7+ years of full-time professional experience in architecting, building and optimizing ETL pipelines and data warehouse to onboard and streamline data cleansing, transformation, standardization and aggregation processes with thoughtful design mindset centered around flexibility, robustness, computing efficiency and maintenance
  • Excellent and proven abilities in:
    • Analytic skills
    • GCP architectural skills
    • Data governance
    •  Security
    • data modeling
  • Proven track record of owning data integrity and QA, with an exceptional attention to detail
  • Expert at data modeling and system design
  • Expert in designing GCP environments.
  • Comfort in working in a fast-paced environment with moving targets and changing priorities
  • Proven ability to speak both business and technology, and effectively liaison with both teams
  • Demonstrated expertise in Information Architecture, Data Engineering, and data warehousing
  • Strong experience designing distributed systems for scale and high availability.
  • Extensive experience in designing microservice based applications and data pipelines, concepts of ETLs.
  • Deep experience with AWS and GCP

Preferred Job Qualifications:

  • Direct experiences with GCP BigQuery
  • Hands-on experiences with web analytics, 3rd party augmentation data, and A/B testing is a plus
  • Experience in hosting, provisioning, and maintaining database servers in cloud environment, preferably GCP, is a plus

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Detail Oriented
  • Communication
  • Analytical Skills

Data Architect Related jobs