Snowflake Consultant_Bharti_RF

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong understanding of Snowflake capabilities such as data ingestion and sharing., Expertise in data modeling techniques like Star schema and Data Vault., Proficient in programming languages such as Python, Scala, Java, or PHP., Experience with ETL/ELT processes and familiarity with AWS integration..

Key responsibilities:

  • Work with large datasets to optimize data usage and performance.
  • Automate deployment processes and manage data pipelines.
  • Collaborate with clients to understand their data needs and provide solutions.
  • Utilize data visualization tools to present insights and findings.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

• A passion for all things data; understanding how to work with it at scale, and more
importantly, knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing,
zero-copy cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like
Star schema and/or Data Vault
• Experience in automating deployments • Experience writing code in Python, Scala or
Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like
AWS Glue, Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with
Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar
tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality,
Dremio etc.
• Certified SnowPro Advanced: Data Engineer is a must.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Related jobs