• A passion for all things data; understanding how to work with it at scale, and more
importantly, knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing,
zero-copy cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like
Star schema and/or Data Vault
• Experience in automating deployments • Experience writing code in Python, Scala or
Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like
AWS Glue, Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with
Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar
tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality,
Dremio etc.
• Certified SnowPro Advanced: Data Engineer is a must.