Explicitly calls for vibe coding and AI-assisted development, mentioning tools like Cursor, Claude Code, GPT assistants, and Kizer to accelerate development.
About the Role
The Snowflake Data Engineer will design, build, and maintain scalable data pipelines and transformations on Snowflake and Azure to produce authoritative data sources for analytics and decision-making. The role emphasizes SQL and ETL expertise, collaboration with business and analytics teams, and active use of AI-assisted development tools to accelerate delivery.
Job Description
Role
The Snowflake Data Engineer is responsible for transforming complex raw data into trusted, authoritative sources that power analytics and business decision-making. You will design scalable data solutions across cloud platforms, build and optimize SQL-based transformations, and enable self-service analytics.
Key Responsibilities
- Design, build, and maintain scalable, reliable data pipelines using Snowflake, Azure, and modern ETL frameworks.
- Develop and optimize SQL-based transformations and data models to support enterprise reporting and analytics.
- Partner with business, analytics, and engineering teams to create authoritative data sources for self-service analytics and commercialization.
- Modernize legacy ingestion and transformation workflows and support data integration processes.
- Collaborate with BI and analytics teams to enable reporting and self-service analytics using platforms such as Power BI or Sigma.
- Participate in Agile delivery practices using tools such as Jira and ServiceNow.
- Apply AI-powered coding tools and automation techniques (e.g., Claude Code, GPT-based assistants, Cursor, Kizer) to streamline development and improve productivity.
Requirements
- 5+ years of experience in data engineering, data architecture, or a related technical discipline.
- Strong hands-on experience with Snowflake, including data modeling, performance optimization, and large-scale transformation workloads.
- Required expertise in Azure, SQL, and ETL development for cloud-based data platforms and integrations.
- Experience designing, building, and maintaining enterprise-grade data pipelines and transformation frameworks.
- Familiarity with Agile methodologies and tools such as Jira and ServiceNow.
- Experience supporting analytics and reporting platforms such as Power BI or Sigma and enabling self-service analytics.
- Hands-on AI-assisted development experience using tools such as Claude Code, GPT-based assistants, Cursor, or Kizer.
- Experience with healthcare or regulated industry data is a strong plus, including data quality, governance, and sensitivity considerations.