Mentions "Vibe coding" as a nice-to-have.
About the Role
Hands-on Data Engineering & Integration Engineer responsible for building scalable, high-performance data platforms on Snowflake and AWS. Design and optimize ETL/ELT pipelines, integrate diverse data sources, and enable analytics and AI-driven use cases through production-grade data systems.
Job Description
Role
We are looking for a hands-on Data Engineering & Integration Engineer to design and build scalable, high-performance data platforms on Snowflake and AWS that support analytics and AI-driven use cases.
Key Responsibilities
- Design, build, and optimize ETL/ELT pipelines using Snowflake and AWS
- Integrate data from APIs, files, logs, and streaming sources
- Develop advanced SQL, stored procedures, and Snowflake tasks
- Optimize Snowflake performance (query tuning, clustering, cost optimization)
- Implement secure, scalable cloud architectures on AWS
- Partner with stakeholders to translate business needs into data solutions
- Perform exploratory data analysis and support analytics/reporting use cases
- Build and maintain CI/CD pipelines for data workflows
- Automate testing, deployment, and monitoring of data pipelines
- Leverage Snowflake Cortex / AI capabilities
Requirements
- 2–3 years of relevant experience
- Strong experience with Snowflake Data Cloud
- Advanced SQL and strong Python programming skills
- Hands-on experience with AWS
- Expertise in building scalable ETL/ELT pipelines
- Solid understanding of data modeling and warehousing
- Experience with CI/CD and DevOps practices
Nice to Have
- Experience with AI/ML pipelines and vector databases
- Familiarity with data quality, governance, and observability frameworks
- Vibe coding