Software Engineering Lead Analyst - HIH - Evernorth
Explicitly mentions using "vibe coding" tools to improve development velocity.
About the Role
Lead software engineer focused on designing and implementing ETL and data integration solutions for Evernorth's Data & Analytics organization. The role delivers scalable data transformation, ensures data quality and performance, and collaborates across teams to support enterprise data platforms like Databricks and Denodo.
Job Description
Role
Evernorth is seeking a Software Engineering Lead Analyst to design, implement, and maintain ETL and data integration solutions in the Data & Analytics organization. The role supports enterprise-scale data capabilities, improves development velocity, and delivers production-ready data transformation workflows.
Key Responsibilities
- Design and implement ETL processes to extract, transform, and load data from various sources.
- Conduct research and perform proof-of-concept (POC) activities.
- Use vibe coding tools to improve development velocity.
- Collaborate with cross-functional teams to gather data requirements and deliver solutions.
- Optimize and maintain ETL workflows for efficiency and performance.
- Ensure data integrity and quality through testing and validation.
- Create and implement best practices for large data transformation layers.
- Prepare detailed technical documentation.
- Collaborate with team members to deliver production-ready applications.
Requirements
Required
- 5–8 years of relevant ETL development experience.
- 4+ years of experience with Databricks.
- 4+ years of experience with Python.
- Hands-on experience building data integration solutions.
- Experience with both NoSQL and relational databases (examples cited: Mongo, Postgres, Teradata, SQL Server).
- Experience with Data Warehouse, Data Lake, and Lake House technologies.
- Strong understanding of database concepts, query optimization, and performance tuning.
- Hands-on experience with scripting (Python) and Airflow monitoring.
- Experience with Denodo Platform administration (VDP server configuration, data source setup, view creation, RBAC, cache management).
- Experience with Denodo Solution Manager for environment promotion and deployment management.
- Experience configuring and tuning Presto/MPP within Denodo.
- Familiarity with Denodo SSO integration using SAML/OAuth providers (e.g., Okta) and user/group mapping.
- Experience with Denodo JDBC/ODBC connectivity to BI tools such as Tableau.
- Hands-on experience automating code/artifact migration across environments using CI/CD pipelines or platform-native tools (e.g., Denodo Solution Manager, Databricks Repos, AWS CodePipeline).
Preferred
- Databricks certifications and cloud certifications (e.g., AWS Developer Associate).
- Denodo Platform 8.0+ certification or equivalent admin experience.
- Experience deploying and managing Denodo on AWS EC2, including OS-level configuration and JVM tuning.
Tools & Technologies
Databricks, Python, Airflow, Denodo (VDP, Solution Manager), Presto/MPP, Mongo, Postgres, Teradata, SQL Server, JDBC/ODBC, Tableau, SAML/OAuth (e.g., Okta), Databricks Repos, AWS CodePipeline, AWS (EC2), Azure, Data Lake/Warehouse/Lake House patterns.
Team & Environment
Work in an agile development environment focused on delivering user-oriented, enterprise-scale data capabilities. The team emphasizes technical excellence, collaboration, and building maintainable solutions.