Explicitly calls out vibe coding and use of AI-powered coding assistants (e.g., GitHub Copilot, Gemini) and agentic AI frameworks to accelerate development.
About the Role
Lead the design and deployment of cloud-native, scalable AI solutions and APIs at Equifax, applying agentic AI frameworks and modern development practices. The role focuses on building production-ready AI agents, data pipelines, and microservices while mentoring junior engineers.
Job Description
Role
Equifax is hiring an AI Engineer (Intermediate) to lead technology transformation efforts by architecting and deploying cloud-native, scalable AI and data solutions. The role covers full stack development, cloud infrastructure, containerized microservices, and building/deploying AI agents on Google Cloud Platform.
Key Responsibilities
- Design, develop, and deploy AI agents, APIs, microservices, and PaaS/SaaS platforms.
- Build and maintain data pipelines to gather, integrate, cleanse, and structure data from multiple sources.
- Apply data-quality rules, implement data quality checks, and troubleshoot data anomalies.
- Implement security best practices in pipelines and infrastructure.
- Use and evaluate GCP AI/ML services and stay current with platform advancements.
- Work with Kubernetes/Docker for containerized workloads and CI/CD tooling for automated delivery.
- Provide guidance and mentorship to junior data engineers and review their implementations.
Requirements
- Bachelor’s degree or equivalent experience.
- 3+ years in software engineering delivering complex, scalable systems.
- 1+ years in a dedicated AI/ML role with hands-on model integration and MLOps experience.
- 1+ years architecting/working with LangChain, LangGraph, or similar agentic AI frameworks.
- 2+ years working with Google Cloud Platform and its AI/ML services (e.g., Vertex AI).
- 2+ years of experience with Kubernetes workloads and Docker.
- Proficiency in Python and JavaScript/TypeScript and/or Java; familiarity with a modern front-end framework (Angular, React, or Vue).
- Experience with LLM observability/monitoring tools (e.g., Langfuse).
- Experience with CI/CD tools such as GitHub Actions, Argo CD, or Jenkins.
- Strong database knowledge across SQL (Spanned DB, Alloy DB, PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB, Firestore).
Nice-to-have / Differentiators
- Hands-on experience with Generative AI models like Gemini, ChatGPT, Claude, or Llama.
- Demonstrated experience deploying AI agents to production and using AI-powered coding assistants (e.g., GitHub Copilot).
- Experience solving ambiguous, large-scale technical problems and mentoring engineering teams.