+91 80748 68174 contactoffcampusjob@gmail.com

Lead Data Engineer - Ai Systems (snowflake / Dbt / Llm)

Rockwoods Dallas, Texas, US

About the Role

Title: Lead Data Engineer – AI Systems (Snowflake / dbt / LLM)

Location: Dallas, TX (Hybrid)

Contract

US Citizens Only

About the Role

Rockwoods is hiring a Lead Data Engineer for a high-visibility engagement with an insurance client.

We are looking for someone who has genuinely worked on modern cloud data platforms and supported AI/LLM-driven initiatives in production environments.

This is not a traditional ETL or reporting role.

We need an engineer who understands how scalable data systems power AI applications — including LLM integrations, semantic search, vector-based retrieval, AI-ready data modeling, and production-grade pipelines.

You should be someone who:

  • enjoys solving messy real-world data problems
  • can build and optimize systems hands-on
  • understands performance, scale, and reliability
  • has worked beyond proof-of-concepts and actually deployed solutions

This is a strong opportunity for senior engineers who want ownership, technical influence, and meaningful architecture work.

Responsibilities

  • Build and optimize scalable Python + Snowflake + dbt pipelines supporting analytics and AI use cases
  • Design modern data architectures for LLM workflows, RAG patterns, semantic search, and AI-enabled applications
  • Develop API and event-driven ingestion frameworks for structured and unstructured data
  • Improve platform reliability, observability, data quality, and performance
  • Prepare high-quality datasets for AI/ML inference and downstream applications
  • Tune Snowflake performance and optimize transformation efficiency/costs
  • Partner closely with engineering and business teams to solve operational data challenges
  • Help establish scalable engineering standards and modern data platform best practices

Required Experience

  • 7+ years of hands-on Data Engineering experience
  • Strong expertise in Python, Snowflake, SQL, and dbt
  • Experience building production-grade pipelines and modern cloud data platforms
  • Experience supporting AI/LLM-related workflows in real environments
  • Hands-on experience with OpenAI, Anthropic, embeddings, vector search, semantic retrieval, or RAG-style architectures
  • Strong orchestration experience with Airflow or similar tools
  • Experience handling imperfect enterprise-scale data
  • Strong understanding of data modeling, optimization, transformation strategies, and scalability
  • Ability to work independently in a fast-moving engineering environment

Strong Plus

  • Insurance domain experience (Claims, Policy, Billing, Underwriting, etc.)
  • Experience with vector databases or AI search architectures
  • Exposure to MLOps or AI deployment workflows
  • Experience designing reusable enterprise data frameworks

What This Role Is NOT

This is NOT:

  • a junior ETL developer role
  • a reporting/dashboard-only role
  • an AI “prompt engineering” role
  • a heavily bureaucratic environment with layers of approvals

We are looking for builders and problem-solvers.

Why Engineers Like This Role

  • Modern cloud + AI-focused tech stack
  • High ownership and technical influence
  • Direct impact on real business initiatives
  • Strong engineering culture
  • Fast interview process
  • Less process, more execution
  • Opportunity to shape architecture decisions early

Important

Please apply only if you have hands-on experience with modern Data Engineering AND practical AI/LLM-related implementations in production environments.

Candidates with only reporting/dashboard backgrounds or purely academic AI exposure will likely not be a fit.

Responsibilities

  • Build and optimize scalable Python + Snowflake + dbt pipelines supporting analytics and AI use cases
  • Design modern data architectures for LLM workflows, RAG patterns, semantic search, and AI-enabled applications
  • Tune Snowflake performance and optimize transformation efficiency/costs

Qualifications

  • 7+ years hands-on data engineering
  • Experience with vector databases or AI search architectures
  • Exposure to MLOps or AI deployment workflows

Required Skills

Python Snowflake dbt LLM integration data pipelines

Keywords

data engineering machine learning AI systems Snowflake dbt

Interested in this role?

Apply now and take the next step in your career.

Apply Now