+91 80748 68174 contactoffcampusjob@gmail.com

Senior Database Architect

LanceSoft Seattle, Washington, US

About the Role

ONLY W2 no C2C

The following information provides an overview of the skills, qualities, and qualifications needed for this role.

Title: Senior Data Architect

Pay Range: $60-$85/hr on W2

Duration: 12 months (with possible extension or FTE for right Fit)

Location: 100% Onsite in Seattle, WA

In this role, your responsibility will be to leverage Unified Data Modeling (UDM) to design and govern data models, process layers, transformations, routing, and schema evolution for the datalake built on Azure Databricks and Microsoft Azure. You will map incoming and changing source data to the UDM, manage and define data transformation rules, and collaborate closely with distributed data engineering teams (including team members in India) to implement robust, scalable data pipelines and governance.

Position Responsibilities:

· Design, document, and maintain the Unified Data Model (UDM) artifacts and mappings.

· Model process layers in the datalake, defining transformation responsibilities and lineage across layers.

· Define and enforce rules for data processing, including routing tables, validation rules, enrichment logic, and error-handling policies.

· Author and maintain schema definitions, attribute dictionaries, and change management processes for schema evolution.

· Translate business and source system requirements into data transformation specifications to be implemented in Azure Databricks and downstream systems.

· Collaborate with data engineers to design performant transformations, partitioning, and storage strategies in the datalake.

· Review and approve data pipeline designs, ensuring adherence to UDM, governance, and security policies.

· Work with DevOps and engineering teams to operationalize CI/CD for Databricks notebooks, jobs, and infrastructure-as-code.

· Provide technical leadership and mentorship to data engineering teams, including remote collaboration with engineers located in India; coordinate design, implementation, and delivery across time zones (may require off-hour work).

· Establish data quality metrics, monitoring, and remediation guidance; ensure traceability and lineage from source to consumption.

· Participate in architecture and design reviews, code reviews, and agile ceremonies; drive best practices for data modeling and transformation.

· Communicate architecture decisions and trade-offs to stakeholders, product owners, and engineering teams.

Describe the project/day-to-day activities they will be working on:

Designing, improving, communicating, and managing data and data models for training analytics application.

What are the Top 3-5 Technical/Software Skills needed to perform this role/job?:

Unified Data Model (UDM)

Data lakehouse concepts

Business intelligence analytical understanding

Basic Qualifications (Required Skill/Experience):

· 9+ years of experience in data architecture, data modeling, or related data engineering roles.

· Proven experience designing and implementing Unified Data Models (UDM).

· Strong expertise in Structured Query Language (SQL)

· Strong expertise in data modeling across multiple process layers (raw/ingest, transformed, curated) and defining transformation logic.

· Deep understanding of data governance, data lineage, metadata management, and data quality concepts.

· Demonstrated experience with Databricks and building/architecting datalake solutions.

· Experience defining schema evolution processes, new attribute definitions, and backward-compatible changes.

· Experience authoring route tables, processing rules, and data routing/ingestion patterns.

· Experience collaborating with geographically distributed engineering teams and willingness to work off hours to coordinate with teams in India.

· Strong communication, documentation, and stakeholder engagement skills.

Preferred Qualifications (Desired Skills/Experience):

· Prior experience in aviation, training analytics, or related operational data domains.

· Experience with Azure Data Factory, Delta Lake, Unity Catalog, or equivalent data governance tools.

· Familiarity with infrastructure-as-code and CI/CD practices for data pipelines (Terraform, Azure DevOps, GitOps).

· Knowledge of streaming ingestion patterns, Kafka/Event Hubs, and near-real-time processing.

· Experience with metadata platforms and data catalog tools (e.g., Purview, Alation).

· Experience with containerization and orchestration (Docker, Kubernetes) is a plus. xywuqvp

· Bachelor’s or advanced degree in Computer Science, Information Systems, Engineering, or related field.

Required Skills

Azure Databricks Unified Data Model (UDM) data governance schema evolution data pipelines

Keywords

Senior Data Architect Azure Databricks UDM

Interested in this role?

Apply now and take the next step in your career.

Apply Now