Careers at Sagility

Explore meaningful roles that let you make an impact in healthcare while growing your career with purpose, innovation, and global opportunities.

Software Architect-Analytics

Partially Remote/Hybrid

Experience: Job Code: REQ-023379

Sagility

Bangalore

job Details

Job title

Software Architect-Analytics

About Sagility

Sagility combines industry-leading technology and transformation-driven BPM services with decades of healthcare domain expertise to help clients draw closer to their members. The company optimizes the entire member/patient experience through service offerings for clinical, case management, member engagement, provider solutions, payment integrity, claims cost containment, and analytics. Sagility has more than 25,000 employees across 5 countries.

As a Data Architect within our Enterprise Analytics COE, you will lead the strategic design and governance of our high-performance Cloud Lakehouse. We are seeking a visionary technical leader to architect scalable, Medallion-based data structures on Azure Databricks that bridge the gap between complex raw data and AI-ready insights. Whether your expertise lies in Databricks, Snowflake, or BigQuery, you will be responsible for defining the technical roadmap, implementing robust governance via Unity Catalog, and mentoring engineering teams to build resilient, automated pipelines. This is a high-impact role centered on driving architectural excellence and data reliability across a diverse portfolio of client engagements.

Job title:

Software Architect-Analytics

Job Description:

Role Overview We are looking for a Data Architect to lead the design and governance of our Enterprise Data Lakehouse. While our primary ecosystem is Azure Databricks, we value architectural expertise across equivalent cloud platforms (Snowflake, BigQuery). You will be responsible for defining how data is structured, secured, and scaled to support BI and AI initiatives.

Key Responsibilities

  • Architectural Design: Design and implement Cloud Lakehouse architectures (Medallion pattern: Bronze/Silver/Gold).
  • Governance & Security: Lead the implementation of Unity Catalog (or equivalent governed catalogs) to manage metadata, lineage, and fine-grained access control.
  • Data Modeling: Create scalable physical and logical data models, ensuring high performance for both batch and real-time streaming (Structured Streaming).
  • Strategic Roadmapping: Evaluate and integrate cloud-native services (Azure Data Factory, Key Vault, etc.) to build a cohesive ecosystem.
  • Mentorship: Act as the technical North Star for data engineers, ensuring code quality and architectural consistency.

Required Skills & Experience

  • Experience: 6–12 years in Data Architecture or Senior Data Engineering.
  • Cloud Platforms: Expert-level knowledge of Azure (preferred), AWS, or GCP.
  • The "Core" Engine: Deep experience with Databricks/Spark is ideal. However, we highly value candidates with equivalent expertise in Snowflake or Google BigQuery who understand decoupled storage/compute and cloud-native scaling.
  • Languages: Proficient in SQL and Python (PySpark).
  • Modern Standards: Proven experience with Delta Lake, Parquet, or Iceberg formats.
  • Data Governance: Familiarity with modern discovery and security tools (Unity Catalog, Microsoft Purview, or Collibra).

Preferred Qualifications

  • Experience migrating legacy data warehouses to a Cloud Lakehouse.
  • Certifications: Databricks Certified Data Engineer Professional or Azure Solutions Architect (AZ-305).
  • Knowledge of dbt (data build tool) for modular SQL modeling.

Location:

BangaloreIndia

Join our team, we are looking forward to talking to you!

An Equal Opportunity?

Apply Job