Software Architect-Analytics
Experience: Job Code: REQ-023379
Sagility
Bangalore
job Details
Job title
Software Architect-Analytics
About Sagility
Sagility combines industry-leading technology and transformation-driven BPM services with decades of healthcare domain expertise to help clients draw closer to their members. The company optimizes the entire member/patient experience through service offerings for clinical, case management, member engagement, provider solutions, payment integrity, claims cost containment, and analytics. Sagility has more than 25,000 employees across 5 countries.
Job title:
Job Description:
Role Overview We are looking for a Data Architect to lead the design and governance of our Enterprise Data Lakehouse. While our primary ecosystem is Azure Databricks, we value architectural expertise across equivalent cloud platforms (Snowflake, BigQuery). You will be responsible for defining how data is structured, secured, and scaled to support BI and AI initiatives.
Key Responsibilities
- Architectural Design: Design and implement Cloud Lakehouse architectures (Medallion pattern: Bronze/Silver/Gold).
- Governance & Security: Lead the implementation of Unity Catalog (or equivalent governed catalogs) to manage metadata, lineage, and fine-grained access control.
- Data Modeling: Create scalable physical and logical data models, ensuring high performance for both batch and real-time streaming (Structured Streaming).
- Strategic Roadmapping: Evaluate and integrate cloud-native services (Azure Data Factory, Key Vault, etc.) to build a cohesive ecosystem.
- Mentorship: Act as the technical North Star for data engineers, ensuring code quality and architectural consistency.
Required Skills & Experience
- Experience: 6–12 years in Data Architecture or Senior Data Engineering.
- Cloud Platforms: Expert-level knowledge of Azure (preferred), AWS, or GCP.
- The "Core" Engine: Deep experience with Databricks/Spark is ideal. However, we highly value candidates with equivalent expertise in Snowflake or Google BigQuery who understand decoupled storage/compute and cloud-native scaling.
- Languages: Proficient in SQL and Python (PySpark).
- Modern Standards: Proven experience with Delta Lake, Parquet, or Iceberg formats.
- Data Governance: Familiarity with modern discovery and security tools (Unity Catalog, Microsoft Purview, or Collibra).
Preferred Qualifications
- Experience migrating legacy data warehouses to a Cloud Lakehouse.
- Certifications: Databricks Certified Data Engineer Professional or Azure Solutions Architect (AZ-305).
- Knowledge of dbt (data build tool) for modular SQL modeling.
Location: