Scroll to filters below after hitting search
New Join the JobGlobe WhatsApp Insider Circle for instant incoming job drops, shortlist tips, and priority alerts the moment we publish new roles. Join WhatsApp
New Anyone can earn now by posting verified jobs on JobGlobe. Every approved listing can pay you while helping more people get hired. Start earning

JobGlobe Advertisement

Senior Data Platform Engineer II (Databricks)

FULL TIME
Experience: 3 years
Remote
Open until filled
👤 Posted by Azam Rafique
Job alert
Save for later
WhatsApp

Apply on your behalf

Let JobGlobe submit your application for you. Includes CV review and document checks.

Processing fees*: PKR 500
Login to apply How it works

* Processing fee shown above. Additional fees may apply:

  • For postal application within Pakistan: Additional PKR 300
  • Demand drafts, challans, or other requirements: Applicant's responsibility

Job Overview

Role: Senior Data Platform Engineer II (Databricks). Department: Technology. Team: Engineering. Location: Remote, United States. Employment Type: Full Time. Workplace Type: remote. Posted Timestamp: 1773081238077.

Eligibility / Qualification Required:

As a Senior Data Platform Engineer II, you will architect and manage the high-performance, distributed data environments that power our healthcare analytics. You will move beyond traditional maintenance to ensure our Databricks Lakehouse and Snowflake environments scale indefinitely. You will be responsible for the health, optimization, and security of our data platforms, making complex data accessible and expressive for web applications and AI.

Primary Duties:

  • Develop and implement scalable and performant solutions.
  • Partner, as a peer, with Engineering Managers, Product Managers, and stakeholders throughout Aledade to develop and execute technical roadmaps using Agile processes.
  • Mentor and coach more junior engineers including thorough pull request reviews for other developers and be receptive to critical feedback on your own work.

Minimum Qualifications:

  • BS/BTech (or higher) in Computer Science, Engineering or a related field or equivalent experience.
  • 6+ years experience as an engineer building and optimizing highly scalable distributed data systems (e.g., Databricks, Spark, or Snowflake).
  • 3+ years of experience working with SQL and data modeling on large multi-table data sets.
  • 3+ years of experience acting as a trusted technical decision-maker in a team setting, solving for short-term and long-term business value.
  • 3+ years of experience coaching other engineers.

Preferred KSA’s:

  •  
  • Platform & Infrastructure (The "Databricks/Cloud" Core)
  • Databricks & Lakehouse Architecture: Deep expertise in managing Databricks workspaces, including Unity Catalog for data governance, lineage, and fine-grained access control.
  • Infrastructure as Code (IaC): Advanced proficiency with Terraform (or similar) to automate the provisioning and scaling of Databricks clusters, cloud resources (AWS preferred), and networking.
  • Snowflake Proficiency (Nice-to-Have): Experience managing Snowflake environments, specifically focusing on warehouse cost optimization, security integration, and secure data sharing.
  • Modern Database Internals: In-depth knowledge of distributed systems, including partitioning, liquid clustering/Z-Ordering, sharding, and high-availability strategies for petabyte-scale data.
  • Performance, Reliability & DevOps
  • Observability & Optimization: Proven track record in performance monitoring and query tuning for distributed workloads to ensure system reliability and cost-efficiency.
  • Data Engineering Lifecycle: Experience designing and optimizing high-throughput ETL/ELT pipelines and ingestion systems (batch and streaming) using Spark.
  • Deployment & Orchestration: Experience building robust CI/CD pipelines for data infrastructure and deploying services using containerization (Docker, Kubernetes).
  •   Security, Compliance & Domain Knowledge
  • Sensitive Data Handling: Expertise in building systems that handle protected information, with specific experience in HIPAA and SOX compliance frameworks.
  • Healthcare Data Expertise: Experience navigating health-tech data complexities, such as Electronic Health Records (EHR), clinical data formats (HL7/FHIR), and claims data.

Physical Requirements:

  • Sitting for prolonged periods of time. Extensive use of computers and keyboard. Occasional walking and lifting may be required.

How to Apply:

Apply through the official Aledade Lever application flow.
Apply Now
View Job Details

Advertisement attachments

Files

Links

Remote Jobs Technology Healthcare Jobs Engineering Engineering Jobs Work From Home Jobs United States Public Health Jobs Aledade Aledade Jobs Value Based Care Jobs Primary Care Jobs Health Tech Jobs Senior Data Platform Engineer II (Databricks) Healthcare Analytics Jobs
Aledade
Apply Now