Role Name: Databricks Developer
Core Responsibilities (Data Engineer / Developer)
Data Pipeline Development: Design, build, and maintain scalable ETL/ELT processes using Databricks and Spark.
Spark Optimization: Tune and optimize Spark jobs and data transformations for performance.
Lakehouse Architecture: Implement data reliability, governance, and security using Delta Lake and Unity Catalog.
Workflow Orchestration: Automate data workflows using Databricks Workflows or tools such as Airflow.
Collaboration: Work with analysts and business stakeholders.
Key Skills \& Technologies
Languages: PySpark, SQL.
Databricks Features: Delta Lake, Structured Streaming, Delta Live Tables (DLT).
Cloud Platforms: AWS, Azure, or GCP.
DevOps: Git, Terraform, CI/CD practices.
Tools: Databricks Notebooks, Databricks Repos.
Job Type: Full-time
Pay: Up to $10,000.00 per month
Work Location: In person