Job Description:
Design and implement robust, scalable integrations between Databricks and key enterprise business systems (Netsuite, Workday, SalesForce)
Translate business requirements into technical specifications and integration workflows
Develop, and maintain automated data pipelines using a combination of low-code tools and custom AWS solutions (Lambda, Step Functions, S3, CloudWatch, VPCs, IAM)
Own the provisioning and management of infrastructure for integrations by applying Infrastructure as Code principles with Terraform
Mentor engineers and champion best practices in software engineering, source control (Gitflow), and DevOps workflows within Github repositories
Collaborate closely with data scientists, product managers, and stakeholders to ensure integrations deliver transformative business value
Proactively monitor, troubleshoot, and ensure reliability and observability of integration solutions across distributed systems
Job Requirements:
Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent professional experience
Minimum 6+ years of hands-on experience in software engineering
Hands-on expertise with Terraform and modern DevOps practices, including CI/CD and infrastructure lifecycle automation
Proficiency in Python or Node.js
AWS experience: Lambda, Step Functions, S3, CloudWatch, VPCs, IAM.
Hands-on experience with Databricks, Snowflake or similar for data warehousing, modeling, or pipeline optimization.
Experience with modern source control workflows (Gitflow), using Github
You might also have:
Hands-on experience with Databricks or Snowflake, especially in data warehousing, modeling, or pipeline optimization.
Benefits:
/


