Job Title

DataOps / DevOps Engineer (Python | AWS | Snowflake

Experience: 5–10 years

Employment Type: Full-time

Location:

Vadodara

About the Role

We are looking for a DataOps/DevOps Engineer to streamline and automate data platform operations across AWS and Snowflake environments. The ideal candidate will have strong experience in Python scripting, CI/CD automation, infrastructure-as-code, and data pipeline reliability. You’ll collaborate with data engineers and analysts to ensure data products are deployed, monitored, and scaled efficiently.

 

Key Responsibilities

  • DataOps Automation
  • Build and maintain CI/CD pipelines for data ingestion, transformation, and analytics workloads.
  • Automate ETL/ELT job deployments using tools like GitHub Actions, Jenkins, or AWS CodePipeline.
  • Develop Python utilities for orchestration, metadata tracking, and operational workflows.
  • Implement data quality and validation checks as part of deployment workflows.
  • DevOps & Cloud Infrastructure
  • Manage and optimize AWS infrastructure for data workloads — EC2, Lambda, S3, Step Functions, Glue, ECS/EKS, and IAM.
  • Design and manage infrastructure-as-code (IaC) using Terraform or AWS CloudFormation.
  • Set up monitoring, alerting, and logging via CloudWatch, Prometheus, or Datadog.
  • Ensure high availability, backup, and DR strategies for Snowflake and data pipelines.
  • Snowflake & Data Platform
  • Automate Snowflake warehouse management, user provisioning, and role-based access using Python or Terraform.
  • Manage Snowpipe, Streams, and Tasks for continuous data ingestion.
  • Collaborate with data engineers on performance tuning and query optimization.
  • Build integrations between Snowflake, AWS (S3, Lambda, Kinesis), and BI tools (e.g., Superset, Tableau).

 

Collaboration & Best Practices

  • Work closely with data engineering and analytics teams to improve delivery efficiency and reliability.
  • Define and enforce data versioning, testing, and release standards.
  • Contribute to observability and incident response playbooks.
  • Promote a culture of automation, testing, and continuous improvement.

 

Required Skills

  • Programming: Python (must have); Bash scripting
  • Cloud: AWS (S3, Lambda, Glue, ECS, IAM, CloudFormation)
  • Data Platform: Snowflake (SQL, Snowpipe, Streams, Tasks, RBAC)
  • DevOps: CI/CD pipelines (GitHub Actions, Jenkins, or AWS CodePipeline)
  • IaC: Terraform or AWS CloudFormation
  • Version Control: Git, GitHub, GitLab
  • Monitoring: CloudWatch, Grafana, or Datadog
  • OS: Linux / Unix

 

Nice to Have

  • Experience with Airflow, dbt, or Flink/Spark for pipeline orchestration
  • Exposure to Kubernetes (EKS) for scalable data workloads
  • Knowledge of Cost Optimization for Snowflake and AWS resources
  • Familiarity with Data Governance and security frameworks (e.g., AWS KMS, Secrets Manager)