Flag job

Report

Infometry - MLOps Engineer - Python/PySpark

Min Experience

7 years

Location

Bengaluru, Karnataka, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Location : Remote

Role : MLOps Engineer

EXP : 7-10yrs

Job Summary

We are seeking a highly skilled and motivated MLOps Engineer to join our team. The ideal candidate will be responsible for streamlining the deployment, scaling, and monitoring of machine learning models in production. You should have a strong background in Python, PySpark, cloud platforms (AWS & Azure), and container orchestration with Kubernetes.The ideal candidate will bridge the gap between our AI Team and our Data Platform engineers, ensuring the seamless integration of ML models into scalable and reliable systems. In this role you will design and implement robust MLOps pipelines, automate model training and deployment processes, and monitor the performance of ML system.

Role Responsibilities

  • Design and implement end-to-end machine learning infrastructure and workflows in production environments.
  • Collaborate with data scientists, data engineers, and software engineers to streamline ML model development and deployment.
  • Develop robust and scalable CI/CD pipelines for continuous model integration, testing, and deployment.
  • Manage and optimize distributed data processing pipelines using PySpark.
  • Deploy and maintain ML models on AWS (e.g., SageMaker, EKS, Lambda) and Azure (e.g., ML Studio, AKS) environments.
  • Use Kubernetes to manage containerized applications and ensure scalability and high availability.
  • Monitor model performance, data drift, and system reliability in production.
  • Implement model versioning, reproducibility, and governance using tools like MLflow, DVC, or Kubeflow.
  • Drive automation and best practices in MLOps across the organization.

Required Qualifications

  • Bachelors or Masters degree in Computer Science, Data Science, Engineering, or related field.
  • 7+ years of industry experience with a focus on MLOps, machine learning systems, or data engineering.
  • Strong programming experience in Python with knowledge of ML libraries (scikit-learn, pandas, etc.).
  • Advanced skills in PySpark for distributed data processing.
  • Hands-on experience with both AWS and Azure cloud platforms, including services for ML and data workflows.
  • Proficient in Docker and Kubernetes for containerized deployment and orchestration.
  • Experience with CI/CD tools (e.g., Jenkins, GitHub Actions, Azure DevOps).
  • Deep understanding of software development best practices, Agile methodologies, and DevOps culture

(ref:hirist.tech)

About the company

Infometry

Skills

python
pyspark
aws
azure
kubernetes
ci/cd
devops