Job description
Position Description:
As an MLOps Engineer at the Company, you will be an integral part of the Quality Analytics Team within the GDIA organization. Your role will involve collaborating with the modeling team to transition new models from proof of concept to production. Additionally, you will be responsible for establishing a robust back-end infrastructure to deploy our machine learning models in real-time streaming contexts. We are seeking a talented individual who possesses a combination of technical skills, industry experience, and strong communication abilities.
Responsibilities:
- Collaborate with the modeling team to facilitate the smooth transition of new models from proof of concept to production, ensuring scalability, reliability, and efficiency.
- Create a back-end infrastructure that supports the deployment of our machine learning models in real-time streaming contexts.
- Design and develop ETL pipelines to ensure seamless data integration and processing for model training and inference.
- Work closely with the data engineering team to optimize and streamline data pipelines and workflows.
- Stay updated with the latest advancements in MLOps and implement best practices to enhance model deployment and monitoring.
- Collaborate with cross-functional teams to ensure the successful integration of ML models into production systems.
Skills Required:
1+ year of experience working with Google Cloud Platform (GCP) services, leveraging its capabilities for ML model deployment.
2+ years of experience in Python programming, including libraries such as TensorFlow, PyTorch, or scikit-learn.
2+ years of experience in Java programming, including knowledge of frameworks such as Apache Kafka.
Experience Required:
Experience working with Kubernetes in an industry context, managing containerized applications and orchestrating deployments. Proficiency in writing and maintaining ETL pipelines, extracting data from various sources and transforming it for model training and inference. Familiarity with Apache Beam for building data processing pipelines.
Experience Preferred:
Previous experience working in a large, data-driven organization, with exposure to complex analytics workflows and data systems.
Education Required:
Bachelor's degree in computer science or a related field.
Education Preferred:
Master's degree in computer science or a related field.
Additional Information:
Strong communication skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.