JD:
• Experience in AWS system and network architecture design, with specific focus on AWS Sagemaker and AWS ECS
• Experience developing and maintaining ML systems built with open source tools
• Experience developing with containers and Kubernetes in cloud computing environments
• Experience with one or more data-oriented workflow orchestration frameworks (KubeFlow, Airflow, Argo)
• Design the data pipelines and engineering infrastructure to support our clients’ enterprise machine learning systems at scale
• Develop and deploy scalable tools and services for our clients to handle machine learning training and inference
• Support model development, with an emphasis on auditability, versioning, and data security
• Experience with data security and privacy solutions such as Denodo, Protegrity, and synthetic data generation.
• Ability to develop applications using Python and deploy to AWS Lambda and API Gateway
• Ability to develop Jenkins pipelines using the groovy scripting.
• . Good understanding in testing frameworks like Py/Test.
• Ability to work with AWS services like S3, DynamoDB, Glue, Redshift and RDS
• Proficient understanding of Git and version control systems
• Familiarity with continuous integration and continuous deployment.
• Develop the terraform modules to deploy the standard infrastructure.
• Ability to develop the deployment pipelines using the Jenkins, XL Release
• Experience in Python boto3 to automate the cloud operations.
• Experience in documenting technical solutions and solution diagrams
• Good understanding of the simple python applications which can be deployed as a docker container.
• Experiencing in creating workflows using AWS step functions
• Create the docker images using the custom PYTHON libraries
Any Graduate