Designing, developing, testing, documenting, deploying and monitoring data engineering solutions to support high volume and fault tolerant data pipelines using Distributed Cloud Technologies and Python. Building large-scale batch and real-time cloud based distributed data systems to provide low latency delivery of high quality data for machine learning systems. Developing to optimize & integrate distributed data architectures to support the next generation of products and data initiatives. Designing, developing, and managing automated workflows utilizing various technologies (Python, GIS, SQL, Azure Cloud, Alteryx, Microsoft Services, Snowflake and more) to minimize human error. Work under supervision. Travel and/or relocation to unanticipated client sites throughout USA is required.
Bachelor’s degree in Computer Science/ IT/IS/ Engineering (Any)/Business/Statistics or closely related field with Twelve (12) months of experience in the job offered or as an IT Consultant or Analyst or Programmer or Engineer or Developer or related field.
Experience of Twelve (12) months working on Alteryx, Apache Spark, Azure Cloud, Databricks, Terraform, Kubernetes, Snowflake, DBT Analytics, Linux, SQL and Python is required. Travel and/or relocation is required to unanticipated client sites within USA. International travel is not required. The frequency of travel is currently not known as it depends on the client and project requirement that cannot be currently anticipated. Employer provides Information technology services to various clients in USA and hence implementing projects will require such travel.
Any Graduate