Design, Build and operationalize large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties – Big Table, Cloud Big Query, Cloud Pub Sub, Cloud Functions, etc.
Analyze, re-architecting and re-platforming on premise data warehouses to data platforms on GCP cloud using GCP/3rd party services.
Design and build production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.
Architect and implement next generation data and analytics platforms on GCP cloud.
Design and implement data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming.
Work with recommendation engines, data pipelines or distributed machine learning and experience with data analytics and data visualization techniques and software.
Perform detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud.
Design and implement at production scale.
Required Skills
Familiar with the software development lifecycle and agile methodologies.
Good knowledge of Java, J2EE, Spring, Micro services.
Cloud Platforms like AWS, Google Cloud Platform.
Proven ability to work effectively in a fast-paced, interdisciplinary, and deadline driven environment.
Proficient at financial modeling and quantitative analysis.
Strong problem solving and troubleshooting skills.
Required Experience
Minimum 7 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties – Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.
Have 1-3 years of experience in handling applications, operations in Google Cloud Platform Cloud environment.
Minimum 2 years of hands-on experience analyzing, re-architecting and re-platforming on premise data warehouses to data platforms on GCP cloud using GCP/3rd party services.
Minimum 2 years of designing and building production data pipelines from ingestion to consumption within a hybrid bigdata architecture, using Java, Python, Scala etc.
Minimum 2 years of architecting and implementing next generation data and analytics platforms on GCP cloud should have worked in agile environment.
Experience in Java, J2EE, Spring, Micro services development.
Education Requirements
Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.