Responsibilities:
- Design, develop, test, and implement technical solution using GCP data technologies/tools
- Develop data solutions in distributed microservices and full stack systems
- Utilize programming languages like Python, Java and GCP technologies like BigQuery, Data Proc, Data Flow, Clous SQL, Cloud Functions, Cloud Run, Cloud Composer, Pub/Sub, APIs
- Lead performance engineering and ensure the systems are scalable
- Ensure clarity on NFR and implement these requirements.
- Work with Client Technical Manager by understanding customer's landscape & their IT priorities
Desired Candidate Profile Technology & Engineering Expertise:
- Overall 8+ years of overall experience in implementing data solutions using Cloud/On-prem technologies
- At least 3+ years of experience in data pipeline development using GCP cloud technologies
- Proficient in data ingestion, store and processing using GCP technologies like BigQuery, Data Proc, Data Flow, Clous SQL, Cloud Functions, Cloud Run, Cloud Composer, Pub/Sub and APIs
- Experience in Microservices implementations on GCP
- Knowledge in Master data management
- Knowledge in Data Catalog, Data Governance, Data Security
- Excellent SQL skills
- Must be google certified
- Experience with different development methodologies (RUP | Scrum | XP) Soft skills
Soft skills:
- Able to deal with diverse set of stakeholders
- Proficient in articulation, communication, and presentation
- High integrity
- Problem solving skills & learning attitude
- Team player