Job Responsibilities:
- Assist in Presales and client problem definition phase of an Azure data engagement. Work closely with the presales engineers to design the best-in-class Azure data solution and scoping to satisfy client requirements.
- Ability to handle multiple client engagements in different stages of the implementation lifecycle.
- Provide hands-on guidance, mentoring, and validation for the project teams during the implementation of the client engagement.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Required Skills:
- Demonstrated work experience with handling streaming/time-series data using AWS Kinesis/AWS-managed Kafka
- ETL development experience in SQL, Python, Java/Scala
- Implementation experience of self-service data layer using data mesh/ data fabric or other architectures for data sharing and governance
- Familiarity with Lakehouse architectures
- Experience with other dashboarding solutions like Tableau, Qlik, and/or similar programs is a plus.
- Excellent verbal and written communication skills to effectively interact with both technical and executive stakeholders on client engagements.
Additional Desired Skills:
- Working experience with Snowflake or Databricks is a plus
- Experience on Azure and GCP data stack is definitely a plus
Education and Experience:
- Bachelor's degree or equivalent experience and/or military experience
- Over all 10+ years of experience in Data Engineering
- 10+ years of advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
5+ Strong work experience on AWS Data Analytics stack - EMR, Redshift, Kinesis/managed