Job summary
A good hands-on Data application developer with excellent experience on Databricks, Snowflake, Airflow, Kubernetes, Kafka, lake houses, Delta lake, Object stores like AWS s3/ Azure blob.
Job Responsibilities
• Developing latest reusable data processing component & libraries based on Data mesh design pattern
• Designing & implement a modern highly responsive factory approach & inner source the components for enterprise use
• Translating designs and wireframes into high-quality code
• Should have extensive experience building data integration and warehouses in cloud (Azure and AWS)
• Strong hands-on experience in python and Java.
• Extremely strong SQL skills on OLAP and OLTP technologies.
• Ability to do Data models Using Data vault and Dimensional Models
• Understand and use CI/CD with on AWS/Azure & Kubernetes.
• Learn and understand new tools and techniques on Databricks, snowflake, no-sql Databases, data governance and data quality,
Any Graduate