Key Responsibilities:
• Design and maintain cloud-based data warehouse and lake infrastructure using AWS
• Develop ETL processes for integrating retail datasets into a unified data model
• Create QuickSight dashboards for retailer insights
• Enable self-serve analytics and support ad-hoc reporting needs
• Implement real-time monitoring and alerting based on operational data
• Manage multiple projects, collaborating across teams to deliver results
• Mentor junior engineers and contribute to team growth
Key Qualifications:
• 5+ years of data engineering experience
• Expert-level Python skills
• Comprehensive expertise in AWS Big Data tools: S3, Redshift, QuickSight, Glue, Lake Formation, EMR/Spark, Kinesis, Firehose, Kafka, Athena, Lambda, Step Functions, IAM roles and permissions.
• Proficiency with non-relational databases and data stores
• Experience with big data file formats: Parquet, Avro, ORC
• Understanding of distributed systems for data storage and computing
• Proven ability to extract value from large datasets
• Experience in technical leadership and mentoring
• Expertise in data security best practices, including encryption, governance, and API authorization (OWASP Top 10, OAuth 2.0, JWT)
• Proficiency in comprehensive software testing strategies (unit, integration, end-to-end)
Any Graduate