Description

Description
Key Responsibilities:

1. Data Engineering (AWS Glue & AWS Services):

• Design, develop, and optimize ETL pipelines using AWS Glue (PySpark).

• Manage and transform structured and unstructured data from multiple sources into AWS S3, Redshift, or Snowflake.

• Work with AWS Lambda, S3, Athena, Redshift for data orchestration.

• Implement data lake and data warehouse solutions in AWS.

2. Infrastructure as Code (Terraform & AWS Services):

• Design and deploy AWS infrastructure using Terraform.

• Automate resource provisioning and manage Infrastructure as Code.

• Monitor and optimize cloud costs, security, and compliance.

• Maintain and improve CI/CD pipelines for deploying data applications.

3. Business Intelligence (Tableau Development & Administration):

• Develop interactive dashboards and reports using Tableau.

• Connect Tableau with AWS data sources such as Redshift, Athena, and Snowflake.

• Optimize SQL queries and extracts for performance efficiency.

• Manage Tableau Server administration, including security, access controls, and performance tuning.

Required Skills & Experience:

• 5+ years of experience in AWS Data Engineering with Glue, Redshift, and S3.

• Strong expertise in ETL development using AWS Glue (PySpark, Scala, or Python).

• Experience with Terraform for AWS infrastructure automation.

• Proficiency in SQL, Python, or Scala for data processing.

• Hands-on experience in Tableau development & administration.

• Strong understanding of cloud security, IAM roles, and permissions.

• Experience with CI/CD pipelines (Git, Jenkins, AWS Code Pipeline, etc.).

• Knowledge of data modeling, warehousing, and performance optimization

Education

Any Graduate