Description

Requirement/Must Have:

  • Bachelor’s degree in Computer Science, IT, or related field.
  • 3+ years of experience designing efficient dimensional models (star and snowflake schemas).
  • 3+ years ensuring data quality, security, and governance.
  • 5+ years as a Data Analyst, Data Engineer, or similar role.
  • 2+ years using Git, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, Client, CloudFormation).
  • 5+ years of experience extracting and manipulating data from diverse on-premises and cloud-based sources.
  • 3+ years of experience with SSIS, Azure Data Factory (ADF), and API integrations.
  • 2+ years performing data migrations across on-premises, cloud, and cross-database environments.

Experience:

  • Designing and maintaining data pipelines on cloud platforms (Azure, GCP, AWS).
  • Creating and optimizing data models to enhance query performance.
  • Integrating data from multiple sources (SQL, NoSQL, APIs).
  • Developing ETL/ELT processes and automation through CI/CD pipelines.
  • Managing data lakes and warehouses with proper governance.
  • Building Power BI dashboards and developing DAX measures.
  • Applying statistical and machine learning techniques using Python or R.

Responsibilities and Duties:

  • Design, build, and maintain robust data pipelines on-premises and in the cloud.
  • Optimize ETL workflows and ensure high-quality data integration.
  • Implement security best practices and governance in data environments.
  • Develop dimensional models and curated data marts for analytics.
  • Deliver dashboards, reports, and predictive models for actionable insights.
  • Collaborate with cross-functional teams to translate requirements into solutions.
  • Present findings and data-driven recommendations to stakeholders.
  • Support agile delivery processes and mentor teams on analytics practices.

Nice to Have:

  • Experience with modern app development (Next.js, Node.js, D3.js).
  • Knowledge of PostgreSQL, MongoDB, Azure Cosmos DB, Azure Synapse, Talend.
  • Exposure to AI/ML tools within cloud platforms like Databricks and Azure.

Skills:

  • Strong understanding of data architecture, ETL/ELT design, and governance.
  • Proficiency in Power BI, DAX, Python, and/or R.
  • Familiarity with containerization, CI/CD, and cloud infrastructure tools.

Qualification and Education:

  • Bachelor’s degree in Computer Science, IT, or related discipline

Education

Any Graduate