Key Skills: Data Engineer, AI (Artificial intelligence), SQL, Python, Java.
Roles and Responsibilities:
- Architect and implement modern, scalable data solutions on cloud platforms, specifically Google Cloud Platform (GCP).
- Collaborate with cross-functional teams to assess, redesign, and modernize legacy data systems.
- Design and develop efficient ETL pipelines for data extraction, transformation, and loading to support analytics and ML models.
- Ensure robust data governance by maintaining high standards of data security, integrity, and compliance with regulatory requirements.
- Monitor, troubleshoot, and optimize data workflows and pipelines for enhanced system performance and scalability.
- Provide hands-on technical expertise and guidance across data engineering projects, with a focus on cloud adoption and automation.
- Work in an agile environment and contribute to continuous delivery and improvement initiatives.
Experience Requirements:
- 5-10 years experience in designing and implementing data engineering solutions in GCP or other leading cloud platforms.
- Solid understanding of legacy data infrastructure with demonstrated success in modernization and migration projects.
- Proficiency in programming languages such as Python and Java for building data solutions and automation scripts.
- Strong SQL skills, with experience in working with both relational (SQL) and non-relational (NoSQL) databases.
- Familiarity with data warehousing concepts, tools, and practices.
- Hands-on experience with data integration tools and frameworks.
- Excellent analytical, problem-solving, and communication skills.
- Experience working in fast-paced agile environments and collaborating with multi-disciplinary teams.
Education: B.tech, M.tech, B.com, M.com, MBA, any PG