Description

• Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
• Designing and optimizing data storage solutions, including data warehouses and data lakes.
• Ensuring data quality and integrity through data validation, cleansing, and error handling.
• Collaborating with data analysts, data architects, and software engineers to understand data requirements and deliver relevant data sets (e.g., for business intelligence).
• Implementing data security measures and access controls to protect sensitive information.
• Automating and improving data processes and workflows for scalability and efficiency.
• Monitoring data infrastructure for performance and reliability to address issues promptly.
• Keeping abreast of industry trends and emerging technologies in data engineering.
• Documenting data pipelines, processes, and best practices for knowledge sharing.
• Participating in data governance and compliance efforts to meet regulatory requirements.
• Providing technical support and mentoring to junior data engineers, if applicable.
• Continuously optimizing data architecture to support the company's evolving data needs.
• Collaborating with cross-functional teams to drive data-driven decision-making within the organization.
• Strong communication skills and ability to translate complex technical topics to senior stakeholders.
• Excellent knowledge of cloud computing technologies and current computing trends.
• Understanding of and experience with well-architected frameworks.
• Proven ability to collaborative with multi-disciplinary teams of Product Managers/Owners, Architects, Scrum masters and Subject Matter Experts
• Desired Azure,AWS and/or GCP Architect certifications and proven hands-on experience
• Bachelor's degree in computer engineering, or equivalent with extensive practical experience.
Specific Skills:
• Proficiency in data modeling and database management.
• Strong programming skills (e.g., Python, Java, or SQL).
• Knowledge of big data technologies like Hadoop and Spark.
• Experience with ETL (Extract, Transform, Load) processes.
• Familiarity with data warehousing and cloud platforms (e.g., AWS, Azure, or Google Cloud).
• Degrees in computer science or related field

Education

Bachelor's Degree