Description

Key Responsibilities:

  • Develop, maintain, and optimize data pipelines and workflows using Python.
  • Perform complex data analysis to identify trends, patterns, and actionable insights.
  • Write and optimize SQL queries for data extraction, transformation, and reporting.
  • Work with Client to store, manage, and process large datasets.
  • Design and implement ETL processes for structured and unstructured data.
  • Collaborate with cross-functional teams to understand data needs and deliver solutions.
  • Utilize cloud-based tools and platforms for data storage, processing, and analytics.
  • Ensure data quality, accuracy, and security across systems.

Key Requirements and Technology Experience:  

  • Key skills; Data Analysis, SQL, Snowflake, ETL and Cloud. Prior FreddieMac/FannieMae experience is must.
  • Proven experience as a Data Analyst or Data Engineer with a strong focus on Python programming.
  • Proficiency in SQL for data querying and manipulation.
  • Experience with Client or similar cloud-based data warehouses.
  • Knowledge of ETL concepts, tools, and best practices.
  • Familiarity with cloud platforms such as AWS, Azure, or GCP.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration abilities.
  • Experience with data visualization tools (e.g., Tableau, Power BI, Looker).
  • Understanding of data governance, security, and compliance practices.
  • Knowledge of version control systems (e.g., Git)

Education

Any Gradute