Description

Responsibilities

  1. Analyze and interpret complex data sets. Ability to identify data anomalies and resolve data issues
  2. Understand specific business processes and domain concepts and relate them to data subject domains
  3. Optimize SQL queries, scripts, and stored procedures in Snowflake and SQL Server to improve efficiency and reduce processing time.
  4. Integrate and manage data from various sources, ensuring data quality, reliability, and performance.
  5. Design, develop, and maintain scalable and high-performance ETL pipelines using Matillion ETL to ingest, transform, and load data into Snowflake.
  6. Monitor data pipeline and ETL performance, identifying bottlenecks and making necessary adjustments to ensure optimal system performance.
  7. Perform troubleshooting and root cause analysis to resolve data issues and prevent future occurrences.
  8. Develop and implement data models within Snowflake to support business intelligence and analytics requirements.
  9. Collaborate with Data Leads, Product Managers and QA Engineers to validate requirements, participate in user requirement sessions
  10. Perform tests and validate data flows and prepare ETL processes according to business requirements
  11. Perform ETL tuning and SQL tuning
  12. Ability to document data flows representing business logic in ETL routines
  13. Design and implement data conversion strategy from legacy to new platforms
  14. Perform design validation, reconciliation, and error handling in data load processes
  15. Design and prepare technical specifications and guidelines including ER diagrams and related documents

Qualifications:

  1. Must be well versed with Data warehousing concepts including design patterns (Star schemas, Snowflake design schemas). Must be aware of data modeling concepts including the data modeling Normal forms
  2. Knowledge of AWS Infrastructure including S3, SNS, EC2, CloudWatch and RDS
  3. 8+ years working in ETL/Data transformation projects with one or more related products such as Informatica or Talend or Microsoft SSIS
  4. Experience in business intelligence, data warehousing initiatives
  5. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases like
  6. Oracle or MS SQL Server or Snowflake
  7. Strong SQL and Python skills with experience in query optimization and performance tuning.
  8. Strong analytical, problem-solving, and communication skills.

Education/Experience:

  1. Bachelor’s in mathematics, Computer Science or related technical field.
  2. Minimum 8 years of relevant experience.

Preferred Qualifications:

  1. Additional Tools and Platforms: Experience with additional ETL tools, data transformation methods, or cloud platforms is a plus.
  2. Healthcare and PBM: PBM-specific knowledge and experience with transforming various healthcare data is a plus, but not required.
  3. Experience with Matillion ELT tool and Snowflake Database
  4. Experience in building AWS Data Pipelines using Python or Spark, SparkSQL in any of Cloud Environments (AWS / Azure / Google).

Education

Bachelor’s in mathematics, Computer Science