Description

 Job Description

ETL - Big Data/Data Warehousing, MS SQL,SQL,Python, Big Data, Spark,RDBMS, Data Analytics


Job Summary

key skills are Very Very Strong SQL capability, Data Pipelines Mgmt, Python and Airflow. Location is New York mandatory 3 days a week.

1.Need a hardcore, heavy hitting Data Engineer who is extremely skilled and is able to function independently and manage their deliverables
2.Capable of writing ETL pipelines using Python from scratch
3.Expert in OOP principles and concepts
4.Ability to independently write efficient and reusable code for ETL pipelines
5.Expert in data modeling concepts such as schemas and entity relationships
6.Expert at analyzing and developing queries in SQL in various dialects (SQL Server, DB2, Oracle)
7.Familiarity with Airflow and understands how to develop DAGS
8.Expert in data warehouses like BigQuery, Databricks Deltalakehouse and how to programmatically ingest, cleanse, govern and report data out of them
9.Familiarity with Spark is plus
10.Willing to learn new cloud-based business apps and tools
11.Prior knowledge of IBM Apptio is a plus

Education

Any Graduate