Responsibilities:
Design, develop, enhance, code, test, deliver and debug solutions.
Implement large, complex stories spanning multiple technology domains.
Participate in requirement gathering and design sessions.
Participate in code reviews, Maintain code quality, introduce and enforce development standards.
Develop deliverables and meet project deadlines.
Identify and communicate technical trends and/or emerging technology.
Participating in all sprint ceremonies (Refinement, Planning, Daily Standup and Retrospective)
Participate in pair programming, Knowledge sharing, brown bag sessions.
Qualifications:
5 or more years of relevant software development experience required.
5 or more years of in-depth experience with Python, MS SQL Server and T-SQL is required. Experience with Azure Data Bricks and Data Factory required.
Experience with Data Engineering such as Databricks is required.
Experience with big data technologies like Hadoop, Spark, and Kafka is required. Experience with understanding and developing data lake/delta lake eco systems is required. Experience with working with large volumes of data – millions of records Experience building and using CI/CD pipelines
Experience with IaC technologies (Terraform, Pulumi)
Knowledge or experience working with .Net is a plus.
Experience working in an Agile based development environment is required.
Strong written and verbal communication skills
Any Gradute