Writes ETL (Extract / Transform / Load) processes, designs database systems, and develops tools for real-time and offline analytic processing.
Troubleshoots software and processes for data consistency and integrity. Integrates data from a variety of sources for business partners to generate insight and make decisions.
Experience building AWS cloud architecture and supporting services and technologies (Eg: ECS, Glue, S3, Glue Crawler, Redshift, Step Functions, Foundational Services like IAM, Cloud Watch, Cloud formation, Lambda, Secrets Manager, Sagemaker).
Translates business specifications into design specifications and code. Responsible for writing programs, ad hoc queries, and reports. Ensures that all code is well structured, includes sufficient documentation, and is easy to maintain and reuse.
Partners with internal clients to gain a basic understanding of business functions and informational needs. Gains working knowledge in tools, technologies, and applications/databases in specific business areas and company-wide systems.
Participates in all phases of solution development. Explains technical considerations at related meetings, including those with business clients.
Expert building Spark data processing applications (Python, Pyspark).
Expert with SQL development and Tableau Reporting.
Experience with test automation and test-driven development practices.
Experience with CI/CD pipeline tools like Github.
Tests code thoroughly for accuracy of intended purpose. Reviews end product with the client to ensure adequate understanding. Provides data analysis guidance as required.
Provides tool and data support to business users and fellow team members.