Job Description -
- Experience level - 14+ years
- BigData skills: AWS, Glue, Redshift, Python/PySpark, Denodo, SQL server.
- BigData application/data pipeline Development
- Experience in creating and managing data pipelines
- SQL querying and Data analysis
- Language: Python, PySpark
- Tools: GIT, SQL server management studio, Design tools ( e.g. LucidChart, Draw.io)
- OS: Windows
What you’ll do
- Lead the data integration and data transformation efforts
- Data visualization, data migration and data modeling.
- Database and AWS cloud computing design, architectures, and data lakes.
- Experience in implementing cloud native development – AWS.
- Implement automation and DevOps practices using tools like CloudFormation, Terraform, and CI/CD pipelines
- Development of batch and real time processing jobs using Apache Spark, Spark Streaming, Kafka and Python
- Develop diagrams representing key data entities and their relationships.
- Generate a list of components needed to build the designed system.
- In-depth understanding of database structure principles.
- Experience gathering and analyzing system requirements.
- Familiarity with data visualization tools.
- Performance tuning
- Work effectively in a fast paced and dynamic environment with cross-functional teams
- Communicate clearly, simply, and effectively with all stakeholders
- Hands on contribution.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience).
- 5+ years of experience in cloud architecture, specifically with AWS.
- AWS Certified Solutions Architect (Associate or Professional) certification is strongly preferred.
Bravens is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identify, national origin, age, protected veterans or individuals with disabilities.