- Database Design and Architecture: Lead the design, implementation, and optimization of scalable and efficient database architectures to support business requirements.
- Data Modeling and Management: Develop data models, database schemas, and data pipelines for efficient storage, retrieval, and management of data across platforms.
- AWS Database Services: Manage AWS databases, including Aurora (PostgreSQL/MySQL) and DynamoDB, optimizing for cost, performance, and scalability.
- Python Scripting: Use Python for data manipulation, automation of database tasks, and integration with other systems.
- Database Performance Optimization: Monitor and optimize database performance, troubleshoot issues, and ensure high availability and disaster recovery.
- Documentation and Reporting: Maintain up-to-date documentation for database architectures, processes, and configurations. Generate reports on database performance, issues, and improvements.
Primary Skills (Essential):
- At least five years of professional programming experience writing performant stored procedures/functions/SQL statements
- Database Management: Expertise in database administration, tuning, and troubleshooting.
- SQL Expertise: Strong SQL skills across various database systems.
- AWS Databases: Proficiency with Amazon Aurora (MySQL/PostgreSQL) and DynamoDB, including setup, optimization, and scaling.
- Python: Strong Python skills for data manipulation, automation, and database task management.
- Data Modeling: Proficiency in designing logical and physical data models to support business processes.
Secondary Skills (Highly Beneficial):
- Snowflake: Hands-on experience with Snowflake data warehousing, data modeling, and data ingestion processes.
- ETL/ELT Tools: Familiarity with ETL/ELT processes and tools (e.g., Apache Airflow, AWS Glue) for data transformation and ingestion.
- Data Governance and Compliance: Knowledge of data governance practices, including data security and regulatory compliance standards