Functions that will be performed:
Responsible for ensuring the efficient and reliable operation of corporate data systems and databases;
Technical and operational support for tools and platforms such as: CloudSQL, BigQuery, Bigtable, Hadoop, Composer and Dataflow, ensuring adequate availability and performance;
Carry out the administration and maintenance of databases, ensuring integrity, security and optimization;
Develop and implement automation solutions for data transfers and loads between systems, promoting efficiency and scalability.
Necessary requirements:
Knowledge of GCP, Azure or AWS cloud services (mandatory);
Knowledge of SQL, Hadoop and Airflow (mandatory);
Knowledge of Python (differential);
Intermediate/advanced English (differential);
Ability to configure data pipelines in the GCP environment;
Clear communication skills to translate business needs into technical solutions;
Ability to document workflows and data architectures;
Effective collaboration in multidisciplinary teams;
Critical thinking and complex problem solving;
Flexibility to learn and adapt to new technologies.
Any Graduate