Description

What you’ll do  

Engage with business stakeholders to gather requirements for specific use cases. 
Collaborate with team members and stakeholders to review and refine requirements. 
Assist in developing functional specifications. 
Design, develop, and maintain data models using SQL and BigQuery to support business intelligence and reporting needs. 
Build data pipelines to gather data from multiple sources and systems. 
Integrate, consolidate, cleanse, and structure data for client use in solutions. 
Create optimized, efficient, repeatable, and scalable SQL and Python code. 
Analyze, understand, and prepare complex data sources for project integration. 
Stay current with GCP and related technologies, and propose new solutions. 
Perform QA testing on data and reports; validate systems. 
Support production deployment planning and installation. 
Prepare reports for management, highlighting trends, patterns, and predictions. 
Build and maintain dashboards using BI tools. 
Provide ongoing application support and actively resolve issues, prioritizing and escalating as needed. 
Implement data management, maintenance, and reporting best practices to improve solutions.

What Experience You Need

BS degree in Computer Science or related technical field 
1 years of experience working with Python,R 
2 years of experience in data processing platforms and google cloud technologies like BigQuery, Dataproc 
At least 1 year of experience working on data cleaning, preprocessing, feature engineering, and statistical analysis and using popular frameworks such as PyTorch or scikit-learn 
2 years of experience developing dashboards using tools such as Tableau, PowerBI or Looker Studio.

  What could set you apart  

Cloud Certification Strongly Preferred 
Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others 
Experience with Source code control management systems (e.g. SVN/Git, Github)

Education

Any Graduate