Drive the development effort end-to-end for the timely delivery of high-quality solutions that meet requirements, align with the architectural vision, and adhere to all applicable standards.
Proficiency in creating and managing large-scale data pipelines, handling complex, high-volume, multi-dimensional data, and machine learning models.
Proven knowledge of successful design, architecture, and development using Big Data technologies with large data volumes and transaction systems.
Design and coding skills with Big Data technologies like Hadoop, Spark, Hive, Kafka along with Proven knowledge of successful design, architecture, and development using Big Data technologies with large data volumes and transaction systems.
Excellent coding hands-on expertise in SQL and one or more programming languages such as Java, Scala, or Python.
Proven experience and coding skills working with real-time streaming platforms such as Flink, Spark Streaming, and Pinot.
Work closely with other engineers and teams to integrate data solutions into existing systems.
Excellent approaches in optimizing performance for low-latency querying and high-throughput data ingestion.
Hands-on experience with scheduling tools like Airflow and Control-M. Experience with Docker and Kubernetes. Excellent communication and interpersonal skills, and a strong team player.
Proven skills and expertise in NoSQL databases such as Click House, MongoDB, Cassandra, HBase, and Redis.
Familiarity with Agile development, incorporating CI/CD
Any Gradute