Responsibilities:
· Design, develop, and test robust, scalable data platform components to meet the needs of millions of users.
· Collaborate with product engineers to understand data pipeline requirements and deliver innovative solutions.
· Build high-performance, scalable data solutions that support business objectives.
· Work closely with cross-functional teams to define new data products and features.
· Utilize your expertise in big data technologies to tackle challenging projects involving large data sets.
Skills:
· 7+ years of core experience in Scala/Java, focusing on building business logic layers and high-volume data pipelines.
· 5+ years of experience in real-time stream processing using Apache Flink or Apache Spark, with messaging infrastructure like Kafka or Pulsar.
· 7+ years of experience in data pipeline development, ETL, and processing structured and unstructured data.
· Proficiency in NoSQL systems such as MongoDB and relational databases like PostgreSQL.
· Strong analytical skills and a commitment to quality in all deliverables.
Any Graduate