Detailed JD
Responsibilities:
- Technical Leadership: Serve as the primary technical point of contact for the assigned customer account, providing expert guidance on Databricks architecture, best practices, and implementation.
- Solution Design & Implementation: Collaborate with the customer to understand their business requirements and translate them into technical solutions leveraging the Databricks Lakehouse Platform. Lead the design, development, and deployment of these solutions.
- Hands-on Development: Engage in hands-on development activities, including writing code (Spark, Python, SQL, etc.), configuring Databricks clusters, optimizing performance, and integrating with other systems.
- Proof of Concept (POC) Development: Lead the development of POCs to demonstrate the value of Databricks and address specific customer use cases.
- Technical Enablement: Conduct training and workshops for the customer's technical teams, empowering them to effectively use the Databricks platform.
- Relationship Management: Build and maintain strong relationships with key stakeholders within the customer organization, including technical leads, architects, and business executives.
- Customer Advocacy: Act as a customer advocate, providing feedback to Databricks product management and engineering teams on product enhancements and feature requests.
- Sales Support: Assist the sales team in identifying and qualifying new opportunities within the assigned account. Contribute to technical proposals and presentations.
- Project Management: Manage technical projects related to Databricks implementations, ensuring timely delivery and successful outcomes.
- Thought Leadership: Contribute to the Databricks community by sharing best practices, writing blog posts, and presenting at conferences.
Qualifications:
- Education: Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree or PhD preferred.
- Experience: [Specify years] years of experience in data engineering, data science, or a related field, with a focus on cloud-based solutions.
- Technical Skills:
- Deep understanding of distributed computing frameworks (e.g., Apache Spark).
- Proficiency in programming languages such as Python, SQL, Scala, or Java.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP).
- Experience with data warehousing, data lake architectures, and ETL processes.
- Knowledge of machine learning algorithms and techniques is a plus.
- Familiarity with DevOps practices and tools.
- Experience with Databricks Lakehouse Platform is highly desirable. If lacking, strong experience with similar technologies and a demonstrated ability to quickly learn new platforms is crucial.
- Soft Skills:
- Excellent communication, presentation, and interpersonal skills.
- Strong problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Customer-focused and results-oriented.
- Ability to manage multiple projects simultaneously.
Preferred Qualifications:
- Databricks certifications. - MUST HAVE
- Experience with specific industry verticals (e.g., finance, healthcare, retail).
- Contributions to open-source projects.