Senior Kafka Developer | ProArch | Remote (India)

Senior Kafka Developer | ProArch | Remote (India)

Remote India
Application ends: January 24, 2025
Apply Now

Job Description

As a Kafka senior developer at ProArch you will be responsible for designing, developing, troubleshooting, and maintaining scalable and efficient Kafka-based messaging solutions and microservices applications. You will work closely with cross-functional teams to ensure seamless data flow, scalability, and reliability of our Kafka ecosystem. You are responsible for developing and delivering POCs. Hands-on experience is a must.

Key Responsibilities:

  • Set up and configure Kafka clusters, including brokers, zookeepers, and other components.
  • Develop and implement best practices for Kafka configuration, management, and monitoring.
  • Troubleshoot and resolve Kafka-related issues, ensuring high availability and performance.
  • Collaborate with development teams to integrate Kafka with existing systems and applications.
  • Design, develop, and manage Kafka-based data streaming applications and pipelines. Stay updated with the latest trends and advancements in Kafka and related technologies.
  • Implement security measures and ensure compliance with industry standards.
  • Integrate Kafka with other systems and applications, ensuring seamless data flow and real-time processing.
  • Conduct thorough code reviews, providing constructive feedback and ensuring adherence to coding standards and best practices.
  • Experience with cloud platforms such as AWS.
  • Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes.
  • Familiarity with other messaging systems and data streaming platforms such as RabbitMQ, Apache Pulsar, or Apache Flink.
  • Familiarity with CI/CD pipelines and DevOps practices.

Requirements

  • Bachelor’s or master’s degree in computer science, Information Technology, or related field.
  • 5-8 years of hands-on experience in software development, with a focus on distributed systems and data streaming technologies.
  • Proven experience in setting up, configuring, and troubleshooting Kafka clusters in production environments using AWS MSK (serverless/Provisioned) or Confluent Kafka platforms is a must.
  • Strong understanding of Kafka architecture, including brokers, zookeepers, producers, consumers, and Kafka Streams.
  • Strong experience with Schema registries like AWS Glue or Avro etc.
  • Strong experience Connector ecosystem specifically Source Connectors and Sink Connector leveraging open-source components.
  • Strong experience leveraging Postgres and its datatypes like Jsonb etc.
  • Proficiency in programming languages such as Java, Scala, or Python.
  • Experience in designing, building deploying and maintenance of enterprise cloud solutions in AWS
  • Demonstrable experience with microservices based architecture on Cloud at scale.
  • In-depth understanding of microservices architecture and best practices.
  • Experience with RESTful APIs and web services.
  • Strong testing skills with JUnit and Mockito.
  • Experience with Karate testing framework for API testing.
  • Familiarity with version control systems like Git.
  • Strong problem-solving skills and the ability to work under pressure.
  • Excellent communication and teamwork skills.