We are looking for a Data Engineer for a client project.
Main stack: Pulsar, AWS Data Pipeline, Azure Data Factory, Kafka, Airflow
Priority to candidates with experience in Nifi Navigator, Datamesh Appreciation, Kubernetes
— Crafting Seamless Interfaces: Master the art of creating sophisticated Kafka / Pulsar queue and topic interfaces that set new standards.
— Elevating Deployment: Take the lead in developing Helm charts tailor-made for Kubernetes, ensuring swift and efficient streaming service deployment.
— Architecting Excellence: Engineer ingenious service and backup strategies that guarantee exceptional performance and reliability.
— Automating Brilliance: Unleash the power of automation to orchestrate the entire life cycle of Kafka / Pulsar topics, optimizing workflows.
— Documentation as an Art: Shape comprehensive documentation that serves as a roadmap for flawless processes, leaving no detail unturned.
— Future-Ready Streaming Platform: Fuel the evolution of a state-of-the-art cloud-native streaming platform that pushes boundaries.
— Scaling Consultation: Provide expert consultation, guiding clients seamlessly through the integration of messaging services at scale.
Innovative App Development: Push the envelope by designing and developing ingenious Flink applications on top of dynamic messaging frameworks.
— Kafka Mastery: Showcase
— Topic Orchestra: Demonstrate your adeptness at orchestrating enterprise-wide Kafka topic landscapes, elevating organizational efficiency.
— Disaster-Ready: Display your capabilities in setting up disaster recovery processes for Kafka, ensuring operational resilience in the face of challenges.
— Pipeline Maestro: Exemplify your prowess in architecting real-time data pipelines using Python/Scala, scaling to meet complex demands.
— Agility and Collaboration: Flourish in Agile/Scrum work environments, collaborating seamlessly to drive projects to success.
— Container Insight: Navigate effortlessly in container ecosystems, enhancing your ability to thrive in modern tech landscapes