Senior Kafka Consultant

Miami, Florida

Post Date: 06/19/2017 Job ID: 23700 Industry: Other Area(s)

Applications Consultants have expertise in a specific technology environment.  They are responsible for software-specific design and realization, as well as testing, deployment and release management, or technical and functional application management of client-specific package based solutions (e.g. SAP, ORACLE).  These roles also require functional and methodological capabilities in testing and training.

 

Required Skills and Experience:

 

You are an expert in one or more business processes and technology practices and are accountable for translating a business case into a detailed technical design. Alternatively, you are responsible for operational and technical issues and translate technical blueprints into requirements and specifications. You may also be responsible for integration testing and user acceptance testing. You act as a stream lead, guiding team members by experience. You are seen as active member within technology communities.

 

• Qualification: Minimum 5 years experience, Bachelor’ s Degree.

• Certification: SE level 1 and seeking level 2.

• Must have experience in Package Configuration.

• Should be proficient in Business Analysis, Business Knowledge, Testing, Architecture Knowledge, Technical Solution Design and Vendor Management.                                                                                                                                                                                                                                                          

• Experience with various messaging systems, such as Kafka or RabbitMQ

• Experience in Real-time data processing and/or messaging expertise with such products as Storm, Kafka, and Spark.

• Good understanding of Cloud and Datacenter architecture and technologies, High Availability, Disaster recovery etc.

• Implement new real time stream processing and ETL features in Logstash/Kafka

• Good Understanding of DevOps and Micro-services Architecture

• Build framework to detect data patterns using complex event processing capabilities.

• Experience in integrating external services like Kinesis, Google PubSub, HDFS using Kafka

• Experience in code/build/deployment tools like Bitbucket, Maven, Ansible, Jenkins etc.

• Candidate must have experience in supporting performance intensive, high availability application

• Experience in building Hadoop based ETL workflow to ingest, transform and aggregate data either using apache technologies

 

50% Implement new real time stream processing and ETL features in Logstash/Kafka

 

25% Build framework to detect data patterns using complex event processing capabilities

 

25% Build solutions in AWS Cloud Datacenter with High Availability using OpenStack and SDN

 

AWS, Kafka, SDN, Logstash, Streaming technologies

 

Work in a DevOps culture and collaborate with mumtiple Dev, QA and Ops teams from various vendors

 

Should be very analytical and willing to experiment using new technologies in  a very fast paced environment

 

Should work independently and achieve goals in a agile and 2 weeks sprint cycles
Apply Online

Not ready to apply?

Send an email reminder to:

Share This Job:

Related Jobs: