We are looking for a Big Data Engineer to create solutions suitable for the job in a real-time, highly data-driven organization and make technical innovations happen.
- Minimum 2+ years of development and design experience in Java with Flink, Beam (or similar streaming toolset) and Kafka.
- Experience of working with Real Time/Streaming data
- Ability to Execute BEAM pipelines on Apache FLINK Runner
- Experience of working on Beam SDK
- Knowledge of Microservices using JAVA
- Experience of working on Github
- Experience of using Spark Streaming
- Experience of working with SQL and NOSQL Databases
- Experience of executing or working with Real Time Data pipelines
- Experience of topic setting using Kafka(Confluent).
- Knowledge of real time data pipelines by developing Kafka producers and streaming applications for consuming
- Experience in Kubernetes
- Well versed with CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem
- High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures
With over 20 years of experience, we have come to understand that innovation is the only way to provide agile, practical solutions that transform businesses and careers.
Our resourcing and smart services help you to realize tomorrow’s potential. Discover the amazing things possible when you bring the right people and the right technologies together.
To apply for this job please visit www.monstergulf.com.