company-img2

Events Streaming Service Hadoop | Fortune 500 E commerce Retail Firm | 2 8 years

  • 2-8years
  • Not Disclosed

Job Description

Responsibilities:
Own various infrastructure components in data pipelines like Kafka, ElasticSearch, Hadoop
Write and review code necessary to keep our infrastructure components up and running
Triage and debug the problems arising in our platform services and evolve systems by pushing for changes that improve reliability
Collaborate with customers to understand their requirements and tweak various components to meet their needs
Participate in 24 7 on call support rotation
R&D for new data engineering technologies


Minimum qualifications:
2+ years of experience with managing Elastic Search, Kafka and or other data engineering related applications
3+ years of experience with operating systems (Linux Unix) and good knowledge of network fundamentals
3+ years of experience in configuration and maintenance of infrastructure components such as load balancers, messaging systems, storage systems and or web servers
2+ year of experience coding in at least one of higher level languages (e.g. Java, Python, Go, Ruby)
2+ year of experience in software frameworks and APIs


Preferred qualifications:
2+ years of experience with one of the configuration management tools (e.g. Chef, Ansible)
1+ year of experience in building internet facing, multi tenant, distributed web applications
Familiarity with latest trends in computing platforms (e.g. Docker, Kubernetes)
Interest in building software systems for distributed, internet scale environment


Required Education:
Bachelor’s degree in Computer Science or more than 3 years of working in devops field.