company-img2

Intelliconnect Technologies Big Data Hadoop Engineer MapReduce Hive HDFS (1 3 yrs) Mumbai Pune (DevOps)

  • 1-3 years
  • Not Disclosed

Job Description

We are looking for passionate programmers who would like to make a difference in the field of Big Data and Machine Learning.Please apply if (1) you are excited and can continuously selflearn in a fast pace environment to become an expert and enhance customer experience and contribute building expertise to benefit all.(2) You are a team player and proactive(3) You can commit yourself to the mission to build solutions.(4) Believe in meritocracy.Responsibilities and Duties (Hadoop) 1. Write software to interact with HDFS and MapReduce. 2. Assess requirements and evaluate existing solutions. 3. Build, operate, monitor, and troubleshoot Hadoop infrastructure. 4. Develop tools and libraries, and maintain processes for other engineers to access data and write MapReduce programs. 5. Develop documentation and playbooks to operate Hadoop infrastructure. 6. Evaluate and use hosted solutions on AWS Google Cloud Azure. 7. Write scalable and maintainable ETLs. 8. Understand Hadoop's security mechanisms and implement Hadoop security. 9. Write software to ingest data into Hadoop.Skills 1. You know the JVM runtime, the Java language, and ideally another JVMbased programming language. 2. You know computer science fundamentals, particularly algorithmic complexity. 3. You know tradeoffs in distributed systems. 4. You re proficient at software engineering principles that produce maintainable software and you can use them in practice. 5. You have worked with a Hadoop distribution. 6. You have worked with one or more computation frameworks, such as Spark. 7. You re familiar with HBase, Kafka, ZooKeeper, and other Hadoop components 8. You know Linux and its operation, networking, and security. 9. You know how to efficiently move large data around.Responsibilities and Duties (General) 1. Write clean, quality, and testable code within given schedule 2. Lead and participate in design and architecture decisions 3. Work, contribute and collaborate in a crossfunctional agile team 4. Be passionate and continue to advance your craft 5. Contribute to IntelliConnect Open Source Projects and Initiatives 6. Complete Quarterly SelfLearning Certifications in Hadoop and others as required by the company to enhance skills.Required Experience, Skills and Qualifications 1. Minimum 1 year of practical experience designing high performance solutions to big data problems, developing &amp testing modular, reusable, efficient and scalable code to implement those solutions 2. Aptitude to learn and implement new technologies. 3. Must be able to demonstrate at least one Big Data applicationroject implemented in Java, Scala or Python. ( preferably Java). 4. Good object oriented programming skills using Java or Python ( preferably Java). 5. Handson realproject experience with Hadoop and its components 6. Strong working knowledge of Linux 7. Working knowledge of Agile software development 8. Hands on experience using GIT 9. Good understanding of object oriented programming 10. Great, innovative problem solver who can turn ambiguous problem spaces into clear design solutions. 11. Degree in Computer Science or related technical degree. 12. Good understanding of object oriented programming 13. Great, innovative problem solver who can turn ambiguous problem spaces into clear design solutions. 14. Team Player 15. Excelled written and communication skillsPlease send the following information while applying ( mandatory ) 1) Your willingness to work from officeclient locations in Pune or Thane, MH, India and other office in India and abroad.2) Your github account handle ( if you have contributed to open source projects andor have uploaded your personal work)3 a) Your current CTC3 b) Your expected CTC4) Your noticejoining period5) a Your willingness to complete an open book test assignment (remote) where you will have to develop a demo solution and explaindemonstrate the working to the Technical Panelist Team5) b. Your willingness to complete the HackerRank Algorithms Evaluation ( Easy and Medium Complexity) as seen in A detailed resume that includes project information, duration of the project, team size, your role and responsibilities, your contribution to the projects.7) Your willingness to selflearn new technologies in Machine Learning and others as required from time to time to design and implement solutions8) Your willingness to commit a minimum of 18 months with Intelliconnect9) Confirm You are physically and mentally fit for the profession in Software Engineering10) Your willingness to take complete ownership of work allocated. Share verifiable examplesinstances where you took ownership of work.11) Your willingness to selfregister on the Nasscom National Skill Registry Site Are you a team player and willing to work as a team player?


Similar Jobs