Hadoop Openings/Walkins

- Jobs Last Update On 09-December-2016

We have many openings for you in various IT companies if you have knowledge of Apache Hadoop. So find Hadoop jobs in the top tier cities of Noida, Bangalore, Mumbai, Pune, Kolkata and many more. The companies listed with Peel jobs are hiring for various skills based on managing and maintaining big data across various positions including Hadoop developer and Hadoop administrator. If you do not have much experience as a Hadoop developer, these jobs will give you a chance to learn Hadoop and enhance your skills. So submit your application to these companies and give a strong foundation to your career.

1 - 17 of 17

    • Extensive experience in Designing, Capacity planning and cluster setup for Hadoop.
    • Hadoop operational expertise such as troubleshooting skills, bottlenecks, basics of memory, CPU, OS, storage, and networks.
    • Good knowledge of Hbase,Hive,Pig and Apache web server
    • Good knowledge of performance tuning ,Monitoring and administration using Cloudera
    • Hands on experience in Unix administration and shell scripting to handle file management and job scheduler


    Good to Have Skills:

    • Familiarity with open source configuration management and deployment tools such as Puppet or Chef.
    • Know ledge of any of the scripting language(Bash, Perl, Python).
    • Good to have knowledge on Nagios, Kafka or any message broker(Active MQ, Rabbit MQ).
    • Good knowledge of Linux and tool like Splunk,Tableau



    Interested candidates send ur profiles to vinod.kumar@datamtaics.com

    ...
    Hadoop
    • Extensive experience working in Hadoop eco system tools Map Reduce, Hive, Pig, Sqoop
    • Must have experience in designing and building Hadoop based applications, evaluation of tools based on the requirements and have worked with one or more major Hadoop distributions
    • If it is OK for you re war back to me with your updated CV ( sowmya.palepu@anantha.co.in ) then i will share your profile with client
    • Refer your friends and colleagues also
    ...
    Unix Hadoop Java Linux
    • Resource who has worked on middleware application and excellent JAVA development with Big Data Hadoop Technologies
    • Quick learner and should have good exposure to direct client facing work culture.
    • 10+ Years Experience designing, developing , deploying & Supporting large scale distributed systems and API development
    • 10+ Years experience MUST in Core Java, Web services, REST JMS technologies and good knowledge of various design patterns with Big Data technologies
    • 5+ Years of Data Serialization format (POF, Thrift, JSON, XML ,AVRO etc)
    • Should have knowledge of building tool like Maven and Ant.
    • Should have knowledge of continuous integration tools like Jenkins
    • Should have experience in software versioning and revision control system like SVN, Git or CVS
    • Should have experience in static source code analyzer tool like PMD,SonarQube,Checkstyle or FindBugs
    • Have knowledge of TDD [Test-driven development] methodolog
    • Knowledge of powermock, mockito or easymock API for unit testing
    • Hands on experience on NOSQL DB, Data Modelling etc
    • 2+ Yrs Hands on experience in Oracle Coherence: Types of caches, Caching Schemes, Cache Services, POF format etc. or similar technologies
    • PLS NOTE: This requires working in shifts(1 PM to 10 PM )
    ...
    Hadoop
  • We are hring for client

    We have urgent openings on the below requirements, if you are interested, please send us your updated profile in word format.

    Mandatory Skills: Hadoop ( CDH), data ingestion for CDH
    Good to have skills: SAS/ R analytics, QV, Tableau
    Domain: Financial Services

    If interested, please send your updated profile to raj@vedainfo.in along with the following details. ( Mandatory )

    • Full name:
    • Mobile No:
    • Total Experience:
    • Relevant Experience:
    • Notice Period:
    • Current Organization:
    • Current Location:
    • DOB:
    • Current CTC:
    • Expected CTC:


    Please fill all the above details so that we can send the exact information to the client.

    ...
    Hadoop
    •  
    • Relevant Experience: 3.5-4.5 years (Total: 4-8 yrs)
    • UNIX: Shell Scripting (must have), Unix utilities like sed, awk, perl, python
    • Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP)
    • ETL Skills (Preferred): ETL Mapping Development, ETL standard environment parameters, SDLC, Data Analysis
    • Developer Experience: Of at least 2 year
    • Project Experience: Minimum 2 complete projects over a period of last 2 years, out of which 1 project must be independently leading a team of 3-4.
    • Database(Preferred): SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred)
    • Project Profiles: Atleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly
    • Expected to produce LLD, work with testers, work with PMO and develop ETL Mappings, schedules
    • Primary Skills(Must have) Hadoop , BigData , Unix shell scripting
    • Secondary Skills(Good to have) Oracle, DB2, Teradata (Preferred),
    • hadoop certification preferred
    • Hadoop Exposure/Certification Preferred

    Note: This is a Scheduled Drive - please share your resumes with your availability to farhana.khan@capgemini.com

    ...
    Unix Big data Hadoop
    • Must have Hands on Experience to one of the enterprise Hadoop distributions(Cloudera, Hortonworks or MapR)
    • Strong Knowledge in data modeling, HIVE, OOZIE, HDF, Pig and shell script. Good expertise in MapReduce, Sqoop.
    • Ability to write complex hive queries and Sound knowledge in Cassandra, spark and kafka.
    • Knowledge oon Build Tools (MAVEN,ANT) SVN or GIT will be an icing
    • Experience in databases like SQL/MySQL is desirable
    • Product development experience for large scale systems with high volume and high performance requirements with Fundamentals of multi-threading on multi-core systems
    • Experience in product development life-cycle and product process oriented agile development environment.
    • Must be Technically equipped with Java, J2EE, Spring, Hibernate, REST API, JavaScript, Jquery, HTML.
    • Must possess Strong written and verbal communication skills to interact with customers / clients on a regular note.
    • Must function independently with limited supervision, Must act as a Team Mentor and should be a Team Player by proactively involving in highly collaborative environment

    If you are interested kindly send me your updated profile  to rajesh @msr-it.com along with below:

    Current salary:
    Expected Salary:
    Notice Period:
    Reason for Job Change:
    Notice Period:

    ...
    Hadoop
    • Proven expert level understanding of Cloudera Hadoop and Apache ecosystem namely YARN, Impala, Hive, Flume, HBase, Sqoop, Apache Spark, Apache Storm, Crunch, Java, Oozie, SQOOP, Pig, Scala, Python, Kerberos/Active Directory/LDAP, etc.
    • Proven Experience showcasing technical and operational feasibility of Hadoop Architecture solutions
    • Experience and detailed knowledge of Hadoop development utilizing SyncSort DMX-H, Subversion, SQL Knowledge and equivalent technologies
    • 4 Year Degree Computer Science/Software Engineering or related degree program, or equivalent application development, implementation and operations experience
    • Minimum 3+ years of related database development experience, including Data Architect experience working with 'Big Data' technologies such as Hadoop, ETL tools, and large datasets
    • Excellent verbal and written skills, Proficient with MS office Tools, Strong analytical and problem solving skills.
    ...
    Hadoop
    • Good hands-on in Java/J2EE (Strong in core Java is mandatory)
    • Good Hands on in Spring, Hibernate & JDBC
    • Good hands-on in the Hadoop EcoSystem (Mapreduce, Spark, Hive and Oozie) - Intermediate to Advanced Level (and AWS cloud is good to have)
    ...
  • Conduct training for our corporate clients through out India. Conduct in house modular trainings. Support software development at BitCode.

    ...
    Big data Hadoop
    • Experience in Java/J2EE /Hadoop
    • Open Source contributor and technology evangelist
    • Experience in designing/ architecting and implementation of complex projects/ products with considerable data size (GB/ PB) and high complexity
    • Strong Knowledge in any of NoSQL/ Graph databases (Cassandra/ MongoDB/ HBase/CouchDB/ Neo4J etc.)
    • Knowledge of clustered deployment architecture
    • Good in mathematics, specifically statistics
    ...
    J2EE Hadoop Java
    • 2—6 years of experience with testing using Python, soapui, Junit, Java, Perl and Scala
    • Basic Networking Knowledge
    • Strong Experience on Python and Java programming skills
    • Experience in Big data technologies ( Hadoop/No SQL databases/streaming platforms/mongo DB) a big plus
    • Strong experience with Automated testing and continuous integration
    • Good in Unix and Linux background with administration skills
    • Experience working with Agile team using test driven development. Short Sprint cycles 2 – 4 weeks.
    • Ability to influence product design in order to meet testability requirements
    • Ability to design test frameworks
    • Ability to deep dive technically in product design and modules designs
    • Knowledge of storage/network protocols a big plus
    • Knowledge of NFS, DFS and associated environments highly desired
    • Hands on knowledge of various java stack trace, memory mapping tools a must
    • Experience with automation on CI environment
    • Experience with scripting languages Python, Perl
    • Experience with analyzing technical requirements and design
    • Should be a self-starter and able to work with minimal guidance
    • Demonstrated experience with data validation testing on large complex projects using PLSQL and big databases
    • Strong knowledge on back end testing on Databases and BI/DW , Rest/SOAP Api’s
    • Understand when to use black box, white box and gray box test approaches
    • ......
    • Design and develop applications using Java/J2EE technology
    • Strong knowledge and hands on experience on Map/Reduce(Hadoop), Hive and Pig scripting
    • Passionate in exploring new technologies
    • Good Knowledge and hands on experience on Web Services Development
    • Excellent knowledge of Agile Development
    • Good Communication Skills
    ...
    Hadoop
    • Programming experience in Java
    • Ability to use programming concepts and develop automation framework
    • Ability to write good quality code
    • Exposure to Big Data technologies (Hadoop, NoSQL, Storm, Kafka, etc.)
    • Experience of working in Linux environments and RDBMS – MySQL/ PostgreSQL
    • Good at scripting languages (Perl/ Ruby/ Shell/ Python)
    • Exposure to application deployment and configurations
    • Experience in Apache, Tomcat, JBoss installation, setup, configuration and writing deployment scripts
    • Understanding of application monitoring and exposure to tools like Nagios/ Ganglia/ JMX tools
    • Exposure to Cloud & Virtualized environments (AWS, EC2, ESX) will be an added advantage
    • Exposure to automated deployment tools like chef, puppet will be an added advantage
    • Strong verbal and written communication skills
    • Excellent analytical and problem solving skills
    ...
    J2EE Big data Hadoop Java
    • Must have experience in Java J2ee Development
    • Must have experience in Spring and Hibernate
    • Must have experience in Hadoop development
    • Must have experience in Team Management
    • Must have experience in Client Interactions
    • Good to have product development experience
    ...
    J2EE Hadoop Java
    • Must have Experience in Java J2ee Development
    • Must have Experience in Spring and Hibernate
    • Must have experience in Hadoop development
    • Must have Experience in Team Management
    • Must have Experience in Client Interactions
    • Good to have product development experience
    ...
    Hadoop Java
  • We are looking for expert Programmer in Big Data. Person should have minimum 1 year of experience in Hadoop, Hive, Cassandra and experience on Spring MVC, Tomcat as well as Kafka. Knowledge of JAVA API,

    Strong Problem solving and good communication skills
    Performance oriented development approach.
    Must have Ability to manage several assignments

    To apply please send your CV to hr@tickto.com
     

    ...
    • Hands on experience on Hadoop Admin/Developer
    • Strong experience in hive,pig,etc.,
    • Good communication and team skills
    ...
    Hadoop
Top