Recruiting Hadoop Specialists

RPO Case-studies

In 2 years GlobalCareer managed to find 30 engineers, analysts and developers who use Hadoop. Most candidates received job offers from a large bank that was developing a new IT project. Today we’ll share how we tracked down these hard-to-find specialists.

Hadoop is an open-source set of programs and procedures. It was developed in 2006 by Apache Software Foundation based on MapReduce and Google File System Storage technologies. Today it is a primary tool for managing huge amounts of information. IT companies, telecoms, fintechs and retail chains all compete for Hadoop specialists in Russia.


A regular client of ours launched a new large-scale project to create data marts and implement integration processes between its analytical and product systems and those of its partners. This meant working with a massive amount of data, so experienced Hadoop specialists were required. These specialists are hard to come by on the open market. A search using traditional sources was ineffective, so the client turned to us for help.

Our consultants selected the following specialists:

  • Hadoop engineers with Java experience, specializing in data warehouse (DWH) systems
  • Java developers familiar with the Hadoop ecosystem
  • Analysts who use Hadoop for big data projects

Alena Daragan, a GlobalCareer consultant, explains: “When looking for engineers and developers, we look to see if they have the skills to create flexible, scalable, manageable systems that can withstand huge loads and large amounts of data. For analyst vacancies, we select candidates with serious mathematical skills: big data analysis is impossible without excellent knowledge of statistics, probability theory, etc.”


The fact that many IT specialists list Hadoop experience in their resumes complicates the search process. We formed an experienced team of recruiters who developed the right search strategy:

  • We searched through our own database (of over 2 million resumes) and selected about 700 Russian specialists with experience in Hadoop, Java/Python/R, SQL, Kafka, Spark, Kassandra and ETL processes, as well as those who have worked on big data projects
  • We analyzed the subscribers of channels like @hadoopusers, @datajobschannel and @datajobs on Telegram and compared them with our database entries
  • We studied the online profiles of authors and commentators who regularly write about big data
  • We expanded the funnel with the help of young specialists who actively participate in big data meetups and hackathons

Once we had our final list, we discussed the project with suitable candidates and asked them to recommend colleagues with the skill set we’re looking for.


With maximum team involvement and using multiple sources, we found the Hadoop talent needed by our client. The selection process took 8 months. Now all vacancies for this project have been successfully filled.

The GlobalCareer database contains over 2 million resumes of IT specialists with a range of skills. Whether you’re looking for an experienced candidate with a unique set of abilities or a whole development team, get in touch with us and we’ll help you find the right candidates.

Ready to Hire the Best IT Talent?

Submit your query and we’ll get in touch

    Read more

    Thanks for your request!

    Your request has been successfully received!

    An error occurred.

    Please try again later, or contact us by phone +7 495 967-80-30 or e-mail oapatina@IBS.RU