Hadoop Apps Engineer at Jobbatical

Bangkok, Thailand

Posted on Apr 26 2017 (8 months ago)

Agoda.com is one of the largest online hotel and accommodation platforms in the world. As a Priceline Group company, we are part of the largest online travel company in the world. Offering over 1 million accommodation options worldwide in 197 different destinations in 38 languages, our globe-spanning network of travel experts is at the cutting edge of an industry that is rapidly transforming the way the world travels. Technology is not just what we do—it’s at the heart of who we are. From IT professionals managing hundreds of millions of data points to market managers building relationships with hotels to PR specialists networking with media around the globe, Agoda.com delivers an exciting, fast-paced environment in every one of our 36 locations worldwide. Our workforce of over 2,800 professionals from 70 nationalities in locations around the globe is truly one of our biggest strengths. We pride ourselves on a dynamic, workplace where creativity thrives and collaboration is key. A positive, enjoyable culture where people work hard but smile often is what keeps our team spirit high, and we encourage communication to be open, frequent, and constructive. We choose people who are dedicated to making things great, who are able to push boundaries, and who understand that cutting-edge products come from cutting-edge ideas. Our industry moves fast, and so must we—but we have a great time doing it. Journey with us.

Are you the Hadoop Apps Engineer we're looking for?

Our systems scale across multiple data centers, totaling a few million writes per second and managing petabytes of data. We deal with problems from real-time data ingestion, replication, enrichment, storage, and analytics. We're not just using big data technologies—we're pushing them to the limit. In this competitive world of online travel agencies, finding even the slightest advantage in the data can make or break a company. That is why we put data systems among our top priorities. And while we're proud of what we built so far, there's still a long way to go to fulfill our vision of data. We're looking for people who are as excited about data technology as we are, to join the fight. You can be part of designing, building, deploying (and probably debugging) products across all aspects of our core data platform products. Joining this team, you will be solving some of the most difficult challenges out there today for the Hadoop ecosystem. We focus largely on building user tools and applications for other teams to use. Most of these are built on top of Yarn utilizing Apache Spark as well as other cutting edge technologies. We are also the engine that drives a fully functional world-class data warehouse on top of Hadoop. This means providing tools to define data cubes and syncing high load of data from a processing Hadoop to other systems using in-house built applications.

How we'd like you to help us:

  • You will design, build, test and deploy scalable and efficient tools and applications to process large amounts of data while keeping to the highest standards of testing and code quality.
  • You will improve the scalability, stability, accuracy, speed and efficiency of our existing data systems.
  • You will build monitoring systems to monitor the data warehouse and other SLA's.
  • You will work with experienced engineers and product owners to identify and build tools to automate many large-scale data management/analysis tasks.


What we'd like to see in the candidate:

  • Ideally, you hold a bachelor’s degree in computer science, information systems, engineering or another related field.
  • You have at least three years of experience in designing and implementing large, scalable distributed systems.
  • You’re a proficient coder in either Java or (preferably) Scala.
  • You have operational experience debugging production issues.
  • You have a good understanding of data architecture principles.
  • You're a proficient communicator in written and spoken English.
  • NICE-TO-HAVES:
  • Working knowledge of the Hadoop ecosystem or other big data experience;
  • Experience with Apache Spark (Scala API preferred);
  • Strong design and OOP skills;
  • Python/Django;
  • Experience working with open source products;
  • Working in an agile environment using test-driven methodologies.


Compensation and perks

  • Your package will include a competitive salary, private medical insurance, a pension fund, and a hotel discount scheme.
  • We offer a full relocation package for you and your family (including flights, a 20ft container, pet relocation, and 1 month's free accommodation).
  • We are happy to receive applications from both international and local candidates and offer visa sponsorship for eligible candidates.
  • You'll enjoy our quarterly drinks, annual parties, and monthly social activities.
  • We have have a casual dress code in our offices in one of the most desirable locations in Bangkok.

Questions?

Do you have any question or comment for Jobbatical about their position Hadoop Apps Engineer?

You

Please log in to ask a question

Get noticed by being the first to ask Jobbatical a question.
No question right now? Subscribe to this job post to be notified when other applicants ask something.