Agoda.com is one of the largest online hotel and accommodation platforms in the world. As a Priceline Group company, we are part of the largest online travel company in the world. Offering over 700,000 properties in 197 different destinations in 38 languages, our globe-spanning network of travel experts is at the cutting edge of an industry that is rapidly transforming the way the world travels.
Technology is not just what we do—it’s at the heart of who we are. From IT professionals managing hundreds of millions of data points to market managers building relationships with hotels to PR specialists networking with media around the globe, Agoda.com delivers an exciting, fast-paced environment in every one of our 36 locations worldwide.
Our workforce of over 1,800 comprises 64 nationalities and is truly one of our biggest strengths. We pride ourselves on a dynamic, workplace where creativity thrives and collaboration is key. A positive, enjoyable culture where people work hard but smile often is what keeps our team spirit high, and we encourage communication to be open, frequent, and constructive.
We choose people who are dedicated to making things great, who are able to push boundaries, and who understand that cutting-edge products come from cutting-edge ideas. Our industry moves fast, and so must we—but we have a great time doing it.
Journey with us.Are you the Big Data Pipeline Engineer we're looking for?
We are a small passionate team looking for exceptional individuals to be a part of designing, building, deploying (and probably debugging) our data pipeline. Our systems scale across a multitude of data centers, totaling a few million writes per second and managing petabytes of data. We deal with problems from real-time data ingestion, replication, enrichment, storage, and analytics. We are not just using Big Data technologies—we're pushing them to the limit.
Join us on the team that handles the data pipeline infrastructure that's the backbone of all data event logging within Agoda and is crucial to real-time monitoring of all Agoda systems across geographically distributed data centers. The always-on data pipeline feeds logs, events, metrics into Hadoop, ElasticSearch, Spark Clusters and other distributed systems that drive key business processes such as Data Driven Business Intelligence, NRT Monitoring, A/B Testing, Centralized Application Logging and Stream Processing, to name a few.
Ensuring data quality, integrity and accuracy is a core part of our identity. You should be eager to solve problems that come from managing and making sense of large amounts of data. You should be comfortable navigating the following technology stack: Linux, JVM, Java/Scala, C#, Golang, Kafka, scripting (Bash/Python), Hadoop, ElasticSearch.
If that sounds like you, apply now and let's talk!How we'd like you to help us:
What we'd like to see in the candidate:
- Build, administer and scale data pipelines that process billions of messages a day spanning over multiple data centers
- Develop and expand upon existing frameworks that is used by teams throughout Agoda to produce messages to the data pipeline
- Build and manage data ingestion into multiple systems (Hadoop, ElasticSearch, other Distributed Systems)
- Build tools that monitor high data accuracy SLAs for the data pipeline
- Fix production problems
- Profile for performance, self-recovery and stability
- Collaborate with other teams and departments
- Automate system tasks via code as needed
- Explore available new technologies that improve our quality of data, processes and data flow
- Develop quality software through design review, code reviews, and test-driven development
Compensation and perks
- You probably have a degree in computer science, information systems, computer engineering, or a related field.
- You have upwards of two years of industry experience, preferably at a tech company.
- You have a passion for Big—Petabytes' worth—Data.
- You have good knowledge of data architecture principles.
- You have operational experience debugging production issues.
- You have experience with the following: Scala, Java, C#, GoLang and/or any functional language.
- You're an experienced coder who can stand your ground.
- You have experience building systems that are flexible, well-tested, maintainable and scale.
- You’re detail oriented and consider every outcome of a particular decision.
- You have no issues being on-call and working at odd hours as needed.
- You can communicate fluently in written and spoken technical English.
- A good understanding of how Kafka works;
- Kafka Administrator experience;
- Experience producing messages to Kafka from any one of the following languages: Java, Scala, C#, GoLang;
- An understanding of concepts relating to Schema Registry and Schema Evolution;
- Experience working with Serialization Formats either with ProtocolBuffers, Avro or Thrift;
- Proficiency in ElasticSearch;
- Development experience on Hadoop (MapReduce, Spark, Hive, Impala, SparkSql);
- Experience with data ingestion from Kafka into Hadoop, ElasticSearch, other distributed systems;
- Strong systems administration skills in Linux;
- Experience working on or contributing to open source projects.
- Your package will include a competitive salary, private medical insurance, a pension fund, and a hotel discount scheme.
- We are happy to receive applications from both international and local candidates and offer visa sponsorship for eligible candidates.
- We offer a full relocation package for you and your family (including flights, a 20ft container, pet relocation, and 1 month's free accommodation).
- You'll enjoy our quarterly drinks, annual parties, and monthly social activities.
- We have have a casual dress code in our offices in one of the most desirable locations in Bangkok.