We’re looking for a senior data engineer with a background in software development to keep improving our existing Hadoop-based system, and work with us on the migration to the future GCP based cloud data platform.
As engineer responsible for the big data platform, you have a big impact on all data users at trivago. Thousands of workflows and processes are running on our platform every day, processing terabytes of new data daily and creating thousands of analytical insights and dashboards. Our mission as data platform team is to provide a state-of-the-art data platform, and enable data engineers, analysts, and data scientists to do their work.
This involves creating and managing big data infrastructure, developing tools, writing software to make things more efficient, as well as DevOps and DataOps activities and the occasional tuning of very large jobs. You’ll get to work with a great team of experienced engineers, and the chance to work on our open source / Hadoop based system, and the GCP platform.
Get an inside look at data at trivago:
How you’ll make an impact:
- Design, implement, run, and maintain the core data infrastructure, including the Hadoop ecosystem and our future GCP-based platform.
- Develop and maintain tools around import/export, processing, monitoring and internal metrics to observe and improve the platform and processes.
- Continuously improve our approach, architecture, and algorithms. We encourage thinking outside the box and want our developers to challenge the status quo.
- Ensure the solutions you define are reusable and follow good design patterns and architecture principles.
- Collaborate with and provide technical solutions to other software development teams, DevOps, product, and business intelligence teams.
What we’re looking for:
- Experience in data systems and programming.
- A solid foundation in programming languages, data systems, architecture, ideally experience in object-oriented programming and other paradigms. Java is used a lot in our team.
- Experience with engineering and setup of distributed systems.
- Ability to model and work with complex data structures.
- Ability to take responsibility for your tasks and drive them, but also working in a team and collaborate with others.
What makes you stand out:
- Experience working with Hadoop or cloud are a plus.
- Experience with Kafka or stream processing in general.
Don’t let these nice-to-haves stop you from applying for the job! As long as you have relevant experience and the right attitude, we’d love to hear from you!
What you can expect from life at trivago:
Entrepreneurship: The freedom to take ownership of your work and drive initiatives independently. It’s the idea that counts, not the position.
Growth: Support for your development, constant new opportunities, regular peer feedback, mentorship and training.
International workforce: Collaboration with international talents from 80+ nations bringing different perspectives, backgrounds and expertise together to ensure a truly global focus.
Flexibility: Self-determined working hours and the opportunity to split your time between home and our campus in Düsseldorf: At least 2 days on campus and 3 days at home per week. We also offer further flexibility for parents of young children and talents with severe disabilities or health conditions.
Relocation and integration: Support with relocation costs, insurance, work permit and visa questions.
Equal opportunity: Commitment to creating an all-inclusive workplace, because we know representing the diversity of our users in our talent base enables us to create a more meaningful product.
A note from our recruitment team:
We know that applying for a new job can be both exciting and intimidating, but don’t worry, we’ve got you. Our recruiting team will be on hand every step of the way, but if you have any questions or concerns before applying, feel free to reach out to us at firstname.lastname@example.org. You can also find out more our hiring process here. We look forward to your application!