Just Eat Takeaway.com is a leading global online food delivery marketplace headquartered in Amsterdam and listed on the London Stock Exchange.
We’ve built our business on having the widest choice available on our platform – connecting millions of customers with over 155,000 restaurants across 24 countries – with over 100 different cuisines from local independents to globally famous restaurants, available to order via our app and website.
We provide the platform and tools to help independent restaurants move online and reach a significantly broader customer base – to generate increased orders and grow their businesses. We also provide the insights, advice, and support our growing community needs to satisfy customers and help raise standards across a vibrant takeaway sector.
We’re built to deliver behind the scenes too. To make Just Eat the great company it is, it takes a great team of people. Which is why all of our colleagues are welcomed into a diverse and inclusive workplace where they feel they can belong. We’re passionate about nurturing our people and offer a full programme of training and support to our employees – helping them to develop their careers in a way that suits them.
No matter who you are, what you look like, who you love, where you are from, religious beliefs or takeaway preferences you could find your place at Just Eat Takeaway.com. We’re a diverse and inclusive workplace that promotes a sense of belonging, allowing all of our people to bring their most colourful and complex selves to work every day.
You can read more about us at: https://www.justeattakeaway.
Description
On behalf of Just Eat Takeaway, Ciklum is looking for a Senior Data Engineer for our team in Kyiv on a full-time basis.
As our Data Engineer, you’re part of the diverse Data Engineering team based at our Amsterdam HQ. The team’s goal is to support our existing platform, in addition to creating new components and integrations and ensuring production runs smoothly.
The main goal is to verify data integrity and transform it into the best format for each stakeholder, using the right tools to support their operational work.
Responsibilities
- Collect and transform unstructured data from different sources into a structured output – e.g. columnar databases, flat or Parquet/ORC files, NoSQL or streams
- Use your skills and best practices to create reliable, frequently/continuously running pipelines
- Create reusable, maintainable and scalable integrations & services, using a cutting-edge cloud infrastructure
- Model and test data, implement proper logging and troubleshoot any issues swiftly
- Collaborate effectively with team members and other stakeholders in Agile iterative processes
Requirements
- 4+ years’ experience in handling data pipelines, data warehouses or other (preferably distributed) data stores
- In-depth knowledge of Python, including 3+ years’ experience
- Proficient in working with Airflow
- Experience with MPP data warehouses like Redshift, Teradata, Snowflake and databases like Postgres, Oracle, MySQL etc.
- Experience in the cloud is preferred (AWS, GCP, Azure)
- Skilled in parsing structured and unstructured data. Knowledge of data warehousing is a plus
- An independent way of working: you prefer deploying services on the cloud instead of waiting for a DevOps engineer to hand your servers
- Passionate about clean code and ideas, and able to strike a balance between development speed and documentation + testing
- Fluent English (written & spoken) and good communication skills
What’s in it for you
- Challenging tasks – you and the team are to define every next Sprint, tasks and how they will be implemented
- Smart solutions – yes, we produce them
- Career and professional growth opportunities
- Conferences, knowledge sharing activities, certifications
- Sport and team-building activities
- Smiles and jokes
To help us with our recruitment effort, please indicate in your cover letter where (vacanciesinukraine.com) you saw this job posting.