Senior Data Engineer
Job Location: Pune, India
Vanderlande provides baggage handling systems for 600 airports around the globe, capable of moving over 4 billion pieces of baggage around the world per year. For the parcel market our systems handle 52 million parcels per day. All these systems generate data. Do you see a challenge in building data-driven services for our customers using that data? Do you want to contribute to the fast growing Vanderlande Technology Department on its journey to become more data driven? If so, then join our Digital Service Factory team!
As a senior data engineer you will work in the Predictive Maintenance team consisting of data engineers, data scientists, front- and backend developers, testers, a system engineer and a product owner. You will participate in solution design discussion led by our product architect where your knowledge and input will be highly valued. You will collaboratively work with IT and business SMEs to ensure delivering high quality end-to-end data ingestion pipelines.
- You will be developing, testing, documenting the data (collection and processing) pipelines for Predictive Maintenance. The data collection consists of (complex) data pipelines from (IoT) sensors and low/high level control components to our Data platform. When the data is in the cloud, it has to be processed and made available as data products to our data scientists.
- Align implementation efforts with other back-end developers across multiple development teams.
- You will develop scalable data pipelines to transform and aggregate data for business use, following software engineering best practices. For these data pipelines you will make use of the best frameworks available for data processing like Spark and Splunk. We are continuously improving on the solutions we use and encourage you to keep challenging the status quo.
- You develop our data services for customer sites towards a product, using (test & deployment) automation, componentization, templates and standardization in order to reduce delivery time of our projects for customers. The product provides insights in the performance of our material handling systems at customers all around the globe.
- You will contribute to the design, build and improve a CI/CD pipeline, including (integration) test automation for data pipelines. In this process you strive for an ever-increasing degree of automation.
- Bachelor’s or Master’s degree in Computer Science, IT or equivalent with 8+ years relevant work experience
- Programming in Python/Scala/Java
- Hands-on experience in CI/CD, Data/Code testing (e.g., Bamboo, Artifactory, Git)
- Scalable data processing frameworks (e.g. Spark)
- Experience with deploying services as containers (e.g. Docker and Kubernetes)
- Experience with AZURE is desired.
- Experience with serverless concepts on AZURE.
- Experience with SQL and no SQL databases.
- Experience with automated / unit testing and test driven development.
- Interest for AI and / or machine learning technology.