Data Engineer at Operational Awareness team within Digital Service Factory (DSF)
Start: December 2022
Job Location: PUNE, India
Vanderlande provides baggage handling systems for 600 airports around the globe, capable of moving over 4 billion pieces of baggage around the world per year. For the parcel market, our systems handle 52 million parcels per day. All these systems generate data. Do you see a challenge in building data-driven services for our customers using that data? Do you want to contribute to the fast-growing Vanderlande Technology Department on its journey to become more data-driven? If so, then join our Digital Service Factory team!
As a data engineer, you will be responsible for delivering data intelligence solutions to our customers all around the globe, based on an innovative and new product, which provides insights into the performance of their material handling systems. You will be working on implementing and deploying the product as well as designing solutions to fit it to our customer needs. You will work together with an energetic and multidisciplinary team to build end-to-end data ingestion pipelines and implement and deploy dashboards.
· You will design and implement solutions to bridge the gap between the data intelligence product and the customer needs.
· You will be deploying, testing, and documenting project solutions for our customers and customizing them for their specific needs.
· You collect feedback and always search for opportunities to improve the existing standardized product.
· You will build data pipelines and enable further project implementation.
· You will design and develop data models to map the raw data with the domain knowledge
· You will work with multidisciplinary internal teams to develop and deploy our product as well as project specials.
· You will monitor, and support implemented project solutions at our existing customers.
· You embrace working in an international, diverse team, with an open and respectful atmosphere.
· You will be part of an agile team that encourages you to speak up freely about improvements, concerns, and blockages.
· You enjoy an independent and self-reliant way of working with a proactive style of communication to take ownership to provide the best possible solution.
· Experience in guiding engineers
· Minimum 2 years' experience with building complex data pipelines and data solutions.
· Bachelor’s or Master’s degree in Computer Science, IT, or equivalent.
· Event processing tools like Splunk or the ELK stack
· Hands-on experience with data modeling
· Hands-on experience with programming in Python.
· Experience in data engineering using DevOps principles
· Experience in building highly performant and secure data pipelines
· Data Schema’s (e.g. JSON/XML/Avro)
· Storage formats (e.g. Azure Blob, SQL, NoSQL)
· Deploying services as containers (e.g. Docker)
· Streaming and/or batch storage (e.g. Kafka, Oracle) is a plus
· Streaming processing (e.g., Spark) is a plus
· Working with cloud services (preferably with Azure)
· Experience in building API is a plus
· Experience in data quality management and monitoring is a plus.