DevSecOps at Data Acquisition Team within Digital Service Factory (DSF)
Hay Level: Hay 60 & 70
Starting date: November 2022
Job Location: Pune (India)
Vanderlande provides baggage handling systems for 600 airports around the globe, capable of moving over 4 billion pieces of baggage around the world per year. For the parcel market our systems handle 52 million parcels per day. All these systems generate data. Do you see a challenge in building data-driven services for our customers using that data? Do you want to contribute to the fast growing Vanderlande Technology Department on its journey to become more data driven? If so, then join our Digital Service Factory (DSF) team!
Your Position
Within the Data Acquisition team we are developing a secure and efficient data collection system that gathers operational data from various equipment developed with Vanderlande and makes this data available on our Digital Services Platform. This data will be disclosed using standard interfaces, to enable data consumers to build digital services, such as operational awareness, predictive maintenance and process optimization, based no machine learning and AI.
Each equipment can use a different protocol and different interfaces to provide data. As an DevSecOps Engineer you will design and develop standard and scalable IoT modules able to collect this data and make it available in a standard platform that supports streaming and blob storage. You will collaborate with mechanical engineers responsible for the MHS, data engineers that will support you to understand the requirements and define data products and integration interfaces; and, platform engineers to support you on the platform stack. You and the team are also responsible for the CI/CD and monitoring in production. Eventually you could be involved in production support during working hours.
What are your main tasks and responsibilities?
- Develop, test, deploy, document and support the IoT data collection modules. The modules consists of (complex) data pipelines from (IoT) sensors and low/high level control components to be centrally stored in a blob storage and in a streaming platform.
- You setup and adapt our continuous integration/deployment pipeline(s) so that teams can have fast feedback and thus can effectively work. In this process you strive for an ever-increasing degree of automation.
- You help the product deployment team with implementing and commissioning the IT infrastructure.
- You support the software that your team develop in multiple customer sites.
- You ensure that releasing and deploying times are as short as possible.
- You ensure that infrastructure is setup as such that it serves the continuous integration/deployment.
- You are continuously seeking ways to improve our DevSecOps practices so that we can improve our products and services.
Your Profile
- Have a Bachelor or Master’s degree in Computer Science or at least 8+ years of relevant experience with DevOps and/or software development activities.
- Can setup a Continuous Integration and Deployment pipeline.
- Ability to inspire and coach others with regards to DevOps.
- Background of working within agile team environments, especially using Scrum and Kanban frameworks.
- Hands-on experience in:
- Programming
- Java (pre-requisite).
- C# (nice to have).
- Python 3 (nice to have).
- DevOps tooling for test automation and CI/CD.
- Skilled at working with Linux and shell scripting.
- Streaming and/or batch solutions (e.g. Kafka).
- Experienced with APIs and databases.
- You have experience with Docker and Kubernetes.
- Functional Programming.
- Programming
- Nice to have:
- Experience with Azure cloud services.
- Azure IoT Edge.
- Experience with Infrastructure as Code (IaC) frameworks (e.g. Terraform).
- Experience with Atlassian stack.
- Development experience with IoT data acquisition / collection modules.
- Experience with developing with technologies routinely found in IoT platforms (i.e. MQTT, TCP/IP, AMQP, etc.).
- Experience with automated testing on different levels is a plus.