Veovo launches 3D camera to track passengers’ end-to-end movements
At Passenger Terminal EXPO 2019, we spoke to Veovo to give us an exclusive insight into their new BlipVision technology, which monitors end-to-end passenger movements.
In a busy terminal, with vast numbers of passengers, gaining an accurate and detailed picture of people´s movement patterns can be a challenging task.
To combat this, Veovo have launched its new BlipVision solution which is used to visually count and track the end-to-end movement of people through a terminal to help minimise the effect of queues.
o-Founder and CTO of Blip Systems, told us more.
How does BlipVision work?
This technology is an addition to BlipTrack. It will measure waiting times based on 3D sensors which are spread throughout the airport which follows individual passengers through a terminal and gives a detailed view which helps with understanding real-time and predictive queue analysis. By using advanced deep-learning algorithms, the solution enables individualised and fully anonymised movement patterns, providing a deeper level of flow insight. The BlipVision solution protects an individual’s identity by only reporting a numerical ID and their position within the system.
How does real-time analysis ultimately benefit the passenger?
The BlipVision solution can be used to provide per-airline, per-person-class wait times at check-in areas, including areas where counters are dynamically assigned throughout the day. This is achieved via integration with Veovo’s Airport Operational Database and FIDS solutions and similar systems. The system visually counts and tracks end-to-end movements and provides individualised and fully anonymised movement patterns, displayed in a web-based user interface. This provides wait times in both dynamically forming queues and per unique zone. For the passenger this means that the time they spend in an airport can be planned properly; knowing that the security queues are 10 minutes long means that passengers can allow ample time to get through.
Predicting what is going to happen is all well and good, but what happens when the unexpected occurs in the terminal?
In the U.S., airports often use dogs as an additional layer of security at immigration. This means that sometimes border queues are stopped so that the dogs can patrol a large group of people at once. This would result in the system displaying increased queue times. However, the real-time sensors adapt to the situation and determine when the queues go back down again. The system adapts to known unknowns within the airport. Furthermore, changes in behaviour of security attendants, for example opening another lane to ease congestion, will alter the queue time displayed, but the system would not have been made aware of this change. The ability to adapt to these real-world environments means that the information displayed is accurate and up-to-date.
By expanding the data in the network what do you see the next steps being for this technology?
The length of forecasting can keep expanding. With the accumulation of data of different flows of passengers at different times in the day, on different days, means that the system will always know what to expect. These forecasts can be made as long as there is a flight schedule in place and thus makes the real-time data invaluable to the system of forecasting.
Is this technology implementable to existing BlipTrack airports?
This technology is adaptable to a multitude of airport environments. There is the possibility to expand this technology to areas such as check-in, just by adding additional cameras. The only limiting obstacle we face is airport architecture. Low ceilings would mean that the system requires more cameras to have the same scope as those of higher ceilings with fewer cameras. With a range of cameras, the system ensures that blind spots around the terminal are eliminated and as such the technology is able to work anywhere.