In an article dating from last May, The Economist argued that data was “the world’s most valuable resource.” We could not agree more. Gathering data is not enough. Finding appropriate ways to visualise it and communicate it is just as crucial.
In her “#musedata is cool” presentation, Angie Judge from DEXIBIT reminded us all that we all like to take in information in different ways. Some people respond better to raw numbers, others prefer visual representations, creative propositions, stories or text. It is essential to adapt to whoever you are trying to communicate this information to. Good data visualization should help your team make informed strategic decisions. When paired with geolocation technology, it gives you a better understanding of how visitors are progressing through space and how you can improve the visiting experience based on their habits.
Data visualization is not dumbing down information
Elizabeth Bollwerk is an Archaeological analyst at the Digital Archaeological Archive of Comparative Slavery (DAACS), an initiative of the Thomas Jefferson Foundation, which happens to share all of the data regarding their archaeological collection on their website. In her presentation, she stressed the need to “[portray] the data meaning accurately and ethically.” She explains that presenting data in a creative way shouldn’t compromise the accuracy of the information that is being shared. Good data visualization should help you understand information quickly but also allow you to make informed strategic decisions.
The future of data visualization is bright
Jeff Steward from the Harvard Art Museums presented a series of extremely creative data visualisations including a representation of the whole museum’s collection in the form of a solar system. “Can data be immersive?” he asked, “what if I could walk through data with virtual reality?” This would certainly be a wonderful way to weave data into a tangible and playful experience.
Jeff Steward: Applying astronomical metaphors to render the collection as a series of solar systems.
Angie Judge described what the future of data visualization looks like for DEXIBIT: machine learning models predicting what tomorrow’s data will be and friendly chat bots answering decision-makers’ questions based on the data collected. Exciting, isn’t it?
Our partner, Wezit, is a software development company that integrates Accuware’s technology to develop complete solutions, such as tour apps for cultural institutions. Our latest collaboration is a geolocated mobile app for the newly reopened Musée d’Arts de Nantes.
This is about our latest product: a computer vision-based indoor positioning system (IPS) capable of delivering a mobile device’s location with high accuracy without need for infrastructure and minimal site preparation.
We asked our CEO to walk us through a demonstration of the new product. He happily agreed. This is what happened.
Dragonfly’s 2.0 Capabilities
Accuware Visual Positioning System, code-named Dragonfly 2.0, is a SLAM-based system. In brief: Dragonfly uses a camera to perform Simultaneous Localization And Mapping (SLAM), a technique described as “… the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent’s location within it “. The agent in question is an autonomous mobile device, most likely a robot that moves through physical space carrying a camera hooked up to a computer.
A minimal site setup is required: at least 3 markers must be placed throughout the site. They encode specific location coordinates, and are used by Dragonfly to initialize its position. Markers are easily generated on a regular printer.
Dragonfly was designed to be thrifty in its use of available resources. As expected, the device’s camera is the main drain of battery power, while the computational footprint, available as an SDK, has been optimized to run efficiently on a regular smartphone, and is available on other platforms.
Dragonfly uses the device’s camera to analyze its surroundings, map the environment around it, and determine its location in reference to its physical context. This data can be shared with an external system (ex. a server) to track the device’s location over time.
Putting the demo together
Our first challenge was finding a suitable autonomous mobile robot fitted with computer and camera, to run the demo. After some deliberation, we realized that our faithful office iRobot is an autonomous mobile platform, while a regular smartphone could provide the requisite computational power plus video camera. A smartphone car bracket mounted on top of the iRobot completed the setup for an Android Nexus X5.
We installed the test mobile app on the Nexus X5. It integrates with the Dragonfly 2.0 SDK and uses available WiFi and/or 3G or LTE to upload data to our cloud-based server, whose dashboard displays the phone’s real-time location on the floor plan we uploaded.
The advantage of this setup is that it can be easily replicated by anyone interested in testing this product to learn how it behaves, with the convenience of using everyday items. Note also that household robots like the iRobot are designed to discover and avoid obstacles in the physical environment where they move. This matches the practical use cases we envision.
Here is the demonstration
The video gives us the highlights of this experience. Mount the smartphone on the robot and start it. As the robot moves about, Dragonfly maps the environment while reporting the sequence of locations reached. The server displays these on the dashboard as a moving dot.
Note the spaghetti diagram depicting the robot’s entire trajectory:
And the heatmap, which highlights the physical areas covered by the robot. Both diagrams built using location data collected through the test.
What applications do we envision for this product? Here’s a short list:
Guiding pick-and-place autonomous mobile robots in warehouses
Mobile industrial robots in “Smart Manufacturing” environments
Roaming security robots patrolling buildings, parking lots and garages
In other words, physical location, as provided by Dragonfly 2.0 can be a key ingredient for IoT applications in industrial settings. It is the “location sensor” in environments variously referred to as: “Industry 4.0”, “Industrial Internet”, “Connected Enterprise”, “Smart Manufacturing”, “Smart Factory”, “Manufacturing 4.0”, “Internet of Everything” or “IoT for Manufacturing”.
Rounding it up
Accuware is a technology-driven company. It is in our DNA. We value technology with practical uses in mind. We develop products that leverage different technologies, all focused on providing physical location in the real world.
Dragonfly 2.0 is today’s accomplishment. The R&D effort took many months. The product launch follows the model from previous releases. Provide interested parties with a chance to evaluate the new product in their environment. Our Tech Support group will assist committed users to ensure their success. Our Partners network will come up to speed on the new release. Case studies will follow.
We are thrilled by the possibilities of this product. Do you want to try this system in your environment? Contact us for details.
Incidentally, the heatmap helped us visualize the iRobot’s coverage, so we redirected it until the office had been thoroughly swept clean and looked pristine. We have the data to prove it. Now, that’s quality control.
Based in Barcelona, Spain, Be Retail is a provider of innovative solutions in Spain’s retail market. Part of the Be Motion group, which focuses on online marketing, Be Retail delivers solutions for bricks-and-mortar clients that combine mobile, location and analytics. Their innovation enables new practices and methods.
As we well know, retail is a fiercely competitive space that over the past few years has been transitioning from simple bricks-and-mortar to a new era that merges the physical world, online and mobile. Responses to these challenges have included omnichannel strategies with a renewed awareness on the power of mobile, location-awareness and analytics.
Adoption of new ways of shopping throughout the world has been uneven. Price-Waterhouse Coopers’ (PwC) Total Retail 2017 describes the retail market worldwide, focusing on the Spanish market as an example to illustrate a societal transition where online shopping is gradually but firmly gaining ground on bricks-and-mortar. Major retailers, both domestic and foreign (European and American) are changing their game plans, responding with new approaches, technology and methods. BeRetail has played a key role delivering innovation into this market. Here are some of their projects.
Major Supermarket Chain
Taking advantage of the ubiquitous mobile devices, a major chain of supermarkets is deploying an array of in-store solutions targeting their customers and their operations:
Proximity Marketing. A customer walks by a specific area of the store and receives information about an item of interest. Either a promotional offer or simply product information about an item available nearby. The information is always contextual: it may take into account the customers’ interest, the time of day, and the location inside the store: a gentle reminder that their object of interest is just down the aisle or around the corner. This is the classical application of iBeacons paired up with a custom app. And it is made possible by having the customer install the store’s app on their phone and, ideally, to share their shopping preferences.
Business intelligence. Through analytics, merchandising specialists and store managers can gain insights by analyzing customer behavior inside the store: Where do customers go? How long do they linger? What are the preferred paths throughout the various sections? How do path, speed and dwell time change throughout the day? On different days of the week? Does anyone linger in front of endcaps? How many customers spend time at brand-specific exhibits? How long are the checkout lines at various time of the day?
These insights, and many others are possible by tracking customers’ movements throughout the store. For instance, at a supermarket, shopping carts and baskets can be fitted with electronic tags that broadcast their location inside the store. That makes possible tracing the whereabouts of customers moving through physical space, which enables generating heat maps and traces to illustrate people’s movements over time. Note that personal privacy is never compromised by this technique, as no personally identifiable information is ever collected.
Online ordering with in-store pickup services. Supermarkets have implemented systems that enable customers to order online or via their smartphones the items in their shopping list, and have store employees gather the items, bag the order, and have it ready for pick up when the customer shows up. This can be especially convenient for hurried parents driving home from work, with children in the back seat, just picked up from school or daycare.
The store can radically increase their employees’s efficiency by using a system that analyzes the shopping lists and plots the most efficient route throughout the store so that the employee collecting the order can be guided from item to item, without missing anything. An indoor navigation system enables the step-by-step guidance required. Just picture a tablet attached to the shopping cart, guiding the employee, displaying the route through the shortest path. Such a system requires a database that accurately maps items and their locations throughout the store.
Auto Mechanic Shop’s performance
A chain of auto mechanic shops that serve insurance agencies, decided to use actual performance data to differentiate itself from the competition. Customers could now have answers to performance questions: How long does it take for a vehicle to be serviced for a given problem? What is the estimated service time per problem type? How many vehicles are being serviced? At what step in the service chain is a given vehicle? What’s the estimated service completion time for each vehicle? What’s the shop’s performance rating?
The company decided to deploy a solution to measure the efficiency and performance of its shops. The solution tracked the location of each vehicle throughout the service stages at the shop. An indoor location tracking tag was attached to each vehicle. Each vehicle’s position within the shop was tracked automatically throughout the service process and its location was made available both through an online dashboard for customers to access, and through a mobile app, which also provided status alerts.
The system significantly increased the perception of transparency. No longer were customers in the dark about how service to their vehicle was progressing. Now they could actually check where their vehicle was in the chain and get its status. There is now a performance rating.
Behind the scenes
It takes an innovative solution provider like BeRetail to bring complex applications like these to life, addressing real business needs with technology. And it takes technology partners to provide some of the required building blocks, reducing time to market and risk.
BeRetail bridges the gap between customers’ needs and technology. Accuware provides its location platform and support to ensure a successful implementation.
Using Accuware’s products, BeRetail has access to functionality such as:
Proximity applications were implemented using a standard iBeacons infrastructure: Bluetooth Low Energy (BLE) iBeacons (hardware) deployed throughout a venue, to trigger branded mobile apps running on customers’ phones and tablets. When actual location, rather than mere proximity is required, Accuware Indoor Location may be the right tool.
Indoor tracking of objects (ex. shopping carts, cars) was enabled by Accuware Bluetooth Beacon Tracker. This product uses small form-factor BLE beacons (ex. iBeacons) attached to objects, whose signals are triangulated by a set of beacon sensors (hardware) deployed throughout the venue. Note that tracking people may also be possible with this system.
People tracking was made possible by Accuware Wearabouts, which relies on apps running on mobile devices (either smartphones or “tags”), which sense ambient Wi-Fi signals to estimate their location in a venue.
Indoor navigation was implemented leveraging Accuware Indoor Navigation, which relies on apps running on tablets or smartphones that sense ambient WiFi signals augmented by signals from device sensors like gyroscopes and accelerators.
Experience shows that no single technology works best for all location-aware applications. It depends. That’s why solution providers like BeRetail can partner with platform providers that combine the latest technologies with worldwide coverage and support.
It is a win for everyone; especially for our common customers.
On June 23, 2017, the Musée d’Arts de Nantes reopened after an 88+ million Euros, six-year long renovation. The beautiful main building was renovated and expanded. A new building, the Cube, was dedicated to contemporary art, and a large inner space was opened for public exhibits. In addition, digital technologies were deployed throughout the museum to make it all accessible, creating unforgettable new experiences for all visitors.
Created in 1801 as a provincial museum, the Fine Arts Museum of Nantes, now named Musée d’Arts de Nantes, is the largest and most important institution in France’s Loire region. Its collections contain works from all major French and European art movements, placing them among the largest public collections alongside the six largest notable Museums of Fine Arts: those of Grenoble, Lille, Lyon, Montpellier and Rouen.
The physical renovation was commissioned to the British architectural firm Stanton Williams, and the digital implementation was entrusted to the French digital agency Mazedia. The results were stunning.
Stanton Williams delivered a magnificent new and expanded physical space, transforming the 19th century complex into an “urban quarter” that encompasses the original building (“le palais”), the chapel (“l’oratoire”), the garden area now open for exhibits, and a new building. the Cube, destined to house contemporary art. Exhibit space was enlarged by almost a third, enabling 900 new pieces of the 13,000 collection to be available for visitors.
The digital transformation was equally remarkable. Mazedia deployed Wezit, a transmedia storytelling platform for implementing interactive programs at cultural institutions using multiple media formats. At Nantes three Wezit-enabled attractions were deployed: the Ma Visite mobile app for visitors’ smartphones and tablets, interactive kiosks and the “Under the magnifying glass” stations.
There were twin objectives for the digital transformation: enabling the content curation desired by the museum and creating a superb visitor experience.
Curation enabled presenting the content so as to highlight the artwork as conceived by museum curators.
Visitor experience required additional work: a key requirement was for visitors to either be able to create their own tour and operate autonomously, or to take advantage of the predefined tours designed around specific topics.
Since the renovated museum complex is quite large, passageways have been created between buildings that were formerly not present, and some spaces are now used differently than they were before the renovation. As a result, one of the museum’s big challenges was helping visitors find their way easily inside the new, larger space. Location-awareness, both of artwork and for visitors, became a key concern to address. Indoor location helped tackle these problems.
Ma Visite mobile app
Ma Visite (“My visit”) is an app available for both iOS and Android devices. It provides access to key information and functions:
the list of all curated museum tours, which encompass the museum’s collections, its architecture, works grouped by time period, or tours such as “looking for love” and the “literary tour”
tours for families
the “must-see” exhibits
the ability for visitors to build custom tours based on their personal interests, and this is possible to do even before reaching the museum grounds
special event announcements
indoor navigation, for visitors to find their way throughout this vast physical space.
Especially notable are the “must-see” exhibits (“Oeuvres incountournables“), the masterpieces that set this museum apart. Here is a peek at the user’s experience:
The museum is large, spanning multiple levels over different buildings.
Complementing the Ma Visite app, the kiosks give visitors the ability to instantly find their location, and search for exhibits of their interest with the convenience of a larger screen.
All artifacts can be found through these kiosks. This way visitors can learn how to locate the specific artworks they wish to see.
“Under the magnifying glass” stations
Strategically placed in front of the major exhibits, these stations give visitors the ability to explore notable artwork in detail.
The stations provide the “magnifying glass” (“Œuvre à la loupe”) that enables visitors to delve deeper into the artist, history and significance of these exhibits.
Interestingly, young visitors, accompanied by their parents, are instantly fascinated by these digital devices, rapidly mastering the ability to learn through them.
Behind the scenes
What made the digital magic possible? Here is how it happened.
Mazedia created the graphics and designed interfaces for the various devices: the website, the mobile app, the kiosks, and the “magnifying glass” stations. All of them are linked to Wezit, the transmedia platform that enables structuring and personalizing the digital content published across devices. Visitors, in particular, may use a variety of devices, such as smartphones, tablets, kiosks or the website portal. Also, Wezit enables museum curators and administrators to manage the digital content.
A key element of the information displayed to visitors is their location and the location of the artwork they may see during their visit. Artwork location is displayed as part of search results.
Technically, there are two different ways of providing location. One is using proximity, as in “I’m standing near ‘Le Déluge’ exhibit”. There is also use for actual geographic coordinates. For instance: “I’m on the 2nd floor, in the 3rd gallery, near the center of the room”. This second version delivers coordinates: latitude, longitude and level.
iBeacons were deployed throughout the museum to provide location. These Bluetooth beacons deliver both proximity to specific spots, as the standard iBeacon functionality delivers, as well as indoor location using Accuware’s Indoor Navigation product, which first maps the location of each and every iBeacon, and then delivers coordinates to mobile apps based on the user’s real time location.
With location functionality firmly embedded in Ma Visite, visitors were able to pinpoint their current location and get guidance to locate and see any artwork of their choosing, combining autonomy and flexibility for a great experience.
Early feedback from digital infrastructure users has been very positive. Children, in particular, were wildly enthusiastic in their embrace of new technology, which instantly multiplied their interest in exploring and learning. And that is music to the ears of all those who contributed to this remarkable transformation.
Our latest solutions for 3-D positioning and video analysis
The Accuware Team will be at the International Security Conference & Exposition, ISC West, from April 5th to the 7th.At our booth we will demonstrate the latest features of products designed to deliver high reliability and location accuracy in a variety of applications. Our latest offerings include Video Positioning System (VPS) code-named Dragonfly, and Video Location Monitor. These products expand Accuware’s portfolio, which harness multiple technologies to provide solutions for locating, tracking and monitoring people and mobile assets in the physical world.
An early version of VPS Dragonfly made its debut at CTIA 2016, the main trade show for the wireless industry. At that time, VPS Dragonfly was able to estimate the location of a camera-equipped mobile device in 3-dimensions, with an accuracy of just a few inches. It determined its position by decoding visual markers placed around a venue. The coordinates it produces include latitude, longitude, height off the floor and level (ex. 1st floor, 2nd floor, etc.), which makes it appealing for applications of autonomous mobile robots and drones.
The latest Dragonfly has learned new tricks: by first performing a “visual fingerprinting” of a venue with a device’s camera, Dragonfly can now take a snapshot anywhere around the venue and determine its location by interpolating that image into the database of “ambient visual fingerprints”. The new system is also self-optimizing, in that it continues “learning” the environment, causing its resolution and location accuracy to improve over time.
This video shows a demonstration of the new VPS Dragonfly functionality as it would be seen when run on a drone:
Visit us at booth 6140 for a live demonstration of this product.
Video Tracker is a new tool that delivers useful features for intelligent video analysis.
Imagine a CCTV system displaying a video feed on multiple screens, under the watchful gaze of several human operators. Now imagine those operators suddenly wondering whether a person currently visible on the screen has appeared before. Is that person currently on camera 3 the same one we saw on camera 6 about 20 minutes ago? Have we seen this person earlier? Perhaps earlier this morning? At what time? Was that same person present multiple times? When and where?
Now picture the CCTV feed been analyzed in real-time: people visible on the screen are being identified by their appearance, including body bulk, height and clothing color (face recognition is not used). Imagine a database containing small snippets of video collected over time, with annotations of who is visible on each snippet, when it was taken and where. Incidentally, these snippets are referred to as “tracklets”, meaning “portions of a track”, a “track” being the sequence of physical positions of a given subject as it moves through physical space over time.
Once a person’s appearance is uniquely identified and categorized, this enables tracking the movement of that specific individual across multiple camera views over time. In other words, classifying tracklets in real-time provides the ability to track individuals moving through a monitored venue over time. Most importantly, the tracklets database enables visual search.
Visual search works as follows: knowing a specific individual’s appearance, it is possible to search the tracklets database to learn when and where that individual has appeared on screen over time. Visual search is one of Visual Tracker’s key features. Here is a demonstration.
As shown in the video, other key features of Video Tracker include the ability to perform traffic analysis, definition of geofences, and the visualization of the varying density of people present in a monitored area over a period of time.
Regarding Video Tracker’s search feature, it is important to emphasize that the system does not use facial recognition, given the impracticality of this technology on relatively small and low-resolution video footage. Privacy considerations should take into account that Video Tracker’s features enhance video surveillance mechanisms already in place.
Accuware is an engineering-driven company that develops products to serve a wide variety of industries. We strive to provide cost-effective, easy to deploy and maintain products that are easy to integrate into complete solutions. And our world-wide technical support coverage helps us ensure that our customers and partners succeed.
We work directly with customers, and through a network of Application and Implementation Partners, which gives us worldwide reach across many industries.
We will be delighted to meet you at ISC West 2017, booth 6140. We look forward to our in-person conversation about your requirements for location-aware applications. Contact us for further information.
As we know, Apple’s launch of iBeacons in 2013 started a gold rush in proximity-based applications. Since then we have become familiar with using the tiny, long-lived devices to trigger delivery of contextual information. Museum visitors get vivid displays about the exhibit they are approaching. Shoppers at bricks-and-mortar stores receive promotions when walking by a specific section of a store. And there’s more…
Later on came the systems that leveraged iBeacons’ ambient signals to provide location coordinates, latitude and longitude. And now, the latest systems enable efficient tracking of people and assets through a physical indoor space. How does it all work?
How beacons are used
Bluetooth beacons are small form-factor devices consisting of a radio transmitter and a battery in a tough enclosure. They are a class of Bluetooth Low Energy (BLE) devices that periodically broadcast their identifier; that is, they “advertise”. Nearby devices that “listen” for BLE signals react to these advertisements. Different behaviors are possible for the receiver of these signals. Some examples:
Proximity applications. Apple’s iBeacons protocol enables an app running on the client device to react to iBeacons placed at given locations in a venue. The app “listens” to advertisements, and upon detecting a specific identifier, it displays content downloaded from a server.
Device location by ambient signals. Again, iBeacons are placed throughout a venue. By first scanning the iBeacons’ signals, users build a database of iBeacons identifiers detected throughout the space. Client devices running Accuware Indoor Navigation can now obtain their physical location (latitude, longitude and level) in the site
Tracking mobile device with BLE receivers.
In this case, BLE “receivers”, referred to as BLE nodes, are placed at known locations throughout a venue. This time, the BLE beacons are moving, attached to mobile assets or carried by people. Each beacon is configured to identify an asset or a person.
Note that this turns the iBeacon model on its head. It is the BLE beacons that move while the BLE nodes are static. When three or more BLE nodes detect the same beacon, the system can triangulate that beacon’s location. This is how Accuware Bluetooth Beacon Tracker works.
Summarizing, 3 models are possible: BLE beacons can be used either to trigger (contextual) proximity actions, to obtain indoor location or to perform real-time tracking. But, why use BLE beacons for tracking?
Let’s review what it takes to implement a tracking system based on BLE beacons.
System components include: iBeacons, BLE nodes, and a cloud-based server. iBeacons move around advertising their identifiers. BLE nodes collect the clients’ advertisements and upload that data to the server. The server estimates the clients’ locations relative to the BLE nodes’ known locations.
Simple, but why use BLE beacons instead of other technologies such as WiFi? Are there any advantages? YES! The beacon’s long battery life. Let’s take a closer look.
Long battery Life
One of the big challenges of tracking movement using electronic mobile devices is their battery life. Imagine tracking patients at a hospital using mobile devices whose battery life is just a couple of days. That means recharging or changing the battery overnight every two days. Now imagine tracking a valuable asset, like an EKG machine at an emergency room. How often would you want to have to recharge the battery? Now imagine tracking dozens, or even hundreds of people or assets. From the maintenance standpoint, it can be expensive and inconvenient to do it daily. Now, what if the battery charge lasted many months?
That is precisely what BLE beacons provide: extended battery life. This is because of their very low energy demand for simply advertising their presence, and the ability to configure the frequency of their advertisement, which is the major culprit in draining the battery.
There are several BLE beacons providers. All of them highlight battery life for a good reason.
What does it take to set up indoor tracking?
Bluetooth Beacon Trackerprovides a dashboard for system administration and management. That’s where the setup process starts. The steps are:
Physically place BLE nodes on each floor, in a grid pattern. BLE nodes are simply plugged into regular power outlets.
Upload one or more floor plans or maps to the dashboard
Overlay and resize the floor plans on top of a Google map representation of the target venue, one floor at a time.
Mark the location of each BLE node on the corresponding floor plan on the dashboard. Note that this process “anchors” the location of each BLE node to a latitude and longitude on a given level
Register each BLE beacon identifier via the dashboard, and configure the system to receive data uploads from the BLE nodes.
And the setup is complete.
How does this work in a real application?
Tracking in real time the physical location of people and mobile assets is a critical need in many applications. Tracking patients in a nursing facility, lift trucks in a warehouse, mobile medical equipment in a hospital, or children at a child care facility, all share a strong requirement to identify and locate in real time. And very often, to learn where everyone and everything has been over time.
Setting up an application takes a few steps:
Configuring the BLE beacons’ identifier and its advertising frequency
Recording the BLE beacon identifier via the system dashboard, to associate its identifier with a person or asset to track.
Affixing the BLE beacon to the corresponding asset, or enabling a person to carry it.
BLE beacons must be configured with a unique identifier. In addition, depending on application requirements, the frequency of advertisement should be adjusted to meet practical needs while maximizing battery life. For example, advertising frequency could be set at 10 seconds to track patients in a hospital. Tracking fast moving vehicles, such as autonomous robots in a warehouse may require more frequent advertisement. In all cases, you should refer to the manufacturers’ specifications to achieve the desired effect.
Would you like to try this system in your environment? Contact us for details.
CTIA Super Mobility 2016 in Las Vegas, the event “about everything wireless”, was bigger and better than ever. A global audience descended on the Sands Expo for the yearly sensory feast on the latest technologies about the mobile world. And once again, visitors to our booth had a chance to see our latest indoor location products in action. Continue reading →