Last week, the VR/AR Association held their VR Enterprise and Training Forum, discussing the increasingly viable business applications for mixed reality technology. As part of the one-day event, Overlay CEO and co-founder Christopher Morace gave a keynote talk in which he introduced the company’s new Asset Vision feature, which utilizes artificial intelligence (AI) within the augmented reality (AR) app to quickly and automatically identify features in water utility spaces. Recently, Morace spoke with Geo Week News to discuss the new feature as well as to provide more background on the company as a whole.
Prior to founding Overlay, Morace had spent his career primarily working with enterprise software, including with a social collaboration platform called Jive. There, he says he got experience working with Fortune 500 companies and began to “understand what it takes to really transform a business.” This was about a decade ago, as many different tools were starting to be developed for enterprise uses, and Morace started to become intrigued by technology like AI and AR. However, he says, he and Overlay co-founder and CTO Josh Ricau “felt pretty strongly that the white collar workers inside the four walls had been a bit overserved.”
He continued, “They were kind of drowning in technology, and everybody out in the real world trying to keep the internet up and running, and keep water flowing, they’re just struggling.” Morace notes that technology designed for those in the field did exist, but it was extremely expensive. That led to the ultimate development of Overlay, providing simple-to-use technology to water agencies for solving their problems in the field.
As alluded to above, Overlay uses AR and AI to provide easy access to crucial data in the field right on a user’s phone. For example, information from GIS systems can be brought into Overlay, and then a user can hover over an asset – such as a sewer in a road – with their iPhone and instantly receive all available metadata for that asset. Additionally, Overlay takes advantage of all of the sensors available on iPhones, including the relatively recent addition of lidar, to enable 3D scanning of modeling to, among other reasons, cut down on return trips for professionals in the field - a significant and persistent issue in the industry - and create digital twins of assets by incorporating Internet of Things (IoT) data.
The ability to do all of this with just an iPhone was a crucial piece of Overlay’s foundation. Morace acknowledges that scanning with these mobile devices certainly don’t solve every issue and there are still plenty of projects that require high-end laser scanners, but for many of their users working on water utilities, projects are small enough to take advantage of simpler technology, to say nothing of the accessibility (or lack thereof) of more traditional devices. He told Geo Week News, “It was important to us that everything was off-the shelf, inexpensive hardware like an iPhone, because we feel like when real transformation happens, it happens because you can put the device in everyone’s pocket.”
That brings us to the recent development of Asset Vision, which as mentioned above utilizes AI to identify assets automatically and keep an inventory for agencies. Morace notes that as he and his colleagues started going out in the field and seeing the real problems faced by professionals, he learned from utilities that they don't actually have a great idea of what assets they have. They know, of course, where their pump stations are, for example, but not necessarily every asset within the stations. That’s a clear barrier for maximum efficiency. As Morace puts it, “It’s great to have IoT sensors, but IoT sensors aren’t that valuable if you don’t know what it’s on.”
That’s where Asset Vision comes in. Trained to recognize assets typical for these types of water utility stations, the tool is able to take a 3D model scanned using the Overlay app and automatically recognize and register assets within the model, thereby creating an accurate inventory that the industry has largely been lacking. While the AI can’t necessarily identify exactly which asset it is looking at, but rather just the type, it can read things like serial numbers and QR codes – and is location-aware – which allows users to subsequently attach identified assets to those IoT sensors, maximizing the value of that real-time data and creating a functional digital twin.
In a conversation with Morace about the current AI boom as well as some of the shortcomings – like “hallucinations” with ChatGPT – and he noted that this is obviously a different kind of AI than the generative tools that are dominating the mainstream news cycle. One of the crucial features with Asset Vision is that it gives a percentage of how sure it is that it has correctly identified an asset, providing some guidance as to when a little more human intervention may be needed.
Morace also talked about the process for identifying asset types that are not already in the Overlay database, something that they say is significantly more efficient than more traditional AI training. Within Asset Vision, if there is an asset that is not identified by the AI, a user can simply put a digital box around the asset within the 3D model and enter the asset type within the app. After just a few seconds, that asset is in the programming and will be identified moving forward. Additionally, Morace mentioned that if a utility has some sort of “proprietary relationship to an asset,” they would be able to quarantine that to just their business’ account, though he notes that since most of these assets are third-party purchases they have yet to come across that scenario.
We’re in a time where there is more pressure on utilities than perhaps any other time for a variety of reasons, with climate change looming large over water and energy utilities around the world, and ever-increasing reliance on remote work and global connectivity putting pressure on communication utilities. It’s something that is on the mind of the Overlay team, which is why they are looking to take advantage of recent technological developments and lean on relatively simple tools to complete complex tasks. Morace reiterated points above about using phones for digital twin creation, looking back at when iPhones first came out.
He said, “When the iPhone first came out, they had these really terrible cameras, and everyone was like, ‘Why would I use that camera? It’s so terrible.’ And the answer became, the best camera is the one you have with you. We see the same approach to this technology, which is: The best technology you can have is the one that’s going to be in your truck or in your pocket.”
They are using this mantra to try and address the issues in the field being experienced by those in the pressurized utility fields. “I think at this time when there’s so much pressure on all these spaces – energy and water and communication – we all just need to get a lot more with a lot less, and we think this type of technology can help play that role.”
Asset Vision will officially launch on June 1.