In just about any major infrastructure project, one of the most important aspects of the work is going to be determining what kind of utilities and other assets are buried under the ground, and where. As we’ve covered in the past here at Geo Week News, the plans from the time assets were originally buried are often outdated and, frankly, completely inaccurate, leading to strikes occurring during digging portions of a project. These can be very expensive in the best-case scenario, and often extremely dangerous on top of the financial expenses.
While the technology to detect what is located under the ground has long existed, we are starting to see more of an emphasis on using tools like ground-penetrating radar (GPR) and other detection technologies, and in turn, the products are getting better and easier to use. One of the major improvements around GPR in recent years has been the ability to collect subsurface data more easily at driving speeds, rather than having to walk a push cart over an area. Given those improvements, some of the more cutting-edge firms are starting to take the work a step forward, creating multi-array setups for their surveying projects to combine mobile mapping and GPR workflows into one.
Recently, Geo Week News spoke with Skyler Neumeyer, a mobile mapping applications engineer with Trimble, and Matt Okubo, a president, principal partner, and licensed land surveyor with WestLAND Group (WLG), who each shed some light on this growing development, what the multi-array approach adds to workflows, and some of the challenges that had to be overcome to fully implement the approach.
Given that most large-scale construction projects are going to involve data both above and below the surface, it stands to reason that most could benefit from this kind of workflow. That said, Neumeyer points to a couple of buckets for which this integration makes sense. One is going to be subsurface utility information, or SUI, surveys.

“Locating any subsurface utilities is part of any major construction project,” Neumeyer said. “They’re just taking this GPR that was going to be part of the major project, and the as-built and survey side of it, and just kind of combining them and running it simultaneously. They were both components that had to be done on projects.”
He also mentioned projects involving asphalt and roadway management where firms are looking at the subgrade – where the dirt meets the asphalt or concrete. This, he says, is a bit more nascent in this multi-array approach, but it makes sense in areas like Colorado where freeze cycles come into play where GPR can “pre-identify” failures before you’d see them on the road surface.
Of course, it’s not as simple as deciding it’s time to start combining these two workflows for it to happen. Okubo explains that, while the value is clear and WLG has been using these workflows for some time now, there is an initial investment – both financially and intellectually.
For GPR in particular, he says, “They’re not extremely expensive, but they do take some time to understand the technologies, and understand how to read the data.”
And even to the point of these systems not being “extremely expensive,” he acknowledges that getting the right equipment for your work requires a lot of research and development. That alone costs both money and time. In the long run, though, making that initial investment will ultimately pay off in the long run.
After that portion, both Neumeyer and Okubo talked about the difficulty that can come in integrating the data from the two systems. Each acknowledged that the physical integration of hardware isn’t too much of a headache – “I’m pretty good at telling them, You make this cable and pin it out, and connect it here, and set the settings here. We can make all of that work pretty easily,” Neumeyer said – but bringing the data together in the processing phase can be a little more of a challenge.
Okubo said that process was indeed a challenge, and told Geo Week News, “There were a lot of intelligent people pulling on the same end of the rope to make that happen.”

He continued to note that even just getting people to understand the subsurface data was more difficult than the mobile mapping point cloud, as the latter can be fairly simple to intuit based on the fact that people can just physically go to the area to verify what they see. Still, ultimately, you can see within this article the deliverables they’re able to show to their customers with the data.
Even with the challenges, it’s clear that the benefits are worth the initial investment in money and time. One of the biggest benefits that both relayed to Geo Week News was the safety aspect. As noted above, traditionally the GPR work was done with people walking a street with a push cart, often putting them in danger with traffic driving by. Being able to do this work in a car significantly reduces the possibility of serious injury.
Neuymeyer noted that he’s heard from customers, “I don’t care if it was slower and less accurate. If it keeps people off the road, we’re going to switch to [the multi-array approach].”
The data also has the potential to significantly increase in accuracy with this approach compared to more traditional GPR usage thanks to the ability to utilize the mobile mapping unit’s positioning.
“One of the biggest benefits is that a lot of the GPR systems have just a regular, lower-grade GPS chipset in them,” Neumeyer said. “So when used standalone, the GPR is less accurate. Our system’s much more expensive, so we have a very precise positioning system that is improving the overall accuracy.”
WLG uses Trimble’s MX9 mobile mapping system in their rig, and Okubo relays that the multi-array approach with GPR has been extremely beneficial to their processes overall. And that improved accuracy from the mobile mapping system was one of the main pushes to get them to start utilizing the approach.
“The thought process was, If I can export the GPS positioning from the mobile mapper into this, in theory, it should work. That motivated us to say, Okay, well let’s push the envelope on that and see what we can do.”
Most importantly, he reiterated the importance of putting in the initial time for R&D, particularly for GPR. He says that when that doesn’t happen, it leads to horror stories, which ultimately brings the entire sector down.
"I always just try to stress to them to make sure that they do the R&D that they should be doing. A lot of times what happens, and what gives this type of technology a bad rap, is that somebody doesn’t vet it all the way through, and then they get a bad experience. And then that client, next time someone even brings this up, they’re like, Oh no, I don’t want to do that. We had a horrible experience. Don’t do a disservice to the entire community and profession and just run it into the river.”