Visual effects houses like Marvel, Weta, and Mr.X have proven their rendering and modeling chops in a number of recent blockbusters. These artists use vector or polygon-based rendering data to produce models and textures that appear so real it’s nearly impossible to tell them apart from reality. There is, however, another data-type taking root in the VFX industry: point clouds.
If you’ve been following point cloud render technology for the past several years—and seen the numerous evolutions it’s made—this development won’t be surprising. Bentley’s POD format, through Pointools, showed the world that we could achieve photo-realism with point cloud data and render it in real time. FARO Scene showed us its Project Point Cloud tools, which color-balanced and rendered hundreds of point clouds at once. Euclideon caused some controversy with their Unlimited Detail algorithm, which demonstrated that even thousands of point clouds from thousands of scan positions could be rendered all at once… through the internet. Other companies have found ways to render point clouds quickly and with great visual accuracy as well, this list includes Autodesk, Leica, and many others.
Though Nurulize comes from a VFX background, they fit right in with any of these companies because they are using point clouds to provide more than just striking renderings. They are animating point cloud data as efficiently and effectively as any polygon rendering method I have seen, and they are using occlusion-filling algorithms to enrich this data for VR. Moreover, because this point cloud data can be streamed and rendered much like a raster, Nurulize seems to have found a way to circumvent the polygon-count limitations common with VR and AR platforms. An attention-grabber for sure…
Recently I had the great fortune of catching Nurulize chief creative officer Scott Metzger and CEO Philip Lunn on Skype. The two VFX industry veterans spoke about Atom View, their stunning visual effects and point cloud editing software.
Company background
Metzger has spent nearly two decades working in VFX, which has given him an in-depth knowledge of modeling, texturing, and reality capture. He began his career in VR capturing point clouds for measurement, shooting HDR photos, then aligning them all by hand. After discovering FARO scanners, which offered higher density data and colorization, he took a big step toward improving his laborious process. At that time, using point clouds for pure visualization was very new—which launched Metzger toward conceptualizing Atom View.
About four years ago, while demonstrating for NVIDIA, Philip Lunn took notice of Metzger’s work. Lunn had founded or co-founded a number of VFX companies, and had deep expertise in developing new kinds rendering software. After a few conversations, they began collaborating and Nurulize was born.
What makes Atom View unique
Nurulize’s Atom View is different from other applications that use occlusion filling, a technique that detects the parts of an image that are absent and fills them in. According to Metzger, Atom View doesn’t generate any new data to add to the existing point cloud. Neither does Atom View filter, reduce, or modify the data in any way. It doesn’t even use voxel technology or store the normals for any sort of rendering.
What Atom View does do is use a unique GPU-based algorithm that fills point cloud data by interpolating between points to make the point cloud appear to have a surface. In other words, it makes point clouds look like they’re polygon geometry. While the level of detail algorithms do require some processing calculation, this visual filling is performed in real-time as a user zooms into various areas. Zoom in on a wall in the point cloud, for instance, and the system immediately renders it like a surface with no missing portions between the points.
This is a big leap forward for point cloud computer visualization, to be sure.
Untitled Project, courtesy RedPillVR / Marshmallow Laser Feast
It works with industrial scanners
Fortunately, Atom View accepts many file types, including FLS, PTX, PTC (Renderman), VRay (volumetric renders), E57, and PLY. This bodes well for owners of just about any scanner on the market, as these formats are relatively universal, or can be generated with a quick conversion from a proprietary scan format. So, unlike some software out there that is built for only a handful of scanner types, Atom View is designed to be the “Photoshop” of point cloud data.
Metzger explains the reasoning behind this choice. Since the volumetric images produced by industrial scanners include depth and color information, he says, “being able to use [these]scanners for the transition into holographic capture makes sense. Atom View is pretty much agnostic in terms of what we can accept with the software.”
Untitled Project (Tree scans: 2 day project included Surphaser with Nikon photography captured by Mimic3D). 1:1 re-creation of area within 30’ transversable space in VR. Courtesy RedPillVR / Marshmallow Laser Feast
It offers better colorization
Many of the scanner-friendly point-cloud colorization tools on the market today use only a few exposures and tone-mapping to generate a final image. Tone-mapping is a complex process that makes images easier for standard displays to reproduce, at the cost of greatly reducing the range of colors in the image.
Nurulize’s background in the VFX industry, which is generally more color savvy than our own, has enabled them to develop an excellent color workflow that retains the original range of colors. They are branding their imagery as GDR (glorious dynamic range) rather than the somewhat out-of-context HDR. With GDR, Nurulize is set to change the landscape by offering better point cloud colorization than most people have seen to date.
In the tree visualization example, the team set up a Nikon D810 at same nodal point of scanner and, calibrating to Mimic 3D software for this process, captured eight directions vertical and horizontal, with five exposures, each three stops apart. The result is an EXR file (an open-source image file format developed by Industrial Light and Magic), a 16:32 float linear color image container with full dynamic range.
Technical detail: 32-bit float is a format that allows for 4 billion values for each color channel (a lot of range!). In addition, this 32-bit system can have additional channels for storing things like alpha channel, masks, metadata, etc. for more robust color editing.
There is no tone mapping—Atom View keeps the tonal data as is, enabling it to closer mimic reality. Atom View is true 32-float color capable, and having that color range allows user to match many environments such as can be seen in the tree scan example.
Users have quality control
When you load your scans, Atom View gives you the option to manually process for best results. To do this, pull up a single scan, which the program shows in 2D, and you select areas you want to keep. When you pull up a second or third scan, areas you’ve already tagged show up as blue. The process is non-destructive so you can add and remove as wanted.
For those who do need full control, this feature helps with identifying areas of the scan or photos that may induce less than favorable results, such as noise, lens flare, unfocused imagery, etc. By tagging these, Atom View allows you to essentially choose the best of the best for the final visualization product of a given data-set. Nurulize says that version 1.5 of the software will include a fully automatic version for those who don’t need full control.
Once you’ve compiled imagery and point clouds within Atom View, you can then export into a cache format for streaming across networks and viewing in VR.
Nu Design for connection, context, collaboration
Nurulize wants to use VR for practical purposes that can benefit you every day. At the heart of what they’re selling is high quality visual communication that facilitates collaboration to make judgment calls, like decisions on how to understand complex spaces, designs or even organic subjects.
With the Nu Design tool for design collaboration, users can stream in data in Atom View and throw it around, draw it, scale your avatar up and down, take measurements–historically, a challenging task to do accurately in VR with polygon-based data that has been optimized, decimated and finagled to fit into VR platforms due to mesh size limitations. You can even do it with a team in the same environment.
Abbye, France reality capture. Courtesy Romain Rouffet
Lunn suggests that the rise of the new Leica BLK360 could make this kind of collaboration much more widely accessible and I would say he is absolutely right. The democratization of reality capture tech is starting to do for point clouds what Photoshop did for photo editing; making this data-type completely accessible, and editable, to everyone.
Lunn further pointed out that point clouds from drones, whether they are lidar or photogrammetry-based, will continue to grow as an important data type for collaboration, and that Nurulize’s technology is providing an access point to such information.
Nurulize tomorrow
We could be seeing several new developments over the next year or so with Nurulize. It was suggested that Atom View will work with Unity (current iterations are based on the Unreal platform), and we can expect AR, MR, and mobile versions of Atom View at some point down the pipeline. At the rate of mobile 3D sensor integration and development in smart phones, this seems like a very good move on their part.
Does your family know…?
When asked if anyone in their families had a clue on what they do for a living, they laughed a bit and said that for the most part, not really (this author can relate)! At some point in this career, the creative and technical involvement merge into a very unusual skill set that is not your typical “head to the office for my nine to fiver” sort of day. It requires a lot of trial and error, dealing with limitations, and finding work-arounds to make a vision come to life. But, as could be heard in this interview, there is nothing that compares to the excitement, intrigue and required persistence of transforming reality into a digital universe.
In terms of what the future holds for Atom View, the Nurulize team believes that it will be a leader in the future of photography and image capture.
“Consider” Metzger suggested, “that if we want to replicate reality, we can look to the fundamentals of matter, treating everything like an atom – a point.” And in his words, “at the end of the day, anything else would be pointless…”
For more information, see Nurulize.com