3 Ways Smart Glasses Cleared a Path for Tunnel and Concourse Construction at LAX
Tunnel vision—focusing on a single objective while remaining blind to peripheral risks and opportunities—can make even the steadiest projects sway. Especially when those projects take place in, well, tunnels, where tunnel vision is as literal as it is figurative.
Aviation firm Corgan discovered this firsthand at Los Angeles International Airport (LAX) as the lead design firm creating the new Midfield Satellite Concourse (MSC) for the international terminal. The construction of the $1.6 billion project, which began in early 2017, includes a 750,000-square-foot concourse, utility tunnel, and passenger tunnel connecting the existing Tom Bradley International Terminal (TBIT). The tunnels span about a quarter mile and also house a new baggage-handling system.
Project manager Monica Sosa felt that Corgan would benefit from using DAQRI Smart Glasses, portable augmented-reality (AR) glasses that layer digital information on top of the physical environment, to visualize 3D models alongside the build construction. The glasses also provide remote work assistance, which benefits the architect, owner, consultants, and stakeholders who are not available to walk the site.
Sosa obtained funding from CorganCreate, an internal committee tasked with seeding the company’s innovation. She also acquired a Matterport 3D camera, which captures 360-degree scans at 4K resolution and overlays them onto geometric information to create high-resolution 3D models.
“Phase one of the tunnel was 575 linear feet, which is about 18,000 square feet,” Sosa says. “The team captured the data in approximately 46 scans, which captured measurements from that geometric information, generated a digital point cloud, and shared all of this information with the project team. The scanning process took less than an hour on-site and about 7.5 hours to process in the cloud.”
Corgan devised numerous ways to use these new technologies; following are three of the most successful. If your company is considering 4K 360-degree smart cameras or AR smart glasses for reality capture, using them to build a business case could be the first step toward increased speed, efficiency, and quality.
1. Construction Validation
During construction, using AR smart glasses and a smart camera in tandem can help architects and builders cross-reference as-built conditions with initial project designs or design changes.
This has been especially important at LAX, where Corgan is working with many consultants and the design-build team on the MSC project. After scanning the utility tunnel, Corgan extracted raw data for a point-cloud file, which can be indexed in Autodesk ReCap or Revit. The indexed file can also be used in Autodesk AutoCAD and Navisworks. Autodesk BIM 360 Docs was linked to the smart glasses so 3D models could be translated to the glasses for viewing on-site.
Overlaying the postconstruction point cloud with the preconstruction design let Corgan validate construction for future tenants and verify their design models. In one instance, the design overlay revealed a set of utility pipes installed higher than planned. Although this mistake had no negative consequences, on another project it might have created major issues.
The real objective is discovering errors before they’re built, not after. To that end, project teams should be encouraged to use smart glasses to reference the design on-site as they build it.
In one instance on the LAX project, design models showed what appeared to workers to be a typical steel beam. In reality, the beam was a hoist beam on which a chain pulley system would be installed that would open sump-pit lids beneath it. “We went on a site walk and saw the hoist beam and there was all this conduit beneath it,” Sosa says. “If workers had an opportunity to put on the glasses, they would have seen the chain pulley system in our model and understand the use of the beam and not run all of their conduit below. Unfortunately, they must now move the conduit, which is just added time and expense.”
2. Remote Work Assistance
Contractors appreciate working with 3D models in the field for greater accuracy and coordination, but project sponsors especially love AR smart glasses’ ability to connect remote collaborators to the jobsite.
“The owner doesn’t really care about seeing a model in the field because they’re not doing any coordination,” Sosa says. “What they care about most is commissioning at the very end of the project. If you have a few pairs of glasses out there, someone can be at his or her desk and participate in commissioning or other processes without having to go to the site.”
Indeed, workers who are wearing DAQRI glasses on-site can livestream what they’re seeing to remote parties, who can communicate with them through the glasses and even circle things in the workers’ field of vision by drawing on the screen. Or, someone wearing the glasses can record a video for remote playback. This technology is useful for executives who want to keep up with construction; it’s also useful for injured or disabled team members who can’t visit the site because it’s dangerous or inaccessible.
The technology can also be effective for training. “Users can see what you’re seeing, and you can see them,” Sosa says. “It’s like a little FaceTime in the glasses. Let’s say you’re maintenance staff and you need to fix something, but you don’t know how to fix it. You can get an expert to call in and walk you through the maintenance procedure. And then you also can record that and use it as a training video for onboarding new employees. It’s very hands-on.”
3. Data Visualization
Because construction sites are constantly in flux, Sosa says data visualization is one of the most promising use cases for AR smart glasses and reality capture. During the 2.5-year construction of the MSC utility tunnel at LAX, her team tested using AR smart glasses to create and maintain a digital punch list of issues to be addressed. Team members also performed testing with the smart-camera scan, which they could use to virtually tag issues and share with rest of the team.
“We needed some way to document issues in a 3D world,” she explains. “So, we had the idea of scanning the project and tagging the issues.” Corgan’s team color-coded its tagging system for each discipline.
Using the AR smart glasses, project team members can walk the tunnel and digitally “tag” issues they encounter, such as cracks. Because each tag is attached to a 3D model of the space—with embedded links to requests for information (RFIs), drawing details, additional images, or preset tags—workers can use the same glasses later to locate, diagnose, track, and ultimately resolve outstanding issues.
Sosa’s initial experience with smart glasses convinced her that reality capture and AR smart glasses are a winning combination, one that will increase as the technology evolves. “I don’t know what the future holds,” she says, “but I know it’s going to be awesome.”