AR / Tech

Apple Vision Pro: Spatial Computing and Its Impact on Construction Tech

In anticipation of the Apple Vision Pro release, our latest blog delves into the evolving world of spatial computing and its profound impact on the AEC (Architecture, Engineering and Construction) industry. Discover why Argyle embraced spatial computing, how it transcends traditional tools, and what the future holds with Apple's latest innovation.
Published: 
January 12, 2024
Written by
Full Name
Published on
22 January 2021

We’re measuring the countdown for the most anticipated hardware drop in days. The developer chatter is abuzz about the term Spatial Computing instead of Augmented Reality or AR/XR.

Who invented the term spatial computing?

Why did Argyle use the spatial computer term when “metaverse” dominated for years? 

We took the cue from Magic Leap, actually.

While Apple and others were quietly working on its AR ambitions, Magic Leap was boldly and messily building a new paradigm they called spatial computing. 

At the same time, with grit and just smidgen less financing, we at Argyle began building our dream application--one that would "just" show us our Building Information models in position on site.

I changed my profile to ‘Pioneering Spatial Project Management” and never looked back.

Now using the term “Spatial computing” is literally required by Apple app makers–so let’s talk Spatial Computing for AEC (Architecture, Engineering and Construction) , we’re so glad to have you with us.

What is Spatial Data?

Spatial data, broadly speaking, is any data occurring in three dimensions–even when that content isn’t necessarily 3D. From a 3D model viewer, to a localized app, to a location-based datapoint that isn’t necessarily volumetric. Spatial Computing places spatial data into a physical context.

Argyle is a spatial computing application for construction that does all three at once.

Why Spatial Data in AEC (Architecture, Engineering and Construction)? 

AEC has always been a spatial job. Translating 2D into 3D and constructing it safely is a spatial job full of human error–it’s why our industry battles waste in the billions annually. Communicating design and construction to laypeople is a spatial problem, resulting in an industry famous for being over budget and overdue.

We’ve created many tools for communication--paper blueprints were to flat, 3D-printed models were too static, and computers are a limited tool.

But with spatial computers--The tool is beginning to match the job. 

The spatial computers have everything the typical PC has–a CPU, access to internet, and local storage–Plus spatial computers have have cameras, depth sensors, accelerometers, and special GPUs to process and display visual information efficiently and on the fly.

A spatial computer is the first of its kind--it can capture existing conditions on a site at the same time that it helps you visualize the future state in situ. 

And when it comes to spatial computers, Apple’s spatial computer has been the one to watch.

Why is Apple Vision Pro the one everyone is talking about?

Because Apple is famous for its human-centered hardware. 

Apple has the unique experience of pioneering and popularizing new human/digital interfaces. 

We remember having Pocket PCs--PDA with extruded buttons and an occasional stylus--strictly nerds owned them.

One fateful day, Apple created multitouch on pocket computer and it exploded. The change between a stylus and a couple of fingers seems subtle. But it is these subtle design choices complimenting the human body that makes Apple so good at what it does. 

From the point-and-click mouse, to the sound fade of my iPod pros, these design choices matter to my experience. 

It’s also why I was (naively) hopeful for clear glass AR from Apple.

When I saw eyes in the Apple Vision Pro launch, I excitedly hooted– then I experienced a visceral full body cringe when I realized the eyes were projected. I had been so used to having my Magic Leap 2 and Hololens 2--true clear glass computers. Uncanny valley feelings set in. Then ideas for really fun digital makeup.

Projected Eyes on Apple Vision Pro

So, no. I don't love the projected eyes. But what do I know? I laughed at iPads when Apple first introduced them, and now? I own seven. 

Projected eyes? Maybe the windows to our human souls are just fine projected on screen… 

Apple has done more human wearable tech than most–and has made it respond intelligently to the human body, at a scale, degree of attention, and success that is unparalleled.

Spatial computing has been in sore need of that kind of attention. Everyone is benefiting from that innovation.

"floating flat apps that are disinterested in their surroundings is–honestly– not what the spatial computing medium wants to be. It wants awareness, persistence, and intelligent interaction with space."

Mobile computing meets spatial computing 

Activities in a spatial environment–like physical movement and interpersonal interaction, are experientially different from the activities on our desktop and mobile computers. It’s the difference between “In” vs. “On.” 

The jump to spatial computing is eased by Apple Vision Pro’s initial offerings that leverage apps we already have as monitors in space. 

Surely, everyone can relate to big monitors and lots of tabs open. In Spatial Computing they’ll surround you. 

But floating flat apps that are disinterested in their surroundings is–honestly– not what the spatial computing medium wants to be. It wants awareness, persistence, and intelligent interaction with space.

Spatial computing will come into its own–not with floating windows of mobile apps, but as spatial computing enables the things you can’t do in the real environment–things like the holographic overlay of BIM in Argyle or surgical directions in a medical app. 

Who needs Spatial Computers Today? 

Who actually needs to buy Apple Vision Pro or Magic Leap 2 or any of these devices? Two groups.

Group 1: People Using Spatial Data


Today the these are the people who are today working with spatial data on 2D screens –The same people who buy high-end graphic computers and who are already looking at spatial data–they’ll want to also experience that data in real space to remain competitive.

Group 2: People Doing Spatial Jobs. 

Think of course: building construction, human surgery, or manufacturing–for them, spatial computing enables the impossible. Realtime connection to relevant data in space. 

Spatial computing is here to stay as a major paradigm shift.

What AR hardware is needed?

That will depend on what job they are doing. In North America, Magic Leap 2 and Apple Vision Pro will be the professional-level devices. Whereas Meta Quest will the the accessible spatial option. 

Magic Leap 2

Meta Quest 3

Group 1, people using spatial data, may find a Meta Quest 3 is more than adequate for their needs. You will be able to do many of the things on a $600 Quest 3–looking at 3D models, collaborating, screensharing, as you can on the more expensive devices. 

But Group 2–at least industrial manufacturing and building construction will most often require a pro-level device to run its models. And in many cases–I worry that device will need to be clear glass AR.

My kid wears his Quest 3 around the house without incident, but when I saw him walk with a batch of cookies to open a hot oven wearing a headset–I pounced into the Site Safety mode. “We do not bake in the metaverse, son.”

Construction Founder optimism made me fantasize that Santa would bring me clear glass AR from Apple, but I have to remind myself that this is Apple’s first generation. I don’t expect it to find itself in this iteration. 

Most people can get away without needing clear glass AR for a little bit–their floating window applications won’t demand it.. The next gen of Magic Leap will benefit from learning what Apple does and we hope in the great tradition of American competition that Apple will eventually create its own clear glass spatial computer.

When will everyone own a Spatial Computer? 

Think: movies. When pictures were strung together quickly to create movies–it took decades before these motion pictures began to shine as a medium distinct from watching a (worse) play at the theater. 

Early on, it was just a camera on a stage. Cutaways, tilts, and even characters leaving the frame was learned and implemented over time. A couple generations later and, I have several streaming accounts that I leave on like podcasts–without appreciating any of the cutaways.

Consumers will use the hardware when it looks like a pair of Oakleys and interfaces like Star Trek. Spatial computing comes into its own when it brings you into an easy, just works, location-aware understanding all the time. 

When the hardware always knows where it is the way you know you are. 

When it seamlessly switches between apps and lets you share between people with and without AR devices like yours. Multiplatform.

And in this way, I am so proud of Argyle. Minimalist and multi-platform by design, our goal has always been to to be on the best devices available for the industries we serve. 

What will the AEC industry want to do with Spatial Computing?

On day 1 it's hard to say. Will they use 3D image capture? Will they begin incorporating 3D pics as an underlay for Revit? Will the industry hold out for a handheld AR and will that ultimately be a contributor to the 3D picture and image environment? This is spatial data. 

BUT there are some special productivity things that the industry might like–special facetime, special slack, so if you’re in your Apple Vision Pro, multitasking you’re probably in AVP. It will be way less isolating. 

And of course there are apps like ours at Argyle that today let you convert Revit or Navisworks into AR experiences.

Clear glass AR on Hololens 2 and Magic Leap 2 is already being used for layouts and QA on projects large and small. The common thread in these AEC projects is a Building Information Model and need to get it done more efficiently--whether pressured for time or labor--Augmented Reality--ahem--spatial computing is already here.

Day one of Apple Vision Pro

The day one AVP apps will actually be pretty cool –you’ll get a small suite of apps that are designed for AR like 3D images and some entertainment like Disney+ 3D movies. You’ll also get all the iOS stuff that isn’t Augmented Reality. 

Normal, everyday iPhone and iPad apps will also be available by default.The interface will be an interpretation of the app you’re used to, floating screens around you, kind of like a limitless iphone home screen.

But, with some irony, this means apps using the artist formerly known as AR kit will not be available by default. True spatial apps will need another round to be specially adapted for the Apple Vision Pro device.  

What about you? 

What type of spatial data are you most excited to visualize in the very near future?

Maret Thatcher is the co-founder and CEO of Argyle, and speaks on spatial computing, augmented reality, and construction technology.
Connect with Maret on Linkedin

Interested In
Augmented Reality For Your Company?

Book a free demo today and see how our AR can cut your build time by 50%.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.