Last night, I fell asleep under the stars, the chirp of crickets intermingling with the old radiator’s whistle off in the distance. I just finished an episode of Justified: City Primeval on the big screen. It was a constant 68 degrees, but I tucked myself into the duvet, nonetheless. For tonight, I’m thinking the surface of the moon, or perhaps the edge of a Hawaiian volcano.
According to most analytics, the average American spends around seven hours a day in front of screens. The Center for Disease Control recommends something in the neighborhood of two hours. But for all the increased focus on sleep hygiene and the harmful effects of staring at displays all day, it seems society is swiftly moving in the opposite direction.
When we talk about "screen time," we mostly mean using devices like computers, phones, and televisions. Meanwhile, for a few years now, a whole different paradigm has been hanging over the edge. With the Vision Pro, two screens with a total of 23 million pixels—one for each eye—are involved.
These screens are, of course, significantly smaller than the other examples, but they’re right there in front of your eyes, like a $3,500 pair of glasses. This is something I’ve been thinking about quite a bit over my first 48 hours with the Vision Pro.
In 2018, Apple introduced Screen Time as part of iOS 12. The feature is designed to alert users to their — and their children’s — device usage. The thinking goes that when presented with such stark figures at the end of each week, people will begin to rethink the way they interface with the world around them. Tomorrow, Apple is finally releasing the Vision Pro. The device is another effort to get people to rethink the way they interact with the world, albeit in entirely the opposite direction.
I’ve spent much of the past two years attempting to break myself of some of my worse pandemic habits. Toward the top of the list are all those nights I fell asleep watching some bad horror movie on my iPad. I’ve been better about this. I’m reading more and embracing the silence. That is, until this week. The moment the Vision Pro arrived, all that went out the window.
Now, there’s a certain extent to which much of this can be written off as part of my testing process. To review a product, you need to live with it as much as possible. In the case of the Vision Pro, that means living my life through the product as much as possible. I’m taking work calls on it, and using it to send emails and Slack messages. I’m listening to music through the audio pods and — as mentioned up top — using it to watch my stories.
My early meditation routine has also shifted to using a headphone. It's the age-old paradox of utilising technology to assist mitigate some of the issues it initially brought into our lives.
While my job requires me to use the Vision Pro as much as humanly possible while I have it, I have to assume my experience won’t be entirely dissimilar from that of most users. Again, you’re going to want to make the most of the $3,500 device as you’re able, which invariably translates to using it as much as you can.
I advised users to take it slow when navigating the Vision Pro universe when I wrote yesterday's entry for Day One of this journal. I sincerely regret not having taken my own advise to heart. By the end of the first day, I was experiencing severe nausea. Of course, your outcomes will differ. Personally, I'm prone to motion and sea sickness. In some of the Vision Pro pictures, you can see a patch behind my right ear that is for the former. (It's definitely a placebo, but occasionally self-deception works like magic.)
VR sickness and car sickness actually operate in similar ways. They’re caused by a mismatch between what your eyes and inner ear are perceiving. Effectively, your brain is getting mixed signals that it’s having trouble reconciling.
In some ways, this phenomenon gets to the heart of something fundamental in mixed reality. Even in the world of passthrough AR, there’s a disconnect between what you see and what your body feels. The Vision Pro’s passthrough is the best I’ve experienced in a consumer device. The cameras capture your environment and transmit it to your eyes as quickly as possible. Using this technology, the headset can overlay computer graphics over the real world — a phenomenon Apple refers to as “spatial computing.”
This gets to something important about this brave new world. Extended reality isn’t reality. It’s the world filtered through a computer screen. Now, we spiral into an existential argument fairly quickly here.
I was reminded this week of a statement made by a Samsung executive in response to criticism that the company is "faking" the moon with its high-end smartphones: "[T]here is no such thing as a real picture." You replicate [what you're seeing] as soon as you have sensors to record it, and it becomes meaningless. A true picture does not exist. Saying, "I took that picture," may help you describe a true image, but is it really real if AI was used to enhance the scene's autofocus, zoom, and other features? Is everything filtered, or not? There isn't a true picture at all.
Apple Vision Pro: Day One
It’s Friday, February 2, 2024. Today is the day. You’ve been eyeing the Vision Pro since Tim Cook stepped onstage with the product at last year’s WWDC. Longer than that, really, if you factor in the years of rumors, leaks and renderings. The price wasn’t anywhere near what you had hoped, but it’s a first-gen product.
I apologise, but to have that particular talk, I need to be much more stoned. For the time being, though, the Vision Pro is making me wonder how at ease I will be if "screen time" in the future mostly consists of having them strapped to my face. Unquestionably fascinating, the result suggests some really creative applications to come (I'm sure there will be a lot of these among the first 600 apps).
Maybe bracing yourself for the future is a combination of embracing bleeding-edge technologies while knowing when it’s time to touch grass. That 2.5 hour-long battery pack might not be the worst thing after all.
Your $3,500 headset just arrived. Now what?
It’s Friday, February 2, 2024. Today is the day. You’ve been eyeing the Vision Pro since Tim Cook stepped onstage with the product at last year’s WWDC. Longer than that, really, if you factor in the years of rumors, leaks and renderings. The price wasn’t anywhere near what you had hoped, but it’s a first-gen product. Manufacturing isn’t at full consumer scale and you’ve got to factor in the millions poured into seven or eight years of R&D.
After a few months of waffling, you hovered your cursor over the “Buy” button, held your breath, closed your eyes and committed to the tune of $3,500. Congratulations, you’re an early adopter.
The box arrives. It’s huge. It’s also quintessentially Apple — it’s premium, designed with intention. Tear the tabs on other side and slide off the top. The visor is inside, anchored to a small platform that’s more display case than shipping container. Dig deeper, and you’ll find another strap and a second “light seal” insert.
Me, I’m currently partial to the Dual Loop Band. It doesn’t look as cool as the Solo Knit Band, but the top strap does a much better job distributing weight (the Vision Pro is not a light headset). As for the light seal inserts, I advise glasses-wearers to go with the larger of the pair to create more distance between your eyes and the inserts.
Last, of course, is the now-infamous battery pack. Plug it into the port on the left side and give it a twist. A small white light pulses before turning solid. The boot-up has begun
Apple Vision Pro: Day One. Your $3,500 headset just arrived. Now what?
After eight months, what’s another 60 seconds between friends? There’s a bit of a setup process. Understandably so. The Vision Pro has to orient its sensors, get to know your space and lighting. If you had Zeiss optical inserts made for your vision, now is the time to snap them in, magnetically. If you’re a glasses wearer, don’t freak out about the image too much until you’ve enrolled your lenses by holding up a piece of paper with a QR-like code on it. Pairing the device to your iPhone works in much the same fashion.
You’ll be asked to pull the headset off for a bit, to take a scan of your face. But first, a short introductory video.
The face scan process utilizes the camera on the front of the visor to construct a shoulders-up 3D avatar. The process is extremely similar to enrolling in Face ID on your iPhone. Look forward. Turn your head to the side. Then the other. Look up and rotate down. Look down and rotate up. Find some good lighting. Maybe a ring light if you have one. If you wear glasses, make a point not to squint. I apparently did, and now my Persona looks like it spent the last week celebrating the passage of Ohio’s Issue 2 ballot measure.
The Personas that have thus far been made public have been a mixed bag. All of the influencers nailed theirs. Is it the lighting? Good genes? Maybe it’s Maybelline. I hope yours goes well, and don’t worry, you can try again if you didn’t stick the landing the first time. Mine? This is actually the better of the two I’ve set up so far. I still look like a talking thumb with a huffing addiction, and the moment really brings out the lingering Bell’s palsy in my right eye. Or maybe more of a fuzzy Max Headroom? I’ll try again tomorrow, and until then be mindful of the fact that the feature is effectively still in beta.
This is the version of you who will be speaking to people through FaceTime and other teleconferencing apps. This is meant to circumvent the fact that 1) You have a visor on your face and 2) There (probably) isn’t an external camera pointed at you. It definitely takes some getting used to.
Oh, you’ll also have to take it again if you want to change your hair or shirt. I was hoping for something a bit more adaptable à la Memojis, but that’s not in the current feature set. It will, however, respond to different facial expressions like smiling, raising your eyebrows and even sticking out your tongue (handy for Zoom work calls). The scan is also used to generate an image of your eyes for the EyeSight feature on the front of the visor to alert others in the room when you are looking in their direction.
Put the headset back on and hold your hands up so the hand-tracking feature knows what to look out for. Next, three circles of dots will appear, each with brighter light than the last. Here you’ll have to look at each while pinching your thumb and index fingers together. This helps calibrate eye tracking.
Input has long been a big question mark in the world of extended reality. You can pair Bluetooth game controllers, keyboards and trackpads with the headset, but in Apple’s vision of the future, the lion’s share of interaction utilizes your eyes and hands. Look at an object to highlight and pinch your fingers to select. Pinches also come into play when zooming (pinch with both hands before pulling them apart) and scrolling (pinch and swipe).
The digital crown is your friend. It’s basically a bigger version of the one on your Apple Watch. Pressing it brings up an apps display, similar to Launchpad on MacOS. The apps sidebar also showcases different Environments and People/Contacts. Long-pressing the crown centers visionOS where you’re looking.
My biggest tip to you, the owner of a shiny new Vision Pro, is give yourself time to adjust. This is going to be the last thing you want to hear. Listen, I get it. You spent a car down payment on a device you’ve been waiting more than half a year to try. But coming face to face with a new version of reality can do weird things to your brain if you don’t take breaks. People have reported headaches from the weight. Personally, I’m prone to motion sickness and am feeling a bit off at the end of my first full day with the device.
Watch an episode of a TV show. Play a quick game (you can play the iPadOS versions of Fruit Ninja and Angry Birds with an Apple Arcade subscription). If this is, indeed, the dawn of a new era for computing, you’ve got plenty of time to acclimate.
Apple Vision Pro’s secret weapon? Mindfulness
Sometimes you just have to stop and breathe
Above all, Apple's Vision Pro is only the start. While utilising it, there are times when it seems like a window into the future or another universe. Even while it's not flawless, it's still a remarkable achievement after many years of extended reality false starts. Not only does using the headset for a short while captivate you, but it also provides real-world insights on the direction things are headed.
I’ve long been a sucker for mindfulness apps. That’s not to say that I use them much these days, but it feels like I’ve tried every one. One key thing that engaging with a Headspace or Calm lacks on a phone or tablet is immersion. It’s hard to overstate how powerful a tool immersion is, particularly for those who are just beginning their practice. It’s something the Vision Pro has in spades.
Much like the world of AR and VR, my own practice has been littered with false starts. Meditation is hard. Full stop. Even seasoned veterans have difficulty silencing noisy brains. It’s far more difficult when first attempting to get your footing. It can be frustrating and anxiety-inducing, effectively having the opposite of the intended effect; but you have to be patient, committed and willing to put in the time if you’re going to push through.
I'm a sucker for testing out pretty much any gadget that claims to aid initiate a mindfulness practice, in addition to the apps. As you've probably already guessed, the majority of them are junk — the kind of plastic trash that is used twice before being hidden in a drawer until your relatives start bringing items for your estate sale. (This is where I shall express my disapproval of Muse's ingenious meditation band; that one really did assist me.)
I’ve always found mindfulness to be one of the most compelling use cases for extended reality. Again, it’s that sense of full immersion that does so much heavy lifting, drowning out life’s distractions. Well, to a point. My jarring NYC apartment buzzer went off right in the middle of this morning’s session. There’s only so much you can do. As for the cluttered apartment, the whine of the radiator, the rabbit rustling and my yell-talking neighbors, pop in a pair of AirPods Pro and you’re off the races.
Apple has offered Mindfulness on the iPhone and Apple Watch for a while now. It’s a basic app, primarily targeted on breathing. It lacks the complexity and content of apps like Calm, but anyone who’s ever successfully meditated will gladly tell you how important breathing is to the process. It’s like the drums in a rock track — it’s centering and constant.
Here’s Zen monk Shunryu Suzuki from his seminal 1970 book, “Zen Mind, Beginner’s Mind”:
When we practice zazen our mind always follows our breathing. When we inhale, the air comes into our inner world. When we exhale, the air goes out to the outer world. The inner world is limitless, and the outer world is also limitless. We say “inner world” or “outer world,” but actually there is just one whole world. In this limitless world, our throat is like a swinging door. The air comes in and goes out like someone passing through a swinging door.
If you think “I breathe,” the “I” is extra. There is no you to say “I.” What we call “I” is just a swinging door which moves when we inhale and when we exhale. It just moves; that is all. When your mind is pure and calm enough to follow this movement, there is nothing: no “I,” no world, no mind nor body; just a swinging door.
The Vision Pro version of Mindfulness operates in much the same manner. An image like a circle of flower petals expands and contracts to help you center your breathing, while a narrator offers a guided meditation. It’s simple, like the best parts of Zen.
Couple it with the headset’s Environment offers (effectively 3D desktop wallpaper for the world around you) and you’ve got an appropriate level of immersion that forces you to focus on the app, which forces you to focus on your breathing, making you mindful of a powerful and important aspect of our lives that most people take for granted most of the time. Suddenly, you’re meditating on a sandy beach or the moon.
It’s the kind of tool I would have loved to have had access to in those early days when I struggled so hard to focus. It’s also a much welcome respite in a device that is half-productivity, half-entertainment.
Apple’s Mindfulness app is hardly the end all, be all for the space. It’s the tip of the iceberg, but a hopeful tip. When such a basic app can have such a powerful effect, it’s exciting to think about the direction devs might go in, both in terms of mindfulness tools and altered states. Sorry, I can’t help sounding like late-period Timothy Leary when I write about this stuff, but I’m very much looking forward to seeing where this goes.