The Apple Vision Pro is supposed to be the start of a new spatial computing revolution. After several days of testing, it’s clear that it’s the best headset ever made — which is the problem.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    9 months ago

    This is the best summary I could come up with:


    In Apple’s photos, it looks like a big, bright screen that shows a video of your eyes to people around you so they feel comfortable talking to you while you’re wearing the headset — a feature adorably called EyeSight.

    On the top edge, you’ll find what feel like larger versions of some familiar Apple Watch controls: a digital crown that adjusts both the volume and the level of virtual reality immersion on the right as you look through the headset and a button on the left that lets you take 3D photos and videos.

    You can also see Apple’s incredible video processing chops right in front of your eyes: I sat around scrolling on my phone while wearing the Vision Pro, with no blown-out screens or weird frame rate issues.

    A lot of work has gone into making it feel like the multitouch screen on an iPhone directly controls the phone, and when it goes sideways, like when autocorrect fails or an app doesn’t register your taps, it’s not pleasant.

    I asked about this, and Apple told me that it is actively contributing to WebXR and wants to “work with the community to help deliver great spatial computing experiences via the web.” So let’s give that one a minute and see how it goes.

    There’s a part of me that says the Vision Pro only exists because Apple is so incredibly capable, stocked with talent, and loaded with resources that the company simply went out and engineered the hell out of the hardest problems it could think of in order to find a challenge.


    The original article contains 8,148 words, the summary contains 264 words. Saved 97%. I’m a bot and I’m open source!