It's not just Netflix, Spotify, and YouTube that don't have apps for Apple's Vision Pro at launch. New App Store data indicates the new mixed reality

I left the headline like the original, but I see this as a massive win for Apple. The device is ridiculously expensive, isn’t even on sale yet and already has 150 apps specifically designed for that.

If Google did this, it wouldn’t even get 150 dedicated apps even years after launch (and the guaranteed demise of it) and even if it was something super cheap like being made of fucking cardboard.

This is something that as an Android user I envy a lot from the Apple ecosystem.

Apple: this is a new feature => devs implement them in their apps the very next day even if it launches officially in 6 months.

Google: this is a new feature => devs ignore it, apps start to support it after 5-6 Android versions

Ok yes with Oculus it’s similar actually. You can poke at the letters but the problem is the exact depth detection is not so great (mainly because you’re pointing directly away from the tracking cams with your finger) so it’s a bit of a hit and miss.

And moving the “virtual mouse pointer” and then pinching is also a pain to do. My oculus doesn’t have eye tracking but you can move your hand to move the “pointer”.

Both methods are a PITA. Using the controllers to point and then click the trigger is better but it’s still slow going of course that way. It’s like typing on a keyboard hanging in front of you by pressing the keys with a stick. Considering that’s the most comfortable option (which the Vision Pro doesn’t have for lack of controllers), it’s pretty sad.

But yeah I see the potential too… I hope it will come to pass.

I can imagine a return to some sort of t9 style typing where you could wear a thin sensor on your finger tips then tap certain fingers a certain number of times to enter specific characters. People who were used to typing with t9 could do it very quickly and without looking.

True, but it’s still about adapting the user to the tech instead of the other way around. I don’t think Apple will go for that.

I would personally think more in the direction of a separate sensor you can place in the house, from a third-person point of view the finger tracking will be much easier to do because you are not moving straight away from the camera.

Create a post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 152 users / day
  • 311 users / week
  • 620 users / month
  • 2.26K users / 6 months
  • 1 subscriber
  • 3.36K Posts
  • 67.7K Comments
  • Modlog