iPad & iPhone User - USA (2019-08)

(Antfer) #1
iOS 13

There’s little doubt, though, that Apple is still
championing augmented reality with a zeal we find
in few other competitors, and the progress in the
new ARKit 3 (and so in iOS 13 and iPadOS) feels like
a great leap forward. That leap is so great, in fact,
that the many of the features only work with iPhones
or iPads running A12 chips or better. (And for that
matter, that progress strongly suggests that Apple is
designing a headset or glasses that will allow these
features to shine, but anyway.)
The slight downside is that current phones can only
use this technology with machine learning technology
tied to the iPhone’s rear camera, so you won’t always
get the fluid captures you’d expect from the front-facing
camera’s TrueDepth sensor. As we’d hoped, though,
Apple is rumoured to be including VCSEL time-of-flight
sensors in an upcoming iPhone’s rear camera, but we
likely won’t see them until 2020. So maybe we won’t
have glasses anytime soon, but at least the iPhones
and iPads may be better.
On the more immediate horizon, here’s what you
can expect in Apple’s latest devices once iOS 13 and
iPadOS drop later this autumn.

People occlusion
An easy way to understand how occlusion works is to
think of an eclipse. When the moon passes in front
of the sun, it’s occluding it. In AR terms, that means
a Pokémon you find in Pokémon Go’s AR view might
appear to stand behind chairs and other furniture. As
iOS currently works with ARKit 2, the illusion of ‘reality’
only works if the Pokémon is ‘sitting’ on a flat space.

iOS 13


There’slittledoubt,though,thatAppleisstill
championingaugmentedrealitywithazealwefind
infewothercompetitors,andtheprogressinthe
newARKit 3 (andsoiniOS 13 andiPadOS)feelslike
agreatleapforward.Thatleapissogreat,infact,
thatthe many ofthefeaturesonlyworkwithiPhones
oriPadsrunningA12chipsorbetter.(Andforthat
matter,thatprogressstronglysuggeststhatAppleis
designinga headsetorglassesthatwillallowthese
featurestoshine,butanyway.)
The slight downside is that current phones can only
use this technology with machine learning technology
tiedtotheiPhone’srearcamera,soyouwon’talways
getthefluidcapturesyou’dexpectfromthefront-facing
camera’sTrueDepthsensor.Aswe’dhoped,though,
AppleisrumouredtobeincludingVCSELtime-of-flight
sensorsinanupcomingiPhone’srearcamera,butwe
likelywon’tseethemuntil2020.Somaybewewon’t
haveglassesanytimesoon,butatleasttheiPhones
and iPadsmaybebetter.
On themoreimmediatehorizon,here’swhatyou
can expectinApple’slatestdevicesonceiOS 13 and
iPadOSdroplaterthisautumn.

People occlusion
Aneasywaytounderstandhowocclusionworksisto
thinkofaneclipse.Whenthemoonpassesinfront
ofthesun,it’soccludingit.InARterms,thatmeans
aPokémonyoufindinPokémonGo’sARviewmight
appeartostandbehindchairsandotherfurniture.As
iOScurrentlyworkswithARKit2,theillusionof‘reality’
onlyworksifthePokémonis‘sitting’onaflatspace.
Free download pdf