Making Augmented Reality in UE5 | Unreal Fest 2022
Estimated read time: 1:20
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.
Summary
In this exciting session from Unreal Fest 2022, Jeff Fisher, an XR programmer at Epic, dives deep into the world of augmented reality using Unreal Engine 5. He outlines the requirements for making AR applications and walks through various templates and features that UE5 offers for AR development. From handheld devices to head-mounted displays, Fisher explains the intricacies of AR interactions, the mapping of real and virtual worlds, and the importance of platform anchors in creating stable AR experiences. He also provides insight into cross-platform development and the future of AR in relation to open standards like OpenXR.
Highlights
Learn about setting up AR applications using Unreal Engine 5 with insights from an Epic Games XR programmer π§βπ».
Explore UE5 templates for AR, including handheld and HMD-based setups, to see what suits your needs π±.
Understand how 'tracking space' aligns virtual and real-world interactions for immersive AR experiences π.
Discover the role of platform anchors in ensuring that AR content stays aligned in different environments π§².
Unlock the secrets of cross-platform AR and how to handle different AR development frameworks π.
Get a glimpse into the potential of open standards like OpenXR in streamlining AR application development π¨.
Key Takeaways
Augmented Reality (AR) in UE5 allows for creating virtual content in the context of the real world π².
UE5 offers templates for different AR platforms, including handheld and HMD-based (head-mounted display) AR π».
Mapping virtual and real worlds involves understanding 'tracking space' and 'Unreal world space' π.
Platform anchors help maintain consistent positioning of virtual objects in relation to the real world π.
Cross-platform AR development requires handling differences in interaction methods and rendering capabilities π.
Future AR development may benefit from open standards like OpenXR for more consistent API interactions π§.
Overview
Jeff Fisher from Epic Games takes us on an AR adventure with Unreal Engine 5, showing how to build amazing augmented reality experiences. Whether you're programming for handheld devices or head-mounted displays, Unreal Engine 5 provides templates and features to help make your AR dreams a reality.
Understanding the relationship between the tracking space and the Unreal world space is crucial for effective AR application development. Fisher explains these concepts along with the role of platform anchors, which hold virtual objects in place relative to the real world, enhancing the stability of AR interactions.
As we look to the future, open standards like OpenXR seem promising for augmented reality. By covering common APIs across platforms, developers can create more seamless and consistent AR experiences. Fisherβs insights provide a roadmap for tackling cross-platform AR challenges.
Chapters
00:00 - 01:00: Introduction to Augmented Reality in UE5 Introduction to Augmented Reality in UE5
01:00 - 05:00: Features and Templates for AR in UE5 The chapter discusses the integration of Augmented Reality (AR) functionalities in Unreal Engine 5 (UE5). It highlights the presence of features that support the blending of digital elements with the real world by recognizing objects, people, and locations. Moreover, UE5 offers several templates designed to demonstrate certain AR-related features, helping users quickly understand the capabilities offered within the engine. Although one of the showcased templates focuses on virtual reality rather than AR, the chapter primarily centers on AR functionalities.
05:00 - 11:00: Real World vs. Virtual World Space The chapter discusses the differences and interactions between real-world and virtual-world spaces, particularly within the contexts of augmented reality (AR) and virtual reality (VR). It elaborates on the usefulness of a 3D space setup for AR, featuring motion controllers and camera arrangements in platforms like Unreal Engine. The manipulation of objects through physics and the use of an arm-attached, in-virtual scene menu system are highlighted as effective UI strategies in VR or head-mounted displays (HMDs).
11:00 - 20:00: Working with Tracking and World Space This chapter discusses different AR (Augmented Reality) platforms with a focus on handheld devices like tablets and phones. It explains how virtual content is overlaid on the camera feed of these devices, providing examples such as Android and iOS platforms, and discusses UI design considerations for these applications.
20:00 - 30:00: Platform Anchors and AR Pins This chapter discusses the user interface design in augmented reality (AR), particularly focusing on placing UI elements directly onto the screen, similar to a mobile game. It introduces the concept of 'spawned AR content,' where virtual elements only appear dynamically in response to user interactions. The chapter refers to the 'handheld AR template,' which, when opened, contains minimal pre-loaded content such as lights and documentation tips, emphasizing dynamic content generation based on user input.
30:00 - 40:00: Cross-Platform AR Application Development The chapter discusses the development of a cross-platform AR application, focusing on the method of placing objects in a 3D space using AR. It explains how to collect additional context about the world, specifically using a rectangular plane in front of the user. By pointing the device and tapping, users can place objects onto the plane. This method effectively constrains the 3D interaction problem into a 2D solution.
40:00 - 52:00: Anchor Persistence and Sharing The chapter discusses an easier approach to creating content in AR projects using an HMD-based viewer template in Unreal Engine. It emphasizes a method where users can place objects directly into the environment, simplifying the workflow by allowing them to point and drop elements at specific spots. This approach aligns with a conventional Unreal workflow of adding content to a level for viewing.
52:00 - 60:00: Plugins and Future of AR Platforms The chapter discusses the integration of a mechanical device with an app, focusing on user convenience and customization. Users can adjust the location of the device within the app to a preferred position, which is automatically saved for future sessions, enhancing the user experience by remembering personalized settings.
Making Augmented Reality in UE5 | Unreal Fest 2022 Transcription
00:00 - 00:30 I'm Jeff Fisher an XR programmer at Epic for the last six years working on virtual reality and augmented reality during that time and uh we're going to talk about how to make augmented reality applications for ue5 and first we're going to talk about what you need for augmented reality so augmented reality is virtual content in the context of the real world and the context is the space you're in the
00:30 - 01:00 objects around you the people around you and where you are in the world or at least some of these elements to be ar so we're going to have features to support these things and unreal five ships with a number of templates to show you how to do some AR um related features and we're gonna we're gonna go through those quickly to see what they have to offer the virtual reality template so this is not an AR template but it is a virtual content in
01:00 - 01:30 a 3D space that you interact with and it has a lot of features that are useful for AR like motion controllers the way you set up a camera in unreal for virtual reality or augmented reality um the um grabbing objects and manipulating them using physics it also has an in Virtual scene menu system attached to your arm which is a good way to do uis in Virtual Reality or in hmd
01:30 - 02:00 platforms or hmd ar platforms next we have the handheld AR platform or template and this is an example of a tablet or phone based AR where you hold the device in front of you and the virtual content is overlaid on top of the camera feed and uh so like Android or iOS devices and this is a good example of how to do a UI for
02:00 - 02:30 that form factor where probably you just put the UI like a normal phone game onto the screen and and tap it directly it's also an example of what we could call spawned AR content so if you open up the level for the handheld AR template you'll see that there is not very much in it there's a couple of lights and there's some documentation tips and nothing else all of the virtual content is going to be spawned dynamically in response to user actions
02:30 - 03:00 in this case we collect a little additional context about the world a rectangular plane in front of you and then you um Place objects onto it by by pointing the device and tapping and this also illustrates another thing about AR interactions uh placing items into a 3D space is difficult and you want to constrain it however you can and in this app we constrain it by using this plane to turn a 3D problem into a 2d problem
03:00 - 03:30 now you're just pointing at a spot and dropping onto that spot and that's way easier for people to do next we have the hololens viewer template so this is an hmd based AR uh project and it's also a example of another way to create AR content in unreal which is putting stuff in your level which is kind of the more the more typical workflow for unreal you have your level you put content into it and then people see that content
03:30 - 04:00 so we have this complicated mechanical device that's the centerpiece of the experience and when you start up the app that device is in a spot and then you adjust it to be in a convenient spot for you to work on it and then this template also uh will store that position so that on a later session the content will automatically be in the spot that you chose previously
04:00 - 04:30 and then we have one more example which is a showcase sample the mission AR project and this doesn't run in unreal five you probably have to back up to 427 to get it to run but you can open it and look at how it was made in unreal 5. and at first this seems a lot like the hololens viewer template right there's some content it was authored in levels it's you just need an open spot in front of you for it to live but the more interesting thing is the original version of this project which
04:30 - 05:00 was a stage presentation so there were two presenters and there was a camera filming those presenters as they looked at and one of them interacted with the virtual content to drive the demo forward and then we overlaid the virtual content onto that camera feed so we had three users experiencing the same AR virtual content in the same real world context they were all seeing the same thing at the same time in the same spot
05:00 - 05:30 and there was a dedicated server uh replicating the state so that they all saw it so they all saw the responses to the interaction from the one presenter so this is an important thing to understand in AR um the real world Space versus the virtual space the real world space is the real world right where all the objects are in the real world and the virtual space is where all the virtual objects are
05:30 - 06:00 inside of your instance of unreal that's running and these are both these are both spatial mappings and they align in a particular way and we need to we need to understand that sometimes so VR has the same distinction but um but the real world in VR is very simple you've got your hmd and we need to know where it is in the real world so that we can render the right thing for for the person to feel like they're in the virtual space and their controllers are
06:00 - 06:30 in the real world and we need to know you know where they are so they can so that when you move your arm it interacts and then we have a safe play area around you perhaps to keep you from falling out of a window or bumping into something that's not much of a real world though and you know in AR we have a lot more um real world to worry about so in unreal we talk about the real world space as the tracking space that's the space that the device is using to describe the real world
06:30 - 07:00 when you are when the real world is telling us the location of an or when the when the platform is telling us the position of a tracked object it's giving it to us in tracking space and that's one-to-one lined up with the real world then we have the unreal World space that's the space inside of unreal you know if you open up a level if you're building level based content that corresponds to the space inside of the level where everything is
07:00 - 07:30 in the level and at runtime it's where everything is at runtime and unreal and um you can change the relationship between these two we have if you look inside of any of the platform plugins you'll see a tracking to World transform and that's the transform that takes a point in tracking space and turns it into a point in unreal space in unreal we do that with the the player Pawn so the player Pawn is at the tracking space origin so the tracking to
07:30 - 08:00 World transform is the same as the pawns transform if you put zero zero zero in tracking space into unreal it shows up at the position of your Pawn and your your Pawn its position is also that that same transform from the unreal world origin right and if you think about how motion Works in a virtual reality game you can see how you move the pawn through the world
08:00 - 08:30 and you see the world you know moving by you right that works that works for any game not just a virtual reality game you move the pawn through the virtual space and you see new virtual space as you go but with VR or ar you also have a real world position and your Pawn is in the same position in the real world all the time barring maybe a little bit of room scale walking around but you can't go a long ways so what we do is we're if you were a third party Observer who could see both the real world and the virtual space what you would see is the the unreal
08:30 - 09:00 World space moving past the user the user is staying still and the unreal World space is moving around them that's also how we can manipulate level-based content for AR and I have a little example uh hoping to illustrate these things better than just talking about them all right so I've started up my test level and we can see the content that's in it in front of me I've got some controls to
09:00 - 09:30 control the test level up above and I've got some meshes down below and a little coordinate system marker in the middle that is the unreal world origin so zero zero zero in unreal space so if you opened up the editor for this level this is pretty much what you would see the meshes down below are set to be static so they can't move and they have some baked Shadows on those um white blocks the controls are all movable and attached together so that we can
09:30 - 10:00 position them where we want and then we can move our potentially heavier weight content um to where we want it to be so this is all in unreal space right we can turn on I've got a box here I can show this big rectangle you know wire mesh rectangle I'm inside of that's representing the unreal space so it's kind of oriented this way turn that back off and if I if I back up a little bit we can see that there's a pin here called
10:00 - 10:30 tracking origin this is the tracking space origin so more or less where my hololens was when I started up the level and the pawn is the Pawns the Pawn's origin the camera component actually is parents origin is the tracking space origin so in unreal space my Pawn is here a couple meters in front of the content and we can turn on
10:30 - 11:00 a visualization of the tracking space you can see it's a little bit closer to me but it's oriented the same way turn off the tracking space box so one thing we can do is we want we want to move these controls around to a more convenient place so I spawned an actor and I've attached the controls to it and the actor I call this a pin actor and it's an actor that knows how to create an AR pin and thus the underlying
11:00 - 11:30 in this case Windows mixed reality anchor to hold it in real world space so there we go I've pinned it now my pin called controls and I'm holding my controls there by the wall what about the static content so static content can't move in the real space so what we need to do is move the whole world and I want to get that content to be on top of this cabinet so let's spawn another pin actor this one's called
11:30 - 12:00 origin and I'm going to place it down here on top of the cabinet and pin it and then I'm going to move the unreal world origin to that actor so I've got a little delay on this so we can see it happen and it's popped over there we can turn on the unreal box and we can see that that has also kind of rotated and it's moved this way a little bit although you can't really see that so we've just moved the whole unreal space to align with the real world in the way that we want it to and all of our
12:00 - 12:30 content that was authored in unreal space is now is now lined up with the real world the way we want to do this stuff all stayed in the same place because it's pinned and the pin's job is to keep it in the same real world location so which is also the tracking space location and we can illustrate this a little bit better by starting the world moving so I'm now rotating the world around my PIN and what that really means so let's back up a little bit here here's
12:30 - 13:00 the tracking space origin which is staying in the same place right relative to the real world I'm teleporting my Pawn around in a circle around my unreal space origin that's how that's how we move the world is by moving our Pawn inside of it and if you think about a VR game where your your Pawn is moving through it in the real world you're staying in the same place this is the same thing where our Pawn is moving through the virtual
13:00 - 13:30 space and we could use this to move forward or whatever in this case I'm just spinning so let's stop the unreal world from moving what else is in tracking space my hands they're in the real world right so when I'm tracking my hands and I've got these little this little movement or these little motion controller things attached to it those are in tracking space if we move to the unreal world they would stay still in the real world but they're moving through the unreal world even when I don't move my hands
13:30 - 14:00 and the same goes for my headset right it's in a real world location so that it stays it's real world location is important because my head moving is what controls its movements so I mentioned uh AR pin in that video and all this this mapping of the real world to the virtual world is not the easiest thing in the world to deal with
14:00 - 14:30 so the platforms have a feature to help you with it a platform anchor so a platform anchor is a fixed Point relative to the real world that the platform maintains and it tries to keep it in the same spot particularly relative to nearby geometry right so so it can try to keep you on the top of a table or on a wall or just near something interesting and we make sure that that that also survives moving the unreal space relative to the tracking space
14:30 - 15:00 and then we wrap that in AR pin which is an unreal class that just wraps the platform anchor and makes different platform anchors look the same to your application so you can make cross-platform applications more easily with the AR pin you pin a scene component in place so you've got a scene component on an actor you and a transform that you want it to stay at and you create a pin and now that actor is going to stay uh in that real world position and it's going to stay
15:00 - 15:30 an important point is that platform anchors aren't just about moving the unreal world they're also just about maintaining a better fixed geometry versus the real world these devices are scanning the world all the time trying to figure out where they are and how they have moved and they might change their mind a little bit about you know there's a wall over here and there's a wall over here and they might decide after scanning more well this wall is actually a little bit closer than I thought and if everything's fixed in the unreal World space your your content would now be through the wall but potentially the platform could know
15:30 - 16:00 that and adjust the tracking space position of the anchor and your content would stay better aligned to the to the real world um this has some important consequences like be careful not to assume that the relative positions of your anchors was always exactly the same because they could change them and uh also they have recommendations about how far how close to an anchor you should stay before before it might you know just a tiny rotation when you're on the
16:00 - 16:30 end of a big lever is going to start moving you a lot so you want to stay within you know two or three meters the recommendations may vary a little bit by platform but basically the closer you can stay to your your pin the better um this could get complicated with really big objects you might need to have multiple pins and then you know a spline or something like that might work really well to stick to multiple pins across an area in case they adjust so this is the blueprint for doing an AR
16:30 - 17:00 pin creation we have this is this is from my AR pin actor that I used in that in that sample so it's basically an actor that pins itself to its current position so we're getting the actor transform down here at the bottom we're getting the default scene root we're calling pin component which gives us a pin which we store in a variable so that we could do things with it later and that's all you have to do and then we'll talk a little bit about the math for moving the world origin
17:00 - 17:30 because this is one where the tracking to World con transform you would actually have to use it to do to do something like this so I've got my unreal origin I've got my AR pin and I want to move my unreal origin to be on my AR pin in the real world right and I'm going to do that by moving my Pawn because that's how I change the the relationship between the unreal world and the real world so these two blue objects the AR pin and the pawn they both have a real world
17:30 - 18:00 position and they will maintain that the the pawn is you know the reference for the whole tracking space and then the AR pin is fixed in tracking space because that's what it does so what we want to do is is um basically those two things are going to stay relative to each other in the same position if we move the unreal world so we need to move the pawn this this little parallelogram diagram we need to move the pawn until we've zeroed out the
18:00 - 18:30 transform of the AR pin you zero out a transform by or turn it to an identity by combining it with its inverse so if we take the pawn and we apply the inverse of the AR pins transform we will get to the position which will put the AR Pawn at the unreal origin and we've got the the math for that at the bottom and this is what the blueprint looks like um I have a player pawn and a Target actor I'm going to get the transform out of my
18:30 - 19:00 player Pawn which is the tracking to World transform there's also a special blueprint function you can call to get the tracking to World transform I did it this way here kind of for clarity to to reinforce the specialness of the pawn um and then we have the AR pins transform we invert that we apply we combine those and then we got a couple more steps that I go through here um first I'm going to I'm going to break the transform I'm going to break its
19:00 - 19:30 rotation and I'm going to zero out roll and pitch for this app I don't want my um I don't want to roll or pitch the world you could but I don't want to and this is probably redundant anyway because the pawn class I'm using enforces Z up so it probably wouldn't actually accept the role or the pitch um but it could be important if you're if you're doing something else or it could be important to not do this if you actually want to be able to roll or pitch the world and then the other thing
19:30 - 20:00 is happens it's kind of done by the teleport function you can see that it doesn't take a transform it takes a location and a rotation separately you don't want to have a transform Matrix in a loop where it's the input and it's the output because you could accumulate error inside that transform and something weird could happen you're pretty well protected from this here at multiple levels so we don't have to do anything explicit about it
20:00 - 20:30 so another thing that we needed for Mission AR was uh anchor persistence and sharing we don't want to have to do that setup stage live in the show we don't want to run it and hope that our app keeps running for an hour while we wait for the show to actually stop start we want to be able to rehearse and have that location be exactly the same every time and we have three users and we need them all to have the same idea of where the content is
20:30 - 21:00 so let's talk about how how we did that and we've got another video just showing me doing that in this test level so we've set all this up and we don't want to have to do this every time we come into our app you know maybe we've placed these things where we want them and we want to be able to come and go and not have to do that setup again so we have the ability to persist pins and there's various ways of doing that in this demo we're using Azure spatial anchors which persist them up to the cloud and we can see we've got this ASA
21:00 - 21:30 session status so we started in Azure spatial anchors session when this level started up and it says ready to create recommended create you know both 16 and three if those numbers are better than one then you're ready to create pins and we're obviously way over one so we're totally fine so I can save my pins I can save my control pin and I can save my origin pin you see it says ASA saving so that one succeeded this one has now succeeded so we've uploaded some D some data about
21:30 - 22:00 the geometry here and I have allowed it to know about the Wi-Fi and the GPS location or GPS coordinates so that it can it can find where it is let's create another just sort of arbitrary create another I accidentally created a controls pin we won't save that one so let's create another pin you know some other kind of content we wanted to put over there we'll save that one so that one's saving
22:00 - 22:30 and then we can quit oh and we will start it up again and let's stand in a different spot so we're tracking our tracking Position will be a little bit different tracking space will be aligned differently so the contents over there instead of over there and if I back up somewhere here we'll see the the tracking origin
22:30 - 23:00 um we haven't loaded our pins yet right this level is not set up to automatically start loading the pins I have to trigger that so we're going to start a watcher and a watcher is their word for their API for finding pins around you and here we go we've loaded up the controls and this level is set up to automatically snap the controls to that pin attached to it when it's uh when it's loaded we've loaded the origin pin and we've loaded this other pin pin number four over here let's
23:00 - 23:30 um move so it's not set up to move the world automatically I'm going to do that as a manual step so I've got the same little delay and then there we go my world is now lined up and my content is all the same as it was before and these pins can last a long time you set an expiration time on them I had a couple of them in this room down here for over a year when I deleted them and you know I moved furniture around and all that and they were still working good
23:30 - 24:00 so uh this is also cross-platform I've got a video of it on the phone and I'm going to turn that on but I'm going to mute it and just talk over it so this is on a pixel 6 phone using AR core and Azure spatial anchors to it's the same app and uh instead of just saying what I'm saying in the video again I'm going to say some things about making a cross-platform app so the very first step of this was super easy it was
24:00 - 24:30 just enable AR core and get set up to build Android that and then my app was running but all the interactions that were implemented in my app were based on tracked hands or motion controllers and I don't have them on a phone so luckily this was my test app and it was made to be kind of lowest common denominator denominator while implementing all these other features so it's basically a look and poke like all I do is look at one of those boxes and it lights up and then I do the pinch gesture so all I needed to
24:30 - 25:00 do was replace the pinch gesture because I'm still looking at things with my phone but a lot of the work in going between form factors between hmd and tablet is going to be in the interactions they're going to have to be very different another difference which you can probably see in this video somewhere there we go that cylinder is wireframe in the hololens but the renderer here is not supporting wireframe so it's just solid white and I
25:00 - 25:30 couldn't do that thing where I where I showed the unreal Space by drawing a wireframe box around me because it would just be a white box um so of course you'll have small differences in the way rendering works on the different platforms and then potentially performance is the other big stumbling Factor now these are both mobile type devices with with this app I'm not um I'm not streaming to the hololens I'm running on the hololens so I didn't really have that problem they were kind of roughly comparable and it's a simple level so it doesn't have a lot of that
25:30 - 26:00 anyway but if you were trying to go between a you know PC based AR with a lot more power you could easily have trouble porting it to a handheld but I think something that's really cool about this is the is that you can potentially have you know maybe for a marketing purpose or something like that you could have a more expensive device that's harder to set up for the main viewer and then you could have a secondary person who can also see the content and control it but
26:00 - 26:30 is using a much you know cheaper device that doesn't that doesn't separate from the world as much so we need to add some things to our diagram we've got our AR pin pinning our scene component it's backed by a platform anchor and now we have a persisting anchor and we have the stored anchor data in the platform in the platforms system so the persisting anchor is is either just an ID which which we would
26:30 - 27:00 typically store in a string or it's an object that wraps the ID and lets you call functions on the object and what you would do is you would have your AR pin and you would say you would call a function to persist it and you would get a persisting anchor or an ID and then you could store that in a save game file or if your anchor persisting system supports sharing to other devices you could transmit it over the network to another device and on the other end you've got your ID
27:00 - 27:30 or your persisting anchor object and you um you you call a function and it gives you back an AR pin wrapping a platform anchor for that spot and now you have a real world position on a different device or at a different time that's that's the same as before this kind of stuff was possible you you had some some options that were uh bad and some options that were really bad and the the bad ones were kind of like
27:30 - 28:00 putting QR codes everywhere and that works but it's ugly and it means you have to set up in advance and actually that that can be a really good solution if you completely control the environment and don't care that there's a QR codes dock in the middle of it um the worst Solutions are where you try to get the tracking space origin to be the same by you know putting your device in the same spot and turning it on and hoping that it chooses the exact same orientation and position that'll probably get you close enough for a lot of purposes but it won't get
28:00 - 28:30 you really close and it is a really huge hassle so these kind of systems are a lot better and Azure spatial anchors is not the only one Google Cloud anchors can do the same sharing through time and across devices and there are other ways to do it too and I think it's important to if you're planning to do an app and you and you want to do uh and you're going to use anchors like this it's important to investigate all these different systems and see which
28:30 - 29:00 one does the things that you need it to do one thing that Azure spatial anchors gave me for this demo is the ability to add a little bit of metadata to my stored anchors so that's how I know that when I load up this anchor it's the origin and this other one is the controls is because I stored just the word controls and the word origin on those pins I didn't have to do any networking or any save game files for this demo I don't think that you can make all that many interesting apps using only that metadata you'd probably need some kind
29:00 - 29:30 of networking connection to do more interesting stuff but you can do things just like that and definitely you could use that to save to do a local only app where you're going to save data to your own file because say you went to another room and you had that type of pin stored in that other location now you can find it you find those pins just by scanning that room right through the system you don't have to know that you're in a
29:30 - 30:00 different room somehow and then look into your saved data and find the anchors for that room that's that starts to get a little bit a little bit chicken egg there so here is just kind of the wrap-up demo where I'm where I'm putting it all together to sort of everything that you need for uh to do something like Mission air okay let's put it all together now starting my app up on my hololens starting up the Watcher moving the world origin
30:00 - 30:30 and then bring in a second device so we'll need to start the Watcher on the phone it's loaded the pins we'll move the controls remove the world origin so now gotta look it's showing the same thing we've got the controls over here attached to that pin we've moved the world origin so that gets on top of the cabinet we've got our other random pins up there
30:30 - 31:00 what we've got now is on both devices the unreal World space is aligned to the real world in the same way and at this point we could use normal unreal networking functionality to move things through the world to spawn them to replicate their positions to replicate their state and they would show up in the same place relative to the real world on both devices and we have a very rudimentary version of that we can do right here which is
31:00 - 31:30 I can place another pin let's put it on the corner of the chair down there place the pan on the corner of the chair on my hololens it's not on the phone see that then we can save it so it's saving and then we can watch it get loaded on the phone mathematically here there we go it's loaded that pin pin
31:30 - 32:00 number one on the edge of the chair so we've got a space aligned we can continue to add new real world reference points and share them between the two devices I also want to go over the plugins that
32:00 - 32:30 are relevant for the different AR platforms that we're supporting right now um for hololens hololens is an open XR based platform which is good it means there's a lot of consistency in its Behavior versus other openxr hmds including the The Meta Oculus hmds that have some AR features and there's also an eye tracker a hand tracker and then this hand interaction which makes your tracked hands act as a controller which
32:30 - 33:00 just makes it easier to do a cross-platform a fairly simple cross-platform application with the hololens and then there's the Microsoft open XR plugin uh this is their plug-in which they maintain and provide through the marketplace and GitHub that gives you access to all of their platform features uh for hololens and then and then I have this Azure spatial anchors plug-in that's the that's the plugin that gives you kind of the editor side of azure spatial anchors and then you need support for each platform for that also
33:00 - 33:30 and in this case Microsoft open XR provides that support then we have the AR core plugins Google AR core Google AR core Services which I wasn't using here but that includes the Google Cloud anchors and then AR utilities includes some helpers for working with handheld AR and and then I have the two Azure spatial anchors plugins in this case and for AR kit we've got the AR kit
33:30 - 34:00 plug-in there's a face tracking one um again Azure spatial anchor supports this this is a handheld platform so we have that those handheld help helpers with ar utilities and then the Google Cloud anchors also work on AR kit so that's another option there and then Oculus so I don't know a huge amount about the Oculus plug-ins they're maintaining them they have a plug-in and they're
34:00 - 34:30 they're somewhat modified branch of unreal unreal that they support so I would refer you to their documentation this is an open XR based implementation though so what about the future so augmented reality is is kind of in a similar place to VR a few years ago where there's a lot of feature overlap and not a lot of API overlap so you get kind of a different API for every platform
34:30 - 35:00 will openxr help with that so Microsoft and meta are both working with openxr based platforms but right now they each have their own extensions for doing AR things so right now there isn't a lot a lot of overlap there although it does mean that the um the sort of whatever maintenance level of of activating features is similar between the two which does help and they're both fitting into openxr as their display system uh
35:00 - 35:30 so right now it's still up to us to wrap these things and try to make them look the same so that it's easier to develop cross-platform uh features but if you want to make cross-platform features you should definitely look into that and see where you need to put layers inside of your app so that you could switch to using different systems on different platforms because you might be able to save yourself a lot of work in the future for only a little bit of work at the beginning