I’m learning AR development and enjoyed working with RealityKit but less so with Unity. I’m unsure whether a game engine is the best option for developing non-gaming apps, though it does have the advantage that it’s cross-platform. How was Fologram developed? Do you use a cross-platform solution or platform specific SDKs i.e. ARKit, ARCore, WMR?
You’re correct in stating that the key advantage of using an engine is cross-platform support.
For cross platform with an engine:
- Write once run anywhere - one language compiled to native code on all platforms (IL2CPP is amazing)
- “Automatic” support for new hardware with minimal code changes
- Examples/libraries/communities etc are well supported and easy to find
- Simulation/diagnostics/profiling built-in
Against using an engine:
- Can be slower to support native features (less of an issue in 2021, but was more prevalent when ARKit and ARCore had quite a different feature set)
- Sharing projects/scenes between different platforms isn’t always a good idea when UI/input mechanisms are different
- Version control can be difficult
- Harder to optimize the last 1% (but the other 99% is done for you)
- Hard to make UI that feels native - engines are historically based around games, though this is changing.
- Larger binaries as engine is included in distribution.
In the case of Fologram, the for far outweighs the against so we use Unity as an engine - the alternative would be using Java/Kotlin for Android, Swift/ObjectiveC for iOS and C++/DirectX for HoloLens, each along with the different associated tooling required.
Another thing to keep an eye on over the next few months/years is WebXR. At the moment it lacks the performance and feature richness of available engines but the APIs are slowly getting formalized, and proper “appless” AR certainly has perks for in-and-out experiences.
Thanks for the comprehensive reply. I was put off Unity because of the poor 2D UI in their downloadable AR demo but Fologram has a polished UI so it seems it can be done!
WebXR makes a lot of sense, unfortunately Apple seems to be in no rush to implement it in Mobile Safari.
Yeah - lots of Unity demos have very mediocre (default) UI - so unless the demo is actually UI focused I’d just look at the feature.
That said, in the age of React/Angular/Vue and the ease of implementing Material/Bootstrap on the web, Unity UI is definitely a major pain point. A great deal of things that are trivial to do in browser are difficult to do in Unity, and both the in-built and third party tools to build UI are lacking. Some changes here are happening with the introduction of Unity Stylesheets (uss) but this is still immature.
Apple is definitely slowing down WebXR adoption, but it’s making it’s way into WebKit which is the base of Safari - so hopefully not too much longer… Who knows. No doubt it will be timed with an iOS feature/device.
If you’re only after mobile devices, you could also check out some of the AR wrappers in React Native or Flutter - or if you’re up for an interesting challenge there are ways of embedding Unity to create hybrid apps.