The 3 best ways to track live geometry in mixed reality with Fologram

Get the example: 2.10_Attaching Geometry to a Marker.gh (6.1 KB)

This example shows how simple it is to use augmented reality to check whether parts are correct, orient design models to markers, or track physical tools even without much grasshopper expertise. Learning to work with markers and QR codes is a great way to begin to understand some of the more powerful computer vision tools in Fologram.

Here’s the 3 best ways to track live geometry in Fologram:

  1. Using Aruco Markers for tracking.
  2. Using Aruco Cubes for tracking.
  3. Using QR codes for tracking.

Templates for each of these tracking markers can be found on the Fologram website. Simply print them out, turn on tracking in in Fologram, and a live update of the location of each marker will appear in grasshopper. Move your marker around in front of your device and you will see the geometry from grasshopper following it live.

You might experience the tracked geometry jumping around or skipping frames and this can occur when your mobile camera either can’t clearly see the marker or has a poor map of the space you are working in. Check out this article to learn more about how to map your space for robust tracking.

Like this example? Try different markers, and apply different geometries to each, or check out our article ‘ Augment a physical tool with a holographic overlay with Fologram in these 5 simple steps’ to learn about augmenting physical objects with markers!

Hi Sean,
is it possible to scan custom images or even objects to use for tracking? I’m trying to ‘attach’ something in AR to a physical object and if possible I’d like to avoid sticking a QR Code / Aruco Marker on it (designer problems). Do you maybe have an idea?
That would be super awesome!
Thank you!

Hi @Bekki5000

Efficient 3D object tracking typically requires neural networks to be pre-trained on the specific geometry of the object to be tracked, so it isn’t so easy to allow Fologram users to track custom objects at run time. From doing a quick search on the Grasshopper forum you might be able to fudge something together using a kinect sensor or webcam and your own object tracking algorithm - though this would probably run much slower than the simple QR code / Aruco marker solution.

What are you wanting to track? And what’s the intended use case?

Hi @Gwyll,
well, my own object tracking algorithm sounds intense!
I’m coming at it more from a conceptual, speculative view - I want to create objects that have a presence in ‘real’ space and virtual space. So in a way, I want to create virtual extensions to actual objects that change our experience of them (say a chair or a trashcan or a teapot)
Ideally, I want to showcase them as a live AR installation where everyone can walk around and look at them or I will make a video of me interacting with the things.
That would mean I would need to be able to sort of reliably track the actual object and place the AR extension in relation to it - just like the white and red sphere on the drillbit of your tutorial video.
In the end probably I can also attach the Aruco markers (even though those don’t seem to be working super reliably for me either)- I was just wondering if there was a more ‘invisible’ option.
Thank you!

Hey @Bekki5000

The performance of marker tracking will depend largely on the make and type of mobile device you’re using (new iPhones and iPads work best) along with how visible the marker is to the phone camera (large / close markers track better than small / far ones). So unfortunately no invisible option here! You could potentially design how the marker works with the object - etching them into the back or seat of the chair could be an interesting idea.