Did you know you can live track surface curvature in augmented reality with Fologram?

Get the example:
2.31_Surface Aruco Tracking_Curvature.gh (12.1 KB)

And example marker sheet:
Aruco 4x3.pdf (61.7 KB)

Learn how to create a digitized version of a curved surface topology with this easy to use computer vision tool in augmented reality with Fologram. If you’re working with curved physical geometries and want to create live digitized versions, or you’re just interested in learning the basics of computer vision in AR, this is a perfect example to get started with.

Doubly curved surfaces are notoriously difficult to accurately fabricate, and documenting the outcomes often relies on complex setups involving point cloud scans. Fologram’s marker tracking tools can be used to visualize surface curvature in AR, and create an accurate digital copy of your surfaces in grasshopper.

The tool tracks a grid of Aruco marker locations in physical space and updates the surface curvature live in augmented reality, with the digital copy of your surface sent straight back to your grasshopper document, where you can check the accuracy against your digital model. The feedback is live and requires no extra equipment other than a HoloLens or mobile device.

Want to learn about other computer vision techniques in augmented reality? Check out our article ‘Have you built a complex brick wall? Use computer vision in AR to check its accuracy with this helpful guide!’ for more digitization techniques.

1 Like

When trying to use the example, I keep getting an error in the ‘Surface from Point’ component saying “Surface could not be fitted”. I am not sure how to fix this error but it is preventing the surface curve from showing. I haven’t made any changes to the Grasshopper script yet. Any ideas how to fix this? I attached my set up and rhino view for reference.

Hi @rnaab

The surface from points component is expecting a grid of points as the input. In the example above we used a 4x7 grid of markers printed on a sheet of paper, and then set the U input to the Surface from Points components to 4 to match. You could use the marker grids here Aruco Marker Download (3x4 or 6x8) or make your own one in illustrator etc.

If you’re only going to be manipulating your physical surface in plan (rather than say, twisting it or picking it up and re-orienting it) then an alternative way to digitize it from the aruco marker positions would be to draw vertical lines from each of the detected marker positions and then loft between them.

I’m trying out this method, and it appears to be reading the points in an order that causes the surface to twist around on itself. Am I missing something simple? The points from the Aruco grid are working just fine, but perhaps the order it reads them matters?

Hey @erb02

It had a quick look into this. It looks like two things required to make it work:

  1. Sort the detected aruco markers by their ID so that the Surface from Points input to work with them correctly
  2. Set the U parameter input to match whatever your grid of markers is. I tested this with this PDF:
    Aruco 4x3.pdf (61.7 KB)
    and so I had my U setting set to 3.

It will still be a bit glitchy until the HoloLens / mobile has detected every marker in the grid. After that it works as expected.

Great! Worked much better after the sort. Should have sussed that out myself. Thanks.