Using a mobile device to aggregate recycled objects


Fologram on the mobile device allows one to position physical objects, in this case, according to a Grasshopper-made design. This tool will be especially helpful for future students to utilize on or off campus. Current studies revolve around how found or recycled objects can be aggregated to create form and space. After Rhino and Grasshopper are used to design irregular aggregations and connections between objects, Fologram will be used to precisely place and join items. Let us know your experiences using Fologram on a mobile device or if you have any thoughts on the incorporation of preexisting found items into iterative virtual design. How may a user increase accuracy to allow for the most precise positioning in real time when access to a HoloLens headset might vary?

1 Like

Great idea for a project Rachel, looking forward to seeing this one!

The Sync Object component in Grasshopper has an option to only show Outlines (from the right click menu) that might help with placing 3D objects like this as you won’t need to worry about the hologram occluding the physical object and making it difficult to see whether you have placed it at the correct depth.

A couple of design ideas:

Could these repeating objects be located in different orientations as well as positions?
Could there be multiple object types?
Could there be custom joints between these cans? It could be really interesting to boolean the cans together in Rhino and then use Fologram to show the 3D cut curve required to fit the two physical objects together in the same way, sort of like a low tech version of Greg Lynn’s Blob Wall.

@Gwyll

Hi, I wanted to chime in because Rachel is a Research Assistant working with me over the summer. As she said, we’re interested in how accurately we can use the phone app as a fabrication tool, and some of what you’ve suggested is certainly the plan. I love the reference to blob wall and potential to cut objects to fit.

I’m planning to run an elective course this fall that will engage students in using this tool to aggregate found materials, and then to pass on their designs to people in other locations to see how well the design can be replicated by others with their digital instructions. We’d love to have your input on this idea, and if you’re willing to join us virtually sometime during the semester, that would be such a benefit to the class.

Would you mind if I assigned students to make posts about their work in this space?

Thanks!

Hey @erb02

Oh yeah - the idea of distributing the fabrication instructions is a good one. Because of the lack of depth perception / hands free interaction on the phone this is probably going to work well if either:

a) the design is very tolerant of large margins of error (a bit like this project https://vimeo.com/312240310 that used the fologram mobile app)

b) the AR model doesn’t need to be precise (e.g. if you are assembling lasercut parts with precise joints and you just need to know if you have the right part)

I’m really interested in seeing how it goes so yes definitely happy to jump on a video call with the students at some point and they are welcome to post their work on the forum. It’s a good way for us to be involved, actually.

@Gwyll
Very cool example project! I’m just trying to suss out some cool prompts for the class, and the idea of high tolerance makes sense. Since we’re not sure if we’ll be meeting in person, we assume assignments that they can do at home, and I like the idea of trying to push the mobile version, if only because we only have two older Hololens, and the school hasn’t yet bitten off on the software subscription anyway.

I’m also really interested in the idea of projecting a cut line onto a can/tube/other, and then assembling those parts. We’re trying the outline function now.

Do you have any good examples of projects that use tracking in interesting ways that could be done with mobile?

Thanks!

I think @tomo8177’s project with using the tracked device position of the mobile phone (Camera tracking with mobile phone) to create camera paths is a good starting point for thinking about the value of Fologram for mobile in AR applications. It is effective because it solves a problem (how to map a 3D camera path in physical space and translate this to Rhino), without running into the usual problems of mobile AR which are a lack of depth perception, a lack of hands free use and often unreliable tracking.

So the challenge is how to overcome these limitations:

Lack of depth perception:

  • This could potentially be solved with some secondary tracking systems - e.g. colouring holograms by distance to a tracked aruco marker would help the user judge depth.
  • Alternatively you could work only in 2D planes - then you don’t have to worry about depth. This might become interesting if you switch between many 2D plans (maybe a folding process?).
  • This could also potentially be addressed using the occlusion material setting in the Sync Object component. If you have a 1:1 relationship between your digital model and the as build physical model, the occlusion material might help with hiding holograms that would otherwise get in the way of placing a particular part and as a result make judging depth difficult.

Lack of hands free use:

  • You could mount the phone in place using a tripod.
  • You could design in-and-out applications with the phone (where the AR model is used to check something occasionally, rather than being relied upon all of the time).
  • The phone could be the tracking device - e.g. you mount the phone to a tool that is tracked and then provide some simply feedback to the user on the screen as to the direction to move the tool. This is a bit of a crazy idea.

Unreliable tracking:

  • New iphones / ipads have very good tracking so use these if possible.
  • Take care to introduce students to best practice for setting up physical spaces for AR tracking
  • Consider how the AR app might complement other forms of instructions / documentation. E.g. in the example where you are showing cut paths on cans - because the cans are cylinders you might be able to print the cut path in 2D and then just use AR to locate where this should wrap around the can.

If you have access to a hololens you might also be able to condense the fabrication aspect into a small chunk of the semester then convince the school to get a monthly license of fologram - this is pretty cheap.

@Gwyll
Thanks for these thorough thoughts.

I decided to try the intersection of tubes (toilet paper tubes in this case) and to see how accurately I could map the intersection line and construct it physically. I set up a QR code printed with an offset base for the tubes. (is there a way to turn off text marker on QR code, which gets in the way in AR environment?)

the target intersecting tubes:

the target lines to trace onto tube surfaces:

AR mapping:

comparison:

cut one tube and intersect them:


Of course it’s tough to get a perfect trace, but I found that the two traced lines met up very closely and that the physical assembly was very close to the digital.

Fun experiment. More to come.

1 Like

Hey @erb02 that looks like a pretty good first test!

I noticed the curves in Fologram are pretty faceted due to the scale of the model - if you want them to appear smooth you can try using the mesh pipe component and streaming over the mesh instead. (we are downsampling curves based on a real world scale).

@Gwyll
Yes, I did use the pipe tool on the intersection line, but also using the outline function on the tube itself, and it seemed to intersect with the line in a strange way. The tube is a fologram synced object, and its resolution seemed low. Do you mean use the pipe tool for the tube geometry too?

@Gwyll Also, is it possible to turn off the text in the Fologram QR? It both gets in the way in AR (seems very large and floats on top of my small geometry), and if it wasn’t there, it’s easier for me to print my QR along with an offset reference point in order to align geometries well. I did print the small QR and a circle for the base of the tube onto a single sheet, and the QR had a text label at the center, but it was still readable in the app.
Thanks!

Now that I check again, it looks like the pipes are showing up in Rhino, but not in the Mobile app. It is just the line, as you said… It doesn’t seem to matter whether preview is enabled or disabled in GH–still no pipes show up on the app. Do you know what could be happening?


here’s the piped line in Rhino.

Hey @erb02, cool project! In terms of the FologramQR code text, you can hide this by turning off the ‘Fologram QR’ layer in Rhino if it’s becoming distracting:
image
Or you could turn it off directly in your device in the Layers menu of the Fologram for Mobile application

In terms of the pipe not appearing, my first thought would be that you need to send it as a separate Sync Object to get it to appear in Fologram like this setup below:

Let me know if that solves both issues, and if not I can take a look further.

@Sean
Thanks for the reply. Strange, I think my setup with the piping was like that, but I’ll check back.

As well, will the snap feature still work with the QR layer off? I guess that’s where I was getting hung up.

Thanks!

Hi @erb02, the snap feature will still work if you turn the layer off. Let me know if you get it all working!