this post was submitted on 23 Aug 2024
19 points (100.0% liked)

Blender

2730 readers
5 users here now

A community for users of the awesome, open source, free, animation, modeling, procedural generating, sculpting, texturing, compositing, and rendering software; Blender.

Rules:

  1. Be nice
  2. Constructive Criticism only
  3. If a render is photo realistic, please provide a wireframe or clay render

founded 1 year ago
MODERATORS
 

Good morning,

Hope this is not a stupid question, I am very new to Blender. So, my setup is:

  • 3d env built from iPad photogrammetry

  • we insert some lasers (a simple cylinder with emission node)

  • we control the lasers using QLC+ --> artnet --> BlenderDMX and a python expression that modulates the emission color for every laser from a separate dmx channel.

We would now love to be able to store the dmx animation directly in blender as keyframes in order to export the animation and put it back on the iPad for AR simulation. Is there any way to record the driver data in real time?

you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 1 points 2 months ago (3 children)

That’s a pretty niche use case so I can’t be of specific help, but it sounds like a job for Python scripting. Assuming you have the same number of lasers all the time, you can set the animation keyframes on your cylinders to have the correct position/rotation/scale as your data comes in. Keyframe animation should export easily to USDZ for use in AR.

Again I don’t know much about your specific setup but this is the direction I’d look.

[–] TDSOJohn@lemmy.world 2 points 2 months ago (1 children)

First of all thank you for your answer! What I meant with animation is the light color modulation itself, so I would basically need to listen to incoming driver data and record it to play it afterwards as an animation (yes, we would export to usdz in order to import in reality composer).

[–] pennomi@lemmy.world 1 points 2 months ago (1 children)

I do not believe that USDZ supports features that allow you to modify the color of the object or do any kind of driver-based logic. (It certainly did not two years ago, when I was in the AR industry). Blender can do it, but Reality Composer probably can’t.

There are only three possible paths I think might be viable.

  1. If USDZ supports UV animation (unlikely?), you could use the driver to animate the UV coordinates of the cylinders across a rainbow gradient, based on what wavelength of light you need.
  2. If you have a small number of colors, you can have redundant cylinders for each color, and show/hide them in the animation to fake the color change.
  3. In Reality Composer, set up some hooks that swap the materials of the cylinders based on various triggers. I think you can use a timer as a trigger. This is a very manual process and would be an absolute nightmare.
[–] TDSOJohn@lemmy.world 2 points 2 months ago

Thanks for the detailed explanation, not the answer I hoped for but definitely the answer I needed! Will look into all 3 options, thanks again!

load more comments (1 replies)