this post was submitted on 12 Jun 2024
27 points (100.0% liked)

Programming

16752 readers
234 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

cross-posted from: https://programming.dev/post/15448730

In order to learn programming holograms I'd like to gather some sources in this post.

The linked paper describes already optimised way of rendering holograms. I'd like to find a naive implementation of a hologram i.e. in ShaderToy using interferometric processing of stored inteference patterns like it works in a physical hologram(I guess). I also want this to be a resource to learn how laser holograms work in real life.

To create an introduction project to holographic rendering these steps will be required.

  1. Store a sphere or a cube interference patterns in a texture. This should be a model of our physically correct hologram. Note: If this step requires saving thousands of textures we should limit the available viewing angles(if that's what helps)
  2. Load the rendered patterns as a texture or an array of textures into a WebGL program
  3. Create a shader that will do the interferometric magic to render the sphere/cube from the hologram model

The performance of the solution is irrelevant. Even if takes and hour to generate the data and a minute to render one frame in low resolution that's fine.

Note: The goal is not about creating anything that visually looks like a cool hologram or rendering 3D objects with a volume like with SDFs or volume rendering. It's all about creating a basic physical simulation of viewing a real hologram.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 2 months ago* (last edited 2 months ago) (1 children)

A couple of ideas:

Encoding holograms

  • Model the object in 3D space (using Blender maybe?)
  • Use the Angular Spectrum algorithm to model light propagation, its interaction with the object, and it hitting the recording medium.
  • Your final recorded hologram should have two maps (aka "images") across (x, y): a map of the light's amplitude and another of its phase offset. This is your recorded hologram.

Decoding holograms:

  • Use the angular spectrum algorithm again except reverse the light's propagation direction. The amplitude and phase maps from the encoding phase are the initial conditions you'll use for the light.
  • The light's amplitude and phase information you calculate at various planes above the recording plane are the 3D "reconstructed" image.

Last thought

Holography is often used to record information from the real world, and in that process it's impossible to record the light's phase during the encode step. Physicist's call it "the phase problem" and there are all kinds of fancy tricks to try to get around it when decoding holograms in the computer. If you're simulating everything from scratch then you have the luxury of recording the phase as well as the amplitude - and this should make decoding much easier as a result!

[โ€“] [email protected] 1 points 2 months ago

Thanks! Finally something concrete. Once I return to this to write a POC I'll revisit your tips here.