# How on "earth" can light rays reflect a person from a mirror!??

## Recommended Posts

Of course not: 6 multiplied by 2 is 12; 6 divided by 2 is 3.

Yes it is true: frequency is the number of cycles in one second; period is how long (in seconds) the cycle is. (This has nothing to do with your statement that multiplication and division are the same.)

From that page:

So, yes, obviously light can pass through some matter (air, water, glass, for example) but bosons do not pass through all matter.

So why do they call them Bosons??

Why was it so hard for the later person to say this??? Thanks!

About division I was meaning the reciprocal "in formulas", as used for refraction index purposes for lenses..

Do I really need to be this specific? Sorry if that sounds dumb..

Here is a statement of one of them.

http://en.wikipedia.org/wiki/Geometrical_optics

The magnification of a lens where the negative sign is given, by convention, to indicate an upright object for positive values and an inverted object for negative values. Similar to mirrors, upright images produced by single lenses are virtual while inverted images are real
M = - s2 / s1 = f / [f-s1]

All photons are bosons. Not all bosons are photons.

Some photons can pass through some matter. But it is not a general property of bosons that they pass through matter.

Lets not forget that " Some Bosons" EVEN bounce off things making the matter even more incomprehensible than it may seem

Lord help Us

At one frequency range material can be transparent, and on other frequency range same material can be opaque.

In one episode of Mythbusters they were testing IR camera and experimenting with different materials - which one causes to be invisible in IR camera view. Glass was pretty good at hiding to IR camera.

Glass testing at ~8 min of video.

I love science!

##### Share on other sites

About division I was meaning the reciprocal "in formulas", as used for refraction index purposes for lenses..

You should be aware that refraction index is not constant.

It's variable depending on frequency of photon.

We can even change refraction index for given material on the fly.

It's called Kerr's effect

http://en.wikipedia.org/wiki/Kerr_effect

This can be used to create windows that are transparent or opaque when we need it.

http://en.wikipedia.org/wiki/Magneto-optic_Kerr_effect

##### Share on other sites

So why do they call them Bosons??

Because they have integer spin and obey Bose-Einstein statistics. Like the other types of boson (W, Z, Higgs, gluon, etc.)

Do I really need to be this specific?

Yes. You often make very confused or confusing statements. It is hard to tell if this is because you do not understand the words you use, or are just being very loose with the meaning.

##### Share on other sites

Because they have integer spin and obey Bose-Einstein statistics. Like the other types of boson (W, Z, Higgs, gluon, etc.)

Yes. You often make very confused or confusing statements. It is hard to tell if this is because you do not understand the words you use, or are just being very loose with the meaning.

integer spin , NOW THAT MAKE MORE SENSE!

At times I tend " I shouldn't" to assume that a scientist would already know what I am talking about, I shouldn't do this, I am working on it , thanks...

You should be aware that refraction index is not constant.

It's variable depending on frequency of photon.

We can even change refraction index for given material on the fly.

It's called Kerr's effect

http://en.wikipedia.org/wiki/Kerr_effect

This can be used to create windows that are transparent or opaque when we need it.

http://en.wikipedia.org/wiki/Magneto-optic_Kerr_effect

WOW! now this is some new information! Thanks!

In this animation, Is the plane of incidence, a 2d plane or is it a cubed volume???

This model appears to look as if though we are surrounded by spheres and that we see reflections as millions and millions of random sized spheres floating in the air creating an image for us to see of the world around us .

Think in terms of 2d sprites in a video game if that helps clarify my question.

I think what I am asking is if we live in a flat hyper space? It seems so..

I want to make sure I am not confusing these models and photos, because they seem flat, and stationary not dimensional.

Snells law wavefronts

Here is another in the semi transparent plane, does this describe a wave as a 2d " flat surface>>?

I thought waves had thickness to them, like a log rolling down a hill if that helps clarify this question.

But they appear flat...

##### Share on other sites

In this animation, Is the plane of incidence, a 2d plane or is it a cubed volume???

It's surface of material.

On top there is material with refraction index 1.0 like vacuum or (almost) air, on bottom there is material with higher refraction index.

Photons are emitted in the all directions, isotropic emission. The one that travel up and left/right are never colliding with bottom material.

Those that are colliding with material, pass through it (it's transparent). And perhaps are partially absorbed by medium (not showed).

Dashed line is showing example path of such photon.

Edited by Sensei

##### Share on other sites

"On top there is material with refraction index 0.0 like vacuum"

Try 1.0.

##### Share on other sites

Good observation. Of course I meant 1.0.

##### Share on other sites

It's surface of material.

On top there is material with refraction index 1.0 like vacuum or (almost) air, on bottom there is material with higher refraction index.

Photons are emitted in the all directions, isotropic emission. The one that travel up and left/right are never colliding with bottom material.

Those that are colliding with material, pass through it (it's transparent). And perhaps are partially absorbed by medium (not showed).

Dashed line is showing example path of such photon.

No, I believe now its the normal that lays in the center of the plane of incidence.

It appears flat and has no volume.... Why so??

If that plane of incidence were to spin in place 360 = 1 second rather one cycle, and the surface remain still would this resemble a Praxinoscope?

http://en.wikipedia.org/wiki/Praxinoscope

What I am trying to comprehend is the " logic" behind the plane of incidence and its normal in relation to how " humans perceive " light as something that has " form" something that moves in the " real world" IE Buildings, cars, other people, sounds, traffic, people shopping, they are all moving, and light bounces of these things??? BUT Where is the BIG Mirror it all happens in???

Are we all looking at the same surface??? Is this a big mirror out there we have not discovered?

Sorry if my questions sound dumb at this point, but I am being honest and sincere about them.

Light from what I understand does not refract through humans unless humans are some type of " projection rather holograms?? I am so confused at this point now having frequency = 1 all the time, something does not make sense here.

Where then would this incidence plane and normal be located at?? In the center of the city? The world?

OR! is this model only applicable for experimental purposes only??

I am wanting to know the " greater picture."

You say: pass through it ????

Ok, I see the grey " thick line" as what separates the two media, the dashed line is the light beam and this is an isotropic type of emission,

they pass through it because the material is transparent I got that, Or----> is it because light beams are invisible>>>??I have read that light beams are invisible.

If I am honestly understanding this then this would only mean that we " Do live in some type of hyper plane."

My reason for this is because it appears that 'EVERYTHING IS ISOTROPIC" in relation to time as I see frequency does not change, so the emission of light beams should not either at-least concerning magnetic waves...

In any case, these models seem to be too perfect as how light beams just pass through things, it seems that there is always one and only one light source either in the room, the office, or the universe of which serves as a guide of measurement...

Surly light is more radical, unpredictable isn't' it???

##### Share on other sites

Plane of incidence doesn't exist as real thing.
It's just mathematic tool to illustrate one of many ways photons can be reflected from material.

Smooth surface like water (in closed room, where air can't influence it much) or flat glass has all normal vectors pointing in the same direction.
Rough surface will have normal vectors disturbed, resulting in disturbed reflections (therefor in 3d applications attributes are called "bump map" and "roughness").

If you have patience watch this 10 min video, how you can manually disturb normal vectors to get disturbed reflections (by adding randomized vector to surface normal vector and then normalize vector):

Light from what I understand does not refract through humans

http://en.wikipedia.org/wiki/Subsurface_scattering

If you will place your hand between strong light source and your eyes, you can see photons passing through your fingers (where body is the thinnest).

Modern 3d movies uses subsurface scattering all the time, and 3d games also start using this effect, because it's giving a lot of realism.

Here is quite extreme example:

On the left girl without subsurface scattering, on the right with subsurface scattering:

Subsurface scattering can be quickly described as multi-bounce of photons through different layers of material after refraction, causing whole material to be brighter as a whole.

Exiting refracted photons from material are exhibiting internal of material (like veins, bones, internal damages, internal color (f.e. multiple times painted car but with different colors)).

Edited by Sensei

##### Share on other sites

Plane of incidence doesn't exist as real thing.

It's just mathematic tool to illustrate one of many ways photons can be reflected from material.

Smooth surface like water (in closed room, where air can't influence it much) or flat glass has all normal vectors pointing in the same direction.

Rough surface will have normal vectors disturbed, resulting in disturbed reflections (therefor in 3d applications attributes are called "bump map" and "roughness").

If you have patience watch this 10 min video, how you can manually disturb normal vectors to get disturbed reflections (by adding randomized vector to surface normal vector and then normalize vector):

http://en.wikipedia.org/wiki/Subsurface_scattering

If you will place your hand between strong light source and your eyes, you can see photons passing through your fingers (where body is the thinnest).

Modern 3d movies uses subsurface scattering all the time, and 3d games also start using this effect, because it's giving a lot of realism.

Here is quite extreme example:

On the left girl without subsurface scattering, on the right with subsurface scattering:

Subsurface scattering can be quickly described as multi-bounce of photons through different layers of material after refraction, causing whole material to be brighter as a whole.

Exiting refracted photons from material are exhibiting internal of material (like veins, bones, internal damages, internal color (f.e. multiple times painted car but with different colors)).

So the photons in sub-surfaces scattering pass through people in "real life too?" I just want to make sure

I cant hear the volume in this video, I checked several times, but I know about this scattering in digital graphics and 3d..

many ways photons can be reflected from material. YES! I kind of thought this was true thanks! I should have remembered about subsurface-scattering! I used to do this all the time and waited loooooooong hours for render time in my 3d scenes!

In the case of ---->3d texture maps though, there had to be "some" material nodes that had "TRANSPARENCY" otherwise subsurface scattering and 'GLOBAL ILLUMINATION" IE set lighting to -> digital photons would not really work correctly in the rendered final images " at least with mine" , this is why I question if photons pass through people in real life too as in the 3d set up for 3d mesh models?

Having in mind that people's skin has many attributes, shiny and etc,I assume human skin to be a texture that is able to reflect light in its own right..

##### Share on other sites
So the photons in sub-surfaces scattering pass through people in "real life too?"

Yes. of course. But it's small amount. And depends strongly on thickness.

Close eyes, and "look" at Sun or other strong source of light - you will see no black color like at night, but reddish color - photons pass through thin skin of your eyelids.

I cant hear the volume in this video, I checked several times, but I know about this scattering in digital graphics and 3d..

There is no sound in this video. This video shows how blurred reflections can be simulated by hand (and what 3d applications do internally), and has nothing to do with subsurface scattering. It's important knowledge for somebody making 3d.

I used to do this all the time and waited loooooooong hours for render time in my 3d scenes!

Because calculation of multi-bounces of photons between different internal layers in subsurface scattering node/shader is taking a lot of cpu time.

One of the most slowing down effects in 3d.

##### Share on other sites

Yes. of course. But it's small amount. And depends strongly on thickness.

Close eyes, and "look" at Sun or other strong source of light - you will see no black color like at night, but reddish color - photons pass through thin skin of your eyelids.

There is no sound in this video. This video shows how blurred reflections can be simulated by hand (and what 3d applications do internally), and has nothing to do with subsurface scattering. It's important knowledge for somebody making 3d.

Because calculation of multi-bounces of photons between different internal layers in subsurface scattering node/shader is taking a lot of cpu time.

One of the most slowing down effects in 3d.

Sensei are you familiar with 3d texture maps in 3d??

You basically get a particle system set up, the one I've used uses flat circles emitting in many directions in the 3d program, then you add this 3d texture map over these particles, " you hook them up in the nodes" and arrange the effects in a 3d space.

Would you say that reflections of any kind work similar to this digital set up.

Or is that completely not the case..

It would appear that this may be the case since real reflections in the real world are flat at the surface and appear to go through the mirror..

What I see is this particle system emitting a flat 2d surface in the real world and the 3d texture would be the photons bouncing off matter, then we would visually see ourselves and other things in these reflections..

It is hard to place this as an example sorry if that made no sense to you

On another note, total internal reflections I assume have light beams coming from the bottom of the water up?

And then I assume the top of the water would have different reflections say from the clouds in the sky, IE, water reflects the sky's color on top of the water right?

It would then be logical to say both the bottom of the water and the top of the water, then exist in two dimension but converged as 1 and -1??? It kind of reminds me of a secant line...

##### Share on other sites

Particle system that uses sprites (2d flat planes always rotated toward camera) is cheap trick to render stuff quickly on computers, quite nice flames or so.

Each particle in 3d has parameters such as position, direction, velocity, acceleration, rotation, time of life.

Time of life is attached to gradient. For t=0 it's the hottest - the brightest - sprite has white color.

Then the longer particle lives, gradient changes to yellow, orange, red, brown and suddenly black, and then disappears.

Such color of particle can be used as light point source (fading away with inverse-square law) to brighten regular geometry surrounding fire in 3d space.

Sprites are rendered on image, one on another, stacking together, gluing with others and looking as one piece.

Pretty often it's pure post-process effect.

When quantity of particles is counted in millions, and their paths are nicely simulated, we're receiving pretty nice fire like on your picture.

The same procedure is used to render clouds. But different are colors and method of decay and movement. Rendering procedure barely changes.

Sensei are you familiar with 3d texture maps in 3d??

Yes. But particle systems don't need texture maps. They can rely only on gradients and time-of-life.

Procedural texture is often used to disturb directions of movement of particles, to get semi random directions and velocities. You can modify paths of particles by using wind, gravity and attractors and repellers.

In LightWave 3D such procedural texture disturbing particles is called HyperTexture.

Particle system asks hypertexture "does location xyz is void or filled?". HyperTexture returns result.

It's repeated f.e. 100x100x100 times in given box. And 3d application knows where is stuff (like cloud), and what parameters it has, and where is nothing to render.

Example hypertexture used to render cloud:

Or is that completely not the case..

Particle systems in 3d applications and especially 3d games are cheap tricks to get nice realistic looks, but they're still tricks.

Unlike regular multi-bounces ray-tracing which is very physically accurate IMHO (as long as you forget about wave nature of matter and photons - 3d simulations uses exclusively only particle nature!)

Particle systems can also render using volumetric - this is much more physically accurate than stack of 2d sprites, but also much longer render times and therefor quite unusable in 3d real-time engines in games.

Volumetric is intercepting any ray-tracing routine and injecting samples inside ray-origin and ray-origin + ray-direction * ray-length.

This allows f.e. clouds to cast shadows on the ground.

Edited by Sensei

##### Share on other sites

Particle system that uses sprites (2d flat planes always rotated toward camera) is cheap trick to render stuff quickly on computers, quite nice flames or so.

Each particle in 3d has parameters such as position, direction, velocity, acceleration, rotation, time of life.

Time of life is attached to gradient. For t=0 it's the hottest - the brightest - sprite has white color.

Then the longer particle lives, gradient changes to yellow, orange, red, brown and suddenly black, and then disappears.

Such color of particle can be used as light point source (fading away with inverse-square law) to brighten regular geometry surrounding fire in 3d space.

Sprites are rendered on image, one on another, stacking together, gluing with others and looking as one piece.

Pretty often it's pure post-process effect.

When quantity of particles is counted in millions, and their paths are nicely simulated, we're receiving pretty nice fire like on your picture.

The same procedure is used to render clouds. But different are colors and method of decay and movement. Rendering procedure barely changes.

Yes. But particle systems don't need texture maps. They can rely only on gradients and time-of-life.

Procedural texture is often used to disturb directions of movement of particles, to get semi random directions and velocities. You can modify paths of particles by using wind, gravity and attractors and repellers.

In LightWave 3D such procedural texture disturbing particles is called HyperTexture.

Particle system asks hypertexture "does location xyz is void or filled?". HyperTexture returns result.

It's repeated f.e. 100x100x100 times in given box. And 3d application knows where is stuff (like cloud), and what parameters it has, and where is nothing to render.

Example hypertexture used to render cloud:

Particle systems in 3d applications and especially 3d games are cheap tricks to get nice realistic looks, but they're still tricks.

Unlike regular multi-bounces ray-tracing which is very physically accurate IMHO (as long as you forget about wave nature of matter and photons - 3d simulations uses exclusively only particle nature!)

Particle systems can also render using volumetric - this is much more physically accurate than stack of 2d sprites, but also much longer render times and therefor quite unusable in 3d real-time engines in games.

Volumetric is intercepting any ray-tracing routine and injecting samples inside ray-origin and ray-origin + ray-direction * ray-length.

This allows f.e. clouds to cast shadows on the ground.

Wow! this information really helps me understand 3d and ray-tracing much better, I wonder if you can create a realistic scientific experiment as with re-fractional indexes of materials and computerized light beams, but not sure if at all it will be " physically" accurate IE matching the physical world..

## Create an account

Register a new account