Jump to content

MIT's trillion frames per second light-tracking camera


Shadow

Recommended Posts

A camera capable of visualising the movement of light has been unveiled by a team of scientists in the US.

 

http://www.bbc.co.uk/news/technology-16163931

http://web.media.mit.edu/~raskar/trillionfps/

 

I would be curious as to how they deal with storing the frames. If we assume 1kb/frame, which is probably way less than the actual size, we get 30 PB of data per second, which according to WA is about 1/30 of the estimated data content of the internet.

Link to comment
Share on other sites

http://www.bbc.co.uk...nology-16163931

http://web.media.mit...ar/trillionfps/

 

I would be curious as to how they deal with storing the frames. If we assume 1kb/frame, which is probably way less than the actual size, we get 30 PB of data per second, which according to WA is about 1/30 of the estimated data content of the internet.

 

If you dig through their site a bit you'll see that it's not actually saving a trillion frames per second, it's compiling trillionth of second snapshots together over a much longer time.

They had a few videos which looked like they 'ran' for a few nanoseconds each. Also, things that you're interested in which happen that quickly generally don't happen for very long, so a microsecond or so for each video is probably enough.

Link to comment
Share on other sites

http://www.bbc.co.uk/news/technology-16163931

http://web.media.mit.edu/~raskar/trillionfps/

 

I would be curious as to how they deal with storing the frames. If we assume 1kb/frame, which is probably way less than the actual size, we get 30 PB of data per second, which according to WA is about 1/30 of the estimated data content of the internet.

 

 

I'm kind of curious how 1terafps * 1KB = 30PB??? Forgive me if it is something obvious . . . .

 

30PB is a lot and is costly as can be seen here, but when the date of the post and Schrödinger's hat's comments on the length of sequences are factored in, it really isn't unimaginable. Even at 10 nano-seconds (the bottle was ~ 1ns), with 1MB frames, a clip would simply be what 10GB . . . . . which really ain't much of nothing.

Edited by Xittenn
Link to comment
Share on other sites

If you dig through their site a bit you'll see that it's not actually saving a trillion frames per second, it's compiling trillionth of second snapshots together over a much longer time.

They had a few videos which looked like they 'ran' for a few nanoseconds each. Also, things that you're interested in which happen that quickly generally don't happen for very long, so a microsecond or so for each video is probably enough.

 

So, billionth of a second. A few thousand frames.

 

Already queued for my blog tomorrow, though my take was on a press-release-style video in which the PI says "where we can see photons, or light particles, moving through space". Ugh.

Link to comment
Share on other sites

Very clever, but I think it's misleading to say its zillions of frames per second.

If a bus goes past my house at exactly 8 o'clock each morning and I take a picture at 08:00:00 today then 08:00:01 tomorrow, 08:00:02 the next day and so on, then stick the images together into an animation, have I really taken a picture every second?

Give or take a change of scale, that's what this new camera does.

 

In any event, if I want to make a time lapse or slow motion video of anything I'm going to arrange to play it back over the course of something like half a minute (at least for most things).

It doesn't matter if it's a hummingbird's wing flapping, or a tree growing.

If I make the end result take about half a minute then it will be slow enough to see what's happening, but not so slow as to be dull.

If I play that back at 25 frames a second then I need 25*30 pictures.

If each is a one megapixel image (which isn't bad resolution, it's probably comparable with the screen on which you are reading this) then I need 23*30*1000000 pixels worth of storage.

If each is one byte resolution then I need 750 meg (with no compression).

The speed at which I film those frames doesn't affect the memory I need.

Link to comment
Share on other sites

Very clever, but I think it's misleading to say its zillions of frames per second.

If a bus goes past my house at exactly 8 o'clock each morning and I take a picture at 08:00:00 today then 08:00:01 tomorrow, 08:00:02 the next day and so on, then stick the images together into an animation, have I really taken a picture every second?

Give or take a change of scale, that's what this new camera does.

 

In any event, if I want to make a time lapse or slow motion video of anything I'm going to arrange to play it back over the course of something like half a minute (at least for most things).

It doesn't matter if it's a hummingbird's wing flapping, or a tree growing.

If I make the end result take about half a minute then it will be slow enough to see what's happening, but not so slow as to be dull.

If I play that back at 25 frames a second then I need 25*30 pictures.

If each is a one megapixel image (which isn't bad resolution, it's probably comparable with the screen on which you are reading this) then I need 23*30*1000000 pixels worth of storage.

If each is one byte resolution then I need 750 meg (with no compression).

The speed at which I film those frames doesn't affect the memory I need.

 

 

This is fine if the sole purpose for MIT's experiment is to visually inspect the process. I would assume though that precise detail is the goal and that dropping frames would be like deleting important data. I wouldn't be surprised if the resolution was on the order of 16 megapixels with a bit depth of 32 making each frame on the order of 64MB/frame, increasing my earlier projection to 640GB@10ns clip. This would still be reasonable, although transfer rates become tight and buffering is a less than trivial issue. OCZ's max IOPS ssd cards only transfer at 1GB/s and I'm sure that techniques similar to what I was thinking in a previous post about high speed oscilloscopes would become important.

Edited by Xittenn
Link to comment
Share on other sites

The result as seen in the video don't look like a wave propagation.

It would be interesting to observe a double slit experiment that way.

 

 

To visualize the propagation of a light burst onto the projected screen? The time evolution of such an event might prove interesting .. . .

 

 

I'm interested in what a more developed device like this could do for the petri dish.

Link to comment
Share on other sites

  • 1 month later...
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.