Tag Archives: Big Data

NVIDIA Slow Motion Interpolation With AI Deep Learning Tech!

NVIDIA Slow Motion Interpolation

NVIDIA has been hard at work on the problem posed by high frame rate interpolation of video data shot on lower fps.  We have had this tech since the late 1990s with the advent of Twixtor and refined over the decades in systems like Twixtor Pro and Adobe’s Optical Flow in After Effects. You are still not getting real temporal detail data since the frames are created by extrapolating velocity and direction vectors plus pixel values between frames to get the result.

We explored this technique in our post on interpolation here and why it is no substitute from a real slow motion camera solution.  NVIDIA’s new method uses machine learning along with 11,000 videos to arrive at a more convincing result. Considering the relatively small sample size we can imagine a future where hundreds of thousands or millions of footage samples are used to generate near flawless interpolation. This technique takes some serious computation and data sets so as of now it is not really ready for the mass market but that could change with the cloud very soon. → Continue Reading Full Post ←

100 Million fps High Speed Camera Developed to Fight Cancers!

100 Million fps High Speed Camera

The Rosalind Franklin Institute  has started the development of a 100 million fps high-speed camera that will operate at 1-megapixel resolution to scan how new cancer drug treatments along with ultrasound interact to create effective cures for the deadliest forms of the disease.   This type of performance is unheard of in that resolution and would herald a new way of looking at minute amounts of time that happen so fast no detector so far has been able to capture them at a usable resolution.

To put this in numbers, the camera will be able to capture a 100,000,000,000,000 or One Hundred Trillion Pixels per second or one hundred million megapixels/sec. Those are staggering numbers and if saved in a raw format it would take a megabyte per frame or 100 Terabytes of imaging data/second.  That will be one extremely large frame buffer. With compression techniques and image optimization you could probably get that number lower but if played back at 30fps it would take 38 days to see a single 1-second video, staggering speed for sure! → Continue Reading Full Post ←