Tag Archives: gpu

Rife-App Creates Higher frame Rates 25x Faster!

A few months back we took a look at Dain app, and how it was able to use AI and machine learning to create in between frames from almost any source footage and create something that looked and felt like real footage taken with higher fps cameras.  The algorithm was so revolutionary that it took the world by storm, making older software that used re-timing from Adobe and others look antiquated and underpowered.  The Dain-App was great and it was a pay what you want App but had an Achilles heel. The software required a powerful Nvidia GPU with as much VRAM as you could muster to be able to convert footage and re-time it.  

The new Rife-App which is the direct successor of Dain App by the creator GRisk is up to 25x faster than the original, improves the algorithm, and by many examples betters it by creating more seamless transitions. The flow of frames is frankly jaw-droppingly beautiful, especially on low frame rate animation.   We estimate that Animation studios in 2D will eat this app up immediately, and even 3D animation studios could reduce their render times by calculating fewer frames and using Rife-App to increase them to 24p, 30p or 60p from a lower source like 20fps or 12fps.   → Continue Reading Full Post ←

Galaxy S21 Slow Motion Feature is Retained!

Galaxy S21 Slow Motion

This week we had the announcement by Samsung of the Galaxy S21 phone line which is the de-facto Android handset in most of the world.  The Galaxy S21 Slow Motion has essentially the same feature set as the Galaxy S20 of 2020 when it comes to high-speed video. The Galaxy brand went serious with Slow Motion since the Galaxy S9 with the 960fps mode at 720p which we can still see here with somewhat improved resolution and recording time.

The S21 line is by almost any standard a refinement of the line with very subtle but important updates on the previous S20 phones. We get the Snapdragon 888 which is the brightest and fastest SoC in non-Apple land. We get 120Hz adaptive screen refresh rate and 108MP telephoto camera on the Ultra version with 10x hybrid zoom, and half that at 64MP on the regular S21 and S21+ which only offer 3x hybrid zoom. We also get 8k video recording on all of the phones at 24p which we will have to test if it is really delivering on the promise.

Galaxy S21 Slow Motion Video Features:

Resolution

  • 8K video recording at 24 fps (7680×4320)
  • 4K UHD video recording at 60 fps (3840×2160)
  • 1080p FHD video recording at 60 fps (1920×1080)
  • 720p HD video recording at 30 fps (1280×720)

Shooting speed

  • Super Steady 1080p video support at 60 fps
  • Super Slow-mo 720p video support at 960 fps
  • Slow motion 1080p video support at 240 fps & 120fps
  • Hyperlapse 4K video support at 30 fps

Other

  • HDR10+ recording
  • VDIS (Video Digital Image Stabilization)
  • Digital zoom up to 12x
  • High CRI LED Flash
  • Tracking AF
  • Take 33MP still photos while playing 8K video
  • Playback zoom
  • Video location tags
  • *HDR10+ recording is currently in beta and only available on the rear camera. Quality may vary based on sharing platform and playback environment; HDR10+ color is only available on supported devices or platforms.
  • *Super Steady is only available when shooting in FHD resolution.
  • *Super Smooth 60fps Video supports FHD resolution in Auto mode only.
  • *Super Slow-mo only supports HD resolution. On Galaxy S21 5G and S21+ 5G, users can record approximately 0.5 seconds of video captured at 960 fps with approximately 16 seconds of playback. On Galaxy S21 Ultra 5G, users can record approximately 1 second of video captured at 480 fps and digitally enhance the video to 960 fps with approximately 32 seconds of playback. Playback time can be edited in Super Slow-mo player.

Samsung Galaxy S21 event in 12 minutes by The Verge:

Same feature, mildly upgraded results!

So we get 960fps recording in 720p only in the S21 and S21+ phones and not in the Ultra. The Ultra version is capped at 480fps 720p and then AI deep learning interpolates to 960fps slow motion for the final result on the highest spec phone.

It is the same as last year’s S20 which also featured that limitation on the Ultra version of the handset.  It is not a matter of processing power but it is based on the choice of sensors. The Ultra seems to be capped at 480fps while the non-Ultra can scan at 960fps in 720p.

When it comes to slow motion the cheaper phones are actually more capable which may be ideal if this feature is important to you.

On the S21 and S21+ you get 0.5 seconds of 1/2 a second recording time at 960fps which translates to 16 seconds playback at 30p.  On The Ultra you get 480fps for 1 second recording time and interpolated to 960p for a 32 second playback time at 30p.  This is exactly the same as last year’s S20 line.

We have to test the feature fully but we estimate slightly better color, dynamic range, and artifact control on the S21 but nothing that would make you upgrade your S20.

Galaxy S21 Slow Motion Early Samples:

We will be testing the Galaxy S21 in the coming weeks as we are receiving a test unit. In the meantime, we found some samples online that could be legitimate. They certainly look real in resolution and speed for the most part but it is so early that we cannot confirm their validity.  We will re-examine the slow motion samples in a later post at HSC including our own tests.

Samsung Galaxy S21 Camera Slow Motion Test by MKP Captain: → Continue Reading Full Post ←

Dain app GPU and why you should wait for 2021!

Dain app GPU

Probably no computer-related technology has received more attention on the PC side than GPUs.  NVidia launched the 3000 series of cards with aggressive pricing that completely obliterate the previous 2000 series GPUs for much less money. Things are great for PC gamers, machine learning coders, and 3D animators in 2020 with these levels of performance except for the fact that there is a complete scarcity of GPU cards in most lines including but not limited to the 2000 and 3000 series for a variety of reasons.

Dain app the machine learning frame rate interpolation software uses CUDA v5.0 as the minimum requirement or a GeForce GTX 750 as the minimum card to run it. But that does not mean it will be fast or be able to finish interpolating your high res footage. In fact, a 720p clip interpolated frame rate needs about 10-11GB of VRAM memory on the card as it runs the entirety of the calculation in video memory for predictive algorithms to work.

Dain app and the GPU Shortage:

Dain App will be able to run in any CUDA 5.0+ supported Nvidia card or the following:

  • GeForce GTX 750 Ti, GeForce GTX 750, GeForce GTX 960M, GeForce GTX 950M, GeForce 940M, GeForce 930M, GeForce GTX 860M, GeForce GTX 850M, GeForce 845M, GeForce 840M, GeForce 830M, GeForce GTX 870M
  • GeForce GTX Titan X, GeForce GTX 980 Ti, GeForce GTX 980, GeForce GTX 970, GeForce GTX 960, GeForce GTX 950, GeForce GTX 750 SE,
    GeForce GTX 980M, GeForce GTX 970M, GeForce GTX 965M
  • Nvidia TITAN Xp, Titan X,
    GeForce GTX 1080 Ti, GTX 1080, GTX 1070 Ti, GTX 1070, GTX 1060,
    GTX 1050 Ti, GTX 1050, GT 1030,
    MX350, MX330, MX250, MX230, MX150, MX130, MX110
  • NVIDIA TITAN RTX,
    GeForce RTX 2080 Ti, RTX 2080 Super, RTX 2080, RTX 2070 Super, RTX 2070, RTX 2060 Super, RTX 2060,
    GeForce GTX 1660 Ti, GTX 1660 Super, GTX 1660, GTX 1650 Super, GTX 1650, MX450
  • GeForce RTX 3090, RTX 3080, RTX 3070, RTX 3060 Ti

However, since the app runs on video memory or VRAM you need a card with a minimum of 4GB of it to have any sort of success at interpolating frames. Even then your card will have too little VRAM to do anything over VGA resolution. There is a workaround however by using the Split frames into sections feature which will render small pixel buckets of frames and re-align a merged final frame with all the parts when done. This allows you to literally render up to 4k footage at higher frame rates without getting higher video memory.

How to Create Slow Motion Videos with DAIN APP | AI Frame Interpolation by GreenBox:

This workaround is very slow and can take days for a few second 4k clip to render on a mid-range PC.  Your best bet is to get a faster GPU with tons of VRAM. In our view, a minimum of 10GB of VRAM or even better yet a minimum of 12GB is preferable to get the best performance.  In Dain app your card’s CUDA cores are the primary speed accelerator but without enough VRAM it becomes slow as molasses.

The good news is that new Nvidia GPUs have more and more VRAM than ever before.  The just-launched 3000 series toys with 24GB on the high end to 8GB on the low end 3060 Ti cards.  However, none of these cards are available at this time at their suggested retail prices. Scalpers literally bought the entire free supply of cards and are selling them on eBay and Amazon at ridiculously high prices approaching anywhere from 40% to 150% mark up. 

You would think that the 3060 Ti FE starting at $399 for an 8GB of VRAM card would be ideal for the Dain app, cheap, great performance, and close to the ideal 10GB of VRAM but there lies the problem.  8GB will force you to do segmented rendering for higher frame rates and that will limit your speed and video frame sizes. Your render time will increase exponentially with also some artifacts from conjoining segments showing up in some instances.

What to do?

If money is no object then we suggest you get a 3090 RTX card with 24GB of RAM. This will be the ideal card setup but we are talking at a hefty price increase as MSRP cards are nowhere to be found. Relying on scalpers will cost you dearly.  You could still get a 3080 with 10GB of VRAM or a 3070, 3060 Ti with 8GB for about 40% more money.

There is a better option in our view and that is, wait for next year. Nvidia is going to ramp up production of 3000 series cards by Q1 2021 offering better lower-priced options of current cards and also a new 3060 card launching with less CUDA and Raytracing tensor cores but with a whopping 12GB of VRAM option.

By having 12GB of VRAM the 3060 card will be ideal for DAIN app on a budget and be able to render footage directly without segmenting the frames. You will get a big cut on CUDA cores on these cards compared to the 3060 TI.  Rumors say that the 4864 CUDA cores of the 3060 Ti will drop down to  3840 on the regular 3060 card. That is a cut of 1000 cores.  It will still have more CUDA cores than most of the 2000 series so it will still be a very capable card for sure.  For example, a 2080 standard card only features 2944 CUDA Cores.  You will really be able to get better technology for a smaller price tag for gaming, graphics, and machine learning applications like Dain app.

Card Options Today?

1. Nvidia RTX 3000 Series

We first start with the RTX 3000 series. You can get them today at high prices but the links below should adjust in time to much lower levels as supply catches up to demand.

amzn_assoc_placement = "adunit0";
amzn_assoc_tracking_id = "hispeedcams-20";
amzn_assoc_ad_mode = "manual";
amzn_assoc_ad_type = "smart";
amzn_assoc_marketplace = "amazon";
amzn_assoc_region = "US";
amzn_assoc_title = "RTX Cards at Amazon";
amzn_assoc_linkid = "21a8fc1df62beea447f50d845d979617";
amzn_assoc_rows = "4";
amzn_assoc_design = "text_links";
amzn_assoc_asins = "B08HBQWBHH,B08HR9D2JS,B08HRBW6VB,B08HR7SV3M,B08L8L71SM";

amzn_assoc_placement = "adunit0";
amzn_assoc_tracking_id = "hispeedcams-20";
amzn_assoc_ad_mode = "manual";
amzn_assoc_ad_type = "smart";
amzn_assoc_marketplace = "amazon";
amzn_assoc_region = "US";
amzn_assoc_title = "RTX 3060 Ti";
amzn_assoc_linkid = "92700d93b889e22be18ea83ae388dd07";
amzn_assoc_asins = "B08L8L71SM,B08PW559LL,B08QZ5GJ52,B08P3V572B";
amzn_assoc_rows = "4";
amzn_assoc_design = "text_links";

2. Nvidia RTX 2000 Series:

amzn_assoc_placement = "adunit0";
amzn_assoc_tracking_id = "hispeedcams-20";
amzn_assoc_ad_mode = "manual";
amzn_assoc_ad_type = "smart";
amzn_assoc_marketplace = "amazon";
amzn_assoc_region = "US";
amzn_assoc_title = "RTX 2000 Series!";
amzn_assoc_linkid = "edcfa31c5523b1f515240d3301577dbe";
amzn_assoc_rows = "4";
amzn_assoc_design = "text_links";
amzn_assoc_asins = "B07VR2GZMB,B08FYRG8XP,B07Y2R5Y2G,B07W3P4PC2";

amzn_assoc_placement = "adunit0";
amzn_assoc_tracking_id = "hispeedcams-20";
amzn_assoc_ad_mode = "manual";
amzn_assoc_ad_type = "smart";
amzn_assoc_marketplace = "amazon";
amzn_assoc_region = "US";
amzn_assoc_title = "RTX 2070 Series!";
amzn_assoc_linkid = "edcfa31c5523b1f515240d3301577dbe";
amzn_assoc_rows = "4";
amzn_assoc_design = "text_links";
amzn_assoc_asins = "B0856BVRFL,B07Y2R5Y2G,B07YXPVBWW,B07VDC5FDJ";

3. Get in the EVGA Queue!

The company EVGA has developed a product queue at https://www.evga.com/ that allows you to place your name on a waiting list and will email you the right to buy a card for 8 hours. If you do not buy it it will go to the next in line and you will have to re-register a new slot.

We feel this is great for two reasons, you get a great product at the MSRP price and you also refrain from supporting the scalping market.  The downside is that you may have to wait weeks to months for a slot to become available.

4. Wait for next year!

If you wait until 2021, Nvidia will have a January announcement event for RTX with the rumored RTX 3060 card which will be the renamed 3050 ti card.  There will be an option with 12GB of VRAM which will be the best in price performance and should be under $350 USD when it ships in quantity.

Final note: 

We feel you should only buy a card now at inflated prices if you absolutely need it for mission-critical work.  If you can wait please do so to combat price gouging and scalpers who destroy the legitimate market for technology parts.

We have rarely seen such a blatant attack to the consumer. The RTX cards have been gone in mere seconds from online sources due to bots that continuously scan for new stock and snatch it automatically.   By not buying from them you support the community at large and save money in the process.

You can read our article on Dain App and the interpolation of footage that lets you create very convincing super slow motion from almost any frame rate. However, the app shines even more with high-speed footage. You can literally create a 4000fps video from a 1000fps source that looks almost as good as the real thing. Of course you will create frames and data from the ether so for mission-critical and lab studies, Dain app will not be an option.

If you get a new card and run it through Dain app, please share your results and footage below.  Merry Christmas -HSC

amzn_assoc_placement = "adunit0";
amzn_assoc_tracking_id = "hispeedcams-20";
amzn_assoc_ad_mode = "manual";
amzn_assoc_ad_type = "smart";
amzn_assoc_marketplace = "amazon";
amzn_assoc_region = "US";
amzn_assoc_title = "RTX 3000 Series!";
amzn_assoc_linkid = "edcfa31c5523b1f515240d3301577dbe";
amzn_assoc_asins = "B08KWLMZV4,B08HR7SV3M,B08P2D3JSG,B08J5F3G18";
amzn_assoc_search_bar = "true";

NVIDIA Slow Motion Interpolation With AI Deep Learning Tech!

NVIDIA Slow Motion Interpolation

NVIDIA has been hard at work on the problem posed by high frame rate interpolation of video data shot on lower fps.  We have had this tech since the late 1990s with the advent of Twixtor and refined over the decades in systems like Twixtor Pro and Adobe’s Optical Flow in After Effects. You are still not getting real temporal detail data since the frames are created by extrapolating velocity and direction vectors plus pixel values between frames to get the result.

We explored this technique in our post on interpolation here and why it is no substitute from a real slow motion camera solution.  NVIDIA’s new method uses machine learning along with 11,000 videos to arrive at a more convincing result. Considering the relatively small sample size we can imagine a future where hundreds of thousands or millions of footage samples are used to generate near flawless interpolation. This technique takes some serious computation and data sets so as of now it is not really ready for the mass market but that could change with the cloud very soon.

NVIDIA Slow Motion Is New But Still flawed:

As you can see in the sample video below the artifacts produced by interpolation are very evident and more so when a fluid or fabric motion is introduced. The human eye can hide some of these in real time playback due to the persistence of vision effect and brain image processing but it is still quite apparent if you look at it witha  critical eye.

Transforming Standard Video Into Slow Motion with AI by NVIDIA:

There is no question this might be the best looking interpolation method we have seen to date but it is still not generating new information that has any scientific value. In other words, you can’t create something from nothing.  Nothing being the estimated values between two distant frames in time.  It sure is a marvel of computation and could really help in getting many more frames where detail is vast and artifacts suppressed but there is no real image captured from a live event.  If you record an explosion or fluid with this technique you will get what the computer estimates should be there and not what actually happened. Any rogue debris or physically distinct motion phenomena will not be there.  This technique is completely useless for education and scientific research.

That said the technique can make your slow-mo videos shot on your phone just a little more interesting even when shot at 30 or 60fps. As with any interpolation technique you can get better results the more frames you give the system.  If you shoot at 1000fps with a shutter of 1/4000 for example you will get the ability to interpolate down to 3k or 4k fps without much artifacting happening.  Then again if you shoot at 4000fps like what an edgertronic SC2+ can do you could interpolate down to 16,000fps without much in the way of artifacting.

We can certainly see a future in which you can upload your lower frame rate footage to the cloud and choose which frame rate you want it at within a reasonable range.   The cloud AI with the Machine Learning algorithms will get better with more and more videos being added to the collection. It is possible to do it with millions of samples instead of only 11,000 videos like the NVIDIA researchers were using in the lab.    The interpolation should get better and better as the computer learns from the added content.

It will also be possible to create footage from scratch by using video parts much like what Google did with Machine image learning to create new art.  What an interesting future it will be.

We are all for better interpolation but do not believe the hype when you are told you may never need a slow-motion camera again. In fact, temporal detail and nature recording cannot be interpolated to generate real information. So you better continue to use your slow motion camera and expect to get one more capable as technology improves and the price continues to lower. -HSC

Nvidia Slow Motion Interpolation Press Release on the Technology Below:

Link to the article here: https://news.developer.nvidia.com/transforming-standard-video-into-slow-motion-with-ai/?ncid=–43539

Transforming Standard Video Into Slow Motion with AI

June 18, 2018

Researchers from NVIDIA developed a deep learning-based system that can produce high-quality slow-motion videos from a 30-frame-per-second video, outperforming various state-of-the-art methods that aim to do the same.  The researchers will present their work at the annual Computer Vision and Pattern Recognition (CVPR) conference in Salt Lake City, Utah this week.

“There are many memorable moments in your life that you might want to record with a camera in slow-motion because they are hard to see clearly with your eyes: the first time a baby walks, a difficult skateboard trick, a dog catching a ball,” the researchers wrote in the research paper.  “While it is possible to take 240-frame-per-second videos with a cell phone, recording everything at high frame rates is impractical, as it requires large memories and is power-intensive for mobile devices,” the team explained.

With this new research, users can slow down their recordings after taking them.

Using NVIDIA Tesla V100 GPUs and cuDNN-accelerated PyTorch deep learning framework the team trained their system on over 11,000 videos of everyday and sports activities shot at 240 frames-per-second. Once trained, the convolutional neural network predicted the extra frames.

The team used a separate dataset to validate the accuracy of their system.

The result can make videos shot at a lower frame rate look more fluid and less blurry.

“Our method can generate multiple intermediate frames that are spatially and temporally coherent,” the researchers said. “Our multi-frame approach consistently outperforms state-of-the-art single frame methods.”

To help demonstrate the research, the team took a series of clips from The Slow Mo Guys, a popular slow-motion based science and technology entertainment YouTube series created by Gavin Free, starring himself and his friend Daniel Gruchy, and made their videos even slower.

The method can take everyday videos of life’s most precious moments and slow them down to look like your favorite cinematic slow-motion scenes, adding suspense, emphasis, and anticipation.

The researchers, which include Huaizu Jiang, Deqing Sun, Varun Jampani, Ming-Hsuan Yang, Erik Learned-Miller, and Jan Kautz, will present on Thursday, June 21 from 2:50 – 4:30 PM at CVPR.

Are Slow Motion Camera Days Numbered?

Slow Motion Camera Days Numbered

Recent advances in computation, simulation, and real-time rendering have become so advanced that some visual effects artists are opting for rendering everything with CGI instead of filming it outright.  In making-off featurettes, we have seen the VFX teams scrap explosions, fluids, car crashes and actor stunts with CGI inserts that offer precise control over lighting, debris, motion, and quality which are all a crap shoot when filming with a real camera.

NVIDIA just released some demos of real-time ray tracing that surpass the best quality renderers of the 1990s with playable frame rates and ambient occlusion that in the right hands could be indistinguishable from many special effects in a film. The new RTX technology in their Volta architecture GPUs can do area shadows, true multi-level depth reflections, refraction and global illumination in real time 30+fps on a personal computer.

Video Examples of Real-Time RTX Raytracing on the latest NVIDIA GPUs:

NVIDIA RTX Real Time Ray Tracing Demo:

NVIDIA RTX and GameWorks Ray-Tracing Technology Demonstration by NVIDIA GeForce:

SEED – Project PICA PICA – Real-time Raytracing by Electronic Arts: → Continue Reading Full Post ←

HTC U11 Plus Does 120fps Full HD 1080p!

HTC U11 Plus

The HTC U11+  is a refined version of the U11 which was one of the better spec android phones of 2017. The new iteration boosts speed and a screen that goes from 5.5 inches to 6 inches. Space was accommodated for a larger screen without making the phone itself much bigger which is a feat of design and engineering.

For many, the big disappointment will be that it seems HTC is not making a US version of the U11 Plus and leaving it only for Asia and Europe at this time.  The US market is very competitive and HTC has had a hard time penetrating the sales ranks of the Samsung, Apple and Google phones. If you indeed are able to get an HTC U11 Plus the Slow motion component has a 120fps full HD 1080p spec that should be competitive.

HTC 11 Plus Main New Features:

  • 6-inch quad-HD+ LCD Status screen
  •  2880 x 1440 resolution
  • Snapdragon 835 CPU
  • 3,930mAh battery
  • 6GB RAM
  • 64 or 128GB Flash Internal Storage
  • 12MP Rear camera
  • 8MP Front Camera
  • 2160p@30fps, 1080p@30/60/120fps
  • HDR Video
  • 24-bit/192kHz stereo sound recording mode.
  • microSD Expansion
  • Android 8 OS Oreo
  • USB-C 3.1 Connector
  • No Headphone Jack! 🙁
  • MSRP £699 Pounds
Video Previews HTC U11 Plus:

HTC U11 Plus hands on (HTC U11+) by Android Authority:

HTC U11 Plus (U11+) hands-on preview! by Android Central: → Continue Reading Full Post ←