Kicking off this week is SIGGRAPH, the annual North American professional graphics gathering that sees everyone from researchers to hardware vendors come together to show off new ideas and new products. Last year’s show ended up being particularly important, as NVIDIA used the show as a backdrop for the announcement of their Turing graphics architecture. This year’s NVIDIA presence is going to be far more low-key – NVIDIA doesn’t have any new hardware this time – but the company is still at the show with some announcements.

Diving right into matters then, this year NVIDIA has an announcement that all professional and prosumer users will want to take note of. At long last, NVIDIA is dropping the requirement to use a Quadro card to get 30-bit (10bpc) color support on OpenGL applications; the company will finally be extending that feature to GeForce and Titan cards as well.

Dubbed their Studio Driver: SIGGRAPH Edition, NVIDIA’s latest driver will eliminate the artificial restriction that prevented OpenGL applications from drawing in 30-bit color. For essentially all of the company’s existence, NVIDIA has restricted this feature to their professional visualization Quadro cards in order to create a larger degree of product segmentation between the two product families. With OpenGL (still) widely used for professional content creation applications, this restriction didn’t prevent applications like Photoshop from running on GeForce cards, but it kept true professional users from using it with the full, banding-free precision that the program (and their monitors) were capable of. So for the better part of 20 years, it has been one of the most important practical reasons to get a Quadro card over a GeForce card, as while it’s possible to use 30-bit color elsewhere (e.g. DirectX), it was held back in a very specific scenario that impacted content creators.

But with this latest Studio Driver, that’s going away. NVIDIA’s Studio drivers, which can be installed on any Pascal or newer GeForce/Titan card – desktop and mobile – will no longer come with this 30-bit restriction. It will be possible to use 30-bit color anywhere that the application supports it, including OpenGL applications.

To be honest, this wasn’t a restriction I was expecting NVIDIA to lift any time soon. Rival AMD has offered unrestricted 30-bit color support for ages, and it has never caused NVIDIA to flinch. NVIDIA’s official rationale for all of this feels kind of thin – it was a commonly requested feature since the launch of the Studio drivers, so they decided to enable it – but as their official press release notes, working with HDR material pretty much requires 30-bit color; so it’s seemingly no longer a feature NVIDIA can justify restricting from Quadro cards. Still, I suppose one shouldn’t look a gift horse too closely in the mouth.

Otherwise, at this point I’m not clear on whether this is going to remain limited to the Studio drivers, or will come to the regular “game ready” GeForce drivers as well. Keeping in mind that both drivers are essentially identical software stacks – the difference being their testing and release cadences – there’s no reason to think it won’t show up in future GeForce drivers as well. But for now, it’s only being mentioned in the Studio drivers.

Meanwhile, the latest Studio driver release, true to its purpose, will also include updated support for several applications, including Cinema 4D and Blender. So while the 30-bit color announcement is likely to overshadow everything else, NVIDIA is continuing to iterate on their software support as previously promised.

New RTX Studio Laptops & RTX-Supporting ProViz Software

Along with their latest Studio drivers, NVIDIA is also using the show to announce their partners latest hardware and software developments.

On the hardware side of matters, another 10 Studio laptops are being announced. The NVIDIA branding program, first launched at Computex earlier this year, set about establishing a minimum standards program for participating laptops. In short, the laptops need to include a 45W Core i7 and a GeForce RTX 2060/Quadro 3000 or better, along with a calibrated display. The latest hardware release cycle will see new laptops from Lenovo, HP, Dell, and Boxx, and will bring the Studio program to 27 laptops in total.

Meanwhile on the software side of matters, NVIDIA is celebrating the adoption of their RTX technology, as well as the additional applications that are adding support for it. According to the company, 40 professional visualization applications already support RTX in some form, with more to come. At this year’s show in particular, Adobe, Autodesk, Daz, and Blender are all showing off new software versions/updates that add support, typically for hardware ray tracing. NVIDIA sees RTX as an important product differentiator for the company, especially as it seems AMD won’t have comparable technology for at least another year, so it’s something the company has continued to invest in and is happy to tout that advantage.


Source: NVIDIA

Comments Locked


View All Comments

  • FXi - Monday, July 29, 2019 - link

    I suspect Intel will bring HDMI 2.1 (which will be widespread by the time they release their cards), VRR and I'll bet 10 bit to the regular user.

    The same thing happened when users started thinking AMD for VRR because freesync was becoming an "everywhere" product with no cost premium. Given a looming loss in competitive advantage they bring features they were not going to prior to that point.

    But yes, with the advent of HDR and 10 or 12 bit hitting more product areas, this has some basis in good sense. Just should have happened years ago.
  • 0ldman79 - Tuesday, July 30, 2019 - link

    How are they going to market this to normal folks?

    The normal terms for the colors are 8 bit, 16 bit, 24 bit and 32 bit, 24 bit being truly what we have, but alpha being thrown in for 32 bit.

    30 bit is a downgrade by the terminology. What are they going to call it? 1 trillion colors? 40 bit? HDR?
  • Oliseo - Tuesday, July 30, 2019 - link

    Why do you think they will?

    And if they do, perhaps they can pay someone on Love Island to do a demo.
  • crimsonson - Tuesday, July 30, 2019 - link

    You are making a mountain out of molehills here.

    10 bit video has been a mainstay of professional video production for 2+ decades now.

    For consumers, since high-end LCDs and OLEDs its something that is often quoted in system specs and confirmed by reviewers.

    Video bit depth DOES NOT need to be in chunks of 8. You can argue that the container is likely in 8/16/24/32 encoding but 10 or 12 bit is technically accurate.
  • Freakie - Tuesday, July 30, 2019 - link

    Yes 10-bit is also referred to as 1 trillion colors but what you're getting it mixed with is total bit depth vs. bit per channel. When they refer to 10-bit here, they are referring to 10 bits per channel. Whereas the 24 and 32 bits that you refer to is combined from all 3 color channels. So your 24 bit is 6 bits per channel, and 32 is 8 bits per channel.
  • mode_13h - Sunday, August 4, 2019 - link

    WTF? Nobody uses 6 bits per channel. I was with you 'till that point. You're confusing 3 vs 4 channel specifications. Anyone talking about 24-bit color means 8-bits per channel, for 3 channels.

    32-bit color is *practically* the same as 24-bit color - just that they're counting a 4th channel that's usually used to hold alpha (transparency) values that can be used for compositing.
  • D. Lister - Friday, August 2, 2019 - link

    @0ldman79: 24-bit color is 8 bits(red)+8 bits(blue)+8 bits(green), giving you a total of 16.7 million colors. 32-bit color is actually just 24-bit color + 8 bits for alpha transparencies, and still only gives you 16.7 mil colors.

    30-bit color OTOH is actually 10 bits each for red, green and blue channels, giving you a total of 1.07 billion color. And it is definitely no gimmick. Properly set up, it can provide noticably smoother gradients in compatible videos and image types (eg. PNG or TIFF). For gamers that means significantly better smoke/fog/explosions/god rays/ambient occlusion.
  • Agent Smith - Tuesday, July 30, 2019 - link

    So if i selected 10bit in the nVidia control panel that was just for DirectX applications all along - is that right?
  • Freakie - Tuesday, July 30, 2019 - link

    DirectX and full screen OpenGL applications. So if you watched a 10-bit movie using MPC-HC in full-screen then it's supposed to work before this change. Also games that use OpenGL HDR could theoretically work as long as they are in full-screen mode. It was a weird mess, glad it's finally being sorted.
  • Stacy_Elmer - Tuesday, July 30, 2019 - link

    I am Mrs Stacy, i want to invest in any business in good faith I have equity capital for profitable investment. Get back to me via email: with your business proposal or your project plans for review.
    Mrs Stacy

Log in

Don't have an account? Sign up now