The largest issue with UltraHD displays, and any HiDPI display, is operating system and application support. Sure, you can run a desktop at full resolution with no scaling but that is almost impossible for anyone to actually use. To get any real benefit from any HiDPI display you are going to need OS and Application support.

In this area OS X is far ahead of Windows. With the Retina MacBook Pro released almost 18 months ago now, there has been a much bigger push to get OS and App support working there. It isn’t perfect still as there are many apps that lack HiDPI support (including Office). The exact nature of how well OS X works with HiDPI displays that aren’t the native display for the system isn’t perfect either.

Plugging my 2013 15” MacBook Pro into the Dell UP3214Q I expected to see options for scaling. Unfortunately I saw nothing of the sort with only the native resolution available for me to choose from. Attempting to use SwitchResX and other hacks to enable scaling also did not work for me. As always user error is a likely culprit for those, but that OS X isn’t aware of high-resolution displays by default is surprising. Perhaps Apple will not do this until they have their own UltraHD panel, but with the UltraHD support of the new Mac Pro being such a big deal the lack of support here is a shortcoming. (Note: this is updated in the first beta build of OS X 10.9.3 which I don’t have access to but Anand wrote about.)

Windows still lags behind here. Windows 8.1 was supposed to deliver better DPI scaling for multiple monitor setups but I have not seen that. Setting the UP3214Q to scale correctly means that my other 27” displays now have giant icons and are worthless for working on. Since running a single display is not a sacrifice I am willing to make I have to choose the option that best bridges the two.

Application support is still very lacking on the PC side. Most programs exhibit jagged edges and other issues when DPI Scaling enabled. Some applications are there, but it’s the exception rather than the rule. However, with Ultrabooks adopting HiDPI displays faster and faster, I fully expect Windows to push to get this right in the next 8-12 months.

What else is behind the times is the DisplayPort 1.2 interface. As I mentioned earlier, you need to enable MultiStream Transport mode to get a 60 Hz UltraHD image on the Dell. This really treats it as a pair of 1920x2160 displays instead of a single monitor, as there are no DisplayPort chips that can support the higher resolution. The specification should allow for it but no silicon vendors have taken advantage of that as there has been no need until now.

Unfortunately MST support is incredibly flaky. It works great, and then your computer hibernates and the monitor won’t wake up until you power cycle it. Or the two sides get out of sync and you have correct colors on one side and an incorrect color profile on the other side. I had half of the screen change resolution on me one day and the other side remain the same. After a firmware update I felt most of these issues were resolved, but as soon as I updated the Dell Calibration software, the monitor would no longer stay in sync in MST mode anymore. You also have to give up Uniformity Compensation on the Dell to use MST.

Note: The firmware update that I installed is not being provided to end users. You would need to exchange your monitor for a refurbished one with the updated firmware from Dell. More details can be read in the thread on Dell's website here.

HDMI 2.0 could also provide a solution to this, but no one currently ships HDMI 2.0 products. Most TVs claiming HDMI 2.0 are really only HDMI 1.4 that support a specific feature of HDMI 2.0 (4:2:0 chroma subsampling support) but they label them as HDMI 2.0 anyway. Until real HDMI 2.0 silicon is available, HDMI support for UltraHD is also limited to 30 Hz. So right now you have two real choices for UltraHD resolution support: 30 Hz that works, or 60 Hz that can be problematic.

The MST feature on the Dell UP3214Q started out working poorly for me. It didn’t wake up from sleep and the other issues I mentioned. A firmware update from Dell seemed to resolve all of these. It always woke up from sleep and the color profiles managed to stay in sync as well. Dell also released a new update to their calibration software that lets you take advantage of the two CAL presets in the monitor. As soon as this was installed a new issue cropped up. In MST mode, the two halves of the monitor would flicker, then it would turn off completely, then back on, then repeat. Only disabling MST fixes this, which then puts me back at a 30 Hz refresh rate.

So at the moment, UltraHD is half-ready when it comes to hardware and software. It has improved a bit over the past few months ago, but it still isn’t quite ready for everyone yet.

Design and Specs Brightness and Contrast


View All Comments

  • willis936 - Tuesday, April 1, 2014 - link

    I'm not sure this is right. Companies usually are making and testing IP while a standard is in the works. In some cases they're out before the standard is done. Reply
  • cheinonen - Tuesday, April 1, 2014 - link

    This is correct. There is currently no full HDMI 2.0 silicon out there that I'm aware of, and since the Dell started shipping last fall it certainly didn't have access to it then. There are currently devices shipping that claim "HDMI 2.0" support in the AV world, but that isn't full HDMI 2.0. It is support for 4:2:0 chroma subsampling, which is part of the HDMI 2.0 spec, and enabled UltraHD resolution at 60 Hz. Since computers don't use chroma subsampling, this isn't relevant and there is no HDMI 2.0 silicon right now. Reply
  • Penti - Tuesday, April 1, 2014 - link

    Not even Maxwell can output it, so what sources are you suppose to use? Reply
  • BMNify - Tuesday, April 1, 2014 - link

    where you get that idea from , its false you need a GeForce 600 "Kepler" graphics card or newer to drive a display up to 4096 x 2160.

    hell, even the ChromeOS guys have merged this linux UHD patch in to their tree intel Haswell/Iris Graphics work at "UHD-1" 3840x2160P if you are not gaming
  • cheinonen - Tuesday, April 1, 2014 - link

    You can do that resolution at 24 Hz, or 3840x2160 at 30 Hz, but you can't do it at 60 Hz without MST right now. HDMI 2.0 allows it at 60 Hz but that isn't available yet on a product. Reply
  • Penti - Tuesday, April 1, 2014 - link

    I was speaking about 600MHz HDMI not ~300MHz. 300MHz HDMI has been around since GCN 1.0 and Kepler. It's also available in Haswell, works fine in Windows, OS X or GNU/Linux at that res, but that limits it's to 30Hz for 3840x2160. That's not HDMI 2.0 specs. You can't use anything else than DisplayPort for 60Hz 4k/UHD. DisplayPort-receivers only do that on MST too. You need two 300MHz HDMI-ports to do UHD @ 60Hz. So gaming in UHD with HDMI is out regardless of gpu/source.

    Maxwell doesn't do H.265/HEVC for that matter either. You only need ~300MHz HDMI 1.4 to do 4096x2160 @ 24Hz. Not HDMI 2.0, that can do it @ 60Hz.
  • zanon - Tuesday, April 1, 2014 - link

    As far as things that still aren't there, I'd throw in color space (both gamut and bit depth) as well. Official UHDTV (see Rec. 2020), beyond the resolution standards bumping to 4K or 8K, also at last features a significantly larger color space and also the depth necessary to go with it (either 10-bit or 12-bit). That's another marquee feature of HDMI 2.0, 12-bit 4:2:2 4K@60fps. Without the increase depth a wider gamut isn't a straight upgrade since the delta between colors increases too, 8-bit AdobeRGB say isn't a clear superset of 8-bit sRGB. It's exciting that as well as HiDPI we'll finally see an industry wide shift to a color space that will be a strict improvement and is large enough to basically be "done" as far as human vision.

    There's still a lot more pieces needed on the PC side though, including both hardware (video cards, interconnect) and OS/applications. High DPI is slowly improving, but even Apple has slipped a bit in terms of color management and support. That said, given the economies of scale that'll come with the general UHDTV push the market pressure should be there at least.
  • npz - Tuesday, April 1, 2014 - link

    "Sure, you can run a desktop at full resolution with no scaling but that is almost impossible for anyone to actually use. To get any real benefit from any HiDPI display you are going to need OS and Application support."

    I don't understand this thinking. To fully utilize all the pixels at your disposal, why would you scale? You want native 1:1 resolution. It makes NO sense to scale. If you can't make out the details with the finer pixels with your eyes, why bother getting more pixels in the first place?

    Objective utilization of higher resolution requires ability for your eyes to *resolve* the pixels better, which means its purpose is for screen real estate of display items, not to scale the picture on this 4k monitor to the same size as seen on a 1080p monitor.

    4k on 32" monitor at 1:1 pixels is very useable, if you don't have bad eyesight. If you need to scale, meaning your eyes can't resolve well at this pixel density, then why not just get something like 2560 x 1440 instead at the same size? Plus you'll have the added benefit of avoid distortion and blurriness introduced from scaling bitmaps.
  • peterfares - Tuesday, April 1, 2014 - link

    He doesn't mean scaling as in stretching it out, he means more intelligent scaling like all phones and newer desktop programs do.

    Interface elements are composed of more pixels to increase clarity and keep them big enough to be usable. If Photoshop worked well with HiDPI systems the buttons physical size would remain the same but be made up of more pixels. The photo work area though would be 1:1.
  • npz - Tuesday, April 1, 2014 - link

    What I mean is why scale? Why not have everything proportionate to the DPI to actually utilize the extra pixels? But adding extra pixels to interface elements doesn't actually add extra detail that was not there to begin with. It's not like a photograph. It just adds extra pixels for the sake of increasing size. Even if adding extra pixels to the interface elements increases "clarity"--let's assume it's something like a highres texture--that means your eyes can resolve the details better i.e. actual perceive the difference in resolution. So if your eyes can actually resolve the difference in detail at a finer level, why not just keep them small in order to gain more useable working space? That's where real utility comes in.

    Most applications like the majority of graphics and audio applications using their own custom UI don't use vector graphics. They instead use bitmaps with pixel level precision placement and bitmap fonts in order to have small widgets. Most of these, DAWs, 3D, Adobe-like programs, etc pack a ton of things in the interface. Why not take advantage of the resolution for more working space then?

Log in

Don't have an account? Sign up now