DisplayPort vs. HDMI: Which Is Better For Gaming? (2024)

DisplayPort vs. HDMI: Which Is Better For Gaming? (1)

The

best gaming monitors

and best graphics cards are packed with features, but one aspect that often gets overlooked is the inclusion of DisplayPort vs. HDMI. What are the differences between the two ports and is using one for connecting to your system definitively better?

You might think it's a simple matter of hooking up whatever cable comes with your monitor to your PC and calling it a day, but there are differences that can often mean a loss of refresh rate, color quality, or both if you're not careful. Here's what you need to know about DisplayPort vs. HDMI connections.

If you're looking to

buy a new PC monitor

or

buy a new graphics card

, you'll want to consider the capabilities of both sides of the connection — the video output of your graphics card and the video input on your display — before making any purchases. Our

GPU Benchmarks

hierarchy will tell you how the various graphics cards rank in terms of performance, but it doesn't dig into the connectivity options, which is something we'll cover here.

The Major Display Connection Types

DisplayPort vs. HDMI: Which Is Better For Gaming? (2)

The latest display connectivity standards are

DisplayPort

and HDMI (High-Definition Multimedia Interface). DisplayPort first appeared in 2006, while HDMI came out in 2002. Both are digital standards, meaning all the data about the pixels on your screen is represented as 0s and 1s as it zips across your cable, and it's up to the display to convert that digital information into an image on your screen.

Earlier digital monitors used DVI (Digital Visual Interface) connectors, and going back even further we had analog VGA (Video Graphics Array) — along with component RGB, S-Video, composite video, EGA and CGA. You don't want to use VGA or any of those others in the 2020s, though. They're old, meaning, any new GPU likely won't even support the connector, and even if they did, you'd be using an analog signal that's prone to interference. Yuck.

DVI is the bare minimum you want to use today, and even that has limitations. It has a lot in common with early HDMI, just without audio support. It works fine for gaming at

1080p

, or

1440p

resolution if you have a dual-link connection.

Dual-link DVI-D

is basically double the bandwidth of

single-link DVI-D

via extra pins and wires, and most modern GPUs with a DVI port support dual-link. But the truly modern graphics cards like Nvidia's Ada Lovelace RTX 40-series and AMD's RDNA 3 RX 7000-series almost never include DVI connectors these days.

If you're wondering about Thunderbolt 2/3, it basically just routes DisplayPort over the Thunderbolt connection. Thunderbolt 2 supports DisplayPort 1.2, and Thunderbolt 3 supports DisplayPort 1.4 video. It's also possible to route HDMI 2.0 over Thunderbolt 3 with the right hardware.

For newer displays, it's best to go with DisplayPort or HDMI. But is there a clear winner between the two? Let's dig into the details.

DisplayPort vs. HDMI: Which Is Better For Gaming? (3)

DisplayPort vs. HDMI: Specs and Resolutions

Not all DisplayPort and HDMI ports are created equal. The DisplayPort and HDMI standards are backward compatible, meaning you can plug in an HDTV from the mid-00s and it should still work with a brand new RTX 20-series or RX 5000-series graphics card. However, the connection between your display and graphics card will end up using the best option supported by both the sending and receiving ends of the connection. That might mean the

best 4K gaming monitor

with 144 Hz and HDR will end up running at

4K

and 24 Hz on an older graphics card!

Here's a quick overview of the major DisplayPort and HDMI revisions, their maximum signal rates and the GPU families that first added support for the standard.

Swipe to scroll horizontally

DisplayPort vs. HDMI Specs

Max Transmission RateMax Data RateResolution/Refresh Rate Support (24 bpp, uncompressed)GPU Introduction
DisplayPort Versions


Row 0 - Cell 4
1.0-1.1a10.8 Gbps8.64 Gbps1080p @ 144 HzAMD HD 3000 (R600)



4K @ 30 HzNvidia GeForce 9 (Tesla)
1.2-1.2a21.6 Gbps17.28 Gbps1080p @ 240 HzAMD HD 6000 (Northern Islands)



4K @ 75 HzNvidia GK100 (Kepler)



5K @ 30 Hz
1.332.4 Gbps25.92 Gbps1080p @ 360 HzAMD RX 400 (Polaris)



4K @ 98 HzNvidia GM100 (Maxwell 1)



5K @ 60 Hz



8K @ 30 Hz
1.4-1.4a32.4 Gbps25.92 Gbps4K @ 98 HzAMD RX 400 (Polaris)



8K @ 30 HzNvidia GM200 (Maxwell 2)
2.0-2.180.0 Gbps77.37 Gbps4K @ 240 HzAMD RX 7000 (54 Gbps), Intel Arc A-series (40 Gbps)



8K @ 85 Hz
HDMI Versions



1.0-1.2a4.95 Gbps3.96 Gbps1080p @ 60 HzAMD HD 2000 (R600)




Nvidia GeForce 9 (Tesla)
1.3-1.4b10.2 Gbps8.16 Gbps1080p @ 144 HzAMD HD 5000



1440p @ 75 HzNvidia GK100 (Kepler)



4K @ 30 Hz



4K 4:2:0 @ 60 Hz
2.0-2.0b18.0 Gbps14.4 Gbps1080p @ 240 HzAMD RX 400 (Polaris)



4K @ 60 HzNvidia GM200 (Maxwell 2)



8K 4:2:0 @ 30 Hz
2.148.0 Gbps42.6 Gbps4K @ 144 HzNvidia RTX 30 (Ampere), AMD RX 5000 (RDNA)



8K @ 30 HzPartial 2.1 VRR on Nvidia Turing

Note that there are two bandwidth columns: transmission rate and data rate. The DisplayPort and HDMI digital signals use bitrate encoding of some form — 8b/10b for most of the older standards, 16b/18b for HDMI 2.1, and 128b/132b for DisplayPort 2.0. 8b/10b encoding for example means for every 8 bits of data, 10 bits are actually transmitted, with the extra bits used to help maintain signal integrity (eg, by ensuring zero DC bias).

That means only 80% of the theoretical bandwidth is actually available for data use with 8b/10b. 16b/18b encoding improves that to 88.9% efficiency, while 128b/132b encoding yields 97% efficiency. There are still other considerations, like the auxiliary channel on HDMI, but that's not a major factor for PC use.

Let's Talk More About Bandwidth

DisplayPort vs. HDMI: Which Is Better For Gaming? (4)

To understand the above chart in context, we need to go deeper. What all digital connections — DisplayPort, HDMI and even DVI-D — end up coming down to is the required bandwidth. Every pixel on your display has three components: red, green, and blue (RGB) — alternatively: luma, blue chroma difference, and red chroma difference (YCbCr/YPbPr) can be used. Whatever your GPU renders internally (typically 16-bit floating point RGBA, where A is the alpha/transparency information), that data gets converted into a signal for your display.

The standard in the past has been 24-bit color, or 8 bits each for the red, green and blue color components.

HDR

and high color depth displays have bumped that to

10-bit

color, with 12-bit and 16-bit options as well, though the latter two are mostly in the professional space. Generally speaking, display signals use either 24 bits per pixel (bpp) or 30 bpp, with the

best HDR monitors

opting for 30 bpp. Multiply the color depth by the number of pixels and the screen refresh rate and you get the minimum required bandwidth. We say 'minimum' because there are a bunch of other factors as well.

Display timings are relatively complex calculations. The VESA governing body defines the standards, and there's

even a handy spreadsheet

that spits out the actual timings for a given resolution. A 1920x1080 monitor at a 60 Hz refresh rate, for example, uses 2,000 pixels per horizontal line and 1,111 lines once all the timing stuff is added. That's because display blanking intervals need to be factored in. (These blanking intervals are partly a holdover from the analog CRT screen days, but the standards still include it even with digital displays.)

Using the VESA spreadsheet and running the calculations gives the following bandwidth requirements. Look at the following table and compare it with the first table; if the required data bandwidth is less than the max data rate that a standard supports, then the resolution can be used.

Swipe to scroll horizontally

Common Resolution Bandwidth Requirements
Resolution Color DepthRefresh Rate (Hz)Required Data Bandwidth
1920 x 10808-bit603.20 Gbps
1920 x 108010-bit604.00 Gbps
1920 x 10808-bit1448.00 Gbps
1920 x 108010-bit14410.00 Gbps
2560 x 14408-bit605.63 Gbps
2560 x 144010-bit607.04 Gbps
2560 x 14408-bit14414.08 Gbps
2560 x 144010-bit14417.60 Gbps
3840 x 21608-bit6012.54 Gbps
3840 x 216010-bit6015.68 Gbps
3840 x 21608-bit14431.35 Gbps
3840 x 216010-bit14439.19 Gbps

The above figures are all uncompressed signals, however. DisplayPort 1.4 added the option of

Display Stream Compression

1.2a (DSC), which is also part of HDMI 2.1. In short, DSC helps overcome bandwidth limitations, which are becoming increasingly problematic as resolutions and refresh rates increase. For example, basic 24 bpp at 8K and 60 Hz needs 49.65 Gbps of data bandwidth, or 62.06 Gbps for 10 bpp HDR color. 8K 120 Hz 10 bpp HDR, a resolution that we're likely to see more of in the future, needs 127.75 Gbps. Yikes!

DSC can provide up to a 3:1 compression ratio by converting to YCgCo and using delta PCM encoding. It provides a "visually lossless" (and sometimes even truly lossless, depending on what you're viewing) result. Using DSC, 8K 120 Hz HDR is suddenly viable, with a bandwidth requirement of 'only' 42.58 Gbps.

There's a catch with DSC, however: Support tends to be rather hit and miss. We've tested a bunch of graphics cards using a Samsung Odyssey Neo G8 32, which supports up to 4K at 240 Hz over DisplayPort 1.4 or HDMI 2.1. On DisplayPort connections, most of the latest GPUs are fine, but cards from a generation or two back may not even allow the use of 240 Hz. We've also seen video signal corruption on occasion, where dropping to 120 Hz (still with DSC) often fixes the problem. In short, cable quality and the DSC hardware implementation still factor into the equation.

Both HDMI and DisplayPort can also carry audio data, which requires bandwidth as well, though it's a minuscule amount compared to the video data. DisplayPort and HDMI currently use a maximum of 36.86 Mbps for audio, or 0.037 Gbps if we keep things in the same units as video. Earlier versions of each standard can use even less data for audio. One important note is that HDMI supports audio pass through, while DisplayPort does not. If you're planning to hook up your GPU to an amplifier, HDMI provides a better solution.

That's a lengthy introduction to a complex subject, but if you've ever wondered why the simple math (resolution * refresh rate * color depth) doesn't match published specs, it's because of all the timing standards, encoding, audio and more. Bandwidth isn't the only factor, but in general, the standard with a higher maximum bandwidth is 'better.'

DisplayPort: The PC Choice

DisplayPort vs. HDMI: Which Is Better For Gaming? (5)

Currently DisplayPort 1.4 is the most capable and readily available version of the DisplayPort standard. The DisplayPort 2.0 spec came out in June 2019, and Intel's Arc Alchemist GPUs along with AMD's RDNA 3 GPUs support the standard (which has since been bumped to DisplayPort 2.1). Nvidia for its part has decided to stick with DisplayPort 1.4a with its Ada Lovelace parts.

There are now cards with DisplayPort 2.1 support, but they're still of different levels. Intel's Arc GPUs support 10 Gbps per lane, for a 40 Gbps maximum connection speed (not including 128b/132b encoding). AMD opted for the faster 13.5 Gbps per lane (54 Gbps total), but neither company supports the potential 20 Gbps per lane variant. But perhaps the bigger issue now isn't GPU support.

There still aren't many displays that support DisplayPort 2.1. Those are starting to appear, but it's the old chicken and egg scenario. While AMD's latest and greatest supports DisplayPort 2.1, very few people have purchased those cards. Nvidia's decision to stick with DisplayPort 1.4a will undoubtedly play a role as well, since it still accounts for 75% or more of all GPU sales. DisplayPort 1.4 doesn't even have as much bandwidth available as HDMI 2.1, but it's sufficient for up to 4K 240 Hz and 8K 60Hz with DSC, and HDMI 2.1 support is there for people that need up to 48 Gbps.

One advantage of DisplayPort is that variable refresh rates (VRR) have been part of the standard since DisplayPort 1.2a. We also like the robust DisplayPort connector (but not mini-DisplayPort), which has hooks that latch into place to keep cables secure. It's a small thing, but we've definitely pulled loose more than a few HDMI cables by accident. DisplayPort can also connect multiple screens to a single port via Multi-Stream Transport (MST), and the DisplayPort signal can be piped over a USB Type-C connector that also supports MST.

One area where there has been some confusion is in regards to licensing and royalties. DisplayPort was supposed to be a less expensive standard (at least, that's how I recall it being proposed back in the day). But today, both HDMI and DisplayPort have various associated brands, trademarks, and patents that have to be licensed. With technologies like HDCP (High-bandwidth Digital Content Protection), DSC, and more, companies have to pay a royalty for DP just like HDMI. The current rate appears to be $0.20 per product with a DisplayPort interface, with a cap of $7 million per year. HDMI charges $0.15 per product, or $0.05 if the HDMI logo is used in promotional materials.

Because the standard has evolved over the years, not all DisplayPort cables will work properly at the latest speeds. The original Display 1.0-1.1a spec allowed for RBR (reduced bit rate) and HBR (high bit rate) cables, capable of 5.18 Gbps and 8.64 Gbps of data bandwidth, respectively. DisplayPort 1.2 introduced HBR2, doubled the maximum data bit rate to 17.28 Gbps and is compatible with standard HBR DisplayPort cables. HBR3 with DisplayPort 1.3-1.4a increased things again to 25.92 Gbps, and added the requirement of DP8K DisplayPort certified cables.

Stay on the Cutting Edge

Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.

Finally, with DisplayPort 2.1 there are three new transmission modes: UHBR 10 (ultra high bit rate), UHBR 13.5 and UHBR 20. The number refers to the bandwidth of each lane, and DisplayPort uses four lanes, so UHBR 10 offers up to 40 Gbps of transmission rate, UHBR 13.5 can do 54 Gbps and UHBR 20 peaks at 80 Gbps. All three UHBR standards are compatible with the same DP8K-certified cables, thankfully, and use 128b/132b encoding, meaning data bit rates of 38.69 Gbs, 52.22 Gbps, and 77.37 Gbps.

Officially, the maximum length of a DisplayPort cable is up to 3m (9.;8 feet), which is one of the potential drawbacks, particularly for consumer electronics use.

With a maximum data rate of 25.92 Gbps, DisplayPort 1.4 can handle 4K resolution 24-bit color at 98 Hz, and dropping to 4:2:2 YCbCr gets it to 144 Hz with HDR. Alternatively, DSC allows up to 4K and 240 Hz, even with HDR. Keep in mind that 4K HDR monitors running at 144 Hz or more carry premium pricing, so gamers will more likely be looking at something like a 144Hz display at 1440p. That only requires 14.08 Gbps for 24-bit color or 17.60 Gbps for 30-bit HDR, which DP 1.4 can easily handle.

If you're wondering about 8K content in the future, the reality is that even though it's doable right now via DSC and DisplayPort 1.4a, the displays and PC hardware needed to drive such displays aren't generally within reach of consumer budgets. GeForce RTX 4090 sort of overcomes that limitation, but 8K pixel densities often outstrip modest human eyesight. By the time 8K becomes a viable resolution, both in price and in the GPU performance required to run it adequately, we'll likely have gone through another generation or three of GPU hardware.

HDMI: Ubiquitous Consumer Electronics

DisplayPort vs. HDMI: Which Is Better For Gaming? (6)

Updates to HDMI have kept the standard relevant for over 18 years. The earliest versions of HDMI have become outdated, but later versions have increased bandwidth and features.

HDMI 2.0b and earlier are 'worse' in some ways compared to DisplayPort 1.4, but if you're not trying to run at extremely high resolutions or refresh rates, you probably won't notice the difference. Full 24-bit RGB color at 4K 60 Hz has been available since HDMI 2.0 released in 2013, and higher resolutions and/or refresh rates are possible with 4:2:0 YCbCr output — though you generally don't want to use that with PC text, as it can make the edges look fuzzy.

For AMD FreeSync users, HDMI has also supported VRR via an AMD extension since 2.0b, but HDMI 2.1 is where VRR became part of the official standard. Both AMD and Nvidia support HDMI 2.1 with VRR, starting with Turing and RDNA 2. Nvidia has also opted to call its HDMI 2.1 VRR solution "G-Sync Compatible," and you can find a list of all the officially tested and supported displays on Nvidia's site.

One major advantage of HDMI is that it's ubiquitous. Millions of devices with HDMI shipped in 2004 when the standard was young, and it's now found everywhere. These days, consumer electronics devices like TVs often include support for three or more HDMI ports. TVs and consumer electronics hardware have been shipping HDMI 2.1 devices for a while, before PCs even had support.

HDMI cable requirements have changed over time, just like DisplayPort. One of the big advantages is that high quality HDMI cables can be up to 15m (49.2 feet) in length — five times longer than DisplayPort. That may not be important for a display sitting on your desk, but it can definitely matter for home theater use. Originally, HDMI had two categories of cables: category 1 or standard HDMI cables are intended for lower resolutions and/or shorter runs, and category 2 or “High Speed” HDMI cables are capable of 1080p at 60 Hz and 4K at 30 Hz with lengths of up to 15m.

More recently, HDMI 2.0 introduced “Premium High Speed” cables certified to meet the 18 Gbps bit rate, and HDMI 2.1 has created a fourth class of cable, “Ultra High Speed” HDMI that can handle up to 48 Gbps. HDMI also provides for routing Ethernet signals over the HDMI cable, though this is rarely used in the PC space.

We mentioned licensing fees earlier, and while HDMI Technology doesn't explicitly state the cost, this website details the various HDMI licencing fees as of 2014. The short summary: for a high volume business making a lot of cables or devices, it's $10,000 annually, and $0.05 per HDMI port provided HDCP (High Definition Content Protection) is used and the HDMI logo is displayed in marketing material. In other words, the cost to end users is easily absorbed in most cases — unless some bean counter comes down with a case of extreme penny pinching.

Like DisplayPort, HDMI also supports HDCP to protect the content from being copied. That's a separate licensing fee, naturally (though it reduces the HDMI fee). HDMI has supported HDCP since the beginning, starting at HDCP 1.1 and reaching HDCP 2.2 with HDMI 2.0. HDCP can cause issues with longer cables, and ultimately it appears to annoy consumers more than the pirates. At present, known hacks / workarounds to strip HDCP 2.2 from video signals can be found.

HDMI 2.1 allows for up to 48 Gbps signaling rates, and it also supports DSC. Theoretically, that means resolutions and refresh rates of up to 4K at 480 Hz or 8K at 120 Hz are supported over a single connection and cable. We're not aware of any 4K 480 Hz displays yet, though there are prototype 8K 120 Hz TVs that have been shown at CES.

DisplayPort vs. HDMI: The Bottom Line for Gamers

DisplayPort vs. HDMI: Which Is Better For Gaming? (7)

We've covered the technical details of DisplayPort and HDMI, but which one is actually better for gaming? Some of that will depend on the hardware you already own or intend to purchase. Both standards are capable of delivering a good gaming experience, but if you want a great gaming experience, right now DisplayPort 1.4 is generally better than HDMI 2.0, HDMI 2.1 technically beats DP 1.4, and DisplayPort 2.1 trumps HDMI 2.1. The problem is, you'll need support for the desired standard from both your graphics card and your display for things to work right.

For Nvidia gamers, your best option right now is a DisplayPort 1.4 connection to a G-Sync certified (compatible or official) display. Alternatively, HDMI 2.1 with a newer display works as well. Both the RTX 30-series and 40-series cards support the same connection standards, for better or worse. Most graphics cards will come with three DisplayPort connections and a single HDMI output, though you can find models with two HDMI and two (or three) DisplayPort connections as well — only four active outputs at a time are supported.

AMD gamers have a few more options, at least with RX 7000-series cards. You can find DisplayPort 2.1 monitors and TVs, if you look hard enough. Maybe. The Asus ROG Swift PG32UXQR for example supports DisplayPort 2.1, but it hasn't officially released yet (and it's not the same as the previous PG32UXQ). HDMI 2.1 connectivity is also sufficient, and there are more displays available. Keep in mind that maximum bandwidth of the RDNA 3 GPUs is 54 Gbps over DisplayPort 2.1, or 48 Gbps over HDMI 2.1, so it's not a huge difference. Most AMD RX 7900-series cards that we've seen include two DisplayPort 2.1 ports, and either two HDMI 2.1 or a single HDMI 2.1 alongside a USB Type-C connection.

What if you already have a monitor that isn't running at higher refresh rates or doesn't have G-Sync or FreeSync capability, and it has both HDMI and DisplayPort inputs? Assuming your graphics card also supports both connections (and it probably does if it's a card made in the past five years), in many instances the choice of connection won't really matter.

2560x1440 at a fixed 144 Hz refresh rate and 24-bit color works just fine on DisplayPort 1.2 or higher, as well as HDMI 2.0 or higher. Anything lower than that will also work without trouble on either connection type. About the only caveat is that sometimes HDMI connections on a monitor will default to a limited RGB range, but you can correct that in the AMD or Nvidia display options. (This is because old TV standards used a limited color range, and some modern displays still think that's a good idea. News flash: it's not.)

Other use cases might push you toward DisplayPort as well, like if you want to use MST to have multiple displays daisy chained from a single port. That's not a very common scenario, but DisplayPort does make it possible. Home theater use on the other hand continues to prefer HDMI, and the auxiliary channel can improve universal remote compatibility. If you're hooking up your PC to a TV, HDMI is usually required, as there aren't many TVs that have a DisplayPort input.

You can do 4K at 60 Hz on both standards without DSC, so it's only 8K or 4K at refresh rates above 60 Hz where you actually run into limitations on recent GPUs. We've used AMD and Nvidia GPUs at 4K and 98 Hz (8-bit RGB) with most models going back several generations, and 4:2:2 chroma can push even higher refresh rates if needed. Modern gaming monitors like the Samsung Odyssey Neo G8 32 with 4K and up to 240 Hz are also available, with DisplayPort 1.4 and HDMI 2.1 connectivity.

Ultimately, while there are certain specs advantages to DisplayPort and some features on HDMI that can make it a better choice for consumer electronics use, the two standards end up overlapping in many areas. The VESA standards group in charge of DisplayPort has its eyes on PC adoption growth, whereas HDMI is defined by a consumer electronics consortium and thinks about TVs first. But DisplayPort and HDMI end up with similar capabilities.

DisplayPort vs. HDMI: Which Is Better For Gaming? (8)

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

More about cables connectors

VESA introduces DisplayPort 2.1a standard, providing higher resolution and refresh rate combos and doubling cable length limitNvidia's problems with melting GPU connectors intensify — CableMod issues safety recall of 16-pin power adapters for Nvidia GPUs due to overheating and melting, advises disconnecting them immediately

Latest

TSMC to charge premium for making chips outside of Taiwan, including its new US fabs, CEO says
See more latest►

49 CommentsComment from the forums

  • Toadster88

    what about Thunderbolt 3 in comparison?

    Reply

  • JarredWaltonGPU

    Toadster88 said:

    what about Thunderbolt 3 in comparison?

    Thunderbolt just uses DisplayPort routed over the connection. Thunderbolt 2 supports DP 1.2 resolutions, and Thunderbolt 3 supports DP 1.4. I'll add a note in the article, though.

    Reply

  • jonathan1683

    I tried to setup a g-sync monitor with HDMI lol yea didn't work and took me forever to figure it out. Easy choice now.

    Reply

  • CerianK

    Another big issue for me is input switching latency when I flip between sources, which is something missing from product specifications and reviews (unless I've somehow missed seeing it).

    I know that HDMI can be very slow (depending on monitor)... sometimes as much as 5 seconds to see the new source. I assumed that was content protection built into the standard and/or slow decoder ASIC.
    I have not compared switch latency to Display Port, so would be curious if anyone here has impressions
    .
    Honestly, I would probably pay quite a bit extra for a monitor and/or TV that has much faster input source switching.. I have lost patience for technology regression as I have aged... I recall using monitors and TVs that had nearly instantaneous source switching back in the analog days.

    Reply

  • JarredWaltonGPU

    CerianK said:

    Another big issue for me is input switching latency when I flip between sources, which is something missing from product specifications and reviews (unless I've somehow missed seeing it).

    I know that HDMI can be very slow (depending on monitor)... sometimes as much as 5 seconds to see the new source. I assumed that was content protection built into the standard and/or slow decoder ASIC.
    I have not compared switch latency to Display Port, so would be curious if anyone here has impressions
    .
    Honestly, I would probably pay quite a bit extra for a monitor and/or TV that has much faster input source switching.. I have lost patience for technology regression as I have aged... I recall using monitors and TVs that had nearly instantaneous source switching back in the analog days.

    My experience is that it's more the monitor than the input. I've had monitors that take as long as 10 seconds to switch to a signal (or even turn on in the first place), and I've had others that switch in a second or less. I'm not sure if it's just poor firmware, or a cheap scaler, or something else.

    I will say that I have an Acer XB280HK 4K60 G-Sync display that only has a single DisplayPort input, and it powers up or wakes from sleep almost instantly. I have an Acer G-Sync Ultimate 4K 144Hz HDR display meanwhile that takes about 7 seconds to wake from sleep. Rather annoying.

    Reply

  • Molor#1880

    "36.86 Mbps for audio, or 0.37 Gbps" Actually it would be 0.037 Gbps, which takes it from relatively small to a near rounding error for several of those tables.

    Reply

  • excalibur1814

    The best option is the one you have! My gaming laptop only has hdmi.

    Reply

  • bit_user

    JarredWaltonGPU said:

    My experience is that it's more the monitor than the input. I've had monitors that take as long as 10 seconds to switch to a signal (or even turn on in the first place), and I've had others that switch in a second or less. I'm not sure if it's just poor firmware, or a cheap scaler, or something else.

    I always figured it's to do with HDMI's handshaking and auto-negotiation.

    HDMI was designed for home theater, where you could have the signal pass through a receiver or splitter. Not only do you need to negotiate resolution, refresh rate, colorspace, bit-depth, link speed, ancillary & back-channel data, but also higher-level parameters like audio delay. So, probably just a poor implementation of that process, running on some dog-slow embedded processor.

    As such, you might find that locking down the configuration range of the source can speed things up, a bit.

    Reply

  • JarredWaltonGPU

    Molor#1880 said:

    "36.86 Mbps for audio, or 0.37 Gbps" Actually it would be 0.037 Gbps, which takes it from relatively small to a near rounding error for several of those tables.

    Oops, you're correct. Speaking of rounding errors, I seemed to have misplaced my decimal point. ;-)

    Reply

  • waltc3

    I'm surprised the article didn't mention TVs, as currently that's the main reason people go HDMI instead of DP, imo. I appreciated the fact that the article mentioned the loss of color fidelity the 144Hz compromise forces, although most people seem to ignore that difference. I include a link below that is informative on that topic--it's not mine but I saved the link to remind me...;) I haven't looked in a while, but last time I checked few if any TVs feature Display Ports--my TV at home is strictly HDMI. Personally, I use an AMD 50th Ann Ed 5700XT with a 1.4 DP cable plugged into my DP 1.4 monitor, the Ben Q 3270U.

    hardware/comments/8rlf2zView: https://www.reddit.com/r/hardware/comments/8rlf2z/psa_4k_144_hz_monitors_use_chroma_subsampling_for/

    Reply

Most Popular
Alleged cryptojacker arrested for money laundering, $3.5 million in cloud service fraud — ultimately mined less than $1 million in crypto
Determined modder gets 'Halo 2' running at 720p on the original Xbox — after tweaks to the console hardware, kernel, and game
Chinese researchers use low-cost Nvidia chip for hypersonic weapon —unrestricted Nvidia Jetson TX2i powers guidance system
ChatGPT can craft attacks based on chip vulnerabilities — GPT-4 model tested by UIUC computer scientists
ASRock reveals two new 27-inch 1440p IPS monitors, one with an integrated Wi-Fi antenna in the stand
PlayStation overlay is coming to PC alongside 'Ghost of Tsushima' port
Samsung unveils 10.7Gbps LPDDR5X mobile memory optimized for AI applications
ASML sets density record with latest chipmaking tools — High-NA EUV equipment prints first patterns
High-end AMD RDNA 2 supply is dwindling — RX 6950 XT, RX 6900 XT, RX 6800 XT virtually out of stock
TSMC powers parts of Taipei with backup generators — Taiwan requests help as blackouts begin
Raspberry Pi FPV monitor shares drone flights in real time
DisplayPort vs. HDMI: Which Is Better For Gaming? (2024)

FAQs

DisplayPort vs. HDMI: Which Is Better For Gaming? ›

DisplayPort for PC Gaming

Is it better to game on DisplayPort or HDMI? ›

For gaming desktop PCs, DisplayPort remains the preferred choice. While HDMI 2.1 can be used if you have a compatible monitor or TV, it should be noted that using it for multiple monitors will reduce the available bandwidth for each display. In all other scenarios, DisplayPort 1.4 is still the superior option.

Is DisplayPort more advanced than HDMI? ›

When is DisplayPort the best option? DisplayPort cables can achieve a higher bandwidth than HDMI cables. If there's a higher bandwidth, the cable transmits more signals at the same time. This mainly has an advantage if you want to connect multiple monitors to your computer.

Does HDMI or DisplayPort give more FPS? ›

As you can see, both DisplayPort and HDMI are capable connectors suitable for gaming. But if you're playing popular FPS titles at higher resolutions, having the right connector will save you from any performance loss. So, which connector is best suited for gaming? Without a doubt, it's the DisplayPort!

Does HDMI have more latency than DisplayPort? ›

The latency of HDMI 2.1 and DisplayPort 2.1 is quite similar (0.01 milliseconds). However, some factors can affect the latency, such as the length of the cable, the quality of the cable, and the specific devices that are being used.

Does DisplayPort cable matter for gaming? ›

By virtue of its higher bandwidth, DisplayPort supports higher display resolutions and refresh rates, making it a popular choice for gamers and professional video editors.

How to get the most out of a gaming monitor? ›

Steps for how to set up a gaming monitor
  1. Check your connections.
  2. Connect the cable to the graphics card.
  3. Set your resolution and refresh rate.
  4. Enable any Variable Refresh Rate (VRR) technology.
  5. Adjust color settings.
Jan 20, 2024

Why should I use DisplayPort instead of HDMI? ›

Significant differences include that a Displayport generally supports higher resolutions and refresh rates, making it ideal for gaming, graphic design, and multimedia production. Displayport also supports daisy-chaining, where multiple monitors can be connected in a chain, using a single Displayport output.

How many fps can DisplayPort handle? ›

DisplayPort 1.2 can output 1080p and 1440p resolutions at 144Hz, while DisplayPort 1.3 and 1.4 increased bandwidth to manage up to 240Hz at those same resolutions and up to 120Hz at 4K.

Should I use HDMI or DisplayPort for 144Hz? ›

HDMI 2.0 has a high enough data rate to natively handle 1440p at 144Hz, but not 4K. That is reserved for the latest HDMI 2.1 specification. DisplayPort cables are designed more with high-end desktop computers in mind, so it's no surprise that its cables have been more capable of 144Hz for several generations.

Can I use both HDMI and DisplayPort at the same time? ›

You can freely combine HDMI, DisplayPort, and DVI. One monitor can be HDMI, the other DisplayPort and so forth. Video connections are one thing, but remember your extra monitors won't run themselves. Each requires a power connection, obviously.

Can a bad DisplayPort cable cause low FPS? ›

Dropping Frames

If you notice the moving images on your screen are slow, the DisplayPort cable may have received damage and will require repair for a smooth video.

Can I plug HDMI into DisplayPort? ›

HDMI to DisplayPort Conversion

There are two types of conversion adapters that can be used, the first is an active adapter, and the second is a passive adapter type. Most adapters only carry a one-way signal.

Why HDMI is more popular than DisplayPort? ›

Here are some reasons why HDMI cables are often more popular than DisplayPort cables: Historical Adoption and Compatibility: HDMI has been around for a longer time than DisplayPort and has gained widespread acceptance in various consumer electronics, such as TVs, monitors, and audio-visual equipment.

Is HDMI 2.1 better than DisplayPort? ›

Maximum Video Quality

If you're trying to get the very highest resolutions at the fastest frame rates, HDMI 2.1 wins. That's clear in the data sheets. DisplayPort 1.4 simply cannot match the maximum output of HDMI 2.1.

Does DisplayPort have better audio than HDMI? ›

DisplayPort, like HDMI, can carry multichannel digital audio. However, DisplayPort can not carry Ethernet and the standard cable does not have an audio return, whereas HDMI does. A DisplayPort cable supports extremely high data rates over a longer length with a 1080p resolution.

Does HDMI to DisplayPort make a difference? ›

If you have a good active HDMI-to-DisplayPort converter, you're not likely to notice a difference even with a native signal. The key here is not to skimp since cheaper ones can do worse. By contrast, passive DisplayPort-to-HDMI converters work pretty much flawlessly, given how the signal is the same as a native signal.

Can DisplayPort do 4K 144Hz? ›

For 144Hz at 1440p, you will need at least HDMI 2.0 or DisplayPort 1.2 while for 4K 144Hz you are going to need HDMI 2.1 or alternatively, DisplayPort 1.4 with DSC 1.2.

Top Articles
Latest Posts
Article information

Author: Tuan Roob DDS

Last Updated:

Views: 6383

Rating: 4.1 / 5 (42 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: Tuan Roob DDS

Birthday: 1999-11-20

Address: Suite 592 642 Pfannerstill Island, South Keila, LA 74970-3076

Phone: +9617721773649

Job: Marketing Producer

Hobby: Skydiving, Flag Football, Knitting, Running, Lego building, Hunting, Juggling

Introduction: My name is Tuan Roob DDS, I am a friendly, good, energetic, faithful, fantastic, gentle, enchanting person who loves writing and wants to share my knowledge and understanding with you.