Home > News > Content
HDMI Is Suitable For Gaming?
- May 17, 2018 -

Any modern video card will have several display output connectors, and usually among them will be the familiar, many-pinned DVI. That connector is fast becoming a legacy one, though. (The occasional high-end video card these days drops it altogether.) Old-school VGA is fading out even faster, and it now tends to be found only on low-end video cards, if at all. It's mainly a solution nowadays just for connecting legacy monitors.


HDMI is your modern choices. An HDMI port ("HDMI" stands for "High-Definition Multimedia Interface") is almost a given card-side, with most video cards offering at least one, if not more. HDMI is used primarily to connect consumer electronics gear to a television, be that a game console, an A/V receiver, or your Roku/Amazon box. The vast majority of computer-centric monitors also sport an HDMI input, though the port is more prevalent on consumer/home monitors than on business-oriented displays.


The latest HDMI spec (at this writing) is HDMI 2.1, and its capabilities are impressive compared to older digital display connector tech. This version of HDMI boasts a 48Gbps-rated bandwidth, and the spec delivers support for HDR (of varying flavors, depending on whether you're talking about HDMI 2.0a or 2.0b), as well as Enhanced Audio Return Channel (eARC) functionality. (eARC allows TVs to send back audio signals to a receiver, in the event that's a factor in your gaming setup.)


What could be most important for some gamers, is HDMI's ability to support FreeSync. If you own a late-model AMD Radeon video card that supports FreeSync, that could be your decision-maker right there between the two major interfaces. (Of course, your display will need to support FreeSync, as well, for you to garner any benefit.)