Best options for playing Retro Consoles on HDTV – Games Connection


Reduce input lag with Game Mode

Modern televisions do a lot of work to make your picture look good. For example, when your 1080p or 4K HDTV has to upscale the 240p picture from a gaming console, the television may perform motion interpolation or other post-processing in the background. This can cause “input lag,” in which you experience a delay between pressing a button on the controller and seeing a reaction on the screen. This is a problem for lots of video games, but it’s especially noticeable on older, difficult, fast-twitch ones like Tetris or Super Mario Bros.

You can alleviate this by turning on your set’s so-called Game Mode. As we’ve mentioned in the past, Game Mode disables a lot of your TV’s post-processing to reduce input lag.

If you feel like Mario’s not jumping until long after you press the A button, head into your TV’s settings and make sure Game Mode is turned on. This won’t eliminate lag completely—for that, you have to replace your modern set with an old-school CRT TV—but it’ll still provide a noticeable improvement.


Why does your Sega Genesis look horrible on an HDTV?

Older game consoles were designed to work with older televisions. Sometimes they will not work right, and below we will go over the reasons that is. 

  • Resolutions fail to match up.

For example, most of the games on the Super Nintendo Entertainment System use a display resolution of 256*224. Compared to standard 1080 television at 1920*1080, it’s practically a jump from California to New York!

  • They are not supported sometimes

Sometimes 240p signals are not even supported on modern TVs, leaving some entirely incompatible with consoles from the PlayStation era and earlier.

  • Ratio problem

Old TVs were 4:3, while today’s HDTVs are 16:9 widescreen. If you think of displaying an older console on a new TV and it stretches the image to full screen, it will be 1.5 times wider than the usual size. You can set your new TV to the aspect ratio of 4:3 manually. Another way is to zoom the image, but this will cut off a portion of the top and bottom. Zooming will hide crucial game information like remaining lives or ammunition.

CRT television has fast image processing lower than 3-4 milliseconds. The digital setups on modern TVs and monitors are more complicated and an expensive gaming monitor will have an input lag of around eight milliseconds.

Screen Types: CRT LCD/LED (or Similar)

Now that you understand the kind of video signal your Atari, NES, or SNES (or similar) console is putting out, what kind of screen are you using to display that signal, and how is it being interpreted?

A CRT (Cathode Ray Tube) TV uses an electron gun to fire a beam of electrons at the inside of its screen, hitting a bunch of tiny red, green, and blue phosphors to make them light up. These can handle a range of different video signal resolutions (up to 480 vertical resolution), and they just fit the image to the screen (as long as it’s a 4:3 signal being shown). An old game console and a DVD had different numbers of pixels (levels of detail) in what they displayed, and your TV happily just scaled what it got to light up the right phosphors and give you your image.

An HD (or 4K) TV has a specific, fixed resolution they want to display everything in, and when they get a lower resolution signal, they upscale it. Upscaling stretches the image to fit the larger number of pixels required to match the vertical resolution of your screen. While it might be considered ideal to gamers for HDTVs to just upscale everything crisply, TV makers opted for upscaling designed more for the look of video, instead of pixel art. Also, the upscaling being used it not the best, due to cost savings, and the fact that non-HD signals are not expected to be that common.

If you want to do a quick test: run a video cable (male RCA connectors on both ends – see below) from your NES to a female-RCA-to-male-F-type connector (see on Amazon), connecting that to the coaxial jack on the back of your HDTV (assuming you have one). Whoa! Not at all the glorious wonder you may remember from back in the day.

Or, if this is your first exposure to retro gaming, you may think this looks so bad, you don’t understand why people like retro games.

This single image makes the entire case against just plugging your retro video game console into your modern TV. The left is just that, relying on the HDTV’s built-in upscaling. The right is the EXACT SAME CARTRIDGE played on THE EXACT SAME HDTV via the RetroN 5. [ game: Q*bert on the NES ]

Keep reading to learn how to get your games looking their best.

The originalist perspective: Many feel that since the consoles and games were designed to run on CRTs, that is the optimal way to play them. CRTs use illuminated phosphors, and the screens all have a certain degree of blending/softness in the way images look. This was taken into account when game art was created, and so the purest ideal of what looks “right” is what the developers were intending it to look like. That softer feel was the medium they were working in, not just an inferior version of our “perfect” crisp modern display look. Also, that phosphor glow has a special allure you have to get a look at (on a decent CRT, not junk) to appreciate. Phosphors have a glowing, scintillating warmth that is very appealing compared to the much flatter, matter-of-fact “perfect” look of flat panel TVs.

The pixelist perspective: A proper HD signal of a crisply upscaled low-resolution game can be a very inviting look as well. That pixel art look is boldly emphasized, with each onscreen element having a Lego-like chunkiness. It’s an unusual look to the person accustomed to games on CRTs, but it definitely does have its merits. There’s a clean precision that can be very attractive.

This comparison of images rendered on a CRT vs HDTV makes is crystal clear how the literal, blocky pixel rendering of oldschool graphics on a modern TV has a very different feel from the softer shading of the CRT the game was originally designed for. Not every comparison is so stark, but this one makes it obvious. This image is from a fantastic article about the world of creating video game graphics in 1980s Japan over at VGDensetsu.

Video Signal (and Screen) Aspect Ratios

I spent most of my life during the era when TV screens were a nearly squarish shape. Today’s screens are much wider in their proportions, being nearly twice as wide as they are tall.

The aspect ratio difference between oldschool CRT TVs and HDTVs. New TVs are significantly wider than the nearly squarish CRT screens.

This relationship between width and height is referred to as aspect ratio. Standard HD TVs show a 1080p image. This means they are showing video at 1920 pixels wide, by 1080 pixels tall. Note that the resolution is always referred to by the height (1080 in this case), and the “p” in 1080p refers to “progressive” vs “interlaced” (see the next section). If you take the 1920 X 1080 rectangle and shrink it down while keeping the relative sizes of width-to-height in the same proportion (the aspect ratio), you’ll find that 1920:1080 reduces down to 16:9. So 16:9 is just the boiled down, lowest-common-denominator description of that relationship between width and height.


This site is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to We are compensated for referring traffic and business to Amazon and other companies linked to on this site.