In the past, Digital Foundry has enthused about the quality of high-end CRTs, the display technology of yesteryear that can still deliver some aspects of image quality that no modern screen can match. We've talked about contrast, precision, motion resolution and much more. Now, two years on from us acquiring the Sony GDM-FW900 - quite possibly the best gaming CRT money can buy - we've tested out how the display works with the new wave of consoles: Xbox Series X and PlayStation 5. Are all of the benefits of CRT still there? How on earth do you connect an HDMI device to an 18-year-old display? How does today's 4K rendering stand up on a CRT screen? And since we first looked at CRTs, have modern displays made any strides in matching up to the strengths of the cathode ray experience?
You can see for yourself by checking out the video below, where I test out a range of PlayStation 5 and Xbox Series X games on my own FW900 and show off how the latest OLED screens from LG are able to compete against one of the key strengths of CRT - but first of all, let's go back to basics. What makes the FW900 so special? Put simply, it's Sony deploying its Trinitron tech to maximum effect, with a relatively large 24-inch 16:10 screen. It's able to process virtually resolution up to 2560x1600 at 60Hz, and if you scale down resolution, it's possible to increase frequency - so yes, high refresh rate gaming is possible. The downsides? A 24-inch screen is small by today's standards, but the FW900 is a huge, desk dominating box, and it weighs 42kg, meaning it is hardly portable.
Inputs are via VGA or BNC, where five sockets allow for individual connection of red, green, blue signals along with horizontal and vertical sync. In terms of connecting a modern device, HDMI to VGA adapters are possible and achieving 1080p60 is possible even with cheapo adapters. We've tested a Vention box from Amazon UK (US link here), which seems to allow for 1440p60 too. USB-C and DisplayPort adapters are also available which will get the job done for PC users.