From CRT to Ultra HD: the evolution of screen resolutions

From year to year, the resolution of the screens of modern devices is increasing, more and more increasing the clarity of the displayed picture. HD, 4K, 8K … And it all started with black and white TVs, where the picture size was smaller than a matchbox. In this article, we will talk about the evolution of screens and what the future holds for the technology.

Early television
The pioneers of serial TVs can be considered black and white American televisions Visionette, released in 1929. The screen on the first models was about the size of a postage stamp. A magnifying glass was installed in front of him, which allowed him to see at least something. These were models with mechanical scanning. To create an image on this type of TV, a slotted disc was constantly spinning, controlled by an electric motor. The concept of “screen resolution” in the literal sense did not exist then. The image quality was calculated by the scan lines – in this model there were 45 of them. This quality was enough only to recognize in the frame some objects or faces shot in very close-up.

TVs gained real popularity only with the advent of picture tubes. In 1934, the German company Telefunken began production of the first serial CRT TVs. The number of scan lines was 120. This made it possible to transmit an image equivalent to a resolution of 160 x 120 pixels. In fact, however, the actual size of the picture was slightly smaller – all due to the peculiarities of interlaced scanning. Televisions have become more similar to the models we are used to. These were devices with a diagonal of 30 cm or more, but the body was huge by modern standards. Within a couple of years, the production of models similar in characteristics was mastered in the USA, Great Britain and France. In 1940, the production of such TVs was also established in the Soviet Union. For the next half century, picture tubes, or cathode ray tubes (CRTs), will be the basis for most devices with screens.

Classic era: SECAM, PAL, CRT
In 1953, the NTSC standard was developed and approved for television broadcasting. It allowed broadcasting in color with 480 scan lines and a frequency of 60 Hz. The first frame displayed odd 240 lines, the second – even ones. This technology produced a 640 x 480 image at a real rate of 30 frames per second. This principle of display was used by all televisions. This includes, among other things, systems according to SECAM and PAL standards, which appeared in the 1960s. In the new standards, the maximum image size was 720 x 576 pixels at 50 Hz and 25 frames per second. CRT TVs with this resolution were used until the early 21st century. The DVD-Video standard is based precisely on these resolution and frequency parameters.

1Z0-1083-21

1Z0-1080-21

1Z0-1079-21

1Z0-1077-21

1Z0-1074-21

NSE4_FGT-7.0

C_C4H260_01

1Z0-1057-21

1Z0-1059-21

1Z0-1062-21

1Z0-1069-21

1Z0-1065-21

1Z0-1073-21

1Z0-931-21

1Z0-1075-21

1Z0-1056-21

For a long time, TV manufacturers saw no reason to increase picture quality. Progress went forward again with the advent of computers. They demanded more and more resolution for comfortable work with text elements of the operating system. In the early nineties, CRT monitors conquered the resolution of 1024 x 768, and a little later, with the spread of 15- and 17-inch models – 1280 x 1024 and 1600 x 1200. The latter, however, in those years, few people used. Until the advent of LCD monitors, most computers had a fairly modest desktop resolution of 1024 x 768 or 1280 x 1024 pixels.

However, only personal computers could use this permission. Game consoles connected to TV gave a picture of very low resolution. They were inferior even to the television standards of the 60s. For example, the NES (known to us as Dendy) in PAL mode gave out only 256 x 240 pixels, SEGA Mega Drive and Sony PlayStation – up to 320 x 240.

LCD panels and the HD era
The beginning of the 21st century was marked by the massive distribution of LCD panels. The first such monitors suffered from the disadvantages of slow matrices and poor contrast. However, they took up much less space than CRTs. In addition, they have contributed to the emergence of mainstream laptops and mobile phones with color screens.

The first LCDs were rarely used in televisions. At first, manufacturers opted for plasma panels. This technology provided good colors and contrasting images. Of course, there were also disadvantages: Plasma TVs consumed a lot of energy, weighed a lot and were prone to screen burnout. But most importantly, such panels could only provide a resolution of 1024 x 768 pixels. This was not enough: already in 2005, the introduction of the HD Ready standard began, requiring a resolution of 1280 x 720 and higher. LCD panels have evolved over the years to provide the necessary resolution. So the full transition to them was a matter of time. The appearance of the Full HD standard in 2007 was the final nail in the coffin of plasma panels. There are simply no alternatives to LCD.

The first mainstream computer monitors on LCD panels had a resolution of 1280 x 1024 with a diagonal of 17-19 inches. Such models were typical of 2002-2006. Later, widescreen monitors with an aspect ratio of 16:10 appeared. Image size increased: high-end models boasted a resolution of 1920 x 1200. Simpler models had a resolution of 1680 x 1050 and 1440 x 900 pixels. The latter were also used in premium notebooks. The budget “laptops” used a more modest 1280 x 800.

The resolution in game consoles has also increased. PlayStation 2 and the first Xbox in the bulk of games issued up to 720 x 576 pixels. The next-gen Xbox 360 and PlayStation 3 consoles were already capable of delivering true HD gaming at 1280 x 720.

Screens of mobile phones also developed in the “2000s”. In the first models with color LCD displays, the resolution did not exceed 128 x 96 pixels. In 2005-2006, standard definition ranged from 176 x 208 to 320 x 240 pixels. This has contributed to the development of multimedia capabilities in phones and mobile games. The first mainstream Symbian smartphones also used these resolutions. Until 2008, 320 x 240 remained the standard for mobile devices.

LED, OLED, 4K
In the 2010s, TVs and monitors began to be equipped with matrix backlighting based on LEDs, or LEDs. Prior to that, large devices used fluorescent lamps. The new technology has increased the contrast and color gamut of the panels.

By this time, the 16:10 format was considered obsolete. Newer monitor models were only equipped with panels with an aspect ratio of 16: 9 – 1920 x 1080, 1600 x 900 and 1366 x 768. This format is still used in most inexpensive laptops, TVs and computer monitors.

There has also been an evolution in the smartphone segment: touchscreen models have become truly massive. They interacted with the user through the screen, not through the keyboard. Therefore, the first naturally began to grow in size. Resolutions of the new format 2008-2011: 480 x 320, 640 x 360, 800 x 480, 960 x 540.

In “tenths” mobile devices received a new type of matrixes based on organic light-emitting diodes – OLED. The era of HD began for smartphones – models with a screen resolution of 1280 x 720 pixels appeared. The flagships soon acquired matrices of 1920 x 1080. In 2015, the first models appeared with a resolution of 2560 x 1440 and even 3840 x 2160, which corresponds to the 4K standard. True, manufacturers quickly realized that chasing huge resolutions on small screens was pointless. Most smartphones now come with 1080p panels on the narrow side of the screen (Full HD +).

By the mid-tenths, OLED made its way to televisions. However, here competition with LED is only in devices of a high price range. The new technology is not cheap.

The eighth generation PlayStation 4 and Xbox One game consoles, released in 2013, finally delighted users with Full HD resolution in games. However, the graphics of new games are becoming more demanding on the hardware of the consoles. Therefore, modern games run on them in much more modest resolutions – 1600 x 900 or even 1280 x 720.

In 2014, a new resolution standard appeared – 4K. It provided a resolution of 3840 x 2160 pixels, that is, exactly four times more than the popular Full HD. By now, TVs with support for the format are available even in the entry price segment. Computer monitors with support for 4K resolution are also available, but they are still quite rare in real use. In this segment, the so-called 2.5K models with a resolution of 2560 x 1440 are currently expanding.

In connection with the proliferation of 4K format, Sony and Microsoft released improved versions of the eighth generation consoles. They were PlayStation 4 Pro and Xbox One X, released in 2016 and 2017. However, despite support for 4K, in many games these consoles give out lower resolutions. Somewhere they use upscale from 2560 x 1440, somewhere they use checkerboard rendering (different parts of the frame alternate through one to obtain the desired resolution).

8K, 16K … What’s next?
In 2020, Sony and Microsoft released ninth generation game consoles: PlayStation 5 and Xbox Series X / S. The devices support image output in 8K Ultra HD. Although games are played on them only in 4K, the course for higher resolution has already been set.

8K TVs and monitors are already on sale. True, there are not very many models, and they cost fabulous money. But the moment when they will become massive is not far. And then devices with 16K will begin to gain momentum … Progress cannot be stopped. But does such a resolution make sense?

Let’s take as an example the most popular diagonal size for modern TVs – 55 inches. At Full HD resolution, the picture will look as clear as possible up to 2.5 meters from the device. This is the distance most people watch TV of this size. When using a modern 4K model of the same diagonal, you can see a perfectly smooth image already from a distance of 120 centimeters from the screen.

Go ahead – you bought an ultra-modern 8K model. The discrimination threshold has again halved. Now you can see the ideal picture from a distance … 60 cm. Can’t you go further? You can, of course, but then you won’t see the difference with Full HD, and even more so with 4K. And we are talking about devices with really large screens. In a computer or smartphone with an ultra-high-resolution panel, the differences can be seen only by literally burying your nose into the screen.

Complicating the problem of using 4K and 8K on computers is the fact that processing high-resolution 3D images requires many times more hardware power. That is why users of high-performance gaming PCs prefer monitors with a resolution of 2560 x 1440. The difference with 4K at normal distance is not striking, but the performance in games is much higher.

At the same time, TVs have already made the transition to 4K format, but they are in no hurry to move to 8K. This is because most users will not see the difference: few people watch TV from a distance of less than a meter. And there is practically no content in this resolution.

Outcomes

Why do we need ultra-high resolutions, since there is no real benefit from them? In fact, this is a serious reserve for the future. Ultra-high-resolution sensors will be useful in next-generation VR glasses, which will finally get rid of the graininess of the image. Large 8K and 16K panels will find great use in TV-walls installed in shopping centers – finally, you can come close to such a TV and see a clear image, not blurry frames. In the future, such a permit will find application in AR windshields of cars.

For shooting video 8K is also useful: from a video of this resolution, you can get a better 4K video recording. There are indeed many options, so the further development of ultra-high resolutions is inevitable.