Hd hd: HD vs Full HD: What’s the Difference?

HD, FHD, UHD, 4K : What are the differences ?

The main difference between an HD, FHD, UHD, or 4K television is that the 4K television will have a sharper picture than an HD television.

What is the difference between HD (HD ready), FHD, UHD and 4K?

The main difference between an HD, FHD, UHD, or 4K television is that the 4K television will have a sharper picture than an HD television.


HD or High Definition has a width of 1280 x 720 pixels (<1 megapixel). HD is also referred to as HD ready and 720p.


Full HD

Full HD or FHD is the resolution currently found on most televisions, Blu-ray players, and video content. The image is 1920 pixels wide and 1080 pixels high (2.07 megapixels). Full HD is also referred to as 1080i and 1080p.


Ultra HD

Ultra HD also known as UHD is increasingly popular among televisions, media players and video content. The image resolution is 3840 x 2160 (8.3 megapixels). TVs with Ultra HD resolution display 4 times more pixels than Full HD images.

> UHD 


4K has a resolution of 4,096 × 2,160 (8.8 megapixels), a little higher than UHD.

Better resolution at the same image size

A better resolution TV (eg UHD vs FHD) displays a greater number of pixels and allows more details to be displayed on an image of the same size. The materials, the textures, the skin texture of the actors, the backgrounds: everything is richer and more realistic.

At the same diagonal, the higher the definition, the higher the image quality and therefore the more precise and detailed the image. A 4K definition television has 4x more pixels than an HDTV to display the same scene on the same surface. At the same time, the resolution is multiplied by a factor of 2.

A larger image at equal distance

It is the second advantage of switching to a higher definition image: if we increase the number of pixels in the image, we can therefore enjoy a larger image while maintaining the same distance between the screen and the viewers, without however distinguishing the pixels which compose it.

For example, you can switch without fear from a 32-inch (80 cm) 1080p HD television placed 2.5 m from viewers to a 65-inch (164 cm) UHD television, without having to move the sofa back. The image will be much larger and the immersion much better, but the pixels that make up the image will remain invisible to viewers at this distance.

Note: In order to take full advantage of a 4K image, the entire signal transition chain must be capable of encoding, transmitting, receiving, decoding and displaying 4K. In other words, the camera used to record the film must be 4K, the signal transmission is in 4K, the reception (TNT, cable or SAT) can receive 4K, the cables (between box and TV for example) or 4K, the receiver decoding is 4K, the receiver panel is 4K.

Discover all our TV models :



> UHD 

HD Ready vs. Full HD vs. Ultra HD: What’s the Difference? Explained

Virtually every television available today supports high-definition (HD) video. But there’s still a bit of jargon to wade through when it comes to display technology. Particularly, you might get confused on the differences between the terms HD Ready, Full HD, and Ultra HD.

Let’s take a look at the distinction between HD Ready and Full HD, how they compare to Ultra HD, why these terms are used, and what they mean in practical use.

HD Ready vs. Full HD

In the most basic terms, HD Ready TVs (and set-top boxes) are capable of displaying 720p video, which is 1280×720 pixels. Full HD TVs and boxes can show 1080p video, which is 1920×1080 pixels. The HD Ready standard came about in Europe around 2005, so that people could be sure they were buying TVs that actually supported HD.

However, it’s not quite this simple. Depending on where you live, the definition of HD Ready is slightly different. Specifically, the US and Europe define it differently.

In the US, HD Ready for a TV means that the display can output 720p images. In most cases, this also indicates that the TV has a built-in digital tuner, which is necessary to accept digital TV broadcasts (which have largely replaced analog signals). This same HD Ready logo is also printed on several projectors, computer monitors, and other devices, which don’t have a tuner.

In Europe, the HD Ready logo doesn’t mean that a TV has a digital tuner. The output must be 720p to get the HD Ready logo, but the sticker only indicates that support.

There are other logos/stickers used in the past that aren’t as common now. HD Ready 1080p means that the TV is capable of outputting 1080p video without distortion, while HD TV 1080p means the 1080p-capable TV also has a digital tuner.

Worldwide, the golden Full HD 1080p logo is a standard that denotes the display can show 1080p images. It does not indicate anything about a digital tuner, but in the US, most Full HD TVs have one.

What Is High-Definition? 720 vs.

1080 Explained

Logo aside, what is the actual difference in the quality?

TVs show video as a series of lines; resolution is simply the amount of pixels that make up a display, both horizontally and vertically. The shorthand numbers used for resolution (720p and 1080p) represent how many vertical lines can your TV display at one time.

1920×1080 resolution (1080p) means that there are 1920 pixels horizontally and 1080 pixels vertically. 720p resolution is 1280×720 pixels. Having a higher resolution results in a sharper image, because there’s more information on the screen at once.

Image Credit: Raskoolish/Wikimedia Commons

As you can probably tell from the discussion above, “HD” isn’t a well-defined term. Technically, high definition just means anything that’s better that standard definition. In the US, standard definition is 480i (640x480px). In many other places in the world, standard definition is 576i (768x576px).

Read about the differences between NTSC and PAL for more about the history of these resolutions.

Interlaced vs. Progressive Displays

In addition to the resolution, it’s also important to know the scanning type of the display. There’s a difference between 1080p and 1080i; they don’t use the same technology to display video.

The p in a display type stands for progressive scan, while the i stands for interlaced scan. In progressive scan, the video displays all lines in a given frame (one image of the video) at the same time.

In interlaced scan, each frame is divided into two fields. One field contains all even-numbered lines, while the other has all the odd-numbered lines. There two fields rapidly switch back and forth quickly enough that the human eye sees motion.

Interlaced video conserves bandwidth, and was thus used in older analog TV broadcasting. While efficient, it’s also more susceptible to distortion, especially for fast-moving video. In the US, most TV broadcasts today are either 1080i or 720p, with the latter preferred for sports since they move quickly.

A 1080p (“Full HD”) TV can display progressive scan HD signals from video game consoles, Netflix streaming, and similar. These TVs can also show interlaced signals, but since the process of deinterlacing isn’t perfect, you can sometimes spot imperfections.

An HD Ready TV might mention that it can display 1080i video, but this isn’t quite the same as “Full HD,” as we’ve seen.

Where Will You See the HD Ready and Full HD Logos?

You’ll typically see the HD Ready or Full HD logo on TVs, but they show up on other similar gadgets too. These include projectors and monitors, as well as set-top boxes.

Remember that video will play at the lowest resolution supported by any device in the chain. For example, if your TV is Full HD (1080p), but your set-top box is only HD Ready (720p), your TV will show 720p video. A PlayStation 4 capable of outputting in 1080p won’t be able to show that 1080p video on a 720p TV.

Some TVs will attempt to upscale the video, but this is a workaround that doesn’t always result in better-quality images.

Are “HD Ready” and “Full HD” Relevant Today?

We’ve explained this so that you understand the distinction between these terms mostly used for marketing. But today, you don’t really need to worry about the “HD Ready” or similar tags on most devices.

720p resolution has become the default minimum for nearly every display device. If you’re buying a TV, monitor, projector, or anything like that, it will almost certainly support 720p video at least. Unless it’s extremely cheap, chances are that it supports 1080p as well; the Full HD tag lets you know for sure.

But when considering a purchase, you should go beyond these stickers and check the actual product details of a display before you buy it. Online, look in the specifications for a field titled Resolution or similar, which should have a value like 720p or 1920×1080. When in a store, look at the device’s box or ask an employee for more details.

In general, unless you’re looking to spend as little money as possible, we don’t recommend buying any display that’s under 1080p. While 720p is still referred to as “HD,” 1080p is the HD standard in most people’s minds. It’s used for Netflix streaming, Blu-ray discs, game consoles, and similar.

What About 4K and Ultra HD?

After HD became the baseline, new technology has brought us even better display options. 4K TVs, monitors, and other displays are now affordable for most people. In most cases, you can treat “4K” and “Ultra HD” as interchangeable.

As a result, you may see stickers labeled Ultra HD or 4K Ultra HD on TVs, monitors, and projectors now. Like “HD,” the “4K” moniker is not an exact standard. It refers to any resolution that has around 4,000 pixels horizontally, but the exact count differs between TV and cinematography usage.

Read more: How 4K TV Resolution Compares to 8K, 2K, UHD, 1440p, and 1080p

4K TVs are typically 3840x2160px, which is exactly four times the amount of pixels in a 1080p display. In addition to the 4K or Ultra HD name, this resolution is sometimes called 2160p, in line with lower-resolution naming conventions.

See our comparison of 4K and Ultra HD for more info. At even higher resolutions, there’s also 8K Ultra HD or Full Ultra HD, which is 7680x4320px. However, 8K resolution is rarely seen in actual use so far, and will take some time to adopt.

Other Measures of TV Quality

Now you understand the differences between HD Ready and Full HD, and how these compare to Ultra HD. In a lot of ways, these terms are outdated since 1080p and 4K TVs are readily available and affordable now. Either way, you shouldn’t buy a TV without checking the specific product details; don’t go off these marketing stickers alone.

Remember that the resolution is only one factor that goes into the quality of a TV, too. You should consider the viewing angles, features, HDR support, and similar when buying a new display.

Image Credit: semisatch/Depositphotos, Rubenlodi/Wikimedia Commons

HD vs Full HD: what’s the difference?

You must have seen the abbreviations “HD” and “FHD” more than once. Everything seems to be simple: the latter means a higher quality picture. But how is this quality measured in general? And is everything so clear?

Even a user who does not particularly want to understand the intricacies of technology, when choosing a smartphone, tablet or, for example, a laptop, seeing the “Full HD” sticker, will think that the device has a good display. A more experienced user will even be able to somehow explain why Full HD is definitely better than HD and say that the first abbreviation suggests screens with more image detail than the latter.

However, Full HD has other advantages that few people know about. What are they and is the name “Full HD” a separate quality standard at all, as many of us used to think, maybe it’s just a marketing ploy?


  • 1 What is screen resolution
  • 2 Why isn’t Full HD really a separate standard?
  • 3 Comparison of Full HD and HD
  • 4 Other resolution standards

What is screen resolution

The only thing that needs to be understood when talking about resolution is that pixels are the smallest indivisible objects that form an image on the screen. So, screen resolution is a value that determines the number of pixels (luminous dots) per unit area. It is denoted quite simply, for example: “1920 x 1080”, “1280 x 720”, etc., where the first number indicates the number of scattered points along the vertical, the second – horizontally.

For a visual comparison, let’s imagine two screens with a diagonal of 6.36 inches. One has FHD resolution (1920 x 1080), the second is HD (1280 x 720). Logically, it will become clear that a screen with FHD resolution will produce a clearer and more detailed picture than one that has an HD matrix. This is easily explained by the fact that with equal display sizes, the first one has a larger number of pixels horizontally and vertically (1920 versus 1280 and 1080 versus 720, respectively), which means that the picture is divided into smaller parts that the human eye can see.

But is it possible to somehow calculate how much one screen is superior to another? There is another value called pixel density (a unit of monitor resolution, “PPI” – Pixels Per Inch). It is calculated based on the diagonal resolution ratio (for 1920 x 1080, for example, it equals 2202.9 units) to the diagonal of the screen, that is, it is measured by the number of pixels per inch:

Let’s go back to the example above and calculate the number of ppi for each screen. In the case of HD resolution, we have a density of 231 units, while a screen with an FHD matrix boasts a result of 346 units. It turns out that with an equal diagonal, the difference in pixel density (clarity and detail) reaches almost 34%, which is very noticeable. At the same time, if we calculate how many single elements are in one and the other screen, we will get a difference of almost 56%: 2.07 million against 921.6K for FHD and HD panel, respectively.

If the screen resolution and its diagonal are almost always indicated by the manufacturers of equipment, then such a parameter as PPI is often omitted. However, this is a very important point, which is definitely worth paying attention to, for example, when choosing a smartphone. Even a screen with FHD-resolution “ultra-high definition” may not live up to expectations due to too large a diagonal. Conversely, HD or HD+ displays may not be as terrible, depending on the size of the panels. There are many online calculators to calculate PPI.

Why isn’t Full HD really a separate standard?

One interesting fact will help answer this question. In 2007, Sony introduced the Full HD (Full High Definition) name for some of its products. This resolution is 1920 x 1080 pixels. This step was needed in order to somehow separate this resolution from the much lower 1280 x 720. permissions 1920 x 1080 approved by EICTA in 2005. Despite this, even after 15 years, we are used to referring to 1920 x 1080 as Full HD or simply FHD.

Depending on the aspect ratio of the screen and its diagonal, pixels can take on different shapes: square or rectangular. For example, “standard” 1920 x 1080 is 16:9, while there are other FHD variations: 1440 x 1080 (at 4:3 screen), 3840 x 1080 (at 32:9) and more.

Comparison of Full HD and HD

Now that we have established the main difference between the two resolutions, we can dwell on other details. What are the advantages and disadvantages of Full HD screens over HD? We list all aspects:


  • More than 2 million pixels. This resolution provides a comfortable viewing of content not only on a mobile device, but also on a laptop, monitor, or even some TVs.
  • A huge amount of equipment with FHD screens. Displays with this resolution are very common in a wide variety of devices from well-known manufacturers.
  • Support and compatibility. FHD-quality is one of the main ones for any content on the Internet, advertising and other things. Also, in the H.264 codec, 720p video is perceived as 1080p.
  • 4x better clarity. By some estimates, 1080p screens display up to 4 times more detail than 720p.


  • Requires more computing power. If we are talking, for example, about smartphones or laptops, then when outputting a higher quality FHD image, the device is loaded more. This is especially noticeable in games: high resolution textures force the GPU to do more rendering due to the increased amount of data that needs to be processed.
  • Consumes more Internet traffic. When watching online video in FHD resolution, the requirements for Internet connection speed increase: again, because a 1080p picture has more information to download. Higher traffic consumption and stricter speed requirements.

Other resolution standards

In summary, when we understand how image quality is measured, what resolution is and what it says, we learned about pixel density as a universal measure of clarity, as a conclusion, we can talk about other standards.

In addition to those discussed above, Quad HD (2560 x 1440) and Ultra HD (3840 x 2160) resolutions are also the most common. QHD screens display images of 3. 68 million pixels, and for UHD displays this figure reaches as much as 8.29million (or even 8.8 million for Cinema 4K – a format common in cinema). In addition, a 12K resolution with a number of pixels exceeding 74.64 million is also known today. | Ultra HD | Full HD – What’s the difference?

What does Ultra HD mean?

Ultra HD is the marketing name for an ultra high definition format with a resolution of 3840×2160 pixels. This means that the screen of a TV, monitor, display or projector contains 3840 horizontal pixels and 2160 vertical pixels: about 8.3 million pixels in total. The resolution of 4K screens is usually specified as 3840 x 2160.

What does Full HD mean?

Full HD, also often referred to as 1080p, is the marketing name for an older high definition format at 1920×1080 pixels. The term usually implies a 16:9 widescreen aspect ratio, which implies a resolution of 2.1 megapixels. It is often sold as Full HD to distinguish 1080p screens from 720p (1280×720 pixels) screens, which are also referred to as HD.

To simplify, Ultra HD differs from Full HD screens in higher image quality and clarity. Ultra HD is 4 times more pixels than Full HD (1920 x 1080). The high pixel density makes the image crisp and fine details more visible.

The difference between Ultra HD and Full HD is especially noticeable on large screens. Compare the picture quality of a regular 65″ Full HD screen with the same size 65″ Ultra HD screen. A screen with a high resolution (Ultra HD) has smaller pixels than a screen of the same size with a lower resolution (Full HD). Therefore, with the Ultra HD screen, you can enjoy maximum immersion in what is happening at any distance from the screen, in addition, high resolution is especially useful when you need to display drawings, tables, graphs and small text on the screen.

What does 4K mean?

The most common resolution for new multimedia hardware is now 4K. The marketing gimmicks of some manufacturers are a source of confusion in the designation of permits.

Many people are familiar with the concept of 4K, but the marketing tricks of some manufacturers have led to confusion in the designation of resolutions. And those who claim that 4K and Ultra HD are not the same thing are absolutely right.

The problem is that the term 4K often refers to consumer TVs with a screen resolution of 3840×2160 pixels, although technically speaking, 4K means a horizontal resolution of 4096 pixels . This resolution is specified by a consortium of film studios and DCI cinema equipment manufacturers in the standards that describe the requirements for digital cinema display and digital cinema cameras. Since movies differ in aspect ratio from the TV screen, in some cases it is important to take into account this difference, as well as the difference between 1080p and 2K (2048×1080).

So yes, Ultra HD screens are not technically 4K as they are 3840 x 2160 resolution. However, in normal use of equipment, outside of film production and theaters, the difference doesn’t really matter. Platforms such as Amazon, Netflix, Apple and others use Ultra HD resolution (3840×2160) and this is quite enough.

VESA designation Horizontal pixels Vertical pixels
HD 720 1280 720
WXGA 1280 800
WXGA 1366 768
Full HD 1080 1920 1080
WUXGA 1920 1200
WQXGA 2560 1600
Ultra HD 3840 2160
4K 4096 2160
8K UHD 4320p 7680 4320

4K Zoom

Of course it’s great when you can see content in 4K or Ultra HD on the screen. But not all content is now available in 4K, and not all media players, even built-in Ultra HD displays, are capable of playing video files in this resolution.