What is the difference between 1080i and 1080p?

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on reddit
Share on whatsapp


It’s the same format, right? Not quite, one is definitely better than the other.

With so many high and ultra high definition resolution formats on the market, it can be hard to tell the difference between them. For example 1080i and 1080p. From the outside, little or nothing is revealed about their attributes or differences.

High Definition (HD) refers to a screen resolution of 1920 pixels wide and 1080 pixels high (hence the use of “1080”). This means 1080i and 1080p have the same resolution. So what’s the difference between them? Keep reading to find out.

SEE AS ​​WELL: What is the difference between Xbox One S and Xbox One?

The difference between 1080i and 1080p

The first thing to note is that the letters in 1080i and 1080p refer to the raster scan technique used. A raster scan is simply how an image is reconstructed on a display monitor.

The “i” in 1080i stands for interlace scan and the “p” in 1080p stands for progressive scan. These refer to two separate methods of producing an image on a screen at a resolution of 1920 x 1080. So if the two resolutions have 2,073,600 pixels in total, what’s the difference?

Imagine your TV screen as rows of pixels. It is 1080 pixels high, so there are 1080 rows of pixels from the top to the bottom of the TV. The refresh rate of pixels is called the refresh rate. Most televisions and displays operate at a refresh rate of 60Hz (60 refreshes per second).

For video display to work, every pixel on a digital screen must be refreshed quickly enough to perceive it as movement (even though technically the screen is only blinking individual images).

The difference between 1080i and 1080p is how those pixels are updated to produce a consistent, easy-to-view “moving” image.

What is 1080i and how does it work?

Interlaced scanning produces an image by displaying the odd and even rows of pixels alternately. Thus, all odd lines are refreshed 30 times per second, and all even * lines are refreshed 30 times per second, in sequence.

Even and odd lines are refreshed 30 times per second, so an interlaced scan effectively doubles the frame rate to 60 without using additional bandwidth.

The 1080i method was produced to counter the effect when the entire screen is refreshed from top to bottom too slowly, causing the top of the screen to show half of an image different from the bottom in older CRTs. . In older screens, the top of the screen became duller and less illuminated than the bottom at the end of each swipe.

The interlaced scan format was especially important when technology was limited, and it was essential to use as little bandwidth as possible. For television broadcasting, it was an absolute necessity. But with the advent of better technology, 1080p has arrived.

1080i vs. 1080p

1080p is the format typically used on all modern screens and televisions. Instead of refreshing half the pixels at once – like 1080i – 1080p refreshes the entire screen at once. For this reason, 1080p is sometimes referred to as “true HD”.

With the entire screen refreshed at the same time, 1080p effectively processes twice as much information as 1080i at the same frame rate. The way 1080p refreshes the screen simultaneously is usually a top-to-bottom “wave” with each row being refreshed at a time. This usually means that (with a 60Hz monitor) each row will be refreshed to 1 / 60th of a second.

This is why 1080p requires more bandwidth than 1080i and why 1080i has been used more historically. Now that this is no longer a limitation, 1080p has become the main format for new digital displays.

Interestingly, many TV programs are still broadcast in an interlaced format, usually 1080i. This means that 1080p compatible displays must have a deinterlacing component to properly display the image and avoid visual artifacts.

Deinterlacing is the process used to build a complete image from the two image fields of alternating rows of pixels that use 1080i. When this happens, the picture quality is somewhat reduced compared to true 1080p.

What about 4K?

Most brand new televisions and many computer monitors offer 4K capabilities. 4K is called “ultra-high definition” and has a resolution of 3840 x 2160 pixels, which is almost four times that of 1080p or 1080i (and don’t get me started on 8K). This resolution brings a massive change in the quality, clarity and sharpness of the picture.

But, as 1080p is still limited by streaming technology, 4K streaming via cable or satellite will be even more limited. That said, major sporting events are now broadcast in 4K, which means they will likely become more common over time.

One downside is that much of the 4K is compressed for more efficient transmission. This means that most of the time you don’t experience true 4K.

READ ALSO: Difference between hacker, programmer, developer and security researcher

Which is better: 1080i or 1080p?

The main drawback of the 1080i is the display of fast movement. Since only half of the image is displayed at a time, rapid movements can cause so-called “motion artifacts”. These are weird visual effects that result from simultaneously displaying images at different positions.

1080p avoids this problem, showing much better picture quality in fast-moving scenes. Additionally, 1080p is generally more vivid and lifelike, which most people prefer. The best picture quality (around 60% better) comes from the fact that in 1080i odd and even lines of pixels are not displayed simultaneously. In other words, 1080i is similar in quality to 720p.

But, one problem is that a lot of satellite and TV shows are still in the interlaced format, which means that the full quality of 1080p is not being broadcast.

With constant technological improvements in this space, progressive scan is already becoming the primary format for digital displays. Ultimately, most shows will likely use the progressive scan format.



Source link

Related Posts

Leave a Reply

Related Posts

Table of Contents