1080i vs 1080p: What's the Difference?
Published on April 10, 2026
Both 1080i and 1080p display 1920x1080 pixels, but they draw those pixels differently. 1080p (progressive scan) renders every line of each frame sequentially, giving you a complete image every refresh. 1080i (interlaced scan) splits each frame into two fields of alternating lines, displaying odd lines first, then even lines. For most modern viewing, 1080p is the better choice because it produces a sharper, more stable image.
How Progressive Scan Works
With 1080p, the display draws all 1,080 horizontal lines from top to bottom in a single pass. Every frame is complete. This means fast-moving objects stay sharp, and there are no artifacts from blending two different time slices. All modern streaming services, Blu-ray discs, gaming consoles, and computer monitors output in progressive scan.
How Interlaced Scan Works
1080i divides each frame into two fields. The first field contains lines 1, 3, 5, 7, and so on. The second field fills in lines 2, 4, 6, 8. These fields are captured at slightly different moments. When they combine on screen, static content looks fine, but fast motion can produce a visible "combing" artifact where the two fields do not line up perfectly. This is the signature flaw of interlaced video.
Bandwidth and Broadcasting
Interlaced scanning was invented to save bandwidth in the analog TV era. By sending half the lines at a time, broadcasters could deliver smooth motion within tight frequency limits. Many over-the-air TV stations still broadcast in 1080i because it requires less bandwidth than 1080p at the same frame rate. Cable networks like CBS, NBC, and HBO in the US use 1080i for their HD feeds.
De-interlacing on Modern TVs
Modern displays are natively progressive. When they receive a 1080i signal, they de-interlace it by combining the two fields into a single frame. Good de-interlacing processors handle this seamlessly for most content. However, the processing adds a small delay, and fast action scenes may still show slight softness compared to native 1080p input.
Which Should You Use
If you have a choice, always pick 1080p. It gives you a cleaner image with no combing artifacts and works natively with every modern screen. The only reason 1080i still exists is backward compatibility with broadcast infrastructure. For recording, editing, streaming, and gaming, 1080p is the standard. If you are comparing resolution options, also consider whether 4K makes sense for your use case.
Working with video files? Try our video to MP4 converter or video compressor. For more video format comparisons, see interlaced vs progressive, 720p vs 1080p, and 4K vs 1080p.