Thread: 720p vrs 1080i
View Single Post
  #9 (permalink)  
Old March 3, 2013, 11:01 AM
10e's Avatar
10e 10e is offline
Join Date: Jan 2008
Location: Markham, Ontario
Posts: 775

My System Specs


Originally Posted by sswilson View Post
Was there ever a definitive response to which would provide better image quality, or are we still stuck with chosing between one or the other based on refresh rate (60 vrs 30) and whether or not our normal viewing includes a lot of high speed motion?

I'll keep looking, but if a member has a link to a paper with definitive answers I'd be greatly appreciative.

Device = Motorola DCX3400-M and the TV is an LG 1080P TV so I'm assuming that the early issues with converting non-native signals no longer apply.
When viewing a 1080i signal on a 1080p LCD TV, the scalar will show all lines at once at 30fps and remove any "comb" effect that you might see due to lines alternating. I believe broadcast TV is 23.976 fps these days so it doesn't matter much whether fast or slow action. Most motion issues are due to Rogers' wonderful (sarcasm) compression algorithms (due to their stretched/limited bandwidth) or poor TV electronics.

Back when CRT was king, 720p was hands down better than 1080i iMHO.. 30 fps interlaced was torture for me due to more prevalent flickering. Weather channels and sports tickers, or any other type of generated graphics bothered me.

These days the only issues for gamers with non-native signal up-converting is introduction/increase of input lag. Wii players noticed this on some TVs a few years ago the most.

If the box is 1080i, just use that.

Xeven: How about 10^8.450980400142567e-001 -as a possible replacement for "10e"
Reply With Quote