720p vrs 1080i
Was there ever a definitive response to which would provide better image quality, or are we still stuck with chosing between one or the other based on refresh rate (60 vrs 30) and whether or not our normal viewing includes a lot of high speed motion?
I'll keep looking, but if a member has a link to a paper with definitive answers I'd be greatly appreciative.
Device = Motorola DCX3400-M and the TV is an LG 1080P TV so I'm assuming that the early issues with converting non-native signals no longer apply.
are you asking about upscaling ? seem your 3400 does 1080 but TV what model to know native rez 720 or 1080
For me I prefer 720p if its high speed/action and 1080i otherwise on a 1080 tv. That's not a definitive answer though. If I had to pick one over the other though I would pick 720p.
Unfortunately that link is WRT 1080p vrs 720p I'm looking for information WRT 1080i vrs 720p. The reason I mentioned upscaling is that it was one of the deciding factors when 1080p TVs weren't the norm and thus much of the discussion WRT the merits of 720p vrs 1080i appear to revolve around matching the input source with the native format of the TV.
A quick search seems to suggest that Rogers signal is for the most part put out at 1080i with only a few channels using the 720p format. As it stands I haven't found any setting that would allow me to pass the signal through to let the TV decode it.
I've found 1080i looks like poo compared to 720p, then again that was on a 720p TV that does 1080i (HDMI output from a 2400Pro as a source)
Just to refresh my memory.... the interlaced part of 1080i means that 30 times a second half of the scan lines are refreshed alternating with the second half of scan lines in order to come up with 2 X 30 for 60 hz? The assumption being that the eye will be drawn to the most recently refreshed scan line and not concern itself all that much with the older non-refreshed one?
edit... ok not exactly as I described it above, the non-active scan lines aren't visible so you're actually only seeing 1/2 of the image at a time.
Pretty funny looking at the google responses that come up, most of them seem to be from 5+ years ago, and really the answer seems to be that it's a personal preference thing outside of the motion blur for things like sports.
Guess I'll just have to play with it a bit, although I'm inclined to stick with the default 720P if I don't see much difference.
While I know Im wrong, the quick n dirty answer is...Think of 1080i as 540p that has been spread out a bit. 720p would be the same as 1440i. Kinda sorta....or atleast that is my take on thangs. :)
With Rogers boxes, if you enable all output formats, it'll output whatever the channel is broadcast in and the TV can do with it what it wants. Not sure about your Motorola box though. This makes changing channels painfully slow though and typically the TV flickering as it determines what to do with the source change.
That being said, I have mine set to output 720p-only. I find no noticeable difference between 720p and 1080i, but I figure having all lines outputted continuously should result in better picture.
I think modern TV's are pretty good at de-interlacing and upscaling content regardless though.
Back when CRT was king, 720p was hands down better than 1080i iMHO.. 30 fps interlaced was torture for me due to more prevalent flickering. Weather channels and sports tickers, or any other type of generated graphics bothered me.
These days the only issues for gamers with non-native signal up-converting is introduction/increase of input lag. Wii players noticed this on some TVs a few years ago the most.
If the box is 1080i, just use that.
Since most HD channels output at 1080i, use 1080i. This ensures your TV is receiving the maximum, native resolution and it also ensures less conversion steps in the chain (720p output: 1080i -> 720p -> 1080p; 1080i output: 1080i -> 1080p). Also, the de-interlacers and scalers found in HDTVs are usually of inferior quality compared to HDPVRs and other input devices. Case-in-point are the 2012 Panasonic plasma televisions, which have a horrible scaler (de-interlacer is fine).
Of course, for most hardware setups and viewing distances the difference between 720p/1080i won't be noticeable. I only notice a slight oil-painting effect when viewing a 720p signal on my 2012 Panasonic plasma because it's 60" and my viewing distance is ~8'. Luckily my Motorola box outputs at 1080i and both my WDTV Live and Xbox 360 have solid scalers that output at 1080p.
|All times are GMT -7. The time now is 08:15 AM.|