Sunday, October 13, 2013

24 fps. vs. 60 fps. on Blu-ray?

blu ray player 60fps
 on Olympus OM D E M5 16MP Camera with 14 42mm 40 150 Lens Silver | eBay
blu ray player 60fps image



Dude12


Let me first state that my TVâs resolution is 1366x768. I notice on my Blu-ray player, there is an option to have movies play at 24fps. Itâs set to 60fps, by default. Iâve read everywhere on several sites that if your TV is capable of accepting 24fps, I should use that feature as it will give better results. Well, on my TV, it seems to be the opposite. If I set it to 24fps, my TV accepts it, but I notice that thereâs a little more noticeable motion blur present, not to mention that the input lag is slightly longer. On 60fps, there is less motion blur and the input lag isnât as bad.

Why exactly am I not getting the best results from 24fps? What exactly is the difference picture-wise between watching a Blu-ray movie in 24fps vs. 60fps?
My TV's refresh rate is supposedly 60hz, but when I press the INFO button on the remote when watching a Blu-ray movie, it says 24hz. Not sure if that means it's displaying the 24hz natively.



Answer
There are TVs that can natively playback at multiples of 24 Hz. In that case they don't need to interpolate movie frames.

Your TV most probably accepts 24 Hz, but still interpolates frames to display at 60 Hz, which does not help you (instead of the interpolation being done by the player, it is done by the TV).

So, I suggest you go back to 60 Hz on the BD player.

what's the difference between a 1080i tv and a 1080p?




deb1033


i am buying a new TV and i want to know which one is better


Answer
(DEEEEEP BREATH)

There has been a lot of concern and confusion over the difference between 1080i and 1080p. This stems from the inability of many TVs to accept 1080p. To make matters worse, the help lines at many of the TV manufacturers (that means you, Sony), are telling people that their newly-bought 1080p displays are really 1080i. They are idiots, so let me say this in big bold print, as far as movies are concerned THERE IS NO DIFFERENCE BETWEEN 1080i AND 1080p. See, I did it in caps too, so it must be true. Let me explain (if your eyes glaze over, the short version is at the end).
For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV, every modern TV is progressive scan (as in LCD, Plasma, LCOS, DLP). They are incapable of displaying a 1080i signal as 1080i. So what weâre talking about here mostly applies to people with 1080p native displays.

Movies and almost all TV shows are shot at 24 frames-per-second (either on film or on 24fps HD cameras). All TVs have a refresh rate of 60Hz. What this means is that the screen refreshes 60 times a second. In order to display something that is 24fps on something that is essentially 60fps, you need to make up, or create new frames. This is done using a method called 3:2 pulldown (or more accurately 2:3 pulldown). The first frame of film is doubled, the second frame of film is tripled, the third frame of film is doubled and so on, creating a 2,3,2,3,2,3,2 sequence. It basically looks like this: 1a,1b,2a,2b,2c,3a,3b,4a⦠Each number is the original film frame. This lovely piece of math allows the 24fps film to be converted to be displayed on 60Hz products (nearly every TV in the US, ever).

This can be done in a number of places. With DVDs, it was all done in the player. With HD DVD, it is done in the player to output 1080i. With Blu-ray, there are a few options. The first player, the Samsung, added the 3:2 to the signal, interlaced it, and then output that (1080i) or de-interlaced the same signal and output that (1080p). In this case, the only difference between 1080i and 1080p is where the de-interlacing is done. If you send 1080i, the TV de-interlaces it to 1080p. If you send your TV the 1080p signal, the player is de-interlacing the signal. As long as your TV is de-interlacing the 1080i correctly, then there is no difference. Check out this article for more info on that.

The next Blu-ray players (from Pioneer and the like) will have an additional option. They will be able to output the 1080p/24 from the disc directly. At first you may think that if your TV doesn't accept 1080p, you'll miss out on being able to see the "unmolested" 1080p/24 from the disc. Well even if your TV could accept the 1080p/24, your TV would still have to add the 3:2 pulldown itself (the TV is still 60Hz). So you're not seeing the 1080p/24 regardless.

The only exception to that rule is if you can change the refresh on the TV. Pioneer's plasmas can be set to refresh at 72 Hz. These will take the 1080p/24, and do a simple 3:3 pull down (repeating each frame 3 times).

Short Version
What this all means is this:

⢠When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none).

⢠There is no additional or new information in a 1080p signal from movie based content.

⢠The only time you would see a difference is if you have native 1080p/60 content, which at this point would only come from a PC and maybe the PS3. 1080p/60 does have more information than 1080i/30, but unless you're a gamer you will probably never see native 1080p/60 content. It is incredibly unlikely that they will ever broadcast 1080p (too much bandwidth) or that 1080p/60 content will show up on discs (too much storage space and no one is using it to record/film).

So all of you people who bought 1080p displays only to be told by the companies that you had bought 1080i TVs, relax. The TV will convert everything to 1080p. Now if you bought a TV that doesn't de-interlace 1080i correctly, well, that's a whole other story.




Powered by Yahoo! Answers

Title Post: 24 fps. vs. 60 fps. on Blu-ray?
Rating: 100% based on 998 ratings. 5 user reviews.
Author: Yukie

Thanks For Coming To My Blog

No comments:

Post a Comment