I had been reading some about this strange message of "1080p 10 bit" and this can be the root of the problems; apparently v3.0.5 is pushing the output HDMI signal to 10 bit color depth whilst in 1080p; accordlng to the specs of the most 1080p TV (mine is 3 years old, manufactured on the Q1 - 2013) all of them so far are 8-bit color depth, & as well HDMI 1.4 port is not 10 bit color depth capable; this can be done so far only through HDMI 2.0 ports; in my 4K TV I have to connect the device to the HDMI 2.0 port and activate in the TV menu the 10-bit color depth to can obtain this, that is when the 4K - 50Mhz & 4K-60 Mhz mode appear listed in HDMI mode; so, even the 4K TV being 10 bit color depth capable, the default configuration is 4K-8-bit color depth, which give a maximum resolution of 4K- 30 Mhz on HDMI 1.4 & 2.0 ports.
This problem had appear only with this v3.0.5, so if there is a new kernel vs v3.0.4a this can be a regression on the v3.0.5 kernel.
I came accross this statement on the web http://www.trustedreviews.com/opinions/hdmi-2-0-vs-1-4 :
"Just as important as the higher frame rates for Ultra HD/4K HDMI 2.0 enables, the extra bandwidth also means HDMI is able to transport 4K video at 10-bit and 12-bit colour depths. With HDMI 1.4 it was limited to 8-bit."
Leave a comment: