I just switched over to Ignite. So far, I like the results though I did compile a list of things I thought needed rethinking. One issue was the remote's four flush-mounted, hard-to-press arrow keys but given that this box is "made" by Comcast, I doubt Rogers has much more than superficial control over what happens.
Putting that aside, I have two XI6 boxes each of which is attached to a Sony Bravia. On the older TV when it first starts up, the top left corner shows screen settings such as 16:9 aspect ratio, 1080p resolution and 60 fps frame rate. On the newer TV, it says the same but adds one other bit of info... 12-bit. I never saw that using the NextBox 3. What does that mean?
@fireborne I'm surprised that the second Bravia shows 1080p resolution and 60 fps. I wonder if that TV can't run at a higher resolution? I'd have a look at the specs to see what it can run. Its possible that if the tv can run at a higher resolution, that you might have to manually select a higher resolution on the Xi6-A/T set top box instead of letting it run at a detected default level.
The 12 bit indication suggests that the tv is receiving a High Dynamic Range input using 12 bits to specify the colour level instead of 8 or 10 bits. So, running at 12 bits should present a very nice colour image. That's why I wonder, if the tv is running 12 bit HDR, should it also be running at a higher resolution? Some tv's will only support higher resolution and higher refresh rates on one of the HDMI inputs to the tv, so, I'd suggest taking a look at the owners manual, specifically the HDMI input port specs to see if that is the case and if so, switch the HDMI input from its current input port to the high resolution port. Maybe @57 can offer some advice on this one.
Here's a reference page for HDR colour bit levels:
Just to note, the Xi6 set top boxes are a technological leap from the nextbox technology, so, you would never see HDR output from equipment designed and built in the nextbox days as HDR was probably in the idea stage at that point in time.
If you do switch the HDMI input port on the tv to run at a higher resolution, have a look to see if the end result is higher resolution, still running 12 bit HDR. Maybe higher resolutions will cause the HDR level to drop to 8 or 10 bit?? Just wondering.
With HDR 12 bit running on the second Bravia, I would think that you should notice a difference in colour for those programs that are broadcast in HDR 12 bit. That might take some research to determine what is actually broadcast in HDR 12 bit. Again, maybe @57 can offer some advice here.
I double checked and the newer TV's screen actually shows "1080i HD | 12bit 16:9 | Dolby Digital" so it's sending 12-bit HDR (isn't that Dolby Vision?) even in an 1080 interlaced signal. I thought HDR was just for 4K signals. The newer TV is a Full HD model so I was surprised to see any mention of HDR. The box offers only 720p, 1080i and 1080p as choices when attached to the newer TV.
I switched the XI6's output to 1080i when I received it because so much of what we see on television is still interlaced that I thought there might be some minor picture improvement by not "upgrading" it.
One attempt to figure out what the 12bit meant resulted in my passing the HDMI through a 5.1 Bravia receiver I got at the same time as the older TV and from there onto the new TV. When processing that signal, the newer TV didn't show 12bit but the picture looked the same so I concluded that some aspect of the signal was being safely removed or ignored but I couldn't figure out what.
Since Dolby Vision is a less popular, more expensive standard from what I've read, is there anyone who's seen 10-bit (on something similar) on their screens meaning HDR10? Does the XI6 only support HDR in the form of 12-bit?