How to force 10 bit depth monitor settings?
I just bought a Dell P2715Q, UHD 10-bit depth monitor, but in my Nvidia control panel (both linux & windows) it will only offer a maximum of 8bpc (24/32 bit depth) setting which will not do.
After a bit of digging I found a potential solution which involves creating a custom EDID file that adds support for the 10 bit configuration. Which can be done on Windows by AMD gpu users, and Nvidia Quadro and NVS users however EDID configurations are disabled for GTX cards (for apparently no good reason) and have never been available for iGPUs on windows either.
However under Linux (even with the official Nvidia drivers) EDID data can be acquired and edited, so I tried doing what it said on that forum (changing the value of a5 to b5) and loading that edited edid on Linux but that did not work (it apparently failed to load the EDID after the edit) so that solution may in fact not work after all.
Therefore my question is, how do I force an Nvidia GTX graphics card to set it's color depth to 10 bits per channel on a monitor whose edid (falsely) claims only to support 8bpc under windows 7?
(Bonus points if you can tell me how to do it under linux too but that is not required, I'm sure I can figure something out there eventually)
windows display nvidia-geforce color-depth edid
add a comment |
I just bought a Dell P2715Q, UHD 10-bit depth monitor, but in my Nvidia control panel (both linux & windows) it will only offer a maximum of 8bpc (24/32 bit depth) setting which will not do.
After a bit of digging I found a potential solution which involves creating a custom EDID file that adds support for the 10 bit configuration. Which can be done on Windows by AMD gpu users, and Nvidia Quadro and NVS users however EDID configurations are disabled for GTX cards (for apparently no good reason) and have never been available for iGPUs on windows either.
However under Linux (even with the official Nvidia drivers) EDID data can be acquired and edited, so I tried doing what it said on that forum (changing the value of a5 to b5) and loading that edited edid on Linux but that did not work (it apparently failed to load the EDID after the edit) so that solution may in fact not work after all.
Therefore my question is, how do I force an Nvidia GTX graphics card to set it's color depth to 10 bits per channel on a monitor whose edid (falsely) claims only to support 8bpc under windows 7?
(Bonus points if you can tell me how to do it under linux too but that is not required, I'm sure I can figure something out there eventually)
windows display nvidia-geforce color-depth edid
I don't know the answer to your question so I'm not going to post this as one. But nVidia has a history of differentiating their Quadro and GeForce product lines based on features like this. They deliberately disable features on their consumer models because if you really want them, they want you to pay extra for their pro models, and they use their drivers to enforce that business model. VGA passthrough is a perfect example of this. You have to hack and do things in an unsupported way to get what you want. AMD is much more forgiving, but "buy a different video card" isn't really an answer :-)
– Wes Sayeed
Jul 14 '17 at 17:55
add a comment |
I just bought a Dell P2715Q, UHD 10-bit depth monitor, but in my Nvidia control panel (both linux & windows) it will only offer a maximum of 8bpc (24/32 bit depth) setting which will not do.
After a bit of digging I found a potential solution which involves creating a custom EDID file that adds support for the 10 bit configuration. Which can be done on Windows by AMD gpu users, and Nvidia Quadro and NVS users however EDID configurations are disabled for GTX cards (for apparently no good reason) and have never been available for iGPUs on windows either.
However under Linux (even with the official Nvidia drivers) EDID data can be acquired and edited, so I tried doing what it said on that forum (changing the value of a5 to b5) and loading that edited edid on Linux but that did not work (it apparently failed to load the EDID after the edit) so that solution may in fact not work after all.
Therefore my question is, how do I force an Nvidia GTX graphics card to set it's color depth to 10 bits per channel on a monitor whose edid (falsely) claims only to support 8bpc under windows 7?
(Bonus points if you can tell me how to do it under linux too but that is not required, I'm sure I can figure something out there eventually)
windows display nvidia-geforce color-depth edid
I just bought a Dell P2715Q, UHD 10-bit depth monitor, but in my Nvidia control panel (both linux & windows) it will only offer a maximum of 8bpc (24/32 bit depth) setting which will not do.
After a bit of digging I found a potential solution which involves creating a custom EDID file that adds support for the 10 bit configuration. Which can be done on Windows by AMD gpu users, and Nvidia Quadro and NVS users however EDID configurations are disabled for GTX cards (for apparently no good reason) and have never been available for iGPUs on windows either.
However under Linux (even with the official Nvidia drivers) EDID data can be acquired and edited, so I tried doing what it said on that forum (changing the value of a5 to b5) and loading that edited edid on Linux but that did not work (it apparently failed to load the EDID after the edit) so that solution may in fact not work after all.
Therefore my question is, how do I force an Nvidia GTX graphics card to set it's color depth to 10 bits per channel on a monitor whose edid (falsely) claims only to support 8bpc under windows 7?
(Bonus points if you can tell me how to do it under linux too but that is not required, I'm sure I can figure something out there eventually)
windows display nvidia-geforce color-depth edid
windows display nvidia-geforce color-depth edid
asked Jul 14 '17 at 17:16
CestarianCestarian
1,04211533
1,04211533
I don't know the answer to your question so I'm not going to post this as one. But nVidia has a history of differentiating their Quadro and GeForce product lines based on features like this. They deliberately disable features on their consumer models because if you really want them, they want you to pay extra for their pro models, and they use their drivers to enforce that business model. VGA passthrough is a perfect example of this. You have to hack and do things in an unsupported way to get what you want. AMD is much more forgiving, but "buy a different video card" isn't really an answer :-)
– Wes Sayeed
Jul 14 '17 at 17:55
add a comment |
I don't know the answer to your question so I'm not going to post this as one. But nVidia has a history of differentiating their Quadro and GeForce product lines based on features like this. They deliberately disable features on their consumer models because if you really want them, they want you to pay extra for their pro models, and they use their drivers to enforce that business model. VGA passthrough is a perfect example of this. You have to hack and do things in an unsupported way to get what you want. AMD is much more forgiving, but "buy a different video card" isn't really an answer :-)
– Wes Sayeed
Jul 14 '17 at 17:55
I don't know the answer to your question so I'm not going to post this as one. But nVidia has a history of differentiating their Quadro and GeForce product lines based on features like this. They deliberately disable features on their consumer models because if you really want them, they want you to pay extra for their pro models, and they use their drivers to enforce that business model. VGA passthrough is a perfect example of this. You have to hack and do things in an unsupported way to get what you want. AMD is much more forgiving, but "buy a different video card" isn't really an answer :-)
– Wes Sayeed
Jul 14 '17 at 17:55
I don't know the answer to your question so I'm not going to post this as one. But nVidia has a history of differentiating their Quadro and GeForce product lines based on features like this. They deliberately disable features on their consumer models because if you really want them, they want you to pay extra for their pro models, and they use their drivers to enforce that business model. VGA passthrough is a perfect example of this. You have to hack and do things in an unsupported way to get what you want. AMD is much more forgiving, but "buy a different video card" isn't really an answer :-)
– Wes Sayeed
Jul 14 '17 at 17:55
add a comment |
1 Answer
1
active
oldest
votes
I dont know I have the same problem with anything above 8 bit. Windows cant tell it is 10 but accept for a display model Asus predator I saw once... Windows should just let people force the settings 9 times out of 10 windows is screwing up any ways. And yes the nvidia control panel will allow setting 10 bit but I believe but I'm not sure it disables your color profile and takes over...? Which is annoying at times bevause Nvidia Control Panel does not work with monitor color profiles! And since theirs not official information about how it manager color other than the manual settings in the controle panel this is a serious issue that will probably not get an aswer...
But to answer half of your question:
The GPU is where you plug in your monitor or TV. The two peices of hardware are directly connected and the nvidia controle panel can get the information about your monitor from the GPU directly through the Nvidia graphics driver bypassing the Monitor driver all to gether. Your monitor has a hardware id and so does your graphics card. Plug and play hardware like a monitor transmits its hardware id through its connection. DP and HDMI are data conections all be it high bandwidth specialized for video they are still data connections. It is critical for monitor opperation for the GPU and monitor to be able to comunicate on more than a one way color transmission or for instance features like vsync would not work. That is why the Nvidia driver and GPU can recognise 10 bit + capability and allow the apropriate setting to be availble even when windows fails. The problem is simply Windows and the lack of support for hardware from microsoft. This is also microsoft thinking your to stupid to know you have a 10 bit display and assuming if they let you control the setting you will brake the display, crash the driver or have a bad experiance and in turn blame Windows aka the product. Not that transmiting 10 bit video to an 8 bit panel will brake it. You might get strange effects or a blank screen but this is easily solved by the settings automaticly reverting if the user hasn't cleard the prompt with in 15 seconds.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "3"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1229960%2fhow-to-force-10-bit-depth-monitor-settings%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I dont know I have the same problem with anything above 8 bit. Windows cant tell it is 10 but accept for a display model Asus predator I saw once... Windows should just let people force the settings 9 times out of 10 windows is screwing up any ways. And yes the nvidia control panel will allow setting 10 bit but I believe but I'm not sure it disables your color profile and takes over...? Which is annoying at times bevause Nvidia Control Panel does not work with monitor color profiles! And since theirs not official information about how it manager color other than the manual settings in the controle panel this is a serious issue that will probably not get an aswer...
But to answer half of your question:
The GPU is where you plug in your monitor or TV. The two peices of hardware are directly connected and the nvidia controle panel can get the information about your monitor from the GPU directly through the Nvidia graphics driver bypassing the Monitor driver all to gether. Your monitor has a hardware id and so does your graphics card. Plug and play hardware like a monitor transmits its hardware id through its connection. DP and HDMI are data conections all be it high bandwidth specialized for video they are still data connections. It is critical for monitor opperation for the GPU and monitor to be able to comunicate on more than a one way color transmission or for instance features like vsync would not work. That is why the Nvidia driver and GPU can recognise 10 bit + capability and allow the apropriate setting to be availble even when windows fails. The problem is simply Windows and the lack of support for hardware from microsoft. This is also microsoft thinking your to stupid to know you have a 10 bit display and assuming if they let you control the setting you will brake the display, crash the driver or have a bad experiance and in turn blame Windows aka the product. Not that transmiting 10 bit video to an 8 bit panel will brake it. You might get strange effects or a blank screen but this is easily solved by the settings automaticly reverting if the user hasn't cleard the prompt with in 15 seconds.
add a comment |
I dont know I have the same problem with anything above 8 bit. Windows cant tell it is 10 but accept for a display model Asus predator I saw once... Windows should just let people force the settings 9 times out of 10 windows is screwing up any ways. And yes the nvidia control panel will allow setting 10 bit but I believe but I'm not sure it disables your color profile and takes over...? Which is annoying at times bevause Nvidia Control Panel does not work with monitor color profiles! And since theirs not official information about how it manager color other than the manual settings in the controle panel this is a serious issue that will probably not get an aswer...
But to answer half of your question:
The GPU is where you plug in your monitor or TV. The two peices of hardware are directly connected and the nvidia controle panel can get the information about your monitor from the GPU directly through the Nvidia graphics driver bypassing the Monitor driver all to gether. Your monitor has a hardware id and so does your graphics card. Plug and play hardware like a monitor transmits its hardware id through its connection. DP and HDMI are data conections all be it high bandwidth specialized for video they are still data connections. It is critical for monitor opperation for the GPU and monitor to be able to comunicate on more than a one way color transmission or for instance features like vsync would not work. That is why the Nvidia driver and GPU can recognise 10 bit + capability and allow the apropriate setting to be availble even when windows fails. The problem is simply Windows and the lack of support for hardware from microsoft. This is also microsoft thinking your to stupid to know you have a 10 bit display and assuming if they let you control the setting you will brake the display, crash the driver or have a bad experiance and in turn blame Windows aka the product. Not that transmiting 10 bit video to an 8 bit panel will brake it. You might get strange effects or a blank screen but this is easily solved by the settings automaticly reverting if the user hasn't cleard the prompt with in 15 seconds.
add a comment |
I dont know I have the same problem with anything above 8 bit. Windows cant tell it is 10 but accept for a display model Asus predator I saw once... Windows should just let people force the settings 9 times out of 10 windows is screwing up any ways. And yes the nvidia control panel will allow setting 10 bit but I believe but I'm not sure it disables your color profile and takes over...? Which is annoying at times bevause Nvidia Control Panel does not work with monitor color profiles! And since theirs not official information about how it manager color other than the manual settings in the controle panel this is a serious issue that will probably not get an aswer...
But to answer half of your question:
The GPU is where you plug in your monitor or TV. The two peices of hardware are directly connected and the nvidia controle panel can get the information about your monitor from the GPU directly through the Nvidia graphics driver bypassing the Monitor driver all to gether. Your monitor has a hardware id and so does your graphics card. Plug and play hardware like a monitor transmits its hardware id through its connection. DP and HDMI are data conections all be it high bandwidth specialized for video they are still data connections. It is critical for monitor opperation for the GPU and monitor to be able to comunicate on more than a one way color transmission or for instance features like vsync would not work. That is why the Nvidia driver and GPU can recognise 10 bit + capability and allow the apropriate setting to be availble even when windows fails. The problem is simply Windows and the lack of support for hardware from microsoft. This is also microsoft thinking your to stupid to know you have a 10 bit display and assuming if they let you control the setting you will brake the display, crash the driver or have a bad experiance and in turn blame Windows aka the product. Not that transmiting 10 bit video to an 8 bit panel will brake it. You might get strange effects or a blank screen but this is easily solved by the settings automaticly reverting if the user hasn't cleard the prompt with in 15 seconds.
I dont know I have the same problem with anything above 8 bit. Windows cant tell it is 10 but accept for a display model Asus predator I saw once... Windows should just let people force the settings 9 times out of 10 windows is screwing up any ways. And yes the nvidia control panel will allow setting 10 bit but I believe but I'm not sure it disables your color profile and takes over...? Which is annoying at times bevause Nvidia Control Panel does not work with monitor color profiles! And since theirs not official information about how it manager color other than the manual settings in the controle panel this is a serious issue that will probably not get an aswer...
But to answer half of your question:
The GPU is where you plug in your monitor or TV. The two peices of hardware are directly connected and the nvidia controle panel can get the information about your monitor from the GPU directly through the Nvidia graphics driver bypassing the Monitor driver all to gether. Your monitor has a hardware id and so does your graphics card. Plug and play hardware like a monitor transmits its hardware id through its connection. DP and HDMI are data conections all be it high bandwidth specialized for video they are still data connections. It is critical for monitor opperation for the GPU and monitor to be able to comunicate on more than a one way color transmission or for instance features like vsync would not work. That is why the Nvidia driver and GPU can recognise 10 bit + capability and allow the apropriate setting to be availble even when windows fails. The problem is simply Windows and the lack of support for hardware from microsoft. This is also microsoft thinking your to stupid to know you have a 10 bit display and assuming if they let you control the setting you will brake the display, crash the driver or have a bad experiance and in turn blame Windows aka the product. Not that transmiting 10 bit video to an 8 bit panel will brake it. You might get strange effects or a blank screen but this is easily solved by the settings automaticly reverting if the user hasn't cleard the prompt with in 15 seconds.
answered Feb 10 at 7:33
adamadam
1
1
add a comment |
add a comment |
Thanks for contributing an answer to Super User!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1229960%2fhow-to-force-10-bit-depth-monitor-settings%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
I don't know the answer to your question so I'm not going to post this as one. But nVidia has a history of differentiating their Quadro and GeForce product lines based on features like this. They deliberately disable features on their consumer models because if you really want them, they want you to pay extra for their pro models, and they use their drivers to enforce that business model. VGA passthrough is a perfect example of this. You have to hack and do things in an unsupported way to get what you want. AMD is much more forgiving, but "buy a different video card" isn't really an answer :-)
– Wes Sayeed
Jul 14 '17 at 17:55