It has always been my impression the only way to to get 10-Bit color depth was with a professional card. If NVIDIA prosumer/consumer GPUs will support 10Bit color the AMDs are heading for eBay.
I could care less about a certified driver as long as it works. The others are 2X Vega Frontier Edition GPUs which do support 10-Bit color depth but the driver is not certified. One is a WX 8200 which has a certified driver. I do have a couple of AMD cards that are 10-Bit color but are disappointing from a performace standpoing. I am still in the process of getting things worked out. I upgraded two workstations right before the end of the year. It is important to seen on the screen exactly what prints. I also do some printing both inhouse and with custom labs for specility stuff. The combination I just mentioned renders a 40mb+ file in about 2.5 seconds from when the shutter is tripped until I see the image (currently using 2X 1080 Ti). I cannot set up for the next shot until the current shot is rendered and I see it. I use Capture One Pro 12.0 for a RAW processing engine and, normally, it works great and is fast. Meaning a Nikon D810 is connected to the computer via a USB 3 cable.
I bought the Titan RXT to go in an Intel Core i9 7900X for shooting tethered. Pardon the pun but I shoot firearms that are unique. I do product photography professionally that goes on the web.
A little more information on what I do and why.