I’m still confused. All the HDR processing should be done on the client side if it is an HDR capable display playing a compatible format. The Shield can play most formats so that shouldn’t be the issue. How do you know HDR isn’t displaying properly?
Tone mapping is for converting HDR content to SDR for non-HDR displays. Why do you need it for an HDR TV?
I guess compared to your situation, they’re fantastic. I have a static IP and copper connection, but they don’t offer any symmetric plans. I’m stuck with 200down/15up and the best up they offer is 500down/25up.
I’ve got a 1070 that I use for transcodes and some tonemapping where necessary and I don’t have GPU related issues (My ISP causes their own problems). I can usually run a few small streams at once, and I have a PC that I use to handle files too large to reliably stream to my Chromecast with Google TV over WiFi.
I think I saw he has some patents on it in the video. You could probably use existing hardware and those patents to reconstruct as needed.