>>1068900
>What is the issue here
The chipsets that connect a digital display's HDMI/DP/DVI port to its panel tend to be far, far laggier in TVs than PC monitors. This becomes especially galling when comparing a $100 PC monitor with a $5000 TV. For instance, compare these numbers:
https://www.rtings.com/tv/tests/inputs/input-lag
With these numbers:
https://www.rtings.com/monitor/tests/inputs/input-lag
And you'll notice that most TVs have minimum input lag in the 30ms range, outside game mode rising to over 120ms, and even the "best" TVs suffering from over 10ms latency. Whereas even the worst PC monitor in the list is barely over 40ms, and the best gayman monitors are under 6ms. All at 60Hz.
This is exacerbated by the fact that TV are basically never capable of FPS higher than 60Hz, imposing even greater input lag.
Due to the fact that LCD (and before it CRT) monopolized the PC market, all other technologies have tended to exist entirely in the TV market, and as such have been shackled with hand-me-down chipsets intended for LCD TVs, even when (as in the case of OLED, plasma, and DLP video projectors) the underlying technology is capable of far faster pixel switching than LCD that would be excellent for low-latency high FPS PC applications.
>how are digital display drivers in any way comparable to analog CRTs?
Because in order to go from the digital information generated by a computer, to the analog information needed to drive pixels in a flat panel, a DAC very similar those which drove CRTs is used. CRT DACs exhibit pixel lag in the nanosecond range, indicating that technology designed primarily for LCDs has much lower engineering constraints.
>>1068901
>>1068905
The type of technology I'm talking about is not used in any consumer OLED display products I'm aware of. That is, continually recording the wear of individual subpixels (in addition to an initial calibration for manufacturing defects), permanantly keeping this information in accumulators throughout their lifespan, and compensating the brightness of every frame of every subpixel against this information in realtime.
>What if I don't want display controllers doing wear leveling autism?
There is no reason such a system would have to add latency or otherwise be externally noticable to an end user, unlike image orbiting or whatever.
Here's a good example:
https://www.ignisinnovation.com/technology/
Notice that it completely eliminates unevenness from the factory, TFT aging issues, and (given sufficient OLED lifespan headroom) burn-in. It's really pretty pathetic something like this didn't become standard issue back when plasma was on the market.