@db2@lemmy.one
@db2@lemmy.world
@db2@sopuli.xyz
The way it was explained to me once is that the asic in the gpu makes assumptions that are baked in to the chip. It made sense because they can’t reasonably “hardcode” for every possible variation of input the chip will get.
The great thing though is if you’re transcoding you can use the gpu to do the decoding part which will work fine and free up more cpu for the encoding half.
Feeling personally attacked here… I’m coincidentally trying to get ipv6 remote to my machine working right but I don’t know what I’m doing so while everything is working by itself anything I try fails… 💀
(I don’t need help, it’s better if I learn it the hard way)