Today I was surprised, I was browsing Wikipedia, I found that it turns out that modern Apple devices still can’t do hardware encoding of the AV1 codec. I showed you avif-pictures, only this in the video. Modern GPUs support it (AMD, Nvidia, Intel, Qualcomm) and Google Pixel 8 too.

I don’t know where exactly this is used now. I was just surprised, because I thought Apple processors are above competition. They are unlikely to be bad, but maybe there is too much PR here, or they are just very optimized for certain (typical) tasks.

I changed the battery on my MacBook with Retina. I bought 2 pcs., one 60% original (while it is standing), the second new (I haven’t installed it yet, I’ll check it one of these days). So this one at 60% now holds a charge for 4-5 hours! (Not a load.) That is, a new one at 100% should last up to 9-10 hours. I mean, these 20 hours of MacBook are cool, but I just had the impression that everything before that was just rubbish. And I also had the impression that everything before M-processors was also rubbish. (I don’t plan to buy anything except M-processors, there’s not much point. But more because the keyboards and screens are just rubbish too.)

I mean, it turns out that competitors are actually developing quite well, and in some departments (like AV1 encoding) they even bypass Apple.

https://en.wikipedia.org/wiki/AV1#Hardware