

PNG gets you the best compatibility and features, at the expense of file size. But I probably wouldn’t use it for uploading photographs to the web of course.
made you look
PNG gets you the best compatibility and features, at the expense of file size. But I probably wouldn’t use it for uploading photographs to the web of course.
WebP is the same, it’s got a lossy mode (VP8) and a lossless mode (Which is more limited than PNG, but beats it where it overlaps). But to make it more complicated the lossless mode also has lossy processing modes, where it alters the image first to achieve smaller output sizes.
People have a long habit of turning JPEG files into PNG files, the file extension won’t help you there. They also could have reduced the colour depth or resized it, all lossy operations. All it really tells you is that it can have an alpha channel.
As for AVIF, personally I don’t like the format, it feels like an “open media” (But still patented) version of HEIF to oppose Apple. Like WebP it makes the (baseless IMO) assumption that a format designed to encode motion data is better at encoding still data than a format designed to encode still data. It’s got all the limitations of a video format (It’s got a max resolution, only supports 12bit images, and no progressive decoding), and they left out all the enhancements from WebP (The dedicated lossless mode, “lossless AVIF” files are huge and the last I checked badly supported, so nobody actually used them, and they just called very high quality settings “lossless”)
A team inside of Google was working on WebP2 around the same time, that used AV1 but actually added the useful stuff like efficient lossless encoding, it got killed too in favour of AVIF.
So it depends on the specific HDR encoding used, Rec2020 is the most common ones you’ll see (It’s meant for “pure” setups, i.e. where the source and output are tightly linked, e.g. gaming consoles or blu-ray, or so) and the raw data won’t look great. While something like HLG (Hybrid-Log Gamma) is designed for better fallback (As it’s meant for TV broadcast, where the output device is “whatever TV the user has”), so should just look dimmer.
This is a HDR screenshot I took of Destiny 2, which uses Rec2020, tone mapped to SDR
And here’s the raw screenshot data from before tonemapping.
If the second image had all the right HDR metadata, and the viewer supported it properly, then both images would match.
AVIF is generally smaller in size than both WebP and PNG. AVIF supports animation while PNG does not.
The lossless mode in AVIF is so bad that a BMP in a ZIP file produces smaller results.
Which makes sense, as it doesn’t actually have a dedicated lossless mode (like WebP does), the encoder is just to not quantise the video data it produces.
JXL can do lossy images (like JPEG) and lossless ones (like PNG), and on average it’ll produce smaller file sizes than both (While beating JPEG quality wise). The killer feature is that it can do lossless recompression of existing JPEG files and shave off about 20% of the file size, and it’s reversible so you can turn those JXL files back into JPEG images for existing software.
The downside is that it was created by Google Research (among others), but the Chrome team made AVIF instead and decided that’s what they’d support and nothing else.
At least Safari supports it.
I switched a year ago, after trying and failing multiple times over the years whenever I gave it a try.
I find I’m a lot more willing to let issues slide though, like I’ve had some Thunar crashes which I’m cool with since there’s like 4 devs maintaining it, vs. the multi-billion dollar company working on Explorer which I expect better from. Also unsurprisingly the only actual shop-stopper issue I’ve had was with a memory leak in the Nvidia drivers, the actual FLOSS stuff has been great.
Or Automattic doesn’t have enough employees left to implement it
For a community called “technology” there’s a pretty strong anti-AI bubble going on here.
Are you surprised people have opinions about technology, in a community dedicated to discussing technology?
If everyone has moved on from 32bit, and the old stuff doesn’t change, where is the maintenance requirement?
The problem is that it’s not old unchanging code, people want the latest supported version so they can still run their 32-bit binaries with the latest supporting libraries.
And if the upstream developers don’t consider 32-bit support important, then it falls on the distro maintainers to patch the code to keep it running in these situations.
Use Zola or Hugo then
I always thought it was purely a hardware limitation, but reading up on it I found it’s actually just “virtual 8086 mode” that was dropped, 16-bit protected mode is still available even when running the CPU in “long mode”.
So it rules out DOS apps, but 16bit Win 3.x apps should still run. But it’s probably a compatibility minefield, and even MS decided it wasn’t important (iirc the only thing they kept around was support for 16-bit app installers, but by internally swapping them out with 32-bit versions when run, since it was apparently common for 32-bit 9x apps to still use 16-bit installers so they could show a proper error message when run under Win 3.x)
It seems to me that 16-bit applications are already basically broken with 32-bit wine if you’re running a 64-bit kernel, by default it places extra restrictions over what the hardware already does to prevent apps from loading 16-bit code entirely.
https://gitlab.winehq.org/wine/wine/-/wikis/FAQ#16-bit-applications-fail-to-start
Guessing that’s why they don’t feel it’s that important to continue supporting, seems a VM is the future for these apps.
WINE’s WoW64, does not work for all games.
Ok but is that because of fundamental limitations, or just because of bugs?
One’s easier to fix than the other.
Yeah this is a perfect use case for torrents, could go a step further and keep track of a downloader’s ratio to stop people leaching.
deleted by creator
Tizen (resting place of Meego)
I’d say SailfishOS is the final resting place of MeeGo, especially since it’s maintained by ex-Nokia devs.
I’m not convinced LLMs as they exist today don’t prioritize sources – if trained naively, sure, but these days they can, for instance, integrate search results, and can update on new information.
Well, it includes the text from the search results in the prompt, it’s not actually updating any internal state (the network weights), a new “conversation” starts from scratch.
Also, compared to something like the Switch? I don’t see MS remotely bricking these devices if you run “homebrew” on them.
And there’s still web directories hanging around, similar to the now dead dmoz site.
https://url.town/ and https://curlie.org/ for example
If anything Subnautica lets you see too much.