Lossless is fine, lossy is worse than JPEG.
The_Decryptor
spoiler
made you look
- 0 Posts
- 76 Comments
That’d just be overall worse, it’d never be smaller than a comparable JPEG image, and it wouldn’t allow for any compression/quality benefits.
Yep, their frontend used a shared caller that would return the parsed JSON response if the request was successful, and error otherwise. And then the code that called it would use the returned object directly.
So I assume that most of the backend did actually surface error codes via the HTTP layer, it was just this one endpoint that didn’t (Which then broke the client side code when it tried to access non-existent properties of the response object), because otherwise basic testing would have caught it.
That’s also another reason to use the HTTP codes, by storing the error in the response body you now need extra code between the function doing the API call and the function handling a successful result, to examine the body to see if there was actually an error, all based on an ad-hoc per-endpoint format.
Ehh, that really feel like “But other people do it wrong too” to me, half the 4xx error codes are application layer errors for example (404 ain’t a transport layer error, neither is 403, 415, 422 or 451)
It also complicates actually processing the request as you’ve got to duplicate error handling between “request failed” and “request succeeded but actually failed”. My local cinema actually hits that error where their web frontend expects the backend to return errors, but the backend lies and says everything was successful, and then certain things break in the UI.
Well no, the HTTP error codes are about the entire request, not just whether or not the actual header part was received and processed right.
Like HTTP 403, HTTP only has a basic form of authentication built in, anything else needs the server to handle it externally (e.g. via session cookies). It wouldn’t make sense to send “HTTP 200” in response to trying to access a resource without being logged in just because the request was well formed.
60 in particular is a superior highly composite number, 12 divisors compared to a paltry 8 for 24.
So it’s an “open standard”, not in the sense that anybody can contribute to the development, but in the sense that the details of the standard are open and you can learn about them.
The format itself is an XML version of the existing Office document formats, and they grew organically over decades with random bugs, features, and bug compatibilities with other programs. e.g. There will be a random flag on an object that makes no sense but is necessary for interoperating with some Lotus 1-2-3 files that a company had, who then worked with Microsoft to support back it in the 90s. Things you can’t change, nobody really cares about, but get written down because the software already implements it (and will emit sometimes)
The_Decryptor@aussie.zoneto Technology@lemmy.world•Nepal bans social media(Facebook, X, Reddit, Mastodon, Discord, Signal, YouTube and more) for failing to register with the government; Only 7 to be open(Viber, TikTok, Telegram and more)English1·17 days agoYou can turn the feature off entirely, or just not talk to people who post them? It’s not something like tiktok where you get pushed a bunch of random videos, it’s stuff that people you know are sending you.
It makes sense, never know when somebody is going to try to impersonate you for any reason. That’s why I told all my friends and family that the best way to know I’m the real me is if I say my codeword “chariots”.
The_Decryptor@aussie.zoneto Linux@lemmy.ml•Linux and Secure Boot certificate expirationEnglish6·24 days agoIt’s real, but probably not an issue in practise.
If it does actually turn out to pose a problem, then just disable secure boot on those systems, not like it’s really securing anything at that point.
The_Decryptor@aussie.zoneto Technology@lemmy.world•Our Channel Could Be Deleted - Gamers NexusEnglish24·27 days agoHe stores all his footage in full quality instead of just storing his final edited videos in a compressed format.
That’s the right way to do it, you want to avoid generation loss as much as possible.
The_Decryptor@aussie.zoneto Linux@lemmy.ml•I have an Nvidia GPU, can I game on Linux?English7·28 days agoAMD has its own mix of issues with Vulkan between RADV (mesa), AMDVLK, and AMD’s proprietary driver on a per-game basis at times.
Good news, they’re going away. AMD is focusing entirely on Mesa now.
The_Decryptor@aussie.zoneto Technology@lemmy.world•Microsoft's latest Windows 11 24H2 update breaks SSDs/HDDs, may corrupt your dataEnglish8·1 month agoMost likely an hardware issue, ZFS has seen similar types of corruption with certain drives under normal operation.
The_Decryptor@aussie.zoneto Technology@lemmy.world•SpaceX says states should dump fiber plans, give all grant money to StarlinkEnglish4·1 month agoIt was an issue for a long time that browsers just ignored the caching headers on content delivered over HTTPS, a baked in assumption that they must be private individual content. That’s not the case now, so sites have to specifically mark those pages as uncachable (I think Steam got hit by something like this not that long ago, a proxy was serving up other peoples user pages it had cached).
But for something like Google Fonts, the whole point of it was that a site could embed a large font family, and then every other visited site that also used it would simply share the first cached copy. Saving the bandwidth and amortizing the initial cost over the shared domains. Except now that no longer holds, instead of dividing the resources by the amount of sites using it, it’s multiplying it. So while a CDN might put the contents physical closer to the users, it doesn’t actually save any bandwidth (and depending on how it’s configured, it can actually slow page loads down)
The_Decryptor@aussie.zoneto Technology@lemmy.world•SpaceX says states should dump fiber plans, give all grant money to StarlinkEnglish4·1 month agoBrowsers partition the cache by “origin” now though, so while it can cache HTTPS content, it can’t effectively cache shared content (It’ll store multiple independent copies).
So Youtube still works fine, but Google Fonts is pointless now.
Edit: Oh yeah, and any form of shared JavaScript/CSS/etc. CDN is now also useless and should be avoided, but that’s always been the case.
The_Decryptor@aussie.zonetoMicroblog Memes@lemmy.world•I am font hunting againEnglish216·1 month agoLeads to cooler art too
I want my NKRO.
Which can be done over USB, cheap keyboards just aren’t wired for it.
The_Decryptor@aussie.zoneto Technology@lemmy.world•GitHub is no longer independent at Microsoft after CEO resignationEnglish4·1 month agoMercurial and DARCS had a rather fatal flaw though, they were so much slower than git. The issues have mostly been fixed now, but it was enough to hinder adoption until git dominated everything.
Git also has a rather big flaw, it’s “good enough”. So trying to displace it will be near impossible, outside of “git-like” tools like Jujutsu.
The_Decryptor@aussie.zoneto Technology@lemmy.world•Intel CPU Temperature Monitoring Driver For Linux Now Unmaintained After LayoffsEnglish2·1 month agoAhh, yep it turns out ARM actually removed Thumb support with their 64-bit transition, so their instruction length is fixed now, and Thumb never made it into the M* SoCs.
AVIF is funny because they kept the worst aspects of WebP (lossy video based encoding), while removing the best (lossless mode) There was an attempt at WebP2, using AV1 and a proper lossless mode, but Google killed that off as well.
But hey, now that they’re releasing AV2 soon, we’ll eventually have an incompatible AVIF2 to deal with. Good thing they didn’t support JPEG-XL, it’d just be too confusing to have to deal with multiple formats.