The actual answer in on Stack exchange in their comments.
It is related to a mix of actual display resolution vs conversions to virtual resolutions (the scaled resolution), and use of single precision floating point calculations.
Essentially my understanding is what it is doing is storing the value needed to convert your actual resolutions number of pixels (2160p) to a virtual resolution number of pixels (2160/1.75 horizontally) but that gets you fractions of a virtual pixel. So instead of 1.75 it scaled by 1.75182… to get to a whole number of virtual pixels to work with. Then on top of that the figure is slightly altered from what we’d expect by floating point errors.
If you take the actual horizontal resolution 2190 and divide it by the virtual resolution it’s trying to use 1233 pixels, you need a conversion value of 1.75182… to convert to it so you don’t get fractions of a pixel. If you used 1.75 you’d get 1234.2857… pixels. So gnome is storing the fraction that gets you a clean conversion in pixels to about 4 decimal places of a pixel.
Full credit to rakslice at Stack Exchange who also goes into the detail.
For the same reason a lot of programming languages can’t calculate 0.1+0.2 properly.
There’s a website explaining it: https://0.30000000000000004.com/
Floating point error? Yeaahhh no. No. Just… no. That is NEVER as big as 0.01 unless the number is also insanely massive.
The error is relative in scale. It’s not magically significant fractions off.
TBF the error can become that big if you do a bunch of unstable operations (i.e. operations that continue to increase the relative error), though that’s probably not what is happening here.
To get to 0.01 error, you’d need to add up trillions of trillions of floating point errors. It will not happen solely because of floating point unless you’re doing such crazy math that you shouldn’t be using primitives in the first place.
That’s why I said unstable operations. Addition is considered a stable operation (for values with the same sign)
0.001, but still
As the answer in the link explains, it’s adjustment of your scaling factor to the nearest whole pixel, plus a loss of precision rounding to/from single/double floating point values.
So I’m not really sure of the point of this post. It’s not a question, as the link quite effectively answers it. It’s more just “here’s why your scaling factor looks weird in your gnome config file”, and it’s primarily the first reason - rounding to whole pixels.
If I’m not mistaken 1.75 can be correctly stored as a float, as it consists of 1/2 and 1/4 only
True, but it is not that difficult to trucante (or round) the value at the second decimal value.
Gnome is coded with JavaScript (lmao 🤣)
so yeah, I Think you are right.EDIT: Actually, even if JavaScript and other languages have this issue, the value 1.7518248558044434 has not this issue. There is another reply that explains it and makes totally sense. But still pretty lame to know the desktop runs with JavaScript. (Yeah, I hate Gnome)
It’s not a “language” issue it’s a “computer” issue. This math is being done on the CPU.
IEEE 754
Some languages do provide for “arbitrary precision math” (Java’s BigDecimal for example) but it’s slower to do that. Not what you want if you’re multiplying a 4k matrix every millisecond.
I see, thanks for the explanation.
GNOME is primarily written in C
the desktop shell is mostly javascript though
Closer to 50/50, and other parts of the GNOME desktop like mutter, are largely C. Saying the entire GNOME desktop is mostly JS is silly.
No one here said GNOME desktop is mostly JS.
You’re right, they said the desktop shell, which is still incorrect, but I guess a little less incorrect. My bad.
Well, I started this thread saying it runs on JavaScript, and I mean that they need JS for most of the interactions with the desktop, like gesture or mouse events. 😞 Even if most of the code is C, we all know we need to write much many lines of code of C to do the same with JS, so most of the logics on GNOME is computed by JS. We need some rust here. 🦀 🦀 🦀 🦀
On the other hand, saying that there’s way too much javascript in it is objectively factual.
You don’t get to decide what too much JS in the project is unless you actually work on and have in depth knowledge of the project. I dont like JS, but it has its uses.
Many people are conflating modern electron bloatware with ‘JS bad’, but things are not that simple.
The JavaScript and typescript gtk bindings are nice and make building apps nice.
Yeah, on their git says 46% of the code is JavaScript: https://gitlab.gnome.org/GNOME/gnome-shell
That’s pretty much, almost half of the code.
That page also shows that there is more C. That page is also specifically the shell, not all of the desktop.
There is less than 4% more code in C than JavaScript. That’s pretty much, many features on the gnome-desktop is using JavaScript too, like gestures and mouse events.
Okay, but still needs JavaScript, they are slowly trying to remove or improve it. But it is a fact that it also runs on JavaScript. 🤣
Using JavaScript isn’t inherently a bad thing. JavaScript can be very useful when used for scripting. Obviously anything with a new for performance will be done in C.
JavaScript isn’t the best language to make a desktop interface in my opinion, it can be very efficient, but you can see in bugs (at least in the past) how bad performance it had, and they needed to re-factor it to replace to C or improve the JavaScript. I’m just laughing and making fun of it using JavaScript, not saying it is slow, Gnome is pretty fast nowadays.
Javascript was a toy created in the mid 90s to make dumb interactive animations and have some sort of dynamic aspect to a web page. The world starting to code entire desktop programs and servers in it was a giant, horrific, societal mistake.
I don’t thing you know how to write good JavaScript code. It is a very useful and robust language that is used in a lot of places.
Um, I’m not sure if you realized that basically everything is written in JavaScript. Try programing in C for a while and you will quickly see the benefit of high level languages
It’s mostly C.
And Gnome is far from the only desktop that uses JS, KDE Plasma, for example, also uses a lot of JavaScript.
It’s weird when people bash Gnome for using JS, when practically everybody else uses it a lot too. Shows that they’re just regurgitating “Gnome = bad!!!” nonsense.
We get it, you think disliking Gnome is a quirky, edgy personality trait.
Mostly C because you need to type more C code to do the same with JavaScript, so I suppose most of the logics are using JavaScript. Plasma desktop has 2% JavaScript (https://invent.kde.org/plasma/plasma-desktop), it’s not comparable. 🙂
There’s a lot more to your UX than just the Plasma desktop. And you’re also trying to pass off Gnome’s shell as being Gnome desktop. Pretty disingenuous.
But at least the desktop itself isn’t using JavaScript that much like Gnome do. Show me the repo with the % to see what are you referring.