The author decidedly has expert syndrome -- they deny both the history and rational behind memory units nomenclature. Memory measurements evolved utilizing binary organizational patterns used in computing architectures. While a proud French pedant might agree with the decimal normalization of memory units discussed, it aligns more closely to the metric system, and it may have benefits for laypeople, it fails to account for how memory is partitioned in historic and modern computing.
It’s not them denying it, it’s the LLM that generated this slop.
All they had to say was that the KiB et. al. were introduced in 1998, and the adoption has been slow.
And not “but a kilobyte can be 1000,” as if it’s an effort issue.
They are managed by different standards organizations. One doesn't like the other encroaching on its turf. "kilo" has only one official meaning as a base-10 scalar.
loading story #46876476