Quote: "That's the 1950s created US centric JEDEC definition, it was never a global standard.
The 1999 revised IEC 60027-2 version is the current true standard, and it clearly lists 1024 bytes as a kibiBit, in an attempt to clear up the confusion that surrounds this sphere of computing.
This is still a large area of discussion and debate in the IT world, but that doesn't make your claims a standard. The International Electrotechnical Commission and ISO bodies create the global standards; not you."
Well if you're going to go by the ISO for it you should be writing them as kB mB etc... as that is their given standardization, which is currently ONLY adopted by Linux.
Unix, Windows, MacOS, etc. still use the JEDEC definitions which begin with the capitalisation or alternatively the IEC that begin capitalisation with a little i.
e.g. MB and MiB are identical to 1024^2 where-as mB is 1000^2
Given Microsoft Windows and MacOS-based products are what almost the entire Hard Disk market is aimed towards, those are the products that are dictating the standards rather than some board of people who are not a group of peers but instead are there simply to say "this is right, you do it our way else it isn't standard"
That's just bullcrap. Market leaders or a group of peers should dictate the standards... Manufacturers hide behind the ISO whenever they want to advertise to the public to make themselves look better but the second they want to do something else they'll hide behind the market leaders.
CDs even using ISO-9660 do not show available space in the ISO/IEC standard but JEDEC. In-fact Juliot is the only one that does, which isn't required to follow any given standard at all.
If someone using a standard doesn't even follow their own guidelines how the hell does that even make it a standard?
Microsoft and Apple have been using this standard for displaying dataspace for over 2 decades, declaring they should change it simply cause a bunch of suits say so is just niave and stupid. Especially given the platform you're trying to use them on.
What makes this worse is that the JEDEC system actually makes far more sense when you want to convert between bit and Byte. As it is a physical measurement of the data space available and utilised that will be the same as what it is in memory instead of something that the public would be able to figure out in their heads a little more easily.
What the ISO did was simply to cause more confusion to what companies will use for the public as they can sit behind what they have said in a semantics debate; as it suits them claiming that more space is available than is physically there.
Meaning that people not only have to use one system that is already there and defined by the sodding hardware itself, and spend more CPU cycles converting that to something more "obvious" than simply displaying what the memory offset spits back.
I firmly believe they should completely scrap the whole concept of metric measurements for anything computer related, because as I said before... there aren't 10bits per byte, there are 8. That is how the hardware has been design and developed for over half a century.
What the ISO did when they declared those as the "standards" that almost no one uses, was to tell people hardware is wrong an they are right.
That isn't standardisation that is just mounted confusion. Standardisation would've been to declare 1 = 1024, end of story.
As only a single Operating System that is barely used by business and home users alike utilises it, how on earth can it be called a standard?
Seriously, explain to me how this is in any fashion a standardisation?
It's like having a room full of 100 people, and they all fold napkins. 90 of them fold them into square and the other 10 fold them into triangles... yet because there is a small group of 5 people in other room decide that triangles are a better way of doing it; had nothing to do with anyone in that room, they just decided it on their own. For some reason that makes them right.
Cause that's how these damn standards for "showing data size" came about. Unlike things like C and C# that the creators themselves write out the standards that are to be followed to call their language such.
You know why it happened like that, it's because no one person invented the computer and digital technology. What I don't bloody understand though is why it wasn't a peer review system like ARB have for OpenGL. That would've actually made some bloody sense, but NOOOOOO it was a bunch of suits in a room who know sod all about computers making the decisions because it was believed "we need an external unbias decision"; which is bullcrap in itself, because it should be the industry that works daily with such things that should decide it.
As I said before... a standardisation of this fashion isn't something that can be dictated by some "board", especially when the majority do not agree. If you want to follow ISO then fine do so, sit there claim them as a standard; on the flipside the computer industry itself doesn't follow these standards.
In-fact try this and read through the manual that came with your HDD, and you'll notice it states "Estimated 250,000,000 Bytes" rather than saying 250GB. All hardware manufacturers also have on their websites disclaimers about this as well.
Sure it sounds good... 250GB, but when the physical data it can hold is 232GB; it's just crap in order to sell more cause it sounds larger and more rounded. You know that's the only reason retailers and manufacturers will ever claim to follow the standards as well, simply to cover their backsides legally. Has nothing to do with them believing what-so-ever in the standard.
Anyone who feels even remotely surprised by such things in this day and age are just beyond crazy niavity.
Hell, you guys at TGC have done similar stuff with your products. It's all about how you say something that what you've said... eh?
At the end of the day it's all business though, so it's cool.
You'll follow whatever suits your interests best at that given moment. Screw the public.. once they parted with money there's nothing they can do about it. Legally they don't have a leg to stand on, despite they've been blatently lied to.
I mean you say this is purely my view, not a standard; but again look at the industry itself. I'd say there is far more companies that back my view, than those who back the so-called "standards" that I doubt will ever be adopted by the big names.
http://en.wikipedia.org/wiki/Kilobyte
Not the line that it's been "Forbidden to use 1024" as the definition; this has been the case for several years now... yet Microsoft and Apple both continue this useage. You honestly believe that will change anytime soon?
Hell, they don't even use ISO for their bitrates for data transfering.
Just to lighten the mood