In 1988, while philosophizing with a fellow programmer, I digressed to the point of insanity with a spark of a concept that was to change my life. I had read papers on Quantum Mechanics and was musing over the incapacity of the brain to see that an object could be in multiple states, simultaneously. The cat is dead, alive or absent or…. but I couldn’t ‘swallow’ that the cat could be both dead and alive at the same time. I was a scientist, but obviously I was having a hard time making the jump to being a Quantum Physicist
The computer world has, for many years, stabilised on an 8 bit hardware addressing system that has been beautifully expressed in Hexadecimal
as 00 through to FF. This encompasses all of the values from 0 to 256 within 2 digits and is still present (and required), even on our wonderful 64 bit operating systems. The Hexadecimal numbering system (base 16) addresses values as : 0 1 2 3 4 5 6 7 8 9 A B C D E F.
This is shown in our ‘extended keyboard values’ and is the basis of virtually all computer code ever written. In other words, everything that can be typed at the keyboard has a value of 1 – 256. Compilers and interpreters mash these words in remarkable ways to create the ‘executable’ files that run on our computers. Of course, the computer’s processor, while remarkable, is still based on a set of binary gates. Binary gates are simply On or Off switches (Open or Closed Circuits) that direct electrical current through a maze to arrive at different positions to create different effects.
Data that is stored on computers is always stored in binary format. It is sent to the hard disk
or whatever as a stream of binary data
, although we normally view it in ‘words’ which are bunches of binaries, grouped in 8 bit format and represented as Hexadecimal chunks. Binary data is extremely difficult for humans to read, but is the only way for computers to read…. so far.
I was relating this to electronic data which is stored magnetically as either a North South ‘ I ’ or and East West ‘ – ‘ that created a series of ‘compass points from the molecules on the substrate of the disk. So ‘ – – – – – – ‘ was a bunch of zeroes and ‘ I I I I I ‘ was a bunch of ones. That was easy to visualize, as I had seen a compass often enough to understand the process. A hard disk is simply millions of compass needles, each one pointing to either NS or EW.
There were times when the data was corrupted and upon examination of the actual magnetic fields, it could be ascertained that some of the Compass Points were not always perpendicular, simply, some were pointed to NE or NW etc… ‘ I / –\
‘. This is what is generally known as ‘noise’ and previously, computer programs
were asked to skip the offending areas and jump to the pure NS or EW data.
My programming skills were called onto to throw ‘What If’ formulae at the data stream, making an irrational decision that if the data was at NE, we would ‘pretend’ that it was at NS, and if it was NW we would pretend that it was representing EW. Oddly, real answers started popping out, followed by junk. By creating a series of these What If statements within the program, however, known data that was intentionally corrupted could always be recovered. We were not changing a single byte of data, just reading it through a slightly warped magnifying glass. This work culminated in a series of patents that revolutionized the data recovery industry and brought our ‘Recovery Rate’ from the best in the Industry at 74% to greater than 99.9% (as reported by our clients). We had won.
Which started me thinking.
What if we didn’t have to store data in a binary mode? If computers can only read ‘on or off’, can they be asked to do that in parallel process or in ways that emulate more than On or Off? If these are the X and Y values that plot the data point, what about more values.
I came up with the concept of storing data with a ‘Z’ value also.
Consider this: A ‘NewDVD’ type of product stores data based not only on the NS and EW system, but also on the colour of that burned dash or stroke. There can be multiple colours, (we have defined millions under the 8 bit system,) so now we can store the letter ‘ A ‘ as being not only present, but also with attributes such as ‘Font’ ‘Bold’ ‘Size’ etc…. and 64 million other characteristics.
Practically, all NewDVDs need to have a CRC or reference check done to calibrate the colours for the ‘NewDVD’ reader as it opens that disk…. then data could be read X-Y first, followed by Z. This means that the primary functions of a program could be running, while the Z data is still loading, or the raw text of a web page loaded before the ‘decorations’ follow.
The neat thing is that much more data can be stored more efficiently, although the cost of the reader and writer would be higher. In 1988 this was a pipe dream – today, it is a probability.
A standard DVD
could hold approximately 50 terabytes of information, based on a simple 64 colour system. Future generations could hold much more data based on the colour depth
(Z factor) and provide a commercial upgrade route.)
This process was the subject of my 5th Patent Application, the only one that failed. The term that I used to describe the multiple depths of data as being ‘Multinary’ – had been coined by someone else.
I gave up the fight and moved onto to other moving targets. I was wrong to do so.
Some day soon, the technology of reading and writing binary bits with a Z depth will become practical. Blu-Ray DVD
uses a laser to calibrate itself against the known starting dataset on the disc. The jump to calibration of 64 colours, 16 million colours or 64 million colours is simply based on time. There would still only be 256 characters to your keyboard, but effectively each character could be assigned any of 64 million colour attributes.
The programming opportunities are endless when we consider that a ‘Colour Compiler’ could reduce a program like PhotoShop
or Word to a megabyte in size and that could include all the language variations etc…
Bring it on!