-2-would have had to re-tool and to completely re-educate, to accommodate such a radicallydifferent technology. Indeed, when forced forward by genuine entrepreneurs intent on provingits usefulness, “trigital” or three-character computing did prove to be a relatively brief foray.Originally, binary code, and thus digital technology, was developed because theunsophisticated electronics of the day could easily detect whether a switch was “on” or was“off.” When the speed limitations of silicon chips were at last reached, chip developers couldeither abandon hope of getting faster or sacrifice accuracy, an impossible choice. However,detection of state had achieved a high enough resolution that now, instead of just a third state,five states were as easy to implement. But why stop there?
People like Rob were assigned the task of seeing beyond the current technology. For Rob the developments seemed very timely, for his interest in creating true artificial intelligencehad been smothered by the facts. Utilizing the best storage and the fastest processingtechnologies had failed miserably to provide enough computing power to produce anythingmore than mimicry of human thought. Here at last was something he considered importantenough to be given his full and undivided attention, and he threw himself into this work.
In short form, the two characters in an eight-bit byte permit just 256 combinations.Increasing the number of characters to
increases that to 6,561. Jumping it up to
characters makes it 390,625. Eight bits are no longer optimum for that many combinations.Shortened bytes mean much shorter strings to carry the same data. But to that point, the mostsignificant advance of all had been overlooked, and Rob found it, while looking at optionsafforded by additional characters: markers within bytes of information allowed a mathematicalturn that created a form of machine shorthand. Application of a derivation of fractals would beemployed. Entire phrases, sometimes whole ideas could be expressed in one short string. Anentire new language unfolded before him, like the opening of a book.
To Rob, this was more than just a breakthrough. It was more like the shutters to adarkened room being thrown wide on a sunny day, permitting a glorious view into a previouslyunseen world. And if he was impressed, the world would be astounded.
Here at last was a means to attain goals he had set for himself more than two decadesearlier. A machine would actually think.
Alas, new infrastructure required new support. Back to the algorithms. To the humanmind, the requirements would be complex beyond imagination. Only a soon-to-be-outmodeddigital super-computer could generate them, and it was not a two-second wait while numberswere crunched. It was not unlike the dilemma faced by a primitive man who had discoveredthat he could extract metal from ore by heating it in a fire. Here was something entirely newwith desirable properties, but how could it be shaped into something he could use?
Rob believed that machine thought would include a sequence of models that thecomputer would sort through, in search of similarities or references to any stimulus or problemat hand. Any single algorithm would not suffice, but would have to reference others as asituation dictated. A new kind of progression would take place; it would be comparativelysimple for the the new machine language to seemingly almost instantaneously manufacturenew algorithms of its own. It looked startlingly like realization that would develop new ideasand would permit new conclusions to be drawn. Learning. Data reception becoming perception.
Yet there was a single element that could scarcely be overlooked. The true nature of