You are on page 1of 1

Yes! One can build such a real slow processor with discreet logic put together.

Or implement it in an
FPGA.
(Assuming this may not have an immediate application and just for thought! Apologies if it is too
lengthy. Just ideation!) Here are some basic steps one can consider-
1.Develop an instruction-set first. List each instruction and required number of
operands. Assign each instruction a unique binary number. Your instruction decoder
is going to recognize these unique instructions by the number you assign.
If it's for a calculator application, focus on math operations. Basic arithmetic is easy. Scientific
computing needs more sophisticated instruction set and architecture.
2.Your instructions and architecture, implicitly define the bus-width (Instruction,
Data - 8,16,32,64 or even 128 bits!)
3.Build the instruction decoder. It will be a combinational logic circuit. And also
supporting sequential logic for timing, sequencing and synchronization. At a very
high level you can think of the instruction decoder as a decoder/demultiplexer. The
signals of this block are going enable/disable and setup subsequent oprations.
4.Build all the necessary registers, special function registers SFRs, I/O buffers, pins,
enable/disable signals.
5.Build the timers, counters, synch circuits
6.Build special peripheral driving hardware circuits. With the example of calculator,
it could be the screen, keypad, battery monitoring and speaker/buzzer etc.
7.Build the ALU (Arithmetic Logic Unit). Actually build an advanced ALU!
8.Build a nice math hardware. Also implement floating-point arithmetic hardware!
(Special functions, Logs, Trigonometric functions are implemented as Taylor's series
or other custom series math on some hardware)
9.Actually we live in the age of data-science and AI. So implement array / vector
processing units. Build a vector processor!
10.Implement scratchpad, cache and other internal book-keeping memory areas.
11.Build bus peripherals such as I2C, SPI and memory interfaces and any other
useful peripheral one can think!
12.Build program counter, fetch circuits before you feed the instructions to the
decoder.. Or build an instruction pipeline and some instruction parallelism
13.Look into C programming language specification. Try to fit your own CPU
instructions to be compiled from that language. May be develop your own
implementation specifics and develop your own compiler!
14.Most importantly have a RESET implemented inside the circuit and also make it
available on an external pin!

You might also like