You are on page 1of 3

Computer programming 

is the process of designing and building an executable computer


program to accomplish a specific computing result or to perform a specific task. Programming
involves tasks such as: analysis, generating algorithms, profiling algorithms' accuracy and resource
consumption, and the implementation of algorithms in a chosen programming language (commonly
referred to as coding).[1][2] The source code of a program is written in one or more languages that are
intelligible to programmers, rather than machine code, which is directly executed by the central
processing unit. The purpose of programming is to find a sequence of instructions that will automate
the performance of a task (which can be as complex as an operating system) on a computer, often
for solving a given problem. Proficient programming thus often requires expertise in several different
subjects, including knowledge of the application domain, specialized algorithms, and formal logic.
Tasks accompanying and related to programming include: testing, debugging, source
code maintenance, implementation of build systems, and management of derived artifacts, such as
the machine code of computer programs. These might be considered part of the programming
process, but often the term software development is used for this larger process with the
term programming, implementation, or coding reserved for the actual writing of code. Software
engineering combines engineering techniques with software development practices. Reverse
engineering is a related process used by designers, analysts and programmers to understand and
re-create/re-implement.[3]

History[edit]

Ada Lovelace, whose notes added to the end of Luigi Menabrea's paper included the first algorithm designed
for processing by an Analytical Engine. She is often recognized as history's first computer programmer.
See also: Computer program § History, Programmer § History, and History of programming
languages
Programmable devices have existed for centuries. As early as the 9th century, a
programmable music sequencer was invented by the Persian Banu Musa brothers, who described
an automated mechanical flute player in the Book of Ingenious Devices.[4][5] In 1206, the Arab
engineer Al-Jazari invented a programmable drum machine where a musical
mechanical automaton could be made to play different rhythms and drum patterns, via pegs
and cams.[6][7] In 1801, the Jacquard loom could produce entirely different weaves by changing the
"program" – a series of pasteboard cards with holes punched in them.
Code-breaking algorithms have also existed for centuries. In the 9th century, the Arab
mathematician Al-Kindi described a cryptographic algorithm for deciphering encrypted code, in A
Manuscript on Deciphering Cryptographic Messages. He gave the first description
of cryptanalysis by frequency analysis, the earliest code-breaking algorithm.[8]
The first computer program is generally dated to 1843, when mathematician Ada Lovelace published
an algorithm to calculate a sequence of Bernoulli numbers, intended to be carried out by Charles
Babbage's Analytical Engine.[9]

Data and instructions were once stored on external punched cards, which were kept in order and arranged in
program decks.
In the 1880s Herman Hollerith invented the concept of storing data in machine-readable form.
[10]
 Later a control panel (plug board) added to his 1906 Type I Tabulator allowed it to be programmed
for different jobs, and by the late 1940s, unit record equipment such as the IBM 602 and IBM 604,
were programmed by control panels in a similar way, as were the first electronic computers.
However, with the concept of the stored-program computer introduced in 1949, both programs and
data were stored and manipulated in the same way in computer memory.[11]

Machine language[edit]
Machine code was the language of early programs, written in the instruction set of the particular
machine, often in binary notation. Assembly languages were soon developed that let the
programmer specify instruction in a text format, (e.g., ADD X, TOTAL), with abbreviations for each
operation code and meaningful names for specifying addresses. However, because an assembly
language is little more than a different notation for a machine language, any two machines
with different instruction sets also have different assembly languages.
Wired control panel for an IBM 402 Accounting Machine.

Compiler languages[edit]
High-level languages made the process of developing a program simpler and more understandable,
and less bound to the underlying hardware. The first compiler related tool, the A-0 System, was
developed in 1952[12] by Grace Hopper, who also coined the term 'compiler'.[13][14] FORTRAN, the first
widely used high-level language to have a functional implementation, came out in 1957,[15] and many
other languages were soon developed—in particular, COBOL aimed at commercial data processing,
and Lisp for computer research.
These compiled languages allow the programmer to write programs in terms that are syntactically
richer, and more capable of abstracting the code, making it easy to target for varying machine
instruction sets via compilation declarations and heuristics. Compilers harnessed the power of
computers to make programming easier[15] by allowing programmers to specify calculations by
entering a formula using infix notation.

Source code entry[edit]


See also: Computer programming in the punched card era
Programs were mostly still entered using punched cards or paper tape. By the late 1960s, data
storage devices and computer terminals became inexpensive enough that programs could be
created by typing directly into the computers. Text editors were also developed that allowed changes
and corrections to be made much more easily than with punched cards.

You might also like