You are on page 1of 7

ARTIFICIAL INTELLIGENCE

Artificial intelligence (AI) is the intelligence of machines and the branch of


computer science that aims to create it. AI textbooks define the field as "the study
and design of intelligent agents"[1] where an intelligent agent is a system that
perceives its environment and takes actions that maximize its chances of success.[2]
John McCarthy, who coined the term in 1956,[3] defines it as "the science and
engineering of making intelligent machines."[4]

The field was founded on the claim that a central property of humans, intelligence
—the sapience of Homo sapiens—can be so precisely described that it can be
simulated by a machine.[5] This raises philosophical issues about the nature of the
mind and the limits of scientific inquiry, issues which have been addressed by
myth, fiction and philosophy since antiquity.[6] Artificial intelligence has been the
subject of optimism,[7] but has also suffered setbacks[8] and, today, has become an
essential part of the technology industry, providing the heavy lifting for many of
the most difficult problems in computer science.[9]

AI research is highly technical and specialized, deeply divided into subfields that
often fail to communicate with each other.[10] Subfields have grown up around
particular institutions, the work of individual researchers, the solution of specific
problems, longstanding differences of opinion about how AI should be done and
the application of widely differing tools. The central problems of AI include such
traits as reasoning, knowledge, planning, learning, communication, perception and
the ability to move and manipulate objects.[11] General intelligence (or "strong AI")
is still among the field's long term goals.[12]

Virtual memory
The program thinks it has a large range of contiguous addresses, but in reality the
parts it is currently using are scattered around RAM, and the inactive parts are
saved in a disk file.

In computing, virtual memory is a memory management technique developed for


multitasking kernels. This technique virtualizes a computer architecture's various
hardware memory devices (such as RAM modules and disk storage drives),
allowing a program to be designed as though:
 there is only one hardware memory device and this "virtual" device acts like
a RAM module.
 the program has, by default, sole access to this virtual RAM module as the
basis for a contiguous working memory (an address space).

Systems that employ virtual memory:

 use hardware memory more efficiently than systems without virtual


memory.
 make the programming of applications easier by:
o hiding fragmentation.
o delegating to the kernel the burden of managing the memory
hierarchy; there is no need for the program to handle overlays
explicitly.
o obviating the need to relocate program code or to access memory with
relative addressing.

Memory virtualization is a generalization of the concept of virtual memory.

Virtual memory is an integral part of a computer architecture; all implementations


(excluding[dubious – discuss] emulators and virtual machines) require hardware support,
typically in the form of a memory management unit built into the CPU.
Consequently, older operating systems (such as DOS[1] of the 1980s or those for
the mainframes of the 1960s) generally have no virtual memory functionality,
though notable exceptions include the Atlas, B5000, IBM System/360 Model 67,
IBM System/370 mainframe systems of the early 1970s, and the Apple Lisa project
circa 1980.

Embedded systems and other special-purpose computer systems which require


very fast and/or very consistent response times may opt not to use virtual memory
due to decreased determinism; virtual memory systems trigger unpredictable
interrupts that may produce unwanted "jitter" during I/O operations. This is
because embedded hardware costs are often kept low by implementing all such
operations with software (a technique called bit-banging) rather than with
dedicated hardware. In any case, embedded systems usually have little use for
multitasking features or complicated memory hierarchies.
MULTI USER
Multi-user is a term that defines an operating system or application software that
allows concurrent access by multiple users of a computer. Time-sharing systems
are multi-user systems. Most batch processing systems for mainframe computers
may also be considered "multi-user", to avoid leaving the CPU idle while it waits
for I/O operations to complete. However, the term "multitasking" is more common
in this context.

An example is a Unix server where multiple remote users have access (such as via
Secure Shell) to the Unix shell prompt at the same time. Another example uses
multiple X Window sessions spread across multiple terminals powered by a single
machine - this is an example of the use of thin client.

Management systems are implicitly designed to be used by multiple users,


typically one system administrator or more and an end-user community.

The complementary term, single-user, is most commonly used when talking about
an operating system being usable only by one person at a time, or in reference to a
single-user software license agreement. Multi-user operating systems such as Unix
sometimes have a single user state or runlevel available for emergency
maintenance.

Biotechnology
Biotechnology is a field of applied biology that involves the use of living
organisms and bioprocesses in engineering, technology, medicine and other fields
requiring bioproducts. Modern use similar term includes genetic engineering as
well as cell- and tissue culture technologies. The concept encompasses a wide
range of procedures (and history) for modifying living organisms according to
human purposes - going back to domestication of animals, cultivation of plants,
and "improvements" to these through breeding programs that employ artificial
selection and hybridization. By comparison to biotechnology, bioengineering is
generally thought of as a related field with its emphasis more on higher systems
approaches (not necessarily altering or using biological materials directly) for
interfacing with and utilizing living things. The United Nations Convention on
Biological Diversity defines biotechnology as:[1]
"Any technological application that uses biological systems, living organisms, or
derivatives thereof, to make or modify products or processes for specific use."

Biotechnology draws on the pure biological sciences (genetics, microbiology,


animal cell culture, molecular biology, biochemistry, embryology, cell biology)
and in many instances is also dependent on knowledge and methods from outside
the sphere of biology (chemical engineering, bioprocess engineering, information
technology, biorobotics). Conversely, modern biological sciences (including even
concepts such as molecular ecology) are intimately entwined and dependent on the
methods developed through biotechnology and what is commonly thought of as the
life sciences industry.

Electrical engineering
Electrical engineering is a field of engineering that generally deals with the study
and application of electricity, electronics and electromagnetism. The field first
became an identifiable occupation in the late nineteenth century after
commercialization of the electric telegraph and electrical power supply. It now
covers a range of subtopics including power, electronics, control systems, signal
processing and telecommunications.

Electrical engineering may include electronic engineering. Where a distinction is


made, usually outside of the United States, electrical engineering is considered to
deal with the problems associated with large-scale electrical systems such as power
transmission and motor control, whereas electronic engineering deals with the
study of small-scale electronic systems including computers and integrated
circuits.[1] Alternatively, electrical engineers are usually concerned with using
electricity to transmit energy, while electronic engineers are concerned with using
electricity to transmit information. More recently, the distinction has become
blurred by the growth of power electronics.

Electronics
Electronics is the branch of science and technology which makes use of the
controlled motion of electrons through different media. The ability to control
electron flow is usually applied to information handling or device control.
Electronics is distinct from electrical science and technology, which deals with the
generation, distribution, control and application of electrical power. This
distinction started around 1906 with the invention by Lee De Forest of the triode,
which made electrical amplification possible with a non-mechanical device. Until
1950 this field was called "radio technology" because its principal application was
the design and theory of radio transmitters, receivers and vacuum tubes.

Most electronic devices today use semiconductor components to perform electron


control. The study of semiconductor devices and related technology is considered a
branch of physics, whereas the design and construction of electronic circuits to
solve practical problems come under electronics engineering. This article focuses
on engineering aspects of electronics.

[edit] Electronic devices and components

An electronic component is any physical entity in an electronic system used to


affect the electrons or their associated fields in a desired manner consistent with
the intended function of the electronic system. Components are generally intended
to be connected together, usually by being soldered to a printed circuit board
(PCB), to create an electronic circuit with a particular function (for example an
amplifier, radio receiver, or oscillator). Components may be packaged singly or in
more complex groups as integrated circuits. Some common electronic components
are capacitors, inductors, resistors, diodes, transistors, etc. Components are often
categorized as active (e.g. transistors and thyristors) or passive (e.g. resistors and
capacitors).

Technology
Technology is the usage and knowledge of tools, techniques, crafts, systems or
methods of organization. The word technology comes from the Greek technología
(τεχνολογία) — téchnē (τέχνη), an "art", "skill" or "craft" and -logía (-λογία), the
study of something, or the branch of knowledge of a discipline.[1] The term can
either be applied generally or to specific areas: examples include construction
technology, medical technology, or state-of-the-art technology or high technology.
Technologies can also be exemplified in a material product, for example an object
can be termed state of the art.

Technologies significantly affect human as well as other animal species' ability to


control and adapt to their natural environments. The human species' use of
technology began with the conversion of natural resources into simple tools. The
prehistorical discovery of the ability to control fire increased the available sources
of food and the invention of the wheel helped humans in travelling in and
controlling their environment. Recent technological developments, including the
printing press, the telephone, and the Internet, have lessened physical barriers to
communication and allowed humans to interact freely on a global scale. However,
not all technology has been used for peaceful purposes; the development of
weapons of ever-increasing destructive power has progressed throughout history,
from clubs to nuclear weapons.

Technology has affected society and its surroundings in a number of ways. In


many societies, technology has helped develop more advanced economies
(including today's global economy) and has allowed the rise of a leisure class.
Many technological processes produce unwanted by-products, known as pollution,
and deplete natural resources, to the detriment of the Earth and its environment.
Various implementations of technology influence the values of a society and new
technology often raises new ethical questions. Examples include the rise of the
notion of efficiency in terms of human productivity, a term originally applied only
to machines, and the challenge of traditional norms.

Philosophical debates have arisen over the present and future use of technology in
society, with disagreements over whether technology improves the human
condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar
movements criticise the pervasiveness of technology in the modern world, opining
that it harms the environment and alienates people; proponents of ideologies such
as transhumanism and techno-progressivism view continued technological progress
as beneficial to society and the human condition. Indeed, until recently, it was
believed that the development of technology was restricted only to human beings,
but recent scientific studies indicate that other primates and certain dolphin
communities have developed simple tools and learned to pass their knowledge to
other generations.

Science
Science (from the Latin scientia, meaning "knowledge") is an enterprise that builds
and organizes knowledge in the form of testable explanations and predictions about
the natural world.[1][2][3][4] An older meaning still in use today is that of Aristotle, for
whom scientific knowledge was a body of reliable knowledge that can be logically
and convincingly explained (see "History and etymology" section below).[5]

Since classical antiquity science as a type of knowledge was closely linked to


philosophy, the way of life dedicated to discovering such knowledge. And into
early modern times the two words, "science" and "philosophy", were sometimes
used interchangeably in the English language. By the 17th century, "natural
philosophy" (which is today called "natural science") could be considered
separately from "philosophy" in general.[6] But "science" continued to also be used
in a broad sense denoting reliable knowledge about a topic, in the same way it is
still used in modern terms such as library science or political science.

The more narrow sense of "science" that is common today developed as a part of
science became a distinct enterprise of defining "laws of nature", based on early
examples such as Kepler's laws, Galileo's laws, and Newton's laws of motion. In
this period it became more common to refer to natural philosophy as "natural
science". Over the course of the 19th century, the word "science" became
increasingly strongly associated with the disciplined study of the natural world
including physics, chemistry, geology and biology. This sometimes left the study
of human thought and society in a linguistic limbo, which was resolved by
classifying these areas of academic study as social science. Similarly, several other
major areas of disciplined study and knowledge exist today under the general
rubric of "science", such as formal science and applied science.[7]

You might also like