1.
Automation and computerization
Automation and computerization are related but separate
phenomena. Automation means the mechanizationand usually the
speeding upof production, not only in manufacturing but also in
service. Computerization is an advanced form of automation.
Fantasies and anxieties about automation predate the nation's
founding. The golem of Jewish legend, for example, is a powerful
automaton. But Americans' historic receptivity to technological
innovation and their need for machines to compensate for scarcity of
labor have made the United States a center for automation.
American automation began in the early New England textile mills
and other communities of the 1820s and 1830s, where workers
labored amid rows of complex, noisy, and dangerous machines, and
in the pioneering Midwestern slaughterhouses of the 1840s and
1850s, where workers butchered, divided into parts, and packed
hogs in disassembly lines. In both enterprises employees stood in
place while the work came to them and repeated their respective
assigned tasks day after day. By contrast, in the much smaller early
American craft and machine shops, employees had moved about
more freely from task to task as the work itself remained stationary.
Automation severely limited workers' physical movement and
occupational diversity alike.
Nevertheless, automation does not inevitably result in deskilling, or
the steady loss of skills, both technical and intellectual, to machines.
In the iron and steel industry, new machines may have diluted skills,
but in textiles and other industries, the number of semiskilled workers
actually increased, as employers required a labor force with the
knowledge and dexterity to operate complex machinery. Mere
physical strength mattered ever less. Hence the irrelevance to
industries not dependent on sheer manual labor of Frederick Taylors
system of scientific management (Taylorism)the elimination of
allegedly extraneous works motions and the acceleration of others.
Although scientific management, as conceived by Taylor, and
automation were thus two quite different concepts, Taylorism
provided ideological justification for subsequent automation in
industry. For Taylor, workers as well as machines lacked intelligence
and performed most efficiently when controlled completely by
engineers and managers.
Automation usually means the substitution of machines for workers,
causing technological unemployment. The term became popular
during the economic depression and mass unemployment of the
1930s, when for the first time the American public singled out
President Herbert Hoover and other engineers for being as
responsible as greedy industrialists for the efficient large-scale
manufacturing machines hitherto lauded as engines of prosperity
and job creation. Although America has never experienced anything
comparable to England's eighteenth-century Luddites, or machine
breakers, both white-collar citizens and industrial workers of the
1930s did finally associate automated machinery and job losses.
Automation need not, however, always mean fewer workers. At
Henry Ford's pioneering Michigan automobile assembly lines at
Highland Park (1910s) and River Rouge (1920s), low-cost mass
production and increased efficiency created thousands of new jobs.
In fact, the process of mass production characterized much of the
American industrial landscape. Yet these two huge Ford plants came
to epitomize automation in popular culture, technological history, and
scholarly discourse under the rubric Fordism. They were also the
models for the classic critique of automation in Charlie Chaplin's
1936 movie Modern Times.
Recovery from the Great Depression through World War II military
production, followed by postwar prosperity, lessened concern about
automation until the mid1950s, when another wave of technological
innovation arose. Now the threat that computers might replace white-
collar workers, even intellectual workers like librarians, generated
growing unease. The 1957 movie Desk Set, starring Spencer Tracy
as an efficiency expert computerizing a corporate research
department headed by Katharine Hepburn, popularized postwar
anxiety about technological unemployment.
Similarly, Kurt Vonnegut's novel Player Piano (1952) envisioned the
United States as a prosperous welfare state dependent on one huge
computer for all major decisions. In Vonnegut's technological
dystopia, only a few engineers and managers hold meaningful jobs
while most citizens resent their menial daily tasks despite the
domestic comforts provided by technological progress. Significantly,
the term computer, hitherto applied to men and women skilled in
numerical calculations, was now applied to the mechanical devices
that replaced them. (Years later, automatic programming codes
would replace human computer programmers.) Congressional
hearings on automation in 1955, which revealed that substantial
numbers of blue- and white-collar workers alike were being displaced
by machines, made automation a public issue.
Not until the radical critiques of American technology of the 1960s
and 1970s, however, did earlier piecemeal condemnations of
automation and computerization become parts of a broader
indictment of the overall quality of work. Harvey Swados's
pathbreaking 1957 essay in The Nation, The Myth of the Happy
Worker, vigorously argued that assembly-line workers' high pay and
good benefits hardly compensated for their daily grind and loss of
autonomy.
White-collar workers less threatened by technological
unemployment, particularly those in presumably lifetime corporate
positions, often thought themselves immune to the ills endured by
production workers. Studs Terkel's Working (1972) and the
Department of Health, Education, and Welfare's Work in America
(1973) amply demonstrated otherwise. In the 1980s and 1990s
America's largest corporations discharged large numbers of
managerial and white-collar employees. Technological innovation
now seemed to threaten even more educated employees, though
blue-collar workers lost still more jobs owing to automation. Half as
many factory positions existed in 1996 as in 1966.
By the 1990s, the computerization of America had become a fact of
life. In the 1940s and 1950s, computer pioneers like John Mauchly
and John Von Neumann never anticipated more than a few giant
computers that would be operated by skilled programmers employed
by the largest national and international institutions to solve the most
complex quantitative problems. By the 1980s, computers had
become available to ordinary Americans and embedded in their lives
in countless ways.
At the end of the twentieth century, many Americans anticipated an
ever more automated and computerized high-tech utopia. But other
citizens, aware of actual and potential technological and
environmental disasters, retreated from the nation's historically
uncritical embrace of technological progress and saw automation
and computerization as, at best, profoundly mixed blessings. For
them, chess champion Gary Kasparov's 1997 loss to International
Business Machine's (IBM's) Deep Blue computer symbolized the
human implications of technological triumphs.
2. Productions and Manufacturing
Production is the process of converting inputs into outputs through
various operations. All the operations which demands consumption
of resources together known as manufacturing. Both the terms are
used interchangeably yet both are different [Link] difference is
in the raw material. In production, the raw material is not procured
from outside, the company owns it and after processing and make
the final product. But in Manufacturing, the company procures the
raw material from outside, and then makes the final product.
manufacturing is a process of converting raw material in to finished
product by using various processes,machines and [Link] is a
narrow [Link] is a process of converting inputs in to
[Link] is a broder [Link] type of manufacturing can be
production, but every production is not a [Link]- making
of a turbine by various processes is manufacturingassemble the
various parts to make an engine is production not
[Link] isn't just when producing the product, it
includes other stages such as design, sales, management and
marketing
3. MIS and GIS
Management Information Systems (MIS) is the study of people,
technology, organizations and the relationships among them. MIS
professionals help firms realize maximum benefit from investment in
personnel, equipment, and business processes. MIS is a people-
oriented field with an emphasis on service through technology. If you
have an interest in technology and have the desire to use technology
to improve peoples lives, a degree in MIS may be for you.
Businesses use information systems at all levels of operation to
collect, process and store data. Management aggregates and
disseminates this data in the form of information needed to carry out
the daily operations of business. Everyone who works in business,
from someone who pays the bills to the person who makes
employment decisions, uses information systems. A car dealership
could use a computer database to keep track of which products sell
best. A retail store might use a computer-based information system
to sell products over the Internet. In fact, many (if not most)
businesses concentrate on the alignment of MIS with business goals
to achieve competitive advantage over other businesses.
MIS professionals create information systems for data management
(i.e., storing, searching and analyzing data). In addition, they
manage various information systems to meet the needs of
managers, staff and customers. By working collaboratively with
various members of their work group, as well as with their customers
and clients, MIS professionals are able to play a key role in areas
such as information security, integration and exchange. As an MIS
major, you will learn to design, implement and use business
information systems in innovative ways to increase the effectiveness
and efficiency of your company.
A geographic information system, or GIS, is a computerized data
management system used to capture, store, manage, retrieve,
analyze, and display spatial information. Data captured and used in a
GIS commonly are represented on paper or other hard-copy maps. A
GIS differs from other graphics systems in several respects. First,
data are georeferenced to the coordinates of a particular projection
system. This allows precise placement of features on the earths
surface and maintains the spatial relationships between mapped
features. As a result, commonly referenced data can be overlaid to
determine relationships between data elements. For example, soils
and wetlands for an area can be overlaid and compared to determine
the correspondence between hydric soils and wetlands. Similarly,
land use data for multiple time periods can be overlaid to determine
the nature of changes that may have occurred since the original
mapping. This overlay function is the basis of change detection
studies across landscapes.