Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1


Ratings: (0)|Views: 590|Likes:
Published by vishnudutdharma

More info:

Published by: vishnudutdharma on Jan 28, 2010
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as DOC, PDF, TXT or read online from Scribd
See more
See less





, shortened to "
", is the study of the controlling of matter on anatomic and molecular scale. Generally nanotechnology deals with structures of the size 100nanometers  or smaller in at least one dimension, and involves developing materials or devices within thatsize. Nanotechnology is very diverse, ranging from extensions of conventional device physicsto completely new approaches based upon molecular self-assembly, from developingnew materials  with dimensions on the nanoscale to investigating whether we can directly control matter on theatomic scale.There has been much debate on the futureimplications of nanotechnology. Nanotechnology hasthe potential to create many new materials and devices with a vast range of applications,such as inmedicine, electronics and energy production. On the other hand, nanotechnology raises many of the same issues as with any introduction of new technology, including concerns about thetoxicityand environmental impact of nanomaterials,
and their potential effects on globaleconomics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnologyis warranted.
History of nanotechnology
History of nanotechnology
Althoughnanotechnologyis a relatively recent development in scientific research, thedevelopment of its central concepts happened over a longer period of time. 
In1965,Gordon Moore, one of the founders of IntelCorporation, made the outstanding  prediction that the number of transistors that could be fit in a given area would double every 18months for the next ten years. This it did and the phenomenon became known asMoore's Law.This trend has continued far past the predicted 10 years until this day, going from just over 2000transistors in the original 4004 processors of 1971 to over 700,000,000 transistors in theCore 2. There has, of course, been a corresponding decrease in the size of individual electronic elements,going from millimeters in the 60's to hundreds of nanometers in modern circuitry.At the same time, thechemistry,  biochemistry and molecular geneticscommunities have been moving in the other direction. Over much the same period, it has become possible to direct thesynthesis, either in the test tube or in modified living organisms,Finally, the last quarter of a century has seen tremendous advances in our ability to control andmanipulate light. We can generate light pulses as short as a few femtoseconds (1 fs = 10
s).Light too has a size and this size is also on the hundred nanometer scale.Thus now, at the beginning of a new century, three powerful technologies have met on acommon scale — the nanoscale — with the promise of revolutionizing both the worlds of electronics and of biology. This new field, which we refer to as biomolecular nanotechnology,holds many possibilities from fundamental research in molecular biology and biophysics toapplications in biosensing, biocontrol, bioinformatics, genomics, medicine, computing,information storage and energy conversion.
Historical background
Humans have unwittingly employed nanotechnology for thousands of years, for example inmaking steel, paintings and in vulcanizing rubber.
Each of these processes rely on the
 properties of stochastically-formed atomic ensembles mere nanometers in size, and aredistinguished fromchemistryin that they don't rely on the properties of individual molecules.But the development of the body of concepts now subsumed under the term nanotechnology has been slower.The first mention of some of the distinguishing concepts in nanotechnology (but predating use of that name) was in 1867 byJames Clerk Maxwellwhen he proposed as a thought experiment atiny entity known asMaxwell's Demonable to handle individual molecules.The first observations and size measurements of nano-particles was made during the first decadeof the 20th century. They are mostly associated with Richard Adolf Zsigmondy who made a detailed study of gold sols and other nanomaterials with sizes down to 10 nm and less. He published a book in 1914.
. He used ultramicroscopethat employes the
dark field 
method for seeing particles with sizes much less thanlight wavelength. Zsigmondy was also the first who used
explicitly for characterizing particle size. He determined it as 1/1,000,000 of millimeter . He developed the first system classification based on particle size in the nanometer range.There have been many significant developments during the 20th century in characterizingnanomaterials and related phenomena, belonging to the field of  interface and colloid science. In the 1920s, Irving Langmuir and Katharine B. Blodgettintroduced the concept of a monolayer , a layer of material one molecule thick. Langmuir won a Nobel Prizein chemistry for his work. Inthe early 1950s, Derjaguin and Abrikosova conducted the first measurement of surface forces
.There have been many studies of 
 periodic colloidal structures
and principles of  molecular self-assemblythat are overviewed in the paper 
. There are many other discoveries that serve as thescientific basis for the modern nanotechnology which can be found in the "Fundamentals of Interface and Colloid Science by H.Lyklema 
Conceptual origins
The topic of nanotechnology was again touched upon by "There's Plenty of Room at theBottom," a talk given by physicist Richard Feynman at anAmerican Physical Societymeeting at Caltechon December 29, 1959. Feynman described a process by which the ability to manipulateindividual atoms and molecules might be developed, using one set of precise tools to build andoperate another proportionally smaller set, so on down to the needed scale. In the course of this,he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension andVan der Waals attraction would become more important, etc. This basic idea appears feasible, andexponential assembly enhances it with parallelism to produce a useful quantity of end products. At the meeting, Feynman announced two challenges, and he offered a prize of $1000 for the first individuals tosolve each one. The first challenge involved the construction of ananomotor , which, toFeynman's surprise, was achieved by November 1960 by William McLellan. The second challenge involved the possibility of scaling down letters small enough so as to be able to fit theentireEncyclopedia Britannicaon the head of a pin; this prize was claimed in 1985 byTom  Newman.

Activity (26)

You've already reviewed this. Edit your review.
1 hundred reads
1 thousand reads
S34ldogg liked this
toja1 liked this
thamma liked this
kehkashan05 liked this
Rohit Gupta liked this
Mithun Singh liked this
praveen_san007 liked this
Hazel Cruz liked this

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->