Nanotechnology or nanotech is the study of manipulating matter on an atomic and molecular scale. Or Nanotechnology is the engineering of functional systems at the molecular scale. Or 'nanotechnology' refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products. Generally, nanotechnology deals with developing materials, devices, or other structures possessing at least one dimension sized from 1 to 100 nanometres Nanotechnology is very diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale. Nanotechnology entails the application of fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, microfabrication, etc. There is much debate on the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in medicine, electronics, biomaterials and energy production. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials, and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

Everything we see around us is made of atoms, the tiny elemental building blocks of matter. From stone, to copper, to bronze, iron, steel, and now silicon, the major technological ages of humankind have been defined by what these atoms can do in huge aggregates, trillions upon

NANO TECHNOLOGY 09me84 trillions of atoms at a time, molded, shaped, and refined as macroscopic objects. Even in our vaunted microelectronics of 1999, in our highest-tech silicon computer chip the smallest feature is a mountain compared to the size of a single atom. The resultant technology of our 20th century is fantastic, but it pales when compared to what will be possible when we learn to build things at the ultimate level of control, one atom at a time.”It is argued that through nanotechnology, it has become possible to create functional devices, materials and systems on the 1 to 100 nanometer (one billionth of a meter) length scale. The reasons why nanoscale has become so important are presented.

The history of nanotechnology traces the development of the concepts and experimental work falling under the broad category of nanotechnology. Although nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time. The emergence of nanotechnology in the 1980s was caused by the convergence of experimental advances such as the invention of the scanning tunneling microscope in 1981 and the discovery of fullerenes in 1985, with the elucidation and popularization of a conceptual framework for the goals of nanotechnology beginning with the 1986 publication of the book Engines of Creation. The field was subject to growing public awareness and controversy in the early 2000s, with prominent debates about both its potential implications as well as the feasibility of the applications envisioned by advocates of molecular nanotechnology, and with governments moving to promote and fund research into nanotechnology. The early 2000s also saw the beginnings of commercial applications of nanotechnology, although these were limited to bulk applications of nanomaterials rather than the transformative applications envisioned by the field.

2.0.1. Richard Feynman
The American physicist Richard Feynman lectured, "There's Plenty of Room at the Bottom," at an American Physical Society meeting at Caltech on December 29, 1959, which is often held to have provided inspiration for the field of nanotechnology. Feynman had described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important. After Feynman's death, scholars studying the historical development of nanotechnology have concluded that his actual role in catalyzing nanotechnology research was limited, based on recollections from many of the people active in the nascent field in the 1980s and 1990s. Chris Toumey, a cultural anthropologist at the University of South Carolina, found that the published versions of Feynman’s talk had a negligible influence in the twenty years after it was first published, as measured by citations in the scientific literature, and not much more influence in the decade after the Scanning Tunneling Microscope was invented in 1981. Subsequently, interest in “Plenty of Room” in the scientific literature greatly increased in the early 1990s. This is probably because the term “nanotechnology” gained serious attention just before that time, following its use by K. Eric Drexler in his 1986 book, Engines of Creation: The Coming

NANO TECHNOLOGY 09me84 Era of Nanotechnology, which took the Feynman concept of a billion tiny factories and added the idea that they could make more copies of themselves via computer control instead of control by a human operator; and in a cover article headlined "Nanotechnology", published later that year in a mass-circulation science-oriented magazine, OMNI. Toumey’s analysis also includes comments from distinguished scientists in nanotechnology who say that “Plenty of Room” did not influence their early work, and in fact most of them had not read it until a later date.

These and other developments hint that the retroactive rediscovery of Feynman’s “Plenty of Room” gave nanotechnology a packaged history that provided an early date of December 1959, plus a connection to the charisma and genius of Richard Feynman. Feynman's stature as a Nobel laureate and as an iconic figure in 20th century science surely helped advocates of nanotechnology and provided a valuable intellectual link to the past.

2.0.2. Norio Taniguchi
The Japanese scientist Norio Taniguchi of the Tokyo University of Science was the first to use the term "nano-technology" in a 1974 conference, to describe semiconductor processes such as thin film deposition and ion beam milling exhibiting characteristic control on the order of a nanometer. His definition was, "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule.

2.0.3. K. Eric Drexler
In the 1980s the idea of nanotechnology as a deterministic, rather than stochastic, handling of individual atoms and molecules was conceptually explored in depth by K. Eric Drexler , who promoted the technological significance of nano-scale phenomena and devices through speeches and two influential books. In 1979, Drexler encountered Feynman's provocative 1959 talk "There's Plenty of Room at the Bottom".The term "nanotechnology", which had been coined by Taniguchi in 1974, was unknowingly appropriated by Drexler in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale "assembler" which would be able to build a copy of itself and of other items of arbitrary complexity. He also first published the term "grey goo" to describe what might happen if a hypothetical self-replicating molecular nanotechnology went out of control. Drexler's vision of nanotechnology is often called "Molecular Nanotechnology" (MNT) or "molecular manufacturing," and Drexler at one point proposed the term "zettatech" which never became popular.

NANO TECHNOLOGY 09me84 His 1991 Ph.D. work at the MIT Media Lab was the first doctoral degree on the topic of molecular nanotechnology and (after some editing) his thesis, "Molecular Machinery and Manufacturing with Applications to Computation," was published as Nanosystems: Molecular Machinery, Manufacturing, and Computation, which received the Association of American Publishers award for Best Computer Science Book of 1992. Drexler founded the Foresight Institute in 1986 with the mission of "Preparing for nanotechnology.” Drexler is no longer a member of the Foresight Institute.

When K. Eric Drexler popularized the word 'nanotechnology' in the 1980's, he was talking about building machines on the scale of molecules, a few nanometers wide—motors, robot arms, and even whole computers, far smaller than a cell. Drexler spent the next ten years describing and analyzing these incredible devices, and responding to accusations of science fiction. Meanwhile, mundane technology was developing the ability to build simple structures on a molecular scale. As nanotechnology became an accepted concept, the meaning of the word shifted to encompass the simpler kinds of nanometer-scale technology. The U.S. National Nanotechnology Initiative was created to fund this kind of nanotech: their definition includes anything smaller than 100 nanometers with novel properties. Much of the work being done today that carries the name 'nanotechnology' is not nanotechnology in the original meaning of the word. Nanotechnology, in its traditional sense, means building things from the bottom up, with atomic precision. This theoretical capability was envisioned as early as 1959 by the renowned physicist Richard Feynman. I want to build a billion tiny factories, models of each other, which are manufacturing simultaneously. . . The principles of physics, as far as I can see, do not speak against the possibility of maneuvering things atom by atom. It is not an attempt to violate any laws; it is something, in principle, that can be done; but in practice, it has not been done because we are too big. — Richard Feynman, Nobel Prize winner in physics.

Based on Feynman's vision of miniature factories using nanomachines to build complex products, advanced nanotechnology (sometimes referred to as molecular manufacturing) will

NANO TECHNOLOGY 09me84 make use of positionally-controlled mechanochemistry guided by molecular machine systems. Formulating a roadmap for development of this kind of nanotechnology is now an objective of a broadly based technology roadmap project led by Battelle (the manager of several U.S. National Laboratories) and the Foresight Nanotech Institute. Shortly after this envisioned molecular machinery is created, it will result in a manufacturing revolution, probably causing severe disruption. It also has serious economic, social, environmental, and military implications.

3.0.1. Nanoterms and Nanosystems
The terms ‘nanoreplicator’, ‘nanobot’, ‘assembler’, and ‘molecular manufacturing’ are often used in confusing ways. As used here, an ‘assembler’ is a mechanism for guiding chemical reactions by positioning reactive molecular tools, moving its tool-holding end in three dimensions like an industrial robot arm. A generic ‘nanobot’, then, may be an assembler or some other sort of nanoscale robotic mechanism. ‘Molecular manufacturing’ is a process of construction based on atom-by-atom control of product structures, which may use assemblers (or more specialized mechanisms) to guide a sequence of chemical reactions. If an assembler were packaged together with all of the machinery needed to power it, direct it, and prepare its reactive molecular tools – and with all of the instructions needed to guide the construction of another identical microscopic package – it would then form the heart of (one kind of ) a ‘nanoreplicator’. A nanoreplicator is thus a complex and specialized sort of nanomachine. A molecular manufacturing technology base is potentially self-replicating, but that potential can be harnessed in various ways – for example, to build desktop-scale ‘nanofactories’ preprogrammed to produce a wide range of macroscopic products (including, on demand, parts for more nanofactories) (Drexler, 1992). These contain no autonomous mobile robots. The term ‘nanotechnology’ itself now embraces a broad range of science and technology working at a length scale of approximately 1–100 nanometres, including the more specific goal it originally denoted.

3.0.2. Nanofabrication
Nanofabrication evolves from microfabrication. Since the advent of first transistor in 1947, microelectronics and integrated circuit (IC) industry has been the main driving force to continuously push fabrication technologies to their new dimensional limit. Following the famous ‘‘Moore’s Law’’, the semiconductor industry was able to double the density of

NANO TECHNOLOGY 09me84 transistors on a unit area of silicon chip in every 18 months. In fact, the shrinking of IC feature size is faster than predicted by Moore’s law. According to prediction of the first edition of International Technology Roadmap for Semiconductors (ITRS) published in 1993, the minimum circuit feature should have been 100 nm by 2007. The actually achieved minimum circuit feature in mass production is 65 nm, which is one full generation ahead , and volume manufacturing of 45-nm ICs is already on the horizon. From thousand transistors on a chip in 1970s to multibillion transistors on a chip today, such a feat could only be possible thanks to the constant innovation in microlithography-based IC manufacturing technologies. The minimum feature size in an IC has reduced from 250 nm a decade ago to 45 nm today. Functional complementary metal-oxide semiconductor (CMOS)-based memory IC with minimum circuit feature of 32 nm has been demonstrated in December 2007 . According to the 100-nm dimensional mark, the current generation of IC is already in the nanotechnology regime. Nanofabrication is already taking place in semiconductor manufacturing. While semiconductor industry has spent billions of dollars to develop more and more sophisticated equipments and technologies to downsize IC features, they have the sole purpose of mass production of ICs. The semiconductor industry has been sticking to optical lithography all along in the past half century, because it has been the only technology which has the capacity of patterning over a hundred wafers per hour at desired circuit feature dimension. The research community of nanoscience and nanotechnology is, on the other hand, quite content with vast variety of other less expensive nanofabrication tools and techniques. This has been the way that nanofabrication technologies were developed in the last few decades, in parallel to the huge investment in semiconductor manufacturing technologies. It is a fact that academic researchers rarely have the access to state-of-the-art microfabrication or nanofabrication tools, except via semiconductor foundry services with fewer choices of fabrication processes. They often have to work with whatever they have got, trying to make various nanostructures at affordable cost. Many new techniques and processes have therefore been developed. Low-cost nanofabrication has been the main feature of these technologies. One of the excellent examples is the development of nanoimprinting technology. The technology was developed in the mid-1990s for patterning sub-100 nm-scale structures in laboratories where expensive optical lithography tools could not be afforded The nanoimprinting technique has the parallel patterning capability of optical lithography, but much less expensive than an optical lithography system. Since then, many variants of the technology have been developed, such as working at room temperature and low pressure, using UV-curable polymers, and using soft printing masters, etc. They may not be as good as the state-of-the-art optical lithography technology. However, the low-cost feature certainly outweighs its shortcomings when it is used in a research laboratory. Nanofabrication is not equivalent to nanomanufacturing. The two distinctive features of any manufacturing technology are volume and yield. The typical example is the manufacturing of ICs. To meet the volume requirement, processing has to be parallel. This is the sole reason that optical lithography has survived through so many generations of ICs. Other alternative technologies will have to be able to pattern features in parallel fashion, whether it is done by charged particles (electron or ion projection lithography) or by stamps (nanoimprinting lithography). Though optical lithography is already able to pattern ICs at 32-nm feature size, there are very few technologies available as candidates further down the dimensional scale to succeed optical lithography. Another key issue is that any nanofabrication technology that can be qualified as a manufacturing technology will have to establish a complete supporting infrastructure. For example, optical lithography cannot stand alone to serve the IC manufacturing industry. There have to be photoresist suppliers, photomask manufacturers, and a whole set of inspection/characterization technologies to accompany the optical patterning

NANO TECHNOLOGY 09me84 technology. The same applies to the other so-called ‘‘next-generation lithography’’ (NGL) technologies, such as extreme UV (EUV) lithography , step-and-flash nanoimprinting lithography (SFIL), and maskless lithography (ML2). They all have to have the supporting infrastructure in place before they can enter the IC manufacturing arena. To meet the yield requirement, a nanofabrication technology has to be reliable, repeatable, and have the lowest defect level. Few of emerging nanofabrication technologies are able to meet this requirement. Even a mature technology may stumble over the reliability criterion. A good lesson was learnt from X-ray lithography. X-ray lithography is by nature a parallel patterning technique. Its much shorter wavelength is ideal for succeeding optical lithography as the next-generation patterning technology for IC manufacturing. However, the difficulty in making X-ray masks and related manufacturing reliability issue finally sealed the fate of the technology. Though very few nanofabrication technologies would eventually become true nanomanufacturing technologies, it has not stopped many new nanofabrication processes being developed every year. Nanoscience researches require nanostructures to be made. In these applications, volume and yield are not critical issues. This is the area where creativity and ingenuity have led to many new and unconventional ways of making a nanostructure, whether directly or indirectly. A noticeable example is the use of scanning probes for various types of nanofabrication, either optically, electronically or chemically, mechanically .For a few nanostructures or devices, a scanning probe system is a good alternative, with easy setup and guaranteed nanoscale resolution. At below 10-nm scale, all other nanofabrication technologies have seen their limits. Molecular self-assembly becomes the new force in the field. Though there is a long way for these unconventional nanofabrication technologies to become conventional, they definitely have their niche places in the nanoscience and nanotechnology research and development community, and who knows that some of the technologies may one day be developed into industrial-scale manufacturing technologies.

The Greek word "nano" (meaning dwarf) refers to a reduction of size, or time, by 10-9, which is one thousand times smaller than a micron. One nanometer (nm) is one billionth of a meter and it is also equivalent to ten Angstroms. As such a nanometer is 10-9 meter and it is 10,000 times smaller than the diameter of a human hair. A human hair diameter is about 50 micron (i.e., 50x 10-6 meter) in size, meaning that a 50 nanometer object is about 1/1000th of the thickness of a hair. One cubic nanometer (nm3) is roughly 20 times the volume of an individual atom. A nanoelement compares to a basketball, like a basketball to the size of the earth. Figure shows various size ranges for different nanoscale objects starting with such small entities like ions, atoms and molecules. Size ranges of a few nanotechnology related objects (like nanotube, single-electron transistor and quantum dot diameters) are also shown in this figure. It is obvious that nanoscience, nanoengineering and nanotechnology, all deal with very small sized objects and systems. Officially, the United States National Science Foundation defines nanoscience / nanotechnology as studies that deal with materials and systems having the following key properties.



Comparison of size ranges for several entities as compared to some nanotechnology devices: SET (Single-electron transistor), GMR (Giant magneto resistive), Q-DOTS (Quantum dots). SE stands for Scanning Electron and ST stands for Scanning Tunneling.

(1) Dimension: at least one dimension from 1 to 100 nanometers (nm). (2) Process: designed with methodologies that shows fundamental control over the physical and chemical attributes of molecular-scale structures. (3) Building block property: they can be combined to form larger structures. Nanoscience, in a general sense, is quite natural in microbiological sciences considering that the sizes of many bioparticles dealt with (like enzymes, viruses, etc) fall within the nanometer range. Nanoscale is a magical point on the dimensional scale: Structures in nanoscale (called nanostructures) are considered at the borderline of the smallest of human-made devices and the largest molecules of living systems. Our ability to control and manipulate nanostructures will make it possible to exploit new physical, biological and chemical properties of systems that are intermediate in size, between single atoms, molecules and bulk materials. There are many specific reasons why nanoscale has become so important some of which are as the following : (i) The quantum mechanical (wavelike) properties of electrons inside matter are influenced by variations on the nanoscale. By nanoscale design of materials it is possible to vary their micro and macroscopic properties, such as charge capacity, magnetization and melting temperature, without changing their chemical composition. (ii) A key feature of biological entities is the systematic organization of matter on the nanoscale. Developments in nanoscience and nanotechnology would allow us to place manmade nanoscale things inside living cells. It would also make it possible to make new materials using the self-assembly features of nature. This certainly will be a powerful combination of biology with materials science. (iii) Nanoscale components have very high surface to volume ratio, making them ideal for use in composite materials, reacting systems, drug delivery, and chemical energy storage (such as hydrogen and natural gas). (iv) Macroscopic systems made up of nanostructures can have much higher density than those made up of microstructures. They can also be better conductors of electricity. This can result in new electronic device concepts, smaller and faster circuits, more sophisticated functions, and greatly reduced power consumption simultaneously by controlling nanostructure interactions and complexity.


The molecular theory of matter starts with quantum mechanics and statistical mechanics. According to the quantum mechanical Heisenberg Uncertainty Principle the position and momentum of an object cannot simultaneously and precisely be determined .Then the first question that may come into mind is, how could one be able to brush aside the Heisenberg Uncertainty Principle, Figure , to work at the atomic and molecular level, atom by atom as is the basis of nanotechnology. The Heisenberg Uncertainty Principle helps determine the size of electron clouds, and hence the size of atoms. According to Werner Heisenberg "The more precisely the POSITION is determined, the less precisely the MOMENTUM is known". Heisenberg's Uncertainty Principle applies only to the subatomic particles like electron, positron, photon, etc. It does not forbid the possibility of nanotechnology, which has to do with the position and momentum of such large particles like atoms and molecules. This is because the mass of the atoms and molecules are quite large and the quantum mechanical calculation by the Heisenberg Uncertainty Principle places no limit on how well atoms and molecules can be held in place.

Although we have long been aware of, and many investigators have been dealing with, “nano” sized entities; the historic birth of the nanotechnology is commonly credited to Feynman. Historically nanotechnology was for the first time formally recognized as a viable field of research with the landmark lecture delivered by Richard P. Feynman, the famous Noble Laureate physicist on December 29th 1959 at the annual meeting of the American Physical Society. His lecture was entitled "There's Plenty of Room at the Bottom - An invitation to enter a new field of physics". Feynman stated in his lecture that the entire encyclopedia of Britannica could be put on the tip of a needle and, in principle, there is no law preventing such an undertaking. Feynman described then the advances made in this field in the past and he envisioned the future for nanotechnology. His lecture was published in the February 1960 issue of Engineering & Science quarterly magazine of California Institute of Technology. In his talk Feynman also described how the laws of nature do not limit our ability to work at the molecular level, atom by atom. Instead, he said, it was our lack of the appropriate equipment and techniques for doing so. Feynman in his lecture talked about "How do we write small?", "Information on a small scale", possibility to have "Better electron microscopes" that could take the image of an atom, doing things small scale through "The marvelous biological system", "Miniaturizing the computer", "Miniaturization by evaporation" example of which is

NANO TECHNOLOGY 09me84 thin film formation by chemical vapor deposition, solving the "Problems of lubrication" through miniaturization of machinery and nanorobotics, "Rearranging the atoms" to build various nanostructures and nanodevices, and behavior of "Atoms in a small world" which included atomic scale fabrication as a bottom-up approach as opposed to the topdown approach that we are accustomed to Bottom-up approach is self-assembly of machines from basic chemical building blocks which is considered to be an ideal through which nanotechnology will ultimately be implemented. Top-down approach is assembly by manipulating components with much larger devices, which is more readily achievable using the current technology. It is important to mention that almost all of the ideas presented in Feynman's lecture and even more, are now under intensive research by numerous nanotechnology investigators all around the world. For example, in his lecture Feynman challenged the scientific community and set a monetary reward to demonstrate experiments in support of miniaturizations. Feynman proposed radical ideas about miniaturizing printed matter, circuits, and machines. "There's no question that there is enough room on the head of a pin to put all of the Encyclopedia Britanica" he said. He emphasized "I'm not inventing antigravity, which is possible someday only if the laws (of nature) are not what we think" He added "I am telling what could be done if the laws are what we think; we are not doing it simply because we haven't yet gotten around to it." Feynman’s challenge for miniaturization and his unerringly accurate forecast was met forty years later, in 1999, by a team of scientists using a nanotechnology tool called Atomic Force Microscope (AFM) to perform Dip Pen Nanolithography (DPN). In this DPN an AFM tip is simply coated with molecular ‘ink’ and then brought in contact with the surface to be patterned. Water condensing from the immediate environment forms a capillary between the AFM tip and the surface. Work is currently under way to investigate the potential of the DPN technique as more than a quirky tool for nanowriting, focusing on applications in microelectronics, pharmaceutical screening, and biomolecular sensor technology. Feynman in 1983 talked about a scaleable manufacturing system, which could be made to manufacture a smaller scale replica of itself .That, in turn would replicate itself in smaller scale, and so on down to molecular scale. Feynman was subscribing to the “Theory of Self- Reproducing Automata” proposed by von Neumann the 1940's eminent mathematician and physicist who was interested in the question of whether a machine can self-replicate, that is, produce copies of itself The study of man-made self-replicating systems has been taking place now for more than half a century. Much of this work is motivated by the desire to understand the fundamentals involved in selfreplication and advance our knowledge of single-cell biological selfreplications. Some of the other recent important achievements about which Feynman mentioned in his 1959 lecture include the manipulation of single atoms on a silicon surface , positioning single atoms with a scanning tunneling microscope and the trapping of single, 3 nm in diameter, colloidal particles from solution using electrostatic methods. In early 60’s there were other ongoing research on small systems but with a different emphasis. A good example is the publication of two books on "Thermodynamics of Small Systems" by T.L. Hill in early 1960s. Thermodynamics of small systems is now called "nanothermodynamics". The author of this book was privileged to offer a short course on nanothermodynamics to a large group of university professors, other scientists and graduate students during last May . In 1960s when Feynman recognized and recommended the importance of nanotechnology the devices necessary for nanotechnology were not invented yet. At that time, the world was intrigued with space exploration, discoveries and the desire and pledges for travel to the moon, partly due to political rivalries of the time and partly due to its bigger promise of new frontiers that man had also not captured yet. Research and developments in small (nano) systems did not sell very well at that time with the governmental research funding agencies and as a result the scientific community paid little attention to it. It is only appropriate to name the nanometer
10 | P a g e

NANO TECHNOLOGY 09me84 scale “the Feynman scale” after Feynman’s great contribution and we suggest the notation ( ) for it like Å as used (Å) for Angstrom scale and as used for micron scale. One Feynman ( ) ≡ 1 Nanometer (nm)= 10 Angstroms (Å)= 10-3 Micron ( ) = 10-9 Meter (m)

In this note, we present details of interatomic and intermolecular forces and potential energy functions from the point of view of formation and functions of nanostructures. Detailed information about interatomic and intermolecular interactions is necessary for modeling, prediction and simulation of the behavior of assembly of finite number of molecules, which happen to exist in nanosystems. In addition such information will guide the design of positional-assembly and prediction of self-assembly and self-replication which are the fundamentals of bottom-up nanotechnology. In this chapter, we first introduce a short description of covalent interactions and their differences with non-covalent interactions. Then we will present a detailed analysis of the non-covalent and covalent interactions. This will include experimental and theoretical modeling followed by phenomenological models developed for non-covalent and covalent interactions.

6.0.1 Covalent and Noncovalent Interactions
The interactions between atoms and molecules are either of covalent or non-covalent type [1,2]. Covalent interactions are in a form of chemical bond in compounds that result from the sharing of one or more pairs of electrons. Atoms can combine by forming molecules to achieve an octet of valence electrons by sharing electrons. In this process, the electron clouds of every two atoms overlap where they are thicker and their electric charge is stronger. Since the nuclei of the two atoms have stronger attractive potentials to their respective thick electron clouds than their mutual repulsive potential (due to their far distance from one another), the two atoms are held together and form a molecule.

As a result, the formation of covalent bonds requires overlapping of partially occupied orbitals of interacting atoms, which share a pair of electrons. On the other hand, in non-covalent interactions no overlapping is necessary because the attraction comes from the electrical properties of the interacting atoms and molecules. The electrons in the outermost shell are named the valence electrons, which are the electrons on an atom that can be gained or lost in a chemical reaction. The number of covalent bonds, which an atom can form, depends on how many valence electrons it has. Generally, covalent bond implies the sharing of just a single pair of electrons. The sharing of two pairs of electrons is called a double bond and three pairs sharing is called a triple bond. The latter case is relatively rare in nature, and actually two atoms are not observed to bond more than triply. Most frequently covalent bonding occurs
11 | P a g e

NANO TECHNOLOGY 09me84 between atoms, which possess similar electronegativities. That is when they have similar affinities for attracting electrons. This occurs when neither atom can possess sufficient affinity, or attractive potential energy, for an electron to completely remove an electron from the other atom. Covalent interactions are stronger than the non-covalent bonds. The weak non-covalent interactions were first recognized by J. D. van der Waals in the nineteenth century [2, 3]. Covalent interactions are of short range and the resulting bonds are generally less than 0.2 [nm] long. The non-covalent interactions are of longer range and within a few [nm] range. In Table 1 the differences between covalent and noncovalent interactions are reported.

The main point to be discussed in this chapter is how properties and behaviour change as the characteristic dimension is reduced. Of particular interest are discontinuous changes occurring at the nanoscale.

7.0.1. Materials
An object is delineated by its boundary. Dividing matter into small particles has an effect on purely physical processes. Suppose a spherical object of radius r is heated by internal processes, and the amount of heat is proportional to the volume V = 4πr 3 /3. The loss of heat to the environment will be proportional to the surface area, A = 4πr 2 . Now let the object be divided into n small particles. The total surface area is now n1/3 4πr 2 . This is the basic reason why small mammals have a higher metabolic rate than larger ones ,they need to produce more heat to compensate for its relatively greater loss through the skin in order to keep their bodies at the same steady temperature. This also explains why so few small mammals are found in the cold regions of the earth. Materials referred to as "nanomaterials" generally fall into two categories: fullerenes, and inorganic nanoparticles. Fullerenes The fullerenes are a class of allotropes of carbon which conceptually are graphene sheets rolled into tubes or spheres. These include thecarbon nanotubes (or silicon nanotubes) which are of interest both because of their mechanical strength and also because of their electrical properties.
12 | P a g e

NANO TECHNOLOGY 09me84 Nanoparticles Nanoparticles or nanocrystals made of metals, semiconductors, or oxides are of particular interest for their mechanical, electrical, magnetic, optical, chemical and other properties. Nanoparticles have been used as quantum dots and as chemical catalysts.

7.0.2. Chemical reactivity
Consider a heterogeneous reaction A + B → C, where A is a gas or a substance dissolved in a liquid and B is a solid. Only the surface atoms are able to come into contact with the environment, hence for a given mass of material B the more finely it is divided the more reactive it will be, in terms of numbers of C produced per unit time. The above considerations do not imply any discontinuous change upon reaching the nanoscale. Granted, however, that matter is made up of atoms, the atoms situated at the boundary of an object are qualitatively different from those in the bulk .A cluster of six atoms (in two-dimensional Flatland) has only one bulk atom, and any smaller cluster is “all surface”. This may have a direct impact on chemical reactivity (considering here, of course, heterogeneous reactions). It is to be expected that the surface atoms are individually more reactive than their bulk neighbours, since they have some free valences (i.e., bonding possibilities). Consideration of chemical reactivity (its enhancement for a given mass, by dividing matter into nanoscale-sized pieces) suggests a discontinuous change when matter becomes “all surface”.

The boundary of an object shown as a cross-section in two dimensions. The surface atoms (white) are qualitatively different from the bulk atoms (grey), since the latter have six nearest neighbours (in the two-dimensional crosssection) of their own kind, whereas the former only have four.

In practice, however, the surface atoms may have already satisfied their bonding requirements by picking up reaction partners from the environment. For example, many metals become spontaneously coated with a film of their oxide when left standing in air, and as a result are chemically more inert than pure material. These films are typically thicker than one atomic layer. On silicon, for example, the native oxide layer is about 4 nm thick. This implies that a piece of freshly cleaved silicon undergoes some lattice disruption enabling oxygen atoms to effectively penetrate deeper than the topmost layer. If the object is placed in the “wrong” environment, the surface compound may be so stable that the nanoparticles coated with it are actually less reactive than the same mass of bulk matter. A one centimetre cube of sodium taken from its protective fluid (naphtha) and thrown into a pool of water will act in a lively fashion for some time, but if the sodium is first cut up into one micrometre cubes, most of the metallic sodium will have already reacted with moist air before it reaches the water.

7.0.3. Forces
The magnitudes of the forces (gravitational, electrostatic, etc.) between objects depend on their sizes and the distance z between them. At the nanoscale, gravitational forces are so weak that they can be neglected. Conversely, the range of the strong nuclear force is much smaller,
13 | P a g e

NANO TECHNOLOGY 09me84 and can also be neglected. Of particular importance are several forces (e.g., the van der Waals force) that are electrostatic in origin.Since they are especially important for self-assembly. A cavity consisting of two mirrors facing each other disturbs the pervasive zero-point electromagnetic field, because only certain wavelengths can fit exactly into the space between the mirrors. This lowers the zero-point energy density in the region between the mirrors, resulting in an attractive Casimir force. The force falls off rapidly with the distance z between the mirrors (as z −4 ), and hence is negligible at the microscale and above, but at a separation of 10 nm it is comparable with atmospheric pressure (105 N/m), and therefore can be expected to affect the operation of nanoscale mechanical devices.

7.0.4. Device performance
Analysis of device performance begins by noting how key parameters scale with device length: area (and power and thermal losses) as length squared, volume and mass as length cubed, electromagnetic force as length to the fourth power, natural frequency as inverse length, and so on. Relationships such as these are used to derive the way a device’s performance scales as it is made smaller. Natural phenomena comprising discrete noninteracting entities (such as photons) can be approximated by the Poisson distribution .A fundamental property of this distribution is that its variance equals its mean. The uncertainty (e.g., of the magnitude of a certain expo- sure of an object to light) expressed as a standard deviation therefore equals the square root (of exposure). When objects become very small, the number of entities conveying information necessarily also becomes small. Small signals are more vulnerable to noise. Repetition of a message is the simplest way of overcoming noise. A nanoscale, device using only one entity (e.g., an electron) to convey one bit of information would, in most circumstances, be associated with an unacceptably high equivocation in the transmission of information.

7.0.5. Design
Although the most obvious consequence of nanotechnology is the creation of very small objects, an immediate corollary is that there must be a great many of these objects. If r is the relative device size, and R the number of devices, then usefulness may require that rR ∼ 1, implying the need for 109 devices. This corresponds to the number of components (with a minimum feature length of about 100 nm) on a very large-scale integrated electronic chip, for example. At present, all these components are explicitly designed and fabricated. But will this still be practicable if the number of components increases by a further two and more orders of Magnitude. because it is not possible to give a clear affirmative answer to this question, alternative routes to the design and fabrication of such vast numbers are being explored. The human brain serves as an inspiration here. Its scale is far vaster: it has ∼ 1011 neurons, and each neuron has hundreds or thousands of connexions to other neurons. There is insufficient information contained in our genes to specify the all these interconnexions. Rather, our genes specify an algorithm for generating them. In this spirit, evolutionary design principles may become essential for designing nanodevices. An example of an evolutionary design algorithm is shown in Figure It might be initialized by a collection of existing designs, or guesses at possible new designs. Since new variety within the design population is generated randomly, the algorithm effectively expands the imagination of the human designer.

14 | P a g e


An evolutionary design algorithm. All relevant design features are encoded in the genome (a very simple genome is for each gene to be a single digit binary value indicating absence (0) or presence (1) of a feature). The genomes are evaluated (“survivor selection strategy”)—this stage could include human (interactive) as well as automated evaluation—and only genomes fulfilling the evaluation criteria are retained. The diminished population is then expanded in numbers and in variety—typically the successful genomes are used as the basis for generating new ones via biologically-inspired processes such as recombination and mutation. Although this strategy enables the design size (i.e., the number of individual features that must be explicitly specified) to be expanded practically without limit, one sacrifices knowledge of the exact internal workings of the device, introducing a level of unpredictability into device performance that may require a new engineering paradigm to be made acceptable.

Component failure and redundancy. As the number of components on a “chip” is increased, it may become more cost-effective to build in functional redundancy, such that failures of some of the components will not affect the performance of the whole (more explicitly, their failure would be detected by their congeners, who would switch in substitutes). As a first approximation, considering them to all occur independently of each other.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products. One nanometer (nm) is one billionth, or 10−9, of a meter. By comparison, typical carboncarbon bond lengths, or the spacing between these atoms in a molecule, are in the range0.12– 0.15 nm, and a DNA double-helix has a diameter around 2 nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200 nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm diameter) since nanotechnology must build its devices from atoms and molecules. The upper
15 | P a g e

NANO TECHNOLOGY 09me84 limit is more or less arbitrary but is around the size that phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology. To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth. Or another way of putting it: a nanometer is the amount an average man's beard grows in the time it takes him to raise the razor to his face. Two main approaches are used in nanotechnology. In the "bottom-up" approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition. In the "top-down" approach, nano-objects are constructed from larger entities without atomic-level control. Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

8.0.1. Larger to smaller: a materials perspective
A number of physical phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects become dominant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanicalproperties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials. Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances become transparent (copper); stable materials turn combustible (aluminum); insoluble materials become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.

8.0.2. Simple to complex: a molecular perspective
Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

16 | P a g e

NANO TECHNOLOGY 09me84 These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to noncovalentintermolecular forces. The Watson–Crick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole. Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably Watson–Crick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

8.0.3. Molecular nanotechnology: a long-term view
Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles. When the term "nanotechnology" was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced. It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification. The physics and engineering performance of exemplar designs were analyzed in Drexler's book Nanosystems. In general it is very difficult to assemble devices on the atomic scale, as all one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno, is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Yet another view, put forward by the late Richard Smalley, is that mechanosynthesis is impossible due to the difficulties in mechanically manipulating individual molecules.

17 | P a g e

NANO TECHNOLOGY 09me84 This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003. Though biology clearly demonstrates that molecular machine systems are possible, nonbiological molecular machines are today only in their infancy. Leaders in research on nonbiological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley. They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator, and a nanoelectromechanical relaxation oscillator. An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

Mihail (Mike) Roco of the U.S. National Nanotechnology Initiative has described four generations of nanotechnology development (see chart below). The current era, as Roco depicts it, is that of passive nanostructures, materials designed to perform one task. The second phase, which we are just entering, introduces active nanostructures for multitasking; for example, actuators, drug delivery devices, and sensors. The third generation is expected to begin emerging around 2010 and will feature nanosystems with thousands of interacting components. A few years after that, the first integrated nanosystems, functioning (according to Roco) much like a mammalian cell with hierarchical systems within systems, are expected to be developed.

Some experts may still insist that nanotechnology can refer to measurement or visualization at the scale of 1-100 nanometers, but a consensus seems to be forming around the idea (put forward by the NNI's Mike Roco) that control and restructuring of matter at the nanoscale is a
18 | P a g e

NANO TECHNOLOGY 09me84 necessary element. CRN's definition is a bit more precise than that, but as work progresses through the four generations of nanotechnology leading up to molecular nanosystems, which will include molecular manufacturing, we think it will become increasingly obvious that "engineering of functional systems at the molecular scale" is what nanotech is really all about.

9.0.1. Conflicting Definitions
Unfortunately, conflicting definitions of nanotechnology and blurry distinctions between significantly different fields have complicated the effort to understand the differences and develop sensible, effective policy. The risks of today's nanoscale technologies (nanoparticle toxicity, etc.) cannot be treated the same as the risks of longer-term molecular manufacturing (economic disruption, unstable arms race, etc.). It is a mistake to put them together in one basket for policy consideration—each is important to address, but they offer different problems and will require differentsolutions. As used today, the term nanotechnology usually refers to a broad collection of mostly disconnected fields. Essentially, anything sufficiently small and interesting can be called nanotechnology. Much of it is harmless. For the rest, much of the harm is of familiar and limited quality. But as we will see, molecular manufacturing will bring unfamiliar risks and new classes of problems.

9.0.2. General-Purpose Technology
Nanotechnology is sometimes referred to as a general-purpose technology. That's because in its advanced form it will have significant impact on almost all industries and all areas of society. It will offer better built, longer lasting, cleaner, safer, and smarter products for the home, for communications, for medicine, for transportation, for agriculture, and for industry in general. Imagine a medical device that travels through the human body to seek out and destroy small clusters of cancerous cells before they can spread. Or a box no larger than a sugar cube that contains the entire contents of the Library of Congress. Or materials much lighter than steel that possess ten times as much strength. — U.S. National Science Foundation

9.0.3. Dual-Use Technology
Like electricity or computers before it, nanotech will offer greatly improved efficiency in almost every facet of life. But as a general-purpose technology, it will be dual-use, meaning it will have many commercial uses and it also will have many military uses—making far more powerful weapons and tools of surveillance. Thus it represents not only wonderful benefits for humanity, but also grave risks. A key understanding of nanotechnology is that it offers not just better products, but a vastly improved manufacturing process. A computer can make copies of data files—essentially as many copies as you want at little or no cost. It may be only a matter of time until the building of products becomes as cheap as the copying of files. That's the real meaning of nanotechnology, and why it is sometimes seen as "the next industrial revolution."

19 | P a g e

NANO TECHNOLOGY 09me84 My own judgment is that the nanotechnology revolution has the potential to change America on a scale equal to, if not greater than, the computer revolution.— U.S. Senator Ron Wyden (D-Ore.) The power of nanotechnology can be encapsulated in an apparently simple device called a personal nanofactory that may sit on your countertop or desktop. Packed with miniature chemical processors, computing, and robotics, it will produce a wide-range of items quickly, cleanly, and inexpensively, building products directly from blueprints.

10.0.1. Nanomaterials
The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.
 Interface and colloid science has given rise to many materials which may be useful in

  

nanotechnology, such as carbon nanotubes and other fullerenes, and various nanoparticles and nanorods. Nanomaterials with fast ion transport are related also to nanoionics and nanoelectronics. Nanoscale materials can also be used for bulk applications; most present commercial applications of nanotechnology are of this flavor. Progress has been made in using these materials for medical applications; see Nanomedicine. Nanoscale materials are sometimes used in solar cells which combats the cost of traditional Silicon solar cells Development of applications incorporating semiconductor nanoparticles to be used in the next generation of products, such as display technology, lighting, solar cells and biological imaging; see quantum dots.

10.0.2. Bottom-up approaches
These seek to arrange smaller components into more complex assemblies.
 DNA nanotechnology utilizes the specificity of Watson–Crick basepairing to construct

well-defined structures out of DNA and other nucleic acids.  Approaches from the field of "classical" chemical synthesis (inorganic and organic synthesis) also aim at designing molecules with well-defined shape (e.g. bis-peptides).  More generally, molecular self-assembly seeks to use concepts of supramolecular chemistry, and molecular recognition in particular, to cause single-molecule components to automatically arrange themselves into some useful conformation.  Atomic force microscope tips can be used as a nanoscale "write head" to deposit a chemical upon a surface in a desired pattern in a process called dip pen nanolithography. This technique fits into the larger subfield of nanolithography.

20 | P a g e


10.0.3. Top-down approaches
These seek to create smaller devices by using larger ones to direct their assembly.
 Many technologies that descended from conventional solid-state silicon methods for

fabricating microprocessors are now capable of creating features smaller than 100 nm, falling under the definition of nanotechnology. Giant magnetoresistance-based hard drives already on the market fit this description,[25] as do atomic layer deposition (ALD) techniques. Peter Grünberg and Albert Fert received the Nobel Prize in Physics in 2007 for their discovery of Giant magnetoresistance and contributions to the field of spintronics.  Solid-state techniques can also be used to create devices known as nanoelectromechanical systems or NEMS, which are related tomicroelectromechanical systems or MEMS.  Focused ion beams can directly remove material, or even deposit material when suitable pre-cursor gasses are applied at the same time. For example, this technique is used routinely to create sub-100 nm sections of material for analysis in Transmission electron microscopy.  Atomic force microscope tips can be used as a nanoscale "write head" to deposit a resist, which is then followed by an etching process to remove material in a top-down method.

21 | P a g e


10.0.4. Functional approaches
These seek to develop components of a desired functionality without regard to how they might be assembled.
 Molecular scale electronics seeks to develop molecules with useful electronic

properties. These could then be used as single-molecule components in a nanoelectronic device. For an example see rotaxane.  Synthetic chemical methods can also be used to create synthetic molecular motors, such as in a so-called nanocar.

10.0.5. Biomimetic approaches
 Bionics or biomimicry seeks to apply biological methods and systems found in nature,

to the study and design of engineering systems and modern technology. Biomineralization is one example of the systems studied.  Bionanotechnology is the use of biomolecules for applications in nanotechnology, including use of viruses. Nanocellulose is a potential bulk-scale application.

10.0.6. Speculative
These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.  Molecular nanotechnology is a proposed approach which involves manipulating single molecules in finely controlled, deterministic ways. This is more theoretical than the other subfields and is beyond current capabilities.  Nanorobotics centers on self-sufficient machines of some functionality operating at the nanoscale. There are hopes for applying nanorobots in medicine, but it may not be easy to do such a thing because of several drawbacks of such devices. Nevertheless, progress on innovative materials and methodologies has been demonstrated with some patents granted about new nanomanufacturing devices for future commercial applications, which also progressively helps in the development towards nanorobots with the use of embedded nanobioelectronics concepts.  Productive nanosystems are "systems of nanosystems" which will be complex nanosystems that produce atomically precise parts for other nanosystems, not necessarily using novel nanoscale-emergent properties, but well-understood fundamentals of manufacturing. Because of the discrete (i.e. atomic) nature of matter and the possibility of exponential growth, this stage is seen as the basis of another industrial revolution. Mihail Roco, one of the architects of the USA's National Nanotechnology Initiative, has proposed four states of nanotechnology that seem to parallel the technical progress of the Industrial Revolution, progressing from passive nanostructures to active nanodevices to complex nanomachines and ultimately to productive nanosystems.
22 | P a g e

 Programmable matter seeks to design materials whose properties can be easily,

reversibly and externally controlled though a fusion of information science and materials science.  Due to the popularity and media exposure of the term nanotechnology, the words picotechnology and femtotechnology have been coined in analogy to it, although these are only used rarely and informally.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope(STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy, all flowing from the ideas of the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, that made it possible to see structures at the nanoscale. The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology suggested by Rostislav Lapshin appears to be a promising way to implement these nanomanipulations in automatic mode. However, this is still a slow process because of low scanning velocity of the microscope. Various techniques of nanolithography such as optical lithography, X-ray lithography dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern. Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular selfassembly techniques such as those employing di-block copolymers. However, all of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research. The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques. At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

23 | P a g e


Typical AFM setup. A microfabricated cantilever with a sharp tip is deflected by features on a sample surface, much like in a phonograph but on a much smaller scale. Alaser beam reflects off the backside of the cantilever into a set of photodetectors, allowing the deflection to be measured and assembled into an image of the surface.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics. However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.

Nanomechanics is a branch of nanoscience studying fundamental mechanical (elastic, thermal and kinetic) properties of physical systems at thenanometer scale. Nanomechanics has emerged on the crossroads of classical mechanics, solid-state physics, statistical mechanics, materials science, and quantum chemistry. As an area of nanoscience, nanomechanics provides a
24 | P a g e

NANO TECHNOLOGY 09me84 scientific foundation of nanotechnology. Nanomechanics is that branch of nanoscience,which deals with the study and application of fundamental mechanical properties of physical systems at the nanoscale, like elastic, thermal, kinetic. Often, nanomechanics is viewed as a branch of nanotechnology, i.e., an applied area with a focus on the mechanical properties of engineerednanostructures and nanosystems (systems with nanoscale components of importance). Examples of the latter include nanoparticles, nanopowders,nanowires, nanorods, nanoribbons, nanotubes, including carbon nanotubes (CNT) and boron nitride nanotubes (BNNTs); nanoshells, nanomebranes, nanocoatings, nanocomposite/nanostructured materials, (fluids with dispersed nanoparticles); nanomotors, etc. Some of the well-established fields of nanomechanics are: nanomaterials, nanotribology (friction, wear and contact mechanics at the nanoscale),nanoelectromechanical systems (NEMS), and nanofluidics. As a fundamental science, nanomechanics is based on some empirical principles (basic observations): 1) general mechanics principles 2) specific principles arising from the smallness of physical sizes of the object of study or research. General mechanics principles include:
 Energy and momentum conservation principles  Variational Hamilton's principle  Symmetry principles

Due to smallness of the studied object, nanomechanics also accounts for:
    

Discreteness of the object, whose size is comparable with the interatomic distances Plurality, but finiteness, of degrees of freedom in the object Importance of thermal fluctuations Importance of entropic effects Importance of quantum effects

These principles serve to provides a basic insight into novel mechanical properties of nanometer objects. Novelty is understood in the sense that these properties are not present in similar macroscale objects or much different from the properties of those (e.g., nanorods vs. usual macroscopic beam structures). In particular, smallness of the subject itself gives rise to various surface effects determined by higher surface-to-volume ratio of nanostructures, and thus affects mechanoenergetic and thermal properties (melting point, heat capacitance, etc.) ofnanostructures. Discreteness serves a fundamental reason, for instance, for the dispersion of mechanical waves in solids, and some special behavior of basic elastomechanics solutions at small scales. Plurality of degrees of freedom and the rise of thermal fluctuations are the reasons for thermal tunneling of nanoparticles through potential barriers, as well as for the crossdiffusion of liquids and solids. Smallness and thermal fluctuations provide the basic reasons of the Brownian motion of nanoparticles. Increased importance of thermal fluctuations and configuration entropy at the nanoscale give rise to superelasticity, entropic elasticity (entropic
25 | P a g e

NANO TECHNOLOGY 09me84 forces), and other exotic types of elasticity of nanostructures. Aspects of configuration entropy are also of great interest in the context self-organization and cooperative behavior of open nanosystems. Quantum effects determine forces of interaction between individual atoms in physical objects, which are introduced in nanomechanics by means of some averaged mathematical modelscalled interatomic potentials. Subsequent utilization of the interatomic potentials within the classical multibody dynamics provide deterministic mechanical models of nano structures and systems at the atomic scale/resolution. Numerical methods of solution of these models are called molecular dynamics (MD), and sometimes molecular mechanics (especially, in relation to statically equilibrated (still) models). Nondeterministic numerical approaches include Monte-Carlo, Kinetic More-Carlo (KMC), and other methods. Contemporary numerical tools include also hybridmultiscale approaches allowing concurrent or sequential utilization of the atomistic scale methods (usually, MD) with the continuum (macro) scale methods (usually, field emission microscopy) within a single mathematical model. Development of these complex methods is a separate subject of applied mechanics research. Quantum effects also determine novel electrical, optical and chemical properties of nanostructures, and therefore they find even greater attention in adjacent areas of nanoscience andnanotechnology, such as nanoelectronics, advanced energy systems, and nanobiotechnology.

Anyone coming to nanotechnology for the first time may experience mixed feelings: perhaps excitement tinged with anxiety. Kulinowski warns in Chapter 2 that Wow! (wonderful) could easily turn to Yuck! (horrible) in the public mind, depending on several factors not necessarily under the control of scientists, technologists, researchers, corporations and government departments and agencies. ‘Nanotechnology’ as a conception will be nurtured within preexisting popular mindsets; and the media and popular art forms (films, novels and so on) will have an impact on those mindsets and should not be underestimated. Nanotechnology is as much a public issue as it is an expert issue, and as much a social science subject as a natural science subject. Kulinowski points out that, despite this latent instability in perception, there remains a significant disparity between the research effort that is going into applications and the scant attention given to the whole range of social implications of nanotechnology. Here we could include public understanding, media reception, cultural and religious issues, ethical and legal dimensions, the globalizing context, governance and accountability, disruptive impact on other technologies and on economies, and political and military implications. Of course, it is not just a case of either a Wow! or a Yuck! response, but one of choices and tensions between human welfare benefits and hazards and risks to human health and the environment. The most important questions about nanotechnologies may not be posed, or not posed sufficiently quickly, systematically and deeply, if it is left to the powerful forces of commerce and competition. In the context of the latter the benefits may be stressed and the questions skewed towards issues of sufficiency of investment, profitability, receptivity of markets, intellectual property, speed of innovation and application (Mehta, 2002), the necessary economic infrastructures, funding of research and development, commercial confidentiality and the like. These are all important questions, but they belong to a discourse that may overlap with, but is not the same as, the human welfare discourse. This book makes

26 | P a g e

NANO TECHNOLOGY 09me84 forays into both discourses, paying more attention to the latter to help achieve an overall balance, and emerges with concerns as well as hopes.

With nanotechnology, a large set of materials and improved products rely on a change in the physical properties when the feature sizes are shrunk. Nanoparticles, for example, take advantage of their dramatically increased surface area to volume ratio. Their optical properties, e.g. fluorescence, become a function of the particle diameter. When brought into a bulk material, nanoparticles can strongly influence the mechanical properties of the material, like stiffness or elasticity. For example, traditional polymers can be reinforced by nanoparticles resulting in novel materials which can be used as lightweight replacements for metals. Therefore, an increasing societal benefit of such nanoparticles can be expected. Such nanotechnologically enhanced materials will enable a weight reduction accompanied by an increase in stability and improved functionality. Practical nanotechnology is essentially the increasing ability to manipulate (with precision) matter on previously impossible scales, presenting possibilities which many could never have imagined - it therefore seems unsurprising that few areas of human technology are exempt from the benefits which nanotechnology could potentially bring.

27 | P a g e


28 | P a g e


14.0.1 Cutomer goods
Nanotechnology is already impacting the field of consumer goods, providing products with novel functions ranging from easy-to-clean to scratch-resistant. Modern textiles are wrinkleresistant and stain-repellent; in the mid-term clothes will become “smart”, through embedded “wearable electronics”. Already in use are different nanoparticle improved products. Especially in the field of cosmetics, such novel products have a promising potential. Foods Complex set of engineering and scientific challenges in the food and bioprocessing industry for manufacturing high quality and safe food through efficient and sustainable means can be solved through nanotechnology. Bacteria identification and food quality monitoring using biosensors; intelligent, active, and smart food packaging systems; nanoencapsulation of bioactive food compounds are few examples of emerging applications of nanotechnology for the food industry.Nanotechnology can be applied in the production, processing, safety and packaging of food. A nanocomposite coating process could improve food packaging by placing antimicrobial agents directly on the surface of the coated film. Nanocomposites could increase or decrease gas permeability of different fillers as is needed for different products. They can also improve the mechanical and heat-resistance properties and lower the oxygen transmission rate. Research is being performed to apply nanotechnology to the detection of chemical and biological substances for sensanges in foods. Nano-foods New foods are among the nanotechnology-created consumer products coming onto the market at the rate of 3 to 4 per week, according to the Project on Emerging Nanotechnologies(PEN), based on an inventory it has drawn up of 609 known or claimed nano-products. On PEN's list are three foods—a brand of canola cooking oil called Canola Active Oil, a tea called Nanotea and a chocolate diet shake called Nanoceuticals Slim Shake Chocolate. According to company information posted on PEN's Web site, the canola oil, by Shemen Industries of Israel, contains an additive called "nanodrops" designed to carry vitamins, minerals and phytochemicals through the digestive system and urea. The shake, according to U.S. manufacturer RBC Life Sciences Inc., uses cocoa infused "NanoClusters" to enhance the taste and health benefits of cocoa without the need for extra sugar. Household The most prominent application of nanotechnology in the household is self-cleaning or “easyto-clean” surfaces on ceramics or glasses. Nano ceramic particles have improved the smoothness and heat resistance of common household equipment such as the flat iron. Optics The first sunglasses using protective and anti-reflective ultrathin polymer coatings are on the market. For optics, nanotechnology also offers scratch resistant surface coatings based on nanocomposites. Nano-optics could allow for an increase in precision of pupil repair and other types of laser eye surgery. Textiles The use of engineered nanofibers already makes clothes water- and stain-repellent or wrinklefree. Textiles with a nanotechnological finish can be washed less frequently and at lower temperatures. Nanotechnology has been used to integrate tiny carbon particles membrane and
29 | P a g e

NANO TECHNOLOGY 09me84 guarantee full-surface protection from electrostatic charges for the wearer. Many other applications have been developed by research institutions such as the Textiles Nanotechnology Laboratory at Cornell University, and the UK's Dstl and its spin out company P2i. Cosmetics One field of application is in sunscreens. The traditional chemical UV protection approach suffers from its poor long-term stability. A sunscreen based on mineral nanoparticles such as titanium dioxide offer several advantages. Titanium oxide nanoparticles have a comparable UV protection property as the bulk material, but lose the cosmetically undesirable whitening as the particle size is decreased. Agriculture Applications of nanotechnology have the potential to change the entire agriculture sector and food industry chain from production to conservation, processing, packaging, transportation, and even waste treatment. NanoScience concepts and nanotechnology applications have the potential to redesign the production cycle, restructure the processing and conservation processes and redefine the food habits of the people. Major challenges related to agriculture like low productivity in cultivable areas, large uncultivable areas, shrinkage of cultivable lands, wastage of inputs like water, fertilizers, pesticides, wastage of products and of course Food security for growing numbers can be addressed through various applications of nanotechnology.

14.0.2 Heavy industry
An inevitable use of nanotechnology will be in heavy industry. Aerospace Lighter and stronger materials will be of immense use to aircraft manufacturers, leading to increased performance. Spacecraft will also benefit, where weight is a major factor. Nanotechnology would help to reduce the size of equipment and thereby decrease fuelconsumption required to get it airborne. Hang gliders may be able to halve their weight while increasing their strength and toughness through the use of nanotech materials. Nanotech is lowering the mass of supercapacitorsthat will increasingly be used to give power to assistive electrical motors for launching hang gliders off flatland to thermal-chasing altitudes. Catalysis Chemical catalysis benefits especially from nanoparticles, due to the extremely large surface to volume ratio. The application potential of nanoparticles in catalysis ranges from fuel cell to catalytic converters and photocatalytic devices. Catalysis is also important for the production of chemicals. The synthesis provides novel materials with tailored features and chemical properties: for example, nanoparticles with a distinct chemical surrounding (ligands), or specific optical properties. In this sense, chemistry is indeed a basic nanoscience. In a short-term perspective, chemistry will provide novel “nanomaterials” and in the long run, superior processes such as “self-assembly” will enable energy and time preserving strategies. In a sense, all chemical synthesis can be understood in terms of nanotechnology, because of its ability to manufacture certain molecules. Thus, chemistry forms a base for nanotechnology providing tailor-made molecules, polymers, etcetera, as well as clusters and nanoparticles.
30 | P a g e

NANO TECHNOLOGY 09me84 Platinum nanoparticles are now being considered in the next generation of automotive catalytic converters because the very high surface area of nanoparticles could reduce the amount of platinum required. However, some concerns have been raised due to experiments demonstrating that they will spontaneously combust if methane is mixed with the ambient air. Ongoing research at the Centre National de la Recherche Scientifique (CNRS) in France may resolve their true usefulness for catalytic applications. Nanofiltration may come to be an important application, although future research must be careful to investigate possible toxicity. Construction Nanotechnology has the potential to make construction faster, cheaper, safer, and more varied. Automation of nanotechnology construction can allow for the creation of structures from advanced homes to massive skyscrapers much more quickly and at much lower cost. In the near future Nanotechnology can be used to sense cracks in foundations of architecture and can send nanobots to repair them. Nanotechnology and constructions Nanotechnology is one of the most active research areas that encompass a number of disciplines Such as electronics, bio-mechanics and coatings including civil engineering and construction materials. The use of nanotechnology in construction involves the development of new concept and understanding of the hydration of cement particles and the use of nano-size ingredients such as alumina and silica and other nanoparticles. The manufactures also investigating the methods of manufacturing of nano-cement. If cement with nano-size particles can be manufactured and processed, it will open up a large number of opportunities in the fields of ceramics, high strength composites and electronic applications. Since at the nanoscale the properties of the material are different from that of their bulk counter parts. When materials becomes nanosized, the proportion of atoms on the surface increases relative to those inside and this leads to novel properties. Some applications of nanotechnology in construction are describe below. Nanoparticles and steel Steel has been widely available material and has a major role in the construction industry. The use of nanotechnology in steel helps to improve the properties of steel. The fatigue, which led to the structural failure of steel due to cyclic loading, such as in bridges or towers.The current steel designs are based on the reduction in the allowable stress, service life or regular inspection regime. This has a significant impact on the life-cycle costs of structures and limits the effective use of resources.The Stress risers are responsible for initiating cracks from which fatigue failure results .The addition of copper nanoparticles reduces the surface un-evenness of steel which then limits the number of stress risers and hence fatigue cracking. Advancements in this technology using nanoparticles would lead to increased safety, less need for regular inspection regime and more efficient materials free from fatigue issues for construction. The nano-size steel produce stronger steel cables which can be in bridge construction. Also these stronger cable material would reduce the costs and period of construction, especially in suspension bridges as the cables are run from end to end of the span. This would require high strength joints which leads to the need for high strength bolts. The capacity of high strength bolts is obtained through quenching and tempering. The microstructures of such products consist of tempered martensite. When the tensile strength of tempered martensite steel exceeds 1,200 MPa even a very small amount of hydrogen embrittles the grain boundaries and the steel material may fail during use. This phenomenon, which is known as delayed fracture, which
31 | P a g e

NANO TECHNOLOGY 09me84 hindered the strengthening of steel bolts and their highest strength is limited to only around 1,000 to 1,200 MPa. The use of vanadium and molybdenum nanoparticles improves the delayed fracture problems associated with high strength bolts reducing the effects of hydrogen embrittlement and improving the steel micro-structure through reducing the effects of the inter-granular cementite phase. Welds and the Heat Affected Zone (HAZ) adjacent to welds can be brittle and fail without warning when subjected to sudden dynamic loading.The addition of nanoparticles of magnesium and calcium makes the HAZ grains finer in plate steel and this leads to an increase in weld toughness. The increase in toughness at would result in a smaller resource requirement because less material is required in order to keep stresses within allowable limits.The carbon nanotubes are exciting material with tremendous properties of strength and stiffness, they have found little application as compared to steel,because it is difficult to bind them with bulk material and they pull out easily, Which make them ineffective in construction materials. Vehicle manufacturers Much like aerospace, lighter and stronger materials will be useful for creating vehicles that are both faster and safer. Combustion engines will also benefit from parts that are more hardwearing and more heat-resistant.

14.0.3 Medicine
The biological and medical research communities have exploited the unique properties of nanomaterials for various applications (e.g., contrast agents for cell imaging and therapeutics for treating cancer). Terms such as biomedical nanotechnology, nanobiotechnology, and nanomedicine are used to describe this hybrid field. Functionalities can be added to nanomaterials by interfacing them with biological molecules or structures. The size of nanomaterials is similar to that of most biological molecules and structures; therefore, nanomaterials can be useful for both in vivo and in vitro biomedical research and applications. Thus far, the integration of nanomaterials with biology has led to the development of diagnostic devices, contrast agents, analytical tools, physical therapy applications, and drug delivery vehicles. Diagnostics Nanotechnology-on-a-chip is one more dimension of lab-on-a-chip technology. Magnetic nanoparticles, bound to a suitable antibody, are used to label specific molecules, structures or microorganisms. Gold nanoparticles tagged with short segments of DNA can be used for detection of genetic sequence in a sample. Multicolor optical coding for biological assays has been achieved by embedding different-sized quantum dots into polymeric microbeads. Nanopore technology for analysis of nucleic acids converts strings of nucleotides directly into electronic signatures. Drug delivery Nanotechnology has been a boon for the medical field by delivering drugs to specific cells using nanoparticles. The overall drug consumption and side-effects can be lowered significantly by depositing the active agent in the morbid region only and in no higher dose than needed. This highly selective approach reduces costs and human suffering. An example can be found in dendrimers and nanoporous materials. Another example is to use block co32 | P a g e

NANO TECHNOLOGY 09me84 polymers, which form micelles for drug encapsulation.They could hold small drug molecules transporting them to the desired location. Another vision is based on small electromechanical systems; nanoelectromechanical systems are being investigated for the active release of drugs. Some potentially important applications include cancer treatment with iron nanoparticles or gold shells. A targeted or personalized medicine reduces the drug consumption and treatment expenses resulting in an overall societal benefit by reducing the costs to the public health system. Nanotechnology is also opening up new opportunities in implantable delivery systems, which are often preferable to the use of injectable drugs, because the latter frequently display first-order kinetics (the blood concentration goes up rapidly, but drops exponentially over time). This rapid rise may cause difficulties with toxicity, and drug efficacy can diminish as the drug concentration falls below the targeted range. Tissue engineering Nanotechnology can help reproduce or repair damaged tissue. “Tissue engineering” makes use of artificially stimulated cell proliferation by using suitable nanomaterial-based scaffolds and growth factors. For example, bones can be regrown on carbon nanotube scaffolds. Tissue engineering might replace today's conventional treatments like organ transplants or artificial implants. Advanced forms of tissue engineering may lead to life extension.

33 | P a g e