You are on page 1of 21

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

UNIT V COMPUTER APPLICATIONS IN MEDICAL FIELD BIO MEDICAL TELEMETRY The use of telemetry methods for sending signals from a living organism over some distance to a receiver. Usually, biotelemetry is used for gathering data about the physiology, behavior, or location of the organism. Generally, the signals are carried by radio, light, or sound waves. Consequently, biotelemetry implies the absence of wires between the subject and receiver. Generally, biotelemetry techniques are necessary in situations when wires running from a subject to a recorder would inhibit the subject's activity; when the proximity of an investigator to a subject might alter the subject's behavior; and when the movements of the subject and the duration of the monitoring make it impractical for the investigator to remain within sight of the subject. Biotelemetry is widely used in medical fields to monitor patients and research subjects, and now even to operate devices such as drug delivery systems and prosthetics. Sensors and transmitters placed on or implanted in animals are used to study physiology and behavior in the laboratory and to study the movements, behavior, and physiology of wildlife species in their natural environments. Biotelemetry is an important technique for biomedical research and clinical medicine. Perhaps cardiovascular research and treatment have benefited the most from biotelemetry. Heart rate, blood flow, and blood pressure can be measured in ambulatory subjects and transmitted to a remote receiver-recorder. Telemetry also has been used to obtain data about local oxygen pressure on the surface of organs (for example, liver and myocardium) and for studies of capillary exchange (that is, oxygen supply and discharge). Biomedical research with telemetry includes measuring cardiovascular performance during the weightlessness of space flight and portable monitoring of radioactive indicators as they are dispersed through the body by the blood vessels. See also Space biology. Telemetry has been applied widely to animal research, for example, to record electroencephalograms, heart rates, heart muscle contractions, and respiration, even from sleeping mammals and birds. Telemetry and video recording have been combined in research of the relationships between neural and cardiac activity and behavior. Using miniature electrodes

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

and transmitters, ethnologists have studied the influence of one bird's song on the heart rate and behavior of a nearby bird. Many species of wildlife are difficult to find and observe because they are secretive, nocturnal, wide-ranging, or move rapidly. Most commonly, a transmitter is placed on a wild animal so that biologists can track or locate it by homing toward the transmitted signal or by estimating the location by plotting the intersection of two or more bearings from the receiver toward the signal. For some purposes, after homing to a transmitter-marked animal, the biologists observe its behavior. For other studies, successive estimates of location are plotted on a map to describe movement patterns, to delineate the amount of area the animal requires, or to determine dispersal or migration paths. Ecologists can associate the vegetation or other features of the environment with the locations of the animal. There are usually two concerns associated with the use of biotelemetry: the distance over which the signal can be received, and the size of the transmitter package. Often, both of these concerns depend on the power source for the transmitter. Integrated circuits and surface mount technology allow production of very small electronic circuitry in transmitters, making batteries the largest part of the transmitter package. However, the more powerful transmitters with their larger batteries are more difficult to place on or implant in a subject without affecting the subject's behavior or energetic. RADIO PILL. A capsule containing a miniature radio transmitter that can be swallowed by a patient. During its passage through the digestive tract a radio pill transmits information about internal conditions (acidity, etc.). PHYSIOLOGICAL PARAMETER MONITORING IN SPACE STATION
Wearable physiological monitoring system consists of an array of sensors embedded into the fabric of the wearer to continuously monitor the physiological parameters and transmit wireless to a remote monitoring station. At the remote monitoring station the data is correlated to study the overall health status of the wearer. In the conventional wearable physiological monitoring system, the sensors are integrated at specific locations on the vest and are interconnected to the wearable data acquisition hardware by wires woven into the fabric. The drawbacks associated with these systems are the cables woven in the fabric pickup noise such as power line interference and signals from nearby radiating sources and thereby corrupting the physiological signals. Also repositioning the sensors in the fabric is

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

difficult once integrated. The problems can be overcome by the use of physiological sensors with miniaturized electronics to condition, process, digitize and wireless transmission integrated into the single module. These sensors are strategically placed at various locations on the vest. Number of sensors integrated into the fabric form a network (Personal Area Network) and interacts with the human system to acquire and transmit the physiological data to a wearable data acquisition system. The wearable data acquisition hardware collects the data from various sensors and transmits the processed data to the remote monitoring station. Remote monitoring station is correlated to study the overall health status of the wearer. The wearable monitoring systems allow an individual to monitor his/her vital signs remotely and receive feedback to maintain a good health status. These systems alert medical personnel when abnormalities are detected. The conventional physiological monitoring system used in hospitals cannot be used for wearable physiological monitoring applications due to the following reasons * The conventional physiological monitoring systems are bulky to be used for wearable monitoring. * The gels used in the electrodes dry out when used over a period of time, which lead to increase in the contact resistance and thereby degrading the signal quality. * The gels used in the electrodes cause irritations and rashes when used for longer durations. * There are number of hampering wires from the sensors to the data acquisition system. * The signals acquired are affected with motion artifact and baseline wander as the electrodes float on the layer of gel. * The sensors used in conventional monitoring systems are bulky and are not comfortable to wear for longer durations. To overcome the above problems associated with the conventional physiological monitoring there is a need to develop sensors for wearable monitoring and integrate them into the fabric of wearer and continuously monitor the physiological parameters. A wearable data acquisition, processing and transmission hardware, which is portable, comfortable to wear for longer durations, and having sustainable battery power and a remote monitoring station is to be developed. The wearable physiological monitoring systems consists of three systems namely (a) Vest with the sensors integrated (b) Wearable data acquisition and processing hardware and (c) Remote monitoring station. In the vest sensors for acquiring the physiological parameters are integrated. The sensors outputs and power cables are interconnected

CARDIAC ARRHYTHMIA MONITORING SYSTEM

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

A looping monitor continuously records the patients ECG. When an event is manually or automatically recorded, the monitor automatically goes into a loop-mode to save the selected pre-symptom portion of the ECG rhythm, while continuing to record a post-symptom portion of the ECG. Patients are monitored with looping cardiac event monitors for up to 30 days. Each patient is provided with a Life Watch cardiac monitor, which is utilized when symptoms occur. An ECG is recorded when symptoms occur, and the data is transmitted to the Life Watch cardiac monitoring center. Upon receipt, the ECG is immediately reviewed and acted upon by nurses and certified cardiac technicians according to the enrolling physician's orders.

REMOTE ARRHYTHMIA MONITORING SYSTEM DEVELOPED Telemedicine is taking a step forward with the efforts of team members from the NASA Glenn Research Center, the Metro Health campus of Case Western University, and the University of Akron. The Arrhythmia Monitoring System is a completed, working test bed developed at Glenn that collects real-time electrocardiogram (ECG) signals from a mobile or homebound patient, combines these signals with global positioning system (GPS) location data, and transmits them to a remote station for display and monitoring. Approximately 300,000 Americans die every year from sudden heart attacks, which are arrhythmia cases. However, not all patients identified at risk for arrhythmias can be monitored continuously because of technological and economical limitations. Such patients, who are at moderate risk of arrhythmias, would benefit from technology that would permit long-term continuous monitoring of electrical cardiac rhythms outside the hospital environment. Embedded Web Technology developed at Glenn to remotely command and collect data from embedded systems using Web technology is the catalyst for this new telemetry system (ref. 1). In the end-to-end system architecture, ECG signals are collected from a patient using an event recorder and are transmitted to a handheld personal digital assistant (PDA) using Bluetooth, a short-range wireless technology. The PDA concurrently tracks the patient's location via a connection to a GPS receiver. A long distance link is established via a standard Internet connection over a 2.5-generation Global System for Mobile Communications/General Packet Radio Service (GSM/GPRS)1 cellular, wireless infrastructure. Then, the digital signal is transmitted to a call center for monitoring by medical professionals.

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

Three-lead ECG signal displayed at a call center along with real-time GPS tracking of a patient's location. The call center is a personal computer with an Internet address that collects and displays the ECG signal in the traditional strip chart fashion. Because the GPRS network capacity is shared among many users in a given coverage area, data throughput varies. Software developed for the call center monitors the data rate, buffers the ECG signal as needed, and dynamically adjusts the display update to keep the strip chart in constant motion. Non-ECG data are also transmitted from the patient event recorder to create a safer, viable system. The event recorder can display a low-battery indicator and send an alert to the call center to ensure the condition is acknowledged and addressed. In addition, a patient can send a noncritical medical alert to the call center by pressing a button on the event recorder when a heart flutter or other unusual feeling occurs. The time of the alert is marked in the ECG signal stream for later inspection by a cardiac specialist. Finally, a panic button is available to patients to send a critical alert for help. Call center personnel will be able to dispatch 911 services and provide them with the most recent GPS position to locate the patient. ELECTRO-ENCEPHALO-GRAPHY (EEG) SIGNAL ANALYSIS THROUGH DSP ALGORITHMS

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

The recording of electrical signals emanated from human brain, which can be collected from the scalp of the head is called Electroencephalography (EEG). These signal's parameters and patterns indicate the health of the brain. EEG is the key area of biomedical data analysis. Using Digital Signal Processing functions EEG signals can be analyzed to properly diagnose the patient. The latest biomedical embedded electronics systems with DSP processor can display the computed results helping doctor to save time in analyzing complex EEG waveforms. The various DSP based analytic methods used to evaluate are, 1. Spectral estimation 2. Periodogram 3. Maximum entropy method 4. AR method 5. Moving average methods 6. ARMA method 7. Maximum likelihood method Here we provide some basics of each methods and further links to study in more depth. Spectral estimation: Spectral estimation helps in finding the pulse rhythms present in the EEG signal. The short segment of EEG data is analyzed for spectral parameters such as location and amount of spectral energy. Wave shaping filters are extensively used in this technique. Wave shaping filters determine produce the desired of output the signal of for the given concentrated input energy signal. signal. If the desired signal is a unit impulse it's called spiking filters. Spiking filters can be used to position locations Such study of energy concentration in EEG signal is called spectral estimation analysis. Periodogram: Periodogram is the estimation of spectral density. This can be obtained through estimated correlation function. Maximum entropy method:

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

Helps to measure the randomness and uncertainty associated with the EEG signal. Maximum Entropy method works even if any information or constraints on a process X (n) are absent. AR method: Autoregressive (AR) is preferred when signal's frequency has sharp peaks. AR is popular because, the accurate estimation of PSD can be obtained by solving linear equations. AR model is called all pole method where, each sample of the signal can be expressed as a combination of previous samples and an error signal. ARMA (autoregressive moving average) method: This model is suggested for modeling signals with sharp peaks and valleys in their frequency content and also signals with severe background noise. Maximum likelihood method: This blend the information already available based on prior knowledge and latest measurements. The value is optimal estimation of the actual value.

EEG
An EEG (electroencephalogram) is a measurement of time-varying potential differences that reflect the electrical activity of (for instance) the human brain. The EEG is an important clinical aid for the diagnosis of epilepsy, since the EEG of patients with epilepsy can reveal typical epileptiform activity, during seizures (ictal EEG) as well as in between seizures (interictal EEG).The most prominent example of interictal epileptiform activity is the epileptic spike. Using electrical source localization techniques, it is possible to identify The so-called irritative zone, i.e., the area in the brain where the interictal spikes originate.

EEG source analysis


In the context of EEG source analysis, the forward problem is the calculation of the potential fields that result from given current sources. The reconstruction of these current sources from the EEG is referred to as the inverse problem. The solution to the inverse problem is not unique: different source configurations can give rise to the same EEG. Therefore, a model of the sources is required to constrain the possible solutions. A widely used source model is the current dipole, suitable for modeling neural currents that are assumed to be localized in one small area.

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

To solve the forward problem, a realistic head model is required based on a geometrical description of the head and a specification of the conductivity of the different brain tissues. To solve the forward problem in this realistic head model, numerical techniques are required. EEG Signal Processing Electroencephalograms (EEGs) are becoming increasingly important measurements of brain activity and they have great potential for the diagnosis and treatment of mental and brain diseases and abnormalities. With appropriate interpretation methods they are emerging as a key methodology to satisfy the increasing global demand for more affordable and effective clinical and healthcare services. Developing and understanding advanced signal processing techniques for the analysis of EEG signals is crucial in the area of biomedical research. This book focuses on these techniques, providing expansive coverage of algorithms and tools from the field of digital signal processing. It discusses their applications to medical data, using graphs and topographic images to show simulation results that assess the efficacy of the methods. Additionally, expect to find: explanations of the significance of EEG signal analysis and processing (with examples) and a useful theoretical and mathematical background for the analysis and processing of EEG signals; an exploration of normal and abnormal EEGs, neurological symptoms and diagnostic information, and representations of the EEGs; reviews of theoretical approaches in EEG modeling, such as restoration, enhancement, segmentation, and the removal of different internal and external artifacts from the EEG and ERP (event-related potential) signals; coverage of major abnormalities such as seizure, and mental illnesses such as dementia, schizophrenia, and Alzheimers disease, together with their mathematical interpretations from the EEG and ERP signals and sleep phenomenon; descriptions of nonlinear and adaptive digital signal processing techniques for abnormality detection, source localization and brain-computer interfacing using multi-channel EEG data with emphasis on non-invasive techniques, together with future topics for research in the area of EEG signal processing. The information within EEG Signal Processing has the potential to enhance the clinically-related information within EEG signals, thereby aiding physicians and ultimately providing more cost effective, efficient diagnostic tools. It will be beneficial to psychiatrists, neurophysiologists, engineers, and students or researchers in neurosciences. Undergraduate and

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

postgraduate biomedical engineering students and postgraduate epileptology students will also find it a helpful reference. ROLE OF EXPERT SYSTEMS Expert systems are computer applications that combine computer equipment, software, and specialized information to imitate expert human reasoning and advice. As a branch of artificial intelligence, expert systems provide discipline-specific advice and explanation to their users. While artificial intelligence is a broad field covering many aspects of computer-generated thought, expert systems are more narrowly focused. Typically, expert systems function best with specific activities or problems and a discrete database of digitized facts, rules, cases, and models. Expert systems are used widely in commercial and industrial settings, including medicine, finance, manufacturing, and sales. As a software program, the expert system integrates a searching and sorting program with a knowledge database. The specific searching and sorting program for an expert system is known as the inference engine. The inference engine contains all the systematic processing rules and logic associated with the problem or task at hand. Mathematical probabilities often serve as the basis for many expert systems. The second componentthe knowledge databasestores necessary factual, procedural, and experiential information representing expert knowledge. Through a procedure known as knowledge transfer, expertise (or those skills and knowledge that sustain a much better than average performance) passes from human expert to knowledge engineer. The knowledge engineer actually creates and structures the knowledge database by completing certain logical, physical, and psychosocial tasks. For this reason, expert systems are often referred to as knowledge-based information systems. By widely distributing human expertise through expert systems, businesses can realize benefits in consistency, accuracy, and reliability in problemsolving activities. Businesses may or may not differentiate between a decision support system (DSS) and an expert system. Some consider each one, alternately, to be a subcategory of the other. Whether or not they are one in the same, closely related, or completely independent is frequently debated in trade and professional literature. Like expert systems, the DSS relies on computer hardware,

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

software, and information to function effectively. The debatable distinction, however, between an expert system and a DSS seems to lie in their practical applications. Decision support systems are used most often in specific decision-making activities, while expert systems operate in the area of problem-solving activities. But this distinction may be blurry in practice, and therefore investigation of an expert system often implies research on DSS as well. Four interactive roles form the activities of the expert system:

diagnosing interpreting predicting instructing

The systems accomplish each of these by applying rules and logic specified by the human expert during system creation or maintenance or determined by the system itself based on analysis of historical precedents. Instruction, in particular, emerges as a result of the expert system's justification system. Synthesizing feedback with various combinations of diagnostic, interpretative and predictive curriculum, the expert system can become a finely tuned personal tutor or a fully developed and standardized group class. Computer-aided instruction (CAI) thrives as a field of inquiry and development for businesses.

EARLY MODELS
Early expert systems appeared in the mid-1960s as an offshoot of research in artificial intelligence. Many early systems (GPPS and DENDRAL at Stanford University, XCON at Digital Equipment Corp., and CATS-1 at General Electric) pioneered the concept of a computer expert. But one, MYCIN, most clearly introduced two essential characteristics of an expert system: modularity and justification. MYCIN was developed at Stanford University as an expert system to aid in the diagnosis of bacterial meningitis. As it was developed, MYCIN emerged as a product of modular design with a facility to explain its own advice. Modular design refers to the concept and practice of developing software as stand-alone sets of programming code and informational materials. Each set connects as a module or self-contained capsule to other modules. This idea of modular design led to the further advance of expert shells. An expert shell program simplifies the development of an expert system by providing preexisting inference engines and module knowledge database components. The frontward and backward chaining effects of MYCIN (its ability to recount the steps it took to arrive at any recommendation) still

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

influence the design of expert systems. As a result, the ability to explain or justify is a standard facility of commercially produced expert systems and programs. Perhaps the most important discovery for MYCIN and other early expert systems was the importance of the human expert in the expert system.

BUILDING A KNOWLEDGE BASE


The basic role of an expert system is to replicate a human expert and replace him or her in a problem-solving activity. In order for this to happen, key information must be transferred from a human expert into the knowledge database and, when appropriate, the inference engine. Two different types of knowledge emerge from the human expert: facts and procedural or heuristic information. Facts encompass the definitively known data and the defined variables that comprise any given activity. Procedures capture the if-then logic the expert would use in any given activity. Through a formal knowledge acquisition process that includes identification, conceptualization, formalization, implementation, and testing, expert databases are developed. Interviews, transactional tracking, observation, case study, and self-reporting choices are common means of extracting information from a human expert. Using programmatic and physical integration of logic, data, and choice, expert systems integrate the examination and interpretation of data input with specific rules of behavior and facts to arrive at a recommended outcome.

APPLYING EXPERTISE: THE INFERENCE ENGINE


When an expert system must choose which piece of information is an appropriate answer to the specific problem at hand, uncertainty is intrinsic; thus, uncertainty is an underlying consideration in the overall conceptualization, development, and use of an expert system. One popular treatment of uncertainty uses fuzzy logic. Fuzzy logic divides the simple yes-no decision into a scale of probability. This extension of probability criteria allows the expert system to accommodate highly complex problems and activities in an attempt to more closely model human expert assistance and interaction. Probabilities of uncertainty vary from system to system based on the kind of information being stored and its intended uses. In its diagnostic role, an expert system offers to solve a problem by analyzing yes or no with the likelihood of correctly identifying a cause of a problem or disturbance. By inferring difficulties from past observations, the expert system identifies possible problems while offering possible advice and/or solutions. Diagnostic systems typically infer causes of problems.

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

Applications include medicine, manufacturing, service, and a multitude of narrowly focused problem areas. As an aid to human problem solving, the diagnostic system or program assists by relying on past evidence and problems. By inferring descriptions from observations rather than problems, the expert system takes an interpretive rather than diagnostic role. Interpretive systems explain observations by inferring their meaning based on previous descriptions of situations. The probability of uncertainty is quantified as the likelihood of being an accurate representation. In a predictive role, the expert system forecasts future events and activities based on past information. Probabilities of uncertainty are emphasized as chances or the likelihood of being right. Finally, in an instructive role, the expert system teaches and evaluates the successful transfer of education information back to the user. By explanation of its decision-making process, supplemental materials, and systematic testing, the instructive system accounts for uncertainty by measuring the likelihood that knowledge transfer was achieved. Regardless of the role of an expert system or how it deals with uncertainty, its anatomy is still similar. The inference engine forms the heart of the expert system. The knowledge base serves as the brain of the expert system. The inference engine chums through countless potential paths and possibilities based on some combination of rules, cases, models, or theories. Some rules, such as predicate logic, mimic human reasoning and offer various mathematical arguments to any query. A decision tree or branching steps and actions synthesize probability with rules and information to arrive at a recommendation. Probabilities mirror the human expert's own experience with an activity or problem. Other models or cases structure some systematic movement through a problem-solving exercise in different ways. Case-based reasoning uses specific incidents or models of behavior to simulate human reasoning. Other inference engines are based on semantic networks (associated nodes and links of association), scripts (preprogrammed activities and responses), objects (selfcontained variables and rule sets), and frames (more-specialized objects allowing inheritance). In all cases, the inference engine guides the processing steps and expert information together in a systematic way. The knowledge database provides the fuel for the inference engine. The knowledge database is composed of facts, records, rules, books, and countless other resources and materials. These materials are the absolute values and documented evidence associated with the database structure. If-then procedures and pertinent rules are an important part of the knowledge database. Imitating human reasoning, rules or heuristics use logic to record expert processing steps and

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

requirements. Logic, facts, and past experience are woven together to make an expert database. As a result of knowledge transfer, significant experiences, skills, and facts fuse together in a representation of expertise. This expert database, or knowledge-based information system, is the foil for the inference engine. As such, the knowledge database must be accurately and reliably conceived, planned, and realized for optimum performance. Additionally, the knowledge database must be validated and confirmed as accurate and reliable. Expert databases containing inaccurate information or procedural steps that result in bad advice are ineffective and potentially destructive to the operation of a business. When, however, the inference engine and knowledge database synchronize correctly, businesses may realize gains in productivity and decreases in costs.

BENEFITS AND COSTS


Expert systems capture scarce expert knowledge and render it archival. This is an advantage when losing the expert would be a significant loss to the organization. Distributing the expert knowledge enhances employee productivity by offering necessary assistance to make the best decision. Improvements in reliability and quality frequently appear when expert systems distribute expert advice, opinion, and explanation on demand. Expert systems are capable of handling enormously complex tasks and activities as well as an extremely rich knowledgedatabase structure and content. As such, they are well suited to model human activities and problems. Expert systems can reduce production downtime and, as a result, increase output and quality. Additionally, expert systems facilitate the transfer of expertise to remote locations using digital communications. In specific situations, ongoing use of an expert system may be cheaper and more consistent than the services of a human expert. Some benefits of an expert system are direct. Loma Engineering reduced its staff requirements from five engineers to a 1.5 equivalent by using an expert system to customize machine specifications. Other benefits are less direct and may include improved managerial functions. The Federal Aviation Administration uses the Smart flow Traffic Management System to better coordinate air traffic activities. The American Stock Exchange also put expert systems to use in monitoring insider trading. Hospitals use expert systems to interpret patient data through a large database of drug knowledge in order to identify harmful drug interactions and other problems. Thanks to one New England hospital system, doctors don't even have to be at the computer to get the results: if the system discovers a problem as new data is analyzed, it can automatically send a

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

message to the doctor's pager. In manufacturing, expert systems are common and successful as well. Expert systems can track production variables, tabulate statistics, and identify processes that don't match the expected patterns, signaling potential problems. Moreover, integrated expert systems can immediately notify the appropriate person to correct a problem in the manufacturing process. The costs of expert systems vary considerably and often include post-development costs such as training and maintenance. Prices for the software development itself range from the low thousands of dollars for a very simple system to millions for a major undertaking. For large companies and complex activities, sufficiently powerful computer hardware must be available, and frequently programming must be done to integrate the new expert system with existing information systems and process controls. Additionally, depending on the application, the knowledge database must be updated frequently to maintain relevance and timeliness. Increased costs may also appear with the identification and employment of a human expert or a series of experts. Retaining an expert involves the potentially expensive task of transferring expertise to a digital format. Depending on the expert's ability to conceive and digitally represent knowledge, this process may be lengthy. Even after such efforts some expert systems fail to recover their costs because of poor design or inadequate knowledge modeling. Expert systems suffer, as well, from the systematic integration of preexisting human biases and ignorance into their original programming. Using an expert shella kind of off-the-shelf computer program for building an expert applicationis one way to reduce the costs of obtaining an expert system. The expert shell simplifies the expert system by providing preprogrammed modules and a ready-to-use inference engine structure. A number of companies provide expert shells that support business and industrial operations, including those conducted in Internet environments. PATTERN RECOGNITION TECHNIQUES Pattern recognition is "the act of taking in raw data and taking an action based on the category of the pattern. Most research in pattern recognition is about methods for supervised learning and unsupervised learning. Pattern recognition aims to classify data (patterns) based either on a priori knowledge or on statistical information extracted from the patterns. The patterns to be classified are usually

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

groups of measurements or observations, defining points in an appropriate multidimensional space. This is in contrast to pattern matching, where the pattern is rigidly specified. Overview A complete pattern recognition system consists of a sensor that gathers the observations to be classified or described, a feature extraction mechanism that computes numeric or symbolic information from the observations, and a classification or description scheme that does the actual job of classifying or describing observations, relying on the extracted features. The classification or description scheme is usually based on the availability of a set of patterns that have already been classified or described. This set of patterns is termed the training set, and the resulting learning strategy is characterized as supervised learning. Learning can also be unsupervised, in the sense that the system is not given an a priori labeling of patterns, instead it itself establishes the classes based on the statistical regularities of the patterns. The classification or description scheme usually uses one of the following approaches: statistical (or decision theoretic) or syntactic (or structural). Statistical pattern recognition is based on statistical characterizations of patterns, assuming that the patterns are generated by a probabilistic system. Syntactical (or structural) pattern recognition is based on the structural interrelationships of features. A wide range of algorithms can be applied for pattern recognition, from simple naive Bayes classifiers and neural networks to the powerful KNN decision rules. Pattern recognition is more complex when templates are used to generate variants. For example, in English, sentences often follow the "N-VP" (noun - verb phrase) pattern, but some knowledge of the English language is required to detect the pattern. Pattern recognition is studied in many fields, including psychology, ethnology, cognitive science and computer science. Holographic associative memory is another type of pattern matching where a large set of learned patterns based on cognitive meta-weight is searched for a small set of target patterns.

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

Within medical science, pattern recognition is the basis for computer-aided diagnosis (CAD) systems. CAD describes a procedure that supports the doctor's interpretations and findings. Typical applications are automatic speech recognition, classification of text into several categories (e.g. spam/non-spam email messages), the automatic recognition of handwritten postal codes on postal envelopes, or the automatic recognition of images of human faces. The last two examples from the subtopic image analysis of pattern recognition that deals with digital images as input to pattern recognition systems.

METHODS IN PATTERN RECOGNITION


Two different approaches for track finding have been established. The first one is a global method which consists of retrieving the pattern of hit cells in all layers which possibly belong to a track. The database which is stored in a dynamic memory in a tree-like structure must be set up in a previous learning phase during which a large sample of tracks is sent through the FTD detector to generate the patterns of cells which were hit. Since the algorithm can be coded

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

to be well suited for parallel computing, it is planned to implement this method within the second level trigger processor to make its result available to trigger decisions at that stage but also to store its results with the raw data for use in offline track finding. To keep the storage to a manageable size, the method has to be limited to higher moment and tracks coming from the interaction region. The second method starts from digitizing in single cell clusters and assigns the digitizing to planar track segments which are then fitted to a straight line. In order to convert the measured times into coordinates, we use a simplified model of the drift process which, however, needs the position in space because of its dependence on the magnetic field. An estimate for the spatial position is provided by calculating the intersections of fired cells in all three layers of a module. In the next stage these planar track elements are combined with each other on the module level to construct track elements in space. The over-determination of three projections is used, first to reduce the ambiguities and secondly to constrain the fit. In the final stage a cubic spline technique has been set up to link the spatial track elements found in each module to FTD tracks. E-health E-Health is a relatively recent term for healthcare practice which is supported by electronic processes and communication. The term is inconsistently used: some would argue it is interchangeable with health care informatics and a sub set of health informatics, while others use it in the narrower sense of healthcare practice using the Internet. The term can encompass a range of services that are at the edge of medicine/healthcare and information technology:

Electronic Health Records: Enable easy communication of patient data between different healthcare professionals (GPs, specialists, care team, pharmacy)

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

Telemedicine: includes all types of physical and psychological measurements that do not require a patient to travel to a specialist. When this service works, patients need to travel less to a specialist or conversely the specialist has a larger catchment area.

Consumer Health Informatics (or citizen-oriented information provision): both healthy individuals and patients want to be informed on medical topics. Health knowledge management (or specialist-oriented information provision): e.g. in an overview of latest medical journals, best practice guidelines or epidemiological tracking. Examples include physician resources such as Medscape and MDLinx.

Virtual healthcare teams: consist of healthcare professionals who collaborate and share information on patients through digital equipment (for transmural care). mHealth or m-Health: includes the use of mobile devices in collecting aggregate and patient level health data, providing healthcare information to practitioners, researchers, and patients, real-time monitoring of patient vitals, and direct provision of care (via mobile telemedicine)

Medical research uses E-health Grids that provide powerful computing and data management capabilities to handle large amounts of heterogeneous data.[1] Healthcare Information Systems: also often refer to software solutions for appointment scheduling, patient data management, work schedule management and other administrative tasks surrounding health. Whether these tasks are part of eHealth depends on the chosen definition, they do, however, interface with most eHealth implementations due to the complex relationship between administration and healthcare at Health Care Providers.

APPLICATION OF VLSI DESIGN TOOLS IN BIO ELECTRONICS Overview: New enabling micro/nano/bio-technologies toward the development of all-on-chip systems for on-line bio-monitoring have been explored for applications in: diagnosis, treatment of patients, cell cultures and environmental monitoring. The project comprises of three main

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

research tasks: Nano-Bio films for applications in stem cells monitoring, Nano-Bio films for drugs detection, and innovative ideas in VLSI design for bio-applications.

Figur e (A): Improved sensitivity registered by using CNTs on peroxide detection Figure (B): Improved sensitivity registered by using CNTs on drugs detection. Figure (C): The 3D structures architectures investigated as replaceable bio-layer for biosensing purposes Main Results on Nano-Bio-Chip for Stem Cells Monitoring Nano-biosensing provides new tools to investigate cellular differentiation and proliferation. Among the various metabolic compounds secreted by cells during their life cycle, glucose, lactate and hydrogen peroxide (H2O2) are of main interest. Glucose is the fuel of cells while lactate and hydrogen peroxide production is related to cell suffering. Nano-structured electrodes may enhance the compound sensitivity in order to precisely detect cell cycle variation. In this research task, the detection with electrodes structured by using multi-walled carbon nanotubes (CNT) have been investigated to be considered for an amperometric biochip. A significant improvement in sensitivity has been achieved, indicating that carbon nanotubes are the right candidates to improve biosensing, as shown by figure 1(A). Also, first experiments on

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

glucose and lactate detection in stem cells have been carried out. Future projects originated by this study will be on the development of bio-chips to be integrated in Petri dishes for automatic stem cell culture monitoring. Main Results on Nano-Bio-Chip for Drugs Detection Personalized therapy requires accurate and frequent monitoring of drugs metabolic response in living organisms during drug treatments. In case of high risk side effects, e.g. therapies with interfering anti-cancer molecules cocktails, direct monitoring of the patients drug metabolism is essential as the metabolic pathways efficacy is highly variable on a patient-bypatient basis. Moreover, anti-cancer pharmacological treatments are often based on cocktails of different drugs. Currently, there are no fully mature biochip systems to monitor multi-panel drug amounts in blood or in serum. The aim of this task has been to investigate the complexity of multiple drugs detection for point-of-care and/or implantable systems to be used in personalized therapy. Probes investigated for biochips are the P450 cytochromes as they are key-role proteins in drugs metabolism. Multiple drugs detection has been carried out both by simulations and electrochemical experiments. Three different P450 isoforms (2C9, 3A4, and 2B4) have been considered to detect nine different commonly used drugs. Drugs specificity enhancement has been investigated considering components decomposition of peak in registered cyclic voltammetry, as shown in figure 1(B). Main Results on VLSI Design for Sensing Applications Recent advances in bio-sensing technologies have led to the design of bio-sensor arrays for rapid identification and quantification of various biological agents such as drugs, gene expressions, proteins, cholesterol, fats, etc. Various dedicated sensing arrays are already available commercially to monitor some of these compounds in a sample. However, monitoring the simultaneous presence of multiple agents in a sample is still a challenging task. Multiple agents may often be detected by the same probes on an array, which makes it difficult to design a chip that can distinguish such agents (leading to low specificity). Thus, sophisticated algorithms for target identification need to be implemented in biochips in order to maximize the number of distinguishable targets in the samples. This also requires introducing sophisticated signal processing and more intelligence on-chip. Dealing with these new processing and information

CIET, COIMBATORE-109

S.Suresh Kumar / L / ECE Dept

technologies constraints lead to more innovative approaches in VLSI design. To address such new demands, we have investigated in this task an innovative 3D-integrated biochip especially dedicated to label-free detection (figure 1(C)).

You might also like