You are on page 1of 17

RESEARCH METHODS IN COGNITIVE SCIENCE

SHORT ESSAY 2014-2018

1) Briefly describe about brain imaging.


Neuroimaging or brain imaging is the use of various techniques to either directly or
indirectly image the structure, function, or pharmacology of the nervous system. Physicians who
specialize in the performance and interpretation of neuroimaging in the clinical setting
are neuroradiologists. Neuroimaging falls into two broad categories:

 Structural imaging, which deals with the structure of the nervous system and the diagnosis of
gross (large scale) intracranial disease (such as a tumor) and injury.
 Functional imaging, which is used to diagnose metabolic diseases and lesions on a finer scale
(such as Alzheimer's disease) and also for neurological and cognitive psychology research and
building brain-computer interfaces.
Functional imaging enables, for example, the processing of information by centers in the brain to be
visualized directly. Such processing causes the involved area of the brain to increase metabolism and
"light up" on the scan.

Brain imaging techniques


Computed axial tomography
Diffuse optical imaging
Event-related optical signal
Magnetic resonance imaging
Functional magnetic resonance imaging
Magnetoencephalography
Positron emission tomography
Single-photon emission computed tomography
Cranial ultrasound
Functional ultrasound imaging

2) Write about the scope of cognitive science.

Cognitive science is the interdisciplinary, scientific study of the mind and its processes. It examines
the nature, the tasks, and the functions of cognition (in a broad sense). Cognitive scientists study
intelligence and behavior, with a focus on how nervous systems represent, process, and
transform information. Mental faculties of concern to cognitive scientists
include language, perception, memory, attention, reasoning, and emotion; to understand these
faculties, cognitive scientists borrow from fields such as linguistics, psychology, artificial
intelligence, philosophy, neuroscience, and anthropology.
Scope:
Medical analysis
Data representation and retrieval
Intelligence analysis
Human factors engineering
Computer-human interaction
Artificial intelligence
Human performance testing
Speech synthesis and voice recognition
Multimedia design
Linguistic analysis

3) Write a short note on the issues in research design.


Basic research designs can be seen from the issues associated with the decision about the purpose of the study
(exploratory, descriptive, hypothesis testing), where the research will conducted (i. e, study setting), the type of
research that should be (kind of investigation), the extent to which researchers manipulated and control research
(researcher interference level), the temporal aspects research (time horizon), and the rate at which data will be
analyzed (if the unit of analysis), is an integral part of the research design.

4) Write a short note on functional magnetic resonance imaging


(fMRI).

Functional magnetic resonance imaging or functional MRI (fMRI) measures brain activity by
detecting changes associated with blood flow.[1][2] This technique relies on the fact that cerebral
blood flow and neuronal activation are coupled. When an area of the brain is in use, blood flow to
that region also increases.

fMRI measures the relative amount of oxygenated blood flowing to different parts of the brain.
More oxygenated blood in a particular region is assumed to correlate with an increase in neural
activity in that part of the brain. This allows us to localize particular functions within different brain
regions. fMRI has moderate spatial and temporal resolution.

5) Explain sleep cycle

The sleep cycle is an oscillation between the slow-wave and REM (paradoxical) phases
of sleep. It is sometimes called the ultradian sleep cycle, sleep–dream cycle, or REM-NREM
cycle, to distinguish it from the circadian alternation between sleep and wakefulness. In humans
this cycle takes 70 to 110 minutes (90 ± 20 minutes).
Electroencephalography shows the timing of sleep cycles by virtue of the marked distinction
in brainwaves manifested during REM and non-REM sleep. Delta wave activity, correlating with
slow-wave (deep) sleep, in particular shows regular oscillations throughout a good night's sleep.
Secretions of various hormones, including renin, growth hormone, and prolactin, correlate
positively with delta-wave activity, while secretion of thyroid-stimulating hormone correlates
inversely. Heart rate variability, well known to increase during REM, predictably also correlates
inversely with delta-wave oscillations over the ~90-minute cycle
In order to determine in which stage of sleep the asleep subject is, electroencephalography is
combined with other devices used for this differentiation. EMG (electromyography) is a crucial
method to distinguish between sleep phases: for example, in general, a decrease of muscle tone
is characteristic of the transition from wake to sleep (Kleitman, 1963; Chase & Morales, 1990),
and during REM sleep there is a state of muscles atonia, resulting in an absence of signals in the
EMG.
EOG (electrooculography), the measure of the eyes’ movement, is the third method used in the
sleep architecture measurement;] for example, REM sleep, as the name indicates, is
characterized by a rapid eye movement pattern.

6) What are the criteria's of good scientific research.


1. The purpose of the research should be clearly defined and common concepts be used.
2. The research procedure used should be described in sufficient detail to permit another
researcher to repeat the research for further advancement, keeping the continuity of what
has already been attained.
3. The procedural design of the research should be carefully planned to yield results that are
as objective as possible.
4. The researcher should report with complete frankness, flaws in procedural design and
estimate their effects upon the findings.
5. The analysis of data should be sufficiently adequate to reveal its significance and the
methods of analysis used should be appropriate. The validity and reliability of the data
should be checked carefully.
6. Conclusions should be confined to those justified by the data of the research and limited
to those for which the data provide an adequate basis.
7. Greater confidence in research is warranted if the researcher is experienced, has a good
reputation in research and is a person of integrity.

7) Explain statistical decision theory.


Decision theory is the science of making optimal decisions in the face of uncertainty. Statistical
decision theory is concerned with the making of decisions when in the presence of statistical
knowledge (data) which sheds light on some of the uncertainties involved in the decision problem.
The generality of these definitions is such that decision theory (dropping the qualifier 'statistical' for
convenience) formally encompasses an enormous range of problems and disciplines.

Decision theory operates by breaking a problem down into specific components, which can be
mathematically or probabilistically modelled and combined with a suitable optimality principle to
determine the best decision.

8) Explain analogy and metaphor

Analogy is a cognitive process of transferring information or meaning from a particular subject


(the analog, or source) to another (the target), or a linguistic expression corresponding to such a
process. In a narrower sense, analogy is an inference or an argument from one particular to
another particular, as opposed to deduction, induction, and abduction, in which at least one of
the premises, or the conclusion, is general rather than particular in nature. The term analogy can
also refer to the relation between the source and the target themselves, which is often (though
not always) a similarity, as in the biological notion of analogy.

A metaphor is a figure of speech that, for rhetorical effect, directly refers to one thing by
mentioning another. It may provide (or obscure) clarity or identify hidden similarities between two
different ideas. Metaphors are often compared with other types of figurative language, such
as antithesis, hyperbole, metonymy and simile. One of the most commonly cited examples of a
metaphor in English literature comes from the "All the world's a stage" monologue from As You
Like It.

9) What is non-parametric research?


Nonparametric statistics is the branch of statistics that is not based solely
on parametrized families of probability distributions (common examples of parameters are the
mean and variance). Nonparametric statistics is based on either being distribution-free or having
a specified distribution but with the distribution's parameters unspecified. Nonparametric
statistics includes both descriptive statistics and statistical inference. Nonparametric tests are
often used when the assumptions of parametric tests are violated.

10) Highlight briefly neuropsychological tests used for assessment


of executive functioning.
Executive functions is an umbrella term for a various cognitive processes and sub-processes.
The executive functions include: problem solving, planning, organizational skills, selective
attention, inhibitory control and some aspects of short term memory.[9]

 Behavioural Assessment of Dysexecutive Syndrome (BADS)


 CNS Vital Signs (Brief Core Battery)
 Continuous performance task (CPT)
 Controlled Oral Word Association Test (COWAT)
 d2 Test of Attention
 Delis-Kaplan Executive Function System (D-KEFS)
 Digit Vigilance Test
 Figural Fluency Test
 Halstead Category Test
 Hayling and Brixton tests
 Kaplan Baycrest Neurocognitive Assessment (KBNA)
 Kaufman Short Neuropsychological Assessment
 Paced Auditory Serial Addition Test (PASAT)
 Rey-Osterrieth Complex Figure
 Ruff Figural Fluency Test
 Stroop task
 Test of Variables of Attention (T.O.V.A.)
 Tower of London Test
 Trail-Making Test (TMT) or Trails A & B
 Wisconsin Card Sorting Test (WCST)
 Symbol Digit Modalities Test
 Test of Everyday Attention (TEA)

11) Briefly discuss the use of Fmri

Medical use
Physicians use fMRI to assess how risky brain surgery or similar invasive treatment is for a
patient and to learn how a normal, diseased or injured brain is functioning. They map the brain
with fMRI to identify regions linked to critical functions such as speaking, moving, sensing, or
planning. This is useful to plan for surgery and radiation therapy of the brain. Clinicians also use
fMRI to anatomically map the brain and detect the effects of tumors, stroke, head and brain
injury, or diseases such as Alzheimer's, and developmental disabilities such as Autism etc

Clinical use of fMRI still lags behind research use.[49] Patients with brain pathologies are more
difficult to scan with fMRI than are young healthy volunteers, the typical research-subject
population. Tumors and lesions can change the blood flow in ways not related to neural activity,
masking the neural HDR. Drugs such as antihistamines and even caffeine can affect HDR

Animal research
Research is primarily performed in non-human primates such as the rhesus macaque. These
studies can be used both to check or predict human results and to validate the fMRI technique
itself. But the studies are difficult because it is hard to motivate an animal to stay still and typical
inducements such as juice trigger head movement while the animal swallows it.
12) Explain pragmatic reasoning schemes vs. social contrast theory.
Pragmatic reasoning is concerned with what follows from the premises in a given context. If the context
changes, the pragmatic conclusions may change also. Pragmatic reasoning is thus context-
dependent.Pragmatic reasoning typically involves both reasoning in a context and reasoning about it.

In moral and political philosophy, the social contract is a theory or model that originated during
the Age of Enlightenment and usually concerns the legitimacy of the authority of the state over
the individual. Social contract arguments typically posit that individuals have consented, either
explicitly or tacitly, to surrender some of their freedoms and submit to the authority (of the ruler, or to
the decision of a majority) in exchange for protection of their remaining rights or maintenance of
the social order

13) Explain the meaning of language structure.


Words and sentences have parts that combine in patterns, exhibiting the grammar of the language.
Phonology is the study of patterns in sound or gesture. Syntax and Semantics involve studying
patterns in sentence structure, from the vantages of form and meaning, respectively. The shared aim
is a general theory of human grammars, one that allows us to understand speakers' ability to use
language and the rapid development of language in every normal child.

14) Explain block design test.


A block design test is a subtest on many IQ test batteries used as part of assessment of human
intelligence. It is thought to tap spatial visualization ability and motor skill. The test-taker uses hand
movements to rearrange blocks that have various color patterns on different sides to match a pattern.
The items in a block design test can be scored both by accuracy in matching the pattern and by speed
in completing each item.
Good performance on the block design test is indicative of appropriate functioning of
the parietal and frontal lobes. Head injury, Alzheimer's disease, and stroke can severely reduce the
performance of an individual on the block design test

The block design test is also a relatively accurate measure of spatial ability and spatial visualization
ability used in daily life. The block design test is considered one of the best measures of spatial ability,
although it is subject to certain problems of administration, such as anxiety or over-cautious
responding.

15) Describe types of variables in cognitive sciences experiment


There are three of main cognitive variable, such as : • Memory • Intelligence •
Aptitude

Memory is the storage of information and past experiences for the purpose of
present day application or use. There are two kinds of memory, such as : •
Semantic memory • Episodic memory

Intelligence is often measured in terms of the Intelligence Quotient, or IQ. IQ is a


measure of your ability to solve problems and understand concepts. There is a
strong correlation between having a high IQ and academic success

Aptitude generally refers to an individual's verbal, numerical or abstract


reasoning skills. For the sake of practical application, aptitude refers to a
person's ability to learn or adapt certain new skills

16) Write a note on factor analysis and its applications.


Factor analysis is a statistical method used to describe variability among observed,
correlated variables in terms of a potentially lower number of unobserved variables called factors. For
example, it is possible that variations in six observed variables mainly reflect the variations in two
unobserved (underlying) variables. Factor analysis searches for such joint variations in response to
unobserved latent variables. The observed variables are modelled as linear combinations of the
potential factors, plus "error" terms.
Applications in psychology
Factor analysis is used to identify "factors" that explain a variety of results on different tests. For
example, intelligence research found that people who get a high score on a test of verbal ability are
also good on other tests that require verbal abilities. Researchers explained this by using factor
analysis to isolate one factor, often called verbal intelligence, which represents the degree to which
someone is able to solve problems involving verbal skills.
Factor analysis in psychology is most often associated with intelligence research. However, it also has
been used to find factors in a broad range of domains such as personality, attitudes, beliefs, etc. It is
linked to psychometrics, as it can assess the validity of an instrument by finding if the instrument
indeed measures the postulated factors.

17) Contrast between concept and category.


Categorization is an activity that consists of putting things, such as objects, ideas, or people,
into categories (classes, types, index) based on their similarities or common criteria. It is
sometimes considered synonymous with classification (cf., Classification synonyms).
Categorization and classification allow humans to organize things, objects, and ideas that exist
around them and simplify their understanding of the world

Category: A category is a collection of instances which are treated as if they were


the same.

 Objects
 Natural kinds
 People
 Events
 Ideas
Concepts are defined as abstract ideas or general notions that occur in the mind, in speech, or
in thought. They are understood to be the fundamental building blocks of thoughts and beliefs.
They play an important role in all aspects of cognition. As such, concepts are studied by several
disciplines, such as linguistics, psychology, and philosophy, and these disciplines are interested
in the logical and psychological structure of concepts, and how they are put together to form
thoughts and sentences. The study of concepts has served as an important flagship of an
emerging interdisciplinary approach called cognitive science.
In contemporary philosophy, there are at least three prevailing ways to understand what a
concept is:

 Concepts as mental representations, where concepts are entities that exist in the mind
(mental objects)
 Concepts as abilities, where concepts are abilities peculiar to cognitive agents (mental
states)
 Concepts as Fregean senses (see sense and reference), where concepts are abstract
objects, as opposed to mental objects and mental states

18)Explain language processing.


Language processing refers to the way humans use words to communicate ideas and feelings,
and how such communications are processed and understood. Language processing is
considered to be a uniquely human ability that is not produced with the same grammatical
understanding or systematicity in even human's closest primate relatives.

Throughout the 20th century the dominant model for language processing in the brain was
the Geschwind-Lichteim-Wernicke model, which is based primarily on the analysis of brain
damaged patients. However, due to improvements in intra-cortical electrophysiological
recordings of monkey and human brains, as well non-invasive techniques such as fMRI, PET,
MEG and EEG, a dual auditory pathway[ has been revealed. In accordance with this model, there
are two pathways that connect the auditory cortex to the frontal lobe, each pathway accounting
for different linguistic roles. The auditory ventral stream pathway is responsible for sound
recognition, and is accordingly known as the auditory 'what' pathway. The auditory dorsal
stream in both humans and non-human primates is responsible for sound localization, and is
accordingly known as the auditory 'where' pathway. In humans, this pathway (especially in the
left hemisphere) is also responsible for speech production, speech repetition, lip-reading, and
phonological working memory and long-term memory.

19) ldentify with examples different areas of cognitive functioning


that can be tested.

There are different types of cognitive tests. The most common tests are:

 Montreal Cognitive Assessment (MoCA)


 Mini-Mental State Exam (MMSE)
 Mini-Cog
 Montreal Cognitive Assessment (MoCA) test. A 10-15 minute test that includes memorizing a
short list of words, identifying a picture of an animal, and copying a drawing of a shape or object.
 Mini-Mental State Exam (MMSE). A 7-10 minute test that includes naming the current date,
counting backward, and identifying everyday objects like a pencil or watch.
 Mini-Cog. A 3-5 minute test that includes recalling a three-word list of objects and drawing a
clock.

20) What is EEG? How it is measured?


Electroencephalography (EEG) is an electrophysiological monitoring method to record electrical
activity of the brain. It is typically noninvasive, with the electrodes placed along the scalp, although
invasive electrodes are sometimes used, as in electrocorticography, sometimes called intracranial
EEG.
An EEG measures the electrical impulses in your brain by using several electrodes that are attached
to your scalp. An electrode is a conductor through which an electric current enters or leaves. The
electrodes transfer information from your brain to a machine that measures and records the data.

21) Describe Gait analysis.


Gait analysis is the systematic study of animal locomotion, more specifically the study of human
motion, using the eye and the brain of observers, augmented by instrumentation for measuring
body movements, body mechanics, and the activity of the muscles.[1] Gait analysis is used to
assess and treat individuals with conditions affecting their ability to walk. It is also commonly
used in sports biomechanics to help athletes run more efficiently and to identify posture-related
or movement-related problems in people with injuries.
The study encompasses quantification (introduction and analysis of measurable parameters
of gaits), as well as interpretation, i.e. drawing various conclusions about the animal (health, age,
size, weight, speed etc.) from its gait pattern.

22) What are the applications of brain imaging techniques?

Computed tomography (CT) or Computed Axial Tomography (CAT) scanning uses a series of x-
rays of the head taken from many different directions. Typically used for quickly viewing brain
injuries,

Diffuse optical imaging (DOI) or diffuse optical tomography (DOT) is a medical imaging modality
which uses near infrared light to generate images of the body. HD-DOT has also been compared
to fMRI in terms of language tasks and resting state functional connectivity.[12]

Event-related optical signal (EROS) is a brain-scanning technique which uses infrared light
through optical fibers to measure changes in optical properties of active areas of the cerebral
cortex.

Magnetic resonance imaging (MRI) uses magnetic fields and radio waves to produce high quality
two- or three-dimensional images of brain structures without the use of ionizing radiation (X-rays)
or radioactive tracers.
Functional magnetic resonance imaging (fMRI) and arterial spin labeling (ASL) relies on the
paramagnetic properties of oxygenated and deoxygenated hemoglobin to see images of
changing blood flow in the brain associated with neural activity, This allows images to be
generated that reflect which brain structures are activated (and how) during the performance of
different tasks or at resting state. According to the oxygenation hypothesis, changes in oxygen
usage in regional cerebral blood flow during cognitive or behavioral activity can be associated
with the regional neurons as being directly related to the cognitive or behavioral tasks being
attended.

Positron emission tomography (PET) and brain positron emission tomography, measure
emissions from radioactively labeled metabolically active chemicals that have been injected into
the bloodstream .The greatest benefit of PET scanning is that different compounds can show
blood flow and oxygen and glucose metabolism in the tissues of the working brain. These
measurements reflect the amount of brain activity in the various regions of the brain and allow to
learn more about how the brain works.

23) Write short notes on any two cognitive architecture.

A cognitive architecture refers to both a theory about the structure of the human mind and to a
computational instantiation of such a theory used in the fields of artificial intelligence (AI) and
computational cognitive science

ACT-R ("Adaptive Control of Thought—Rational") is a cognitive architecture mainly developed


by John Robert Anderson and Christian Lebiere at Carnegie Mellon University. Like any cognitive
architecture, ACT-R aims to define the basic and irreducible cognitive and perceptual operations
that enable the human mind. In theory, each task that humans can perform should consist of a
series of these discrete operations.
Most of the ACT-R's basic assumptions are also inspired by the progress of cognitive
neuroscience, and ACT-R can be seen and described as a way of specifying how the brain itself
is organized in a way that enables individual processing modules to produce cognition.

Soar is a cognitive architecture,[2] originally created by John Laird, Allen Newell, and Paul
Rosenbloom at Carnegie Mellon University. (Rosenbloom continued to serve as co-principal
investigator after moving to Stanford University, then to the University of Southern California's
Information Sciences Institute.) It is now maintained and developed by John Laird's research
group at the University of Michigan.
The goal of the Soar project is to develop the fixed computational building blocks necessary for
general intelligent agents – agents that can perform a wide range of tasks and encode, use, and
learn all types of knowledge to realize the full range of cognitive capabilities found in humans,
such as decision making, problem solving, planning, and natural language understanding. It is
both a theory of what cognition is and a computational implementation of that theory. Since its
beginnings in 1983 as John Laird’s thesis, it has been widely used by AI researchers to create
intelligent agents and cognitive models of different aspects of human behavior.

24) List the activities involved in programming.


Tasks accompanying and related to programming include: testing, debugging, source
code maintenance, implementation of build systems, and management of derived artifacts, such
as the machine code of computer programs.

Software testing is an investigation conducted to provide stakeholders with information about


the quality of the software product or service under test.[1] Software testing can also provide an
objective, independent view of the software to allow the business to appreciate and understand
the risks of software implementation. Test techniques include the process of executing a program
or application with the intent of finding software bugs (errors or other defects), and verifying that
the software product is fit for use.
In computer programming and software development, debugging is the process of finding and
resolving bugs (defects or problems that prevent correct operation) within computer
programs, software, or systems. Debugging tactics can involve interactive debugging, control
flow analysis, unit testing, integration testing, log file analysis, monitoring at
the application or system level, memory dumps, and profiling. Many programming languages and
software development tools also offer programs to aid in debugging, known as debuggers.
In computing, source code is any collection of code, with or without comments, written using
a human-readable programming language, usually as plain text. The source code of a program is
specially designed to facilitate the work of computer programmers, who specify the actions to be
performed by a computer mostly by writing source code. The source code is often transformed
by an assembler or compiler into binary machine code that can be executed by the computer.
The machine code might then be stored for execution at a later time. Alternatively, source code
may be interpreted and thus immediately executed.

Build automation is the process of automating the creation of a software build and the
associated processes including: compiling computer source code into binary
code, packaging binary code, and running automated tests.
An artifact is one of many kinds of tangible by-products produced during the development of
software. Some artifacts (e.g., use cases, class diagrams, and other Unified Modeling
Language (UML) models, requirements and design documents) help describe the function,
architecture, and design of software. Other artifacts are concerned with the process of
development itself—such as project plans, business cases, and risk assessments.
25) Write a note on merits and demerits of case studies.

Advantages of Case Study Method

1. Intensive Study. Case study method is responsible for intensive study of a unit. It is the
investigation and exploration of an event thoroughly and deeply.
2. No Sampling. It studies a social unit in its entire perspectives. It means there is no sampling in case
study method.
3. Continuous Analysis. It is valuable in analyzing continuously the life of a social unit to dig out the
facts.
4. Hypothesis Formulation. This method is useful for formulation of hypothesis for further study.
5. Comparisons. It compares different type of facts about the study of a unity.
6. Increase in Knowledge. It gives the analytical power of a person to increase knowledge about a
social phenomena.
7. Generalization of Data. Case study method provides grounds for generalization of data for
illustrating statistical findings.
8. Comprehensive. It is a comprehensive method of data collection in social research.
9. Locate Deviant Cases. The deviant cases are these units which behave against the proposed
hypothesis. So, it locate these deviant cases. The tendency is to ignore them but are important for
scientific study.
10. Farming Questionnaire or Schedule. Through case study method we can formulate and develop a
questionnaire and schedule.

Disadvantage of Case Study Method

1. Limited Representatives. Due to as narrow focuses a case study has limited representatives and
generalization is impossible.
2. No Classification. Any classification is not possible due to studying a small unit.
3. Possibility of Errors. Case study method may have the errors of memory and judgment.
4. Subjective Method. It is a subjective method rather than objective.
5. No Easy and Simple. This method is very difficult and no layman can conduct this method.
6. Bias Can Occur. Due to narrow study the discrimination & bias can occurs in the investigation of a
social unit.
7. No Fixed Limits. This method is depend on situation and have no fixed limits of investigation of
the researcher.
8. Costly and Time Consuming. This method is more costly and time consuming as compare to other
methods of data collection.

26) Describe the importance of correlational analysis.


Correlation analysis is a statistical method used to evaluate the strength of relationship between two
quantitative variables. A high correlation means that two or more variables have a strong relationship
with each other, while a weak correlation means that the variables are hardly related. In other words,
it is the process of studying the strength of that relationship with available statistical data.

Findings from correlational research can be used to determine prevalence and relationships among
variables, and to forecast events from current data and knowledge. In spite of its many uses, prudence
is required when using the methodology and analysing data. To assist researchers in reducing
mistakes, important issues are singled out for discussion and several options put forward for analysing
data.
27) Compare and contrast analogy with metaphor

Analogy, in it’s broadest definition is in fact an umbrella term for a cognitive process where we transfer
meaning or information from a particular subject to another subject. It is a similarity between the features
of two things where a comparison can be made.

In rhetoric (the study of communication), it is where we create reasoning or explanation from a parallel
subject.

A classic example of an analogy is the describe the human mind like a computer. They are not identical,
but considering the mind to be like a computer can create greater understanding of the human mind.

Metaphor
Metaphor is a type of analogy, but where analogy is identifying two things as similar, a metaphor claims a
comparison where there may not be one. It is then up to the listener to create meaning out of this
comparison.

For example “ that sound goes through me like nails down a blackboard”. The sound may be very different
to the nails on the blackboard, but create a similar sensation or emotion.

28)Describe a research design which can explain learning as a


cause and expertise as an affect

29) Describe the neuropsychological approaches to understand human


cognition.

Neuropsychology is a branch of psychology that is concerned with how the brain and the rest of
the nervous system influence a person's cognition and behaviors. More importantly,
professionals in this branch of psychology often focus on how injuries or illnesses of the brain
affect cognitive functions and behaviors.

Experimental neuropsychology is an approach that uses methods from experimental


psychology to uncover the relationship between the nervous system and cognitive function. The
majority of work involves studying healthy humans in a laboratory setting, although a minority of
researchers may conduct animal experiments. Human work in this area often takes advantage of
specific features of our nervous system (for example that visual information presented to a
specific visual field is preferentially processed by the cortical hemisphere on the opposite side) to
make links between neuroanatomy and psychological function.
Clinical neuropsychology is the application of neuropsychological knowledge to the assessment
management, and rehabilitation of people who have suffered illness or injury (particularly to the
brain) which has caused neurocognitive problems. In particular they bring a psychological
viewpoint to treatment, to understand how such illness and injury may affect and be affected by
psychological factors.
Cognitive neuropsychology is a relatively new development and has emerged as a distillation of
the complementary approaches of both experimental and clinical neuropsychology. It seeks to
understand the mind and brain by studying people who have suffered brain injury or neurological
illness. One model of neuropsychological functioning is known as functional localization.

Connectionism is the use of artificial neural networks to model specific cognitive processes using
what are considered to be simplified but plausible models of how neurons operate. Once trained
to perform a specific cognitive task these networks are often damaged or 'lesioned' to simulate
brain injury or impairment in an attempt to understand and compare the results to the effects of
brain injury in humans.
Functional neuroimaging uses specific neuroimaging technologies to take readings from the
brain, usually when a person is doing a particular task, in an attempt to understand how the
activation of particular brain areas is related to the task.
In practice these approaches are not mutually exclusive and most neuropsychologists select the
best approach or approaches for the task to be completed.

30) What are the components of ERP waveforms


ERP waveforms consist of a series of positive and negative voltage deflections, which are related
to a set of underlying components.[7] Though some ERP components are referred to with
acronyms (e.g., contingent negative variation – CNV, error-related negativity – ERN), most
components are referred to by a letter (N/P) indicating polarity (negative/positive), followed by a
number indicating either the latency in milliseconds or the component's ordinal position in the
waveform. For instance, a negative-going peak that is the first substantial peak in the waveform
and often occurs about 100 milliseconds after a stimulus is presented is often called
the N100 (indicating its latency is 100 ms after the stimulus and that it is negative) or N1
(indicating that it is the first peak and is negative); it is often followed by a positive peak, usually
called the P200 or P2. The stated latencies for ERP components are often quite variable,
particularly so for the later components that are related to the cognitive processing of the
stimulus. For example, the P300 component may exhibit a peak anywhere between 250 ms –
700 ms.

31) Describe the different stages of human sleep.

Sleep Type of Normal


Stages Sleep Other Names Length

Stage 1 NREM N1 1-5 minutes

Stage 2 NREM N2 10-60 minutes

N3, Slow-Wave Sleep (SWS), Delta Sleep, Deep


Stage 3 NREM Sleep 20-40 minutes

Stage 4 REM REM Sleep 10-60 minutes

Sleep Stages
There are two basic types of sleep: rapid eye movement (REM) sleep and non-REM sleep (which has
three different stages). Each is linked to specific brain waves and neuronal activity. You cycle
through all stages of non-REM and REM sleep several times during a typical night, with increasingly
longer, deeper REM periods occurring toward morning.
Stage 1 non-REM sleep is the changeover from wakefulness to sleep. During this short period
(lasting several minutes) of relatively light sleep, your heartbeat, breathing, and eye movements slow,
and your muscles relax with occasional twitches. Your brain waves begin to slow from their daytime
wakefulness patterns.

Stage 2 non-REM sleep is a period of light sleep before you enter deeper sleep. Your heartbeat and
breathing slow, and muscles relax even further. Your body temperature drops and eye movements
stop. Brain wave activity slows but is marked by brief bursts of electrical activity. You spend more of
your repeated sleep cycles in stage 2 sleep than in other sleep stages.

Stage 3 non-REM sleep is the period of deep sleep that you need to feel refreshed in the morning. It
occurs in longer periods during the first half of the night. Your heartbeat and breathing slow to their
lowest levels during sleep. Your muscles are relaxed and it may be difficult to awaken you. Brain
waves become even slower.

REM sleep first occurs about 90 minutes after falling asleep. Your eyes move rapidly from side to
side behind closed eyelids. Mixed frequency brain wave activity becomes closer to that seen in
wakefulness. Your breathing becomes faster and irregular, and your heart rate and blood pressure
increase to near waking levels. Most of your dreaming occurs during REM sleep, although some can
also occur in non-REM sleep. Your arm and leg muscles become temporarily paralyzed, which
prevents you from acting out your dreams. As you age, you sleep less of your time in REM sleep.
Memory consolidation most likely requires both non-REM and REM sleep.

32) Differentiate between functional magnetic resonance imaging


(fiMRI) and magnetoencephalography (MEG).

Functional magnetic resonance imaging or functional MRI (fMRI) measures brain activity by
detecting changes associated with blood flow.[1][2] This technique relies on the fact that cerebral blood
flow and neuronal activation are coupled. When an area of the brain is in use, blood flow to that region
also increases.[3]
Magnetoencephalography (MEG) is a functional neuroimaging technique for mapping brain activity
by recording magnetic fields produced by electrical currents occurring naturally in the brain, using
very sensitive magnetometers. Arrays of SQUIDs (superconducting quantum interference devices)
are currently the most common magnetometer. Applications of MEG include basic research into
perceptual and cognitive brain processes, localizing regions affected by pathology before surgical
removal, determining the function of various parts of the brain, and neurofeedback.

The difference between these two techniques predominantly lies in that fMRI measures blood flow
relying on the fact that cerebral blood flow and neuronal activation are coupled. MEG directly
measures brain activity through the magnetic field the neuronal activation produces. Due to these
different measurement methods, MEG has much higher temporal resolution than fMRI, meaning that
the measurement of the timing and location of brain activity is more precise with MEG.

33) Write a note on parallel distributed processing (PDP).


Parallel Distributed Processing (PDP) assume that information processing takes place through the interactions
of a large number of simple processing elements called units , each sending excitatory and inhibitory signals to
other units. In some cases , the units stand for possible hypotheses about such things as the letters in a
particular display or the syntactic roles of the words in a particular sentence. In these cases, the activations
stand roughly for the strengths associated witt the different possible hypotheses, and the interconnections
among the units stand for the constraints the system knows to exist between the hypotheses. In other cases ,
the units stand for possible goals and actions, such as the goal of typing a particular letter, or the action of
moving the left index finger, and the connections relate goals to subgoals, subgoals to actions , and actions to
muscle movements. In stilI other cases, units stand not for particular hypotheses or goals , but for aspects of
these things. Thus a hypothesis about the identity of a word, for example, is itself distributed in the activations
of a large number of units.
It provided a general mathematical framework for researchers to operate in. The framework involved
eight major aspects:

 A set of processing units, represented by a set of integers.


 An activation for each unit, represented by a vector of time-dependent functions.
 An output function for each unit, represented by a vector of functions on the activations.
 A pattern of connectivity among units, represented by a matrix of real numbers indicating
connection strength.
 A propagation rule spreading the activations via the connections, represented by a function on
the output of the units.
 An activation rule for combining inputs to a unit to determine its new activation, represented by a
function on the current activation and propagation.
 A learning rule for modifying connections based on experience, represented by a change in the
weights based on any number of variables.
 An environment that provides the system with experience, represented by sets of activation
vectors for some subset of the units.

34) Describe the contribution of artificial intelligence (Al) in


computational modelling.

There are two main purposes for the productions of artificial intelligence: to produce intelligent
behaviours regardless of the quality of the results, and to model after intelligent behaviors found in
nature. In the beginning of its existence, there was no need for artificial intelligence to emulate the
same behaviour as human cognition. Until 1960s, economist Herbert Simon and Allen
Newell attempted to formalize human problem-solving skills by using the results of psychological
studies to develop programs that implement the same problem-solving techniques as people would.
Their works laid the foundation for symbolic AI and computational cognition, and even some
advancements for cognitive science and cognitive psychology.
The field of symbolic AI is based on the physical symbol systems hypothesis by Simon and Newell,
which states that expressing aspects of cognitive intelligence can be achieved through the
manipulation of symbols. However, John McCarthy focused more on the initial purpose of artificial
intelligence, which is to break down the essence of logical and abstract reasoning regardless of
whether or not human employs the same mechanism.

35) Describe how analogy and metaphors are studied in cognitive


science?

36) What is Brunn approach to examine concept formations in humans?

Bruner, Goodnow, and Austin (1956) approach the problem of concept attainment in a manner similar
to the method of computer simulation of cognitive processes—but without the use of computers.
Using the term “strategy”—borrowed from the mathematical theory of games and functioning like
a computer program—they analyze the behavior of subjects categorizing a set of cards containing
geometrical figures. In different experiments the subjects were instructed to form categories that were
conjunctive, disjunctive, or relational. Interest was centered on the kind of strategies used to attain
these concepts; a strategy being defined as “a pattern of decisions in the acquisition, retention, and
utilization of information that serves to meet certain objectives …” (1956, p. 54). Their objectives were
to maximize the information obtained from each instance, to reduce “cognitive strain,” and to regulate
the risk. The authors analyzed the results primarily in terms of four ideal strategies, and in terms of the
advantages and disadvantages of each. Although the term “strategy” is rich in connotations, the
questions of how they develop and influence behavior are far from clear. The value of the construct
strategy would seem to depend on additional theoretical refinement and empirical data
In closing this article it may be appropriate to offer a brief summary of the crucial problems confronting
investigators of concept formation. There is little doubt that the discrimination process is of primary
importance in concept formation. The best method of teaching a concept would be to arrange the
optimal conditions for discriminating between instances that belong to a concept and those that do
not. Although such a principle would be generally accepted, there would be much disagreement about
its specific interpretation. Whether optimal conditions for discrimination could be best arranged by
reinforcing correct habits and not reinforcing incorrect ones, by encouraging suitable mediational
responses, by training the organism to perceive crucial differences, by developing appropriate
cognitive systems, or by some favorable combination of all of these factors—all these issues would be
open to dispute. Basic to this disagreement are two related questions: Do these apparent differences
always represent real differences? If so, does their resolution depend upon their being cast in precise
mathematical language?

37) What are the neuropsychological tests to examine working memory?


Memory is a very broad function which includes several distinct abilities, all of which can be
selectively impaired and require individual testing. There is disagreement as to the number of memory
systems, depending on the psychological perspective taken. From a clinical perspective, a view of five
distinct types of memory, is in most cases sufficient. Semantic memory and episodic
memory (collectively called declarative memory or explicit memory); procedural
memory and priming or perceptual learning (collectively called non-declarative memory or implicit
memory) all four of which are long term memory systems; and working memory or short term
memory. Semantic memory is memory for facts, episodic memory is autobiographical
memory, procedural memory is memory for the performance of skills, priming is memory facilitated by
prior exposure to a stimulus and working memory is a form of short term memory for information
manipulation.

 Benton Visual Retention Test


 California Verbal Learning Test
 Cambridge Prospective Memory Test (CAMPROMPT)
 Gollin figure test
 Memory Assessment Scales (MAS)
 Rey Auditory Verbal Learning Test
 Rivermead Behavioural Memory Test
 Test of Memory and Learning (TOMAL)
 Mental Attributes Profiling System
 Wechsler Memory Scale (WMS)

38) What are the processes of measuring event related potentials


(ERPs)?
ERPs can be reliably measured using electroencephalography (EEG), a procedure that
measures electrical activity of the brain over time using electrodes placed on the scalp. The EEG
reflects thousands of simultaneously ongoing brain processes. This means that the brain response to
a single stimulus or event of interest is not usually visible in the EEG recording of a single trial. To see
the brain's response to a stimulus, the experimenter must conduct many trials and average the results
together, causing random brain activity to be averaged out and the relevant waveform to remain,
called the ERP.[6]
The random (background) brain activity together with other bio-signals (e.g., EOG, EMG, EKG) and
electromagnetic interference (e.g., line noise, fluorescent lamps) constitute the noise contribution to
the recorded ERP. This noise obscures the signal of interest, which is the sequence of underlying
ERPs under study. From an engineering point of view it is possible to define the signal-to-noise
ratio (SNR) of the recorded ERPs. Averaging increases the SNR of the recorded ERPs making them
discernible and allowing for their interpretation. This has a simple mathematical explanation provided
that some simplifying assumptions are made. These assumptions are:
1. The signal of interest is made of a sequence of event-locked ERPs with invariable latency and
shape
2. The noise can be approximated by a zero-mean Gaussian random process of variance( small
size sigma with square) which is uncorrelated between trials and not time-locked to the event
(this assumption can be easily violated, for example in the case of a subject doing little
tongue movements while mentally counting the targets in an experiment).

39)Describe Sternberg's scanning paradigm to examine memory.

Sternberg (1966; 1969) conducted a series of experiments aimed at determining


how we access the information in our short-term memory. To do this, he used a
simple memory task: participants saw a series of numbers presented one at a
time, and were asked to remember those numbers. After seeing a few numbers,
there was a brief pause, and then a test number would appear, and participants
were asked to determine whether or not the test number was one of the studied
numbers. Sternberg then examined the reaction times of responses to the test
number, comparing the time it takes to respond as more and more numbers were
studied.

Sternberg's results helped demonstrate several important concepts. First, he


found a nearly perfect linear relationship between reaction times and the number
of items studied: the more items studied, the longer it took to respond to the test
item. In fact, for every additional item the participants studied, they took about 38
ms longer to respond to the test number, suggesting that it takes about that long
to "scan" one item in short-term memory. Second, he found no difference in
reaction times between trials where the target had appeared in the study group
and those in which it had not, which suggests that scans of short-term memory
are exhaustive: even when the target item is found, other items are scanned as
well.

An underlying basis of cognitive psychology is the use of outward behaviors (like


reaction times) to infer mental events. Sternberg identified a linear relationship
between study group size and reaction time, which allowed him to infer that
scanning short-term memory for a target is serial, and that each item takes about
38 ms to scan. Equivalent reaction times for studied test items and novel test
items led him to infer that scans of short-term memory are exhaustive. In both
cases, an outward behavior led to conclusions about mental processing.

40) Write a note on methods used in the analysis of human movement

The most common methods for accurate capture of three-dimensional human movement require a
laboratory environment and the attachment of markers, fixtures or sensors to the body segments.
These laboratory conditions can cause unknown experimental artifacts.

The most frequently used method for measuring human movement involves placing markers or
fixtures on the skin's surface of the segment being analyzed [18]. The vast majority of current analysis
techniques model the limb segment as a rigid body, then apply various estimation algorithms to
obtain an optimal estimate of the rigid body motion. One such rigid body model formulation is given
by Spoor and Veldpas [19]; they have described a rigid body model technique using a minimum mean
square error approach that lessens the effect of deformation between any two time steps. This
assumption limits the scope of application for this method, since markers placed directly on skin will
experience non-rigid body movement.

41) What is the importance of computers in cognitive science?

Computer science has been very important in cognitive science for two reasons. First, the notion of
computation has been invaluable for developing ideas about how thinking might be a natural process.
Previously, scientific theories of the mind relied on clumsy and unproductive analogies with
mechanical devices such as clocks and electronic switchboards. The advent of computer
programs made it possible to see how a mechanical device could solve complex problems by
manipulating symbols, or representations, according to algorithmic procedures (computations),
generating productive analogies for how minds might work in similar ways. Standard programming
languages, for example, allowed for sequences of “IF…THEN…” instructions, which suggest a model of
how people make plans .

Second, computers themselves have been useful for testing scientific hypotheses about mental
organization and functioning. A given hypothesis is modeled in a program by
constructing algorithms that mimic the entities and processes the hypothesis proposes. The program
is then run on a computer, and if the computer’s output is similar in appropriate ways to real human
performance, the hypothesis is considered to be supported

42) Write a note on how computers function like a black box.

A Black box is a device, system or object which can be viewed in terms of its inputs and outputs
(or transfer characteristics), without any knowledge of its internal workings. Its implementation is
"opaque" (black). Almost anything might be referred to as a black box: a transistor, an engine,
an algorithm, the human brain, an institution or government.

In computer programming and software engineering, black box testing is used to check that the
output of a program is as expected, given certain inputs.[19] The term "black box" is used because
the actual program being executed is not examined.

In computing in general, a black box program is one where the user cannot see the inner
workings (perhaps because it is a closed source program) or one which has no side effects and
the function of which need not be examined, a routine suitable for re-use.

Also in computing, a black box refers to a piece of equipment provided by a vendor for the
purpose of using that vendor's product. It is often the case that the vendor maintains and
supports this equipment, and the company receiving the black box typically is hands-off.

You might also like