You are on page 1of 77

Unit 2

Qualitative Research
1. The nature of qualitative research
Qualitative research methods are concerned with opinions, feelings and experiences. It describes
social phenomena as they occur naturally - no attempt is made to manipulate the situation - just
understand and describe. It takes a holistic perspective / approach, rather than looking at a set of
variables. Qualitative research data is used to help us to develop concepts and theories that help
us to understand the social world - which is an inductive approach to the development of theory,
rather than a deductive approach that quantitative research takes - i.e. testing theories that have
already been proposed. Qualitative data is collected through direct encounters i.e. through
interview or observation and is rather time consuming.
Fraenkel (2007) defined qualitative research as research studies that investigate the quality of
relationships, activities, situations, or materials. It is described by its aims, methodology, and the
kind of data collected to understand the different facets of social life. Qualitative approaches to
research are universal and holistic. They follow these beliefs: A single reality is not observed,
Reality is based upon concepts that are distinct for each person and change over time, what we
perceive has meaning only within a given situation.
Qualitative research is an inquiry process of understanding based on distinct methodological
traditions of inquiry that explore a social or human problem. The research builds a complex,
holistic picture, analyzes words, reports detailed views of informants, and conducted the study in
natural setting. Qualitative methods are typically more flexible – that is, they allow greater
spontaneity and adaptation of the interaction between the researcher and the study participant.
For example, qualitative methods ask mostly “open-ended” questions that are not necessarily
worded in exactly the same way with each participant. With open-ended questions, participants
are free to respond in their own words, and these responses tend to be more complex than simply
“yes” or “no.” In addition, with qualitative methods, the relationship between the researcher and
the participant is often less formal than in quantitative research. Participants have the opportunity
to respond more elaborately and in greater detail than is typically the case with quantitative
methods. In turn, researchers have the opportunity to respond immediately to what participants
say by tailoring subsequent questions to information the participant has provided.
Qualitative research is becoming popular research types now a days. It mainly focus on theory
building particularly in academic world. It was emerge after the Second World War to overcome
the limitation of positivism prospective which is widely used in quantitative research in
systematic investigation of research problem. The popularity of grounded theory, Interpretivism,
narrative analysis, meta-analysis, etc. also contribute for gaining popularity of qualitative
research. The wider perspective adopted by the research to understand the social reality and the
freedom given to the respondent to respond on social phenomena also act as vital for the
emerging of qualitative research. Qualitative research is a process of naturalistic inquiry focusing
on in-depth understanding of social phenomena using the scientific process. It depends on the
direct experiences of human beings focusing on why rather than what about the subject matter
allowing understanding a research query as a humanistic or idealistic approach for study. Mostly,
the qualitative research is design to understand people’s beliefs, experiences, attitudes, behavior,
and interactions according to situation using the non-numerical data supply by them. Qualitative
research has the ability to add a new dimension to interventional studies that cannot be obtained
through measurement of variables alone. Qualitative research was initially used in psychological
studies when researchers found it tedious to evaluate human behavior in numeric. Later it gain
popularity in all disciples to explore the facts. It is conduct using logical and statistical
procedures to generate the knowledge. For the generation of the knowledge the qualitative
researchers use multiple systems of inquiry for the study of human phenomena including
biography, case study, historical analysis, discourse analysis, ethnography, grounded theory and
phenomenology.
The word qualitative implies an emphasis on the qualities of entities and on processes and
meanings that are not experimentally examined or measured [if measured at all] in terms of
quantitative terms. Qualitative researchers stress the socially constructed nature of reality, the
intimate relationship between the researcher and what is studied, and the situational constraints
that shape inquiry.
Qualitative researchers looks at socially constructed nature of reality, the intended relationship
between the researcher and subject matter of studied, and the situational constraints that shape
inquiry. They seek answers to questions that stress how social experience is created and given
meaning. Thus qualitative research, firstly, focus on naturalism meaning seek to understand
social reality in its terms. Secondly on ethnomethodology meaning try to how social other is
created through discussion and interaction. Thirdly, emotionalism, exhibit a concern with
subjectivity, and gaining access to inside experience concern with inner human reality. Lastly,
postmodernism, emphasis “method talk” is sensitive to different ways social reality can be
constructed.
The qualitative research assumption that knowledge is subjective which and researcher learns
from the participants by understanding the meaning of their lives using unstructured
questionnaire, participatory observations or from case study methods.
Feature of Qualitative Research
There are different feature of the qualitative research methods. Some of the feature can match in
quantitative research also although they are research. Following are the distinct feature that
separates the qualitative research from quantitative research:
Openness of Respondents: In the qualitative research mostly open ended question are used for
interview with the aim to collect the data from participants on different dimension. Thus, there is
room to explore the different dimension of subject matter in qualitative research then quantitative
research.
Variety of Approaches and Methods: the qualitative research is not based on unified method as
used by quantitative research. The different methods of data analysis and interpretation are used.
The research may use ethnography, Interpretivism, grounded theory, contain analysis, narrative
analysis, etc as per the necessity of the study.
Flexibility on Researcher: in qualitative research the research have flexibility to define the
concept and choose the methodology. While defining concept and choosing research methods the
research must consider the epistemology and ontology. The concept is not define in rigid way
like quantitative research for forcing on the description of information which are collected at the
time of data collection.
Non-probability Sampling: All most all the qualitative research are conduct using non-
probability sampling methods and most of the sampling methods used is purposive using
relatively small size of sample. But in quantitative research most of sample are taken using
probability sampling methods by taking relatively large samples.
Qualitative research addresses the question of “what?” Knowing what some-thing is entails a
conceptualization of the matter under investigation as a whole and in its various parts, the way
these parts are related and organized as a whole, and how the whole is similar to and different
from other things. Knowing what something is may also involve the conceptualization of its
“how” its process and temporal unfolding in time. Qualitative knowledge may also include an
understanding of the context, the consequences/outcomes, and even the significance of what is
investigated in the larger world. The construction of theories, hypothetical explanation,
prediction, and measurement of a subject matter presupposes qualitative knowledge that is,
knowledge of the basic characteristics of the subject matter. Knowledge of the “what” may be
implicit or explicit, uncritically assumed or carefully established, and informally or formally
acquired. In the history of the sciences that concern human mental life, great attention has been
devoted to the rigorous specification of procedures for measurement and quantitative analysis,
and the qualitative/descriptive procedures have received far less attention. However, in and of
itself, measurement tells us only magnitude, and even when many measurements are made with
the finest instruments and rationally analyzed with the most sophisticated statistical procedures,
they do not themselves provide qualitative knowledge of what is being measured. Therefore, a
different kind of research and analysis—research about what a subject matter is in all its real-
world complexity—is a necessary foundation and complement to quantitative research.
Qualitative knowledge is easily taken for granted. We are already familiar with “what things are”
through ordinary experience in everyday life. However, important basic qualitative work has
always been done in the physical sciences for instance, in charting the stars and planets in
astronomy, developing classification systems for plants in botany, describing the structure and
functions of organ systems, and the stages of embryonic development in biology. Perhaps such
human phenomena as learning, intelligence, emotion, the family, education, democracy, and the
Cold War era are so close to us that we can theorize, measure, explain, and even sometimes
successfully predict and control them without undertaking any methodical qualitative
investigations of them. However, given that qualitative questions concern the structure, the
process, and even the significance of such subject matter, careful, rigorous science may be
necessary in order to overcome the prejudices and limitations of uncritical experience and
assumptions, however well these may serve us in our everyday lives. After all, qualitative
questions about the nature of phenomena such as “learning” and “intelligence,” indeed of the
very nature of “human beings” themselves, continue to be matters of conflicting claims and
ongoing debate.
Qualitative Research and Science
Qualitative research does not perform by following same approach that are used by natural
science, thus the question is rise that is the qualitative study is consider as science? The answer
of the question is itself is the question to answer. At first lets what is considered as science? in
short the study is consider as science if there is the logic of explanation and later the experiment
can be done on that logic. The qualitative research is not based on theory followed by natural
science basically start with general questions which stimulate for conducting research and later
stage the question is specified after gathering the data and interpretation is done for further
collection of data and interpretation. Thus, there is reason to believe that the method used in
qualitative research is not appropriate to consider as science. In this point again considering the
definition of scientific method as the method of logic suggest that there is some confusion while
not considering the qualitative research as scientific methods. Using different approach
according to the situation of research subject matter and not having the unified methods of
interpretation is not always the case to judge the research activities as science. The methodology,
approach used in qualitative research are different according to the situation and subject matter
of the research which is carefully reported by researcher considering the epistemology and
ontology by collecting the data on small group of people using purposive sampling technique.
Thus the qualitative research is context base research conducted by following logical sequence
of activities.
In the qualitative research the information about the background of the researcher, connection
with participants, situation of conducting research is presented which may affect the data
collection and interpretation to draw the conclusion. Thus qualitative research display
the credibility of researcher by defining the concept used, data collection and interpretation in
research. The intellectual definition of concept and interpretation of data is reflect in qualitative
research. The researcher engagement in data for the construction of categories, themes,
constructs, interpretations and explanation shows the creativity of the research in qualitative
research. All these also count to consider the research using scientific methods or not. In
qualitative research the researcher must spend enough time for familiar with all the aspects of
the context and theme develop in research and identify contextual factors that influence the
observed or theme of interest by collecting reliable data which is difficult without rapport with
the participants. To develop the theme in research the research focus on persistent
observation which allows the researcher to identify and focus on the most relevant
characteristics of the situation or context.
The creativity of research also determine by the ability of researcher for triangulation of data. For
this researcher use multiple and different sources of data to provide the reliability and validity in
research findings. In qualitative research it is used to reduce systematic bias in the data and
involves checking findings against different sources and perspectives. The process guards the
researcher from being accused that the findings are simply a result of a single method, a single
source or the single researcher’s personal bias. Thus, in qualitative research multiple source is
used for data collection focusing on reduction of biasness. From this point qualitative research is
scientific study or not?
In qualitative research the finding are not generalized but the conclusion are drawn from the
conceptual framework or theme developed in research. The conclusion of the research may use
any other person for the judgement or verification or decision making. The conceptual
framework have transferability and dependability for the decision making and judgement. It is
the feature of scientific methods.   
The finding or conclusion of qualitative research is tested through experiment or logical
reasoning. The trangularism methods used for data collection, collecting the further data and its
interpretation verify the theme or conceptual framework develop in research. Thus, the
conceptual framework of qualitative research have Confirmability. It is also the element of
scientific study.
Methods of Qualitative Research
Qualitative researchers use all five senses and intelligence to collect views, opinion, perceptions,
feeling and descriptions of about the subject matter from targeted people using variety of
methods. Researcher may use following tools for data collection:
Observation
In qualitative research the researcher used direct observation to studies the reaction, behavior,
nature of people or society without participating with respondent. If there is chances of
manipulation of data researcher use indirect observation. Generally the researcher prefer to use
direct observation then indirect observation due to high cost and time then indirect
observation.
Some cases the researcher also participates in the action or events to collect the data by
observing as well as  by gaining the experience in field which is helpful to think on the behalf of
the respondents of research. It is generally called participants observation which is popular in
qualitative research. 
In research the the in deep observation is also done by researcher using  emic perspective and
deeply involving with respondents for data collection and interpretation which commonly called
Ethnographic Observation. It is the most intensive observational method where researcher
fully immerses herself into the research setting and lives among the participants as one of them
for anywhere from months to years. By doing this, the researcher attempts to see events and have
experiences from the viewpoints of those studied in order to develop an in-depth and long-term
account of the community, events, or trends under observation.
Surveys: In qualitative study the researcher used open-ended questions that allow for the
generation and analysis of qualitative data. It decrease the restrictions to respondent to express
their opinion to researcher.
Focus Group: in qualitative research the researcher may engages in focus group discussion which
consists of a small group of 5 to 15 participants in a conversation which is designed to generate
data relevant to the research question.
In-depth Interviews: Researchers may conduct in-depth interviews using one to one technice
with respondent spending the long hour of inquary on subject matter using unstructured
questions. Sometimes a researcher approaches the interview with a predetermined list of
questions or topics for discussion but it is only for limited case. Uselly in qualitative research the
researcher has identified certain topics of interest and does not used structural guide for interview
with participants.
Oral History: historical research is generally qualitative in nature and for this the researcher used
oral history method for creating a historical account of an event, group, or community, and
typically involves a series of in-depth interviews conducted with one or multiple participants
over an extended period of time. For example if the research is conduct to explore how the
democracy is restore in Nepal the oral interview is appropriate tool of data collection.
Content Analysis: This method is used by sociologists to analyze social life by interpreting
words and images from documents, film, art, music, and other cultural products and media. The
researchers look at how the words and images are used, and the context in which they are used to
draw inferences about the underlying culture. In the last decade, content analysis of digital
material, especially that generated by social media users, has become a popular technique within
the social sciences. 
Pros and Cons of Qualitative Research
Qualitative research has both benefits and drawbacks. On the plus side, it creates an in-depth
understanding of the attitudes, behaviors, interactions, events, and social processes that comprise
everyday life. In doing so, it helps social scientists understand how everyday life is influenced by
society-wide things like social structure, social order, and all kinds of social forces. This set of
methods also has the benefit of being flexible and easily adaptable to changes in the research
environment and can be conducted with minimal cost in many cases.
The downsides of qualitative research are that its scope is fairly limited so its findings are not
always widely generalizable. Researchers also have to use caution with these methods to ensure
that they themselves do not influence the data in ways that significantly change it and that they
do not bring undue personal bias to their interpretation of the findings. Fortunately, qualitative
researchers receive rigorous training designed to eliminate or reduce these types of research bias.
Below are the three key elements that define a qualitative research study and the applied forms
each take in the investigation of a research problem.
The Design
Naturalistic -- refers to studying real-world situations as they unfold naturally; non manipulative
and non-controlling; the researcher is open to whatever emerges [i.e., there is a lack of
predetermined constraints on findings.
Emergent -- acceptance of adapting inquiry as understanding deepens and/or situations change;
the researcher avoids rigid designs that eliminate responding to opportunities to pursue new
paths of discovery as they emerge.
Purposeful -- cases for study [e.g., people, organizations, communities, cultures, events, critical
incidences] are selected because they are “information rich” and illuminative. That is, they offer
useful manifestations of the phenomenon of interest; sampling is aimed at insight about the
phenomenon, not empirical generalization derived from a sample and applied to a population.
The Collection of Data
Data -- observations yield a detailed, "thick description" [in-depth understanding]; interviews
capture direct quotations about people’s personal perspectives and lived experiences; often
derived from carefully conducted case studies and review of material culture.
Personal experience and engagement -- researcher has direct contact with and gets close to the
people, situation, and phenomenon under investigation; the researcher’s personal experiences
and insights are an important part of the inquiry and critical to understanding the phenomenon.
Empathic neutrality -- an empathic stance in working with study respondents seeks vicarious
understanding without judgment [neutrality] by showing openness, sensitivity, respect,
awareness, and responsiveness; in observation, it means being fully present [mindfulness].
Dynamic systems -- there is attention to process; assumes change is ongoing, whether the focus
is on an individual, an organization, a community, or an entire culture, therefore, the researcher
is mindful of and attentive to system and situational dynamics.
The Analysis
Unique case orientation -- assumes that each case is special and unique; the first level of analysis
is being true to, respecting, and capturing the details of the individual cases being studied; cross-
case analysis follows from and depends upon the quality of individual case studies.
Inductive analysis -- immersion in the details and specifics of the data to discover important
patterns, themes, and inter-relationships; begins by exploring, then confirming findings, guided
by analytical principles rather than rules.
Holistic perspective -- the whole phenomenon under study is understood as a complex system
that is more than the sum of its parts; the focus is on complex interdependencies and system
dynamics that cannot be reduced in any meaningful way to linear, cause and effect relationships
and/or a few discrete variables.
Context sensitive -- places findings in a social, historical, and temporal context; researcher is
careful about [even dubious of] the possibility or meaningfulness of generalizations across time
and space; emphasizes careful comparative case analyses and extrapolating patterns for possible
transferability and adaptation in new settings.
Voice, perspective, and reflexivity -- the qualitative methodologist owns and is reflective about
her or his own voice and perspective; a credible voice conveys authenticity and trustworthiness;
complete objectivity being impossible and pure subjectivity undermining credibility, the
researcher's focus reflects a balance between understanding and depicting the world authentically
in all its complexity and of being self-analytical, politically aware, and reflexive in
consciousness.
Characteristics of Qualitative Research
 The direct source of data is the natural setting and the researcher is the key instrument in
qualitative research. Researchers go directly to the particular setting of interest to observe
and collect the needed data.
 Data collected are in the form of words or illustrations rather than numbers. The kinds of
data collected may include, but not limited to, audio recordings, diaries, field notes,
memorandums, official records, personal comments, photographs, textbook passages, and
transcripts of interview, videotapes, and anything else that convey actual words or actions of
people.
 Qualitative research is concerned with process as well as product. The primary interest of a
qualitative researcher is on how things happen and on people’s interaction with one another.
 Analysis of data is taken inductively. It starts with the specific concepts of the respondents
to draw out general idea or theory. To do this, a considerable amount of time is spent in
collecting data before the important questions are considered.
 Qualitative research deals with how people make sense out of their lives. The perspective of
the subjects of a study is a major concern.
Uses of Qualitative Research
 Draw meaningful information about beliefs, feelings, values, and motivations that support
behavior.
 Learn directly from people and what is important to them.
 Provide the context required to elicit qualitative results.
 Identify variables important for further studies.
 Determine one’s genre as a primary step to develop a quantitative survey.
 Assess the usability of websites, databases, or other interactive media/services.
Strength and weakness of Qualitative Research
Strengths of Qualitative Research
 Basically, its strengths is its capacity to give rich information about the respondents.
 Provides in-depth information on individual cases.
 Unravels complex phenomena embedded in local context.
 Describes rich phenomena situated in some exceptional environments.
 Relays subjects’ experiences and perspectives in unusual details.
 Conveys setting factors related to the situation of interest.
 Allows flexibility in research-related processes.
 Enables data to be collected in natural setting.
 Determines possible causes of a particular event in another perspective as that given by
quantitative research.
 Permits approaches that are responsive to local conditions and stakeholders’ needs.
 Presents several options in the conduct of the research.
 Tolerates shifts in focus based on research results.
 Accepts unstructured interpretation of the participants, respecting anything that is in the
participants’ context.
Weakness of Qualitative Research
 Data gathering is often time-consuming.
 Analysis of data takes longer than that in quantitative research.
 Interpretation of results is usually biased because it is influenced by the researcher’s
perspective.
 Conclusions are not generalizable because the subjects are few and sometimes possess
unique characteristics compared to typical respondents.
Stage of Qualitative Research
There are no universally accepted steps of the qualitative research process. The different scholar
follows the different process of research process with their own justification. Some scholar use
qualitative research for theory building and in modern days the research used qualitative research
for theory building as well as theory testing. Following are the steps as suggested for qualitative
research:
i. Developing General Research Questions: Generally the qualitative research begin with lack
of theoretical and conceptual background so, there is little knowledge about the possible solution
of the problem. Thus, the qualitative research begins with general questions.
ii. Selection of Sites and Subjects: After developing research questions the researcher must set
the study area and the subject for the study. The researcher must selected the research sites where
the problem is mostly found.
iii. Collection of Relevant Data: After the selection of sites and subject the research will
collected data using different methods of data collection like unstructured interview, observation,
focus group, oral history, etc.
iv. Interpretation of Data: The collected data are analysis and interpreted using appropriate
approach. For the interpretation of the data knowledge and ability of researcher and perception
used in research play vital role.
v. Conceptual and Theoretical Review: The conceptual and theoretical framework is developed
from the interpretation of data where the variables of theory are identified and link between the
variable are developed. The researcher may further specify of research question if necessary and
further collect and interpret the data for the verification as well as giving fine tune to the theory.
vi. Report Writing: It is the final stage of research. In this stage the research report is prepared
in format as suggested by expert. It also act as proof of conduction research.
Theory and Qualitative research
The theory is the statement that shows the relation between variables having certain assumptions
and concept are defined in rigid way. It limit the meaning of concept and define from close
environment. The qualitative research is assumed the inductive method is recalled this the
research is conduct with the purpose of developing the theory. The grounded theory,
ethnographic research are generally used for the development of social theory. But in recent
times the qualitative research is also conduct for testing the theory. There is no reason to believe
that qualitative theory does not have ability to test the theory. The modern trend in qualitative
research may use deductive method of reason to test the theory. The qualitative research gain the
ability to test the theory by using comprehensive process of research where theory are build and
at the same research the theory is tested.
Generally in qualitative research at first the general research question is develop and research site
and subject is determined at first. The appropriate data is collected and interpretation is done
using different methods for the development of conceptual framework which later convert into
theoretical framework work. In the same research further data is collected to give fine tune for
raw theory to finalize. If the research still not convince or try to test the theory again the data is
collected and on the interpretation of data the theory is tested. Thus in qualitative research both
inductive and deductive methods are used in same research. Qualitative research is generally
design for the development of the theory using different approach through inductive methods
using unstructured interview, participatory observation, focus group discussion, etc.
Concept in Qualitative Research
Concept are abstract, imagination having the different meaning with different people and change
according to place, time and the knowledge gained by individual. The concept have
multidimensional and multi meaning according to the social phenomena. The concept is the
central point in the research. In case of quantitative research the operational definition is used to
narrow down its meaning and dimension and for identification of indicator to measurement. In
qualitative research the concept is defined but not like quantitative where the participants have
limited area for response. The concept in qualitative research is not define in broad sense which
make researcher very difficult to generalize or deriving the conclusion and also in not right sense
which fail to provide meaningful conclusion.
The concept in qualitative research is define in wider sense then the quantitative research
considering the open area for respondent to provides the wide range of information which is
needed for developing theory or conceptual framework giving boundary less option. The
observation, unstructured interview, participatory observation, focus group discussion are used in
the research.

2. Theoretical position: Structuralism – functionalism, symbolic interactionism, ethno-


methodology, grounded theory, phenomenology
i. Structuralism – functionalism
Structuralism
In the early days, the conception of complex phenomena is explain from the basic part or
elements of that objects. It is considered as elementism which shows dominance in science and,
for the study of mine same approach was followed. The elementism was popular methods till late
1800s. It was Wilhelm Wundt who, consider as father of structuralism in psychology,
separate psychology for other discipline by establishing the psychological laboratory in Leipzig,
Germany. He believes that the mind can be study effectively by the conscious of thoughts of
persons.  Under the structuralism, human behavior is analysis by separating different element or
parts. Later, it was popularized by the student of Wundt named Edward Titchener whose
approach is little different and improved version from initial stage. Structuralism is focus on a
study of the element of consciousness where conscious experiment is broken down into basic
elements. Consciousness in psychology, is defined as the sum total of experiences at any given
moment, and the mind is defined as the sum of experiences over the course of a lifetime and
Structuralism emphasis on breaking down the mind into its functional structure. Structuralism in
psychology is concerned with examining how humans experience such events subjectively.
Structuralism focusing on introspection concerned with identify the basic components of
consciousness, association with one element with other and their relationship and underlying
physiological conditions are associated with the core elements but most of the focus on the
identification of basic elements. The principle of introspection emphasis on the discretion of
experience in most basic terms so that a researcher does not describe some experience or objects.
Under the structuralism theory the researcher focus study of an individual's mind by developing
a demonstrable connection between various emotions, feeling and sensations comment
commonly consider as inner experiences. Since Consciousness is the sum of mental experience
that person has in given moment and believes that mind accumulated experience in life time.
Thus, to find out the structure of mental process and thinking the basic components of mind
could be understood, defined and categorized.
A primary goal for structuralism was to identify the basic elements of consciousness. Thus if
researcher used the structuralism for the study of behavior of mine at first the basic element of
that behavior are identify, then the relation between the basic elements are established and the
reason of that behavior are explain. The use of structuralism are not only for the explorative
study it can also be used for other purpose in research. Since, there are three basic elements of
consciousness: sensations, images, and feelings the researcher also focus on the extraction of
these three element of the mind for the subject under study. In research the sensations is
fundamental which help for the building blocks of all perceptions of the respondents of subject. 
Under structuralism the researcher all feelings are viewed as reducible to experiencing a degree
of pleasantness or unpleasantness and combined with certain sensations, can give rise to a
complex emotional state, such as love, joy, disgust, or fear.
The researcher must evaluated the each element of consciousness with five basic dimensions:
quality (to the differentiation of sensation) , intensity (the strength or magnitude of the quality),
protensity (duration or length of a sensory experience), attensity (clarity or vividness of the
experience and reflects the process of attention), and extensity (space). Feelings in the research is
feature in terms of quality, intensity, and protensity.  
While using structuralism in research the researcher use introspection as the methods of research
and research face the problem due to unreliable and there was no way of objectively verifying
the content of someone’s consciousness of results. One group of researchers, most notably a
former follower of Wundt, Oswald Külpe, at the University of Würzburg, concluded, using
introspection methodology that some thoughts occurred in the absence of any mentalistic
sensations or images. This was completely at odds with structuralism, and researchers loyal to
the structuralist position were not able to replicate the findings. On the other hand, researchers
sympathetic to the Würzburg school were able to replicate the findings. Obviously, a theoretical
bias was driving the results. It was widely concluded that introspection was lacking the
objectivity needed to sustain a scientific discipline. Other methodologies were discouraged by
structuralists in part because of the limited scope of psychology they practiced. In essence,
structural psychology was limited to the study of the elements of consciousness in the healthy
adult human. There was no place for the use of nonhuman animals as subjects, no child
psychology, and no concern with the psychology of physical or mental illness. In addition,
Titchener was against applied research that is, conducting research to help resolve practical
problems. He felt that this would detract from the objectivity of the study and that academic
researchers should be devoted to advancement of pure knowledge. Finally, structuralism was
criticized for focusing almost exclusively on the elements of consciousness without taking into
serious consideration the idea that consciousness is experienced as a unified whole, and that this
whole is different from the sum of the elements.
In the twenty-first century, two major contributions of structuralism are recognized. The first is
the strong emphasis that Titchener and his followers placed on rigorous laboratory research as
the basis for psychology. Although other methods are used by contemporary psychologists (such
as case studies and field research), the emphasis on experimentation in practice and training
remains dominant. Second, structuralism provided a well-defined school of thought and set of
ideas that others could debate and oppose, with the ultimate result being the development of new
and different schools of thought. The most prominent opposition to structuralism was
functionalism.
A final tradition that can be mentioned, but which we will not analyze in detail here, is the
tradition originating with structuralism in the first half of the twentieth century—the linguistics
of Ferdinand de Saussure and the structural anthropology of Claude Lévi-Strauss, for example,
which eventually developed into post-structuralism in the latter half of the century in the hands
of figures such as Michel Foucault, a French philosopher and historian of ideas, who is among
the most referenced authors in the social science as a whole. Structuralism was based on the idea
that language is a system of signs whose meaning is determined by the formal relations between
the signs (and not with reference to “the world”) and post-structuralism pushed this idea further
by arguing that the system is constantly moving and in flux, which is why, as Jacques Derrida
(the leading exponent of deconstruction) would say, meaning is endlessly “deferred.” In relation
to qualitative research, we should say that Foucault (and to a lesser extent Derrida) was a
significant inspiration for many forms of discourse analysis, which today exist in many different
variants. One variant is heavily inspired by Foucault and an awareness of power relations in
social worlds, while Discursive Psychology as another is not closely associated with Foucault or
post-structuralism, but originates in the aforementioned ethnomethodology and conversation
analysis.

Functionalism
Functionalism is the theory comes due to the rigidity of structuralism and the earliest propounded
of this theory was William James, who is considered the father of the functionalism theory.
Functionalism theory try to understand the psychological aspects of human and animal which
have been developed and they currently possess. In simple word the theory try to analysis the
mind from the function of it. Thus the theory based on the evolutionary theory propounded by
charles Darwin. The functionalism try to explain the consciousness in a more systematic and
accurate manner focusing on the purpose of consciousness and the behavior of the person or
animal by examine the way in which mind adapts to the changing environment and situations.
Unlike structuralism, functionalism was not a formal school of psychological thought. Rather, it
was a label (originally used by Titchener) applied to a general set of assumptions regarding the
providence of psychology, and a loosely connected set of principles regarding the psychology of
consciousness. In many respects, functionalism was defined in terms of its opposition or contrast
to structuralism. For example, functionalists believed that psychology should focus on the
functions of mental life (in contrast to the structuralist focus on elemental components); be
concerned with using psychology for practical solutions to problems (structuralists were, at best,
indifferent to this concern); study not only healthy adult humans (the main focus of attention of
structuralists) but also nonhuman animals, children, and unhealthy individuals; employ a wide
range of methodologies to investigate psychological issues (structuralists relied almost totally on
introspection); and examine individual differences, rather than being solely concerned, like the
structuralists, with the establishment of universal (nomothetic) principles. Although structuralism
was imported to the United States by a British scholar (Titchener) who received his
psychological training in Germany (under Wundt), functionalism had a distinctly American flair.
The American Zeitgeist at the time emphasized pragmatism and individuality. Such qualities
made American psychologists especially receptive to the revolutionary work of Charles Darwin
on evolution and its subsequent application (as “social Darwinism”) by anthropologist Herbert
Spencer to education, business, government, and other social institutions. Other important
developments that influenced functionalism include work by Sir Francis Galton on individual
differences in mental abilities and the work on animal psychology by George Romanes and C.
Lloyd Morgan.
William James
William James is considered the most important direct precursor of functional psychology in the
United States, and one of the most eminent psychologists ever to have lived. James earned his
medical degree from Harvard University in 1869 and subsequently became keenly interested in
psychology. Despite his severe bouts with depression and other ailments, he accepted a post at
Harvard in 1872 to teach physiology. Shortly thereafter, in 1875, James taught the first
psychology course offered in the United States,
“The Relations Between Physiology and Psychology,” and initiated a classroom demonstration
laboratory.
James published the two-volume work The Principles of Psychology in 1890. This work was
immediately a great success and is now widely regarded as the most important text in the history
of modern psychology. Given the expansiveness of his work—more than thirteen hundred pages
arranged in twenty-eight chapters—it is impossible to summarize fully, but it includes such
topics as the scope of psychology, functions of the brain, habit, methods of psychology, memory,
the consciousness of self, sensation, perception, reasoning, instinct, emotions, will, and
hypnotism. James presented ideas that became central to functionalism. For example, in the
chapter “The Stream of Consciousness,” James criticized the postulate of structural psychology
that sensations constitute the simplest mental elements and must therefore be the major focus of
psychological inquiry. In contrast, James argued that conscious thought is experienced as a
flowing and continuous stream, not as a collection of frozen elements. With this new, expansive
conceptualization of consciousness, James helped pave the way for psychologists interested in
broadening the scope and methods of psychology. What was to emerge was the school of
functionalism, with prominent camps at the University of Chicago and Columbia University.

The Chicago School


The Chicago school of functionalism is represented by the works of American scholars John
Dewey , James Rowland Angell , and Harvey A. Carr. Functionalism was launched in 1896 with
Dewey’s Psychological Review article “The Reflex Arc Concept in Psychology.” Dewey argued
against reducing reflexive behaviors to discontinuous elements of sensory stimuli, neural
activity, and motor responses. In the same way that James attacked elementalism and
reductionism in the analysis of consciousness, Dewey argued that it was inaccurate and artificial
to do so with behavior. Influenced by Darwin’s evolutionary theory of natural selection, Dewey
asserted that reflexes should not be analyzed in terms of their component parts, but rather in
terms of how they are functional for the organism—that is, how they help an organism adapt to
the environment.
Angell crystalized the functional school in his 1907 Psychological Review paper “The Province
of Functional Psychology.” In this work, three characteristics of functionalism were identified.
Functional psychology is interested in discerning and portraying the typical operations of
consciousness under actual life conditions, as opposed to analyzing and describing the
elementary units of consciousness. Functional psychology is concerned with discovering the
basic utilities of consciousness, that is, how mental processes help organisms adapt to their
surroundings and survive. Functional psychology recognizes and insists on the essential
significance of the mind-body relationship for any just and comprehensive appreciation of
mental life itself.
Carr’s 1925 textbook Psychology: A Study of Mental Activity presents the most polished version
of functionalism. As the title suggests, Carr identified such processes as memory, perception,
feelings, imagination, judgment, and will as the topics for psychology. Such psychological
processes were considered functional in that they help organisms gain information about the
world, retain and organize that information, and then retrieve the information to make judgments
about how to react to current situations. In other words, these processes were viewed as useful to
organisms as they adapt their environments.

The Columbia School


Another major camp of functionalism was at Columbia University and included such notable
psychologists as James McKeen Cattell , Robert S. Woodworth , and Edward L. Thorndike .
In line with the functionalist’s embrace of applied psychology and the study of individual
differences, Cattell laid the foundation for the psychological testing movement that would
become massive in the 1920s and beyond. Under the influence of Galton, Cattell stressed the
statistical analysis of large data sets and the measurement of mental abilities. He developed the
order of merit methodology, in which participants rank-order a set of stimuli (for instance, the
relative appeal of pictures or the relative eminence of a group of scientists) from which average
ranks are calculated. Woodworth is best known for his emphasis on motivation in what he called
dynamic psychology. In this system, Woodworth acknowledged the importance of considering
environmental stimuli and overt responses but emphasized the necessity of understanding the
organism (perceptions, needs, or desires), representing therefore an early stimulus-organism-
response (S-O-R) approach to psychology. Thorndike represented a bridge from functionalism to
behaviorism, a new school of thought that was led by John B. Watson and emerged around 1913.
Thorndike was notable for his use of nonhuman subjects, a position consistent with Darwin’s
emphasis on the continuity among organisms. He is also famous for his puzzle box research with
cats, which led to his law of effect, which states that when an association is followed by a
satisfying state of affairs, that association is strengthened. This early operant conditioning
research was later expanded on by the famous behaviorist psychologist B.F. Skinner.
Evaluation
Functionalism paved the way for the development of applied psychology, including
psychological testing, clinical psychology, school psychology, and industrial-organizational
psychology. Functionalism also facilitated the use of psychological research with a wide variety
of subjects beyond the healthy male adult, including infants, children, the mentally ill, and
nonhuman animals. Finally, functional psychologists used a wide variety of methods beyond that
of introspection, including field studies, questionnaires, mental tests, and behavioral
observations. These developments were responsible, in part, for the United States becoming the
world center for psychological study by 1920. The term functional psychology faded from usage
as it became clear that, by default, being simply a psychologist in the United States meant being
a functional psychologist. The shift in psychological thought instigated by functionalism set the
stage for the next major evolutionary phase in American psychology, behaviorism.
Structuralism is one of two opposing philosophies. The second one introduces another figure in
psychology. William James was a student of Wundt, and was also the first American to study
psychology, and to bring it to the United States as a major area of study. James is what we refer
to as a functionalist. Functionalism-- as opposed to structuralism, which tries to break down the
behavior and mental processes into their component parts-- understands instead, that the mental
experience is more a stream, or a flow, of consciousness, which is irreducible. It can't be broken
down any further. Another way of looking at is that functionalist understand the evolutionary, or
natural selection, of humans mental experiences over time. Because of this, a lot of functionalism
informed our use of animals in our studies, as a way of studying and understanding behavior, as
well as informing current areas of study in educational and industrial psychology. And we'll be
getting more into each of those areas as we move further through the course. To review, in this
lesson we learned about Wilhelm Wundt, who was the father of psychology, in 1879, setting up
his laboratory in Germany. Wundt. Was also a structuralist. Structuralism, again, was the
theoretical perspective that different mental phenomena can be broken down into their more
basic, component parts, and studied in that way. And that was opposed to functionalism, which
says that different mental processes cannot be broken down. And in fact, they are a stream or a
flow of consciousness.
So these two theoretical perspectives will inform our further investigation into different
theoretical perspectives that you'll be seeing in the future.
ii. Symbolic interactionism
Symbolic interactionism is a micro-level theory that focuses on the relationships among
individuals within a society. Communication, the exchange of meaning through language and
symbols, is believed to be the way in which people make sense of their social worlds. Social
interactionism is a micro level theory which finds the relationship among the individual within a
society looking at how people navigate their interaction with other and assign meaning based on
their interpretation of those interactions. The concept focuses on the individual and society and
focuses on how they interacted with other people in society. George Herbert Mead was
considered as the founder of the theory. The symbolic interaction focuses on the behaviour of the
individual as opposed to the collective behaviour of people as a group. The theory looks at the
action of an individual person and searches the meaning of it. The meaning may differ from
person to person and the meaning is not always the same for the same person, change from time
to time and action to action. In simple word, symbolic interactionism theory argues that the
action shows by the person have meaning to them, meaning differ from person to person and
change time to time depending on the situation faced by the person.
George Herbert Mead (1863–1931) is considered a founder of symbolic interactionism though he
never published his work on it . Mead’s student, Herbert Blumer, coined the term “symbolic
interactionism” and outlined these basic premises: humans interact with things based on
meanings ascribed to those things; the ascribed meaning of things comes from our interactions
with others and society; the meanings of things are interpreted by a person when dealing with
things in specific circumstances. If you love books, for example, a symbolic interactionist might
propose that you learned that books are good or important in the interactions you had with
family, friends, school, or church; maybe your family had a special reading time each week,
getting your library card was treated as a special event, or bedtime stories were associated with
warmth and comfort.
Social scientists who apply symbolic-interactionist thinking look for patterns of interaction
between individuals. Their studies often involve observation of one-on-one interactions. For
example, while a conflict theorist studying a political protest might focus on class difference, a
symbolic interactionist would be more interested in how individuals in the protesting group
interact, as well as the signs and symbols protesters use to communicate their message. Janitors
and supporters strike with signs in front of MTV network in Santa Monica.The focus on the
importance of symbols in building a society led sociologists like Erving Goffman (1922–1982)
to develop a technique called dramaturgical analysis. Goffman used theater as an analogy for
social interaction and recognized that people’s interactions showed patterns of cultural “scripts.”
Because it can be unclear what part a person may play in a given situation, he or she has to
improvise his or her role as the situation unfolds Studies that use the symbolic interactionist
perspective are more likely to use qualitative research methods, such as in-depth interviews or
participant observation, because they seek to understand the symbolic worlds in which research
subjects live. The theory can be applied in qualitative research in modern days. The research
which used the symbolic interaction looks the pattern of interaction between individual. The
research study is based on in-depth interview or participant’s observation to understand the
symbolic worlds in which research subject live. The research is based on a study of social
behavior and interaction of individual. In qualitative research, the researcher using the symbolic
interactionism look at divergent meaning people lace on objects, interactions, and people, and
corresponding behavior that reflect this range of interpretations. In research, the researcher tries
to explain the behavior of each individual through the symbol are studied by carefully studying
the individual meaning of the objects and the action shown by the individual. The researcher also
tries to explore the basis such as social interaction which influences the participants to give the
meaning of the things, and the events that change the meaning for that object or things. The
researcher used an in-depth interview with the individual or two people to explore the mean
assign by these participants on the object and the action performed by them. The history of
participants on that objects is identified and the situation that changes the meaning is identified.
The research begins the research with a general objective by setting general questions. For the
study, the researchers used qualitative methodology considering the emic perspective about
objects defined by participants. The individual perceptions are explored to find out the meaning
and action of the person from that objects. The research under symbolic interaction is used to
compare with the social theory or macro view of behavior. It also acts as the basis for the
development of new theory.
Building on the insights from the early Chicago School of sociology (often referred to as the
“first generation of Chicago Sociology”), several sociologists and social anthropologists—some
of whom were themselves students of the early Chicagoans— during the 1940s and onwards
began to develop the idea of symbolic interactionism, sometimes more broadly described as
interactionism. What began as a distinctly North American project later spread to the European
continent. Some of the early proponents of symbolic interactionist social science with a strong
emphasis on qualitative methods were Charles H.  Cooley, Everett C.  Hughes, Erving Goffman,
Howard S.  Becker, Herbert Blumer, and Norman K. Denzin—with Blumer responsible for
originally coining the term “symbolic interactionism,” which he admitted was a “barbarous
neologism”. Symbolic interactionism often refers to the social philosophy of George Herbert
Mead as the founding perspective, which was later developed, refined, and sociologized by
others. Mead was a central force in the development of pragmatism. Symbolic interactionism is
based on an understanding of social life in which human beings are seen as active, creative, and
capable of communicating their definitions of situations and meanings to others. According to
Blumer, there are three central tenets of symbolic interactionism: (1) humans act toward things
on the basis of the meanings they that the things have for them, (2) the meaning of such things is
derived from or arises out of the social interaction that one has with one’s fellows, and (3) these
meanings are handled in and modified through an interpretive process used by the person in
dealing with the things he encounters. As is obvious from this, symbolic interactionists are
concerned with how humans create meaning in their everyday lives and in how, as the term
“symbolic interaction” indicates, this meaning is created and carved out through interaction with
others and by use of various symbols to communicate meaning. As Blumer stated on the
methodological stance of symbolic interactionism: Symbolic interactionism is a down-to-earth
approach to the scientific study of human group life and human conduct. Its empirical world is
the natural world of such groups and conduct. It lodges its problems in this natural world,
conducts its studies in it, and derives its interpretations from such naturalistic studies. If it wishes
to study religious cult behavior it will go to actual religious cults and observe them carefully as
they carry on their lives. If it wishes to study social movements it will trace carefully the career,
the history, and the life experiences of actual movements. If it wishes to study drug use among
adolescents it will go to the actual life of adolescents to observe and analyze such use. And
similarly with respect to other matters that engage its attention. Its methodological stance,
accordingly, is that of direct examination of the empirical social world. Blumer argued for the
development of “sensitizing concepts”—as opposed to “definitive concepts”—to capture social
life theoretically; such concepts “gives the user a general sense of reference and guidance in
approaching empirical instances”. Symbolic interactionism does per definition not privilege any
specific methods or research procedures—anything capable of capturing human meaning making
through symbolic interaction in everyday life and capable of providing sensitizing concepts will
do. However, historically, due to its close association with Chicago sociology, symbolic
interactionists have primarily worked with a variety of qualitative methods and used these to
discover, represent, and analyze the meaning-making processes involved in human interaction is
a variety of contexts. Although a branch of symbolic interactionism under the auspices of
Manford Kuhn began to develop at the University of Iowa (the “Iowa School” as opposed to the
“Chicago School” of Blumer and others) that prioritized more positivistic aspirations and used
quantitative methods and experimental research designs, symbolic interactionism is to a large
degree associated specifically with qualitative research, privileging the careful observation (and
particularly participant observation) of social life in concrete and often naturally occurring
circumstances. Today, symbolic interactionism is still very much alive and kicking—through
conferences, book series, and a journal devoted to studies in symbolic interaction—and is an
active part of American sociology and elsewhere, although the originality and initially
provocative ideas of the pioneering protagonists of symbolic interactionism have gradually
waned throughout the years.
One of the main proponents of interactionism was Erving Goffman, who throughout his career,
which started at the University of Chicago in the early 1950s, gradually developed a perspective
to study the minutiae of social life that still today is one of the most quoted and used within
contemporary social research. Goffman in many ways personified qualitative social science in
the mid-twentieth century due to his particular topics of interest as well as his specific means of
investigating them. Goffman’s main preoccupation throughout his career was to tease out the
many miniscule and often overlooked rituals, norms, and behavioral expectations of the social
situations of face-to-face interaction between people in public and private places—something
that at the time was often regarded with widespread skepticism by more rigorously oriented
social researchers. This was indeed a time when the center of intellectual development and
priority within the social sciences on the American continent had gradually shifted from the
University of Chicago in the earlier parts of the twentieth century to Harvard University and
Columbia University at mid-century with a concomitant shift from qualitative and particularly
ethnographic methods to much more experimental, quantitative, and statistical methods. Not
surprisingly, Goffman is often described as a maverick with his impressionistic and to some
extent obscure approach to research methodology and ways of reporting his findings. Like one of
his main sources of inspiration, Georg Simmel, Goffman keenly used the essay as a privileged
means of communicating research findings, just as other literary devices such as sarcasm, irony,
and metaphors were part and parcel of his methodological toolbox. Goffman was particularly
critical of the use of many of the methods prevalent and valorized in sociology at his time. For
instance, against the preference for statistical variable analysis and the privilege of quantitative
methodology, he once stated: The variables that emerge tend to be creatures of research designs
that have no substance outside the room in which the apparatus and subjects are located, except
perhaps briefly when a replication or a continuity is performed under sympathetic auspices and a
full moon. Concepts are designed on the run in order to get on with setting things up so that trials
can be performed and the effects of controlled variation of some kind or another measured. The
work begins with the sentence “we hypothesize that...,” goes on from there to a full discussion of
the biases and limits of the proposed design, reasons why these aren’t nullifying, and culminates
in an appreciable number of satisfyingly significant correlations tending to confirm some of the
hypotheses. As though the uncovering of social life were that simple. Fields of naturalistic study
have not been uncovered through these methods. Concepts have not emerged that re-ordered our
view of social activity. Understanding of ordinary behavior has not accumulated; distance has.
Instead, Goffman opted for an unmistakable and distinctive qualitative research strategy aimed at
charting the contours and contents of the all too ordinary and ever-present but nevertheless
scientifically neglected events of everyday life. His work was characterized by an apparent
methodological looseness that consciously and stylistically downplayed the importance of his
own findings but which covered over the fact that his work actually uncovered heretofore
empirically uncharted territory. Many of the titles of his books thus contained consciously
diminutive subtitles such as “reports,” “essays,” or “microstudies” that gave the impression,
however mistaken, that it should not be taken all too serious. Goffman willingly admitted on
what others might have regarded as a dubious research strategy: Obviously, many of these data
are of doubtful worth, and my interpretations—especially some of them— may certainly be
questionable, but I assume that a loose speculative approach to a fundamental area of conduct is
better than a rigorous blindness to it. In his work, Goffman relied heavily on all sorts of
empirical material. He conducted interviews with housewives; he explored an island community
through in-depth ethnography; he investigated the trials and tribulations of patient life at a
psychiatric institution by way of covert participant observation; he performed the role as a dealer
in a Las Vegas casino in order to document and tease out the gambling dimensions of human
interaction; he listened to, recorded and analyzed radio programs; and he more or less freely used
any kind of qualitative technique, official and unofficial, to access the bountiful richness of
social life. Despite his reliance on a varied selection of empirical input (or what he termed “slices
of social life”), throughout his career, Goffman gradually developed and refined a unique
research methodology by way of various metaphors intended to capture and highlight specific
features of everyday life interaction. Goffman’s perspective on qualitative research therefore is
often referred to as “dramaturgy” because his main and most popular metaphors was the
theatrical analogy in which he— in The Presentation of Self in Everyday Life—in detail
described social interaction as if it was a performance made by actors on a scene. However,
Goffman’s metaphorical cornucopia was much more than mere dramaturgy. He also invented
and refined other metaphorical schemas: “The ritual metaphor” (looking at social life as if it was
one big ceremonial event), “the game metaphor” (investigating social life as if it was inhabited
by conmen and spies), and “the frame metaphor” (concerned with showing how people always
work towards defining and framing social situations in order to make them meaningful and
understandable). All these different metaphors concentrated on the very same subject matter—
patterns of human interaction, or, put in another way, social life at the micro level—and each
metaphor spawned a spectacular number of analytical terms and sensitizing concepts, many of
which today are household concepts in the social sciences (just think of “stigma,” “impression
management,” “labeling,” or “framing”). Moreover, they served as useful devices in which to
embed the aforementioned varied empirical material, thereby giving it shape, meaning, and
substance. Goffman’s perspective later inspired new generations of sociologists in particular and
qualitative researchers in general who have used him and his original methodology and colorful
concepts to study a variety of conventional as well as new empirical domains such as tourist
photography, mobile phone communication, and advertising.
iii. Ethno- Methodology
The term "Ethnomethodology" was first coined by Harold Garfinkel. Ethnomethodology refers
to the analysis of the ways in which we actively make sense of what others mean by what they
say and do. Much of our everyday interaction occurs through talk - casual, verbal exchange -
carried out in informal conversations with others. Garfinkel analysed these conversations. He
showed how these conversations are based on shared understandings and knowledge which are
brought into play. He refers to these shared understandings and knowledge as "background
expectancies." These organize our conversations. Ethnomethodology is the study of how people
use social interaction to maintain an ongoing sense of reality in a situation. To gather data,
ethnomethodologists rely on conversation analysis and a rigorous set of techniques for
systematically observing and recording what happens when people interact in natural settings. It
is an attempt to classify the actions people take when they are acting in groups.
Ethnomethodology is another important tradition in the internal history of qualitative research
that simultaneously builds on and extends the perspective provided by pragmatism,
interactionism, and the dramaturgical work of Goffman. Like Goffman, ethnomethodologists
take an interest in studying and unveiling the most miniscule realm of human interaction, and
they rely on the collection of empirical data from a variety of sources in the development of their
situationally oriented sociology. Ethnomethodology was initially a project masterminded by
American sociologist Harold Garfinkel who in Studies in Ethnomethodology (1967) outlined the
concern of ethnomethodology as the study of the “routine actions” and the often-unnoticed
methods of meaning making used by people in everyday settings (hence the term
ethnomethodology meaning “folk methods”). These routine activities and the continuously
sense-making endeavors were part and parcel of the quotidian domain of everyday life
(described by Garfinkel, in the characteristically obscure ethnomethodological terminology, as
the “immortal ordinary society”) that rest on common-sense knowledge and practical rationality.
Inspired by the phenomenological sociology of Alfred Schütz as well as to some extent also the
functionalism of Talcott Parsons, Garfinkel concerned himself with a classic question in
sociology: how is social order possible? But instead of proposing abstract or philosophical
answers to this question or proposing “normative force” as the main arbiter between people,
Garfinkel—as a kind of “phenomenological empiricism” set out empirically to discover and
document what people actually do whenever they encounter each other. True to the general
pragmatist and interactionist perspective, ethnomethodologists rely on an image of human actors
as knowledgeable individuals who through such activities as “indexicality,” the “etcetera
principle” and “accounts,” in Ludwig Wittgenstein’s terminology, “know how to go on.” Social
reality and social order are therefore not static or pre-given—rather they are the active outcome
or “accomplishment” of actors’ local meaning making amidst sometimes bewildering, confusing,
and chaotic situations. As Garfinkel stated on the purpose and procedures of ethnomethodology
phrased in typical tortuous ethnomethodological wording: Ethnomethodological studies analyze
everyday activities as members’ methods for making those same activities visibly rational and
reportable for all practical-purposes, i.e. ‘accountable’, as organizations of commonplace
everyday activities. The reflexivity of that phenomenon is a singular feature of practical actions,
of practical circumstances, of common sense knowledge of social structures, and of practical
sociological reasoning ...I use the term ‘ethnomethodology’ to refer to the investigation of the
rational properties of indexical expressions and other practical actions as contingent ongoing
accomplishments of organized artful practices of everyday life.
According to ethnomethodologists, there are many different methods available to tease out the
situational and emerging order of social life that is based on members’ methods for making
activities meaningful. Ethnomethodology is, however, predominantly a qualitative tradition that
uses typical qualitative methods such as interviews and observation strategies for discovering
and documenting what goes on when people encounter everyday life, but they also like to
provoke our ingrained knowledge of what is going on. Thus, in classic Durkheimian-inspired
fashion, one particularly opportune ethnomethodological way to find out what the norms and
rules of social life really are and how they work is to break them. For example, Garfinkel
invented the “breaching experiments” aimed at provoking a sense of disorder in the otherwise
orderly everyday domain so as to see what people do to restore the lost sense of order. Of these
“breaching experiments” or ‘incongruence procedures”—that Garfinkel asked his students to
perform—he wrote: Procedurally it is my preference to start with familiar scenes and ask what
can be done to make trouble. The operations that one would have to perform in order to multiply
the senseless features of perceived environments; to produce and sustain bewilderment,
consternation and confusion; to produce the socially structured affects of anxiety, shame, guilt
and indignation; and to produce disorganized interaction should tell us something about how the
structures of everyday activities are ordinarily and routinely produced and maintained. Garfinkel,
his colleagues, and students throughout the years performed a range of interesting studies—of
courtroom interaction, jurors’ deliberations, doctors’ clinical practices, transsexuals’ attempts at
“passing” in everyday life, piano players’ development of skills and style, medical staffs’
pronunciation of patients’ deaths, police officers’ craft of peace keeping, pilots’ conversation in
the cockpit  —aimed at finding out how everyday life (and particularly work situations) is
“ordinarily and routinely produced and maintained” by using breaching experiments, but also
less provocative methods. Later, ethnomethodology bifurcated into a “conversation analysis”
strand on the one hand and what has been termed “conventional ethnographical
ethnomethodology” on the other. Common to both strands has been a concern with uncovering
the most meticulous aspects of human interaction—non-verbal and verbal. Just as Garfinkel
studied the natural patterns of interaction in natural settings (the living room, the courtroom, in
the clinic or elsewhere), so conversation analysts studied natural language (but also professional
jargon) as used by people in ordinary circumstances. For instance, conversation analysts, such as
Harvey Sacks and Emanuel Schegloff, intimately studied and analyzed the minutiae of turn-
takings, categorizations, and sequences of verbal communication in order to see how people
through the use of language create meaning and a coherent sense of what is going on.
Characteristic of both strands of ethnomethodology is the strong reliance on qualitative research
methods aimed at capturing and detailed describing the situational and emerging character of
social order. In fact, ethnomethodologists strongly oppose positivistic research procedures aimed
at producing universal “truths” or uncovering “general laws” about society and instead opt for a
much more mundane approach to studying the locally produced orders and thoroughly episodic
and situational character of social life. In a typical provocative respecification of Schütz’s classic
dictum, Garfinkel thus suggested that we are all sociologists, because we constantly search for
meaning. The means and methods of inquiry of professional sociologists are thus not all that
different from the various ways ordinary people in everyday life observe, inquire, or talk to one
another. This is a principle shared with the hermeneutic strand, which was addressed earlier.
Most of the North American traditions mentioned here can be covered by the label of “creative
sociologies” because they first of all regard human beings as creative actors capable of and
concerned with creating meaning in their lives, and secondly because they emphasize creative
qualitative approaches to capture and analyze those lives. As Monica B. Morris recapitulated on
these creative sociologies: The basic assumption underlying the ‘creative’ approaches to
sociology are: that human beings are not merely acted upon by social facts or social forces; that
they are constantly shaping and ‘creating’ their own social worlds in interaction with others; and
that special methods are required for the study and understanding of these uniquely human
processes. These “special methods” have predominantly been varieties of qualitative methods.
Common to most of the North American creative sociologies is also a distinct microsociological
orientation aimed at mapping out and analyzing the distinctly quotidian dimensions of social life
and society. Besides the various traditions that we have chosen to delineate as part of the internal
story of qualitative research, we can also mention the important insights from social semiotics,
existentialism, critical everyday life sociology, cultural studies, sociology of emotions,
interpretive interactionism, and more recently actor-network theory that, however, will not be
presented here.
iv. Grounded Theory
The qualitative research approach ‘grounded theory’ has been developed by two sociologists,
Barney Glaser & Anselm Strauss. They defined ‘grounded theory’ in these words as ‘The theory
that was derived from data, systematically gathered and analyzed through the research process’.
Grounded theory is all about data collection and analysis. In this approach the aim is to construct
a theory that is grounded in the data According to Glaser (1992) grounded theory deals with only
inductive approach rather than deductive approach of inquiry. Further, defined the grounded
theory approach very briefly in these words as... ‘Grounded theory is not a theory at all. It is a
method, an approach, a strategy. In my opinion, grounded theory is best defined as a research
strategy whose purpose is to generate theory from data. ‘Grounded’ means that the theory will be
generated on the basis of data; the theory will therefore be grounded in data. ‘Theory’ means that
the objective of collecting and analyzing the research data is to generate theory. The essential in
grounded theory is that theory will be developed inductively from data’. While Charmaz (2014)
added that grounded theory is focused on inductive strategies for data analysis. It starts with
abstract concepts and to explain and understand data. The journey of theory development in
grounded theory approach starts and ends with the data. This journey is best explained by the
as...‘Data collection, analysis, and eventual theory stand in close relationship to one another...the
researcher begins with an area of study and allows the theory to emerge from the data...grounded
theories, because they are drawn from data, are likely to offer insight, enhance understanding,
and provide a meaningful guide to action’.
Grounded Theory is most accurately described as a research method in which the theory is
developed from the data, rather than the other way around. That makes this is an inductive
approach, meaning that it moves from the specific to the more general. The method of study is
essentially based on three elements: concepts, categories and propositions, or what was originally
called “hypotheses”. However, concepts are the key elements of analysis since the theory is
developed from the conceptualization of data, rather than the actual data.
Strauss & Corbin, authors of “Basics of Qualitative research: Grounded Theory Procedures and
Techniques” are two of the model’s greatest advocates, and define it as follows: "The grounded
theory approach is a qualitative research method that uses a systematic set of procedures to
develop an inductively derived grounded theory about a phenomenon”. The primary objective of
grounded theory, then, is to expand upon an explanation of a phenomenon by identifying the key
elements of that phenomenon, and then categorizing the relationships of those elements to the
context and process of the experiment. In other words, the goal is to go from the general to the
specific without losing sight of what makes the subject of a study unique.
Grounded theory is often perceived as a method which separates theory and data but others insist
that the method actually combines the two. Data collection, analysis and theory formulation are
undeniably connected in a reciprocal sense, and the grounded theory approach incorporates
explicit procedures to guide this. This is especially evident in that according to grounded theory,
the processes of asking questions and making comparisons are specifically detailed to inform and
guide analysis and to facilitate theorizing process. For example, it is specifically stated that the
research questions must be open and general rather than formed as specific hypotheses, and that
the emergent theory should account for a phenomenon that is relevant to participants.
There are three distinct yet overlapping processes of analysis involved in grounded theory from
which sampling procedures are typically derived. These are: open coding, axial coding and
selective coding. Open coding is based on the concept of data being “cracked open” as a means
of identifying relevant categories. Axial coding is most often used when categories are in an
advanced stage of development; and selective coding is used when the "core category", or central
category that correlates all other categories in the theory, is identified and related to other
categories.
Data collection is directed by theoretical sampling, which means that the sampling is based on
theoretically relevant constructs. Many experiments, in their early stages, use the open sampling
methods of identifying individuals, objects or documents. This is so that the data’s relevance to
the research question can be assessed early on, before too much time and money has been
invested. In later phases, a systematic relational or variational sampling is frequently employed
with the objective of locating data that either confirms the relationships between categories, or
limits their applicability. The final phase generally involves discriminate sampling, which
consists of the deliberate and directed selection of individuals, objects or documents to verify the
core category and the theory as a whole, as well as to compensate for other, less developed
categories. Also included as necessary parts of the analysis are other procedures such as memo
writing and the use of diagrams, as well as procedures for identifying and incorporating
interaction and process.
Grounded theory contains many unique characteristics that are designed to maintain the
"groundedness" of the approach. Data collection and analysis are consciously combined, and
initial data analysis is used to shape continuing data collection. This is supposed to provide the
researcher with opportunities to increase the "density" and "saturation" of recurring categories, as
well as to assist in providing follow-up procedures in regards to unanticipated results. Interlacing
data collection and analysis in this manner is also designed to increase insights and clarify the
parameters of the emerging theory. At the same time, the method supports the actions of initial
data collection and preliminary analyses before attempting to incorporate previous research
literature. This is supposed to guarantee that the analysis is based in the data and that pre-existing
constructs do not influence the analysis and/or the subsequent formation of the theory. If existing
theoretical constructs are utilized, they must be justified in the data.
Grounded theory provides detailed and systematic procedures for data collection, analysis and
theorizing, but it is also concerned with the quality of emergent theory. Strauss & Corbin state
that there are four primary requirements for judging a good grounded theory: 1) It should fit the
phenomenon, provided it has been carefully derived from diverse data and is adherent to the
common reality of the area; 2) It should provide understanding, and be understandable; 3)
Because the data is comprehensive, it should provide generality, in that the theory includes
extensive variation and is abstract enough to be applicable to a wide variety of contexts; and 4) It
should provide control, in the sense of stating the conditions under which the theory applies and
describing a reasonable basis for action.
Grounded theory offers many advantages, however because it is such a painstakingly precise
method of study, it requires high levels of both experience and acumen on the part of the
researcher. For this reason, novice researchers should avoid this method of study until they have
achieved the proper qualities needed to effectively implement the approach.
v. Phenomenology
Phenomenology, that simply stands for the study of “Phenomenon”. Now what does
“Phenomenon” refers to? This actually depicts the phrase, “How things appear to one’s
consciousness.” And where consciousness is always intentional which means that consciousness
is always of something and never of nothing! Hence, if someone is conscious of something,
he/she is also equally aware about that particular thing. The following awareness can be attained
through various modes of consciousness. Whatever be the modes of consciousness, the object in
consciousness always remains the same. Henceforth, the object of consciousness remaining
constant in different phenomena is termed as “noema” and the ways to retrieve the “noema” are
called “noesis”. With the strong roots in a twentieths’ philosophical movement, Phenomenology
is based on the work of the famous philosopher, “Edmund Husserl”. Phenomenology is a
research tool accompanying the two basic academic disciplines of “Philosophy” and
“Psychology” that has a profound use in narrating human experiences. It is one of a kind of
qualitative research method which helps to outline the human experience about a certain
phenomenon. However, in the very study it attempts to not to take into consideration, the biases,
pre-held beliefs and assumptions about human experiences, affections and reactions to a
particular phenomenon. This term is known as “Bracketing”. Therefore, Phenomenology clearly
states, investigates and interprets the experienced conditions and feelings of the people regarding
to any given situation consciously. This very trait of phenomenology actually separates it from
all other research methods. As this method deals with the pure mental phenomena,
consciousness, while all other pure science research methods studies about the laws of nature,
time and space. Now, the research is widely conducted through the use of in-depth interviews of
usually small samples of participants or groups. After the interviewing portions comes the
generalization part of various studied prospects of following interviewee. The researcher tries to
generalize from theexperiences of interviewee like how does it actually feel to encounter such
phenomena.
Phenomenology is different from all other research methods because its field of investigation is
different from other methods. What then is this difference? To understand this difference first of
all , phenomenology should be differentiated from natural sciences’ research methods. The first
difference is , natural sciences study the world of nature , the physical nature that follows its own
laws commonly known as the laws of nature. Physical nature exists in time and space and it
follows the laws of time and space for example causality. Rather than having events in nature as
its subject matter, phenomenology studies the purely mental phenomenon. It studies
consciousness.
The subject matter of phenomenology is the structure of consciousness, while everything
pertaining to time and space, to the physical nature is eliminated from the consciousness. How
phenomenology attains to this aim of eliminating everything bound by time and space,
everything physical from its subject matter? It is done through a change in attitude of the
researcher. Phenomenology asks a researcher to suspend all judgments about the physical world.
This is called epoche’ in the language of phenomenology. Epoche’ means ‘to pocket’. A
researcher, while exercising epoche’ holds back, or pockets or brackets all judgments about the
physical nature. A researcher has to bracket or hold back even the most basic beliefs about the
nature. The most fundamental belief is the belief in the existence. So, while doing a
phenomenological investigation , one is not concerned about the existence of an object of
consciousness. For example if someone is doing a research on the notion of human soul , one
should not be concerned about the existence of human soul; rather one has to think about the way
the respondents experience human soul in their consciousness. If a researcher is doing a research
about the magical practices of a community, he or she should not be concerned about the truth or
existence or veracity of such practices. All judgments are to be suspended and the only thing to
be considered is how people are experiencing the phenomenon in their consciousness. However,
the belief that things exist accompanies our consciousness, so this belief should also be included
in the phenomenological study. The belief in existence is an immanent part of our consciousness.
 
Phenomenology is also termed as the study of phenomenon. The word phenomenon actually
stands for the phrase’ how things appear to one’s consciousness. In order to understand the
meaning of phenomenon in the context of phenomenology one has to understand that things only
give appearance in the consciousness and they are never fully given. So, when you see a horse ,
you only see a phenomenon of it , neither you see the particular horse you are looking at , fully
and completely , in one act of perception , nor you can perceive the universal horse, that you
mean when you say the word horse. You only see or are aware of or conscious of a single
perspective of a horse, and the inference that what you are looking at is a complete horse is to
transgress or transcend your immediate consciousness. So, if after looking at a perspective, a
phenomenon of a thing, one infers the whole thing from it, one violates the limits of
phenomenology. For, the inference that there is a horse standing in front of you is not immanent
in your single perception of a thing. If I see a profile of a picture and infer from this profile the
whole image, it means that I have transcended what is given to my consciousness and I have
brought my judgment, my learning from my previous experiences to this perception. Whereas
phenomenology does not allow this inference.
The meaning of the word phenomenon is to be further elaborated. For this we have to understand
a basic property of consciousness. Consciousness is always intentional. This has a specialized
meaning; by saying that consciousness is always intentional, it means that consciousness is
always of something and never of nothing. Thus, consciousness is always of something, it is
always directed towards something, it always has an object. So while one is conscious of
something one is aware of that thing, and that awareness can be attained through various modes
of consciousness. You can be aware of a tree through perceiving the tree, through remembering
the tree, through imagining the tree, through performing an act on the tree, through making a
picture of the tree, through studying about the tree, through talking about the tree. These are
different modes of awareness through which you become conscious of an object, the tree.
However, in all modes of awareness, the object remains the same, the same tree is the object of
your consciousness in all these modes. That common tree is called noema in phenomenology and
different modes of getting awareness about this tree are called noesis. Thus, the object of
consciousness that remains same in its different phenomenon is called the noema and the modes
to access this noema are called noesis.
One can access this content of noema and noesis through reflection. So, if I am interested in
knowing how people are aware of a ‘mobile phone ‘ I have to consider certain examples of this
experience. These experiences cannot be called purely phenomenological, because a purely
phenomenological experience is purely mental. However, this purely phenomenological is
always embedded in the concrete space time time events, therefore search for the purely
phenomenological starts from these experiences. Once we have studied some examples, we can
imaginatively vary the experiences in our minds, to achieve a full range of all the possible
experiences of the type under consideration. This step is called imaginative variation. If I look at
a computer from four sides, I can think about the infinite possible perspectives from which I can
possibly see a computer. But I cannot actually perform these infinite number of perceiving
experiences. Thus, I have to rely on my imagination and perform such experiences in my
imagination. During these imaginative experiences I try to change certain details and see what is
that which remains selfsame in all the variations. I have to look for invariant attributes, things
that are essential to my perceiving of a computer. This is called determination of essences, or
eidetic reduction.
Eidetic reduction means to reduce an experience to ideas or essences. So, Initially I look at a
house from say five perspectives, and then I repeat the experience of looking at the house from
other possible perspectives in my imagination. And in each of these experiences, carried out
imaginatively I will try to find out that which remained unchanged, or whose absence from my
awareness will not leave that particular house a house. These essential features will constitute the
object, the noema in its fullest sense. This noema will be different from the actual objects
because it will include in its description all possibilities of that object. Such an understanding of
an object is also called horizonal understanding. 
By horizonal understanding I mean the total understanding of an object, along with all its
possibilities; possibilities that are not given in a single perspective. For example, the fact that a
certain physical object, follows the law of inertia, is not given in any of the single perceptions of
that object. Rather, such an inference can only be achieved through imaginative variation, in
which I imaginatively think about a physical object as a self-moving object and the consider this
property as something accidental to an object or in fact an impossibility.
In practical applications of phenomenology (Husserl’s), one first has to identify a phenomenon to
be investigated. Then one has to sample different exemplary examples of that phenomenon.
These examples are to be studied in depth, and then one has to perform imaginative variation on
these examples to find out the invariant element. During imaginative variation one has to
consider almost all possible ways of looking at the phenomenon. Eidetic reduction, will then yild
the essences.
 In social research, one finds a phenomenon to study, say how children of grade 10 experience
the learning of a certain mathematics topic. One then has to sample a certain number of
exemplary examples. This usually is done through purposive sampling. One selects four or five
respondents, or four or five carefully selected groups for group discussion. Then one conduct in-
depth interviews or focus group discussions, and try to access the first-hand experience of the
respondents through using un-structured questions. Interviews usually are very lengthy, and last
for an hour or two.
These interviews are recorded and transcribed, and then comes the phenomenological analysis.
In this analysis one finds certain themes or underlying currents that are to be focused. The data in
each interview seems to cluster around these themes. This is called thematic analysis. Once
themes are determined, one has to put together or rearrange the data from all interviews under
the found themes, so that no bit of data is left out.
This organized data is then to be subjected to imaginative variation. Each time a noema-noetic
structure is analyzed to find out it invariant elements, and thus essences are found. This reduces
the actual to the possible and ideal and a phenomenon is constructed through exposing the full
horizon of its possibilities. The methodology used in phenomenology differs than most other
research methodology because the goal is to describe a lived experience, rather than to explain or
quantify it in any way.  Phenomenology is solely concerned with the study of the experience
from the perspective of the participants, therefore, the methodology does not include a
hypothesis or any preconceived ideas about the data collected.  Phenomenology makes use of a
variety of methods including interviews, conversations, participant observation, action research,
focus meetings, analysis of diaries and other personal texts.  In general, the methodology is
designed to be less structured and more open-ended to encourage the participant to share details
regarding their experience. Surveys and questionnaires that are commonly used in other research
methods to gather information from participants would be too structured and would not allow the
participant to freely share.  In other words, phenomenology emphasizes subjectivity. The goal of
phenomenological research methods is to maximize the depth of the information collected and
therefore, less structured interviews are most effective.
Following is a list of principles and qualities applied to phenomenological methodology and data
collection:
 Phenomenology searches for the meaning or essence of an experience rather than
measurements or explanations.
 Researcher should begin with the practice of Epoche.  He or she will describe their own
experiences or ideas related to phenomenon to increase their own awareness of their
underlying feelings. 
 Phenomenology is different in that the researcher is often participatory and the other
participants are co-researchers in many cases.
 This type of research focuses on the wholeness of the experience, rather than its
individual parts.
 Phenomenology differs from other research in that it does not test a hypothesis, nor is
there an expectation that the results predictive or reproducible.  Additional studies into
the same phenomenon often reveal new and additional meanings.
 The study can be applied to a single case or deliberately selected samples.
A phenomenological research study typically follows the four steps listed below:
 Bracketing – In order to understand the sole perspective of the participant’s experiences
about any phenomenon, the researcher has to set aside his/her own experiences, biases,
preconceived beliefs. This very process is termed as “Bracketing”. This is possible with
certain discussions and note-taking of these things with colleagues.The process of
identifying, and keeping in check, any preconceived beliefs, opinions or notions about the
phenomenon being researched.  In this process, the researcher “brackets out” any
presuppositions in an effect to approach the study of the phenomenon from an unbiased
perspective.  Bracketing is important to phenomenological reduction, which is the
process of isolating the phenomenon and separating it from what is already known about
it.
 Intuition – This includes understanding of the phenomenon with the variable uses of data
collection methods or questions to bring that level of understanding by the researcher. For
this, the researcher has to delve inside the situation and its study as a whole plus has to
completely divert away from his/her biasness with complete acceptance to respondent’s
experiences. This requires that the researcher become totally immersed in the study and
the phenomenon and that the researcher remains open to the meaning of the phenomenon
as described by those that experienced it.  The process of intuition results in an
understanding of the phenomenon and may require the researcher to vary the data
collection methods or questions until that level of understanding emerges.
 Analysis – After data collection analysis is performed, to which the process seeks the
researcher’s keen indulgence. Then, the rich, descriptive data are organized accordingly
with the use of processes such as coding and categorizing. With this several themes are
developed which thereby can be applicable to describe the experiences from the
perspectives of those that lived it.The process of analyzing data involves the researcher
becoming full immersed into the rich, descriptive data and using processes such as coding
and categorizing to organize the data.  The goal is to develop themes that can be used to
describe the experience from the perspective of those that lived it.
 Description – Now to the final step of the research method that accounts with the
researcher’s cognition about data collected and analyzed. Here, the researcher finally is
accountable to interpret the phenomenon according to his/her understanding of the data
received. Then, he/she will communicate to others.
Phenomenology has its roots in a 20th century philosophical movement based on the work of the
philosopher Edmund Husserl. As research tool, phenomenology is based on the academic
disciplines of philosophy and psychology and has become a widely accepted method for
describing human experiences.  Phenomenology is a qualitative research method that is used to
describe how human beings experience a certain phenomenon.  A phenomenological study
attempts to set aside biases and preconceived assumptions about human experiences, feelings,
and responses to a particular situation.  It allows the researcher to delve into the perceptions,
perspectives, understandings, and feelings of those people who have actually experienced or
lived the phenomenon or situation of interest.  Therefore, phenomenology can be defined as the
direct investigation and description of phenomena as consciously experienced by people living
those experiences.  Phenomenological research is typically conducted through the use of in-depth
interviews of small samples of participants.  By studying the perspectives of multiple
participants, a researcher can begin to make generalizations regarding what it is like to
experience a certain phenomenon from the perspective of those that have lived the experience.
Following is a list of the main characteristics of phenomenology research:
 It seeks to understand how people experience a particular situation or phenomenon.
 It is conducted primarily through in-depth conversations and interviews; however, some
studies may collect data from diaries, drawings, or observation.
 Small samples sizes, often 10 or less participants, are common in phenomenological
studies.
 Interview questions are open-ended to allow the participants to fully describe the
experience from their own view point.
 Phenomenology is centered on the participants’ experiences with no regard to social or
cultural norms, traditions, or preconceived ideas about the experience.
 It focuses on these four aspects of a lived experience:  lived spaced, lived body, lived time,
and lived human relations.
 Data collected is qualitative and analysis includes an attempt to identify themes or make
generalizations regarding how a particular phenomenon is actually perceived or
experienced.
Researchers conducting phenomenological studies are interested in the life experiences of
humans.  This type of research can be applied to wide variety of situations and phenomena.  
Below are just a few examples of topics that would lend themselves to phenomenological study:
 How do parents of an autistic child cope with the news that their child has autism?
 What is it like to experience being trapped in a natural disaster, such as a flood or
hurricane?
 How does it feel to live with a life-threatening aneurism?
 What is it like to be a minority in a predominantly white community?
 What is like to survive an airplane crash?
 How do cancer patients cope with a terminal diagnosis?
 What is it like to be a victim of sexual assault?
The next modules in this series will explore phenomenology research methods, data analysis and
the strengths and limitations of this type of research.
Strengths of Phenomenology:
 Provide guidance to the universal nature of an experience with deeper understanding to it
as well.
 Qualitative in nature which henceforth, provides no restriction with the interpretation of the
results.
 The themes and meanings of an experience results from the data.
 Seeks the greater understanding to lived experiences which further have higher possibilities
to develop into new theories.
 Since “Bracketing” is an important trait to this method, chances to change preconceived
notions by hearing to the voices of participants can also help expose the misconceptions.
Also, prompt actions can be forwarded accordingly.
Limitations of Phenomenology:
 Difficulties while expressing the participant’s experiences to certain phenomenon may
arise due to the lack of art of articulation, language barriers, age, cognition,
embarrassment and many other factors.
 Recruiting participants is always a bigger problem in this method for a specific
phenomenon.
 Bias can happen even with bracketing.
 Obviously, data collection plus analysis is strenuous and time consuming.
 Less credential tags as per policy makers choice.
 The subjective nature brings in the issue of reliability and validity. Thus, presentation of
findings may be difficult.
 We cannot really compare or suggest that the experience of a small number of people is
what is experienced by everyone.
 Statistically unreliable data with that comes difficulties in generalization of data.
Phenomenology researchers’ are always keen about lived human experiences and hence, this
type of researches can be applied to profound number of situations and phenomena. Topics like:
 What is it like to experience being trapped in a natural disaster, such as a flood or
hurricane?
 What is like to survive an airplane crash?
 What is it like to be a victim of sexual assault? Etc. are concerned for the research matter.
Example of a Phenomenological Study:
An exploration of how foster parents educationally assist foster children.
Foster children are academically at risk as a result of abuse, neglect and family disruptions. The
purpose of this qualitative phenomenological research study is to identify the skill sets; foster
parents use to address the academic problems and educational needs of the children placed in
their homes. Data were collected through individual interviews. Apart from identifying skills
used to address academic problems, the interview findings revealed how the foster parent’s sense
of self-efficacy impacted their skill sets in addressing the academic problems and educational
needs of their foster children. Six themes resulted from the coding and analysis of the
participants responses: inspiration from others, internal resources, religion/spirituality, meeting
foster child’s educational needs, interaction with others, and psychosocial resources. The
findings of the study indicated the training programs for prospective foster parents did not
adequately equip them with the necessary competencies to address the academic problems of
foster children. The recommendation suggested revision of the training program and the related
materials and manuals to include a component addressing the educational needs of the children
in foster care and clarification of the accountability for monitoring the child’s academic progress.
The development of a collaborative relationship among concerned adults, specifically foster
parents, caseworkers, and school personnel it was part of the recommendation and was a critical
factor in helping foster parents to address the educational problems and academic needs of their
foster child.
3. Theory to text and text to theory in qualitative research
Theory to text (Producing/ selecting data/ making use of the literature review in qualitative
research)
The literature review
An early and essential step in doing a study is to review the accumulated knowledge on your
research question. This applies to all research questions and all types of studies. As in other areas
of life, it is wise to find out what others have already learned about an issue before you address it
on your own. Clichés reinforce this advice: Do not waste time “reinventing the wheel” and
remember to “do your homework” before beginning an endeavor. This holds true whether you
are a consumer of research or will be beginning a study yourself. We begin by looking at the
various purposes the review might serve. We will also discuss what the literature is, where to
find it, and what it contains. Next we will explore techniques for systematically conducting a
review. Finally, we will look at how to write a review and what its place is in a research report.
Doing a literature review builds on the idea that knowledge accumulates and that we can learn
from and build on what others have done. The review rests on the principle that scientific
research is a collective effort, one in which many researchers contribute and share results with
one another. Although some studies may be especially important and a few individual
researchers may become famous, one study is just a tiny part of the overall process of creating
knowledge. Today’s studies build on those of yesterday. We read studies to learn from, compare,
replicate, or criticize them. Literature reviews vary in scope and depth. Different kinds of
reviews are stronger at fulfilling one or another of four goals. Doing an extensive professional
summary review that covers all of the research literature on a broad question could take years by
a skilled researcher. On the other hand, the same person could finish a narrowly focused review
in a specialized area in a week. To begin review, you must pick a topic area or research question,
determine how much time and effort you can devote to the study, settle on the appropriate level
of depth, and decide on the best type of review for your situation. You can combine features of
each type in a specific review.
Goals of a Literature Review
 To demonstrate a familiarity with a body of knowledge and establish credibility. A review
tells a reader that the researcher knows the research in an area and knows the major issues.
A good review increases a reader’s confidence in the researcher’s professional competence,
ability, and background.
 To show the path of prior research and how a current project is linked to it. A review
outlines the direction of research on a question and shows the development of knowledge.
A good review places a research project in a context and demonstrates its relevance by
making connections to a body of knowledge.
 To integrate and summarize what is known in an area. A review pulls together and
synthesizes different results. A good review points out areas in which prior studies agree,
disagree, and major questions remain. It collects what is known up to a point in time and
indicates the direction for future research.
 To learn from others and stimulate new ideas. A review tells what others have found so that
a researcher can benefit from the efforts of others. A good review identifies blind alleys
and suggests hypotheses for replication. It divulges procedures, techniques, and research
designs worth copying so that a researcher can better focus hypotheses and gain new
insights.
Six Types of Literature Reviews
1. Context review. A common type of review in which the author links a specific study to a
larger body of knowledge. It often appears at the beginning of a research report and introduces
the study by situating it within a broader framework and showing how it continues or builds on a
developing line of thought or study.
2. Historical review. A specialized review in which the author traces an issue over time. It can be
merged with a theoretical or methodological review to show how a concept, theory, or research
method developed over time.
3. Integrative review. A common type of review in which the author presents and summarizes
the current state of knowledge on a topic, highlighting agreements and disagreements within it.
This review is often combined with a context review or may be published as an independent
article as a service to other researchers.
4. Methodological review. A specialized type of integrative review in which the author compares
and evaluates the relative methodological strength of various studies and shows how different
methodologies (e.g., research designs, measures, samples) account for different results.
5. Self-study review. A review in which an author demonstrates his or her familiarity with a
subject area. It is often part of an educational program or course requirement.
6. Theoretical review. A specialized review in which the author presents several theories or
concepts focused on the same topic and compares them on the basis of assumptions, logical
consistency, and scope of explanation.
Literature Meta-Analysis
A literature meta-analysis is a special technique used to create an integrative review or a
methodological review. Meta-analysis involves gathering the details about a large number of
previous studies and synthesizing the results. A meta-analysis proceeds in five steps:
 Locate all potential studies on a specific topic or research question
 Develop consistent criteria and screen studies for relevance and/or quality
 Identify and record relevant information for each study
 Synthesize and analyze the information into broad findings
 Draw summary conclusions based on the findings
For a meta-analysis of quantitative studies,relevant information in step 3 often includes sample
size, measures of variables, methodological quality, and size of the effects of variables, and in
step 4, this information is analyzed statistically. A meta-analysis of qualitative studies is a little
different. The relevant information in step 3 includes qualitative descriptions that are coded into
a set of categories, and in step 4 the results are synthesized qualitatively to reveal recurrent
themes. In addition to using meta-analysis to identify major findings across many studies, we can
also use it to identify how contributors in a research case define and use major concepts. For
example, Fulkerson and Thompson (2008) examined the concept of “social capital” over 18
years. They identified 1,218 articles in 450 academic journals with the term social capital in the
title or abstract. They coded the articles in seven ways to define the concept and identified the
“founding scholar” on the concept that the article cited. They also used statistical techniques to
analyze the patterns that show use of definition across time and by specialty area.
Where to Find Research Literature
Researchers can find reports of research studies in several formats: books, scholarly journal
articles, dissertations, government documents, and policy reports. Researchers also present
findings as papers at the meetings of professional societies. This section discusses each format
and provide a simple road map on how to access them.
Periodicals
Study results appear in newspapers, in popular magazines, on television or radio broadcasts, and
in Internet news summaries, but these are not the full, complete reports of research you need to
prepare a literature review. They are selected, condensed summaries prepared by journalists for a
general audience. They lack many essential details that we require to seriously evaluate the
study. Textbooks and encyclopedias also present condensed summaries as introductions to
readers who are new to a topic. These too are inadequate for preparing a literature review
because many essential details about the study are absent. Navigating the world of published
scholarly articles can be intimidating at first. When asked to do a “literature review,” many
beginning students Google the topic on the Internet or go to familiar nonprofessional,
nonscholarly magazines or newspaper articles. Social science students need to learn to
distinguish between scholarly publications that report on research studies and popular or
layperson entertainment or news articles for the lay public. They need to move from lay public
sources and rely on serious scholarly publications written for a professional audience.
Professional researchers present the results of studies in one of several forms: academic research
books (often called monographs), articles in scholarly journals, chapters in edited academic
books, and papers presented at professional meetings. Simplified, abbreviated, and “predigested”
versions of articles appear in textbooks written for students who are first learning about a topic or
in journalistic summaries in publications for the public. Unfortunately, the simplified summaries
can give an incomplete or distorted picture of a complete study. Researchers must locate the
original scholarly journal article to see what the author said and the data show. Upper-level
undergraduates and graduate students writing a serious research paper should rely on the
academic literature, that is, original articles published in academic scholarly journals.
Unfortunately, students may find some of the scholarly articles too difficult or technical to
follow. The upside is that the articles are the “real McCoy,” or original reports, not another
person’s (mis)reading of the original.
Researchers also may find a type of non-research publication with commentaries on topics or
research questions. These are discussion-opinion magazines (e.g., American Prospect, Cato
Journal, Commentary, Nation, National Review, New Republic, New York Review of Books,
Policy Review, and Public Interest). In them, professionals write essays expressing opinions,
beliefs, value-based ideas, and speculation for the educated public or professionals. They do not
contain original empirical research or actual scientific studies. They may be classified as
“academic journals” (versus general magazines) and may be “peer reviewed,” but they do not
contain original reports of empirical research. For example, Policy Review covers many topics:
law enforcement, criminal justice, defense and military, politics, government and international
relations, and political science. The leading conservative “think tank,” the Heritage Foundation,
publishes material as a forum for conservative debate on major political issues. At times,
professors or professional researchers who also conduct serious research studies contribute their
opinions and speculation in such publications. These publications must be used with caution.
They present debates, opinions, and judgments, not the official reports of serious empirical
research. If you want to write a research paper based on empirical research (e.g., an experiment,
survey data, field research), you need to rely on specialized sources. If you use an opinion essay
article, you need to treat it as such and never confuse it with an empirical social science study.
Researchers use specialized computer-based search tools to locate articles in the scholarly
literature. They also must learn the specialized formats or citation styles for referring to sources.
Professional social scientists regularly use search tools to tap into and build on a growing body
of research studies and scientific knowledge. Knowing how to locate studies; recognize, read,
and evaluate studies; and properly cite scholarly sources is a very important skill for serious
consumers of research and researchers to master.
Scholarly Journals.
The primary source to use for a literature review is the scholarly journal. It is filled with peer-
reviewed reports of research. One can rarely find these journals outside of college and university
libraries. Recall that most researchers disseminate new findings in scholarly journals. They are
the heart of the scientific community’s communication system. Some scholarly journals are
specialized and have only book reviews that provide commentary and evaluations on academic
books (e.g., Contemporary Sociology, Law and Politics Book Review), or only literature review
essays (e.g., Annual Review of Sociology, Annual Review of Psychology, Annual Review of
Anthropology) in which researchers give a “state of the field” essay for others. Publications that
specialize in literature reviews can offer useful introductions or overviews on a topic. Many
scholarly journals include a mix of literature reviews, book reviews, reports on research studies,
and theoretical essays. No simple solution or “seal of approval” separates scholarly journals from
other periodicals or instantly distinguishes a research study report from other types of articles. To
identify a research study you need to develop judgment skills or ask experienced researchers or
professional librarians. Nonetheless, learning to distinguish among types of publications is an
essential skill to master. One of the best ways to distinguish among types of publications is to
read many articles in scholarly journals. The number of scholarly journals varies widely
according to academic field. Psychology has more than 400 scholarly journals, sociology has
about 250, political science and communication have fewer than sociology, anthropology-
archaeology and social work each has about 100, urban studies and women’s studies have about
50, and criminology has only about a dozen. The “pure” academic fields usually have more than
the “applied” or practical fields such as marketing or social work. Each journal publishes from a
few dozen to more than 100 articles each year. You may wonder whether anyone ever reads all
of these articles. One study found that in a sample of 379 sociology articles, 43 percent were
cited in another study in the first year after publication and 83 percent within 6 years. Scholarly
journals vary by prestige and acceptance rates. Prestigious journals accept only 10 percent of the
research reports submitted to them. Overall rejection rates are higher in the social sciences than
in other academic fields and have been rising. This does not mean that researchers are doing low-
quality studies. Rather, the review process is becoming more rigorous, standards are rising, and
more studies are being conducted. This means that the competition to publish an article in a
highly respected journal has increased. You can find the full text of many scholarly journal
articles on the Internet. Usually, to access them you need to go through libraries that pay special
subscription fees for online article searching services, or a source tool. Some journals or
publishers offer limited articles or sell them. For example, I was able to view current articles in
Social Science Quarterly (a respected scholarly journal) free on the Internet, but when I tried to
read an article in Politics and Society online, I was asked to pay $25 per article; however, if I had
access to it through my university library, the article was free. Article search services may have
full, exact copies of scholarly journal articles. For example, JSTOR and Project MUSE provide
exact copies but only for a limited number of scholarly journals and only for past years. Other
source tools, such as Anthrosource, Proquest, EBSCO HOST, or Wilson Web offer a full-text
version of recent articles. Most articles are in the same format as their print versions. In addition
to searching the database of articles using a source tool, you can also select a particular journal
and browse its table of contents for particular issues. This can be very useful for generating new
ideas for research topics, seeing an established topic in creative ways, or expanding an idea into
new areas. Each online source tool has its own search procedure and list of scholarly journals.
None has all articles from all journals for all years. Some recent Internet-only scholarly journals,
called e-journals(e.g, Sociological Research Online, Current Research in Social Psychology, and
Journal of World Systems Research), present peer-reviewed research studies. Eventually, the
Internet format may replace print versions. But for now, about 95 percent of scholarly journals
are available in print form and most are available in a full-text version over the Internet. Internet
access nearly always requires that you use an online service through a library that pays an annual
fee to use it. Certain journals and certain years are not yet available online. Once you locate a
scholarly journal that contains empirical research studies, you next locate specific articles. You
need to make sure that a particular article presents the results of a study because journals often
publish several other types of article. It is easier to identify quantitative studies because they
usually have a methods or data section as well as charts, statistical formulas, and tables of
numbers. Qualitative research articles are more difficult to identify, and many students confuse
them with theoretical essays, literature review articles, idea discussion essays, policy
recommendations, book reviews, and legal case analyses. To distinguish among these types
requires a grasp of the varieties of research and experience in reading many articles.
Most college libraries have a section for scholarly journals and magazines, or, in some cases,
they mix the journals with books. Look at a map of library facilities or ask a librarian to identify
this section. The most recent issues, which look like magazines, are often physically separate in a
“current periodicals” section where they are temporarily available until the library receives all
issues of a volume. Libraries place scholarly journals from many fields together with popular,
nonscholarly magazines. All are periodicals, or “serials” in the jargon of librarians. Thus, you
will find popular magazines (e.g., Time, Road and Track, Cosmopolitan, and The Atlantic) next
to journals for astronomy, chemistry, mathematics, literature, sociology, psychology, social
work, and education. Libraries list journals in their catalog system by title and can provide a list
of the periodicals to which they subscribe. Scholarly journals are published as rarely as once a
year or as frequently as weekly. Most appear four to six times a year. For example, Social
Science Quarterly, like other journals with the word quarterly in their title, is published four
times a year. To assist in locating articles, each journal issue has a date, volume number, and
issue number. This information makes it easier to locate an article. Such information—along
with details such as author, title, and page number—is called an article’s citation and is used in
bibliographies or lists of works cited. The very first issue of a journal begins with volume 1,
number 1. It continues increasing the numbers thereafter. Most journals follow a similar system,
but enough exceptions exist that you need to pay close attention to citation information. For most
journals, each volume includes one year of articles. If you see a journal issue with volume 52, it
probably means that the journal has been in existence for 52 years. Most, but not all, journals
begin their publishing cycle in January.
Most journals number pages by volume, not by issue. The first issue of a volume usually begins
with page 1, and page numbering continues throughout the entire volume. For example, the first
page of volume 52, issue 4, may be page 547. Most journals have an index for each volume and a
table of contents for each issue that lists the title, the author’s or authors’ names, and the page on
which the article begins. Issues contain as few as one or two articles or as many as fifty. Most
have eight to eighteen articles, which each may be five to fifty pages long. The articles often
have abstracts, short summaries on the first page of the article or grouped together at the front of
the issue. Many libraries do not retain physical paper copies of older journals, but to save space
and costs they keep only electronic or microfilm versions. Because each field may have hundreds
of scholarly journals, with each costing the library $100 to $3,500 per year in subscription fees,
only the large research libraries subscribe to most of them. You can also obtain a copy of an
article from a distant library through an interlibrary loan service, a system by which libraries lend
books or materials to other libraries. Few libraries allow people to check out recent issues of
scholarly journals. If you go to the library and locate the periodicals section, it is fun to wander
down the aisles and skim what is on the shelves. You will see volumes containing many research
reports. Each title of a scholarly journal has a call number like that of a regular library book.
Libraries often arrange the journals alphabetically by title. However, journals sometimes change
titles, creating confusion if they have been shelved under their original titles. Scholarly journals
contain articles on research in an academic field. Thus, most mathematics journals contain
reports on new mathematical studies or proofs, literature journals contain commentary and
literary criticism on works of literature, and sociology journals contain reports of sociological
research. Some journals cover a very broad field (e.g., social science, education, public affairs)
and contain reports from the entire field. Others specialize in a subfield (e.g., the family,
criminology, early childhood education, or comparative politics).
Citation Formats.
An article’s citation is the key to locating it. Suppose you want to read the study by Pampel on
cultural taste, music, and smoking behavior. Its citation says the following: Pampel, Fred C.
2006. “Socioeconomic Distinction, Cultural Tastes, and Cigarette Smoking.” Social Science
Quarterly,87(1):19–35. It tells you to go to an issue of the scholarly journal Social Science
Quarterly published in 2006. The citation does not provide the month, but it gives the volume
number (87),the issue as 1, and the page numbers (319–335). Formats for citing literature vary in
many ways. The most popular format in the text is the internal citation format of using an
author’s last name and date of publication in parentheses. A full citation appears in a separate
bibliography or reference section. There are many styles for full citations of journal articles with
books and other types of works each having a separate style. When citing articles, it is best to
check with an instructor, journal, or other outlet for the required form. Almost all include the
names of authors, article title, journal name, and volume and page numbers. Beyond these basic
elements, there is great variety. Some include the authors’ first names while others use initials
only. Some include all authors; others give only the first one. Some include information on the
issue or month of publication; others do not. Citation formats can be complex. Two major
reference tools on the topic in social science are Chicago Manual of Style, which has nearly 80
pages on bibliographies and reference formats, and American Psychological Association
Publication Manual, which devotes about 60 pages to the topic. In sociology, the American
Sociological Review style, with two pages of style instructions, is widely followed.
Books
Books communicate many types of information, provoke thought, and entertain. The many types
of books include picture books, textbooks, short story books, novels, popular fiction or
nonfiction, religious books, and children’s books. Our concern here is with those books
containing reports of original research or collections of research articles. Libraries shelve these
books and assign call numbers to them, as they do with other types of books. You can find
citation information on them (e.g.,title, author, publisher) in the library’s catalog system.
Distinguishing a book reporting on research from other books can be difficult. You are more
likely to find such books in a college or university library. Some publishers, such as university
presses, specialize in publishing research reports. Nevertheless, there is no guaranteed method
for identifying one on research without reading it. Some types of research are more likely to
appear in book form than others. For example, studies by anthropologists and historians are more
likely to appear in book-length reports than are those of economists or psychologists. However,
some anthropological and historical studies are reported in articles, and some economic and
psychological studies appear as books. In education, social work, sociology, and political
science, the results of long, complex studies may appear both in two or three articles and in book
form. Studies that involve detailed clinical or ethnographic descriptions and complex theoretical
or philosophical discussions usually appear as books. Finally, an author who wants to
communicate to scholarly peers and to the educated public may write a book that bridges the
scholarly, academic style and a popular nonfiction style. Locating original research articles in
books can be difficult because no single source lists them. Three types of books contain
collections of articles or research reports. The first type, for teaching, called a reader, may
include original research reports. Usually, articles on a topic from scholarly journals are gathered
and edited to be easier for students to read and understand. The second type of collection gathers
journal articles or may contain original research or theoretical essays on a specific topic. Some
collections contain original research reports organized around a specialized topic in journals that
are difficult to locate. The table of contents lists the titles and authors. Libraries shelve these
collections with other books, and some library catalog systems include article or chapter titles.
Finally, annual research books that are hybrids between scholarly journals and collections of
articles contain reports on studies not found elsewhere. They appear year after year with a
volume number for each year. These volumes, such as the Review of Research in Political
Sociology and Comparative Social Research, are shelved with books. Some annual books
specialize in literature reviews (e.g., Annual Review of Sociology and Annual Review of
Anthropology). No comprehensive list of these books is available as there is for scholarly
journals. The only way to find out is by spending a lot of time in the library or asking a
researcher who is already familiar with a topic area. Citations or references to books are shorter
than article citations. They include the author’s name, book title, year and place of publication,
and publisher’s name.
Dissertations
All graduate students who receive the doctor of philosophy (Ph.D.) degree are required to
complete a work of original research, called a dissertation thesis. The dissertation is bound and
shelved in the library of the university that granted the degree. About half of all dissertations are
eventually published as books or articles. Because dissertations report on original research, they
can be valuable sources of information. Some students who receive the master’s degree also
conduct original research and write a master’s thesis, but fewer master’s theses involve serious
research, and they are much more difficult to locate than unpublished dissertations. Specialized
indexes list dissertations completed by students at accredited universities. For example,
Dissertation Abstracts International lists dissertations with their authors, titles, and universities.
The organization of the index is by topic with an abstract of each dissertation. You can borrow
most dissertations via interlibrary loan from the degree-granting university if it permits this. An
alternative is to purchase a copy from a national dissertation microfilm/photocopy center such as
the one at the University of Michigan, Ann Arbor, for U.S. universities. Some large research
libraries contain copies of dissertations from other libraries if someone previously requested
them.
Government Documents
The federal government of the United States, the governments of other nations, state- or
provincial-level governments, the United Nations, and other international agencies such as the
World Bank, sponsor studies and publish reports of the research. Many college and university
libraries have these documents in their holdings, usually in a special “government documents”
section. These reports are rarely found in the catalog system. You must use specialized lists of
publications and indexes, usually with the help of a librarian, to locate these reports. Most
college and university libraries hold only the most frequently requested documents and reports.
Policy Reports and Presented Papers
If you are conducting a thorough literature review, you may look at these two sources. Some are
on the Internet, but most are difficult for all but the trained specialist to obtain. Research
institutes and policy centers (e.g., Brookings Institute, Institute for Research on Poverty, Rand
Corporation) publish papers and reports. Some major research libraries purchase these and
shelve them with books. The only way to be sure of what has been published is to write directly
to the institute or center and request a list of reports. Each year the professional associations in
academic fields (e.g., anthropology, criminal justice, geography, political science, psychology,
sociology) hold annual meetings. Thousands of researchers assemble to give, listen to, or discuss
oral reports of recent research. Most oral reports are also available as written papers. People who
do not attend the meetings but who are members of the association receive a program of the
meeting, listing each paper to be presented with its title, author, and author’s place of
employment. These people can write directly to the author and request a copy of the paper.
Many, but not all, of the papers later appear as published articles. Sometime the papers are in
online services.
How to Conduct a Systematic Literature Review
i. Define and Refine a Topic.
Just as you must plan and clearly define a topic and research question as you begin a research
project, you need to begin a literature review with a clearly defined, well-focused research
question and a plan. A good review topic should be in the form of a research question. For
example, “divorce” or “crime” is much too broad. A more appropriate review topic might be
“What contributes to the stability of families with stepchildren?” or “Does economic inequality
produce crime rates across nations?” If you conduct a context review for a research project, it
should be slightly broader than the specific research question being examined. Often, a
researcher will not finalize a specific research question for a study until he or she has reviewed
the literature. The review usually helps to focus on the research question.
ii. Design a Search
After choosing a focused research question for the review, the next step is to plan a search
strategy. You must decide on the type of review, its extensiveness, and the types of materials to
include. The key is to be careful, systematic, and organized. Set parameters on your search: how
much time you will devote to it, how far back in time you will look, the minimum number of
research reports you will examine, how many libraries you will visit, and so forth. Also decide
how to record the bibliographic citation for each reference and how to take notes (e.g., in a
notebook, on 3" 5" cards,in a computer file). You should begin a file folder or computer file in
which you can place possible sources and ideas for new sources. As your review proceeds, you
should more narrowly focus on a specific research question or issue.
iii. Locate Research Reports
Locating research reports depends on the type of report or research “outlet” for which you are
searching. As a general rule, use multiple search strategies to counteract the limitations of a
single search method. (Discussed above in Where to Find Research Literature)
Evaluation of Research Articles
After you locate a published study, you need to read and evaluate it. At first, this is difficult but
becomes easier over time. Guidelines to help you read and evaluate reports you find and locate
models for writing your own research reports follow.
1. Examine the title.
A good title is specific, indicates the nature of the research without describing the results, and
avoids asking a yes or no question. It describes the topic,may mention one or two major
variables, and tells about the setting or participants. An example of a good title is “Parental
Involvement in Schooling and Reduced Discipline Problems among Junior High School Students
in Singapore.” A good title informs readers about a study whereas a bad title either is vague or
overemphasizes technical details or jargon. The same study could have been titled “A Three-Step
Correlation Analysis of Factors That Affect Segmented Behavioral Anxiety Reduction.”
2. Read the abstract.
A good abstract summarizes critical information about a study. It gives the study’s purpose,
identifies methods used, and highlights major findings. It avoids vague references to future
implications. After an initial screening by title, you should be able to determine a report’s
relevance from a well-prepared abstract. In addition to screening for relevance, a title and
abstract prepare you for examining a report in detail. I recommend a two-stage screening
process. Use the title and abstract to determine the article’s initial relevance. If it appears
relevant, quickly scan the introduction and conclusion sections to decide whether it is a real
“keeper”(i.e., worth investing in a slow, careful reading of the entire article). Most likely, you
will discover a few articles that are central to your purpose and many that are tangential. They
are only worth skimming to locate one or two specific relevant details. Exercise caution not to
pull specific details out of context.
3. Read the article.
Before reading the entire article, you may want to skim the first several paragraphs at the
beginning and quickly read the conclusion. This will give you a picture of what the article is
about. Certain factors affect the amount of time and effort and overall payoff from reading a
scholarly article. The time and effort are lower and results higher under three conditions:
 the article is a high-quality article with a well-defined purpose, clear writing, and smooth,
logical organization,
 you are sharply focused on a particular issue or question, and
 you have a solid theoretical background, know a lot about the substantive topic, and are
familiar with research methodology.
As you see, a great deal depends on reader preparation. You can develop good reader preparation
to quickly “size up” an article by recognizing the dimensions of a study, its use of theory, and the
approach used. Also, be aware that authors write with different audiences in mind. They may
target a narrow, highly specialized sector of the scientific community; write for a broad cross-
section of students and scholars in several fields; or address policymakers, issue advocates, and
applied professionals. When you read a highly relevant article, begin with the introduction
section. It has three purposes:
 to introduce a broad topic and make a transition to a specific research question that will
be the study’s primary focus,
 to establish the research question’s significance (in terms of expanding knowledge,
linking to past studies, or addressing an applied concern), and
 to outline a theoretical framework and define major concepts.
Sometimes an article blends the introduction with a context literature review; at other times the
literature review is a separate section. To perform a good literature review, you must be
selective, comprehensive, critical, and current. By being selective, you do not list everything ever
written on a topic, only the most relevant studies. By being comprehensive, you include past
studies that are highly relevant and do not omit any important ones. More than merely recounting
past studies, you should be critically evaluative, that is, you comment on the details of some
specific studies and evaluate them as they relate to the current study. You will not know
everything about your study until it is finished, so plan to fine-tune and rewrite it after it is
completed.
You should include recent studies in your literature review. Depending on its size and
complexity, you may distinguish among theory, methods, findings, and evaluation. For example,
you might review theoretical issues and disputes, investigate the methods previous researchers
used, and summarize the findings, highlighting any gaps or inconsistencies. An evaluation of
past studies can help you to justify the importance of conducting the current study.
Depending on the type of research approach used in an article, a hypothesis or methods section
may follow the literature review. These sections outline specific data sources or methods of data
collection, describe how variables were measured, whether sampling was used, and, if so, the
details about it. You may find these sections tightly written and packed with technical details.
They are longer in quantitative than qualitative studies.
After a methods section comes the results section. If the study is quantitative research, it should
do more than present a collection of statistical tables or coefficients and percentages. It should
discuss what the tables and data show. If it is qualitative research, it should be more than a list of
quotations or straight description. The organization of data presentation usually begins simply by
painting a broad scope and then goes into complexities and specific findings. Data presentation
includes a straightforward discussion of the central findings and notes their significance. In
quantitative research, it is not necessary to discuss every detail in a table or chart. Just note major
findings and any unexpected or unusual findings. In a good article, the author will guide the
reader through the data, pointing out what is in the study, and show all data details. In qualitative
research, the organization of data often tells a story or presents a line of reasoning. Readers
follow the author’s story but are free to inquire about it. In some articles, the author combines the
discussion and results sections. In others, they are separate. A discussion section moves beyond
simple description. It elaborates on the implications of results for past findings, theory, or
applied issues. The section may include implications for building past findings from the literature
review, and implications for the specific research question. The discussion section may also
include commentary on any unexpected findings. Most researchers include methodological
limitations of the study in the discussion. This often includes how the specific measures,
sampling, cases, location, or other factors restrict the generalizability of findings or are open to
alternative explanations. Full candor and openness are expected. In a good article, the author is
self-critical and shows an awareness of the study’s weaknesses. After you have read the
discussion and results sections, read the article’s conclusion or summary for a second time. A
good conclusion/summary reviews the study’s research question, major findings, and significant
unexpected results. It also outlines future implications and directions to take. You may want to
look for an appendix that may include additional study details and review the reference or
bibliography section. An article’s bibliography can give you leads to related studies or theoretical
statements.
Reading and critically evaluating scholarly articles takes concentration and time, and it improves
with practice. Despite the peer-review process and manuscript rejection rates, articles vary in
quality. Some may contain errors, sloppy logic, or gaps. Be aware that a title and introduction
may not mesh with specific details in the results section. Authors do not always describe all
findings. The reader with a clearly focused purpose may notice new details in the findings by
carefully poring over an article. For example, an author may not mention important results
evident in a statistical table or chart or may place too much attention on minor or marginal
results. As you evaluate an article, notice exactly how the study it reports was conducted, how
logically its parts fit together, and whether the conclusions really flow from all of the findings.
How to Take Notes
As you gather the relevant research literature, you may feel overwhelmed by the quantity of
information, so you need a system for taking notes. The old-fashioned note-taking approach was
to write the notes onto index cards and then shift and sort the note cards, place them in piles, and
so forth while looking for connections among them or developing an outline for a report or
paper. This method still works. Today, however, most people use word processing software and
gather photocopies or printed versions of many articles. As you discover new sources, you may
want to create two file types for note cards or computer documents, a source file and a content
file. Record all bibliographic information for each source in the source file even though you may
not use some of it. Do not forget anything in a complete bibliographic citation, such as a page
number or the name of the second author; if you do, you will regret it later. It is far easier to
erase a source you do not use than to try to locate bibliographic information later for a source
you discover that you need or from which you forgot one detail. I suggest creating two kinds of
source files, or dividing a master file into two parts: have file and potential file. The have file is
for sources that you have found and for which you have already taken content notes. The
potential file is for leads and possible new sources that you have yet to track down or read. You
can add to the potential file anytime you come across a new source or a new article’s
bibliography. Toward the end of writing a report, the potential file will disappear and the have
file will become your bibliography. The content file contains substantive information of interest
from a source, usually its major findings, details of methodology, definitions of concepts, or
interesting quotes. If you quote directly from a source or want to take some specific information
from it, you must record the specific page number(s) on which it appears. Link the files by
putting key source information, such as author and date, on each content file.
i. What to Record.
You must decide what to record about an article, book, or other source. It is better to err in the
direction of recording too much rather than too little. In general, record the hypotheses tested, the
measurement of major concepts, the main findings, the basic design of the research, the group or
sample used, and ideas for future study. It is wise to examine the report’s bibliography and note
sources that you can add to your search. Photocopying all relevant articles or reports will save
you time recording notes and will ensure that you will have an entire report. Also, you can make
notes on the photocopy, but consider several facts about this practice. First, photocopying can be
expensive for a large literature search. Second, be aware of and obey copyright laws. U.S.
copyright laws permit photocopying for personal research use. Third, remember to record or
photocopy the entire article, including all citation information. Fourth, organizing a large pile of
articles can be cumber some, especially if you want to use several different parts of a single
article. Finally, unless you highlight carefully or take good notes, you may have to reread the
entire article later.
ii. Organize Notes.
After you have gathered many references and notes, you need an organizing method. One
approach is to group various studies or specific findings by skimming notes and creating a
mental map of how they fit together. Try several organizational plans before you settle on a final
one. Organizing is a skill that improves with practice. For example, place notes into piles
representing common themes or draw charts comparing what different reports state about the
same question, noting any agreements and disagreements. In the process of organizing notes, you
will find that some references and notes do not fit anywhere. You should discard them as being
irrelevant. You may discover gaps or areas and topics that are relevant but you have not
examined yet. This necessitates return visits to the library. The best organizational method
depends on the purpose of the review. A context review implies organizing recent reports around
a specific research question. A historical review implies organizing studies by major theme and
by the date of publication. An integrative review implies organizing studies around core common
findings of a field and the main hypotheses tested. A methodological review implies organizing
studies by topic and, within each topic, by the design or method used. A theoretical review
implies organizing studies by theories and major thinkers.
Planning and Writing the Review
A literature review requires planning and clear writing, and it requires rewriting. All rules of
good writing (e.g., clear organizational structure, an introduction and conclusion, transitions
between sections) apply to writing a literature review. Keep your purposes in mind when you
write, and communicate clearly and effectively. You want to communicate a review’s purpose to
readers by the review’s organization. The wrong way to write a review is to list a series of
research reports with a summary of the findings of each. This fails to communicate a sense of
purpose. It reads as a set of notes strung together. When I see these, I think that the review writer
was sloppy and skipped over an important organizational step in writing the review. The correct
way to write a review is to synthesize and organize common findings together. A well-accepted
approach is to address the most important ideas first, logically link common statements or
findings, and note discrepancies or weaknesses.
Text to theory in qualitative research (Analyzing data)
In qualitative research text to theory compiles the following step
 Documentation of Data
 Coding and Categorizing
 Analyzing Conversation, Discourse, and Genres
 Narrative and Hermeneutic Analysis
 Using Computers in Qualitative Analysis
 Text Interpretation: An Overview
1. Documentation of Data
The preceding chapters have detailed the main ways in which data are collected or produced in
qualitative research. However, before you can analyze the data you may have generated in these
ways, you have to document and edit your data. In the case of interview data, an important part
of this editing process is that you record the spoken words and then transcribe them. For
observations, the most important task is that you document actions and interactions. In both
cases, a contextual enrichment of statements or activities should be a main part of the data
collection. You can achieve this enrichment by documenting the process of data collection in
context protocols, research diaries, or field notes. With these procedures, you transform the
relations you study into texts, which are the basis for the actual analyses. In this chapter, I will
discuss the methodological alternatives for documenting collected data. The data you produce as
a result of this process are substituted for the studied (psychological or social) relations so that
you can proceed with the next stages of the research process (i.e., interpretation and
generalization). The process of documenting the data comprises mainly three steps: recording the
data, editing the data (transcription), and constructing a "new" reality in and by the produced
text. All in all, this process is an essential aspect in the construction of reality in the research
process.
New Ways and Problems of Recording Data
The more sophisticated (acoustic and audio-visual) possibilities for recording events have had an
essential influence on the renaissance of qualitative research over the last 20 years. One
condition for this progress was that the use of recording devices (tape, MP3, mini-disc, and video
recorders) has become widespread in daily life itself as well. To some extent, their prevalence
has made them lose their unfamiliarity for potential interviewees or for those people whose
everyday life is to be observed and recorded by their use. It is these gadgets alone that have made
possible some forms of analyses such as conversation analysis and objective hermeneutics.
Acoustic and Visual Recordings of Natural Situations
Using machines for recording renders the documentation of data independent of perspectives—
those of the researcher as well as those of the subjects under study. It is argued that this achieves
a naturalistic recording of events or a natural design: interviews, everyday talk, or counseling
conversations are recorded on cassettes or videotape. After informing the participants about the
purpose of the recording, the researcher hopes that they will simply forget about the tape
recorder and that the conversation will take place "naturally"—even at awkward points.
Presence and Influence of the Recording
This hope of making a naturalistic recording will be fulfilled above all if you restrict the presence
of the recording equipment. In order to get as close as possible to the naturalness of the situation,
I recommend that you restrict the use of recording technology to the collection of data necessary
to the research question and the theoretical framework. "Where videotaping does not document
anything essential beyond that obtained with a cassette recorder, you should prefer the less
obtrusive machine. In any case, the researchers should limit their recordings to what is absolutely
necessary for their research question—in terms of both the amount of data that is recorded and
the thoroughness of the recording. In research about counseling, for example, you may ask the
counselors to record their conversations with clients by using a cassette recorder. In institutions
where these kinds of recordings are continuously made for purposes of supervision, recording
may have little influence on what is recorded. However, you should not ignore the fact that there
may be some influence on the participants' statements. This influence is increased if the
researchers are present in the research situation for technical reasons. The greater the effort in
videotaping and the more comprehensive the insight it permits into the everyday life under study,
the greater the possible skepticism and reservations on the part of participants in the study. This
makes the integration of the recording procedure in the daily life under study more complicated.
Skepticism about the Naturalness of Recordings
Correspondingly, you can find thoughtful reflections on the use of recording technology in
qualitative research. These forms of recording have replaced the interviewers' or observers'
notes, which were the dominant medium in earlier times. For Hopf, they provide "increased
options for an inter-subjective assessment of interpretations ... for taking into account interviewer
and observer effects in the interpretation ... and for theoretical flexibility" compared to "the
neceissarily more selective memory protocols". This new flexibility leads "to a new type of'
qualitative data hoarding' owing to the delays in decisions about research questions and
theoretical assumptions which are now possible." New questions concerning research ethics,
changes in the studied situations caused by the form of recording, and a loss of anonymity for the
interviewees are linked to this. The ambivalence to the new options for recording qualitative data
suggests that it is important to treat this point not as a problem of technical detail, but rather in
the sense of a detailed "qualitative technology assessment." Also, you should include in your
considerations about the appropriate method for documentation "out-of-date" alternatives, which
were displaced by the new technologies.
i. Field Notes
The classic medium for documentation in qualitative research has been the researcher’s notes.
The notes taken in interviews should contain the essentials of the interviewee's answers and
information about the proceeding of the interview. The participant observers repeatedly interrupt
their participation to note important observations, of the classic documentation technique, its
problems, and the chosen solution to them makes clear Lofland and Lofland (1984) formulate as
a general rule that such notes should be made immediately or at least as soon as possible. The
withdrawal necessary for this may introduce some artificiality in the relation to interaction
partners in the field. Especially in action research when the researchers take part in the events in
the field and do not merely observe them, it is additionally difficult to maintain this freedom for
the researchers. An alternative is to note impressions after ending the individual field contact.
Lofland and Lofland (1984, p. 64) recommend that researchers use a "cloistered rigor" in
following the commandment to make notes immediately after the field contact, and furthermore
that researchers estimate the same amount of time for carefully noting the observations as for
spending on the observation itself. It should be ensured that (maybe much) later a distinction can
still be made between what has been observed and what has been condensed by the observer in
his or her interpretation or summary of events. Researchers may develop a personal style of
writing notes after a while and with increasing experience.
The production of reality in texts starts with the taking of field notes. The researcher's selective
perceptions and presentations have a strong influence on this production. This selectivity
concerns not only the aspects that are left out, but also and above all those which find their way
into the notes. It is only the notation that raises a transitory occurrence out of its everyday course
and makes it into an event to which the researcher, interpreter, and reader can turn their attention
repeatedly. One way of reducing or at least qualifying this selectivity of the documentation is to
complement the notes by diaries or day protocols written by the subjects under study in parallel
with the researcher's note taking. Thus, their subjective views may be included in the data and
become accessible to analysis. Such documents from the subject's point of view can be analyzed
and contrasted with the researcher's notes. Another way is to add photos, drawings, maps, and
other visual material to the notes. A third possibility is to use an electronic notebook, a dictating
machine, or similar devices for recording the notes.
Correspondingly, Spradley (1980, pp. 69-72) suggests four forms of field notes for
documentation:
 the condensed accounts in single words, sentences, quotations from conversations, etc.;
 an expanded account of the impressions from interviews and field contacts;
 a fieldwork journal, which like a diary "will contain ... experiences, ideas, fears, mistakes,
confusions, breakthroughs, and problems that arise during fieldwork".
 some notes about analysis and interpretations, which start immediately after the field
contacts and extend until finishing the study.
ii. Research Diary
Especially if more than one researcher is involved, there is a need for documentation of, and
reflection on, the ongoing research process in order to increase the comparability of the empirical
proceedings in the individual notes. One method of documentation is to use continually updated
research diaries written by all participants. These should document the process of approaching a
field, and the experiences and problems in the contact with the field or with interviewees and in
applying the methods. Important facts and matters of minor importance or lost facts in the
interpretation, generalization, assessment, or presentation of the results seen from the
perspectives of the individual researcher should also be incorporated. Comparing such
documentation and the different views expressed in them makes the research process more
intersubjective and explicit. Furthermore, they may be used as memos in the sense of Strauss for
developing a grounded theory. Strauss recommends writing memos during the whole research
process, which will contribute to the process of building a theory. Documentation of this kind is
not only an end in itself or additional knowledge but also serves in the reflection on the research
process. Several methods have been outlined for "catching" interesting events and processes,
statements, and proceedings. In the noting of interventions in the everyday life under study, the
researchers should be led in their decisions by the following rule of economy: record only as
much as is definitely necessary for answering the research question. They should avoid any
"technical presence" in the situation of the data collection that is not absolutely necessary for
their theoretical interests. Reducing the presence of recording equipment and informing the
research partners as much as possible about the sense and purpose of the chosen form of
recording make it more likely that the researchers will truly "catch" everyday behavior in natural
situations. In the case of research questions where "out-of-date" forms of documentation such as
preparing a protocol of answers and observations are sufficient, I highly recommend using these
forms. But you should produce these protocols as immediately and comprehensively as possible
in order to mainly record impressions of the field and resulting questions.
iii. Documentation Sheets
For interviews, I find it helpful to use sheets for documenting the context of data collection.
What information they should include depends on the design of the study; for example, if several
interviewers are involved or if interviews are conducted at changing locations, which supposedly
might have influenced the interview. In addition, the research questions determine what you
should concretely note on these sheets. The example comes from my study of technological
change in everyday life, in which several interviewers conducted interviews with professionals in
different work situations on the influences of technology on childhood, children's education in
one's own family or in general, and so on. Therefore, the documentation sheet needed to contain
explicit additional contextual information.
Example of a Documentation Sheet
Information about the Interview and the Interviewee
 Date of the interview:
 Place of the interview:
 Duration of the interview:
 Interviewer:
 Indicator for the interviewee:
 Gender of the interviewee:
 Age of the interviewee:
 Profession of the interviewee:
 Working in this profession since:
 Professional field:
 Raised (countryside/city):
 Number of children:
 Age of the children:
 Gender of the children:
 Special occurrences in the interview:
Transcription
If data have been recorded using technical media, their transcription is a necessary step on the
way to their interpretation. Different transcription systems are available which vary in their
degree of exactness. A standard has not yet been established. In language analyses, interest often
focuses on attaining the maximum exactness in classifying and presenting statements, breaks,
and so on. Here you can also ask about the procedure's appropriateness. These standards of
exactness contribute to the natural science ideals of precision in measurement and are imported
into interpretive social science through the back door. Also, the formulation of rules for
transcription may tempt one into some kind of fetishism that no longer bears any reasonable
relation to the question and the products of the research. Where linguistic and conversation
analytic studies focus on the organization of language, this kind of exactness may be justified.
For more psychological or sociological research questions, however, where linguistic exchange
is a medium for studying certain contents, exaggerated standards of exactness in transcriptions
are justified only in exceptional cases. It seems more reasonable to transcribe only as much and
only as exactly as is required by the research question. First, precise transcription of data absorbs
time and energy, which could be invested more reasonably in their interpretation instead.
Second, the message and the meaning of what was transcribed are sometimes concealed rather
than revealed in the differentiation of the transcription and the resulting obscurity of the
protocols produced. Thus Bruce holds:
The following very general criteria can be used as a starting point in the evaluation of a
transcription system for spoken discourse: manageability (for the transcriber), readability,
leamabilitv, and interpretability (for the analyst and for the computer). It is reasonable to think
that a transcription system should be easy to write, easy to read, easy to learn, and easy to
search.
Beyond the clear rules of how to transcribe statements, turn taking, breaks, ends of sentences,
and so on, a second check of the transcript against the recording and the anonymization of data
(names, local, and temporal references) are central features of the procedure of transcription.
Transcription in conversation analysis has often been the model for transcriptions in social
science.
Transcription is an integral process in the qualitative analysis of language data and is widely
employed in basic and applied research across several disciplines and in professional practice
fields. Due to financial restraints in both educational institutions and for individuals, transcribing
audio and video materials that would otherwise be beneficial for the research process are given a
backseat or offloaded to inexperienced interns or inferior outsourcing firms. But automatic
transcription also has advantages that help obtain a fine result. 
Transcription process benefits in qualitative research
Transcription in qualitative research has made the overall process of interpreting data very
simple for researchers. It has helped interviewers by enabling them to read, analyse and interpret
information with ease, with text that is precise and concise as well as easily understandable.
Easy Interpretation of Data
Transcripts help in easy interpretation of data, as reading text related to the research/interview
makes it easy to collate data and structure it better.
Shared copies for future analysis
Transcription of interviews and qualitative research ensures that copies of the document can be
distributed to everyone in the team which would further help in future analysis of data. This also
helps fellow researchers to have direct access to data which is extremely valuable and useful.
Audio transcription has also helped enhance teamwork, as tasks are shared between coworkers.
Inclusion of verbatim comments in the report
Instead of having to listen, pause and type, having the focus group or interview pre transcribed
helps generate good quality and in-depth reports.
Enables a follow up and a detailed examination of the events
Taking notes during the event might result in the interviewer missing out on key pieces of
information being mentioned. The word-for-word transcription allows one to listen and interpret
what is being said more effectively.
Source of reference and Data interpretation
Provides a source of reference for the interviewer while conducting a follow up interview and
allows them to be able to go through data and use it at points when he/she could reach a standstill
while interpreting data.
This data can also be re-used during the course of other investigations and can be used to draw
new conclusions during later studies as well. This has helped the extent of which data can be
used and the amount of observations that can be made from the same piece of data.
Quick reporting/Browsing
The process of transcription of qualitative data has made browsing through data a much faster
process. A researcher looking for data can simply use commands like “ CTRL-F” to find specific
information in the transcript without having to waste time going through the whole text or trying
to listen through long audio files to get certain information.
Transcribing is an arduous process, even for the most experienced transcribers, but it must be
done to convert the spoken word to the written word to facilitate analysis. And over the last
decade, advances in technology have brought about new possibilities in the field of transcription.
Clients have the option of automated translation softwares like ‘Happy Scribe’. Happy Scribe
will transcribe your audio or video file using voice recognition technology within a few minutes
which you will be able to access your transcript through your dashboard.
Reality as text, text as a new reality
Recording the data, making additional notes, and transcribing the recordings transform the
interesting realities into text. At the very least, the documentation of processes and the
transcription of statements lead to a different version of events. Each form of documentation
leads to a specific organization of what is documented. Every transcription of social realities is
subject to technical conditions and limitations and produces a specific structure on the textual
level, which makes accessible what was transcribed in a specific way. The documentation
detaches the events from their transience. The researcher's personal style of noting things makes
the field a presented field; the degree of the transcription's exactness dissolves the gestalt of the
events into a multitude of specific details. The consequence of the following process of
interpretation is that:
Reality only presents itself to the scientist in substantiated form, as text - or in technical terms -
as protocol. Beyond texts, science has lost its rights, because a scientific statement can only be
formulated when and insofar as events have found a deposit or left a trace and these again have
undergone an interpretation.
This substantiation of reality in the form of texts is valid in two respects: as a process that opens
access to a field and, as a result of this process, as a reconstruction of the reality, which has been
transformed into texts. The construction of a new reality in the text has already begun at the level
of the field notes and at the level of the transcript and this is the only (version of) reality
available to the researchers during their following interpretations. These constructions should be
taken into account in the more or less meticulous handling of the text, which is suggested by
each method of interpretation. The more or less comprehensive recording of the case, the
documentation of the context of origination, and the transcription organize the material in a
specific way. The epistemological principle of understanding may be realized by being able to
analyze as far as possible the presentations or the proceeding of situations from the inside.
Therefore, the documentation has to be exact enough to reveal structures in those materials and it
has to permit approaches from different perspectives. The organization of the data has the main
aim of documenting the case in its specificity and structure. This allows the researcher to
reconstruct it in its gestalt and to analyze and break it down for its structure—the rules according
to which it functions, the meaning underlying it, the parts that characterize it. Texts produced in
this way construct the studied reality in a specific way and make it accessible as empirical
material for interpretative procedures.
2. Coding and categorizing
The interpretation of data is at the core of qualitative research, although its importance is seen
differently in the various approaches. Sometimes, for example, in objective hermeneutics and
conversation analysis, research refrains from using specific methods for data collection beyond
making recordings of everyday situations. In these cases, the use of research methods consists of
applying methods for the interpretation of text. In other approaches, it is a secondary step
following more or less refined techniques of data collection. This is the case, for example, with
qualitative content analysis or with some methods of handling narrative data. In grounded theory
research, the interpretation of data is the core of the empirical procedure, which, however,
includes explicit methods of data collection. The interpretation of texts serves to develop the
theory as well as the foundation for collecting additional data and for deciding which cases to
select next. Therefore, the linear process of first collecting the data and later interpreting it is
given up in favor of an interwoven procedure. Interpretation of texts may pursue two opposite
goals. One is to reveal and uncover statements or to put them into their context in the text that
normally leads to an augmentation of the textual material; for short passages in the original text,
page-long interpretations are sometimes written. The other aims at reducing the original text by
paraphrasing, summarizing, or categorizing. These two strategies are applied either alternatively
or successively. In summary, we can distinguish two basic strategies in working with texts.
Coding the material has the aim of categorizing and/or theory development. The more or less
strictly sequential analysis of the text aims at reconstructing the structure of the text and of the
case.
i. Grounded Theory Coding
Grounded theory coding (Charmaz 2006 uses this generic term for covering the different
approaches) is the procedure for analyzing data that have been collected in order to develop a
grounded theory. As already mentioned, in this approach the interpretation of data cannot be
regarded independently of their collection or the sampling of the material. Interpretation is the
anchoring point for making decisions about which data or cases to integrate next in the analysis
and how or with which methods they should be collected (see also Chapter 32). In the years since
the publication of the first introductory text by Glaser and Strauss (1967), proliferation of the
approaches in the field has led to debates and distinctions about the right way to grounded theory
coding. Therefore it makes sense to briefly outline some of the different versions that exist in
their way how coding proceeds.
Strauss and Corbin's Approach to Coding
In the process of interpretation, as Strauss (1987) and Strauss and Corbin (1990) characterize it, a
number of "procedures" for working with text can be differentiated. They are termed "open
coding," "axial coding," and "selective coding." You should see these procedures neither as
clearly distinguishable procedures nor as temporally separated phases in the process. Rather, they
are different ways of handling textual material between which the researchers move back and
forth if necessary and which they combine. But the process of interpretation begins with open
coding, whereas towards the end of the whole analytical process, selective coding comes more to
the fore. Coding here is understood as representing the operations by which data are broken
down, conceptualized, and put back together in new ways. It is the central process by which
theories are built from data.
According to this understanding, coding includes the constant comparison of phenomena, cases,
concepts, and so on, and the formulation of questions that are addressed to the text. Starting from
the data, the process of coding leads to the development of theories through a process of
abstraction. Concepts or codes are attached to the empirical material. They are formulated first as
closely as possible to the text, and later more and more abstracdy. Categorizing in this procedure
refers to the summary of such concepts into generic concepts and to the elaboration of the
relations between concepts and generic concepts or categories and superior concepts. The
development of theory involves the formulation of networks of categories or concepts and the
relations between them. Relations may be elaborated between superior and inferior categories
(hierarchically) but also between concepts at the same level. During the whole process,
impressions, associations, questions, ideas, and so on are noted in memos, which complement
and explain the codes that were found.
Open Coding
This first step aims at expressing data and phenomena in the form of concepts. For this purpose,
data are first disentangled ("segmented"). Units of meaning classify expressions (single words,
short sequences of words) in order to attach annotations and "concepts" (codes) to them. This
procedure cannot be applied to the whole text of an interview or an observation protocol. Rather,
it is used for particularly instructive or perhaps extremely unclear passages. Often the beginning
of a text is the starting point. This procedure serves to elaborate a deeper understanding of the
text. Sometimes dozens of codes result from open coding (Strauss and Corbin 1990, p. 113). The
next step in the procedure is to categorize these codes by grouping them around phenomena
discovered in the data, which are particularly relevant to the research question. The resulting
categories are again linked to codes, which are now more abstract than those used in the first
step. Codes now should represent the content of a category in a striking way and above all should
offer an aid to remembering the reference of the category. Possible sources for labeling codes are
concepts borrowed from the social science literature (constructed codes) or taken from
interviewees' expressions (in vivo codes). Of the two types of code, the latter are preferred
because they are closer to the studied material. The categories found in this way are then further
developed. To this end the properties belonging to a category are labeled and dimensionahzed
(i.e., located along a continuum in order to define the category more precisely regarding its
content): To explain more precisely what we mean by properties and dimensions, we provided
another example using the concept of "color". Its properties include shade, intensity, hue, and so
on. Each of these properties can be dimensionalzed. Thus, color can vary in shade from dark to
light, in intensity from high to low, and in hue from bright to dull. Shade, intensity, and hue are
what might be called general properties.
Open coding may be applied in various degrees of detail. A text can be coded line by line,
sentence by sentence, or paragraph by paragraph, or a code can be linked to whole texts (a
protocol, a case, etc.). Which of these alternatives you should apply depends on your research
question, on your material, on your personal style as analyst, and on the stage that your research
has reached. It is important not to lose touch with the aims of coding. The main goal is to break
down and understand a text and to attach and develop categories and put them into an order in
the course of time. Open coding aims at developing substantial codes describing, naming, or
classifying the phenomenon under study or a certain aspect of it. Strauss and Corbin summarize
open coding as follows: Concepts are the basic building blocks of theory. Open coding in
grounded theory method is the analytic process by which concepts are identified and developed
in terms of their properties and dimensions. The basic analytic procedures by which this is
accomplished are: the asking of questions about the data; and the making of comparisons for
similarities and differences between each incident, event and other instances of phenomena.
Similar events and incidents are labelled and grouped to form categories.
The result of open coding should be a list of the codes and categories attached to the text. This
should be complemented by the code notes that were produced for explaining and defining the
content of codes and categories, and a multitude of memos, which contain striking observations
on the material and thoughts that are relevant to the development of the theory.
For both open coding and the other coding strategies it is suggested that the researchers regularly
address the text with the following list of so-called basic questions:
 What? What is the issue here? Which phenomenon is mentioned?
 Who? Which persons, actors are involved? Which roles do they play? How do they
interact?
 How? Which aspects of the phenomenon are mentioned (or not mentioned)?
 When? How long? Where? Time, course, and location.
 How much? How strong? Aspects of intensity.
 Why? Which reasons are given or can be reconstructed?
 What for? With what intention, to which purpose?
 By which? Means, tactics, and strategies for reaching the goal.
By asking these questions, the text will be opened up. You may address them to single passages,
but also to whole cases. In addition to these questions, comparisons between the extremes of a
dimension ("flip-flop technique") or to phenomena from completely different contexts and a
consequent questioning of self-evidence ("waving-the-red-flag technique") are possible ways for
further untangling the dimensions and contents of a category.
Axial Coding
After identifying a number of substantive categories, the next step is to refine and differentiate
the categories resulting from open coding. As a second step, Strauss and Corbin suggest doing a
more formal coding for identifying and classifying links between substantive categories. In axial
coding, the relations between categories are elaborated. In order to formulate such relations,
Strauss and Corbin (1998, p. 127) suggest a coding paradigm model. This very simple and, at the
same time, very general model serves to clarify the relations between a phenomenon, its causes
and consequences, its context, and the strategies of those who are involved. This model is based
on two axes: one goes from causes to phenomena and to consequences, the other one links
context, intervening conditions, and action and interactional strategies of participants to the
phenomenon. The concepts included in each category can become
 a phenomenon for this category and/or
 the context or conditions for other categories, or, for a third group of categories,
 a consequence.
It is important to note that the coding paradigm only names possible relations between
phenomena and concepts and is used to facilitate the discovery or establishment of structures of
relations between phenomena, between concepts, and between categories. Here as well, the
questions addressed to the text and the comparative strategies mentioned above are employed
once again in a complementary way. The developed relations and the categories that are treated
as essential are repeatedly verified against the text and the data. The researcher moves
continuously back and forth between inductive thinking (developing concepts, categories, and
relations from the text) and deductive thinking (testing the concepts, categories, and relations
against the text, especially against passages or cases that are different from those from which
they were developed). Axial coding is summarized as follows:
Axial coding is the process of relating subcategories to a category. It is a complex process of
inductive and deductive thinking involving several steps. These are accomplished, as with open
coding, by making comparisons and asking questions. However, in axial coding the use of these
procedures is more focused, and geared toward discovering and relating categories in terms of
the paradigm model.
In axial coding, the categories that are most relevant to the research question are selected from
the developed codes and the related code notes. Many different passages in the text are then
sought as evidence of these relevant codes in order to elaborate the axial category on the basis of
the questions mentioned above. In order to structure the intermediate results (means-end, cause-
effect, temporal, or local) relations are elaborated between the different axial categories by using
the parts of the coding paradigm mentioned above. From the multitude of categories that were
originated, those are selected that seem to be most promising for further elaboration. These axial
categories are enriched by their fit with as many passages as possible. For further refining, the
questions and comparisons mentioned above are employed.
Selective Coding
The third step, selective coding, continues the axial coding at a higher level of abstraction. This
step elaborates the development and integration of it in comparison to other groups and focuses
on potential core concepts or core variables. In this step you will look for further examples and
evidence for relevant categories. This then leads to an elaboration or formulation of the story of
the case. At this point, Strauss and Corbin conceive the issue or the central phenomenon of the
study as a case and not a person or a single interview. You should bear in mind here that the aim
of this formulation is to give a short descriptive overview of the story and the case and should
therefore comprise only a few sentences. The analysis goes beyond this descriptive level when
the story line is elaborated—a concept is attached to the central phenomenon of the story and
related to the other categories. In any case, the result should be one central category and one
central phenomenon. The analyst must decide between equally salient phenomena and weigh
them, so that one central category results together with the subcategories which are related to it.
The core category again is developed in its features and dimensions and linked to (all, if
possible) other categories by using the parts and relations of the coding paradigm. The analysis
and the development of the theory aim at discovering patterns in the data as well as the
conditions under which these apply. Grouping the data according to the coding paradigm
allocates specificity to the theory and enables the researcher to say, "Under these conditions
(listing them) this happens; whereas under these conditions, this is what occurs". Finally, the
theory is formulated in greater detail and again checked against the data. The procedure of
interpreting data, like the integration of additional material, ends at the point where theoretical
saturation has been reached (i.e., further coding, enrichment of categories, and so on no longer
provide or promise new knowledge). At the same time, the procedure is flexible enough that the
researcher can re-enter the same source texts and the same codes from open coding with a
different research question and aim at developing and formulating a grounded theory of a
different issue.
Juliet Corbin and Anselm Strauss have further developed the approach of grounded theory
coding and applied it in many studies in the context of nursing and medical sociology in the
1980s and since. In one of their more recent studies, Corbin and Strauss (1988! applied their
methodology to the study of how people experiencing a chronic illness and their relatives
manage to deal with this serious illness and manage to conduct their personal lives. The
empirical basis of this study is a number of intensive interviews with such couples at home and
at work. These were undertaken to identify the problems these couples face in their personal
lives in order to answer the question: "How can the chronically ill be helped to manage their
illnesses more effectively?". Different from early conceptualizations of grounded theory research
in which it was suggested not to develop a theoretical framework and understanding of the issue
under study (e.g., in Glaser and Strauss 1967), the authors here start with an extensive
presentation of the theoretical tools used in their study, which builds on previous empirical work
by the same researchers. The main concept in the research is trajectory. This refers to the course
of the illness as well as to the work of the people who attempt to control and shape this course.
Corbin and Strauss identify several stages—trajectory phases—that are labeled as acute,
comeback, stable, unstable, deteriorating, and dying stages of illness. In the theoretical
framework, the authors analyze how a chronically ill member of a family changes the life plans
of families and focus on biographical processes with which the victims try to manage and come
to terms with the illness. In the second part of their book, the authors use this theoretical
framework to analyze the various trajectory phases in greater detail.
This is not only one of the most important studies in the field of everyday management of
chronic illness. It is also very fruitful in developing and differentiating a theoretical framework
for this issue, which goes beyond existing concepts of coping, adjustment, and stress. Rather, the
authors develop from their empirical work a much more elaborate concept (trajectory) for
analyzing the experience of their research partners. They achieve this by analyzing the different
stages of trajectory by asking a set of questions: "What are the different types of work? How do
they get done? How do the central work processes and interactional developments enter into
getting the work done? What are the biographical processes that accompany and affect those
matters?" . All in all, this study is a very good example of how the research strategy developed
by Glaser, Strauss, and Corbin in several steps can be used for analyzing a theoretically and
practically relevant issue.
Glaser's Approach: Theoretical Coding
In many parts, Glaser and Strauss go the same way when analyzing the material. Glaser (1992)
more recently has criticized the way Strauss and Corbin have elaborated their approach and in
particular the coding paradigm and the idea of axial coding. Mainly he sees this as forcing a
structure on the data instead of discovering what emerges as structure from the data and the
analysis. In his version, open coding is the first step, too. But then he advances in a different
way. As an instrument for coding material more formally and in a theoretically relevant way,
Glaser (1978) has suggested a list of basic codes, which he grouped as coding families. These
families are sources for defining codes and at the same time an orientation for searching new
codes for a set of data. The third step is again selective coding, although in his earlier textbook,
selective coding comes before theoretical coding based on the coding families. As Kelle (2007)
holds, the list of coding families can be an heuristic tool for advancing an understanding of the
material. However, he criticizes the lack of internal logic in the set of coding families and states
that there is a lot of background knowledge implicit to the families.
Charmaz's Approach to Coding in Grounded Theory
Research Kathy Charmaz is currently one of the leading researchers in the field of grounded
theory. She develops an alternative view of the procedure in the development of grounded
theory. Charmaz suggests doing open coding line by line, because it "also helps you to refrain
from imputing your motives, fears, or unresolved personal issues to your respondents and to your
collected data". After line-by-line coding at the beginning , she continues by exploring some of
the resulting codes more deeply. Charmaz's second step is called focused coding, were the two
codes "avoiding disclosure" and "assessing potential losses and risks of disclosing.
All three versions discussed here treat open coding as an important step. Ail see theoretical
saturation as the goal and end point of coding. They all base their coding and analysis on
constant comparison between materials (cases, interviews, statements, etc.). Glaser still holds the
idea of emerging categories and discovery as epistemological principle. Charmaz (2006) sees the
whole process more as "constructing grounded theory". All see a need for developing also formal
categories and links.
What Is the Contribution to the General Methodological Discussion?
This method aims at a consequent analysis of texts. The combination of open coding with
increasingly focused procedures can contribute to the development of a deeper understanding of
the content and meaning of the text beyond paraphrasing and summarizing it. The interpretation
of texts here is methodologically realized and manageable. This approach allows room for
maneuvering through the different techniques and flexibility in formulating rules. It differs from
other methods of interpreting texts because it leaves the level of the pure texts during the
interpretation in order to develop categories and relations, and thus theories. Finally, the method
combines an inductive approach with an increasingly deductive handling of text and categories.
How Does the Method Fit into the Research Process?
The procedure outlined here is the main part of the research process that aims at developing
theories. In terms of theoretical background, symbolic interactionism has very strongly
influenced this approach. The material is selected according to theoretical sampling. Research
questions and the development state of the emerging theory orient the selection of data collection
methods. Which methods should be used for collecting data is not determined beyond that. First,
generalization aims at grounded theories, which should be related directly to the data and finally
at formal theories that are valid beyond the original contexts. Integrating grounded theories
developed in other contexts in the study allows the testing of formal theories,
What Are the Limitations of the Method?
One problem with this approach is that the distinction between method and art becomes hazy.
This makes it in some places difficult to teach as a method. Often, the extent of the advantages
and strengths of the method only become clear in applying it. A further problem is the potential
endlessness of options for coding and comparisons. You could apply open coding to all passages
of a text, and you could further elaborate all the categories, which you found, and which in most
cases are very numerous. Passages and cases could be endlessly compared to each other.
Theoretical sampling could endlessly integrate further cases. The method gives you few hints
about what the selection of passages and cases should be oriented to and what criteria the end of
coding (and sampling) should be based on. The criterion of theoretical saturation leaves it to the
theory developed up to that moment, and thus to the researcher, to make such decisions of
selection and ending.
One consequence is that often a great many codes and potential comparisons result. One
pragmatic solution for this potential infinity is to make a break, to balance what was found, and
to build a list of priorities. Which codes should you definitely elaborate further, which codes
seem to be less instructive, and which can you leave out with respect to your research question?
The further procedure may be designed according to this list of priorities. Not only for further
grounding such decisions, but also, in general, it has proved helpful to analyze texts with this
procedure in a group of interpreters. Then you can discuss the results among the members and
mutually check them.
ii. Thematic Coding
I have developed this procedure against the background of Strauss (1987) for comparative
studies in which the groups under study are derived from the research question and thus defined
a priori. The research issue is the social distribution of perspectives on a phenomenon or a
process. The underlying assumption is that in different social worlds or groups, differing views
can be found. In order to assess this assumption and to develop a theory of such groups' specific
ways of seeing and experiencing, it is necessary to modify some details of Strauss's procedure in
order to increase the comparability of the empirical material. Sampling is oriented to the groups
whose perspectives on the issue seem to be most instructive for analysis, and which therefore are
defined in advance and not derived from the state of interpretation, as in Strauss's procedure.
Theoretical sampling is applied in each group in order to select the concrete cases to be studied.
The collection of data is correspondingly conducted with a method which seeks to guarantee
comparability by defining topics, and at the same time remaining open to the views related to
them. For example, this may be achieved with the episodic interview in which topical domains
are defined, concerning the situations to be recounted, which are linked to the issue of the study,
or with other forms of interviews.
What Is the Procedure of Thematic Coding?
In the interpretation of the material, thematic coding is applied as a multi-stage procedure—
again, with respect to the comparability of the analyses. The first step addresses the cases
involved, which are interpreted in a series of case studies. As a first orientation, you will produce
a short description of each case, which you will continuously recheck and modify if necessary
during the further interpretation of the case. This case description includes several elements. The
first is a statement which is typical for the interview—the motto of the case. A short description
should provide information about the person with regard to the research question (e.g., age,
profession, number of children, if these are relevant for the issue under study). Finally, the
central topics mentioned by the interviewee concerning the research issue are summarized. After
finishing the case analysis, this case profile forms part of the results, perhaps in a revised form.
An example (of such a thematic structure), come from the study on technological change in
everyday life previously mentioned is given below:
Example of the Thematic Structure of Case Analyses in Thematic Coding
 1 First encounter with technology
 2 Definition of technology
 3 Computer
 3.1 Definition
 3.2 First encounter(s) with computers
 3.3 Professional handling of computers
 3.4 Changes in communication due to computers
 4 Television
 4.1 Definition
 4.2 First encounter(s) with television
 4.3 Present meaning
 5 Alterations due to technological change
 5.1 Everyday life
 5.2 Household equipment
This structure was developed from the first cases and continually assessed for all further cases. It
is modified if new or contradictory aspects emerge. It is used to analyze all cases that are part of
the interpretation. For a fine interpretation of the thematic domains-, single passages of the text
(e.g., narratives of situations) are analyzed in greater detail. The coding paradigm suggested by
Strauss (1987, pp. 27-28) is taken as a starting point for deriving the following key questions for
 Conditions: Why? Whit has led to the situation? Background? Course?
 Interaction among the actors: Who acted? What happened?
 Strategies and tactics: Which ways of handling situations, e.g., avoidance, adaptation?
 Consequencer: What changed? Consequences, results?
The result of this process is a case-oriented display of the way it specifically deals with the issue
of the study, including constant topics (e.g., strangeness of technology) that can be found in the
viewpoints across different domains (e.g., work, leisure, household). The developed thematic
structure also serves for comparing cases and groups (i.e., for elaborating correspondences and
differences between the various groups in the study). Thus, you analyze and assess the social
distribution of perspectives on the issue under study. For example, after the case analyses have
shown that the subjective definition of technology is an essential thematic domain for
understanding technological change, it is then possible to compare the definitions of technology
and the related coding from all cases.
What Is the Contribution to the General Methodological Discussion?
This procedure specifies Strauss's (1987) approach to studies, which aim at developing a theory
starting from the distribution of perspectives on a certain issue or process. Group-specific
correspondences and differences are identified and analyzed. In contrast to Strauss's procedure,
conduct case analyses in the first step. Only in the second step will you undertake group
comparisons beyond the single case. By developing a thematic structure, which is grounded in
the empirical material for the analysis and comparison of cases, will comparability of
interpretations increase. At the same time, the procedure remains sensitive and open to the
specific contents of each individual case and the social group with regard to the issue under
study.
How Does the Method Fit into the Research Process?
The theoretical background is the diversity of social worlds as assumed in the concept of social
representations or more generally by constructivist approaches. Research questions focus on the
analysis of the variety and distribution of perspectives on issues and processes in social groups.
Cases are involved for specific groups. In addition, elements of theoretical sampling are used for
the selection in the groups. Data are collected with methods that combine structuring inputs and
openness with regard to contents. Generalization is based on comparisons of cases and groups
and aims at the development of theories.
What Are the Limitations of the Method?
The procedure is above all suitable for studies in which theoretically based group comparisons
are to be conducted in relation to a specific issue. Therefore, the scope for a theory to be
developed is more restricted than in Strausss (1987) procedure. The analysis of texts consists of
coding statements and narratives in categories, which are developed from the material. It is
oriented to elaborating correspondences and differences between the groups defined in advance.
These correspondences and differences are demonstrated on the basis of the distribution of codes
and categories across the groups that are studied. The analysis plunges deep into text and case
studies in the first step. If the intermediate step is to be conducted consequently, the procedure
may become somewhat time consuming.
3. Analyzing Conversation, Discourse, and Genres
If you want to understand and analyze statements, it is necessary to take into account the context
in which they occur. Context here refers to both the discursive context and the local context in
the interaction. This notion is more or less unarguable in qualitative research. For this reason, in
qualitative interviews open-ended questions are asked, which encourage the respondents to say
more rather than less and in doing so produce enough textual material for the researcher to
analyze in terms of contextual considerations. In analyzing data, coding is open for this reason, at
least in the beginning. The interpretative procedures discussed in the preceding chapter
increasingly strip away the gestalt of the text in the course of the rearrangement of statements
into categories. As an alternative to this approach, one finds approaches that pay more attention
to the gestalt of the text. Therefore, these approaches "let themselves be guided by the principle
of sequential analysis.... The sequential analysis puts the idea of social order, which reproduces
itself in the performance of the interaction, into methodological terms". Such approaches are
guided by the assumption either that order is produced turn by turn (conversation analysis), or
that meaning accumulates in the performance of activity (objective hermeneutics) and that
contents of interviews are presented in a reliable way only if they are presented in the gestalt of a
narrative analysis. In each case, a specific form of context sensitivity is the methodological
principle.
Conversation Analysis
Conversation analysis is less interested in interpreting the content of texts that have been
explicitly produced for research purposes, for instance interview responses. Rather it is interested
in the formal analysis of everyday situations. Bergmann outlines this approach, which may be
considered to be the mainstream of ethno methodological research, as follows:
Conversation Analysis (or CA) denotes a research approach dedicated to the investigation, along
strictly empirical lines, of social interaction as a continuing process of producing and securing
meaningful social order. CA proceeds on the basis that in all forms of linguistic and non-
linguistic, direct and indirect communication, actors are occupied with the business of analyzing
the situation and the context of their actions, interpreting their utterances of their interlocutors,
producing situational appropriateness, intelligibility and effectiveness in their own utterances and
co-coordinating their own dealings with the dealings of others. The goal of this approach is to
determine the constitutive principles and mechanisms by means of which actors, in the
situational completion of their actions and in reciprocal reaction to their interlocutors, create the
meaningful structures and order of a sequence of events and of the activities that constitute these
events. In terms of method CA begins with the richest possible documentation - with audio-
visual recording and subsequent transcription - of real and authentic social events, and breaks
these down, by a comparative-systematic process of analysis, into individual structural principles
of social interaction as well as the practices used to manage them by participants in an
interaction.
In this way, emphasis is placed less on the analysis of the contents of a conversation and more on
the formal procedures through which the contents are communicated and certain situations are
produced. One starting point was the work of Sacks, Schegloff, and Jefferson (1974) on the
organization of turn taking in conversations. Another point of departure was the work of
Schegloff and Sacks (1974) in explaining closings in conversations. First, conversation analysis
assumes that interaction proceeds in an orderly way and nothing in it should be regarded as
random. Second, the context of interaction not only influences this interaction but also is
produced and reproduced in it. And third, the decision about what is relevant in social interaction
and thus for the interpretation can only be made through the interpretation and not by ex ante
settings.
Drew (1995, pp. 70-72) has outlined a series of methodological precepts for conversation
analysis (CA) shown below:
Methodological Precepts for Conversation Analytic Studies
 Turns at talk are treated as the product of the sequential organization of talk, of the
requirement to fit a current turn, appropriately and coherently, to its prior turn.
 In referring ... to the observable relevance of error on the part of one of the participants ...
we mean to focus analysis on participants' analyses of one another's verbal conduct.
 By the "design" of a turn at talk, we mean to address two distinct phenomena: (1) the
selection of an activity that a turn is designed to perform; and (2) the details of the verbal
construction through which the turn's activity is accomplished.
 A principal objective of CA research is to identify those sequential organizations or
patterns ... which structure verbal conduct in interaction.
 The recurrences and systematic basis of sequential patterns or organizations can only be
demonstrated ... through collections of cases of the phenomena under investigation.
 Data extracts are presented in such a way as to enable the reader to assess or challenge
the analysis offered.
Research in conversation analysis was at first limited to everyday conversation in a strict sense
(e.g., telephone calls, gossip, or family conversations in which there is no specific distribution of
roles). However, it is now becoming increasingly occupied with specific role distributions and
asymmetries like counseling conversation, doctor-patient interactions, and trials (i.e.,
conversations occurring in specific institutional contexts). The approach has also been extended
to include analysis of written texts, mass media, or reports—text in a broader sense.
The Procedure of Conversation Analysis
Ten Have (1999, p. 48) suggests the following steps for research projects using conversation
analysis as a method:
 getting or making recordings of natural interaction;
 transcribing the tapes, in whole or in part;
 analyzing selected episodes; and
 reporting the research.
The procedure of conversation analysis of the material itself includes the following steps. First,
you identify a certain statement or series of statements in transcripts as a potential element of
order in the respective type of conversation. The second step is that you assemble a collection of
cases in which this element of order can be found. You will then specify how this element is
used as a means for producing order in interactions and for which problem in the organization of
interactions it is the answer. This is followed by an analysis of the methods with which those
organizational problems are dealt with more generally. Thus, a frequent starting point for
conversation analyses is to inquire into how certain conversations are opened and which
linguistic practices are applied for ending these conversations in an ordered way.
An essential feature of conversation analytic interpretation is the strictly sequential procedure
(i.e., ensuring that no later statements or interactions are consulted for explaining a certain
sequence). Rather, the order of the occurrence must show itself in understanding it sequentially.
The turn-by-turn production of order in the conversation is clarified by an analysis, which is
oriented to this sequence of turns. Another feature is the emphasis on context. This means that
the efforts in producing meaning or order in the conversation can only be analyzed as local
practices; that is, only related to the concrete contexts in which they are embedded in the
interaction and in which the interaction again is embedded (e.g., institutionally). Analyses always
start from the concrete case, its embedding, and its course to arrive at general statements. Ten
Have (1999, p. 104) suggests in conjunction with Schegloffs work three steps for analysis of
repair in conversation, for example. Adjacency pairs mean that a specific contribution to a
conversation often has to be followed by another specific reaction— a question by an answer,
opening a telephone conversation by "hello" followed by a greeting from the other participant,
and so on. Repair means the way people start a repair organization in cases of comprehension
problems in a conversation. According to Ten Have, you should proceed in the following steps:
 Check the episode carefully in terms of turn taking: the construction of turns, pauses,
overlaps, etc.; make notes of any remarkable phenomenon, especially on any "disturbances"
in the fluent working of the turn-taking system.
 Then look for sequences in the episode under review, especially adjacency pairs and their
sequels.
 And finally, note any phenomena of repair, such as repair initiators, actual repairs, etc.
What Is the Contribution to the General Methodological Discussion?
Conversation analysis and the empirical results that have been obtained by applying it explain
the social production of everyday conversations and specific forms of discourse. The results
document the linguistic methods that are used in these discourses. Furthermore, they show the
explanatory strength of the analysis of natural situations and how a strictly sequential analysis
can provide findings which accord with and take into account the compositional logic of social
interaction.
How Does the Method Fit into the Research Process?
The theoretical background of conversation analysis is ethnomethodology. Research questions
focus on members' formal procedures for constructing social reality. Empirical material is
selected as a collection of examples of a process to be studied. Research avoids using explicit
methods for collecting data in favor of recording everyday interaction processes as precisely as
possible.
What Are the Limitations of the Method?
Formal practices of organizing interaction remain the point of reference for analyses here.
Subjective meaning or the participants' intentions are not relevant to the analysis. This lack of
interest in the contents of conversations in favor of analyzing how the "conversation machine"
functions, which is at the forefront of many conversation analytic studies, has been repeatedly
criticized. Another point of critique is that conversation analytic studies often get lost in the
formal detail—they isolate smaller and smaller particles and sequences from the context of the
interaction as a whole. This is enforced by the extreme exactness in producing transcripts.
Discourse Analysis
Discourse analysis has been developed from different backgrounds, one of which was
conversation analysis. There are different versions of discourse analysis available now.
Discursive psychology as developed by Edwards and Potter (1992), Harre (1998), and Potter and
Wetherell (1998) is interested in showing how, in conversations, "participants' conversational
versions of .events (memories, descriptions, formulations) are constructed to do communicative
interactive work"
Although conversation analysis is named as a starting point, the empirical focus is more on the
"content of talk, its subject matter and with its social rather than linguistic organization". This
allows the analysis of psychological phenomena like memory and cognition as social, and above
all, discursive phenomena. A special emphasis is on the construction of versions of the events in
reports and presentations. The "interpretative repertoires", which are used in such constructions,
are analyzed. Discourse analytic procedures refer not only to everyday conversations, but also to
other sorts of data such as interviews or media reports research process in discursive psychology
first describes the steps of using naturally occurring text and talk. Then, careful reading of the
transcripts. This is followed by coding the material, then analyzing it. According to Potter and
Wetherell, guiding questions are: Why am I reading this passage in this way? What features of
the text produce this reading? The analysis focuses on context, variability, and constructions in
the text and finally on the interpretative repertoires used in the texts. The last step, according to
Willig, is writing up a discourse analytic research. Writing should be part of the analysis and
return the researcher back to the empirical material.
Note that there has been a differentiation in discourse analysis in the last few years. Parker (for
example, 2004) has developed a model of critical discourse analysis, built on the background
developed by Michel Foucault (e.g., Foucault 1980) which is why this is also referred to as
"Foucauldian Discourse Analysis" (e.g., inWillig 2003). Here issues of critique, of ideology, and
of power are more in focus than in other versions of discourse analysis. Parker suggests a
number of steps in the research process:
 The researcher should turn the text to be analyzed into written form, if it is not already.
 The next step includes free association to varieties of meaning as a way of accessing cultural
networks, and these should be noted down.
 The researchers should systematically itemize the objects, usually marked by nouns, in the
text or selected portion of text.
 They should maintain a distance from the text by treating the text itself as the object of the
study rather than what it seems to "refer" to.
 Then they should systematically itemize the "subjects"—characters, persona, role positions
—specified in the text.
 They should reconstruct presupposed rights and responsibilities of "subjects" specified in
the text.
 Finally, they should map the networks of relationships into patterns. These patterns in
language axe "discourses," and can then be located in relations of ideology, power, and
institutions.
What is the Contribution to the General Methodological Discussion?
Discourse analytic studies analyze issues that are closer to the topics of social sciences than those
of conversation analysis. They combine language analytic proceedings with analyses of
processes of knowledge and constructions without restricting themselves to the formal aspects of
linguistic presentations and processes.
How Does the Method Fit into the Research Process?
The theoretical background of discourse analysis is social constructionism. Research questions
focus on how the making of social reality can be studied in discourses about certain objects or
processes. Empirical material ranges from media articles to interviews Interpretations are based
on transcripts of those interviews or the texts to be found.
What Are the Limitations of the Method?
Methodological suggestions on how to carry out discourse analyses remain rather imprecise and
implicit in most of the literature. Theoretical claims and empirical results are dominant in the
works published up to now.
Genre Analysis
A second development coming from conversation analysis is called genre analysis.
Communicative genres are socially rooted phenomena. Communicative patterns and genres are
viewed as institutions of communication, which interactants communicate with others. The
methodological steps include the recording of communicative events in natural situations and
their transcription. The next step is that these data are hermeneutically interpreted and subjected
to a sequential analysis before a conversation analysis is done with the material in order to show
the language's level of organization. From these two steps of analysis, structural models are set
up that are then tested for their appropriateness with further cases, before, in the last step, finally
structural variants are considered that come about as a result of modalization (irony, pejorative
forms, etc.). Examples for such communicative genres are irony, gossip, and the like. The
analysis of the data focuses first on the internal structure of communicative genres including:
 Prosody: intonation, volume, speech tempo, pauses, rhythm, accentuation, voice quality.
 Language variety: standard language, jargon, dialect, sociolect.
 Linguistic register: formal, informal, or intimate.
 Stylistic and rhetorical figures: alliteration, metaphor, rhythm, and so on.
 "Small" and "minimal forms": verbal stereotypes, idiomatic expressions, platitudes,
proverbs, categorical formulations, traditional historical formulae, inscriptions, and
puzzles.
 Motifs, topoi, and structural markers.
Finally, the external structure of communicative genres and the communicative economy of their
use are analyzed.
What Is the Contribution to the General Methodological Discussion?
Genre analytic studies analyze larger communicative patterns than does conversation analysis,
but use similar principles. Contrary to discourse analysis, they keep the focus on formal patterns
of communication and on contents. So, they combine the methodological rigor of conversation
analysis with a more content-oriented approach.
How Does the Method Fit into the Research Process?
The theoretical background of genre analysis is again social constructionism. Research questions
focus on how the making of social reality can be studied in the patterns that are used to
communicate about certain objects or processes and their function. Empirical material are
recordings of communication. Interpretations are based on transcripts of those recordings.
What Are the Limitations of the Method?
The definition of a communicative genre is less clear than other units of qualitative analysis. The
methodology is more comprehensive and more rigorous than other analytic approaches in
qualitative research, but comprises several methodological approaches (hermeneutic and
conversation analytic methods), which make the analysis rather complicated and time
consuming.
4. Narrative and Hermeneutic Analysis
Analyzing Narratives
Narrative analyses start from a specific sequential order. The individual statement that you wish
to interpret is first considered in terms of whether it is part of a narrative and is then analyzed.
Narratives are stimulated and collected in the narrative interview in order to reconstruct
biographical processes. More generally, fife is conceptualized as narrative in order to analyze the
narrative construction of reality without necessarily using a procedure of data collection
explicitly aimed at eliciting narratives.
Analysis of Narrative Interviews for Reconstructing
Events In the literature you can find several suggestions for analyzing narrative interviews. The
"first analytic step (i.e., formal text analysis) is to eliminate all non-narrative passages from the
text and then to segment the 'purified' narrative text for its formal sections". A structural
description of the contents follows, specifying the different parts of narratives, such as "and
then" or pauses. The analytic abstraction— as a third step—moves away from the specific details
of the life segments. Instead its intention is to elaborate "the biographical shaping in toto, i.e., the
life historical sequence of experience-dominant processual structures in the individual life
periods up to the presently dominant processual structure". Only after this reconstruction of
patterns of process do you integrate the other, non-narrative, parts of the interview into the
analysis. Finally, the case analyses produced in this way are compared and contrasted to each
other. The aim is less to reconstruct the narrator's subjective interpretations of his or her life than
to reconstruct the "interrelation of factual processual courses".
Haupert (1991) outlines a different procedure. In preparation for the actual fine analysis, he first
draws up the narrator's short biography. This includes a chronological display of the "events
identified as meaningful" in the life history. This is followed by the segmentation of the
interviews according to Schütze's method and by formulating headings for the single sequences.
The identification of the "sequential thematic," and the attachment of quotations explaining it, is
the next step. Finally, the core of the biography with the central statements of the interview is
formulated. Paraphrases of statements from the text and the explication of the contexts and
milieus of the interviews lead to further abstraction. After condensing the case stories to core
stories, they are classified into types of processes. These types are related to life-world milieus.
This procedure also reconstructs the course of the biography from the course of the narrative.
This reconstruction of factual courses from biographical narratives starts from the "assumption
of homology." According to Bude, this includes the premise that "The autobiographical
unprepared extempore narrative is seen ... as a truly reproductive recapitulation of past
experience" (1985, pp. 331-332). Recently, this premise has been questioned, not only by Bude.
The constructions involved in narratives are now attracting more attention.
The Analysis of Narrative Data as Life Constructions
Accordingly, Bude (1984) outlines a different view on narratives, the data contained in them, and
their analysis by suggesting the "reconstruction of life constructions." Here he takes into account
that narratives, like other forms of presentation, include subjective and social constructions in
what is presented—life constructions in narrative interviews, for example. In a similar way,
authors in psychology, such as Bruner (1987), understand life histories as social constructions. In
their concrete shaping, they draw on basic cultural narratives and life histories offered by the
culture. The goal of analyzing narrative data is more to disclose these constructive processes and
less to reconstruct factual processes. Rosenthal and Fischer-Rosenthal (2004) see a difference
between a life story told in the interview and the life history which was lived by the interviewee.
They analyze narrative interviews in five steps: analysis of the biographical data; thematic field
analysis (reconstruction of the life story); reconstruction of the life history; microanalysis of
individual text segments; and contrastive comparison of life history and life story.
The Sequence of Stages in the Practical Analysis
 Analysis of biographical data (data of events)
 Text and thematic field analysis (sequential analysis of textual segments from the self-
presentation in the interview)
 Reconstruction of the case history (life as lived)
 Detailed analysis of individual textual locations
 Contrasting the life story as narrated with life as lived
 Formation of types
Denzin outlines the procedure for such an interpretation as follows:
(1) Securing the interactional text; (2) displaying the text as a unit; (3) subdividing the text into
key experiential units; (4) linguistic and interpretive analysis of each unit; (5) serial unfolding
and interpretation of the meanings of the text to the participants; (6) development of working
interpretations of the text; (7) checking these hypotheses against the subsequent portions of the
text; (8) grasping the text as a totality; and (9) displaying the multiple interpretations that occur
within the text.
For analyzing narratives of families and the processes of constructing reality that take place in
them, Hildenbrand and Jahn (1988, p. 208) suggest the following sequential analytic procedure.
First, the "hard" social data of the family (birth, marriage, educational situation, stages in
professional life, etc.) are reconstructed from the narrative. Then they are interpreted with respect
to the room for decisions, compared to the decisions actually made. Then a hypothesis is
generated. This is systematically tested during further interpretation of the case of the studied
family. Two components, (1) the opening sequence of the narrative and (2) the "members' self-
presentation" evident within it, provide the basis for the analytic procedure. Sampling of further
cases follows. The case structures elaborated in the analyses can be contrasted, compared, and
generalized. The inspiration behind this procedure was the objective hermeneutics that will be
discussed next.
What Is the Contribution to the General Methodological Discussion?
Common to all the procedures for analyzing narrative data presented here is that in the
interpretation of statements they start from the gestalt of the narrative. In so doing they view the
statements in the context of the way the narrative proceeds. Furthermore, they include a formal
analysis of the material, indicating which passages of the text are narrative passages, and which
other sorts of text can be identified. The procedures differ in how they view the role of the
narrative in the analysis of the studied relations. Schütze sees the narrative presented in the
interview as a true representation of the events recounted. The other authors see narratives as a
special form of constructing events. This form can also be found in everyday life and knowledge
and so this mode of construction is particularly suited to research purposes. A characteristic
feature of narrative analysis is the combination of formal analysis and sequential procedure in the
interpretation of constructions of experiences.
How Does the Method Fit into the Research Process?
The theoretical background is the orientation to the analysis of subjective meaning. For this
purpose, narrative interviews are used for collecting data. Research questions focus on the
analysis of biographical processes. Cases are usually selected gradually, and generalizations
made in order to develop theories. Therefore, case analyses are contrasted with one another.
What Are the Limitations of the Method?
In the main, those analyses based on Schütze's method exaggerate the quality of reality in
narratives. The influence of the presentation on what is recounted is underestimated; the possible
inference from narrative to factual events in life histories is overestimated. Only in very rare
examples are narrative analyses combined with other methodological approaches in order to
overcome their limitations. A second problem is the degree to which analyses stick to individual
cases. The time and effort spent analyzing individual cases restricts studies from going beyond
the reconstruction and comparison of a few cases. The more general theory of biographical
processes that was originally aimed at has yet to be realized, although there are instructive
typologies in particular domains.
Objective Hermeneutics
Objective hermeneutics was originally formulated for analyzing natural interactions (e.g., family
conversations). Subsequendy the approach has been used to analyze all sorts of other documents,
including even works of art and photographs. Schneider (1988) has modified this approach for
analyzing interviews. The general extension of the domain of objects of inquiry based on
objective hermeneutics is expressed by the fact that authors characteristically understand the
"world as text." This is indicated by the tide of a volume of theoretical and methodological works
in this field. This approach makes a basic distinction between (1) the subjective meaning that a
statement or activity has for one or more participants and (2) its objective meaning. The latter is
understood by using the concept of a "latent structure of meaning." This structure can be
examined only by using the framework of a multi-step scientific procedure of interpretation. Due
to its orientation to such structures, the label "structural hermeneutics" has also been used.
What Is the Procedure of Objective Hermeneutics?
In the beginning, the aim was focused on the "reconstruction of objective meaning structures" of
texts. Analysis in objective hermeneutics was not interested in what the text producers thought,
wished, hoped, or believed when they produced their text. Subjective intentions linked to text are
held to be irrelevant in this context. The only relevant thing is the objective meaning of the text
in a particular linguistic and interactive community. Later, the label "objective" was extended
beyond the issue of study: not only were the findings claimed to have (greater) validity, but also
the procedure was seen as a guarantor of objective research. Analyses in objective hermeneutics
must be "stricdy sequential": one must follow the temporal course of the events or the text in
conducting the interpretation. They should be conducted by a group of analysts working on the
same text. First, the members define what the case to be analyzed is and on which level it is to be
located.
It could be defined as a statement or activity of a specific person, or of someone who performs a
certain role in an institutional context, or of a member of the human species. This definition is
followed by a sequential rough analysis aimed at analyzing the external contexts in which a
statement is embedded in order to take the influence of such contexts into account. The focus of
this rough analysis is mainly on considerations about the nature of the concrete action problem
for which the studied action or interaction offers a solution. First, case structure hypotheses,
which are falsified in later steps and the rough structure of the text and of the case, are
developed. The specification of the external context or the interactional embedding of the case
serves to answer questions about how the data came about: Under the heading of interactional
embedding, the different layers of the external context of a protocolled action sequence must be
specified with regard to possible consequences and restrictions for the concrete practice of
interaction itself, including the conditions of producing the protocol as an interactional
procedure.
The central step is sequential Jine analysis. This entails the interpretation of interactions on nine
levels as given below. At levels 1 and 3 of the interpretation, an attempt is made to reconstruct
the objective context of a statement by constructing several possible contexts in thought
experiments and by excluding them successively. Here, the analysis of the subjective meanings
of statements and actions plays a minor role. Interest focuses on the structures of interactions.
The procedure at level 4 is oriented to interpretations using the framework of conversation
analysis, whereas at level 5 the focus is on the formal linguistic (syntactic, semantic, or
pragmatic) features of the text. Levels 6 to 8 strive for an increasing generalization of the
structures that have been found (e.g., an examination is made of whether the forms of
communication found in the text can be repeatedly found as general forms—i.e., communicative
figures—and also in other situations). These figures and structures are treated as hypotheses and
are tested step by step against further material.
Levels of Interpretation in Objective Hermeneutics
 0 Explication of the context which immediately precedes an interaction.
 1 Paraphrasing the meaning of an interaction according to the verbatim text of the
accompanying verbalization.
 2 Explication of the interacting subject's intention.
 3 Explication of the objective motives of the interaction and of its objective
consequences.
 4 Explication of the function of the interaction for the distribution of interactional roles.
 5 Characterization of the linguistic features of the interaction.
 6 Exploration of the interpreted interaction for constant communicative figures.
 7 Explication of general relations.
 8 Independent test of the general hypotheses that were formulated at the preceding level
on the basis of interaction sequences from further cases.
According to Schneider (1985), the elaboration of general structures from interaction protocols
can be shown in the following steps in the proceedings of sequential fine analysis. First, the
objective meaning of the first interaction is reconstructed (i.e., without taking the contextual
conditions into account). Therefore, the research group narrates stories about as many contrasting
situations as consistently fit a statement. At the next step, the group compares general structural
features to the contextual conditions in which the analyzed statement occurred. The meaning of
an action can be reconstructed through the interplay of possible contexts in which it might have
occurred and the context in which it actually occurred. In thought experiments, the interpreters
reflect on the consequences that the statement they have just analyzed might have for the next
turn in the interaction. They ask: what could the protagonist say or do next? This produces a
variety of possible alternatives of how the interaction might proceed. Then the next actual
statement is analyzed. It is compared to those possible alternatives which might have occurred
(but which did not in fact do so). By increasingly excluding such alternatives and by reflecting
on why they were not chosen by the protagonists, the analysts elaborate the structure of the case.
This structure is finally generalized to the case as a whole. For this purpose, it is tested against
further material from the case—which means subsequent actions and interactions in the text.
More generally, Reichertz (2004, pp. 291-292) outlines three variants of text explanation in the
research using objective hermeneutics:
 The detailed analysis of a text at eight different levels in which the knowledge and the
external context, and also the pragmatics of a type of interaction, are explained in
advance and are borne in mind during the analysis.
 The sequential analysis of each individual contribution to an interaction, step by step.
This is done without clarifying in advance the internal or external context of the
utterance. This is the most demanding variant of objective hermeneutics, since it is very
strongly based on the methodological premises of the overall concept.
 The full interpretation of the objective social data from all those who participate in an
interaction before any approach is made to the text to be interpreted. This variant handles
the fundamentals of a theory of hermeneutics interpretation very flexibly and uses them
in a rather metaphorical way.
Further Developments
This procedure was developed for analyzing everyday language interactions, which are available
in recorded and transcribed form as material for interpretation. The sequential analysis seeks to
reconstruct the layering of social meanings from the process of the actions. When the empirical
material is available as a tape or video recording and as a transcript, you can analyze the material
step by step from beginning to end. Therefore, always begin the analysis with the opening
sequence of the interaction. When analyzing interviews with this approach, the problem arises
that interviewees do not always report events and processes in chronological order. For example,
interviewees may recount a certain phase in their lives and then go on to refer during their
narrative to events that happened much earlier. In the narrative interview too (particularly in the
semi-structured interview), events and experiences are not recounted in chronological order.
When using a sequence analytic method for analyzing interviews, you first have to reconstruct
the sequential order of the story (or of the action system under study) from the interviewee's
statements. Therefore, rearrange the events reported in the interview in the temporal order in
which they occurred. Then base the sequential analysis on this order of occurrence, rather than
the temporal course of the interview: "The beginning of a sequential analysis is not the analysis
of the opening of the conversation in the first interview but the analysis of those actions and
events reported by the interviewee which are the earliest 'documents' of the case history" Other
recent developments aim at deriving a hermeneutics of images from this approach. Starting from
a critique of the increasingly narrow concept of structure in Oevermann et al.'s approach, Lüders
(1991) attempts to transfer the distinction between subjective and social meaning to the
development of an analysis of interpretative patterns.
What Is the Contribution to the General Methodological Discussion?
A consequence of this approach is that the sequential analytical procedure has developed into a
program with clearly demarcated methodological steps. A further consequence of this is that it is
made clear that subjective views provide only one form of access to social phenomena: meaning
is also produced at the level of the social (on this in a different context, see Silverman 2001).
Finally, the idea of social sciences as textual sciences is preserved most consistently here.
Another aspect is the call for conducting interpretations in a group in order to increase the
variation of the versions and perspectives brought to the text and to use the group to validate
interpretations that have been made.
How Does the Method Fit into the Research Process?
The theoretical backgrounds of this approach are structuralist models. Research questions focus
on the explanation of social meanings of actions or objects. Sampling decisions are mostly taken
successively (step by step). Often, the researcher refrains from using explicit methods for
collecting data. Instead, everyday interactions are recorded and transcribed, although interviews
and, occasionally, field notes from observational studies are also interpreted using objective
hermeneutics. Generalization in this procedure starts from case studies and is sometimes
advanced using contrasting cases.
What Are the Limitations of the Method?
A problem with this approach is that, because of the great effort involved in the method, it is
often limited to single case studies. The leap to general statements is often made without any
intermediate steps. Furthermore, the understanding of the method as art, which can hardly be
transformed into didactic elaboration and mediation, makes it more difficult to apply generally
(for general skepticism, see Denzin 1988). However, a relatively extensive research practice
using this approach can be seen in German-speaking countries. The common feature of the
sequential methods discussed above is that they are based on the temporal-logical structure of the
text, which they take as a starting point for their interpretation. Thus, they follow the text more
closely than do categorizing methods. The relation of formal aspects and contents is shaped
differently. Conversation analysis is mainly interested in formal features of the interaction.
Narrative analyses start from the formal distinction between narrative and argumentative
passages in interviews. This distinction is used (1) for deciding which passages receive (more
extended) interpretative attention, which will generally be the narrative passages; and (2) for
assessing the credibility of what has been said—as narratives are usually regarded as more
credible than argumentative passages. In interpretations using objective hermeneutics, the formal
analysis of the text is a rather secondary level of interpretation. Sometimes, these methods
employ hypotheses derived from passages of the text in order to test them against others.
Social Science Hermeneutics and Hermeneutic Sociology of Knowledge
Recent approaches have taken up basic ideas of objective hermeneutics, but have developed a
different understanding of hermeneutics and of the issues of research. They no longer use the
term "objective", focusing instead on the social construction of knowledge. Again, non-
standardized data—protocols of interaction—are preferred to interview data. The researchers
should approach the field under study as naively as possible and collect unstructured data.
Interpretations follow a three-step procedure. First, open coding according to Strauss (1987) is
applied with a focus on the sequential structure of the document (line by line, sometimes word
by word). Then, researchers look for highly aggregated meaning units and concepts that bind
together the parts and units. In the third step, new data are sought with which the interpretation is
falsified, modified, and extended by means of the later data collection.
Narrative and hermeneutic approaches take into account the structure of the text. The analysis
follows the structure of the text (sequentially) and sees the statements in this context.
Biographical texts are analyzed in the light of the sequence of the events that is reported so that
(1) the internal structure of the life history and (2) the external structure of the life reported in it
may be related to each other. Social science hermeneutics links such a sequential analysis with
open coding according to grounded theory research.
5. Using Computers in Qualitative Analysis
6. Text Interpretation: An Overview
Sooner or later in qualitative research texts become the basis of interpretative work and of
inferences made from the empirical material as a whole. The starting point is the interpretative
understanding of a text, namely an interview, a narrative, an observation, as these may appear
both in a transcribed form and in the form of other documents. In general, the aim is to
understand and comprehend each case. However, different attention is paid to the reconstruction
of the individual case. In content analysis, you work mainly in relation to categories rather than
to cases. For example, the approach adopted by Strauss does not make a principle of a
thoroughgoing case analysis. In a similar fashion conversation analyses restrict their focus to the
particular socio-linguistic phenomenon under study and dedicate their attention to collecting and
analyzing instances of this phenomenon as opposed to attempting analysis of complete cases. In
thematic coding, in the analysis of narrative interviews, and in objective hermeneutics, the focus
is on conducting case studies. Only at a later stage is attention turned to comparing and
contrasting cases. Global analysis aims at a rough editing of texts to prepare them for later case-
oriented and case-comparing analyses. The understanding of the case in the different
interpretative procedures can be located at various points in the range from a consequent
idiographic approach to a quasi-nomothetic approach. The first alternative takes the case as case
and infers directly from the individual case (an excerpt of a conversation, a biography, or a
subjective theory) to general structures or regularities. A particularly good example of this
approach is objective hermeneutics and other related approaches of case reconstruction. In the
second alternative, several examples are collected and—hence, "quasi-nomothetic"—the single
statement is at least partly taken out of its context (the case or the process) and its specific
structure in favor of the inherent general structure. The procedures of text interpretation
discussed in the preceding chapters may be appropriate to your own research question. As an
orientation for a decision for or against a specific procedure, four points of reference can again
be outlined.
First Point of Reference: Criteria-Based Comparison of Approaches
The different alternatives for coding and sequential interpretation of texts may be compared. The
criteria I suggest for this comparison are as follows. The first is the degree to which precautions
are taken in each method to guarantee sufficient openness to the specificity of the individual text
with regard to both its formal aspects and its content. A second criterion is the degree to which
precautions are taken to guarantee a sufficient level of structural and depth analysis in dealing
with the text and the degree to which such structures are made explicit. Further criteria for a
comparison are each method's contribution to developing the method of text interpretation in
general and the main fields of application the methods were created for or are used in. The
problems in applying each method and each method s limitations mentioned in the preceding
chapters are again noted for each approach at the end. This display of the field of methodological
alternatives of text interpretation allows the reader to locate the individual methods in it.
Second Point of Reference: The Selection of the Method and Checking its Application
As with collecting data, not every method of interpretation is appropriate in each case. Your
decision between the methodological alternatives discussed here should be grounded in your own
study, its research question and aims, and in the data that you collected. You should review your
decision against the material to be analyzed. The evaluation of an interpretative method and the
checking of its application should be done as early as possible in the process of interpretation—
in case analyses no later than after finishing the interpretation of the first case. A central feature
of this evaluation is whether the procedure in itself was applied correctly; for example, whether
the principle of strict sequential interpretation was followed or whether the rules on content
analysis were applied. The specific problems that the individual interpreter has with the attitude
of interpretation demanded by the method should be taken into account. If any problems arise at
this level, it makes sense that you, in a group of interpreters, reflect on them and the way you
work with the text. If it is impossible to remedy the problems in this way you should also
consider changing the method. Another point of reference for assessing the appropriateness of an
interpretative procedure is the level at which you seek results. If you have to analyze large
amounts of text with regard to ensuring that your results are representative on the basis of many
interviews, approaches like objective hermeneutics may make the attainment of this goal more
difficult or even obstruct it. Qualitative content analysis, which would be a more appropriate
method for this type of analysis, would not be recommended for deeper case analyses.
Suggestions for deciding on a method of interpretation, and for checking the appropriateness of
this decision.
Checklist for Selecting a Method of Interpretation and Evaluating its Application
 Research question: Can the method of interpretation and its application address the essential
aspects of the research question?
 Interpretative procedure: The method must be applied according to the methodological
precautions and targets. There should be no jumping between forms of interpretation, except
when this is based on the research question or theoretically
 Interpreter: Are the interpreters able to apply the type of interpretation? What is the effect of
their personal fears and uncertainties in the situation?
 Text(s): Is the form of interpretation appropriate to the text or the texts? How is their
structure, clarity, complexity, and so on taken into account?
 Form of data collection: Does the form of interpretation fit the collected material and the
method of data collection?
 Scope for the case: Is there room for the case and its specificity in the framework of the
interpretation? Can this specificity become clear also against the framework of the
interpretation?
 Process of the interpretation: Did the interpreters apply the form of interpretation correctly?
Did they leave enough scope for the material? Did they manage their roles? (Why not?) Was
the way of handling the text clearly defined? (Why not?) Analyze the breaks in order to
validate the interpretation(s) between the first and second case if possible
 Aim of the interpretation: Are you looking for delimited and clear answers in their
frequency and distribution or complex, multifold patterns, contexts, etc.? Or do you want to
develop a theory or distribution of viewpoints in social groups?
 Claim for generalization: The level on which you want to make statements: • For the single
cases (the interviewed individuals and their biography, an institution, and its impact etc.)? •
Referring to groups (about a profession, a type of institution, etc.)? • General statements?
Third Point of Reference: Appropriateness of the Method to the Issue
The interpretation of data is often the decisive factor in determining what statements you can
make about the data and which conclusions you can draw from the empirical material. Here, as
with other procedures in qualitative research—despite all the rhetoric surrounding certain
approaches—no procedure is appropriate in every case. Procedures like objective hermeneutics
were originally developed for the analysis of a specific domain of issues (interaction in families
viewed from the perspective of socialization theory). Over time their field of application has
been increasingly extended both in terms of materials used for analysis (interviews, images, art,
television programs, etc.) and in terms of issues and topics analyzed. Similarly, the approach of
Strauss and Corbin (1998) is marked by a claim for more and more general applicability as made
clear by the formulation of a very general "coding paradigm". If the postulated applicability of
approaches is extended like this, the criterion of appropriateness to the issue again needs to be
taken into account. You should reflect on it in two respects. It should be clarified not only to
which issues each method of interpretation is appropriate, but also to which it is not appropriate,
in order to derive the concrete use of the method in a grounded way.

Fourth Point of Reference: Fitting the Method into the Research Process
Finally, you should assess the method you choose for its compatibility with other aspects of the
research process. Here you should clarify whether the procedure of interpreting data works well
with the strategy of data collection you used. If, when conducting an interview, you paid great
attention to the gestalt of the narrative in the interviewee's presentation, it does not make much
sense to apply a content analysis on the data in which only a few categories are used which were
defined in advance. Attempts to sequentially analyze field notes with objective hermeneutics
have proved impractical and unfruitful. Similarly, it needs to be examined whether the method of
interpreting data works well with the method of selecting the material. Also you should consider
whether the theoretical framework of your study corresponds to the theoretical background of the
interpretative method and whether both understandings of the research process correspond. If the
research process is conceptualized in the classical linear way, much is determined at the
beginning of the interpretation—above all, which material was collected and how. In this case,
you should answer the question of selecting and evaluating an interpretative procedure with
regard to these parameters to which it should correspond. In a research process, which is
conceptualized in a more circular way, the method of interpretation may determine the decisions
made about procedure in the other steps. Here, the collection of data is oriented to the sampling
and the method to the needs, which result from the type and the state of interpretation of data.
At this point, it is clear that you should make the evaluation of methodological alternatives and
the decision between them with due consideration to the process of the research. Suggestions for
answering these questions are provided by the paragraphs on fitting the individual method into
the research process, and the research questions and the goals of the concrete empirical
application. None of the methods for analyzing data is the one and only method. Each of them
has strengths and weaknesses in relation to your own study. You should carefully consider which
method best fits your kind of data and your research question. Each method produces a specific
structure in the way it enables you to work with the data. Before and while applying a specific
method for answering your research question, assess whether the method you selected is
appropriate.

You might also like