ELPS 600 - LECTURE NOTES

LECTURE HIGHLIGHTS INDEX

DR. ROSALIND LATINER RABY Return to 600 Homepage

Copyright© 2005 - 2014

RESEARCH FOUNDATION

RELIABILTY & ETHICS

INTERVIEWS

PRINCIPLES OF  EDUCATIONAL RESEARCH

THEORY IN EDUCATIONAL  RESEARCH

QUANTITATIVE ETHNOGRAPHY

LITERATURE REVIEW

SURVEY/QUESTIONNAIRE

DRS METHOD

CULTURAL BOUNDARIES

PARTICIPANT OBSERVATION

EVALUATION

ACTION RESEARCH

RESEARCH DESIGN

FIELD NOTES

ACTION RESEARCH

CODING & DATA ANALYSIS


BASIC PRINCIPLES OF EDUCATIONAL RESEARCH

SCIENCE, RESEARCH & METHODOLOGICAL PARADIGMS - WHAT IS SCIENCE?

Science involves the systematic investigation of phenomena in order to elucidate their nature, genesis and consequences

Science - to explain, predict and/or control phenomena

Depends on careful control of possible sources of bias or distortion in both collection and analysis;

Research must be supported adequately by evidence and explained logically.

Based on assumption that all behaviors and events are orderly and that they are effects which have discoverable causes

SCIENCE AS SOURCE OF KNOWLEDGE

Explain, predict and/or control phenomena - not absolute body of knowledge.

As method - empirical science relies on observation to develop and to validate explanatory systems called theories. Based on assumption that all behaviors and events are orderly and that they are effects which have discoverable causes

Empirical - refers to whether or not phenomena are capable of being found in the real world and assessed by means of the senses.

As attitude, consists of a willingness to examine and modify one's own beliefs, ideas, or methods

1. Control of Bias; 2. Quest for Precision; 3. Verification 4. Empirical

RESEARCH STARTS WITH A PROBLEM

Gets perspective - set of prescribed notions

Need theory to provide outcomes

Paradigm - world view - picture of universe

All research fits into one ore more paradigms - specific generalizations of the world

Generalizations then weave theory

Theory - generate answers to "So What?"

Paradigms - take theories and fits them into a methodologicl reality

Methodology is consciously theoretical - what studied, how and with what answers

Research starts with problem - gets perspective - set of prescribed notions - need theory to provide outcomes

Educational research - formal, systematic application of the scientific method to the study of educational problems. Goal is to explain, predict and/or control educational phenomena

Difficulties in that cannot control people!

Difficult to generalize or replicate findings

* impossible to duplicate laboratory conditions

* observation - presence alters behavior

* Precise measurement is also difficult -

no instruments comparable to a barometer for measuring intelligence, achievement or attitude

Scientific Method - techniques by which to view the world and conduct research: involves

1) Explain; 2) Predict; 3) Control Phenomena

Recognition and definition of problem

Induction of hypotheses based on observation - set of scientific questions

Collection of data

Deduction of implications of the hypotheses (analysis of data)

Testing of the implications

Confirmation/dis-confirmation of the hypotheses - evaluation/conclusion

These methods are not exclusive and are often overlapping.

ETHNOGRAPHY

ETHNOSCIENCE - Scientific Method that attempts to gain an "emic" perspective (Spradley)

ETHNOLOGY - "science of peoples, their cultures and life histories as groups "(Kroeber:1948) - Comparative analysis of multiple entities

Ethnography - cyclical pattern (compared to scientific method) - way of studying life

Scope can range from macro to micro, from focused observations to selective

General questions to focused questions etc

Product as well as a process

Use of qualitative and naturalistic methods/observations

All Ethnography IS non-manipulative

Not all naturalist and phenomenological are non-manipulative or ethnographic

Concerned with observation and recording of real-world phenomena which IS empirical and philosophy which IS NOT empirical.

Working with definitions of what is relevant taken from the existing conscious awareness of school practioners and from literature

Helps discover new phenomenon of functional relevance

Fieldwork - doing ethnography

Does NOT Study People - Rather - LEARNS From People

Degrees of learning

It is one thing to describe differences, another to account for them

ETHNOGRAPHIC RESEARCH - Embodies elements of ethnography, qualitative research, case-study research, field research etc. (holistic, multimodal- eclectic)

Observation by itself does not provide meaning.

Without asking questions and learning from "natives" you will not grasp their perspective

Essential core of Ethnography - concern with the meaning of actions and events to the people we seek to understand

Study lifeways of people in their own cultures - to understand cultural meanings in which human action is embedded and to relate these to the social contexts in which action occurs

Starts with a conscious attitude of almost complete ignorance!

ETHNOGRAPHIC METHOD - to Selection of research project * asking ethnographic questions * collecting ethnographic data * making an ethnographic record * analyzing ethnographic data * writing an ethnography

.

Disciplined study of what the world is like to people who have learned to see, hear, speak, think and act in ways that are different. Concerned with the meaning of actions and events to the people we seek to understand.

Some meaning are directly expressed in language; many are taken for granted and communicated only indirectly through word and action.

Every social situation - people make constant use of these complex meaning systems to organize their behavior, to understand themselves and others and to make sense of the world in which they live.

Interprets culture accurately - but is aware that researcher can see a portion of a cultural reality. How much of whose reality is portrayed and how it can be portrayed and with what degree of adequacy are the important questions

Put puzzle together: part and whole analysis

Working with definitions of what is relevant taken from the existing conscious awareness of school practioners and from literature

Helps discover new phenomenon of functional relevance

Fieldwork - doing ethnography!

No one method is all powerful - best to be multi-methodological

 

QUALITATIVE

Qualitative sometimes synonymously used with phenomenological, naturalistic, non-manipulative, observational, and ethnographic. These description refer not to design, but to different aspects of the inquiry process

(1) Naturalistic Research - study of individuals who normally undertakes and engages in activities of interests - normal activitie

Watching people do what they "normally" do

Naturalistic Research MUST BE tied in with theory

* always define phenomenon

* need a working definition of phenomena

* pick a place where expect to find phenomenon

* know units of analysis and levels of analysis

* boundary of analysis

* sampling

* specification of methods

* aware of reactivity - personal bias - side effect that influences behavior

(2) Holistic - descriptions of total phenomena within various contexts - reality is multi-leveled and multi-causal - eclectic - variety of research techniques to amass their data

Emphasis is on "qualities of things" and "kind of things" - get whole picture behind "numbers"

(3) Phenomenological data - represent world view of participants being investigated

-- participant constructs are used to structure the research

(4) Empirical - Nothing to do with numbers or manipulation of variables. Refers to whether or not phenomena are capable of being found in the real world and assessed by means of the senses, through various form of research.

7 TYPES OF QUALITATIVE RESEARCH VARIATIONS

CASE STUDY, COMPARATIVE AND CONTENT ANALYSIS - Appropriate for intensive, in-depth examination of one or a few aspects of given phenomenon

SURVEY/QUESTIONNAIRE ANALYSIS - Fewer individual aspects of phenomena, but across far more instances

EXPERIMENTATION AND SIMULATION - Tells if a specific program achieved its stated goals, but not HOW program was implemented or to what to attributes its success ; Testing and Quantitative Experiments

PARTICIPANT AND NON-PARTICIPANT OBSERVATION - Taking Field Notes, Observing Play

FORMAL AND INFORMAL INTERVIEWING: Asking Questions,,

HISTORICAL OR DOCUMENT ANALYSIS; Archival Records and other documents such as secondary sources, questionnaire responses, ethnographic depictions, organizational records etc.

SANDARDIZED OBSERVATIONAL RESEARCH: Asking Questions, Taking Field Notes, Observing Play; Simulation

QUANTITATIVE

Dominant Methodological Paradigm - considered "Scientific"

investigators follow a linear pattern of investigation

Define a research problem * formulate hypotheses * make operational definitions * design a research instrument gather the data * analyze the data * draw conclusions * report the results

Sometimes a person may not formulate a hypotheses (thus excluding step two), preferring to generate testable hypotheses as a conclusion to the study

Criteria used to define and measure variables that are generally independent of sociocultural world under study

Assessment of frequencies of types of behavior is basic is what judgments are based upon (Wallace: 1970)

Consistency - reliability - distance - method objective

Naturalistic, phenomenological and holistic emphasis is qualitative

Controlled, positivistic and particularistic emphasis in experimentation is quantitative

Use quantitative to demonstrate validity of one's analytic models

Deductive Reasoning - arriving at specific conclusions based on generalizations

Strategies that secure data, measurable by an analytic set of criteria that can be rigorously tested for possible correlations by means of mathematical procedures;

Associated with Experimental - deductive analysis and verification when Outcome is determined

Questionnaire - emphasis on what people SAY not DO

Measurement, control, prediction - aims of research

Quantitative Distortions: no one method is all powerful

All methods are subjective - even "scientific" and "quantitative"

Abstract Empiricists - only one method (the scientific one) is correct

In order to be reliable - must be objective - must be quantitative

 

BRIDGING QUALITATIVE AND QUANTITATIVE: QUANTIFICATION OF QUALITATIVE RESEARCH

Bring two paradigms together to help cut down on distortion. But first need similar language so that both paradigms can talk to one another.

Much of Ethnography uses quantitative data and many quantitative studies are using qualitative techniques

 

QUANTIFICATION OF QUALITATIVE RESEARCH

of Observation defines "variables' in research mode

Attributes - relations, purpose, prospects are observed -

continuous variation - precise height categorical orientation - being tall

Social facts make qualitative/naturalistic studies become attributable to quantitative code for "qualities of things, kind of things" and thus become scientific

Generation of Scientific Questions and Hypothesis

Hypothesis - generating; exploratory; testing; focused

Outer explanations - outcome is determinism

methods: questionnaires - what people SAY not do

particularistic - highly specific - mode of study

measurement, control, prediction - aims of research

operational - concepts


RESEARCH FOUNDATIONS

INFERENCES: TACIT KNOWLEDGE

We all know things that we cannot talk about or express in direct ways. From what people say, how they act and from artifacts people use. At first, each cultural inference is only a hypothesis about what people know

These hypothesis must be tested over and over until the ethnographer becomes relatively certain that people share a particular system of cultural meanings

Inductive - find a theory that explains their data. Beings with collection of data - empirical observations or measurements - and build theoretical categories and propositions from relationships discovered from among the data

Deductive - begins with a theoretical system, develops operational definitions of propositions and concept of the theory and matches them empirically to some body of data. Find data to match a theory

OPERATIONAL RESEARCH QUESTIONS

Framed within a valid research purpose

Defines topic of interest an established parameters for research design

Relates Purpose/Goals - what is to be the overall, ultimate product of the research. Describes yet unknown.

Suggest how research results might be used

Spell out - where, with whom and how a study will be carried out

Ethnographic Methods: Generative, inductive, constructive and subjective ends of continuum

 

Subjective - self-conscious attempt to balance observer bias & reactivity of particiants

Objective - control for biases with design and statistics.

Predictive - Focusing on examining of effects caused by a specific treatment. Measure precise impact a specific activity/treatment has on people and predict the chances of being able to duplicate that impact in future activities/treatments

Descriptive - Document exactly what happened. Ethnography is always descriptive - study of interplay among empirical variables as they occur naturally

Verificative Research - verifies or tests propositions developed elsewhere (deductive)

Generative Research - discovering constructs and propositions using one or more data bases as the source of evidence (inductive)

Constructive Strategy - discover what analytic constructs can be elicited from the stream of behavior; process of abstraction

Enumeration - process by which the research subjects previously derived or defined units of analysis to systemic counting or enumeration - proceeded by constructive process

CLASSIFICATION OF RESEARCH BY TYPE

BASIC RESEARCH - Conducted solely for purpose of theory development and refinement

Conducted solely for purpose of theory development and refinement

Not concerned with practical applicability

Closely resembles laboratory conditions and controls (scientific research)

General principles of learning;

Provides theory that produces implications for solving educational problems

APPLIED RESEARCH - Conducted for purpose of applying or testing, theory and evaluating its usefulness in solving educational problems

Conducted for purpose of applying or testing, theory and evaluating its usefulness in solving educational problems.;

Principles of learning as concerned with utility in educational setting;

Provides data to support, guide theory revision or suggest development of new theory

COMBINATION RESEARCH

Conduct controlled research in special or simulated classrooms: use school children and involving school-relevant topics and materials

EVALUATION RESEARCH - systematic process of collecting and analyzing data in order to make decisions. Purpose is make a decision regarding the relative worth of two or more alternative actions:

1) Is special program worth what it costs?;

2) Is new, experimental reading curriculum better than the former curriculum?

RESEARCH AND DEVELOPMENT - not to formulate or test theory, but to develop effective products for use in schools including: teacher-training materials, learning materials, sets of behavioral objectives, media materials and management systems. R & D efforts are quite extensive in terms of objectives, personnel and time to completion.

ACTION RESEARCH - solve classroom problems through the application of scientific method. Local problem, conducted in local setting - solution of given problem, not contribution to science. Applicable only to local setting. Provides immediate answers to problems that cannot wait for theoretical solutions.

CLASSIFICATION OF RESEARCH BY METHOD

METHODS - Procedural strategies for securing and analyzing data

Type of analysis achieved depends on type of data collected and analytic procedures applied

Particularly significant in evaluating research results, as they constitute the main basis on which judgements of validity and verification are made.

Each method is designed to answer a different type of question

RESEARCH DESIGN: Decisions on what will be studied - explore the alternatives

TYPES OF DESIGNS

1) Research Design before entering field

2) conscious awareness of one's commitment while in field and the answering of questions while in the field

3) analysis after field experience has ended

 

CULTURE-BOUND - living inside a particular reality that is taken for granted as "the reality"

Cultural deprivation theories (1960s-1980s) saw educational failure of many minority children due to a lack of achievement and to their being "culturally deprived". Ethnographic research on "culturally deprived children" reveals they are not deprived, but have elaborate, sophisticated and adaptive cultures which are simply different from the ones espoused by the educational system

Before you impose your theories on people you study, find out how those people define the world

A-PRORI DESIGN - nullify problems done with study

empirical - before the fact - deductive; skeptical of conventional boundaries and labels; pre-selection of site on pre-determined variables - could be misleading; enter field - not sure where phenomena lies or with who exists; cannot assume all data is the same until after study; can't tell theoretical issue until after study is underway

A-POSTERIORI DESIGN - done in field and during write-up ; emphasis on placing data in a theoretical context; inductive - after the fact

CONTINUUM - QUALITATIVE TO QUANTITATIVE

QUALITATIVE

Historical Research

Descriptive Research

HISTORICAL RESEARCH: studying, understanding and explaining past events; conclusions concerning causes, effects or trends of past occurrences that may help to explain present events and anticipate future events

Primary sources - eyewitness reports and original documents. I interview someone who witnessed an accident

Secondary sources - secondhand information, description of an event by other than an eyewitness - I interview persons friend, who did not witness the accident, but heard an account of it

External criticism - authenticity of data

Internal criticism - evaluates worth of data - degree of accuracy. This interpretation of meaning is debatable

Typical Studies:

1) Factors leading to the development and growth of individualized instruction

2) Effects of decisions of the United States Supreme Court on American education

3) Trends in reading instruction, 1875-1975

 

DESCRIPTIVE RESEARCH: collecting data in order to test hypotheses or answer questions concerning current status of the subject of the study; determines and reports the way things are, i.e. assessing attitudes or opinion, market research surveys. Typically collected through a questionnaire survey, an interview or observation

Collecting data that

* asks questions that have not been asked before

* seeks information that is not readily available

Requires development of a research instrument appropriate for obtaining the desired information

Criteria in evaluating the findings and conclusions of specific studies, reports etc.

Complexities involving instrumentation, skill, response and interpretation

Instruments usually have to be developed for specific studies

Typical Studies:

1) How do 2nd grade teachers spend their time? 2nd grade teachers would be observed and time presented as % - i.e. 60% of time lecturing; 20% answering questions; 10% administering discipline; 10% performing administrative duties etc.

2) How will citizens of Yortown vote in next presidential election? Survey (questionnaire or interview) and result presented as percentages

3) How do parents feel about split-shift school days? Parents would be surveyed and results presented in terms of percentages for, against, or undecided

4) Survey of teachers to determine how and to what degree they believe anxiety affects achievement

CORRELATIONAL RESEARCH: determine whether, and to what degree a relationship exists between two or more quantifiable variables; Use these relationship to make predictions

Variables found not to be highly related are eliminated from further consideration

Variables that are highly related suggest causal-comparative or experimental studies to determine if relationships are causal

Just because there is a relationship between self-concept and achievement does not imply that self-concept "causes" achievement of reverse

The degree of relationship between two variables is generally expressed as a correlation coefficient, which is number between .00 and 1.00.

TYPICAL STUDIES

1) Relationship between intelligence and creativity. Scores on an intelligence test and creativity test would be acquired from each member of a given group. 2 sets of scores correlated and resulting coefficient would indicate the degree of relationship

2) Relationship between anxiety and achievement (similar pattern)

3) Use of an aptitude test to predict success in an algebra course. Scores on an algebra aptitude test would be correlated with ultimate success in algebra as measured by final exam scores, for example. If resulting coefficient was high, aptitude test would be considered a good predictor

4) Study to determine the relationship between scores on an anxiety scale and scores on an achievement measure

EXPERIMENTAL RESEARCH: establish case-effect relationship with group comparisons. Explores relationship between an independent and dependent variable which are specifically manipulation for the experiment.

Independent Variable - "Cause" is manipulated and is believed to make a difference

Dependent Variable - "Effect" is determined to occur or not occur - the difference

A study which investigates a cause-effect relationship investigated the effect of an independent variable on a dependent variable.

1. Operationalization of the variables is appropriate

2. Instrumentation used to measure the variables is fully described and is appropriate.

3. Treatments are fully documented and are replicable.

TYPICAL STUDIES

1) Comparative effectiveness of programmed instruction vs. traditional instruction on computational skill. Independent variable (cause) is type of instruction (programmed vs. traditional); dependent variable (effect) is computational skill. Two groups exposed to same experiences, except for method of instruction. Computational skill would be compared.

2) Effect of self-paced instruction on self-concept. Independent variable is pacing; dependent variable is self-conflict

3) effect of positive reinforcement on attitude toward school; independent variable is type of reinforcement; dependent variable is attitude toward school.

4) Study to compare achievement of 2 groups - one taught in an anxiety-producing environment; and another in an anxiety-reducing environment

Clinical - observational studies

1. The phenomena under investigation are clearly identified

2. Interviews and observation guidelines are related to the key elements of the study

3. The methodology for recording the interviews is appropriate

Clinical - teaching experiments

1. The phenomena under investigation are clearly identified

2. Plans for observation are detailed and related to the key elements of the study

CAUSAL-COMPARATIVE RESEARCH - establish case-effect relationship with group comparisons. Research statistic must prove co-variables between cause and effect. When compared to experimental research, these are less expensive, take much less time to conduct and may lead to experimental studies. Randomization is a key element.

Alleged "cause" is not manipulated - it has already occurred (e.g. sex, brain damage; effect of broken home, intelligence) or chosen not to be manipulated - method of instruction

Dependent variables are intrinsically different - perhaps one group possess a characteristic the other does not (SES level).

TYPICAL STUDIES

1) effect of kindergarten attendance on achievement at end of the first grade

2) effort of having a working mother on school absenteeism

3) Effect of sex on algebra achievement

4) Study to compare achievement of a group of students classified as high-anxious and group classified as low-anxious

RESEARCH CYCLES

Quantitative - investigators follow a linear pattern of investigation

Define a research problem * formulate hypotheses * make operational definitions * design a research instrument gather the data * analyze the data * draw conclusions * report the results

Sometimes a person may not formulate a hypotheses (thus excluding step two), preferring to generate testable hypotheses as a conclusion to the study

Ethnography - cyclical pattern

Selection of research project * asking ethnographic questions * collecting ethnographic data * making an ethnographic record * analyzing ethnographic data * writing an ethnography

Scope can range from macro to micro, from focused observations to selective observations, and from general questions to focused questions etc.


THEORY IN EDUCATIONAL RESEARCH

No such thing as "pure" because all is interconnective. Perception is pre-conditioned and colored by implicit causal theory

Perceptions and knowledge code inter-rationalization of reality

Can observe - without recognition. Without partial recognition and understanding. Without full recognition and understanding.

Without full view (hidden) use deductions. Through intermediary source

Reality is multi-leveled, multi-causal, polythetic (numerical classification - can't classify in generalities, but define in statistical bundles of traits)

Discovering Grounded Theories: Theories grounded in empirical data of cultural description - Grounded Theory

Theory - akin to design. Way of looking at the world. Explain why things happen as they do. Help us sort our world

Help decide what things are important and hence, what data to collect. Understanding and interpretation of meaning of constructs. Theory clarify, limits and defines a research problem

Each question forms the basis of a research problem that can be justified on the grounds that the proposed research will help clarify he meaning of the interference theory.

Theories can be complex. All are speculative to some extent - i.e. they generalize. Theories are statements about how things are connective. Some are bad - and do not contribute to practice. Others are good - but have no application

Various Expressions in mathematics, as related propositions (if . . . then . . .), in narrative prose in form of a statement describing a phenomena followed by elaboration of examples, anecdotes or descriptions.

 Theories can be Depictive - ex.: every action causes an equal and opposite relation. Theories can be Descriptive - explain phenomena

Consequences of theory can be used to test indirectly the validity of the theory itself. Each implication is a research problem.

Research, in beginning stages of theory - provide information about the theory. Research, when theory is understood, provides information about the theory's consequences which have been tested

 QUESTIONS TO ASK

IS there anything going on out there?

WHAT is going on out there?

COMPONENTS OF THEORY

CONCEPTS - things to be connected

The classes of phenomena and their characteristics which humans use to organize their world. These are names, labels with definitions which we attach to things abstract or concrete

CONSTRUCTS - the connection itself and how it is linked or forms relationships.

Each empirical investigation adds further knowledge to theory development

Theory implies certain consequences, whose validation lends credence to the theory.

TYPES OF THEORIES: RESEARCH CAN BE DESIGNED TO

Correct faulty theory

Correct faulty methodology

Correct inappropriate use of statistics

Resolve conflicting opinion

Resolve practical field problems

 USING THEORY

 FORMULATION OF RESEARCH PURPOSE AND QUESTION

Established theories can be used to generate a research question.

Questions may be designed to amplify, refine or disconfirm or verify established theories.

Theories predispose research to approach a phenomena differently, in terms of questions asked and research designs chosen.

Can develop hunch about way things are or why they occur and then find theory to match it, compare with it, counter it

Theory narrows focus from a general consideration to a specific concept or idea

CHOICE OF RESEARCH PARTICIPANTS AND SETTINGS

Populations, and settings must have relevant to theory that informs the research question

ETHNOGRAPHIC DESIGN AND THEORY

Addresses interaction of humans together and provides portrait of some group. Utilizes model to develop holistic depiction of uncontrived group interaction over time. Applies to theories that highlight group processes such as cultural transmission, culture and personality, socialization, acculturation and change. Emphasizes discovery of shared belief, practices, artifacts, folk knowledge and behavior and social mechanisms that facilitate these processes. Generates concrete empirical data to test theoretically derived constructs

Variations on Themes

(1) Social Learning Theory (commonly associated with experimental designs) focus on activities of individual learners.

(2) When added to theories of social class (done by survey analysis of groups) - it raises new questions and concerns about how individual learn and accept their place in the system of social class stratification.

(3) When added tot Critical theory adds confrontation of those who live under oppressive conditions and those who perpetuate situations of subordination.

 Educational Ethnography - developing and applying theories of educational change, schooling, race relations and instructional orgrnazation.

Methodology does not exist in a vacuum. It supports, challenges and revised theoretical paradigms

CHRONICLE OF EDUCATIONAL RESEARCH USE OF ETHNOMETHODOLOGY

Historical Foundations: Wax: 1971

Psychological Foundations: Cole & Scribner: 1974; Tharp & Gallimore: 1988

Sociological Foundations: Whyte: 1955; Agar: 1980; Goetz & LeCompte: 1984

Educational Ethnography: Spindler: 1982; Anderson: 1989; Apple: 1992; Quantz: 1992

Educational Sociology and Anthropology: Cicourel: 1974; Garfinkel: 1967; McDermott: 1977; Habermas: 1964;

Roberts & Akinsanya: 1976

Cross-Cultural Influences With Marxist Bent: Labellle & Verhin3: 1975; Karabel & Halsey: 1978; Bowles/Gintis: 1976;

Masserman: 1982; 1990

Symbolic Interactionism: Margaret Mead; Elizabeth Thomas

EDUCATIONAL PARADIGMS

Experiences of individuals must be represented in adequate theories of society and culture

 


CULTURAL BOUNDARIES

THERE ARE MANY DIFFERENT CULTURES, SUB-CULTURES ETC. EACH OF WHICH HAVE THEIR OWN WAYS IN WHICH TO TEACH AND TO LEARN.

CULTURE: A shared design for living Learned Behavior. Rules for interacting and for why certain things are done in certain ways. Shared pictures people carry in their minds for perceiving, relating to and interpreting the world about them.

All knowledge that is learned and that is passed on from one generation to another.

Goodenough defines: "Culture consists of standards for deciding what is, standards for deciding what can be, standards for deciding how one feels about it, standards for deciding what to do about it and standards for deciding how to go about doing it."

 learned behavior - sets rules for behavior. Encompasses: kinship, marriage, political and economic organization, religion and social interactions. explains why certain things are done in certain ways. gives people a sense of identity. facilitates intercultural communication and relations. Every individual has own version of culture. Mini-cultural groups that overlap.

Different models for different roles: I am - woman, daughter, sister, wife, mother, professor, friend etc.

VALUES: System of culturally acquired beliefs and habits that permit individuals and institutions to maintain a comprehensive cultural identity

VALUE ORIENTATIONS: Abstract categories which encompass a world view and a preference for a way of life. They are based on assumptions about the nature of human existence from which standards of right and wrong or good and bad are drawn.

1) Social Relations (independence, interdependence, individualism, competition)

2) Activity (being, doing, pragmatist, reflection)

3) Time (orientation towards past, present, future)

4) Relationships with nature or supernatural (fatalistic or technological orientation)

 IDEALS: Maintained through a complicated system of norms and values. The synthesis of which is found in behavior.

 Our world is comprised of many different cultures. Each of them being unique in the values they hold, the behavior they exhibit and the belief system which sustains them.

 RECOGNIZE A DIFFERENT SCALE OF VALUES

Differences are not barriers. Differences cause difficulty in communicating

1) no two things are identical

2) no one thing stays the same: time and space

3) It is not possible to tell all about anything: all descriptions are open-ended

4) beware of stereotyping, ethnocentrism and biases

5) seek our commonalities among cultural diversities

6) recognize a different scale of values

7) Same word may be used to represent different "realities" while similar events or experiences are sometimes called by different names

8) statement of opinions are often confused with statements of facts

9) Use descriptive terms rather than ones which express approval or disapproval

10) Use phrases that indicate uniqueness of culture, i.e. from our point of view, in our culture

11) Become more alert to ways in which cultural conditioning shape our value judgments. I

12) UNDERSTAND THAT THE COMPLEXITIES OF CULTURE REQUIRE EXPERIENCE AND TIME

CLASSIFY: To understand world around use - DEFINE AND CLASSIFY

Cope with environment and organize vast quantities of data to which we are exposed, all people classify their experience according to categories derived from cultural conditioning.

Geography - use of different maps

classifications are necessary for communication and comprehension of our world

Remember that all classifications are subject to CULTURAL RESTRAINTS

Grouping of images/traits to facilitate communication, both positive and negative. All classifications are subject to cultural restraints. How we classify indicates how we relate to one another on different levels.

How we classify indicates how we relate to one another on different levels. This can be a form of communication, both positive and negative. Much of cross-cultural learning is based on assumptions of how we understand ourselves in relationships to others.

 ETHNOCENTRISM: Term coined by William Braham Summer (early 1900s'). Seeing one's own culture as the center of the universe. Believing that one's way is superior to others.. Loyalty to one's social group and traditional patterns of life

STEREOTYPES: In process of classification, people inevitably group together things which are discriminately different (exaggerated belief). Respond to objects or people in terms of categories that they created (pre-judging) rather than in terms of the individual uniqueness. Acts as both a device for justifying acceptance or for rejecting others. Pre-judging according to categories rather than reality Labeling new acquaintances (students)

 It is easier and quicker to stereotype. Thus a natural and inevitable cognitive process often leads to the harmful imposition of inaccurate attributes and stereotypical characterizations unto others.

CHECK LIST FOR CROSS CULTURAL SENSITIVITY

1) Are you familiar with the country's basic culture and history? Students family structure?

2) Are you aware of U.S. non-verbal forms of communication?

3) Are there non-verbal behavior patterns you use which may be interpreted as "offensive"

4) Can you anticipate some possible miscommunication problems

5) Are you aware of your others non-verbal behavior or communication pattern?

6) Do you know that others culture dictates as being appropriate in a social context? In a work context? In the family context?

7) Visit the community; Familiarize self with literature written by culturally diverse authors

8) Subscribe to ethnically diverse publications; Enroll in race awareness or cross-cultural workshops; Form professional relations with culturally diverse colleagues

 INTERCULTURAL COMMUNICATION

Language, Thinking Patterns, Forms of Expression (verbal and nonverbal)

 The higher the degree of similarity of perception: a) the easier communication among them is likely to be; b) the more communication will occur; c) the more likely it is that this similarity will be recognized: identify groups will form.

Communication across cultural boundaries becomes all the more difficult. Differences in customs, behavior, values promote cultural misunderstanding. Cultural problems result when we fail to recognize that persons of other cultural backgrounds have different goals, customs, thought patterns and values from our own.

Communication manifest through symbols that differ in their meaning according to time, place, culture, or person.

Cultural Differences promote cultural misunderstanding

Every person projects self into human communication

Every person is a medium/instrument of communication, not just a sender of messages

Every generation perceives life differently

No matter how hard one tries, one cannot avoid communication

Communication is irreversible - but it can be explained

Communication occurs in a context and is a dynamic process

Who you communicate with; role and status of person

Who you are: how you perceive yourself

When to communicate: appropriate time, place

 Non-Verbal Patterns: also culturally bound

1) Proximity/space

2) Time and Time consciousness/sense

3) Dress and Appearance: outward garments, body decorations

4) Food and Feeding Habits: manner Food is selected, prepared, presented and eaten

5) Gestures, Grimaces, Signs

6) Roles and relationships: age, sex, status, family, wealth, power, wisdom etc.

7) Work habits and practices (rewards and recognition); definitions of work and practice (individual oriented or communal)

8) Others: shapes, colors, sounds, smells, art forms, body language

 Cultural learning is not designed to make all people in the world alike, nor is it intended to teach that one set of values is better than another. Through understanding from the point of view of another culture, you can make your own judgement about that culture with as little cultural preconception as possible.

 


LITERATURE REVIEW

Systematic identification, location and analysis of documents containing information related to the current status of research and its implications. "Literture takes on broader meaning - refers to all sorts of written and oral and visual communication

Determine what has already been done - provides justification. Provide rationale - arguement for your own work. References become evidence for your own work. Point out the value of a study that supports why your idea is also valid. This shows that others also think your topic is valid since they have also written on it. Show how new study integrates with old ones. Indicate directions to which their work might point. Avoid unintentional duplication. Provides understanding and insight necessary for development of logical. Must convince reader of relevance and interest of questions, and adequacy and appropriates of population and research design. Framework into which your research fits. Working model of the "whole system"

 Do justice to work which has precede your own

1) convince the reader of the relevance and interest of the questions and the adequacy and appropriateness of the choice of population and research design

2) anticipate and justify the results

3) where possible, support the interpretation of data and the conclusions reached

critical to a study because it is the place where investigators explain to the reader the theoretical underpinnings of the study.

Primary Sources - firsthand inforamtion. Original documens and reports by actual participants or direct observers

Secondayr Sources - second hand information - reference books, reports by relatives of actual participants or observers

 Later during the course of they study, the literature review becomes the reference point for retaining or changing the focus of the study

 WHEN TO REVIEW?

Some review materials broadly at first, leaving a more focused search for the end of the data analysis when emergent patterns provide guidance. Others do it in the beginning, some at the end and some throughout. Start with sources that most broadly relate to your topic/question

1) Indicate support for the importance of your topic/question from Professional or Scholarly literature

2) Clarify the important constructs of your study from Theoretical Literature

3) Explain how literature review recommends issues relating to your topic. What is the thinking of Theorists and findings of Researchers as it relates to your question

WHERE TO REVIEW: Periodicals, Abstracts, Reviews, Books, Research Reports, etc.

American Educational Research Journal

American Journal of Education

Annual Review of Educational Research

Anthropological Abstracts

Comparative Education Review

Comparative Education

Compare

Current Index to Journals in Education (CIJE)

Dissertation Abstracts International

Educational Index

Educational Evaluation and Policy Studies

Educational Researcher

Educational Administration Abstracts

Educational Leadership

Educational Resources Information Center (ERIC)

Elementary School Journal

Encyclopedia of Educational Research Handbook of Research in Educational Administration

Handbook of Research in Teaching

Harvard Educational Review

International Review of Education

Journal of Research & Development in Ed. Journal of Education for Teaching

Multicultural Education

Oxford Review of Education

Phi Delta Kappan

Psychological Abstracts

Readers Guide to Periodical Literature

Resources in Education (RIE)

Review of Special Education

Review of Educational Research

Smithsonian Science Information exchange

Sociological Abstracts

Teachers College Record

THINGS TO KEEP IN MIND

1) coverage of literature is adequate; 2) review of literature is well organized; 3) studies are examined critically;

4) source of important findings is noted; 5) relationship of the problem to previous research is made clear

TYPES OF SOURCES

TECHNICAL- Technical Journal - Engineering

SCHOLARLY- Academic emphasis - Comparative Education Review - methods, literature review, findings etc.

PROFESSIONAL- Aimed at a specific profession - Elementary Education Review

LAY- Non-academic emphasis - no specifics on methods or even literature review - Parents

PRIMARY- research observes directly

SECONDARY- research observes through someone else - indirectly

TYPES OF REVIEW

SUBSTANTIVE REVIEW - compiles references to all prior empirical work done n area. It summarizes the results of all studies done to date

METHODOLOGICAL REVIEW - how all prior studies were done

THEORETICAL REVIEW - looks at how the results of studies in the topic area were interpreted, what theoretical frames were used to inform the study and what implications were drawn.

WRITING THE LITERATURE REVIEW

With each article/book there should be a) Complete bibliographic reference; b) Introduction; c) Critical Review of Articles; (your opinions); d) Brief Summary of Major Themes of Articles; e) Relationship of Review of Literature to Your Research; f) Classify and code articles according to some system. Key words help.; G) Check the bibliography given at the end of each article. Existing bibliographies are excellent sources of additional articles.

Be sure to have summaries and transitions between each section of your proposal to explain how that section relates to your research question/ and to explain how the sections relate to each other in terms of your research question.

Make a list of key words (search terms/descriptors) that relate to the topic you have chosen. For purposes of this class, try to limit the articles you will reviews to the last five years.

LIMITATIONS: Need to define limitations of study -; what can't be done and why; what is perceived problems and why

what is perceived biases and why; Define according to topic, to theory and to methodology

TECHNOLOGY - COMPUTERS: store data and files. search for data. assist with coding of data and collating of data.

Internet and literature review. Internet and data collection


RELIABILITY, VALIDITY AND ETHICS

Can any method exist without distortion?NO!But research methods help reduce distortion.  Observation is an artifact of those tools. Facts - assumed to be discrete and unequivocal, independent of cultural beliefs and values NOT. Scientific Method to get at these facts

Objectivity is limited:

1) Perception itself is always filtered through conceptual processes

2) Observations are conditioned by the questions we seek to answer

3) Act of observation ,whether mediated through mechanical devices (physical sciences) or through tools as questionnaires, interviews or simply the presence of an observer in social sciences, may affect the subjects of observation in ways that modify the nature of the data sought

RELIABILITY - Replicability of data offered as evidence

1) To what extent are procedures used clear and repeatable?

2) To what extent are concepts used to analyze the observed flow of events: Will different observers, using the same procedures, identify the same items?

3) Scientific value for consensus is the cornerstone for credibility

Studies that follow the same procedures and achieve consistent results are less likely to be questioned. More detailed presentation of data and procedure, the more credible it is. Measures congruence between constructs of independent observers. No two observers will over SPACE or TIME, Cultural vantage point, knowing the informant etc. view the same event in the same way. Since two observers can share similar biases - reliability is enhanced when observers have divergent procedures, nonetheless agree in their descriptions. Differences are matter of degree. Secondary data (census, tests, scores, ethnographic depictions) - prior process of selection, organization and formulation from which such data were constructed are usually obscured and reliability at this level remains untested

WHAT CONSTITUTES RELIABILITY DIFFERS WITH TYPES OF RESEARCH

Quantitative/Experimental Design

external - would independent researcher find similar results form similar situations

internal - degree of similarity to other researcher's who have or will conduct similar research under the same construct

multiple researchers and content analysis

Emphasize careful methods and techniques in combination with rigorous definitions of concepts, measurements of variables and cleanly abstracted data

Qualitative/Ethnographic Research

external - similar methods and results - labeling factor

* multiple researchers

* peer examinations - feedback from peers

* mechanically recorded materials: video, tape - make different kind of bias/ reliability

internal - length of time in field increases reliability - time makes lasting impression

Emphasize validity of findings, sense of close fit between the theory developed and empirical world it is intended to explain

VALIDITY

Accuracy of fit, congruence, between the researcher's analytic constructs and events in the socio-cultural world experienced by participants. Analysis must account not simply for what we observe from our cultural-scientific frames of reference, but also for what actors observe from their vantage points. Includes providing the most complete picture possible of actual social processes. This requires assembling information from the perspectives of a variety of participants and ascertaining contextual factors that affect those processes. Ethnography is hampered by investigations that impose Western concepts onto non-Western cultures, thereby distorting the results. Comparison helps reveal differences/similarities to assist

external - General representation of place, setting, can legitimate cross-culture research status - how set self up determines how will be perceived. Social conditioning of observation. Is data adequately descriptive of that reality? Informant chooses how much will be revealed and truth of it

internal - extent to which observations are authentic representations of some reality. Cross-check with self over time, with others perceptions of same thing etc. Monitor own reactivity to scenic

MEMBER CHECKS: taking data and interpretations back to the people from whom they were derived and asking them if the results are plausible (do continuously throughout the study)

REPEATED OBSERVATION: Repeated observations of the same phenomenon - and gathering data over a period of time.

TRIANGULATION

Using multiple investigators (to establish validity through pooled judgement), multiple sources of data or multiple methods confirm emerging findings. Cross-check the accuracy of data gathered in another way ; prevents investigator from accepting too readily the validity of initial impressions; enhances scope, density and clarity of constructs developed during the course of the investigation; assists in correcting biases that occur when the ethnographer is the only observer of the phenomenon under investigation. Common in historical research, survey design and secondary analysis or meta = analyses of experimental results

Less frequent in single-study experimental research. The is because precision required for isolating treatment effects mandates designs that eliminate other sources of information and converge on the specific effects at hand

SELECT A RESEARCH PROBLEM

1) Interest

2) Economics - can you afford to do the study

3) Researchers ability and training

4) Criterion of Uniqueness - some things just can't be replicated -

RESEARCH DESIGN DEFINED BY

Type of site (filed vs. Lab)

Number of units selected (case study vs. Survey)

Role of research (PO)

Number of terms studied (Quantitative)

Method of data collection (survey, participant)

Degree of control exercised by the researcher over participants (experiments, naturalistic research, simulation etc.)

 7 COMMON RESEARCH METHODOLOGIES - Variations and Combations are Common

(1) CASE STUDY ANALYSIS - intensive, in-depth examination of one or a few aspects of a given phenomenon

(2) SURVEY ANALYSIS- addresses fewer instances

(3) INTERVIEW ANALYSIS - one-on-one, personal examinations of data

(4) EXPERIMENTATION - tell if a specific program achieved its state goals, but does not give much information on how the program was implemented or to what attribute its success

(5) STANDARDIZED OBSERVATIONAL RESEARCH

(6) SIMULATION

(7) HISTORICAL OR DOCUMENTATION ANALYSIS - rely on written artifacts

 

READ ARTICLES: DOING A CONTENT ANALYSIS

ABSTRACT

1) Is the basic Idea is a good and valid one? 2) Explains that this good study asks an important question: a) Why should I be interested? What is the point?; b) Will it fill a gap in existing literature?; c) Will it reconcile contradictory research results from studies already published?; d) Is it "newsworthy?"; e) Are all of the above done adequately in the first few paragraphs?

3) Study Boundaries: a) Where is the question?; b) Who is this study about?; c) Does the analysis illuminate rather than obfuscate?

 INTRODUCTION

1) Brief introduction states essence of issue in first or last sentence of the first or second paragraph; 2) Is the introduction as brief as possible given the topic of the article?; 3) Is there an explicit hypothesis that is correctly derived from theory that has been cited?; 4) General Research question: a) Stated clearly and serve as foundation for other specific questions generated for the study; b) Check congruence of specific questions against general question; c) Often the general question will be placed at the end of the review of literature. It will be stated as a question and prefaced with a lead-in like "the general purpose of this study" or "an important research question is."

 LITERATURE REVIEW

Needs to include: a) substantive review; b) theoretical review; c) methodological review

Primary sources - firsthand information - original documents and reports by actual participants or direct observers

Example - accounts from personal experience

Secondary sources - secondhand information -reference books (encyclopedias), reports by relatives of actual participants or observers. This type of data can substantiate other research methods.

Example: attendance records, minutes of citizen advisory board; teacher notes; textbooks etc.

 FORMULATE RESEARCH PURPOSE/PROBLEM AND QUESTIONS

1) Established theories (paradigms) - help generate research question. 2) How you study depends on your discipline (psychology, economics) and on sub-areas of your discipline.; 3) Explanations provided by a theory often serve as a kind of comparative case, corroborating researcher's findings. Theory affects : Choice of Design, Participants/Settings, Data Collection Strategies, Presentation, Interpretation and Application of Findings

 BUILDING A HYPOTHESIS OR RESEARCH QUESTIONS

1) defines topic of interest and establishes parameters for the research design; 2) statements of research purpose or goals: delineate what is the overall ultimate research product; 3) research questions - define how the purpose or goals will be carried out. They delineate the specific hypotheses or problems addressed in a study.; 4) Each hypothesis MUST BE TESTABLE

Can be put to empirical test - but not necessary. Can use other testing types

 Reason and deduction reduce the number of tentative explanations (hypotheses) to those which seem reasonable. This can be done through logic, through observation. Seeking "proof" by the method of elimination only establishes which possibilities are likely. Best are simple!

Spell out exactly where, with whom and how a study will be carried out.

a) Describe relationships sought or tested beteeen two or more variables; b) Explains and predicts occurences and events;

c) Lists facts discovered, proved or to be disproved; d) constructs or concepts generated.

What topics, problem or issues doe the study address; What are it's objectives; What is the scope of this investigation;

What is to be included and excluded?; Where is data to be found, and when is it accessible; How should it be recorded, collected and stored

 EVALUATING HYPOTHESES

a) Hypothesis must confirm or disconfirm; b) Hypothesis should be consistent with previously verified information ro data;

c) Multiple hypotheses; d) Simplicity is best

 INDEFENSIBLE POSITION (something that cannot be tested) - state hypotheses in question form rather than as statements of expectations. Questions are useful in defining a research "problem" but fail to specify an expectation that can be confirmed or rejected

 Hypotheses conform to known principles (do not deal with the unknown)- this does not retard knowledge, but guards against the pursuit of unproductive hypothesis

 SINGLE HYPOTHESES - allows for only one implication to be confirmed or disconfirmed

 MULTIPLE HYPOTHESES - consider a multiplicity of plausible explanations for a given event and test these explanations empirically.

 PROBLEMS WITH HYPOTHESES CONSTRUCTION

Constructing hypotheses is tantamount to biasing the investigator

 Sometimes problem is not that there are too many H, but that too few have been considered

 Hypothesis present the research from trying other and more fruitful approaches

 Hypothesis prevent unforseen or accidental discoveries (serendipity)

 WRITING THE RESEARCH HYPOTHESIS

(1) Base Hypothesis on completed Literature Review. Need to know what was done before start.

(2) Place Research Hypotheses in the first chapter. Write in statement form rather than as a question, unless, of course, a thorough review of the literature fails to provide any directions. These expectation should be derived from the literature review

(3) Research should plan on having more than one hypotheses to test.

(4) Do not word in null form. Should predict significant differences or significant relationships. The null form of the hypothesis is presented in the chapter on "procedures" and again in the chapter on "findings"

(5) Define all terms

FORMULATING RESEARCH QUESTIONS

Research question or problem defines the topic of interest and establishes parameters for the research design. Research questions do not need to be testable. Statements of research purpose or goals delineate what is to be the overall, ultimate produce of the research. How goals will be carried out.

Describe the yet unknown; Fill gaps in existing knowledge base. Spells out exactly where, with whom and how a study will be carried out Expand knowledge base; Initiate investigation; Facilitate integration of an emerging conceptual field; Suggest how research result might be used ; Are phrased as concretely as possible in empirical or operational terms

What kind of data will you need from whom; Where the data be found and when is it accessible; How should it be recorded, collected or stored; Why is it needed anyway

METHOD

1) Study is free of design flaws; 2) Design is described so that replication is possible without further information?; 3) When read the Design and Analysis sections - can reviewer keep eye on important variables? - need for a single sentence that gives the rationale for using quote sophisticated or new statistical and qualitative techniques; 4) Get "picture" of subjects - better to over rather than under describe subjects/informants: a) go beyond breakdowns by age/grade/sex and ethnicity; b) consider subject volunteer and how bias affects result; c) provide context to subjects and to site; identify reasons why author believes subject(s) is/are representative of a large group

DISCUSSION/CONCLUSIONS

1) Is discussion properly confined to the findings or is it digressive?; 2) Has author explicitly considered and discussed viable alternative explanations of the findings?; 3) Have nonsignificant trends in the data been promoted to "findings"; 4) Are limits of generalizations possible from the data made clear?; 5) Has author considered possible methodological bases for discrepancies between the results reported and other findings in the literature?

RESEARCH FORMATION SPECIFIC TO TYPES OF RESEARCH

EXPERIMENTAL ETHNOGRAPHY

PREDICTIVE Measure precisely impact specific activity/treatment has on people and predict chances of duplication DESCRIPTIVE Document exactly what happened: describe an natural or experimental emphasis.

VERIFICATIVE Verifies or tests (gives evidence to) propositions / given hypothesis developed elsewhere. Establish not only extent to which the tests exist, but also populations to which it is applicable GENERATIVE Concerned with discovering constructs and propositions using one or more data bases as the source of evidence.

DEDUCTIVE Begins with a theoretical system, develops operational definitions and matches them empirically to some body of data. Find data to match a theory.

INDUCTIVE Begins with collection of data - empirical observations or measurements. Builds theoretical categories / propositions from relationships among data. Find theory to explain data.

ENUMERATION Process by which research subjects previously derived or defined units of analysis to systematic counting or enumeration. CONSTRUCTIVE Strategy aimed at discovering analytic constructs or categories elicited from stream of behavior, developed in the course of observation and description.

OBJECTIVE Precise construction of design that controls for biases with design and statistics, with null hypothesis and with elimination of element contamination. SUBJECTIVE Self-conscious attempt to balanced observer bias and reactivity of participants.

GENERALIZE Generalize beyond scope of a single study to the larger whole. LOCALIZE Comparability and Translatability of results to parts.

DEFINITION OF TERMS

LEXICAL DEFINITIONS - those found in dictionaries, explain a given term as it is used by most people

OPERATIONAL DEFINITION - those in which a concept is defined by the operations used to measure it. They indicate the processes necessary to measure the concept. Define words by the operation that it takes to produce them - i.e. methodology. Definer stipulates what is meant by a given term and in so doing indicates the process of measuring the term itself. Terms that cannot be measured? They become tied to an observable situation and their relationship then is highlighted

ELEMENTS OF STYLE

No colloquial or slang expressions. Use correct grammar.; Be accurate; Be concise: ask if meaning is clear. Avoid verbosity; Avoid redundancy; Define all words; Tense - Try to use present tense to report well-accepted generalizations that apply to current situations, to the representation of data and to findings reported by others and self.; Use past and present tenses in discussing findings; Abbreviations should be used sparingly; Organize thoughts. Guide reader from one point to the next without being pushed, harangued, talked down to or confused

GRAMMAR HINTS

Each, every and everyone are singular and require a singular verb; Datum is singular; data is plural; Curriculum is singular; curricula is plural; Analysis is singular; analyses is plural; Criterion is singular, criteria is plural; Phenomenon is singular, phenomena is plural

WRITING TECHNIQUES

1) Organize research into sections. Sections into sub-sections.; Have connecting thoughts between sections and sub-sections; 2) Organize notes/ideas.; 2) Never use a metaphor or other figure of speech which you are used to seeing in print; 3) Never use a long word where a short one will do; 4) If it is possible to cut a word out, always cut it out; 5) Never use the passive where you can use the active; 6) Clear Sentencies; 7) Never use a foreign phrase, a scientific word or a jargon word if you can think of an everyday English equivalent; 8) Moderation in all of the above is warranted; 9) Quote only when necessary and include within context of text. Don't just add quotes for the fun of it.; 10) Translations - who is doing translating? How accurate? How can you be sure? Include as a limitation; 11) Does manuscript conform to citation style? Is their citation continuity?; 12) Follow rules of citations and footnotes/endnotes

NOTE TAKING

Use a Uniform Bibliographical Card System:

Include: Library Call Numbers; Type of Reference; Topic; Sub-topics; Author(s); Sources; Date; Data - Author's purpose, methods and conclusions; Rephrase key sections to avoid plagiarism - but keep original quotes separate for reference.

Notes Should be Written for a Specific Purpose

 

ETHICS

Institutional Review Boards (IRBs) exist in all colleges: they review proposals, check that proposed research insures proper informed consent and safety to all participants

 

Ethical Principles of the APA - www.apa.org/ethics/code.2002.html

 

 


RESEARCH DESIGNS AND PROCEDURES

A-PRORI Design - nullify problems done with study. empirical - before the fact - deductive. units of observation; sampling. skeptical of conventional boundaries and labels. pre-selection of site on pre-determined variables - could be misleading. enter field - not sure where phenomena lies or with who exists. cannot assume all data is the same until after study. can't tell theoretical issue until after study is underway

A-POSTERIORI Design - done in field and during write-up. emphasis on placing data in a theoretical context. inductive - after the fact

OPERATIONAL RESEARCH QUESTIONS

Framed within a valid research purpose. Defines topic of interest an established parameters for research design. Relates Purpose/Goals - what is to be the overall, ultimate product of the research. describes yet unknown.. Suggest how research results might be used. Spell out - where, with whom and how a study will be carried out

Predictive - focusing on examining of effects caused by a specific treatment, the more credible it is

Measures congruence between constructs

Descriptive - Ethnography is always descriptive - study of interplay among empirical variables as they occur naturally

Ethnographic Methods: Generative, inductive, constructive and subjective ends of continuum

Deductive - begins with a theoretical system, develops operational definitions of propositions and concept of the theory and matches them empirically to some body of data. Find data to match a theory

Inductive - find a theory that explains their data. Beings with collection of data - empirical observations or measurements - and build theoretical categories and propositions from relationships discovered from among the data

Verificative Research - verifies or tests propositions developed elsewhere (deductive)

Generative Research - discovering constructs and propositions using one or more data bases as the source of evidence (inductive)

Constructive Strategy - discover what analytic constructs can be elicited from the stream of behavior; process of abstraction

Enumeration - process by which the research subjects previously derived or defined units of analysis to systemic counting or enumeration - proceeded by constructive process

CAUSALITY DESIGN: to inform, the research statistic must prove that he personnel effect; co-variables between cause and effect; ex: randomization and random assignment - take multi-causality into account; rule out or minimize effect of other extra or confounding variables;

confounding variable - makes internal validity1) selection (randomized decreased confounding); 2) mortality - people who drop out during study to account for reliability of popu.; 3) history -external to experience that effects people

internal validity - make sure Dependent variable stands alone

6 RESEARCH APPROACHES AND STYLES

1) IN-PUT/OUT-PUT APPROACH: Input - process - output: see how resources effect outcomes

2) PROCESS APPROACH: look at resources - processes - students - outcomes

3) ORGANIZATIONAL APPROACH: look at: adaptation, responsiveness, innovation - outcomes (Holistic Perspective)

reflection of social demands and history of institutional practices - how flexible

4) EVALUATION: look at large-scale intervention - magnitude of intervention affect achievement what works?

5) CRITICS APPROACH: critical observer

6) ANALYTICAL LEVEL: Degree of inclusiveness which characterizes the unit being studied

Theories focused on level 1 - individual; level 2 - inter-group/ inter-individual (face-to-face); level 3 - inter-community/ inter-group; level 4 - inter-society/ inter-community; level 5 - inter-society

Observation increasingly is on settings rather than on sets of people. The meaning of "precise specification of relationships between phenomena" varies with analytic level we focus upon.. Intensive ethnographic study of a classroom may yield a body of detailed observations of classroom events and participation. Through analysis, researchers identifies types of events, participants and situations and types of relationships among them. Explanations developed to account for the patterns and relationships delineated. Specification of relationships rests on recorded detail of event-sequences.

RESEARCH STYLES

Observation by itself does not provide meaning. Without asking questions and learning from "natives" you will not grasp their perspective. Essential core of Ethnography - concern with the meaning of actions and events to the people we seek to understand

ETHICS

Informants are human beings with problems, concerns and interests. Values of ethnographer do not always coincide with ones held by informants. Tape record or take notes. How will I use the data collected and will I tell informants how it will be used?. Should I study kinship terms or tactics used by government to keep them oppressed.

RESEARCHER CONDUCT - When to intervene and When to not

If an informant engages in illegal behavior should I make field notes accessible to police?. If informants are children, should teachers or parents have access to my notes. Should I pay informants?. Be sure of what sponsoring agency expects from you when you accept its support, to be sure that this does not include, for example, quasi-spy roles. Informed Consent. Unobtrusive Public Observation - observing those who are not aware of you. If subjects are completely anonymous - ok. To not get consent - but if they, in any way, can be detected - must get consent. Freedom from Coercion. Honoring agreements with Participants - participants must have clear understanding of their respective roles, agreement must be fair and research must fulfill all commitments. Promises must be honored, individuals must not be deceived into participating in a study by promises of high rewards that may obscure any risks they may be taking. Clarification and Explanations of Study. Anonymity and confidentiality. Acceptance of Research Funds. Potential Misuse of Research Funds

RESEARCHER SUBJECTIVITY- Being affected by personal issues, mental/physical conditions

LOYALTY and RESPONSIBILITY to those you are researching

OWNERSHIP OF INFORMATION - information belongs to researcher or to people being studied?

VULNERABILITY - characteristics of participants whose freedom to choose may be limited by age, health (mental and physical disabilities), by social constraints (inmates of prison, hospitals), victims of crime or perpetuators of crime

DISCLOSURE of who you are and what you are doing

1971, Council of American Anthropological Association: Principles of Professional responsibility: In a field of such choices, among conflicting values are bound to arise and to generate ethical dilemmas. It is a prime responsibility of anthropologists to anticipate these and to plan to resolve them in such a way as to do damage neither to those whom they studied, nor, in so far as possible, to their scholarly community. When these conditions cannot be met, the anthropologists would be well advised not to pursue the particular piece of research.

 

National Research Act of 1974 and Family educational Rights and Privacy Act of 197r (Buckley Amendment) - ensures protection of subjects

1) Consider Informants First; Their interests may differ from those of your sponsor, from the gatekeepers (those who have power to give or without permission to conduct interviews) or from significant others.; 2) Safeguard Informants' Rights, Interests and sensitivities; Read your completed project with informant; share with royalties; Ethnographies reveal information that can be used to affirm informant rights, interests and Sensitivies or to violate them.; All informants must have the protection of saying things "off the record"; 3) Communicate Research Objectives; Informants have the right to know the ethnographer's aims; 4) Protect the Privacy of Informants; 5) Don't Exploit Informants; give"fair return": payment, giving opportunity to tell "their story", simply participation; 6) Make Reports Available to Informants

2) ROLE MANAGEMENT - How do you define yourself? What is your role? How manipulate role to get most information? Describe completely and fully your purpose. Present HONEST statements about research. Make clear as soon as possible what researcher can and cannot do for members of society in which she is working (i.e. as student, faculty, government official etc.)

3) PARTICIPANT/RESEARCHER: Objectivity. Have established role or define new role. Dualistic goal - involve as participant - but detached. Language/Culture to Learn. Risk - possibility of any ill consequences of a research study. Reactivity - you react to what is being taken place. Control for group perceptions. Sometimes exploitative. Know your perception - role perceptions of you! how are you perceived by others! are us using selective discussion may be assigned into role that is not compatible to own perceived role -how to get around it. intimacy/intensity. rapport - mutual understanding of role to get accurate and honest observations. strategy for balancing roles. personal autonomy. psyching up - to get everything!

 

DESIGN TYPES: CONTINUUM OF METHODOLOGY

Qualitative (one end); Ethnographic Analysis (middle); Quantittive (other end)

 QUALITATIVE

(1) DESCRIPTIVE METHOD Asks questions that have not been asked before; Seeks information that is not readily available; Restricts observer's access to perspectives of participants, to the cultural and personal meanings which motivate and inform action.; Focus is on SUBSTANCE, not ARITHMETIC; Need is NOT for CODING - Converting Data to Computable Formulas; Need is for ANALYSIS - Elicit Meaning from Data; 1) separate data into categories, for analytic purposes; 2) Emphasis is on acquiring the empirical meanings of cultural domains from a variety of perspectives (researcher, informant, etc.); 3) Review data collected according to proposal; 4) Relocate original research question (to help explain revisions); 5) Scan data (mapping procedures); a) check data for completeness; b) isolate the initially most striking; Reading - discover not only explicit rules for game, but also tacit rules involved; Often combined with historical descriptive methods; Search for Patterns in Categories; Analysis occurs throughout the data collection; Analysis is linked with choices of theoretical frameworks, selection strategies and data collection methods; Conceptual techniques to analyze: a) theorizing - all modes of thinking upon which analysis is built: perceiving, comparing, contrasting, aggregating and ordering, establishing linkages and relationships and specializations; b) sequential selection strategies - formal operations designed to integrate data analysis with data collection: negative-case selection, discrepant-case selection, theoretical sampling, selection of theories relevant to various stages of the research; c) general analysis procedures - systematized means of manipulating data and constructs derived rom data throughout the research process

(2) CONTENT ANALYSIS: LeCOMPTE'S THEORY - Interpret through any combination of four processes; 1) Theoretical consolidation; 2) Theoretical application; 3) Using metaphors; 4) Analogies and synthesis; Interpretation of data - specify what the data means for the questions asked in the study and why particular meanings are salient; Explanatory statement of cause and effect relationships

CONTENT ANALYSIS OF INFORMAL INTERVIEWS

Content analysis: B. N. Colby codes folktale and then does computer analysis of the theoretical structure, looking for eidons

 Louis Gottschalk, codes clinical interviews for affect, then does an analysis with interest in the relation between affect and content of talk

 KEY STEPS: 1) Transcribe notes immediately after interview; 2) Immerse yourself in the details, trying to get a sense of the interview as a whole; 3) Seek to categorize the different segments of talk - develop different categories from the way the informants talked, rather than imposing a set from outside; 4) Mark off similar content talks - look for clear transition from one topic to another; Recurrent topics are prime candidates for categories; 5) Once the material is categorized, apply the next analytical device - cut and copy according to new topic-oriented codes; 6) You can then check for consistency within topics; 7) Begin building a map of the territory that will help you give accounts - subsequently begin to discuss what "those people' are like;  

Dairy-Interview Method - stresses daily routine on both typical and special days. interviewes works with informant to outline precisely the diary keeping

 PROBLEMS WITH RESEARCH SOLELY CONDUCTED ON CONTENT ANALYSIS

1) Universally attributable?; 2) Match those under study or explain why not; 3) Research constrained due to audience to whom research is addressed

(3) HISTORIOGRAPHY

Systematic collection and objective evaluation of data related to past occurrences in order to test hypotheses concerning causes, effects or trends of these events that may help to explain present events and anticipate future events.

 History - diffusion and ramifications of a single idea. How did it happen . . .How did it affect etc.

1) Descriptive - from the specific Time Period; 2) Explanative - from a global perspective during that specific Time Period; 3) Interpretive - from today's perspective (hindsight)

Purpose - EXPLAIN or PREDICT or CONTROL PHENOMENA - to objectively evaluate and weigh all evidence to arrive at a tenable conclusion; not to rehash, prove a point or support a position; define problem - not be: find out what is already known about a topic and retell it.; one can easily verify almost any point of view by consciously or unconsciously "overlooking" evidence

CONCERN: possibility that problem will be selected for which insufficient data are available cannot "create" data - limited to data already available. Often criticized as to excessive use of secondary sources since is limited to whatever data is already available.

Archival records and other documents do not entail the interpersonal dynamics of interviewing, testing or participant observation - but require equal care in use as data sources.

Data they provide are dual constructs: 1) documents reflect categories, intentions and interests of those who originally compiled them; 2) research makes decisions so that information be compiled validly - the basis for an analytic category or as an empirical example of a general concept

Both validity and reliability are problematic - and must be dealt with up-front

Critics - question validity of generalizations based on events that can never be exactly duplicated. Historiography maintains that the more similar a new situation is to a former situation, the more applicable generalizations based on the past situation will be.

Noninteractive strategies are more replicable because observer effects are controlled more easily

Disciplining the mind so that the writing of history and reporting of events that actually took place can occur

(4) CASE STUDIES

In-depth investigation, interview and/or observations of an individual group, event, institution or community; Can stand on its own within ethnographic studies as individualized case studies:; Can be "grounded" in experience of single individual

Purpose - determine background, environment, characteristics, factors and relationships among factors that have resulted in the current behavior or status of the subject of the study. Uses objectivity to determine WHY not just What; Some case studies describe the event; others take the format of a cultural assimilator,

Case study useful in demonstrating how a theoretical model can be exhibited in a concrete example; Can suggest hypotheses which can be tested later using other methods of research and with larger numbers of subjects. Can also be used to test hypotheses.; Conventional single site case study is complimented by a multi-site approach (R.C. Rist)

 General Research - nomothetic - knowledge that relates to larger numbers of persons, institutions or events.

Clinical case study - ideiographic - attempts to understand the behavior and attitudes of individual without attempting to generalize these findings to other persons/groups.

LIFE HISTORIES - details of a single person's life and in the process show important parts of the culture; Multiple life histories provide even greater in-depth orientations

LIMITATIONS/PROBLEMS

Possible observer bias (observer sees what he wants to see); Lack of generalizability.; Difficult to determine which factors, historical or contemporary are relevant to the phenomenon under investigation.; Tendency to use case study to select convenient cases rather than hose which can either yield or test hypotheses.

ANALYZING CASE STUDIES

1) Find Case; 2) Gather information from variety of sources - interviews, educational and psychological tests, observation; 3) What happened?; Answer this from perspective of as many of the parties involved as possible; 4) What is the situation or problem?; Answer from perspective of people involved - does it involve values, language, nonverbal?; 5) How are the individuals in the case study related? Peers, guest-hosts?; 6) How can the situation be resolved? Consider as many courses of action as possible; 7) What are the probable results (or repercussions) of each "solution"; 8) Are the solutions at one extreme or the other, or is a compromise possible?; Is a solution even possible?; If there is no solution, how do Americans, who generally feel that problems must have solutions, deal with it?; Is additional information needed to make a decision

 (5) COMPARATIVE - METHOD OR CONTENT?

19th Century Intellectual roots of Comparative Education, legitimacy after WWII; Comparative "science of society"; Compare national systems of education for a multitude of purposes:; Explanation of national variance; international understanding; educational improvement or reform,; conducted either in one's own country or abroad; Applied - assist in educational reform domestically and internationally; Specifically: Demands of internal reform, foreign policy, patterns of funding research; Different perspectives of "Comparison"

 MICRO - Forces and Factors that shape education systems. Local level - cultural groups within cultural groups. Things inside school (ed administration, instructional methods, learning efficiency are more important than outside school; National education systems defined field concrete realities of school organization, curriculum, administration, finance (I. Kandel); Based on functional aspects of school role school systems play in contributing to cultural continuity and maintenance of nation-state (N. Hans) Regional, local analysis - variation in educational practices and school/society relations is greater within singular nations than between nation-states

MACRO - World-systems analysis; Larger cultural groups, national, ethnic, racial; Things outside school are more important than things within school (M. Sadler); School/society relations and instructional practices cannot be understood solely with reference to a) broad educational policies concerning teacher training and certification; b) levels of resource allocations to schooling; c) score on standardized achievement tests without understanding "whole" picture - interactive and dynamic.

 COMPARATIVE EDUCATION METHODOLOGY: METHODS: Emphasis replicable - to develop data collection for systematic analysis

K. POPPER METHODOLOGY: 1) objectivity; 2) development of solid categories for comparison; 3) rigorous methods of data collection; 4) stringent analysis 5) replicable results

 KANDEL METHODOLOGY: 1) description - collect data about a specific problem; 2) explanation/interpretation (historian perspective)'; 3) comparative analysis on levels 1 & 2; 4) Derive basic principles/generalizations

PROBLEMS WITH KANDEL: 1) historical approach can be overwhelming because it is never ending; 2) plays too may roles - and touches all superficially; 3) massive works with no citations or footnotes; 4) rational character emphasis; 5) difficult to see where basic principles were derived at

 BEREDAY METHODOLOGY: 1) Descriptive - Justification; 2) Deductive - starts from general and goes to specifics; 3) Data Collection: systematic collection of precise, similar data from each nation studies after researcher is well grounded in languages, culture, social and historical backgrounds of those whose education they were seeking to study.; 4) Procedure: a) Collect data - description; b) categorization of data - interpretation (place in socio-cultural context); b) careful juxtaposition - place data side by side - similarities and dissimilarities; c) generation of hypothesis from data; d) comparison which could be "illustrative" or "balanced"

PROBLEMS WITH BEREDAY

1) difficult to separate description from interpretation and perhaps a waste of they - they are simultaneous stages; 2) impossible to specialize in all areas; 3) hypothesis should come before research is done; 4) doesn't all of contextual differences; 5) labeled as inductive approach: specific to general: starts with data, then builds hypothesis

 Hypothesis formation should precede data collection - not be a consequence of it (H. Noah, M. Eckstein)

Go from micro to macro (loalized ed problem to national context (B. Holmes)

Multidisciplinary perspective: Study both within school and out-of-school phenomenon to find complex patterns of interrelations between instructional outcomes (achievements) and social outcomes (social mobility) of schooling (C. Anderson)

Stress on relating elements of schooling to aspects of social and economic life and not the totality of the school system to all parts of a nation's life.

In-depth studies of education phenomena within a country identifies school/society relationships

 

ETHNOGRAPHIC ANALYSIS

CRITICAL EXAMINATION OF DATA: Natives themselves can be biased

Researcher does not always have to adopt the natives' point of view - it is important just to understand that it exists, and that is valid in its own right. Accordingly, native's interpretations can be challenged, i.e. critically analyzed.

 External Criticism - determine authenticity of sources

 Internal Criticism - determine accuracy of sources

1) knowledge and competency of author - Determined whether the person who wrote the document is, or was, a competent person and a person in a position to be knowledgeable concerning what actually occurred

2) time delay between the occurrence and recording of events

3) biased motives of author - distortions may be intentional or unintentional

4) consistency of data - not all records are "official" - those making statements, recording, might use their own opinions not cite "reality"

Ethnographic analysis - archival materials

Creates awareness by researcher of intentions and interests behind the creation of material being researched; Highlights unintentional bias that may occur when a researcher analyzes data from her own cultural perceptions (including time frames) rather than those of the subject's culture/time; Benefits - minimizes the effects of the observer on the events observed; Disciplining the mind so that the writing of history and reporting of events that actually took place can occur;

TECHNIQUES - Require specific methods be identified that help; a) distinguish between the event; b) the accounting of it; c) the means by which the account is prepared

ETHNOGRAPHIC APPROACH THROUGH NARRATIVE: Series of chronologically and lineally ordered events (early time vs. later time); Story Tellers - who did; what did; why did; and with what results; See how changes over time occurred - in practice, in ideals, in life, in society etc.; Avoid summarizing documents - WANT TO BE A STORY TELLER; Look at the data and see what study emerges

ETHNOGRAPHIC CONTENT ANALYSIS OF SPECIFIC TYPES OF DATA: Identification of criticism that stems from the researcher's own personal bias; Try not to be judgmental

 Sometimes, such bias can be disguised as "logical forms of analysis"; The thin line between bias and critical analysis is often undistinguishable.

1) describe materials - concrete sensory terms; compare by class and category etc.

2)Analysis - who produced it? for whom was it made? when and where was it constructed? under what circumstances and for what purpose was it produced?

3) Interpret and analyze - manifest and latent meanings

symbolic materials (types of dress of high school students)

Re-examine what has been identified and addressed and assess for authenticity, participant distortion, falsification and selection bias. Review artifacts within the context from which they were abstracted

 TRANSCRIPTS

1) Communication of meaning; 2) Expression of respondent interest; 3) Clarity of question and response; 4) Precision of interviewee intention; 5) Integration of intent within interviewee questions; 6) Interviewee management of potential respondent fabrication

RESEARCH TECHNIQUES

1) Finding the Facts; 2) Cross-Questioning the Book; 3) Verification: a) Collation, Matching Copies with Sources; b) Rumor, Legend and Fraud; 4) Attribution - Putting a Name to Source; 5) Explication - Getting Secrets Out of Manuscripts; 6) Disentanglement -Undoing the Knots in Facts; 7) Clarification - destroying myths; 8) Identification - Ascertaining worth through authorship; 9) Technique of Self-Criticism; 10) Knowledge of Fact and Knowledge of Causes

 

ETHNOGRAPHIC STEPS

1. Give ethnographic explanations: A Giving project explanations; B Giving question explanations; C Giving native language explanations; D Giving interview explanations

2. Ask ethnographic questions: A Asking project questions; B Asking question questions; C Asking descriptive questions; D Asking structural questions; E Asking contrast questions; F Asking cultural ignorance;  Feedback from field redefines analysis questions

 Repeat different types of explanations/questions/explanations

A Restating informant's/local terms; B Incorporating informant's/local terms; C Distinguishing between informant/local bias through Critical Analysis

QUANTITATIVE ANALYSIS

 (1) CORRELATIONAL

Explores relationship between an independent variable and a dependent variable makes causal relation on Independent Variable from Dependent

1) pick independent variables - pick something interesting. understand treatment - what really will be affected. operationalize - what is treatment that you suspect to be cause of phenomenon. range of variables (different levels). be realistic - must show affect - not off the wall and not too narrow. don't overlap. piolet study - look at everything to see if readjustment is necessary. need extra time, money and different population. increase study - control - generalizable the results (paradox of study). generalizable the study - likely all variables were controlled.

2) pick dependent variable - operationalize - what is treatment that you suspect to be cause of phenomenon. reliability - do the results repeat themselves over a number of times. Validity - are you measuring what you proposed to measure. Group categories of items together with similar scores/answers. Systematic, quantitative description of the composition of the object of the study. Focus on subject. Holds constant differences between subjects. Varies people. Knows theory of subjects and of the task. Methods - groups differ by nature. Random assignment to control for differences in subjects. Assumptions - know allot about subjects - same cultural backgrounds. Quasi-Naturalistic - to make something happen that is infrequent as unobtrusively as possible - to be able to see effects of acts that would otherwise miss. Assumptions - know a little about subjects - different cultural backgrounds

 PROBLEMS: Comparison groups are non-equivalent; Culturally different - can't be sure of the treatment is similar for both groups

 CORRELATIONAL RESEARCH

In Education, movement away from discrimative use of statistics (involves description of relationships with no concern about underlaying theories of behavior). These have investigator not particularly concerned with ed or psych theorist relationship to intelligence, achievement testing, discipline. Rather emphasis is on answering a specific question of minor theoretical importance. These forms of co-relationship analysis are important tools in research - but not all problems employing these techniques are worthwhile. New TREND - structural use of correlational studies - analysis of behavior related to some theoretical system. Determine whether, and to what degree a relationship exists between two or more quantifiable variables; Use these relationship to make predictions. Variables found not to be highly related are eliminated from further consideration. Variables that are highly related suggest causal-comparative or experimental studies to determine if relationships are causal. Just because there is a relationship between self-concept and achievement does not imply that self-concept "causes" achievement of reverse. The degree of relationship between two variables is generally expressed as correlation coefficient, which is number between .00 and 1.00.

 3 TYPES OF CORRELATIONAL STUDIES

FACTOR ANALYSIS: Used to simply and organize large numbers of correlations. Measure and identify the factor common to all of the measurements under investigation

RELATIONSHIP ANALYSIS: Shows relationships of variables; Whatever phenonemeon varies in any manner whenever another phoneme varies in some particular manner, is ether a cause or an effect of that phenomena or is connected with it through some fact of causation.

PREDICTION ANALYSIS: Single-variable predictions - only one predictor is used to predict a criterion. Multiple-variable predictions - two or more variables (individual variables) are used to predict a criterion (dependent variable). Regression equations are used to predict the most likely value in dependent variable.

(2) EXPERIMENTAL RESEARCH - 4 types of variables exist in experimentation

VARIABLE: Concept that can assume any one of a range of values. Examples include height, weight, income, achievement, motivation etc.

Establish case-effect relationship; involve group comparisons. Manipulation of at least 1 independent variable and observed effects on dependent variables (Team playing; Role Playing etc.).

Independent Variable - "Cause" is manipulated by experimenter and is believed to make a difference. Constitutes the experimental treatment.

Dependent Variable - "Effect" is determined to occur or not occur - the difference

A study which investigates a cause-effect relationship investigated the effect of an independent variable on a dependent variable. Subject response variable.

Oranismic variable - subject classifications (sex, IQ) that may be studied by the experimenter but are not manipulated by her

Extraneous variables - correlate with and could affect the dependent variable. Experimental designs are intended to control these extraneous variable sand to permit the experimenter to observe the relationship between independent and dependent variables

Researcher creates the situation to be observed and tells subjects what activities they are to engage in (behavior that occurs infrequently in natural situations or not at all)

 CRITERIA FOR EVALUATING EXPERIMENTAL DESIGNS

Randomization - means of reducing systemic error - eliminate difference between groups through this processes

Sensitivity - precision of an experiment to detect difference between groups

DO NOT WANT THE FOLLOWING: One-shot case study - does not allow for comparison of changes/measurements

One-group pre/post test - "; Series which leads to a complex statistical analysis

 DO WANT THE FOLLOWING:

Equivalent time samples - generalization is only to other groups which are repeatedly tested; Equivalent materials design - general restricted to groups tested repeatedly

 Content Analysis of EXPERIMENTAL AND CORRELATIONAL STUDIES:

1) Under what circumstances was informed consent obtained for subjects/informants? 2) Biases in sampling? 3) Are figures/tables necessary, self-explantatory and clearly identified? 4) Do graphs correspond logically to textual arguement of the article? 5) Are the following sufficiently defined: control definitions; statistical definitions; depenedent and independent variables 6) Simplest statistics are the best: a) A good research question can be insightful investigated with relatively simple analysis provided the assumptions are not too badly violated. The purpose of statistics is to summarize and clarify, not to fog.; b) key is to determine if the impetus for the study is a substantive research question or a fascination with the newest techniques; c) study needs to be driven by its questions rather than its statistics; 7) Is discussion properly confined to the findings or is it digressive? 8) Has author explicitly considered and discussed viable alternative explanations of the findings? 9) Have nonsignificant trends in the data been promoted to "findings" 10) Are limits of generalizations possible from the data made clear? 11) Has author considered possible methodological bases for discrepancies between the results reported and other findings in the literature?

 PROBLEM: Not all behavior exhibited by subjects is "natural" and high risk of "faking it"

 DEVELOPMENTAL STUDIES: Development of biological, cognitive, emotional etc. characteristic; Longitudinal - individual/group - is followed over a period of time; Cross-sectional methodology- different people selected at each stage of development

 NATURALISTIC RESEARCH/OBSERVATION: Observed as they occur naturally. Observer purposely controls or manipulates nothing and works very hard at not affecting the observed situation; Intent is to record and study behavior as it normally occurs; You change the environment by your presence alone; Specialized techniques: diary - day-to-day written record of behavior of subjects by others or by subjects themselves; Episode sampling - study systematically a given episode or event as it naturally occurs without influencing it unduly by their presence.; Time sampling - describe all that a child does within some definite predetermined and representative period of time; Situational observations - control the setting in which the observations will take place, but no other restrictions or conditions are imposed

EXPERIMENTAL QUESTIONS

A) Under what circumstances was informed consent obtained for subjects/informants, sites?; B) Biases in sampling?; C) Are figures/tables necessary, self-explanatory and clearly identified?; D) Do graphs correspond logically to textural argument of the article?; E) Sufficiently defined: Control definitions; Statistical definitions; Representative Design;; Measures - dependent and independent variables; F) Simplest statistics are the best: a) A good research question can be insightful investigated with relatively simple analysis provided the assumptions are not too badly violated. The purpose of statistics is to summarize and clarify, not to fog.; b) key is to determine if the impetus for the study is a substantive research question or a fascination with the newest techniques; c) study needs to be driven by its questions rather than its statistics

(3) QUANTITATIVE CODING

Counting or enumerating items requires that the items be defined and located within the data record. Used to lael given occurrences in social life does not establish the relevance or significance. Adding Qualitative analysis helps to understand the NATURE of that commonality. Identification of specific items to be analyzed in similar ways. Items to be counted can be defined before, during or after data collection. Strategies that measure, analyze and use data. Stress on correlations derived from mathematical procedures. Gives impression of making qualitiative more scientific. Can range from a simple count to a computer-assisted calculation of possible mathematical interrelationships among a host of variables derived from specific data analysis. Quantitative researchers recognize that not all significant information can be measured or categorized readily by a detached (universal ) set of analytic procedures, but they concentrate on data amenable to this procedure

CODING ANALYSIS

Primary purpose: to lose information that is no longer valid. To recast data for use in answering research questions

 


CODING & DATA ANALYSIS

CODING

Coding involves steps

a) search through data for regularities and patterns

b) write words and phrases to represent these topics/patterns

c) words and phrases become coding categories - they are a means to sort the descriptive data

Setting/Context Codes

Situation Codes

Perspectives Held by Subjects

Subjects Ways of Thinking About People and Objects

Process Codes - words/phrases that facilitate categorizing sequences of events, changes over time, passages from one type of status to another

Activity Codes - regularly occurring kinds of behavior (student smoking, joking)

Event Codes

Strategy Codes - tactics, methods, techniques, maneuvers, ploys and other conscious ways people accomplish various things

Relationship and Social Structure Codes - regular patterns of behavior

Narrative Codes - describe structure itself - when does a story that is told to you start, what does it say and when does it end

Methods Codes - material relates to research procedures

Data Analysis

Qualitative

ongoing

models, themes, concepts

inductive

analytic induction

constant comparative method

Quantitative

deductive

occurs at conclusion of data collection

statistical

 

 

DATA ANALYSIS

Computer Assisted Qualitative Data Analysis Software: CAQDAS

helps organize or categorize - does not analyze.

Makes data more accessible

use to designate boundaries of data and attach code symbols to them

Example web-sites

http://caqdas.soc.surrey.ac.uk/

www.qualitative-research.net

 

Analysis of Variance - ANOVA -

statistics procedure that compares the amount of between groups variance in individuals scores with the amount of within-groups variance. If the ratio is sufficiently higher this indicates that there is more difference between the groups in their scores on a particular variable than there is within each group'

If analysis of variance yields a non-significant / F ration (ratio of between groups variance to within groups) the computation of t tests to compare pairs of means is not appropriate

t test for multiple comparisons is a test of the significance of the differences between more than two sample means

EXAP:E Teacher Efficacy in Middle Schools with Interdisciplinary Teams with common planning time, interdisciplinary temas without common planning time

 

Analysis of Covariance (ANCOVA ) used to control for initial differences between groups before a comparison of the withingroups varieance and between groups variance is made

If a difference still remains between the 2 groups, we cannot use the control variable to explain the effect

Boys made significantly more grammatical errors than girls after use ANOVA to control for initial differences in writing productivity. This will help conclude that gender differences exist

 

Multivariate analysis of variance (MANOVA) determining whether groups differ on more than one dependent variable - similar to the t test and to analysis fo variance. Major difference is that t test and analysis of variance can determine only whether serveral groups differ on one dependent variable.

MANOVA - each participant wil have a score on 2 or more dependent variables and MANOVA will determine whether these are statistically significant differences between the centroids of different groups. Purpose of MANOVA is to determine whterh these two spaces differ significantly from each other

Ex: 2 groups of high and low achievers - measure on 2 dependent variables a) attitude toward their present school; b) attitude toward engaging in further schooling. Each score can be represented individually and then compared with others.


SURVEY AND QUESTIONNAIRES

SAMPLING

Process of selecting a number of individuals for a study in such a way that the individuals represent the larger group from which they were selected

Sample - limited number of elements selected from and representative of that population

Generalizing results to larger population - begin with sequential strategies to create a sample that resembles larger population.. Sampling is a necessary precursor to research: is dynamic and sequential (rather than static). Ethnographic studies - selection is recursive, dynamic, and sequential (rather than static). Use statistically based sampling to indicate accumulation of many corroborative sources of data over time

 POPULATION

Group research is concentrated upon. A define population has at least one characteristic that differentiates it from other groups

1) Develop a set of criteria that constitutes a portrait of the group under study - and find individuals with similar characteristics; 2) develop criteria and then advertise for willing participants; 3) participants search for researcher (in case of evaluation)

 SAMPLE SIZE

Determined by cost, time and type of research and emphasis of research; Larger is not always better. Accuracy is not based on size; The larger the sample - the more representative it is likely to be (OF THAT PARTICULAR GROUP) and more generalizable results of the study are likely to be; Homogeneity of population - helps determine how sample is formed

SOURCES OF SAMPLING BIAS

Use of volunteers - why did they volunteer?; Use of available groups - why are they available?; Don't be talked into using a biased sample for the sake of administrative convenience;; Any sampling bias present in study should be fully described in final research report

SELECTING A SAMPLE

Regardless of specific techniques used, steps include:; 1) identification of population; 2) determination of required sample size; 3) selection of sample type; Degree to which selected sample represents the population is the degree to which results are generalizable; Statistical inference - generalizing from known characteristics of a sample to unknown characteristic of a population. Provides an estimate of the parameters and amount of error which can be expected.

 COMPREHENSIVE SELECTION - examine every case/element in population

 QUOTA SELECTION - restricted to a representative subset of some larger population

 NETWORK OR SNOWBALL SELECTION- each successive participant/group is named by a preceding group or individual - participant referrals

 CRITERION-BASED SELECTION

Research based on pre-conceived set of criteria. During research, searches for examples that match the specified array of characteristics. Purposive - applies across selection and sampling procedures and should be contrasted only with completely haphazard means of selecting data or data sources. Use to choose groups or sites - derived via set of research problem/questions. As research unfolds, helps establish new sets of phenomena to examine

 EXTREME-CASE SELECTION - first identification of norm for a characteristic. Then extremes of that characteristic are define and all potential cases are arrayed on continuum. Extremes (poles) become study focus. Must be able to identify mean first for this to work.

 TYPICAL-CASE SELECTION - Develop profile of attributes possessed by an average case and then seeks an instance of this case - strive to find "real-world match"

 UNIQUE-CASE SELECTION - cases that are unusual or rare on a dimension. Used when researcher want to examine some dimension that functions as an experimental treatment that would not normally come from experimental studies - i.e. historical events or micro events. Also works with identification of a specific sub-population

 REPUTATIONAL-CASE SELECTION - variation of extreme case or unique case. Instances of a study population on recommendations of experts

 IDEAL-TYPICAL CASE SELECTION - develops a profile or model for the best, most efficient, effective or desirable example of the population and then finds a real-world case that most closely matches the profile

 COMPARABLE-CASE SELECTION - Form of replication. Used in multi-site studies

 CROSS-SECTIONAL SAMPLING - appropriate for interest in population one point in time only

 LONGITUDINAL SAMPLING - over time - emphasis on life-cycle influences and those reflecting general historical trends

 NATURALLY BOUNDED - exist independently of researcher interest and are formed, or at least recognized, and confirmed by their constant participants

Share common geographical location - provide boundaries which are finite and discrete. Share common characteristics, not necessarily geographic boundaries (i.e. members of a national organization). Research methods are therefore limited by accessibility. Flexibility - people move into and out of groups frequently

 ARTIFICIALLY BOUNDED - identified by researchers, scholars or policy makers as collectives of individuals sharing common attributes; do not together form socially designated groups.. Rigorous sampling strategies can be used with these groups only when they are clearly identified within some other bounded population

 CRITERION-BASED SELECTION

Research based on pre-conceived set of criteria. During research, searches for examples that match the specified array of characteristics. Purposive - applies across selection and sampling procedures and should be contrasted only with completely haphazard means of selecting data or data sources. Use to choose groups or sites - derived via set of research problem/ questions. As research unfolds, helps establish new sets of phenomena to examine

 RANDOM SELECTION - process of selecting a sample so that all individuals in the defined population have an equal and independent chance of being selected for the sample; No population element can either deliberately or inadvertently be omitted or excluded from the sample, except by chance. Mathematical base

1) defining the population;. 2) identifying each member of the population;. 3) selecting individuals for the sample on a completely chance basis - generally selected using a table of random numbers

 SIMPLE RANDOM - uses mathematical procedures to assure that no unit has any greater chance for inclusion in a subset that another, Requires that population be selected, that every unit in that population identified and be accessible to the research.

Requires that population be sampled/selected first; that every unit in population be identified; that each unit be accessible for study; and that each unit has an equal probability of being selected. Experimental studies claim that this is the single best way to obtain a representative sample;. Representative samples are drawn in a random, unbiased manner. Random sampling refers to the process of selection and not necessarily to the extent to which samples approximate population characteristics, which are often unknown

 STRATIFIED SELECTION - process of selecting a sample so that identified subgroups in the population are represented in the sample in the same proportion that they exist in the population; helpful when comparing subgroup populations - helps assure internal validity

 CLUSTER SELECTION - groups rather than individuals are randomly selected; any intact group of similar characteristics is a cluster; Where lists of individuals are unavailable or the characteristics of the population are not well known, it is possible for the researcher to sample areas or clusters of elements first, and then to sample individuals or elements within the clusters.

 SYSTEMATIC SELECTION - individuals are selected from a list by taking every Kth name, where K equals the number of individuals on the list divided by the number of subjects desired for the sample; He is assure internal validity . Even though choices are not independent, systematic sample can be considered a random sample IF the 1st of the population is randomly observed (a relatively infrequent event). Ordering principles may be numerical, chronological, spatial, alphabetical

Problem - establishing a sampling interval unaffected by some confounding fluctuation or variation in the population

 PROBABILISTIC SAMPLING - extracting from an already well-defined population a subset for study approximating the characteristics of the group. Required mathematical procedures for assuring that the smaller group is representative of the larger group

 Ethnographer use systematic and random sampling specifically to assure internal validity in their studies. When observations are planned, samples ensure representatives of findings

 STATISTICAL SAMPLING - simple random or stratifies - study small group possessing same distribution of characteristics as larger population which they intend to generalize

 Statistical sample inappropriate:

1) when characteristics of larger population have not been identified. 2) when groups possess no naturally occurring boundaries; 3) when generalizability is not a salient objective; 4) when populations composed of discrete sets & characteristic are distributed unevenly among them; 5) when only one or a few subsets of characteristics are relevant to the research problem; 6) when some members of subset are not attached to the population from which sampling is intended; 7) when researchers have no access to the whole population from which they wish to sample; 8) when excluding any member of a population is too risky

 Statistical sampling Irrelevant

1) studies where initial description of previously little known or singular phenomenon is desired; 2) where social constructs - to be tested later in more stringently controlled designs; 3) where goal of research is explication of meaning or micro-social processes; 4) where subject of investigation is an entire population

QUESTIONNAIRE/SURVEY

The major differences between an interview study and a questionnaire study are: Nature of the instrument involved (an interview guide vs. a questionnaire); The need for human relations and communication skills; Methods for recording responses

 SURVEY of people distributed to potential respondents fro them to answer unsupervised by the researcher

 INTERVIEW: survey of people administered face-to-face by the researcher is an interview

 Means of gathering information for specific purposes. For quality research: must show connection between information gathered and hypothesis/theories. Start with work-table containing a list of questions related to the hypotheses.  

For each of these questions, the investigator can justify the need for collecting that type of information by questionnaire. Literature Review can indicate variables to be included as well.

 ADVANTAGES - Economy. Expense and time involved in training interviewees and sending them personally to interviewer are diminished.  Questionnaire can be sent almost anywhere. This increases the size of sample. Each respondent receives the same set of question, phrased in exactly the same way as they are on standardized tests. Yield more comparable data than do interviews. If only questions are structured.. Yet highly structures interviews can easily become stilted

 DISADVANTAGES - Motivation of respondent is difficult to check - and validity of responses is difficult to judge assumption that respondent is literate. No survey or questionnaire, no matter how carefully standardized, will be perfectly appropriate in every cultural situation.

 SURVEYS

Maps out and describes information from large groups of people that try to represent the population at large in relation to one or more variables. Attempt to collect data from members of a population in order to determine current status of that population with respect to one or more variables. Involves careful design, formulation of hypothesis, description of variables, plus selection and definition of a population, etc.. Infers information about a population based on responses of a sample drawn from that population (simple-random; stratified etc.). Usually includes probabalistically samples (representative of whole population). Distributed to potential respondents for them to answer unsupervised by the researcher. Culturally determined and bound - both by ethnographer and by those being studied. May be conducted by mailed questionnaires or collected by single interviewers or teams

SURVEY TYPES

Census Survey - acquire data from each and every member in a population

 Public Opinion Polls - how members of population feel about socio-political, economic, educational issues. Results reported separately for each subgroup and for total group.

 School Survey - study of an individual school, or all schools in a district. Purpose of internal or external review, projection of needs, etc. Results often deal with current status of variables such as community characteristics, institutional and administrative personnel, curriculum and instruction, fiance, physical facilities etc.

 Participant-construct survey - measure strength of feeling people have about phenomena. - elicit categories into which people classify items. - define "agreed upon"

 Confirmation Surveys - structures interviews/questionnaires verify applicability of key informant and other data to overall study group - assess extent to which participants hold similar beliefs, share common perspectives

Projective Surveys - When it is impossible to have individuals react to actual stimulus or context under study by using photos, drawings, games etc.

 CATEGORIZING SURVEYS - Categorized on basis of

1) type of person being questioned or the roles they occupy; 2) structure and administration of the interview

 PROBLEMS WITH SURVEYS

Inaccurate indicators of actual behavior (bias and distortion); Self-reports are useful for assessing how individual make judgements about people and events; what they think; - but only when corroborated by observational data; Need to demonstrate equivalence of meaning between researcher and respondent

 STANDARDIZED QUESTIONNAIRES

Questions are written in advanced and pre-arranged answers are envisioned. These answer help define what "words" are chosen for your questions. Require less time than interviewing, is less expensive, permits collection of data from a much larger sample than interviewing. Every item/question must highlight problem and relate to the specific hypothesis. Can be mailed or personally administered. Personally administered:-Give advantages of interviews - opportunity to establish rapport, explain purpose of study and clarify individual items - time and availability eliminates advantage in the long run. Topic should motivate response. Create own research instruments in the field - but can base on pre-existing sources. Questionnaire finds correlations: Result of general hypothesis - "If X is present in this community, will we tend to find Y associated with it?  No Questionnaire, even most carefully standardized is perfectly appropriate for all cultural situation

 SHOULD NEVER BE USE ALONE - this data only supplies a framework for suggesting explanations. Every item/question must highlight the problem under study and relate to the specific hypothesis

 QUESTIONNAIRE PROBLEMS: subject's true response not listed among alternatives

 QUESTIONNAIRE SAMPLING - Sampling. Biased sampling occurs with each questionnaire that is not returned.

Average rate of return is 75%; Increase % with 2nd and 3rd mailings and phone follow-ups

 Return rate depends on:A) length of questionnaire; B) reputation of sponsoring agency; C) complexity of questions asked; D) relative importance of study as judged by the respondent; E) extent to which the respondent believes that his or her responses are important F) quality and design of the questionnaire itself.

 MEASUREMENT SCALES FOR QUESTIONNAIRES

 NOMINAL SCALE - Involves fewest assumptions.. Names or labels for persons, objects, activities or beliefs form this type of scale. Purpose is to identify or categorize attributes

 ORDINAL SCALE - All characteristics of nominal scalae and attributes that may be ranked (highest to lowest, best to worst). Grading systems are examples

LICKERT SCALE: 5 - point scale (strongly approve, approve, undecided, disapprove, strongly disapprove).

Assign points to each categories. Ordinal scales involve ranking, but make no assumption concerning equality of differences between ranks

GUTTMAN'S SCALEGROM TECHNIQUES - Makes use of ordinal measurement. This technique is not so much a method of constructing scales as it is one of determining whether existing scales are unidimensional (salable), quasi-scalable or nonscalable. The extent to which the items are scalable is a function of the reproducibility of the items scores from the respondent's total scores.

 INTERVAL SCALE - Can be ranked. Differences between successful ranks will be equal.

Common method of scaling is THURSTONE AND CHAVE METHOD OF EQUAL- APPEARING INTERVALS.

 CONSTRUCTION OF QUESTIONNAIRE

SELECTION OF SUBJECTS: 1) those who have desired information; 2) are likely to be willing to give it

QUESTIONNAIRE DESIGN: Look - attractive and brief and easy to respond to

 Each question should deal with a single concept and be worded as clearly as possible. Any term/ concept that might mean different things to different people, should be defined. What does "a lot" mean (?);  Use appropriate jargon - in "slang" of those answering the questionnaire

 Structured Questionnaires - identify and list of alternatives, distinctly different from the rest. Structured items also facilitate data analysis; scoring is very objective and efficient. Disadvantage - subject's true response in not listed among the alternatives.

 Unstructured Questionnaires - complete freedom of response - permits greater depth of response and insight into reasons for responses. These are simpler to construct. Disadvantage - may provide information extraneous to the objectives of the study, responses are difficult to score and analyze and many subjects may not be happy with an instrument that requires written responses

 Some questionnaires contain both structured and unstructured items with emphasis on structured

 VALIDATION OF THE QUESTIONNAIRE: Does the questionnaire measure what it was developed to measure. A questionnaire developed to determine the classroom behavior of teachers, for example, might be validated by observing a sample of respondents to determine the degree to which their actual behavior is consistent with their self-reported behavior

 PREPARATION OF COVER LETTER: Every mailed questionnaire must have a cover letter. Explains what and why is being asked to hopefully motivate responder to fulfill request. Address to the specific respondent - if possible; explain purpose of study, emphasizing its importance and significance. Give responder a good reason for cooperation; (your need for data for thesis is not good enough). State a commitment to share results of study when completed; endorsement of organization, institution group etc.. Provide a specific deadline - 2-3 weeks.. Sign each letter individually. Stamped, returned envelope, if feasible

 PRETESTING THE QUESTIONNAIRE: Includes concerning instrument deficiencies as well as suggestion for improvement. Having two or three available people complete the questionnaire first will result in identification of major problems; Send cover letter and questionnaire to small population. Encourage comments and suggestions concerning directions, recording procedures and specific items. If the percentage of returns is low - reformat

 FOLLOW-UP - In Experimental Type Methodology the claim is that if your % of return is not at least 70%, the validity of your conclusions will be weak. Aim for: 40% return (1st mailing); 30% return (2nd mailing). Send out a reminder postcard. Send a second questionnaire with a new cover letter - stressing urgency for response

NONRESPONSES - Significant itself - Try to determine if NONRESPONSES are different from responders in some systematic way (phone, person, interview)

 ANALYSIS OF RESULTS - Response rate for each item should be given as well as total sample size and overall % of returns, since all respondents may not answer all questions. "On item 4, 50% said yes, 30% said no, and 20% said neither"

 TYPES OF QUESTIONS

UNSTRUCTURED QUESTIONS- Starts with general questions going more and more specific.

Completely unstructured questions allow absolute freedom of response, can yield in-depth response, and provide otherwise unobtainable insights, but produce data that are very difficult to quantify and tabulate.

Facilitates explanation and understanding of the responses to structured questions

Semi-structured approach - structured questions followed by clarifying unstructured, or open- ended questions

Techniques involve attentive listening and questioning

 STRUCTURED QUESTIONS - Makes use of a prepared "interview schedule" - a series of questions to which the interviewer requires specific answers. Require interviewee to select from alternatives (multiple choice) are easier to analyze, but defeat purpose of an interview. Ask questions in a way the subject will understands and will provide standardized information that can be compared to data collected in other tests. Studies show that proportionately more leading questions elicit additional volunteered information and that the amount of misinformation was the same for leading and nonleading questions.

 QUANTIFIES -check on informant's point of view, go deeper and get a precise connotation of what is being said. Need to define words like "none", "a few", "some" , "many", "all". Go deeper and get a precise connotation.

CONTRAST questions - throws a switch that personalizes or depersonalizes a question. i.e.: depersonalize some question that gets into a sensitive area.. Use items similar enough to compare. You get two possible lessons with one question: Are the two items comparable? If so, then how do they differ?

 ENTAILMENT questions that guide for formation of other questions to check if you've put the new piece of information in the correct contextual place:

 FRAME - statement with a hole in it that can be filled in a variety of ways: more general strategy than others: "Toys are used for . . ."

 SCRIPTING QUESTIONS - After researcher decide WHAT TO ELICIT - WHAT QUESTIONS will guide interview.Determines what questions are asked, in what order, what additional prompting is needed

 ORDERING - Arranging, organizing and sequencing of questions to communicate to respondent the researcher's intent and direction

 BAITS - Type of Leading Question that strongly suggest an answer. The ethnographer takes a statement made by the informant, especially when it fits his idea of what is going on in the field, and then does everything possible to get the informant to modify, contradict, or weaken the statement

LEADING QUESTIONS - Many put this on their "Don't" List because it is claimed that these types of questions suggest that one response may be more appropriate than another (Patton and Lofland). Others emphasize whether or not the interviewee controls the conversation enough to know where it is going. The informant may or may not want to follow. Well-worded, leading questions can contain deliberate assumption or overstatement, provoking a complex or elaborate response which otherwise would be missed. Studies show that proportionately more leading questions elicit additional volunteered information and that the amount of misinformation was the same for leading and nonleading questions.

1) When question correctly anticipates the informant's answer, it gives an impression of friendliness and interest and encourages volunteered elaboration

2) If the question is clearly incorrect, it gives the impression of a misinformed ethnographer and ethics corrected information

3) If the question is partially correct or incorrect, it gives the impression of inattention and lack of comprehension, and the information may fail to correct the ethnographer. This last type is the "bad" type, since you don't know what coming back

 

TYPES OF QUESTIONS TO AVOID

1) Questions to which respondent might not reply honestly

2) Questions that assume a fact not necessarily in evidence

  


PARTICIPANT OBSERVATION

ETHNOGRAPHIC METHOD

Involves intensive data collection on many variables over extended period of time in a naturalistic setting. Observing many aspects of the learning environment and attempting to identify factors associated with effective and ineffective environments. Does not want to be overtly influenced by plans, prior findings, methodological limitations and biases. Some observations build on a working hypotheses and strategies for "in-field" use. Define and/or refined research problem.

This helps researchers makes informed decisions regarding what type of observation to do: concerning appropriate

a) environment/setting; b)effective levels of participation, c) precise activities are to be observed; d) which analytic framework under which study will be conducted and analyzed

Many maintain that all this should be conducted prior to the first site visit. The end result will be a structured and predetermined approach to data collection and analysis. Many (Rist, Spradley) maintain that all this should be conducted prior to the first site visit. End result will be structured and predetermined approach to data collection and analysis.

 TYPES OF OBSERVATION - OBSERVATIONAL RESEARCH

Tool to look "a new" at the world. However, Can observe - without recognition. Without partial recognition and understanding. Without full recognition and understanding; Without full view (hidden) use deductions; Through intermediary source; Not by asking, but by observing; Once a behavior/characteristic is defined, observations must be qualified and quantified so that all observers will be analyzed and coded in the same way. Are conditioned by the questions we seek to answer, by our conceptual process and our presence itself, affects situations; Pelto & Pelto - emphasize the use of both verbal and nonverbal tools in observation; Wide-Angle Lens - Holistic Perspective; Insider/Outsider Experience - Analysis from different perspectives; Introspection - Use self as research instrument. How does it feel, "objective"; Pre-determined scheme for direct observation helps when: (Whiting); Informants may be unable to give relevant information in area of interest to the researcher.; Researchers may want some kind of standardized data from a variety of different settings

TYPE OF FOCUSED OBSERVATIONS - what might you focus on?

1) Activities; 2) Types of acts; 3) Specific individuals; 4) One individual's life; 5) Role and statuses; 6) Dyads/triads; 7) A bounded setting or situation; 8) Times of the day, or periods in a routine; 9) Stages in the life cycle

 INTERVIEW/OBSERVATION

Relationship between what people SAY and what they DO. People do behave when they talk and don't talk when they behavior. Observations are important - informants giving account in interviews, may learn things out. This may be done for a variety of reasons. Never forget TIME AND MONEY

 Use observations, especially participant to validate their interpretations

 1. Is there a common language or vocabulary available (similar folk terms)

 2. Is there sufficiently clear patterning in behavior, or will questions be "over-contextualized?

 3. Definition of situation equivalent

 4. Are people aware of behaviors, events or beliefs and attitudes you want to find out about? non- verbal behavior or everyday that are unconscious

 5. Are you interested in things people do NOT do, rather than what they do?

 6. Are people defensive, nervous, too concerned about their behavior to report it well?

 7. Can't ask about it for role or decorum reasons. Can't observe it. Are people going to have to over-generalize or abstract from past events?. Will people have to report frequencies or times to you. Do you care, for purposes of answering your focused questions, how reports and behaviors compare?. Happens too rarely - (wasted time)

8) Do you want to COMPARE your data to other data using a specific type of measure? (systematize)

 

DATA SOURCES OTHER THAN INTERVIEWS AND OBSERVATIONS

ARTIFACTS - "Found objects"

symbolic materials - writings, signs,; diaries, journals and other participant-generated records; local literary products; Artifacts obtained from archives and demographic data banks; Log kept by informants; nonsymbolic materials - tools, furnishings,; textbooks etc.; photos, tapes, scrapbooks - Visual Aids to help jog memory; participants as field workers; Talk once a day - to see how everything is going (with other doing similar work)

 NON-PARTICIPANT OBSERVATION

(A) NON-PARTICIPANT OBSERVATION - NO INVOLVMENT

Observer purposely controls or manipulates nothing and works very hard at not affecting the observed situation. Requires detached, neutral and unobtrusive observer. Intent is to record and study behavior as it normally occurs. Complete non-participant observation - researcher is completely hidden (another room, use of video etc.). Semi-Complete non-participant observation - research positions self on margins of social events, sees events through eyes of videotape etc.. Observation alone - without anything else. Provides comprehensive, detailed and representative accounts of behavior/interaction. Noting arrangements of physical space in a classroom. Monitoring rhythm of interactions. Observing children at play. Focus is on participant behavior rather than on participant meaning. Still includes what people say and how they say it - but does not interrupt to seek clarification (a procedure integral to more interactive methods). Rarely used as initial, exploratory method in addressing problems, topics and settings. More appropriate for refinement and verification stages of research process. Used after more generative strategies have indicated which interactions are salient for observation/recording.

1) Unites of analysis are specified prior to collection of data.

2) Means of recording observations are chosen for appropriateness to identified units

3) Strategies for selecting or sampling recorded units are chosen. 4) Trials or pilot studies used to refine techniques.

 LIFE HISTORIES

Life cycle detailed and analyzed. Does not emphasize requirements of society, but rather individual experiences and requirements - how the individual copes and develops within society (David Mandelbaum: 1977). Autobiographies can be utilized in the analysis of the life history. Interviews form the foundation of the life history

 STREAM-OF-BEHAVIOR CHRONICLES

Minute-by minute accounts of what a participant does and say: filmed, taped, recorded by hand on the spot. These methods help nonparticipant observers to delineate categories of activities, study the use of time and motion and map movement and physical environment. Generates process data for investigations of how materials are manipulated or what styles teachers use

 PROXEMICS AND KINESICS - Social uses of space and with bodily movement

PROXEMICS - study of how people use space

KINESICS study of how people move

 PROXEMICS, KINESICS, STREAM-OF-BEHAVIOR AND LANGUAGE STUDIES - all are facilitated by the use of audio and videotaping equipment.

 COMPLETE NON-PARTICIPANT OBSERVATION - Researcher is completely hidden (another room, use of video etc.)

 SEMI-COMPLETE NON-PARTICIPANT OBSERVATION - Researcher positions self on margins of social events, sees events through eyes of videotape etc.. Observer purposely controls or manipulates nothing and works very hard at not affecting the observed situation. Record behavior as it normally occurs. Research positions self on margins of social events, sees events through eyes of videotape. Focus is on participant behavior rather than on participant meaning does not interrupt to seek clarification (a procedure integral to more interactive methods).

LOW INVOLVEMENT - MODERATIVE/PASSIVE

PASSIVE PARTICIPATION - bystander, spectator, in large public places.. Interviews can supplement

 MODERATE PARTICIPATION - balance between active and passive. Example: CASE STUDY

 COMPLETE - ACTIVE PARTICIPANT OBSERVATION. Seeks to do - State of mind

Ethnographer is not merely a detached observer of the lives and activities of people under study, but is also a participant in those round of activities. Actually becomes part of the situation. There are degrees of participation (full, partial etc.)

Neither recording everything nor getting it all down are attainable goals for participant observers. Interactive stream is too complex and too subtle to be captured completely, even by a team of observers. Study situation in which there are already ordinary participants (studying own life style). Fully lives life of those under study. Socialized by new "culture" and friend to those in the culture. Trying to blend in and take part in daily activities. Collect viewpoints of others, stories, anecdotes, myths, materials etc.. Better when researcher speaks the language of the people studied. Can have first-hand experiences of what these activities mean to the people themselves. Looses natural ethnocentric bias by gradually immersing in day-to-day facts of life of people with whom one is working. By becoming an active member of the community - no longer becomes a "scientific" stranger, but an entrusted friend. Others become willing to share some of their thoughts and reactions with you. Others respond to researcher as a familiar person rather than as an intruder or stranger. Better when the researcher remains an observer for sufficient time to sample a reasonably full range of situations and circumstances

 NATURALISTIC OBSERVATION - Observed as they occur naturally. Observer purposely controls or manipulates nothing and works very hard at not affecting the observed situation. Intent is to record and study behavior as it normally occurs. You change the environment by your presence alone

 SIMULATION OBSERVATION - Researcher creates situation to be observed and tells subjects what activities in which to engage. Observe behavior that occurs infrequently in natural situations or not at all. PROBLEM: Not all behavior exhibited by subjects will be "natural" and high risk of "faking it"

INFORMAL TO FORMAL COMBINATIONS

 SIMULATED SITUATION: Define situation typical for group members and observe them interact within that situation. Formal: all groups were guided by same events and verbal behavior was recorded and checked.

Informal: content and interpretation were left to group members

FRAME ELICITATION: intermediate method on formal-informal continuum

Tested conclusions from informal data, but in a way that allows new information to appear.. Standardized frames - more of concern with even sample along white/black lines.

 HYPOTHETICAL SITUATION: most formal

Restricted framework into which responses were forced, with clear outcomes that either supported or refuted ethnographic conclusions. Informal hypothesis see if responses checked out with what ethnographer thinks is important. At same time, there was room for new responses to appear. Sorting is guided by implicit hypothesis - although different sorters would differ, there would also be shared structure implicit in the results. If you ask the right questions, this is all the quantification you will need to support or falsify your ethnographic statements!

 PARTICIPANT OBSERVATION PROBLEMS

Not all social situations allow complete participant observation. The greater the participation, the greater the potential bias. The greater the participation, the greater the potential ethical problems. Participant reports of activities and beliefs may not match their observed behavior. Participants respond to innovations in a variety of unintended ways

 GUIDELINES FOR DIRECTING OBSERVATION

1) Who is in the group; How many people are there, what are their kinds, identities and relevant characteristics? How is membership in the group or scene acquired?

2) What is happening here? What are people doing and saying: a) what behaviors are repetitive? which occur irregularly; b) how do people interact; c) What is content of participant's conversations?

3) Where is group located?

4) When does group meet/interact? How long?

5) How are identified elements connected or interrelated?

6) Why does group operate as it does? History? symbols, traditions, values etc.

In your research: Discuss up front type of observations you have omitted and why

Observations are guided by:

1) Research topic;

2) Theoretical and conceptual frameworks informing a study;

3) Data that begin to emerge as participant observer interacts in the daily flow of events and activities

4) Intuitive reactions and hunches that participant observers experience

  


INTERVIEWS

Oral, in-person administration of a questionnaire to each member of a sample. Does have flexibility - adaption of situations to each subject. Can get accurate and honest responses since the interviewee can explain and clarify both the purpose of the research and individual questions. Interviewer can follow up on incomplete or unclear responses by asking additional probing questions. Reasons for particular responses can also be determined. You have part in creating what comes out. Produces in-depth data not possible with questionnaires/surveys. Is expensive and time consuming; generally involves smaller samples than Q/S. Learn what the significant details of an event are and how they are connected. Want to learn how informants interpret the world through which they move.. No-shows affect results, since sample size is small. Interviewing strategies help you understand what you've done, even if you didn't know that you were doing at a conscious level. This often becomes apparent when analyzing the transcripts

 INTERVIEWS DIFFER FROM FRIENDLY CONVERSATIONS

 FRIENDLY CONVERSATIONS: Greeting, casual nature of encounter, speech used, certain cultural rules

 1) Turn taking is less balanced - informants talk more; 2) Repeating replaces normal rule of avoiding repetition - shared knowledge space; 3) Expressing interest and ignorance more often, but only on the part of the ethnographer; 4) In place of normal practice of abbreviating, the ethnographer encourages expanding on what each person says

 DISADVANTAGES

Flexibility generates difficulties, especially if interview is unstructured. Problems may arise to summarize, categorize and evaluate responses to unstructured questions since each respondent may interpret the meaning of questions differently

 Training of interviewers

 Inappropriate use - need to identify between facts and opinions

 TYPES OF QUESTIONS

LIMITED RESPONSE QUESTION - Restricts length and scope of questions and responses. "Do you like school"

 FREE-RESPONSE - Respondent is free to define the boundaries of the topic: "Tell me about school"

 DEFENSIVE-RESPONSE OR STRESS INTERVIEW - Ask intimate questions that may produce anxiety or hostility

 JAWBONING: Conversation type. to gain rapport. to see what would make good interview. to learn biases, factions. to. introduce topic of interest - less threatening, increase local acceptance. to pick up on local terminology. to learn taboo subjects. information learned may be superficial

 TACIT KNOWLEDGE - We all know things that we cannot talk about or express in direct ways

1) From what people say; 2) from the way people act; 3) from artifacts people use

At first, each cultural inference is only a hypothesis about what people know. These hypothesis must be tested over and over until the ethnographer becomes relatively certain that people share a particular system of cultural meanings. Evaluate adequacy of inference - if a stranger to the culture can use your statements as instructions for appropriately anticipating the scenes of the society

 ERRORS

1) In asking questions - when response to a question will not satisfy the objectives of the investigation

 2) In probing - interviewer does not allow the respondent sufficient time to anticipate to or respond to responses

 3) Motivating - respondent's motivation may produce invalid responses

 4) Recording responses - inaccurately records respondents relays

 

Biasing Factors in the Interview

Age College major; Ed level Experience in interviewing; Racial background Religions background; Sex of interviewer Socioeconomic level

 KINDS OF INTERVIEWS

KEY-INFORMANT INTERVIEWING - Key informants are individuals who possess special knowledge, status or skills who are willing to share that knowledge and skills with the researcher

 CAREER HISTORIES - LIFE-HISTORY - Elicitation of the life narratives of participants

 FORMAL INTERVIEWING - Based on a standardized schedule of questions - seeking out specific informants for the knowledge that they have. Key assumption - holding constant certain variables such as question form and sequence will reduce extraneous sources of bias (Cicourel:1964). The interviewer, the interview situation, the negotiations between interviewer and interviewee, all influence the outcome in uncontrolled and usually unmonitored ways. Those involved may not interpret questions in the way intended by the interviewer. Allows for maximal combination of reliability and validity via comparability of answers. Facilitates if larger numbers are used, not as time consuming as "informal"

 PROBLEM

If the form or content of questions reflect the researcher's interpretive framework rather than the interviewees's - validity is also jeopardized

 INFORMAL INTERVIEWING - Helps avoid presupposing relevant categories for talking about events. This approach allows the interviewee greater freedom in setting the terms of the conversation and structuring its progress. - non-directive. Informant takes the lead. Everything is negotiable; degrees of control can be exercised. Sometimes involves "chance-encounters" - informal discussions as well. Integrate feedback, repetition to go in-depth. Whyte offers a typology of interview directiveness. At the least-directed end of the scale are expressions of interest. You simple encourage the informant to keep talking by word or gesture. Next comes a simple reflection back of the informants last statement. Then there is a "probe" on the last remark. The other end of the scale is directed questions

ETHNOGRAPHIC INTERVIEW STEPS - This is a series of friendly conversations Explicit purpose - must be clear

1) Select and define a problem

 2) Formulate hypotheses - pursue specific topical areas

 3) Decide on type of Questions

 4) Samples of subjects who possess the desired information are selected

 5) Effort is made to get commitment of cooperation from selected subjects

 6) Start off with background information about the subject: a) Who is informant; b) Where does informant fit into the community

 7) Without being authoritarian, interviewer gradually takes control of the talk, directing it to channels that lead to discovering cultural knowledge. Avoid leading questions and ignoring their perspectives. Relaxation is important

 8) Get informant to tell anecdotal about experiences as they relate to the questions

 9) ETHNOGRAPHIC EXPLANATIONS - Build up knowledge by asking the same question in a variety of ways

 10) Repeatedly offer explanation to informant: 10.1 - General Statements - Describe project; 10.2 - Recording Explanations - include all statements about writing things down, reasons for video, etc.; 10.3 - Native Language Explanation - explain why you want definitions from their viewpoint; 10.4 - Interview Explanation - when something new is being introduced - explain the technique and the reason for it; 10.5 - Question explanations - when introducing new question topics or formats - explain

 11) Be aware of cultural rules for a) beginning and ending sessions; b) taking turns; c) asking questions; d) pausing; e) how close to stand to other people etc.

 12) Ethnographic Questions - introduced in stages: 12.1 - descriptive questions - "tell me what you do" - "describe an event"; 12.2 - structural questions - enable ethnographer to cover information about domains - basic unit in an informant's cultural knowledge. HOW informants have organized knowledge "What are . . ."; Structural questions are often repeated so informant identifies 6 types of activities; 12.3 - Contrast questions - find out meanings by discovering the dimension of the meaning which informal apply

 PRETESTING THE INTERVIEW PROCEDURE

The interview guide, procedures and analysis should be tried out before the main study begins. Use a small sample from the same o r a very similar population.. Feedback from a small pilot study can be used to revise questions that are unclear,. Do not solicit the desired information or produce negative reactions.. Insights into better ways to handle certain questions can also be acquired

 CONDUCTING INTERVIEWS

1) Duration; 2) Number -how many separate sessions; 3) Setting; 4) Identity of individuals involved (group or individual setting); 5) Respondent styles

 Don't try to get everything in one session - use multiple session to review notes, formulate more questions and refine answers

 COMMUNICATION DURING THE INTERVIEW

Effective communication is critical; First impressions are important; Establish rapport and put interviewee "at ease" before interview starts; Explain purpose of study and strict confidentiality of responses; purpose of questions; Be sensitive to reactions of subject and proceed accordingly; If interviewee feels threatened to certain questions - avoid them, return later or explore reasons, if it is appropriate; If interviewee "gets of the track", interviewer can put back own; Avoid words or actions that may make subject unhappy, or feel threatened. Frowns and disapproving looks have no place in an interview!

 TECHNIQUES FOR DOING FOCUSED INTERVIEWS

Multi-measures are often valuable; sometimes its' "good to count"; Picture window method - narrow - will help count what happens; Spot observation method -sneak up and observe without disturbing it - quick glance,

Shadowing Method - follow around - monitor individual

 Behavior checklists - record in pre-ordained behavior

 Focused event samples and Obtrusiveness judgments

 Observer ratings and other post-visit summaries - judgements and summaries

 Systematic domains and index categories used for notes

 Helps not to under-observe; check biases and to check on affect of your role in instances

 INTERVIEW QUESTIONS

STRUCTURED QUESTIONS

Makes use of a prepared "interview schedule" - a series of questions to which the interviewer requires specific answers. Identify and list of alternatives, distinctly different from the rest.. Must have working knowledge of the group under study

 Interviewer requires specific answers.. Ask questions in a way the subject will understands and will provide standardized information that can be compared to data collected in other tests. Require interviewee to select from alternatives (multiple choice) are easier to analyze, but defeat purpose of an interview. Ask questions in a way the subject will understands and will provide standardized information that can be compared to data collected in other tests

 ORDERING

Arranging, organizing and sequencing of questions to communicate to respondent the researcher's intent and direction

1) Organize scripts topically so that questions addressing the same topic or culminating in a major idea are grouped together

 2) Questions on each topic should be ordered by complexity, beginning with the simplest eliciting the most familiar information - careful to avoid boredom (and ignoring of these sometimes important "easy" questions)

 UNSTRUCTURED QUESTIONS

Starts with general questions going more and more specific. Permits greater depth of response and insight into reasons for responses. Provides information extraneous to objectives of study, are difficult to analyze but are insightful.. Open-ended questions (use of digression methods). Completely unstructured questions which allow absolute freedom of response, can yield in-depth response, and provide otherwise unobtainable insights, but produce data that are very difficult to quantify and tabulate. Facilitates explanation and understanding of the responses to structured questions. Techniques involve attentive listening and questioning. Semi-structured approach - structured questions followed by clarifying unstructured, or open-ended questions. If interviewer guided informant back to topic, if there is a digression - but the digression itself may provide even greater information

 LIFE-HISTORIES: encourages informant to talk about own life in a chronologically ordered way. Order is of cultural importance. Among a culture with many single mothers, asking about marriage before children might not be appropriate

 RANDOM QUESTIONS - talk about an event, an interest, a particular topic

 FOLK ESTIMATES - When asking for a distribution check, or when asking other questions requiring estimates of proportion or value on the part of informants, an ethnographer should try to learn folk methods for giving those estimates

 ENTAILMENT - questions that guide for formation of other questions to check if you've put the new piece of information in the correct contextual place:

 FRAME - statement with a hole in it that can be filled in a variety of ways: more general strategy than others: "Toys are used for . . .". Then take that frame to different directions. Form questions based on what the informant is saying

 LEADING BAIT QUESTIONS - Well-worded, leading questions can contain deliberate assumption or overstatement, provoking a complex or elaborate response which otherwise would be missed

Don't - suggests that one response may be more appropriate than another

Do - whether or not the interviewee controls the conversation enough to know where it is going. The informant may or may not want to follow

 QUANTIFIES -check on informant's point of view, go deeper and get a precise connotation of what is being said. Need to define words like "none", "a few", "some" , "many", "all".

CONTRAST questions - throws a switch that personalizes or depersonalizes a question. Use items similar enough to compare

i.e.: depersonalize some question that gets into a sensitive area

Use items similar enough to compare - are two items comparable? If so, then how do they differ?

SELF and OTHER - specific form of contrast question - it throws a switch that personalizes or depersonalizes a question

Are the two items comparable? If so, then how do they differ?

 DISADVANTAGES WITH THESE TYPES OF QUESTIONS

Responses may be biased and affected by interviewer's reaction (positive or negative)

Requires a level of skill usually beyond that of the beginning research.

 

Requires knowledge of sampling and instrument development as well as various forms of community and interpersonal relations skills

 

SCRIPTING QUESTIONS

After researcher decide WHAT TO ELICIT - WHAT QUESTIONS will guide interview. Determines what questions are asked, in what order, what additional prompting is needed. Improvisations or Pre-designate protocols. Rehearse or used while conducting interviews. Interview questions should be clearly meaningful to respondent. How to word questions purposefully and effectively. Standardized, comparable data from each subject, all interviews must be conduced in essentially the same manner.. Each question should relate to specific study objective. Combination of objectivity and depth can be obtained and results can be tabulated. Terms should be defined when necessary; Points of reference given when appropriate

 RECORDING - Manual - leave room on paper for your responses (which you write after the interview is over). Many subjects feel threatened if they see the interviewer writing allot of information!; Yet responses - after the fact - may not likely to recall every response; Mechanical - Recording helps this - but make subject uneasy; hand counters; timers and beepers; tape recorded reminders; Paper Clip & Gum Wrapper Method - use as counting devise, without others knowing; still photos; Polaroids; film and videotape; direct-to-computer processors; Misuse, Overuse - failuare - greater than possibility of underuse; How much of tape is transcribed is another issue - who does the transcribing

CHOOSING INFORMANTS - People used to gain information regarding specific data in ethnographic research. "Native speaker" - someone who knows the culture under study.

Research with Informants

1) What do my informants know about their culture that I can discover?; 2) What concepts do my informants use to classify their experience? 3) How do my informants define these concepts? 4) What folk theory do my informants use to explain their experience? 5) How can I translate the cultural knowledge of my informant s into a cultural description my colleagues will understand

GOOD INFORMANT CHARACTERISTICS

a) experience - much of cultural knowledge is tacit (taken for granted); beware of the unfamiliar even if they are completely known to the ethnographer; You and your informant can miss things that are familiar; Informant from familiar cultural scene creates problems for interviewing; At the same time you study, your informant is gathering information about what you know. They will feel you asking dumb questions, trying to test them and can become resentful.; b) knowledge - natural process of learning in a particular culture. Good informants know their culture so well they no longer think about it; c) current involvement - when people are currently involved in a cultural scene, they use the knowledge to guide their actions; d) articulate ability;; e) willingness;; f) availability of time - series of ethnographic interviews: 6-7 1 hour interviews in total; Tandem information - after talking to someone who has no more time, have them suggest another informant in similar situation; Research with Informants; 1) What do my informants know about their culture that I can discover?; 2) What concepts do my informants use to classify their experience?3) How do my informants define these concepts?4) What folk theory do my informants use to explain their experience?5) How can i translate the cultural knowledge of my informant s into a cultural description my colleagues will understand

 SUBJECTS - People used to gain information regarding specific data - experimental research. Specific Goal - test/seek to confirm or disconfirm hypothesis based upon subject's responses; Research with Subjects

1) What do I know about a problem that will allow me to formulate and test a hypothesis? 2) What concepts can I use to test this hypothesis; 3) How can I operationally define these concepts; 4) What scientific theory can explain the data? 5) How can I interpret the results and report them in the language of my colleagues?

 POPULATION - Group research is concentrated upon. A define population has at least one characteristic that differentiates it from other groups

1) Develop a set of criteria that constitutes a portrait of the group under study - and find individuals with similar characteristics; 2) Develop criteria and then advertise for willing participants; 3) Participants search for researcher (in case of evaluation); 4) Aggregate of all observations of interest to the research. In this way you can talk about a population of responses, test scores, characteristics or traits.

 First task: CONCEPTUALIZE THE BOUNDARIES - i.e. identify the population; Descriptors constitute boundaries because they distinguish between people to be studied and those to be excluded from consideration; Conceptual descriptors - concepts embedded in research questions - bilingual teachers; Logistical descriptors - conditions necessary to the feasibility of any study

 NATURALLY BOUNDED POPULATIONS - exist independently of researcher interest and are formed, or at least recognized, and confirmed by their constant participants; Share common geographical location - provide boundaries which are finite and discrete; Share common characteristics, not necessarily geographic boundaries (i.e. members of a national organization). Research methods are therefore limited by accessibility; Flexibility - people move into and out of groups frequently; Examples; classrooms, villages, teams etc.

 ARTIFICIALLY BOUNDED POPULATIONS - identified by researchers, scholars or policy makers as collectives of individuals sharing common attributes; do not together form socially designated groups.; The major differences between an interview study and a questionnaire study are the nature of the instrument involved (an interview guide vs. a questionnaire) the need for human relations and communication skills; methods for recording responses; and nature of pretest activities; Rigorous sampling strategies can be used with these groups only when they are clearly identified within some other bounded population; Although such individual may recognize their commonality in occasional encounters, they do not together form socially designated groups. I.E. illiterate men - they are all illiterate but are not bonded together.

  


QUANTITATIVE ETHNOGRAPHY

 QUANTITATIVE ANALYSIS

Strategies that secure data, measurable by an analytic set of criteria that can be rigorously tested for possible correlations by means of mathematical procedures. Specifying a concept's content precisely and using it to label given occurrences in social life does not establish the relevance or significance. Measurement, control, prediction - Strategies that measure, analyze and use data. Stress on correlations derived from mathematical procedures

 Attributes - relations, purpose, prospects are observed. continuous variation - height. discontinuous/categorical orientation - being tall.  Can range from a simple count to a computer-assisted calculation of possible mathematical interrelationships among a host of variables derived from specific data analysis.

 Quantitative researchers recognize that no all significant information can be measured or categorized readily by a detached (universal ) set of analytic procedures, but they concentrate on data amenable to this procedure

 Quantification of Qualitative Research

Educational Research - can be both qualitative and quantitative

 Ethnographic Perspectives - Defines "variables' in research mode - stresses monitoring of phenomenon over time

Cognitive Approach - Try to get participants to make categories themselves

 Quantitative assessment of frequencies of types of behavior is basic to qualitative description - it is partly on this basis that judgments of typicality are made (Wallace: 1970)

 Social facts make qualitative/naturalistic studies become attributable to the quantitative code for "qualities of things, kind of things" and thus become scientific

 Adding Qualitative analysis helps to understand the NATURE of that commonality

EXPERIMENTATION AND CORRELATIONAL DESIGN UTILIZE

 Reliability and validity are used to select tests for inclusion in research project and to evaluate examinations prepared by the investigator

TEST ANALYSIS FORM - serves as guideline for selecting and evaluating tests

Reliability provides an estimate of how well measurements reflect true (non-random) individual differences). Obtained scores contain both true and error (random or chance) components. Reduction of random error increases reliability

 Number of conditions affect reliability of Tests

a) Group variability. b) Test length; c) Tests that are overly difficult or too simple; d) Reliability of instruments

 Disguised techniques can conceal the true purpose of an investigation if the study is controversial or potentially ego-threatening.

 SOCIOMETRY - designed by Moreno as a method of studying peer group interactions. Can be used to study student interaction and preferences.

 DATA PROCESSING - involves those methods used to convert raw data into a form capable of being interpreted.

Keep in mind: accuracy; completeness; simplicity; speed; cost and availability

TYPES: machine - scoring answers; edge-punched cards; computers

 CODING ANALYSIS

DISTRIBUTION - picture of variation in sample: summarizes differences in informants' responses and give indication of how well ideas you're checking are supported.  The casual nature of the sample (not being random) looses its problematic nature. As long as there is a reasonable amount of variation in the sample you can be confident that you have hit the proverbial nail.  Such a test adds credibility to your eventual ethnographic report  Maybe your information is correct for one subgroup and incorrect for another. In cases like this, you might also look at different parts of the sample.  Maybe there is a need to break the table down in different categories, i.e. Old/Young; Male/Female  Such breakdowns will not always help because sometimes the underlying pattern that helps understand the distribution is related in some complicated way  Breaking down into patterns of statistical distributions "mean", "median", and "mode" (normal curve, binominal distribution, chi-square distribution etc.) can obscure the specific pattern in the data that allowed the ethnographer to interpret them. Yet, at other times, statistical distributions can help check whether or not what you have differs from what you might get by chance. The ethnographer appreciation of the complexity of pattern often makes procedures focusing on the relationship of a few variables (mathematical statistics) look superficial at best, delusional at worst. Testing and conducting experiment engage experimenter and subject in a relationship similar to that of interviewer and interviewee. Often testing is carried out by researchers who have no relationship to their subjects other than that of "tester" or 'interviewee". The significance to subjects of the test/interview and of the context in which it is carried out must be taken into account explicitly in the analysis of results. The hazards to reliability and validity of experiment research are documented

Cross-cultural research based on this type of method raises additional problems such as the familiarity of subjects with the setting in which test are conducted with the familiarity of subjects form, of test procedures, and implicit rules of appropriate behavior and with the content of the tests themselves.

Since all societies are multicultural to a degree, these problems may be equally salient for studies within our society. Where quantity appears to be significant in the sociocultural world being studied, qualitative researchers use Quantification more extensively

STUDY CHECKS

A. Empirical - Experimental studies

1. Operationalization of the variables is appropriate

2. Instrumentation used to measure the variables is fully described and is appropriate.

3. Treatments are fully documented and are replicable.

B. Clinical - observational studies

1. The phenomena under investigation are clearly identified

2. Interviews and observation guidelines are related to the key elements of the study

3. The methodology for recording the interviews is appropriate

C. Clinical - teaching experiments

1. The phenomena under investigation are clearly identified

2. Plans for observation are detailed and related to the key elements of the study

D. Organizational Studies

1. The organizational pattern is clearly defined

2. The commitment of the instruction involved is favorable

3. The researcher gives evidence of commitment to study the effects of the alternative organizational pattern in an evaluative manner.

 

QUASI-NATURALISTIC STUDIES

Experiment- Investigation - - Systematic/Quantitative. To determine effect of variable by controlling set of which these variables occur.. Compare to see if variable have an effect. internal ability - check on validity

 Normative Log for Experiments

1) Research specific phenomenon and locate it in task; 2) Research assess affect other variables have on phenomenon; 3) Research interprets any empirically demonstrated effects; 4) Research design new experiments to eliminate rival plausible hypothesis; 5) Research develops explicit model of task performance; example; flow diagrams

 RULES

1) can't fully control the circumstances of experiments; 2) can't always determine the real nature of the treatment - i.e. global treatment; 3) variables of interests do not equal variables being manipulated; uncertainty about the variables being measured - cultural differentiation; need for cultural knowledge - are subjects responding to task the say way researchers intend them to respond?

 ANALYTICAL LEVEL

Degree of inclusiveness which characterizes the unit being studied. Theories focus:; level 1 - individual; level 2 - inter-group/ inter-individual (face-to-face); level 3 - inter-community/ inter-group; level 4 - inter-society/ inter-community; level 5 - inter-society

Observation increasingly is on settings rather than on sets of people

 The meaning of "precise specification of relationships between phenomena" varies with analytic level we focus upon.

Intensive ethnographic study of a classroom may yield a body of detailed observations of classroom events and participation. Through analysis, researchers identifies types of events, participants and situations and types of relationships among them

 Explanations are developed to account for the patterns and relationships delineated; Specification of relationships rests on recorded detail of event-sequences.

 EXPERIMENTAL DESIGN

Explores relationship between an independent variable and a dependent variable makes causal relation on Independent Variable from Dependent

 1) pick independent variables - pick something interesting. understand treatment - what really will be affected. operationalize - what is treatment that you suspect to be cause of phenomenon. range of variables (different levels). be realistic - must show affect - not off the wall and not too narrow. don't overlap. piolet study - look at everything to see if readjustment is necessary. need extra time, money and different population. .increase study - control - generalizable the results (paradox of study). generalizable the study - likely all variables were controlled

2) pick dependent variable. operationalize - what is treatment that you suspect to be cause of phenomenon. reliability - do the results repeat themselves over a number of times. Validity - are you measuring what you proposed to measure

 CAUSALITY DESIGN - to inform, the research statistic must prove that he personnel effect. co-variables between cause and effect. ex: randomization and random assignment - take multi-causality into account. rule out or minimize effect of other extra or confounding variables. confounding variable - makes internal validity. internal validity - make sure Dependent variable stands alone. confounding variable. 1) selection (randomized decreased confounding); 2) mortality - people who drop out; during study to account for reliability of population.; 3) history -external to experience that effects people

 WHAT QUESTIONS TO BE USED FOR YOUR CODE TO HELP PROVIDE ANSWERS

What is your model of social behavior - what is important in determining behavior?

What is your level of analysis? individual, social or cultural?

What are your units of analysis? describe or general -time constraint?

What is your theory of inference? - inference and judgements of non-observation - motives?

What kind of scale will you use: categorical; rates; proportions?

QUANTIFICATION

8 ways of thinking about quantitative data

1) Concept of "real rates" is a misnomer.

Rates and measures do not appear naturally in the world.

Rates and measures represent a point of view of the researcher and that varies with the individual

Example: rates of acts of violence in a school: depends on who cimpiles the figures ata given time/place; devine the phenomenon

No true measure

 

2) Singling out people objects and events to quantify changes their meaning

for example: requirements to keep statistics on race/ethnicity, may increase the attention people pay to children's race - positive or negative

 

3) Quantifying has a temporal dimension - numbers do not stand alone but are related to the social and historical contexts that generate them

 

4) Quantification involves many different participants and can only be understood as a multi-level phenomenon - changes over time/place and positionality

 

5) Both the person and her motivation for counting affects the meaning, the process and the figures generated

 

6) Counting releases social processes within the setting where the counting takes place in addition to and beyond the activities directly tied to counting

Counting can shape what people consider important and meaningful and designate particular activities as expedient. For example, standardized tests may change the content of the course and generating success rates can become the major activity of the educational agencies

 

7) People who produce data in educational settings are subject to social processes and structural forces similar to those that affect other work groups. Fudge factors, numbers game, massaging the data exist

 

8) Enumeration and its products have strong affective and ritualistic meaning the US ed system

 

  


DRS METHOD - DEVELOPMENTAL RESEARCH SEQUENCE Developed by James Spradley (1979).

Some tasks are best accomplished before other tasks. locate social situation before observation. conduct some interview questions/observations before others. Analysis - also composed of a series of cyclic sequences

 Data simultaneously gathered through the generation of questions followed by analysis of the responses which leads to new questions. This process aids in the identification, description and ultimate understanding of the organization of the multicultural patterns and cultural knowledge of the subject under study

 METHODS OF LEARNING AND DOING

SINGLE-TECHNIQUE PRINCIPLE - Select one technique (PO, Interview, Recording, etc.) - master one at a time

 TASK IDENTIFICATION PRINCIPLE - Identify basic task and specific objectives required by a particular field technique

 DEVELOPMENTAL SEQUENCE PRINCIPLE - a) enables person to improve basic research skills in systematic manner;

b) allows one to study a cultural scene in a way that is efficient and workable

 1. Conceptual problems: lack of understanding of fundamental concepts related to doing ethnography

2. Analysis problems - not knowing what to do with raw information gathered

 ORIGINAL RESEARCH PRINCIPLE - End product is goal

 PROBLEM-SOLVING PRINCIPLE - a) define problem, b) identify possible causes, c) consider possible solutions, d) select best solution, e) carry out plan and f) evaluate results

 DRS METHOD

 1) DOMAIN ANALYSIS

Summary of cultural scene.

 Cultural Domain - broad category of cultural meaning (cultural knowledge) that includes other smaller categories. All members share at least one common feature of meaning. Interpretations vary with the individual. Search for cultural symbols which are included in larger categories (domains) by virtue of some similarity

1) Informant's interpretation (folk perspective); 2) Researcher's interpretation (mixed perspective); 3) Analytic interpretation (scientific perspective)

 Preliminary overview of cultural scene on basis of domains you have identified in your preliminary search. Write in broad terms to describe the total scene or what you know about it. Categorization of symbolic meaning - how symbols relate to one another; search for relationships among symbols (folk, mixed and scientific perspectives). Explores how different perspectives are organized

 What makes up categories of cultural knowledge - the process

1) Cover terms (General term). 2) Included Terms (Other terms that apply, correlate, define). 3) Semantic Relationship

X is a part of Y. X is a way to Y. X is a kind of Y. Instead of reading the MEANING - focus on CONTENT

4) Domain Analysis Worksheet: SEE p. 94. . 5) Formulate structural questions for each domain SEE p. 97/98; 6) Make a list of all hypothesized domains

 (2) DESCRIPTIVE AND STRUCTURAL FOCUS ANALYSIS

Selected observation - dialogue on cultural domain. Select a domain you analyzed and create a meaningful dialogue between two people who know the culture. Describe the situation and get reactions. Focused - add important domains, revise style into more of an overview of the cultural scene. Locate, verify and elicit local terminology in a number of different domains. Repetition allows for formulation of structural questions which helps identify list of domains that ultimately provides overview of cultural scene. Review notes to search for cultural symbols and relationships among those symbols. Don't ask for MEANING, ask for USE

 Simultaneous use of Descriptive and Structural Questions

a) Explanation principle; b) repetition principle; c) context principle; d) cultural framework principle

 Kinds of Structural Questions

a) Verification Questions; b) Cover Term Questions; c) Included Term Questions; d) Substitution Frame Questions; e) Card Sorting Structural Questions (write terms on cards and have informants define them)

 Result is long list of domains that givers overview of cultural scene and some idea of how the surface structure of that scene was organized

 (3) TAXONOMIC ANALYSIS

Describe cultural domain - Select a set of terms that make up one domain or are part of a larger domain - describe it. Search for ways cultural domains are organized; End-result: identifying contrast sets

 Shifting focus from surface to in-depth analysis - limiting the scope of analysis

Micro analysis - cultural meaning is complex - if only skim surface, will never know how informants understand things

Macro analysis - holistic terms define cultural meanings - need to see cultural scene in its entirety

Based on existing knowledge, contrast questions are applied in order to focus on similarities and differences of the domains

 Making a Taxonomic Analysis

a) begin with domain for which you have the most information

b) identify semantic relationship - substitute framework (based on primary relationships within this domain) - substitution frame

is a kind of - - - example: A drunk (is a kind of) inmate. other underlying semantic relationships: X is a kind of Y; X is a way to Y; X is a part of Y; X is a reason for Y; X is a stage in Y; X is used for Y

c) search for possible subsets among included terms

d) search for larger, more inclusive domains that might include as a subset the one you are analyzing

e) construct a tentative taxonomy (box diagram, set of lines and nodes, outline)

f) formulate structural questions to verify taxonomic relationships and elicit new terms

g) conduct additional structural interviews to verify and embellish taxonomy

h) construct a completed taxonomy

(4) ASK CONTRAST QUESTIONS/MAKE SELECTED OBSERVATIONS

Discover principles in the study of meaning. Use Principle - The meaning of a symbol can be discovered by asking how it is used rather than asking what it means. Similarity Principle - Meaning of symbol can be discovered by finding out how it is similar to other symbols. Contrast Principle - meaning of symbol can be discovered by finding out how it is different from other symbols. Structural questions Principles (concurrent, explanation, repetition, context oriented, culturally oriented)

 Shift from similarities to differences. Using discovery principle of contrast - discover numerous contrasts for a number of contrasts sets

 Kinds of Contrast Questions

1) contrast verification: types of decisions; 2) directed contrast: in-depth; 3) dyadic contrast: asks question without having any differences to suggest to the informant; 4) triadic: gives choice (multiple choice); 5) contrast set sorting; 6) 20 Questions game; 7) Rating questions

(5) COMPONENTIAL ANALYSIS

Systematic identification of components of meaning (attributes associated with cultural symbols). Search for attributes that signal differences among symbols in a domain. Search for attributes in terms of each domain. Organize information systematically. Develop paradigm: schematic representation of attributes that distinguish contrast set members. Whereas a taxonomy shows only a SINGLE relationship among a set of terms, paradigms show MULTIPLE semantic relationships

 Process

1) select a contrast set for analysis; 2) describe (inventory) all contrast previously discovered via each cultural domain; 3) select different cultural domains; 4) prepare a paradigm Worksheet SEE p. 132; 134; 136; 138; 5) identify dimensions of contrast which have BINARY values; 6) combine closely related dimensions of contrast into ones that have multiple values; 7) prepare contrast questions to elicit missing attributes and new dimensions of contrast; 8) conduct interview/observation to elicit needed data; 9) prepare a completed paradigm; 10) write a formal description of those domains, making clear the meaning of terms and their relationships. Give specific examples to show some of the attributes that reveal contrast among the terms

 (6) DISCOVERING CULTURAL THEMES (THEME ANALYSIS)

Search for relationships among domains and understanding of how they are linked to the cultural scene as a whole. List of things people know, relationships etc.. Process

a) Describe a cultural theme or make a list of cultural domains; b) make a list of possible unidentified domains; c) collect sketch maps; d) make list of examples; e) inventory miscellaneous data

Search for Universal Themes

Select one or more cultural themes and show how the theme connects several domains

a) social conflict; b) cultural contradictions; c) informal techniques of social control; d) managing impersonal social relationships; e) acquiring and maintaining status; f) acquiring and maintaining status; g) solving problems

  


EVALUTION AND ACTION RESEARCH

DATA ANALYSIS

1) TIDYING UP - putting notes and materials in order - preparing for analysis (could take 1 month)

2) SCANNING - rereading the data - mapping stages, special things, noting events

ask questions of the data

3) SUMMARY of data

4) ESTABLISH CATEGORIES within which the data is organized

5) EMERGING PATTERNS within the categories

6) INTERPRETATION of data

7) INTEGRATION OF FINDINGS WITHIN BROADER AREAS

8) THEORIZING

Formalized and structured method for playing with ideas and data.

Cognitive process of discovering or manipulating abstract categories and the relationships among those categories

a) Comparing; b) Contrasting; c) Aggregating; d) Ordering; e) Establish linkages and Relationships; f) Speculation

 COMPUTERS - ideal for data collection and sorting

Numerical data can be processed by statistical programs; Narrative data can be entered into word processors or data base managers; Data Storage; Research Memos; Segmenting - dividing data into analysis units; Coding varies depending on whether or not a classificatory system exists when the analysis begins; Collating Coded Text Segments; Sorting, Find/Replace

 NARRATIVE ANALYSIS (BASED ON LINGUISTICS)

ABSTRACT STATEMENTS Summaries

ORIENTATION STATEMENTS Time, place, Persons identified

COMPLICATED ACTIONS Describe what actually happened

RESULTS OR RESOLUTIONS Describe result of action

CODA Returns the speaker to the present situation

EVALUATION STATEMENTS Tell how narrator felt about what was happening, dramatize the action or compare the actual event judgementally with others which did or did not occur.

Good Source - David Madsen, Successful Dissertations and Theses. San Francisco: Jossey-Bass Publishers, 1983

 EVALUATIONS

 Various approaches to evaluation - different kinds of truth and different degrees of truth within and across kinds of information

 GENERAL QUESTIONS:

1) For whom is a study being evaluated? Ourselves - emphasis on completing task and doing it well. Coping with the impossibility of completeness by making tradeoffs. Note limitations up front. Scholarly Traditions. Community of Scholars. Participants. Various Publics. Funding Sources

 2) Who are the people who care about the legitimacy and worth of qualitative research

 3) What do they want to know?

 WRITING GRANT PROPOSALS - GUIDELINES

1) Cover Page - Title; Responsible Party; Dates; Abstract

2) Narrative (3 - X pages) - Introduction

Background - Specify scholarly context (literature review)

3) Definition of group/organization applying for proposal

4) Project Description

Project Goals and Objectives

Project Audience

Significance

5) Program Plan (Scope of Program) - Who is doing what, where, and why (brief on the why)

6) Commitment - Level of commitment by participating group/organization

7) Program Management - Staff Titles and Duties

8) Program Implementation - Calendar

9) Activities Associated with Program

10) Possible Outcomes

11) Budget Page

12) Recommendations - Letters of Support from All groups of people associated with the project

13) Plans for Evaluation

 CRITERIA FOR THE EVALUATION OF EDUCATIONAL RESEARCH

 TITLE

1. Title is well related to content of article

 PROBLEM

2. Problem is clearly stated

3. Hypotheses are clearly stated

4. Problem is significant

5. Assumptions are clearly stated

6. Limitations of the study are stated

7. Important terms are defined

 REVIEW OF LITERATURE

8. Coverage of the literature is adequate

9. Review of literature is well organized

10. Studies are examined critically

11. Source of important findings is noted

12. Relationship of the problem to previous research is made clear

 RESEARCH PROCEDURES

13. Research design is described fully

14. Research design is appropriate to solution of the problem

15. Research design is free of specific weakness

16. Population and sample are described

17. Method of sampling is appropriate

18. Data gathering methods of procedures are described

19. Data gathering methods or procedures are appropriate to the solution of the problem

20. Data gathering methods or procedures are used correctly

21. Validity and reliability of data gathering procedures are established

 DATA ANALYSIS

22. Appropriate methods are selected to analyze data

23. Methods used in analyzing the data are applied correctly

24. Results of the analysis are presented clearly

25. Tables and figures are effectively used

 SUMMARY AND CONCLUSIONS

26. Conclusions are clearly stated

27. Conclusions are substantiated by the evidence presented

28. Conclusions are relevant to the problem

29. Conclusions are significant

30. Generalizations are confined to the population from which the sample was drawn

FORM AND STYLE

31. Report is clearly written

32. Report is logically organized

33. Tone of the report displays an unbiased, impartial, scientific attitude

CRITERIA FOR JUDGING RESEARCH REPORTS AND PROPOSALS

 GENERAL CRITERIA

A. THE PROBLEM

1. The problem is clearly stated and the rationale is logical

a) purpose is concisely stated

b) objectives are specified

c) procedures are specified

d) variables are identified and their relationship to theory or observation is explained (if variables are new, then evidence from a pilot study is presented)

e) Research hypotheses are concise

f) Research hypotheses are logically developed from some theory or related problem, and are clearly plausible

 2. The problem is significant

a) its relationship to previous research had been well established

b) the hypothesized research findings should be generalizable beyond the sample

c) the study will make a contribution to the advancement of knowledge

d) the results will contribute to the solution of some practical or theoretical problem

 B. Design and Procedures

1. The design of the study is appropriate to the solution of the problem

a) The research design is fully described

b) Assumptions are clearly stated

c) Delimitations are noted

d) The population and sample are described (geographical limits, time period covered, sociological description, sampling units)

e) The sampling method is appropriate and practical

f) Controls for sources of error are describe and are appropriate (e.g. sampling error, nonresponse, interviewer bias, response error, response set, experimenter bias, teacher effect, control of variables, extraneous factors

g) there is adequate provision for protection of human subjects

 2.The relationship of the procedures to the implementation of the design is appropriate

a) the data-gathering methods are clearly described and meet the requirements of the problem

b) the obtained sample is of a sufficient size and is representative of the define population

c) the measuring instruments are appropriate

d) the validity and reliability of the evidence are established, or a procedure for establishing the validity and reliability of the evidence is described

 C. Analysis and conclusions (for research reports)

1. The analysis of the data is appropriate

a) the results of the analysis are clearly presented

b) the analysis methods are valid, appropriate and properly applied

c) the assumptions behind the statistical tests are stated and the relationship of the test to the design is appropriate

 2. The conclusions are reasonable

a) the conclusions are clearly stated

b) the conclusions are substantiated by the evidence presented

c) interpretations and implications are impartial and scientific

d) a comprehensive discussion of the qualifications is given (methodological problems and errors, alternative explanations, other limitations)

 3. The research is adequately reported

a) the report is logically organized and clearly written

b) grammar and mechanics are adequate

 D. Personnel and facilities (for funding research proposals)

1. The qualifications of the investigator to conduct this study are adequate

a) competence in the techniques involved is demonstrated

b) the investigator has adequate experience and timing for this research

c) the investigator is familiar with the pertinent literature

d) adequate time commitments are indicated

 2. The facilities for this study are adequate

a) requirements for equipment or personnel are realistic

b) the institutional setting is favorable (if applicable)

 3. The relationship between the cost of the study and the proposed activities is appropriate

a) estimates of anticipated costs are reasonable

b) number of personnel assigned to the project is reasonable

c) the relationship between the probable outcome in terms of it impact and the investment required is favorable

 RATING CRITERIA FOR EVALUATING RESEARCH ARTICLES

1. Significant contribution to knowledge about exercise science and/or human movement

2. Research plan developed within a reasonable theoretical framework

3. Citation of current and relevant research

4. Clear, concise, testable statement of the problem

5. Proper interpretation of current and relevant research

6. Problem derived from theory and research reviewed

7. Description of relevant subject characteristics

8. Use of appropriate subjects

9. Use of appropriate instrumentation

10. Sufficient description of testing/treatment procedures

11. Validity and reliability of dependent variable

12. Appropriateness of statistical analyzes and research design

13. Release of results in light of problems

14. Completeness of presentation of results

15. Appropriateness oft tables and figures

16. Discussion of results in light of problem, theory and previous findings

17. Minimal degree of speculation

18. Use of APA style

19. Citation of references in text

20. Accuracy of dates of references

21. Match between references and citations

22. Completeness of Abstract

23. Proper length of Abstract

24. Use of nonsexist language

25. Protection of human subjects

26. Appropriate labeling of human subjects

27. General ethics

  

EVALUATIONS

Various approaches to evaluation - different kinds of truth and different degrees of truth within and across kinds of information

General questions:

1) For whom is a study being evaluated?

Ourselves - emphasis on completing task and doing it well. Coping with the impossibility of completism by making tradeoffs

Note limitations up front. Scholarly Traditions. Community of Scholars. Participants. Various Publics. Funding Sources

2) Who are the people who care abo#DRSut the legitimacy and worth of qualitative research

 3) What do they want to know?

 

ACTION RESEARCH

Systematic inquiry conducted by school personal researchers to gather information about the ways that their particular school operates; how they teach and how well their students learn. Done by teachers, for temselves.

Combination of Indentification, collection of data, analyzing and interpreting data and develpoping an action plan.

PRAXIS: liberate through knowledge gained: proces that concentrates on the socially responsive forms of education and to exame everyday, taken-for-granted ways in which our profession is carried out.

 

 


FIELD NOTES

FIELDWORK - NOTETAKING

There is a tradition in cultural anthropology that one cannot be told how to do fieldwork

(lack of fieldwork training see Berreman (1962); Wax (1971); Adams and Preiss (1960)

Classic methods: Boas (1920); Radin 91933); Mead (1933) Malinowski (1961)

Histories of fieldwork - Wax (1971); Pelto and Pelto (1973)

Bibliography on field methods - Gutkind and Sankoff (1967) - abundance of fieldwork literatture and lack of atemtps to systmatically discuss it

1) Time Table - No Generalizations or observations - just depictions - Step by step - account of what WAS DONE - of entire scene. Accounting of beginning or key time periods at interacting moments Time Generated: a) longitudinal; b) Shadow - pick up moment to moment - descriptions

 2) Forms of Note-Taking: Label work carefully and make extensive, detailed notes - Radical empiricist - stick just to facts; Holistic ethnographic -sub-narrative - infer feelings; Theorist - theoretical statements; LABEL WORK CAREFULLY and MAKE EXTENSIVE, DETAILED NOTES

 3) Types of Note-Taking - Rough note - in field. Final notes - at home - detailed reflection

Notational strategies -

* labeling actors and setting with #s or Letters

* put information where your forgot in margins

* write sequentially - in order

* pause that refresher - use lulls in action to go back and fill in spaces

* speculate - to focus on further questions and areas to focus on

* use items that will jog memory - to elaborate on where doing field notes

Observational strategies

* build up science as go along

* ask questions

* speculate

* time sense

* stay with area to get hold - wait to sort and stay with the obvious

Develop format for notes

* way to identify each page to keep in order

* indexing of notes

* date and time on each note-taking - when observed and when writ-up

* what is distinctive about this group

4) Note-Taking Concentrations (look for the following)

* different perspectives (bias of observer)

* units of observation - focus : physical vs. interactive

* selective inattention

* novelty factor

* what observer sees/thinks is noteworthy. Bbut may not be what informer stresses

* Question - asking

* role of observation affects focus of observation - participant vs. Non-participant

* culture bias of observer

* type of event may be culturally more salient to observer - issue of salience

* importance of accounts

CHECK UP ON YOURSELF

 TRADITIONAL FIELD NOTES - 1) Plan; 2) Format; 3) Notational Strategy; 4) Observe; 5) Label; 6) Questions

 TRADITIONAL FIELDNOTES- 1) since you do not yet know what is significant, you don't know what to record; 2) problem with memory - best to write things down as quickly as possible - but while you are recording, other thing are happening that represent the continuity between when you left and when you returned; 3) write things down at end of day - but contend with long-term memory problems. Long-term memory often produces distorted results

 FIELD NOTES THEN BECOME WORKING NOTES. Begin with short periods of observation and recording - no more than an hour's interaction. Boredom/exhaustion - natural responses to prolonged engagement - produce thin records

 

NON-TRADITIONAL METHOD 1) Pre-Note taking: pre-selected (and pre-examined) topics.; 2) Focus informal interviews, conversations and observations. When you are not learning anything new, you are ready to move on; 3) Revised Field notes consist of: Re-evaluated what you think you learned; A) Some ideas from observation to follow up with interviews;B) Some observations/questions to follow up that came from interviews;; C) Some things you've noticed that you want to be sure to get to eventually.

1) Begin with short periods of observation and recording - no more than an hour's interaction; Boredom, exhaustion, natural responses to prolonged engagement in many activities produce thin records; 2) Include: When; Where and What Circumstances notes were obtained; a) Use only one side of a piece of paper; b) Use wide margins; c) Create a personal shorthand system; d) Write full accounts of observations before relating them to anyone; e) read notes regularly; f) include in notes even casual encounters; g) mark participants quoted material differently from what is paraphrased of their words; 3) Separate record of alternatives for decisions and rationales for choices: How research was designed and conducted; 3) Verbatim principle - sometimes write verbatim - changes meaning if you don't; Both native/observer terms are part of field notes. Important to carefully distinguish them.; 4) Condensed Account - include phrases, single word, unconnected sentences in every interview; 5) Expanded Account - fill in details and recall things not recorded on the spot (as soon after interview as possible); 6) Field Work Journal - contain record of experiences, ideas, feats, mistakes, confusions, breakthroughs etc. Each journal entry should be dated. Provides an introspective record of the field work. Can be an index, chronology to field notes proper; 7) Diary - A personal diary is a different kind of record. These focus on reactions of the; ethnographer to the field setting and information, how research is going etc. Can be an index, chronology to field notes proper. Running record of inferences, hunches and ideas as well. Letters, Reports, and Papers from the field; 8) Analysis and interpretation - leave room for interpretation & insights - place to "think on paper"

 FIELD NOTES - Data collected by observers and some interviews: written accounts. Inscription - made in the midst of interaction. Transcription - writing something down as it occurs, copying as exactly as possible. Full observation and recording; little participation. Description - out of flow of activity, comprehensible account of what was observed. Scratch - Inscription- rough - serve to jog memory. Fragmented phrases, words and other symbols to recall what has happened. Field - Summarize and polish - use in analysis. records of collections; documents; questionnaires; public and private documents; censuses; maps etc.. Need to distinguish between inferential judgements and noninferential judgements

 OBSERVATIONAL TECHNIQUES & SKILLS

1) Observe with all one's senses

2) Know what to look for

3) Seeing where to see - developmental skills: "Whats going on here" is never a straightforward set of "facts"; "What roles are being played out"; "What verbal, non-verbal communication is occurring"; "What rituals, ceremonies etc. are guiding actions"

4) Accept responsibility as a member of a group (if doing participant observation)

5) Plan on strategy - how to record field notes

6) Use a balance between emic/etic

7) Field notes are NOT raw notes - but throughout, comprehensive study - Date of observation and Date of Write Up - BOTH ESSENTIAL

8) Drawings and maps are great

9) Coding should be used when making notes in describing things use code system

10) Makes as explicit as possible the frames of reference which may affect her or his perception

11) Attempt to control possible sources of bias and distortion in recording observations

12) Secure sufficiently rich and varied data that it becomes possible to falsify hypotheses or confirm them by triangulation. (see week 2 definitions)

13) Researchers goal - sort out varied views, discern the vantage points from which they emerge, the factors that affect them and their effect on the actions of those holding the views and then to use them to construct a more complete picture of the event in question

14) Objectivity lies in special training and set of relevances of the researcher

15) Researcher is detached from immediate interest or stake in outcome of events except as data

16) Supplement Participant Observation with interviews, study of documents, non-participant-observation lends considerable credibility to support findings

17) Be aware of bias of observer and bias of purpose

 

RECORDING OBSERVATIONS

MANUAL

Leave room on paper for your responses (which you write after the interview is over). Many subjects feel threatened if they see the interviewer writing allot of information!. Yet responses - after the fact - may not likely to recall every response. Standardized observation forms add validity, reliability and time saved: Do not give details unexpected. Pre-thought code - set of symbols, recording instrumentation. Use of letters, symbols etc. for categorization. Make grid for weeks, leaving spaces for placing codes (easy check list). Checklist - observer can simply check each behavior as it occurs (good for quantification). Rating scales give similar standardized observations. Flanders System (used for classroom observation). Classifies all teacher and student behavior into one of 10 categories, each of which is presented by a number. Researcher needs to recognize the behavior and write the number

GENERAL TIPS FOR BOTH MANUAL AND TECHNICAL

1) Observe only one behavior at a time - When observing more than one - alternate, i.e.: teacher (1,3,5,7 periods) and student (2,4,6,8) periods; 2) Record while behavior occurs; 3) Recording should be simplified as possible

 TECHNICAL

Mechanical - Recording helps this - but make subject uneasy. hand counters, timers and beepers. Paper Clip & Gum Wrapper Method - use as counting devise, without others knowing. Tape recorded reminders. Direct-to-computer processors. Still photos; Polaroids, Film and Videotape. Misuse, Overuse - failure - greater than possibility of under use. How much of tape is transcribed is another issue - who does the transcribing

 

 


ACTION RESEARCH

Action Research Notes

 

Useful approach to solving practical everyday problems by applying group dynamics to facilitate change.

Local problem in local setting - solution of given problem

Planned, systematic research conducted for purpose of applying or testing - and mostly to understanding learning process

Participatory Research

Educational practice is the starting point to build theory

Madeline Hunter (director of the UCLA laboratory school) - did Action research on ways to improve classroom instruction. This led to the development of ITIP - Instructional Theory into Practice (Hunter 1994)

quality depends on how well the project serves local needs

study own students, own instructional methods, own assessments - to better understand them and improve their quality or effectiveness

 

Action research IS

process to incorporating change that is practical and relevant

collaborative process involving educators working together to improve their own practices and to empower relationships

persuasive and authoritative, since it is done by and for teachers

participative - educators are not dis-interested outsiders

a process that requires us to "test" our ideas about education and yet provides critical analysis of the educational work environment

cyclical process of planning, acting, developing and reflecting

involves teacher empowerment - when teachers collect their own data to make decisions - they become empowered

justification of one's teaching practices

 

Action Research is NOT

usual thing that teachers do - it is systematic and more collaborative

simply problem solving - involves specification of problem, development of something new and critical reflection

simple implementation of predetermined answers to educational questions - it explores, discovers and works to find creative solutions

conclusive - not right/wrong - but ongoing answers

8-steps - In Cyclic Pattern

1) Define the problem (Identify Themes)

2) select design (allow for frequent changes) - Role of Ethical Biases

3) select research participants

4) collect data

5) analyze data

a) Code - Process of finding patterns & meanings through data collection

b) What are the key questions? Have they changed?

c) Contextualize findings in literature (literature review)

6) interpret and apply findings

a) Organizational Review - focus on features of organization, vision administration, goals and objectives, operations, issues and concerns

b) concept mapping - visualize major influences and their consistencies and inconsistences

c) by analyzing before, during and after - you are avoiding mistakes make by pre-mature data interpretation

For # 4, 5, 6 - move quickly - in cycles (at least 3 cycles is ideal)

7) report findings - share with colleagues: connect personal experience

8) State What's Missing - new questions - revised questions

 

Facilitating conditions for Action Research

1) Time Action research is time-consuming

2) Collaboration Wwork with mentors and colleagues & share results

Communicated - locally, professional : throughout cycle

3) Openness Open to modify daily routines. Administrations, staff, clients should feel sufficient trust and freedom to acknowledge and confront workplace practices that need improvement. Value data as a call for action

4) Qualitative Methodologies analytical and interpretive lens to move beyond description of data - to make sense of what you have learned

5) Quantitative Methodologies Used to provide picture of what is happening in a particular situation

Pre-Post Test Design (pre-test; treatment; post-test & compare scores)

Time-Series Design (examines groups over time - before and after treatment)

 

Action Research

people in the real-world can also conduct research

research that is practical, directed at their own concerns and for those who wish it, a tool to bring about social change

Whyte (1991) - participatory action research

action becomes political action where citizens do research to work for social change with regard to issues of power

 

research for the purpose of advancing a social cause, to change existing practices of inequality, discrimination, or environmental endangerment. Also used in more general way to refer to research that leads to any kind of immediate change

 

action research IS objective since the research has integrity as a researcher and the honesty with which you report what you find

Taylor (1980)

action research . . . "is not intended to yield an "objective" view of a facility, if "objective" means devoting equal attention to the positive and negative aspects of a facility. . . need to report observations as honestly, completely and objectively as possible

Action - always includes reflection

 

Action - is cyclical: systematically collect information and then modify beahvir/practice onf findings and reflections

 

personal purposes

develop a greater understanding; personal examination; self-awareness

professional purposes

staff development; legitimize role as producers of knowledge; develop network with other practitioners

Political Purposes

seek to make own teaching practices more human and just

provide full participation in research process of all those who are affected by it

embrace an overt agenda of social change with a commitment to promote economic and social justice through collaborative efforts to increase educational opportunities and outcomes - - emphasis on gender, class, cultural equity and voice in education

 

Jeffey Glanz

1) selecting focus

2) collecting data

3) analyzing and interpreting data

4) taking action

5) reflection

6) continuing or modifying ones action which in turn leads to a new focus for another round of action research

APPLIED

Evaluation & Policy Analysis

describe and assess a particular program of change in order to improve or eliminate it. Creates a written report.

 

Political Action Research

conduct research act as citizens attempting to influence political process through collecting information - promote social change and is consistent with advocate's beliefs

For example Geraldo Rivera exposed conditions at a school to show how students with disabilities were treated . . . .

Practitioner Research

Teacher uses research to do what she does better - be more effective

 

TYPE

Evaluation and Policy Research

Political Action Research

Practitioner Research

WHO the researcher serves

contractor

social cause

learner or program

PURPOSE

describe, document, and assess a planned change. Provide information to decision makers

Promote social change

Promote individual or gorup change through education

FORM Of data presentation

Written reports / oral presentation

Pamphlet, press conference, congressional testimony, TV show, sociodrama, expose, report

Training programs, workshop, curriculum and plans for individual change

 

 

ACTION

1) Select Focus

2) Take Action

3) Collect Data

4) Analyze and Interpret data

5) Continue or Modify Action

6) Reflection

7) Reporting Results

 

Outcome Validity - extent to which actions occur that lead to a resolution of the problem under study or to the complection of a research cycle that results in actions

Process Validity - adequacy of the processes used in different phases of research , such as data collection, analysis and interpretation and whether triangulation of data sources and methods are used to guard against bias

Democratic validity - extent to which th research is done in collaboration with all parties who have a stake in the problem under investigation and of which multiple perspectives and interests are taken into account

Catalytic validity - degree to which the action research energizes the participants so that they are open to transforming their views of reality in relation to their practice and highlights the emancipatory potential of paractierioner research

Dialogic Validity - assessment of the degree to which the research promotes a reflective dialogue among the participants in the research, to generate and review the action research findings and interpretations

 

Guidelines used for conventional experimental research, forare not always appropriate for action research.

Action research tends to be...

 

1) No need to develop a metric - which may be abandoned later if it doesn't fit the emerging situation

2) Must work in natural language, which is easier for informants.

3) Must have agreement and commitment of those being affected by research - get others involved in the research process as equal partners.

 

Use multiple cycles, with planning before action and critical analysis after it.

Within each cycle --

use multiple data sources;

and try to disprove the interpretations arising from earlier cycles.