Professional Documents
Culture Documents
Quantitative
Chapter 6
Data collection
One of the important activities in the
research process
Data can be collected in many ways
and using various types of methods
Data can be classified into
quantitative and qualitative
Primary sources
Data can be collected by acquiring firsthand
information from the respondents
Is analysis conducted by the investigator(s) or
institution that collected the data.
For instance, sending mail survey to respondents
or interviewing the respondents
It can be collected also from observation. For
observation, the data is documented in a form and
it can be in the form of non participatory
observation or participatory observation.
Observation is mainly used in qualitative approach
Observation
Researcher will indicate the area that she
or he would like to observe
Take note important points related to the
area that he or she observed
Researcher can decide whether to involve
with the participant or not to participate
For quantitative researcher, observation is
one of the ways to gain initial
understanding about the area to be studied
Questionnaire
Also known as survey
It provides a quantitative or numeric
description of the trends, attitudes,
opinions or perceptions of a
population by studying a sample of
that population
An instrument of measurement
Interview
Quantitative data can also be
acquired from interview for instance,
when asking participants about their
ages, income level
Interview can be categorised as
structured, semi structured and
unstructured
Secondary sources
Is any further analysis of an existing
dataset that produces results or
conclusions other than those produced as a
result of the first report on the inquiry.
Often carried out by different people than
those that collected the data.
Information are obtained from articles,
research report and etc
Quantitative data can be acquired from
database developed by a company
Quantitative data
Data are used to classify groups.
Examples; numbers, quantity,
prevalence, incidence.
Variables can be classified as
physical (population, infrastructure),
social (poverty, slums), spatial (land
use, proximity) etc.
Measurement
Measurement encompasses scales and scaling
used in measuring concept as well as reliability
and validity of the measures used
Measurement in quantitative research should
fulfill
1. Validity - Are you measuring what you think you are measuring?
2. Objectivity - researchers stand outside the phenomena they study.
Data collected are free from bias.
3. Reliability
4. Accuracy
Measurement
Is about how variables in the
framework is measured
Operationalising the variables
Developing scale
Scaling
Scaling is construction of instruments
for measuring abstract constructs.
eg using Likert scale to measure
perception of the respondent on UUM
postgraduate programme
Example
Collect data on
instruments yielding
John
W. Creswell scores
numeric
Educational Research:
Planning, Conducting, and
Evaluating Quantitative and
6.16
Information to Collect:
Types of Data Measures
An instrument is a tool for
measuring, observing, or
documenting quantitative data.
Types of instruments
Performance measures (e.g., test
performance)
Attitudinal measures (measures feelings
toward educational topics)
Behavioral measures (observations of
John W. Creswell
behavior)
Educational Research:
Planning, Conducting, and
6.17
Evaluating
and
Quantitative
Factual
measures
(documents, records)
Locating or Developing an
Instrument for Data Collection
Look in published journal articles
Run an ERIC search and a descriptor for the
instrument you want in an online search to see if
there are articles that contain instruments
Check Tests in Print
Check Mental Measurements Yearbook published
by the Buros Center at the University of
Nebraska, Lincoln, NE (http:unl.edu/Buros)
Develop your own instrument
John W. Creswell
Educational Research:
Planning, Conducting, and
Evaluating Quantitative and
6.18
6.19
Variables
Quantitative Analysis involves the study
of variables.
Variables are attributes that vary across
cases, and/or within a case over time.
For example, gender, age, happiness,
political association, occupation,
number of students on a course
The process of going from a concept to
a variable involves operationalisation of
the research question.
When variables are produced in surveys
they are often the product of closed
questions.
Closed questions are questions in which
possible answers are given and the
respondent selects from these: all of
the questions in last-weeks survey
were closed as you did not have the
option to write your own answers.
Conceptualization
Conceptualization is the process of
specifying what we mean by a term. ( A
clear, verbal specification of your variable
(concept) so that others know what it is
and can place cognitive borders around it.
Conceptualization
Continued
If we hypothesized that lower social
status in college students directly
correlates to an increase in deviant
behavior, then what exactly is
Deviant Behavior ? (The conceptual DV we
are measuring)
-Deviant
The Multi-Dimensions of
Conceptual Definitions
What is a Table ?
What is a Car ?
What is Religiosity ?
1. Ideological Dimension
2. Experiential Dimension
5. Intellectual Dimension
3. Ritualistic Dimension
4. Consequential Dimension
* What about
TALL
VS
Concepts in Research
Lets do a study to see if there is a Direct
relationship between Social Status of parents and
Deviant Behavior in college students.
Conflicts in Measurement Validity
1. All concepts are Multi-Dimensional
Social Status
Power
Prestige
Privilege
Process of
Indicators
Income
E A B
Conceptualization
And
Operati
onalizatio
Operationalization
In Social Research, there are
Operations of Measurement.
Operation: A procedure for identifying or indicating
the value of cases on a variable. (Instructions)
Operationalization: The process of specifying the
operations that will indicate the value of cases on a
variable
Operationalized
as Days spent in the hospital
Ratio
Interval
Levels of MeasurementOrdinal
Nominal
*CAUTION *
LEVELS OF MEASUREMENT CANT BE
DETERMINED WITHOUT
CONSIDERING BOTH THE CONCEPT
AND THE MEASURE
(Conceptualization and
Operationalization)
Instrument/Measurement
Instrument development
~adapted or adopted
~adapted - acquired instrument from
publication (journal, book and so on) on topic
related to the study.
~ adopted is based on the established
instrument without changing the original
content.
Nominal
Scale
Ordinal
Scale
Interval
Scale
Ratio
Scale
Type of scales
Numerical rating scale measuring
from poor to excellent how would
you rate your supervisor from 0 to
10?
Thurstone
Rating the Potential Scale Items
Judges rate the items on 11-point scale
1 = agreement indicates very low
amount of the attribute
11 = agreeing indicates very high
amount of the attribute
encourage judges to use entire range of
scale, assigning some statements to
each of the 11 values sort them into
11 piles.
Thurstone Scales:
Method of Equal-Appearing Intervals
Thurstone
Computing the Scale Score
Values for Each Item
Find median and SD or inter-quartile
range
Arrange in table
Sort by median
Within items with same median, sort by
SD or inter-quartile range
Thurstone
Select the Final Scale Items
1 (or 2 or 3) item(s) for each possible
scale score value
Prefer items with low variability among
judges
End up with 10-30 items
Items and scale score values are shown
on the next slide.
Thurstone
Administer the Final Scale
Randomize the order of the items
For each item, respondent chooses
Agree or Disagree
For each item the scale score is the
median from the judges ratings
Total Score = mean scale score for items
on which the respondent agreed.
Thurstone
Thurstone scales are rarely used
these days
They are just too much trouble to
create.
Likert scales were developed in
response to this difficulty
Guttman Scaling
Define the Concept
Generate Potential Scale Items
Evaluating the Potential Items
For each item, judges are asked if
someone high in the attribute would
agree with the statement Yes or No.
Guttman
Conduct a Scalogram Analysis of
Judges Responses
use special software to do this
if successful, it will create an ordered list
of items such that
agreeing with the first item indicates you
have at least a little of the measured
attribute
agreeing with the second indicates you have
at least a little more of the attribute
etc.
Guttman
The scalogram analysis also computes a
scale score value for each statement.
See the example in Trochims Internet
document Guttman Scaling (reproduced
on the next slide).
It is assumed that anybody who would
agree with the nth item would also agree
with all preceding items.
The order of the items may be
scrambled prior to administering the
scale.
Guttman
Administer the Final Scale
Respondents are asked to check items
with which they agree
Respondents score = sum of the scale
score values for checked responses.
Likert Scales
Define the Concept
Generate Potential Items
About 100 statements.
On some, agreement indicates being
high on the measured attribute
On others, agreement indicates being
low on the measured attribute
Likert
Instead of a dichotomous response scale
(agree or disagree), use a multi-point
response scale like this:
Likert
Evaluating the Potential Items
Get judges to evaluate each item on a 5point scale
1
2
3
4
5
Likert
Alternate Method of Item
Evaluation
Ask some judges to respond to the items
in the way they think someone high in
the attribute would respond.
Ask other judges to respond as would
one low in the attribute.
Prefer items that best discriminate
between these two groups
Also ask judges to identify items that are
unclear or confusing.
Likert
Pilot Test the Items
Administer to a sample of persons from
the population of interest
Conduct an item analysis
Prefer items which have high item-total
correlations
Consider conducting a factor analysis
Likert
Administer the Final Scale
on each item, response which indicates
least amount of the attribute scored as 1
next least amount response scored as 2
and so on
respondents total score = sum of item
scores or mean of item scores
dealing with nonresponses on some
items
reflecting items (reverse scoring)
1___2___3___4___5
Bad
Semantic Differential
Goodness of Measurement
Instrument
To know whether the instrument
measures what it supposed to
measure
To know whether it consistently
measures what it intends to
Can be tested through reliability and
validity test
VALIDITY
How do I know that the method I am using is really
measuring what I want it to measure?
Face validity. Does your method appear appropriate to
measure what you want it to measure at first glance?
Content validity. This is similar to face validity, except that
it refers to the initial assessment from an experts point of
view.
Predictive validity. Can your measures predict future
behaviour?
Construct validity. Does your data correlate with other
measures?
Reliabilit
y
Factor
Analysis
Sensitivity
Exercise # 1
Based on the journal article, please
identify and explain:
a) The framework including variables
involved in the study including the
independent and dependent and others
b) The objectives or research questions
c) The research design
d) The data collection method- including
the sampling technique, sample size,
population, the instrument
e) Data analysis used in the study
The End
Q & A Session