steps

Research and Methods Outline

What is Research?

  • Evidence-based Argumentation: Make predictions based on evidence, as opposed to assumptions

  • Induction: Learning from the world by observations

Why?

  • Tools to conduct good research

  • Evaluate

  • Give opinions

Scientific research is a strategy for generating reliable knowledge to address problems

  • In our case, the problems faced by public managers

  • “The best decision made by public sector managers are based not on instinct, but on an informed understanding of what’s happening on the ground” (Eller, Gerber, Robinson, 2013)

Toulmin Argument in Public Administration

  • Claim: the statement you seek to evaluate

    • Specific, falsifiable, relevant

  • Reason/ evidence:

    • The reason why we do research

  • Warrant:

    • How the evidence or reason is sufficient enough to provide confidence in claim

Descriptive Research: Who? What? When? How many?

  • Example: Is the water safe to drink in Flint, Michigan?

Explanatory Research: Why? How?

  • Example: Why does Flint river water have high levels of lead?

Causation & Correlation:

Correlation does not imply causation

Causal inference: X causes Y

Correlation: There is a pattern

Understand the initial tasks of research and evaluation

3 stages of research:

  • Formulating Research Question and Hypothesis

  • Collecting Relevant Data

  • Analyzing Data

Writing a literature review

Purpose:

Substantive introduction

Explanation of importance

Context-setting


Creating an annotated bibliography: A comprehensive listing of major research articles, reports, books, and other similar sources of information about a specific topic

Understand the techniques used in obtaining general background information:

  • Conducting preliminary interviews with key informants or subject matter experts

  • Identify and understand basic concepts of types of data gathering techniques

Conducting Background Interviews with Subject Matter Experts (SMEs):

Persons with specialized knowledge, expertise, and experience within a particular policy domain or program area

The Value of SME Interviews:

Identify core policy issues and challenges

Identify core research challenges

Assess critical questions

Data Gathering-

Field Research

Surveys

Existing Data Sets

Creating and Original Data Set

Research Design

  • Basic challenge

    • Develop an understanding of exactly how to approach the data collection process when attempting to evaluate or assess some program or policy

  • Research design

    • Refers to the general process by which data gathering efforts are structured and defined; that is, what is to be studied and how, what variables are to be included in the study, how they are measured in relation to one another, and how those data are gathered

Basic Concepts for Research Design

  • Experiment: An activity where a researcher controls or manipulates the conditions under which some sort of subject is examined in order to observe and measure a specific cause-and-effect relationship

  • Treatment: A variable or condition that the researcher introduces into the experiment in order to see whether it has an effect on the subjects

  • Control group: Those subjects in the experiment that do not receive the treatment

Ethics and Research Design

  • Issues of potential bias and conflicts of interest in the conduct of a research or evaluation study is of paramount concern

  • Issues of subject selection and subject inclusion

Ethics and the Research Process

It is critical to understand the importance of protecting the rights of other persons in any project that involves gathering any type of data a from other persons.

Example: Tuskegee Syphilis Study (withholding treatment from subjects)

Protections:

  • Institutional Review Boards (IRB) for University Research

  • Federal Guidelines for biomedical research

Measurement:

Understanding the Basics of Measurement

  • Operationalization- Translating a concept of interest into a form that can be measured

  • Understanding what to study and what to measure

Variables

Qualitative vs. Quantitative Methods

  • Qualitative- goal is to examine, understand, and describe a phenomenon

    • There are non-numerical differences between categories (usually designated by words or labels -i.e. Gender) Values consist of numbers, and differences between values can be expressed in numbers

  • Quantitative- goal is to analyze and represent the relationship mathematically through statistical analysis

    • Values consist of numbers, and differences between values can be expressed in numbers

Dependent vs. Independent variables

  • Dependent- what is being tested and observed

  • Independent- not change by other variables, stand-alone

Levels of Measurement

  • Nominal Variables: Names/ Labels

    • Race/ethnicity, gender, colors

  • Ordinal Variables: Order

    • Agree / disagree scales, ranked orders, variables with discrete categories

  • Interval Variables/ Ratio Variables (an absolute zero)

    • Temperature, age, income, any continuous variable

Sampling Methods

  • Random Sampling - every person in known population gets and even chance of being selected

  • Systematic Sampling - every Nth unit or element from the sample frame list gets selected

  • Stratified Sampling - actual representation of the population- researcher divides a heterogenous population into several strate and then takes a sample from each stratum, usually with the size of the sample for each strata being proportional to the size of the strata in the overall population

  • Snowball Sampling - non-probability sampling where researcher relies on referral from initial subjects to identify other population members

Assessing Measurement Validity and Reliability

Validity assessment approach

  • Face validity - when test seems to be effective in terms of stated aim

  • Content validity - how well a test measures the behavior for which it is intended.

Reliability assessment approach

  • Test---retest - A measure of consistency for test

Issues of Validity and Inference:

Internal Validity:

Understanding the nature of internal validity, and threats to it, are critical to drawing causal inferences

    • Did you actually measure (accurately and validly) the concepts and variables you intended to?

    • Did you account for, or rule out, competing explanations?

  • Threats to internal Validity

    • Threat of history

    • Threat of maturation

    • Threat of testing

    • Threat of instrumentation

    • Threat of regression to the mean

    • Threat of selection bias

    • Threat of mortality

External validity:

Can the results be generalized to other settings beyond the specific sample gathered for the purposes of conducting a given study?

  • Threats to external validity:

    • Involve interaction of the nature of the study (and its treatment) with specific aspects of conducting an experiment (i.e., an interaction between treatment and testing)

Type of Experiment

True Experiment

  • Subjects are randomly assigned to treatment conditions

  • All phenomenon is completely controlled

  • Excellent for showing cause and effect relationships

  • High on internal validity

Quasi-Experiment

  • Natural experiments

  • Subjects are already in the treatment level, experimenter has no control

Similarities

  • Study participants are subjected to some type of treatment or condition

  • Some outcome of interest is measured

  • The researcher test whether differences in this outcome are related to the treatment