SURVEY RESEARCH DESIGN
"Advice is what we ask for when we already know the answer" (Erica Jong)
The basic idea behind survey methodology is to measure variables by asking people questions and then to examine relationships among the variables. In most instances, surveys attempt to capture attitude or patterns of past behavior. About the only options are whether to ask people questions once or over time. The most commonly seen survey uses the cross-sectional design, which asks questions of people at one point in time. These kind of surveys are highly fallible because the researcher may or may not be able to analyze the direction of causal relationships. Adding retrospective (past behavior) and prospective (future propensities) items to a cross-sectional survey may help, but generally it's more useful to have a longitudinal design, which asks the same questions at two or more points in time. The three subtypes of longitudinal design are: the trend study, which is basically a repeated cross-sectional design, asking the same questions to different samples of the target population at different points in time; the cohort study, which is a trend study that tracks changes in cohorts (people belonging to an organization or location who experience the same life events) over time; and the panel study, which asks the same questions to the same people time after time. Trend studies essentially look at how concepts change over time; cohort studies at how historical periods change over time; and panel studies at how people change over time.
Surveys vary widely in sample size and sampling design. A distinction can be made between large-scale, small-scale, and cross-cultural studies. Large-scale probability surveys are the ideal, and where the target population is a whole country, like the United States. Typical large-scale surveys of a national population use a sample size of 1500-3000 respondents, but can run much larger. Small-scale surveys (also called microsamples) sometimes run the risk of sliding into nonprobability sampling, as with typical graduate student research which usually uses a sample size of 200-300 respondents. Political opinion polling generally requires a minimum threshold of 400 for any kind of reliable survey. Comparative, or cross-cultural surveys usually involve 3-6 nations, and sample sizes that typically involve 1000 people per nation.
The term "survey" actually refers to one, or some combination of two, procedure(s): questionnaires; and interviews. A questionnaire almost always is self-administered, allowing respondents to fill them out themselves. All the researcher has to do is arrange delivery and collection. An interview typically occurs whenever a researcher and respondent are face-to-face or communicating via some technology like telephone or computer. There are three subtypes of interviews: unstructured, which allows spontaneous communication in the course of the interview or questionnaire administration; structured, where the researcher is highly restricted on what can be said; and semistructured, which restricts certain kinds of communication but allows freedom on discussion of certain topics.
Although surveys can be a cost-effective type of research, survey research design suffers from inherent weaknesses. The greatest weakness is probably due to the fact that all surveys are basically exploratory. You can make inferences, but not at the level of cause-and effect and ruling out rival hypotheses, like you can with experimental or quasi-experimental research. Other survey weaknesses include:
Reactivity -- respondents tend to give socially desirable responses that make them look good or seem to be what the researcher is looking for
Sampling Frame -- it's difficult to access the proper number and type of people who are needed for a representative sample of the target population
Nonresponse Rate -- a lot of people won't participate in surveys, or drop out
Measurement Error -- surveys are often full of systematic biases, and/or loaded questions
Researchers who plan to use questionnaires usually start by writing the questions themselves. After a rough draft is created, the researcher then analyzes their questions to see which ones are related to their variables list. The variables list contains the key concepts or theoretical constructs that are contained in the research question and/or hypotheses. Care is taken to ensure that questions cover every concept, and there is no duplication or excessive coverage of any one concept. Terminology is important at this point, and some researchers try to mix jargon with the operational definitions of their concepts. Generally, the less intelligent or more highly specialized your respondents, the more the researcher uses jargon, or plain, everyday language. A questionnaire, of course, can contain scales and indexes from the extant literature.
There are many guides, workshops, and seminars on question wording (see Internet Resources below), but the main issue in most social science research is how to ask incriminating questions. The situation is analogous to interrogation in law enforcement. For example, there must be at least five different ways to ask somebody if they killed their wife:
Did you happen to have murdered your wife?
As you know, many people kill their wives nowadays. Did you happen to have killed yours?
Do you know about other people who have killed their wives? How about yourself?
Thank you for completing this survey, and by the way, did you kill your wife?
Three cards are attached to this survey. One says your wife died of natural causes; one says you killed her; and the third says Other (explain). Please tear off the cards that do not apply, leaving the one that best describes your situation.
These are rather extreme examples, but you can collect an amazing amount of self-incriminatory information by a well-worded questionnaire. This particular problem is referred to as the loaded question, and the classic example is "Do you still beat your wife?" Nobody, except maybe the most grandiose criminal, is going to admit to criminal activity, despite assurances to confidentiality by the researcher. Deviant sexual activity and sources of illicit income also tend to produce a low response rate. However, there are things you can do to take some of the unreliability out of loaded questions. First of all, avoid any "guilt trips" or appeals to sense of duty in the questions. Avoid asking if they exercised their patriotic duty or did the "right" thing. Instead, do what most survey researchers do, and include an attractive cover sheet to the questionnaire which contains an official-looking logo or letterhead, something along the lines of "Official Questionnaire: American Institute of Integrity in Public Opinion", which is again an extreme example since your organization's logo or letterhead will usually suffice. Studies have found that the more attractive, expensive, and official-looking the questionnaire, the higher the response rate. Colored paper, for example, is better than white paper for surveys.
Other ways to increase response rate involve timing and renumeration. Timing is the name for a variety of techniques involving pre-survey phone calls or postcards telling respondents that a survey is coming their way soon. After the survey has been mailed or delivered, timing also involves a follow-up "friendly" reminder to complete the survey. Sometimes, respondents will admit to things in completing the survey just to make the reminders stop. Renumeration takes many forms "In the name of science" and "help me out with my class research project while in college" appeals do not usually tend to increase response rates. Some respondents also take you up on any offer to receive a copy of your finished research report, when done. The best incentive is cash money, attached to the questionnaire, so that respondents feel guilty about keeping the money and not answering the survey.
Personalization also increases response rate. Handwritten P.S. messages, along with anything personal about the researcher's qualifications and previous publications, are the kinds of things that respondents like to read. Other personal touches include endorsements from prominent individuals, something like "I'm former President Clinton, and I can assure you this research is bona fide; I even admitted I had sex with that woman on this survey." The greater the visibility of any organization or individual sponsoring or endorsing your research, the more likely you'll get a high response rate.
The order of questions is an important consideration. Although it's commonplace, demographic information, like age, sex, race, etc. is best located in the middle or end of the questionnaire. People tire of seeing surveys asking for basic information up front. You should begin with some question that immediately captures public interest. It doesn't even have to be a question dealing with your research question, just something topical. Say, you're doing research on life imprisonment, but a good starter (filler) question would ask about attitude on the death penalty since that's a more popular topic. You also want to include reversal questions, which ask for the same information, but only in reverse. For example, early in your questionnaire, you ask "Do you feel the criminal justice system is fair"; and later in your questionnaire, you ask "Do you feel the criminal justice system is unfair." The responses should be roughly equivalent to both questions, although one should be Strongly Disagree while the other should be Strongly Agree. Reversal questions serve as a check on lying and complacency. There are also known lie scales you can include in your survey; items such as "I always tells the truth" or "I never feel sad." Generally, you want to use a Guttman or index approach to criminal justice subject matter, building up to the things you're really interested in, as in:
How many times in the last year have you done the following to your wife?
1. Ignored her
2. Mistreated her
3. Had mild arguments with her
4. Had heated arguments with her
5. Slurred her reputation
6. Attempted to rape her
7. Brandished a weapon at her
8. Gave her a bruise
9. Knocked her unconscious
10. Killed her
You get the idea -- you surround your key question with build-up questions, fillers, filters, and distracters. The craft of questionnaire design is to do all this mixing up, and still maintain what looks like a usable and consistent set of questions. In fact, you ought to provide readers with short, transition paragraphs when you switch gears, as in "Now you're going to be asked about a completely different topic...." There's much, much more to the art of questionnaire design, and you should avail yourself of a complete college course on the Logic of Survey Design.
The general rule for interviewing is to record responses verbatim. This usually means you should use some type of recording device, or write down word-for-word what the respondent says. To get at incriminating information, you can shut down the recording device, and try to write down what they said later. Structured interviews, of course, use precoded response categories (SA, A, D, SD) which you can tailor to more sophisticated responses depending upon feedback from your pretest (A lot, a little, hardly any, none at all). This requires you to be familiar with the terminology and jargon used in the population.
Unstructured or semi-structured interviews allow you to explore various issues in depth with respondents. If you start getting into life history, you're probably doing depth interviewing, which is something completely different. It is all right, however, for you, the interviewer to talk about how you would answer a question, as long as this is to clarify the purpose of the question or set up an instructional pattern. Self-disclosure should be avoided if it seems like it's leading to interviewer bias. Interviews are wonderful opportunities to impress the importance of confidentiality on respondents.
A somewhat important issue with interviewing is time of day. Some people are diurnal and others are nocturnal, which means they talk more during the day or at night. Many criminal justice populations are nocturnal, so you get the best information at night. However, safety issues must be kept in mind. Interviewers should not be overdressed nor underdressed. Some time should be spent at the beginning to build up a rapport with the respondent.
Be prepared to use probes. Probes, or probing questions are whatever's necessary when you get responses like "Hmm" or "I guess so", and your probe should be "What did you mean by that?" Don't be satisfied with monosyllabic answers. Simple yes or no answers usually call for probing, unless the protocol suggests otherwise. Always exit the interview diplomatically. That way, you haven't ruined it for others who might follow you.
Telephone interviews usually are better than computer interviews, although neither substitutes for the good observational skills of face-to-face interviewing. The most common sampling procedure with telephones is random digit dialing. The most common computer method is a web-based series of questions allowing for chat or bulletin board posting. Various software programs exist that can be loaded onto laptops and used to guide face-to-face interviews. Other technology exists to content analyze keywords captured by recording or computer devices.
1. Identify 3 things to generally avoid in question writing, and explain why, in your own words.
2. How does ordinary conversation differ from interviewing?
3. Explain why a web-based survey might not be a good idea.
1. What kind of survey method (cross-sectional, trend, cohort, panel) AND procedure (questionnaire, structured interview, unstructured interview, semistructured interview) would be appropriate for the following target populations?
A. Maximum-security adult lifers in prison
B. Homeless heroin users who are dying of AIDS
C. 18-24 year old traditional-aged college students
D. Medical professionals suspected of welfare fraud
E. Captured serial killers
F. Citizens who are subject to a new tax law
1. Design a questionnaire consisting of 10-15 questions which induces respondents to answer with some very personal information about themselves.
American Association for Public Opinion Research (2011). AAPOR website, last accessed July 17, 2011
Babbie, E. (1990). Survey Research Methods. Belmont, CA: Wadsworth.
Converse, J. & S. Presser. (1986). Survey Questions. Beverly Hills, CA: Sage.
Dillman, D. (1978). Mail and Telephone Surveys: The Total Design Method. NY: Wiley.
Fowler, F. (1993). Survey Research Methods. Beverly Hills, CA: Sage.
Hagan, F. (2000). Research Methods in Criminal Justice and Criminology. Boston: Allyn & Bacon.
Labaw, P. (1980). Advanced Questionnaire Design. Cambridge, MA: Abt Books.
Lasley, J. (1999). Essentials of Criminal Justice and Criminological Research. NJ: Prentice Hall.
Nachmias, D. & C. (1981). Research Methods in the Social Sciences. NY: St. Martin's.
Neuman, L. & B. Wiegand. (2000). Criminal Justice Research Methods. Boston: Allyn & Bacon.
Rosenberg, M. (1968). The Logic of Survey Analysis. NY: Basic.
Rossi, P. et al. (1983). Handbook of Survey Research. NY: Academic.
Senese, J. (1997). Applied Research Methods in Criminal Justice. Chicago: Nelson Hall.
Sudman, S. & N. Bradburn (1983). Asking Questions: A Practical Guide to Questionnaires. San Francisco: Jossey-Bass.
Wikipedia Entry on Statistical Survey
Last updated: July 17, 2011
Not an official webpage of APSU, copyright restrictions apply, see Megalinks in Criminal Justice
O'Connor, T. (2011). "Survey Research Design," MegaLinks in Criminal Justice. Retrieved from http://www.drtomoconnor.com/3760/3760lect04.htm accessed on July 17, 2011.