Module 3 Assignment 1

Mixed Methods Research Designs in Counseling Psychology William E. Hanson University of Nebraska—Lincoln John W. Creswell University of Nebraska—Lincoln and University of Michigan Vicki L. Plano Clark and Kelly S. Petska University of Nebraska—Lincoln J. David Creswell University of California, Los Angeles With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods research designs. It defines mixed methods research, discusses its origins and philosophical basis, advances steps and procedures used in these designs, and identifies 6 different types of designs. Important design features are illustrated using studies published in the counseling literature. Finally, the article ends with recommendations for designing, implementing, and reporting mixed methods studies in the literature and for discussing their viability and continued usefulness in the field of counseling psychology. Over the past 25 years, numerous calls for increased meth- odological diversity and alternative research methods have been made (Gelso, 1979; Goldman, 1976; Howard, 1983). These calls have led to important discussions about incorporating qualitative methods in counseling research and including qual- itative studies in traditional publication outlets (Hoshmand, 1989; Maione & Chenail, 1999; Morrow & Smith, 2000). They have also led to discussions about integrating quantitative and qualitative methods, commonly referred to as mixed methods research.

In the social sciences at large, mixed methods research has become increasingly popular and may be considered a legiti- mate, stand-alone research design (Creswell, 2002, 2003;Greene, Caracelli, & Graham, 1989; Tashakkori & Teddlie, 1998, 2003). It may be defined as “the collection or analysis of both quantitative and qualitative data in a single study in which the data are collected concurrently or sequentially, are given a priority, and involve the integration of the data at one or more stages in the process of research” (Creswell, Plano Clark, Gutmann, & Hanson, 2003, p. 212). When both quantitative and qualitative data are included in a study, researchers may enrich their results in ways that one form of data does not allow (Brewer & Hunter, 1989; Tashakkori & Teddlie, 1998). Using both forms of data, for example, allows researchers to simul- taneously generalize results from a sample to a population and to gain a deeper understanding of the phenomenon of interest.

It also allows researchers to test theoretical models and to modify them based on participant feedback. Results of precise, instrument-based measurements may, likewise, be augmented by contextual, field-based information (Greene & Caracelli, 1997).

Despite the availability of mixed-methods-related books, chapters, and journal articles, virtually nothing has been written about mixed methods research designs in applied psychology, generally, or in counseling psychology, specifically. Cursory examination of the three editions of theHandbook of Counsel- ing Psychology(e.g., Brown & Lent, 2000), of popular research design texts (e.g., Heppner, Kivlighan, & Wampold, 1999), and of mainstream, peer-reviewed journals (e.g.,Journal of Coun- seling & Development, The Counseling Psychologist) reinforces this assertion. The general absence of discussions on mixed methods research designs may be due to a number of factors, including the historical precedent of favoring quantitative and experimental methods in psychology (Gergen, 2001; Waszak & Sines, 2003), the difficulty in learning and applying both types of methods (Behrens & Smith, 1996; Ponterotto & Grieger, 1999), and the general lack of attention given to diverse meth- William E. Hanson and Kelly S. Petska, Department of Educational Psychology, University of Nebraska—Lincoln. John W. Creswell, Depart- ment of Educational Psychology, University of Nebraska—Lincoln; Office of Qualitative and Mixed Methods Research, University of Nebraska— Lincoln; and Department of Family Medicine, University of Michigan.

Vicki L. Plano Clark, Department of Educational Psychology, University of Nebraska—Lincoln; Office of Qualitative and Mixed Methods Re- search, University of Nebraska—Lincoln; and Department of Physics and Astronomy, University of Nebraska—Lincoln. J. David Creswell, Depart- ment of Psychology, University of California, Los Angeles.

An earlier version of this article was presented at the 111th Annual Convention of the American Psychological Association, Toronto, Ontario, Canada, August 2003. We thank Patricia Cerda and Carey Pawlowski, who assisted in identifying and locating published mixed methods studies.

Correspondence concerning this article should be addressed to William E. Hanson, Counseling Psychology Program, 228 TEAC, University of Nebraska—Lincoln, Lincoln, NE 68588-0345, or to John W. Creswell, Department of Educational Psychology, 241 TEAC, University of Nebras- ka—Lincoln, Lincoln, NE 68588-0345. E-mail: [email protected] or [email protected] Journal of Counseling PsychologyCopyright 2005 by the American Psychological Association 2005, Vol. 52, No. 2, 224 –2350022-0167/05/$12.00 DOI: 10.1037/0022-0167.52.2.224 224 odological approaches in graduate education and training (Aiken, West, Sechrest, & Reno, 1990). However, with so few resources available, answers to the following types of questions remain elusive and somewhat difficult to find: What is mixed methods research? What types of mixed methods studies have been published in counseling? How should mixed methods studies be conducted and reported in the literature?

The purpose of this article is to help answer these questions by introducing mixed methods research designs to counseling psy- chologists. 1Our goal is to help counseling researchers and educa- tors become more familiar with mixed methods terminology, pro- cedures, designs, and key design features. Articles by Goodyear, Tracey, Claiborn, Lichtenberg, and Wampold (2005) and Beck (2005) introduce two specific methodological approaches—ideo- graphic concept mapping and ethnographic decision tree modeling, respectively—and serve to further familiarize researchers and ed- ucators with mixed methods research designs.

The present article is divided into three sections. In the first section, we present an overview of mixed methods research, in- cluding its origins and philosophical basis, rationales, basic steps in designing a mixed methods study, and procedural notations. We also present a typology for classifying different types of mixed methods research designs. In the second section, we use mixed methods studies published in counseling to illustrate each of the designs and key design features discussed. In the third and final section, we offer recommendations for conducting and publishing mixed methods research.

Overview of Mixed Methods Research The historical evolution of mixed methods research has not been traced completely by any one author or source, although Datta (1994) and Tashakkori and Teddlie (1998, 2003) have identified many of the major developmental milestones. The brief overview presented here attempts to incorporate and build on their analyses.

Origins and Philosophical Basis The use of multiple data collection methods dates back to the earliest social science research. It was, however, Campbell and Fiske’s (1959) study of the validation of psychological traits that brought multiple data collection methods into the spotlight. In their classic study, the multitrait–multimethod matrix was designed to rule out method effects; that is, to allow one to attribute individual variation in scale scores to the personality trait itself rather than to the method used to measure it. Although Campbell and Fiske focused on collecting multiple quantitative data, their work was instrumental in encouraging the use of multiple methods and the collection of multiple forms of data in a single study (Sieber, 1973). Taken one step further, the termtriangulation, borrowed from military naval science to signify the use of multiple reference points to locate an object’s exact position, was later used to suggest that quantitative and qualitative data could be complementary.

Each could, for example, “uncover some unique variance which otherwise may have been neglected by a single method” (Jick, 1979, p. 603).

Over time, mixed methods research has gradually gained mo- mentum as a viable alternative research method. Over the past 15years, at least 10 mixed methods textbooks have been published (Bamberger, 2000; Brewer & Hunter, 1989; Bryman, 1988; Cook & Reichardt, 1979; Creswell, 2002, 2003; Greene & Caracelli, 1997; Newman & Benz, 1998; Reichardt & Rallis, 1994; Tashakkori & Teddlie, 1998). Recently, theHandbook of Mixed Methods in Social and Behavioral Researchwas published (Tashakkori & Teddlie, 2003). In addition, journals such asField MethodsandQuantity and Qualityare devoted to publishing mixed methods research. International online journals (seeForum:

Qualitative Social Researchat http://qualitative-research.ne) and Web sites (e.g., http://www.fiu.edu/ bridges/people.htm) provide easy access, resources, and hands-on experiences for interested researchers. Despite this growth and development, a number of controversial issues and debates have limited the widespread ac- ceptance of mixed methods research.

Two important and persistent issues, the paradigm–method fit issue and the “best” paradigm issue, have inspired considerable debate regarding the philosophical basis of mixed methods re- search. The paradigm–method fit issue relates to the question “Do philosophical paradigms (e.g., postpositivism, constructivism) and research methodshaveto fit together?” This issue first surfaced in the 1960s and 70s, primarily as a result of the increasing popularity of qualitative research and the identification of philosophical dis- tinctions between traditional postpositivist and naturalistic re- search. Guba and Lincoln (1988), for example, identified paradigm differences between postpositivist philosophical assumptions and naturalistic assumptions in terms of epistemology (how we know what we know), ontology (the nature of reality), axiology (the place of values in research), and methodology (the process of research). This led to a dichotomy between traditional inquiry paradigms and naturalistic paradigms.

Some researchers have argued, for example, that a postpositivist philosophical paradigm, or worldview, could be combined only with quantitative methods and that a naturalistic worldview could be combined only with qualitative methods. This issue has been referred to as the “paradigm debate” (Reichardt & Rallis, 1994).

From this perspective, mixed methods research was viewed as untenable (i.e., incommensurable or incompatible) because certain paradigms and methods could not “fit” together legitimately (Smith, 1983). Reichardt and Cook (1979) countered this view- point, however, by suggesting that different philosophical para- digms and methods were compatible. In their article, they argued that paradigms and methods are not inherently linked, citing a variety of examples to support their position (e.g., quantitative procedures are not always objective, and qualitative procedures are not always subjective). Indeed, the perspective exists today that multiple methods may be used in a single research study to, for example, take advantage of the representativeness and generaliz- ability of quantitative findings and the in-depth, contextual nature of qualitative findings (Greene & Caracelli, 2003).

The best paradigm issue relates to the question “What philo- sophical paradigm is thebestfoundation for mixed methods re- search?” This issue, like the paradigm–method fit issue, has mul- tiple perspectives (Tashakkori & Teddlie, 2003). One perspective 1We thank Beth Haverkamp for her helpful conceptual feedback on this article. 225 SPECIAL ISSUE: MIXED METHODS RESEARCH DESIGNS is that mixed methods research uses competing paradigms inten- tionally, giving each one relatively equal footing and merit. This “dialectical” perspective recognizes that using competing para- digms gives rise to contradictory ideas and contested arguments, features of research that are to be honored and that may not be reconciled (Greene & Caracelli, 1997, 2003). Such oppositions reflect different ways of making knowledge claims, and we advo- cate for honoring and respecting the different paradigmatic per- spectives that researchers bring to bear on a study. In an earlier publication, we identified six different mixed methods research designs and discussed how the underlying theoretical lenses, or paradigms, may differ, depending on the type of design being used (Creswell et al., 2003). This perspective maintains that mixed methods research may be viewed strictly as a “method,” thus allowing researchers to use any number of philosophical founda- tions for its justification and use. The best paradigm is determined by the researcher and the research problem—not by the method.

Another perspective is thatpragmatismis the best paradigm for mixed methods research (Tashakkori & Teddlie, 2003). Pragma- tism is a set of ideas articulated by many people, from historical figures such as Dewey, James, and Pierce to contemporaries such as Murphy, Rorty, and West. It draws on many ideas including using “what works,” using diverse approaches, and valuing both objective and subjective knowledge (Cherryholmes, 1992). Ross- man and Wilson (1985) were among the first to associate pragma- tism with mixed methods research. They differentiated between methodological purists, situationalists, and pragmatists. The pur- ists believed that quantitative and qualitative methods derived from different, mutually exclusive, epistemological and ontologi- cal assumptions about research. The situationalists believed that both methods have value (similar to the dialectical perspective mentioned earlier) but that certain methods are more appropriate under certain circumstances. The pragmatists, in contrast, believed that, regardless of circumstances, both methods may be used in a single study. For many mixed methods researchers, then, pragma- tism has become the answer to the question of what is the best paradigm for mixed methods research. Recently, Tashakkori and Teddlie (2003) have attempted to formally link pragmatism and mixed methods research, arguing that, among other things, the research question should be of primary importance—more impor- tant than either the method or the theoretical lens, or paradigm, that underlies the method. At least 13 other prominent mixed methods researchers and scholars also believe that pragmatism is the best philosophical basis of mixed methods research (Tashakkori & Teddlie, 2003).

Rationales, Basic Steps in Designing a Mixed Methods Study, and Procedural Notations Rationales.In the mid-1980s, scholars began expressing con- cern that researchers were indiscriminately mixing quantitative and qualitative methods and forms of data without acknowledging or articulating defensible reasons for doing so (Greene et al., 1989; Rossman & Wilson, 1985). As a result, different reasons, or rationales, for mixing both forms of data in a single study were identified. Greene et al. (1989), for example, identified a number of rationales for combining data collection methods. These ration- ales went above and beyond the traditional notion of triangulation.Specifically, quantitative and qualitative methods could be com- bined to use results from one method to elaborate on results from the other method (complementarity), use results from one method to help develop or inform the other method (development; see Goodyear et al., 2005, and Beck, 2005), recast results from one method to questions or results from the other method (initiation), and extend the breadth or range of inquiry by using different methods for different inquiry components (expansion). Thus, they provided not only rationales for mixing methods and forms of data but also names for them.

Recently, mixed methods researchers have expanded the reasons for conducting a mixed methods investigation (Mertens, 2003; Newman, Ridenour, Newman, & DeMarco, 2003; Punch, 1998).

We agree with Mertens (2003) and Punch (1998), who suggested that mixed methods investigations may be used to (a) better understand a research problem by converging numeric trends from quantitative data and specific details from qualitative data; (b) identify variables/constructs that may be measured subsequently through the use of existing instruments or the development of new ones; (c) obtain statistical, quantitative data and results from a sample of a population and use them to identify individuals who may expand on the results through qualitative data and results; and (d) convey the needs of individuals or groups of individuals who are marginalized or underrepresented.

For a comprehensive, in-depth discussion of rationale issues, the reader is referred to Newman et al. (2003).

Basic steps in designing a mixed methods study.Designing a mixed methods study involves a number of steps, many of which are similar to those taken in traditional research methods. These include deciding on the purpose of the study, the research ques- tions, and the type of data to collect. Designing a mixed methods study, however, also involves at least three additional steps. These include deciding whether to use an explicit theoretical lens, iden- tifying the data collection procedures, and identifying the data analysis and integration procedures (Creswell, 1999; Greene & Caracelli, 1997; Morgan, 1998; Tashakkori & Teddlie, 1998).

These steps occur more or less sequentially, with one informing and influencing the others.

The first step involves deciding whether to use an explicit theoretical lens. As used here, the termtheoretical lensrefers to the philosophical basis, or paradigm, (e.g., postpositivism, con- structivism, feminism) that underlies a researcher’s study and subsequent methodological choices (Crotty, 1998). It is an um- brella term that may be distinguished from broader epistemologies (e.g., objectivism, subjectivism), from narrower methodologies (e.g., experimental research), and from, narrower still, methods (e.g., random sampling, interviews). Recognizing that all research- ers bring implicit theories and assumptions to their investigations, researchers at this initial stage must decide whether they are going to view their study from a paradigmatic base (e.g., postpositivism, constructivism) that does not necessarily involve a goal of social change or from an advocacy-based lens such as feminism. Our use of the termadvocacyis similar to what Ponterotto (2005) refers to as a “critical/emancipatory” paradigm. In any event, the outcome of this decision informs and influences the methodology and the methods used in the study, as well as the use of the study’s findings.

226 HANSON ET AL. If, for example, a feminist lens is used in a mixed methods study, then the gendered perspective provides a deductive lens that informs the research questions asked at the beginning of the study and the advocacy outcomes advanced at the end (cf. Mertens, 2003). Within the field of counseling psychology, the research question might be “How does a counselor’s level of self-disclosure affect a client’s perception of empowerment?” Answering this question may lead to more empowering, research-informed, counselor– client interactions and to overt attempts to change how counselors are trained and supervised.

The second step involves deciding how data collection will be implemented and prioritized. Implementation refers to the order in which the quantitative and qualitative data are collected, concur- rently or sequentially, and priority refers to the weight, or relative emphasis, given to the two types of data, equal or unequal (Cre- swell et al., 2003; Morgan, 1998). A counseling researcher could, in the example above, collect data sequentially, first collecting quantitative survey data related to clients’ postsession levels of perceived empowerment and then collecting qualitative interview data. The interview data could then be used to corroborate, refute, or augment findings from the survey data. As a result, priority in this hypothetical study would be unequal. Unequal priority occurs when a researcher emphasizes one form of data more than the other, starts with one form as the major component of a study, or collects one form in more detail than the other (Morgan, 1998).

Figure 1 shows many of the options related to this step.The third step involves deciding the point at which data analysis and integration will occur. In mixed methods studies, data analysis and integration may occur by analyzing the data separately, by transforming them, or by connecting the analyses in some way (Caracelli & Green, 1993; Onwuegbuzie & Teddlie, 2003; Tashakkori & Teddlie, 1998). A counseling researcher could, for example, analyze the quantitative and qualitative data separately and then compare and contrast the two sets of results in the discussion. As an alternative strategy, themes that emerged from the qualitative interview data could be transformed into counts or ratings and subsequently compared to the quantitative survey data.

Another option would be to connect the data analyses. To do this, the researcher could analyze the survey data, create a categorical variable that helps explain the outcome variance, and conduct follow-up interviews with individuals who were representative of each of the categories. For example, on the basis of results from the survey data, a typology of empowering and disempowering counselor self-disclosures, or levels of self-disclosure, could be developed. The researcher could then interview a subsample of clients (e.g., some who felt empowered and some who felt disem- powered). In this way, results from the quantitative analysis would be connected to the qualitative data collection and analysis, pri- marily by aiding in the identification and selection of individuals to participate in the follow-up interviews.

Procedural notations.Reminiscent of the notation system de- veloped by Campbell and Stanley (1966), which used Xs and Os Figure 1.Options related to mixed methods data collection procedures. QUAN quantitative data was prioritized; QUAL qualitative data was prioritized; qual lower priority given to the qualitative data; quan lower priority given to the quantitative data. 227 SPECIAL ISSUE: MIXED METHODS RESEARCH DESIGNS to represent different experimental procedures, Morse (1991, 2003) developed a system for representing different mixed meth- ods procedures. Instead of Xs and Os, however, her system uses plus ( ) symbols and arrows (3) as well as capital and lowercase letters. A plus sign indicates that quantitative and qualitative data are collected concurrently (at the same time), and an arrow indi- cates that they are collected sequentially (one followed by the other). The use of capital letters indicates higher priority for a particular method. Lowercase letters, in turn, indicate lower pri- ority. By displaying mixed methods procedures graphically, read- ers may identify, at a glance, the implementation and the priority of the data collection procedures (see Figure 1). For example, QUAN3qual indicates a quantitatively driven sequential study, where quantitative data collection is followed by qualitative data collection with unequal priority, and QUAL QUAN indicates a qualitatively and quantitatively driven concurrent study, where qualitative and quantitative data collection occur at the same time and are given equal priority. Types of Mixed Methods Research Designs Several authors have developed typologies of mixed methods research designs, drawing mostly from approaches used in evalu- ation (Greene et al., 1989), nursing (Morse, 1991), public health (Steckler, McLeroy, Goodman, Bird, & McCormick, 1992), and education research (Creswell, 2002). Classification systems that use acceptable, standardized names and descriptive categories are still being developed. As one example, Creswell et al. (2003) developed a parsimonious system for classifying mixed methods research designs. As shown in Figure 2, there are six primary types of designs: three sequential (explanatory, exploratory, and trans- formative) and three concurrent (triangulation, nested, and trans- formative). Each varies with respect to its use of an explicit theoretical/advocacy lens, approach to implementation (sequential or concurrent data collection procedures), priority given to the quantitative and qualitative data (equal or unequal), stage at which the data are analyzed and integrated (separated, transformed, or connected), and procedural notations. Because mixed methods Figure 2.Typology for classifying mixed methods research designs. QUAN quantitative data was priori- tized; QUAL qualitative data was prioritized; qual lower priority given to the qualitative data; quan lower priority given to the quantitative data. 228 HANSON ET AL. designs are, generally speaking, complex, it is important to under- stand subtle differences and nuances between and among them. To facilitate this understanding, we next describe each of the six designs, beginning with sequential designs.

Sequential designs.There are three types of sequential de- signs: sequential explanatory, sequential exploratory, and sequen- tial transformative. Sequential explanatory designs do not use an explicit advocacy lens. In these designs, quantitative data are collected and analyzed, followed by qualitative data. Priority is usually unequal and given to the quantitative data. Qualitative data are used primarily to augment quantitative data. Data analysis is usually connected, and integration usually occurs at the data in- terpretation stage and in the discussion. These designs are partic- ularly useful for, as its name suggests, explaining relationships and/or study findings, especially when they are unexpected.

Sequential exploratory designs also do not use an explicit ad- vocacy lens. In these designs, qualitative data are collected and analyzed first, followed by quantitative data. Priority is usually unequal and given to the qualitative data. Quantitative data are used primarily to augment qualitative data. Data analysis is usually connected, and integration usually occurs at the data interpretation stage and in the discussion. These designs are useful for exploring relationships when study variables are not known, refining and testing an emerging theory, developing new psychological test/ assessment instruments based on an initial qualitative analysis, and generalizing qualitative findings to a specific population.

In contrast to the other two sequential designs, sequential trans- formative designs use an explicit advocacy lens (e.g., feminist perspectives, critical theory), which is usually reflected in the purpose statement, research questions, and implications for action and change. In these designs, quantitative data may be collected and analyzed, followed by qualitative data, or conversely, quali- tative data may be collected and analyzed, followed by quantitative data. Thus, either form of data may be collected first, depending on the needs and preferences of the researchers. Priority may be unequal and given to one form of data or the other or, in some cases, equal and given to both forms of data. Data analysis is usually connected, and integration usually occurs at the data in- terpretation stage and in the discussion. These designs are useful for giving voice to diverse or alternative perspectives, advocating for research participants, and better understanding a phenomenon that may be changing as a result of being studied.

Concurrent designs.Similar to sequential mixed methods re- search designs, there are three types of concurrent designs: con- current triangulation, concurrent nested, and concurrent transfor- mative. In concurrent triangulation designs, quantitative and qualitative data are collected and analyzed at the same time.

Priority is usually equal and given to both forms of data. Data analysis is usually separate, and integration usually occurs at the data interpretation stage. Interpretation typically involves discuss- ing the extent to which the data triangulate or converge. These designs are useful for attempting to confirm, cross-validate, and corroborate study findings.

In concurrent nested designs, like concurrent triangulation de- signs, quantitative and qualitative data are collected and analyzed at the same time. However, priority is usually unequal and given to one of the two forms of data— either to the quantitative or quali- tative data. The nested, or embedded, forms of data are, in thesedesigns, usually given less priority. One reason for this is that the less prioritized form of data may be included to help answer an altogether different question or set of questions. Data analysis usually involves transforming the data, and integration usually occurs during the data analysis stage. These designs are useful for gaining a broader perspective on the topic at hand and for studying different groups, or levels, within a single study.

In contrast to the other two concurrent designs, concurrent transformative designs use an explicit advocacy lens (e.g., feminist perspectives, critical theory), which is usually reflected in the purpose statement, research questions, and implications for action and change. Quantitative and qualitative data are collected and analyzed at the same time. Priority may be unequal and given to one form of data or the other or, in some cases, equal and given to both forms of data. Data analysis is usually separate, and integra- tion usually occurs at the data interpretation stage or, if trans- formed, during data analysis. Similar to sequential transformative designs, these designs are useful for giving voice to diverse or alternative perspectives, advocating for research participants, and better understanding a phenomenon that may be changing as a result of being studied. Illustration of Mixed Methods Research Designs and Key Design Features In this section, we use studies published in the counseling literature to illustrate each of the six types of mixed methods research designs. In so doing, conceptual issues, such as imple- mentation, priority, and data analysis and integration, may become more concrete and easier to understand. We also use these studies to highlight potential publication outlets and topics; the extent to which they include an explicit purpose statement, research ques- tions, and rationale for using a mixed methods design; the data collection procedures; and the data analysis procedures. These design features are important ways of characterizing mixed meth- ods studies. They offer insights into the complexities of this type of research and serve as signposts and markers for identifying, understanding, and evaluating the different types of designs.

To identify published mixed methods studies, we searched the PsycINFO computer database three times between August 2001 and May 2002, locating all counseling-related journal articles written in English. We then back-checked reference lists of the articles to identify other studies that may have been missed ini- tially. This search procedure resulted in the identification of 22 studies. These studies were published between 1986 and 2000.

Table 1 lists the design features of each.

Five of the six types of mixed methods research designs ap- peared in the counseling literature during the designated time period. Concurrent triangulation was the most common type of design used (32%,n 7), followed by concurrent nested designs (27%,n 6), sequential explanatory designs (23%,n 5), sequential exploratory designs (14%,n 3), and concurrent transformative designs (4%,n 1). No sequential transformative designs were used, and none of the studies used procedural nota- tions to depict their design.

Luzzo (1995) used aconcurrent triangulation designto study gender differences in career maturity and perceived barriers to career development. Four hundred one undergraduate students 229 SPECIAL ISSUE: MIXED METHODS RESEARCH DESIGNS participated in the quantitative part of the study, and 128 partici- pated in the qualitative part. In this study, the author did not use an advocacy lens, stated the study’s purpose and rationale for using a mixed methods design, implemented data collection concurrently (QUAN and QUAL at the same time), prioritized the data equally, and integrated the data after analyzing them (during the interpre- tation phase). Specifically, quantitative data, in the form of scores on three different measures, and qualitative data, in the form of tape-recorded responses to open-ended questions, were collected to examine career-related gender differences. After analyzing the quantitative and qualitative data separately, the results were trian- gulated (i.e., integrated), and consistent/overlapping gender differ- ences were identified. Balmer (1994), Balmer, Seeley, and Bachengana (1996), Good and Heppner (1995), Hill et al. (2000), Martin, Goodyear, and Newton (1987), and Meier (1999) are other examples of studies that used concurrent triangulation designs.

Williams, Judge, Hill, and Hoffman (1997) also used a concur- rent mixed methods research design. However, they used acon- current nested designto study “trainees’, clients’, and supervisors’ perceptions of the trainees’ personal reactions and management strategies during counseling sessions” (p. 391). Seven doctoral trainees, 30 volunteer clients, and 7 supervisors participated in the study. In this study, the authors did not use an advocacy lens, stated the study’s purpose and rationale for using a mixed methods design, reported three research questions (2 QUAL and 1 quan, which focused on different issues), implemented data collection concurrently (quan and QUAL at the same time), prioritized the qualitative data, and integrated the data after analyzing/transform-ing them (during the interpretation phase). Specifically, qualitative data, in the form of written responses to open-ended questions, were collected to examine two different issues: the kinds of per- sonal reactions trainees have during counseling sessions and the strategies that they use to manage their reactions. Quantitative data, in the form of pre- and postchange scores, were nested and collected to examine changes in trainee anxiety, counseling self- efficacy, management of countertransference issues, and general counseling skills. After analyzing the qualitative and quantitative data separately, the results were used to help answer the three research questions. Aspenson et al. (1993), Baker and Siryk (1986), Blustein, Phillips, Jobin-Davis, Finkelberg, and Rourke (1997), Gaston and Marmar (1989), and Guernina (1998) are other examples of studies that used concurrent nested designs.

In contrast to Luzzo (1995) and Williams et al. (1997), Palmer and Cochran (1988) used a sequential mixed methods research design. They used asequential explanatory designto provide “an empirical test of parent effectiveness in a structured career devel- opment program for their children” (p. 71). Forty volunteer fam- ilies participated in their study. The experimental group completed a self-guided intervention program, which was compared to a control group on parent– child relationship measures and career development outcomes. In this study, the authors used Bronfen- brunner’s theory of human development and Super’s theory of career development as explicit theoretical lenses, stated the study’s purpose, implemented data collection sequentially (QUAN fol- lowed by QUAL), prioritized the data equally, and integrated the data after analyzing them (during the interpretation phase and in Table 1 Design Features of Mixed Methods Studies Published in Counseling Study Design TopicPurpose or RQs/rationale Priority/analysis Aspenson et al. (1993) Concurrent nested Training/supervision Yes/yes QUAL quan/connected Baker & Siryk (1986) Concurrent nested Assessment Yes/no QUAN qual/connected Balmer (1994) Concurrent triangulation Group counseling No/yes QUAN QUAL/separate Balmer et al. (1998) Concurrent transformative Group counseling No/yes QUAN QUAL/separate Balmer et al. (1996) Concurrent triangulation Individual counseling No/yes QUAN QUAL/separate Blustein et al. (1997) Concurrent nested Vocational/career Yes/yes QUAL quan/CDT Chusid & Cochran (1989) Sequential explanatory Vocational/career Yes/yes (qual3)quan3QUAL/connected Daughtry & Kunkel (1993) Sequential exploratory Individual counseling Yes/yes qual3QUAN/connected Gaston & Marmar (1989) Concurrent nested Individual counseling Yes/yes QUAN qual/connected Good & Heppner (1995) Concurrent triangulation Training/diversity Yes/yes QUAL quan/SDT Guernina (1998) Concurrent nested Individual counseling Yes/yes QUAN qual/separate Hill et al. (2000) Concurrent triangulation Individual counseling Yes/yes QUAN QUAL/separate Luzzo (1995) Concurrent triangulation Vocational/career Yes/yes QUAN QUAL/separate Martin et al. (1987) Concurrent triangulation Training/supervision Yes/yes QUAN qual/SDT Meier (1999) Concurrent triangulation Assessment/training Yes/no QUAN QUAL/separate Orndoff & Herr (1996) Sequential explanatory Vocational/career Yes/yes QUAN3QUAL/connected Palmer & Cochran (1988) Sequential explanatory Vocational/career Yes/no QUAN3QUAL/separate Paulson et al. (1999) Sequential exploratory Counseling process Yes/yes qual3QUAN/connected Payne et al. (1991) Sequential exploratory Individual counseling Yes/yes (quan3)qual3QUAN/CDT Poasa et al. (2000) Sequential explanatory Diversity Yes/yes quan3QUAL/separate Wampold et al. (1995) Sequential explanatory Vocational/career Yes/yes QUAN3(quan QUAL)/separate Williams et al. (1997) Concurrent nested Training/supervision Yes/yes QUAL quan/SDT Note.Purpose or RQs (research questions)/rationale whether or not the study included an explicit purpose statement, RQ, and/or rationale for using a mixed methods design. Priority/analysis the weight, or relative emphasis, given to the quantitative and qualitative data/the point at which the data were analyzed and integrated. QUAL qualitative data was prioritized; QUAN quantitative data was prioritized; quan lower priority given to the quantitative data; qual lower priority given to the qualitative data; CDT connected analyses with data transformation; SDT separate analyses with data transformation. 230 HANSON ET AL. the discussion). Specifically, quantitative data, in the form of scores on three different measures, were collected and analyzed, followed by qualitative data, in the form of verbal responses to open-ended interviews. After the quantitative data were analyzed, parents were interviewed, either in person or by telephone, to “gain a narrative description of how the program went, with attention to problems and benefits. The questions were open-ended, intended to invite general comments rather than definitive answers” (Palmer & Cochran, 1988, p. 73). The qualitative data were used to aug- ment the quantitative data. The authors noted that the “qualitative data from the interviews tended to support quantitative results” (p.

74). The authors did not report any research questions or specify a rationale for using a mixed methods design. Chusid and Cochran (1989), Orndoff and Herr (1996), Poasa, Mallinckrodt, and Suzuki (2000), and Wampold et al. (1995) are other examples of studies that used sequential explanatory designs.

Paulson, Truscott, and Stuart (1999) also used a sequential mixed methods research design. However, they used asequential exploratory designto study clients’ perceptions of helpful experi- ences in counseling. Thirty-six clients and 12 counselors partici- pated in the study. In this study, the authors did not use an advocacy lens, stated the study’s purpose and rationale for using a mixed methods design, reported one research question (combined qual and QUAN), implemented data collection sequentially (qual followed by QUAN), prioritized the quantitative data, and con- nected the data analysis. Specifically, qualitative data, in the form of transcribed responses to a single, open-ended question (i.e., “What was helpful about counseling?”), were collected and ana- lyzed, followed by quantitative data, in the form of a sorting and rating task. Quantitative data were included to augment the qual- itative data and to develop a concept map of clients’ responses to the open-ended question. Daughtry and Kunkel (1993) and Payne, Robbins, and Dougherty (1991) are other examples of studies that used sequential exploratory designs. The methodological ap- proaches described by Goodyear et al. (2005) and Beck (2005) may also be considered examples of sequential exploratory designs.

In the only identified transformative mixed methods research design, Balmer, Gikundi, Nasio, Kihuho, and Plummer (1998) used aconcurrent transformative designto “evaluate group coun- seling, based upon a unified theory, as an intervention strategy for men with an STD infection and to develop a more detailed under- standing of sexual behavior that results in STD/HIV acquisition and transmission” (p. 34). Two hundred forty-two men who were Kenyan and infected with an STD and 6 counselors participated in this randomized clinical trial study. In this study, the authors used an explicit advocacy lens, stated the rationale for using a mixed methods design, implemented data collection concurrently (QUAN and QUAL at the same time), prioritized the data equally, and integrated the data after analyzing them (during the interpretation phase). Specifically, in terms of an advocacy (“participatory action research”) lens, “the qualitative assessment process allowed the counseled groups to become collaborators in a joint project and perhaps it increased their commitment” (Balmer et al., 1998, p.

42). Thus, the research participants’ perspectives were elicited and used to help validate the findings. Moreover, the authors reported that the participants changed as a result of their participation. In terms of implementation (data collection), quantitative data, in theform of pre- and postchange scores on five different measures and medical statistics, and qualitative data, in the form of observations, interviews, field notes, and documents, were collected simulta- neously. After analyzing the quantitative and qualitative data sep- arately, the results were triangulated (i.e., integrated) and com- pared to the existing literature in this area. The authors did not state the purpose explicitly or report any research questions. No other examples of concurrent transformative designs were identified in our search of the counseling literature.

No sequential transformative designs were identified either.

Consequently, to illustrate this design, a counseling-related study from the human development literature is described here. In this study, Tolman and Szalacha (1999) used asequential transforma- tive designto “understand the dimensions of the experience of sexual desire for adolescent girls” (p. 8). Thirty females who were in 11th grade and who attended an urban high school (n 15) and a suburban high school (n 15) participated in the study. In this study, the authors used an explicit advocacy lens, stated the ratio- nale for using a mixed methods design, reported three research questions (2 QUAL and 1 quan), implemented data collection sequentially (QUAL followed by quan followed by QUAL), pri- oritized the qualitative data, and connected the data analysis.

Specifically, in terms of the advocacy lens, it was “explicitly feminist in nature,” using “a feminist organizing principle of listening to and taking women’s voices seriously. . .particularly in data collection and data reduction, as well as in data analysis and interpretation” (p. 11). Thus, a mixed methods design was used to create “an opportunity for girls to put into words and to name their experience in and questions about a realm of their lives that remains unspoken in the larger culture” (p. 13). Data were col- lected and analyzed in three sequential phases. In the first and third phases, qualitative data, in the form of transcribed narratives of private, one-on-one, semistructured interviews, were collected and analyzed. In the second phase, quantitative data, in the form of coded frequency data, were collected and analyzed. Results from the first analysis were used to inform the second phase of data collection, and similarly, results from the second analysis were used to inform the third phase of data collection. In the end, the results from the three analyses were triangulated and used to help answer the three research questions. Journals, Purpose Statements, Research Questions, and Rationales Mixed methods studies have been published in at least seven counseling-related journals:Counselling Psychology Quarterly (CPQ);Counselor Education and Supervision(CES);Journal of Counseling & Development(JCD);Journal of Counseling Psy- chology(JCP),Professional Psychology: Research and Practice (PPRP),Psychotherapy: Theory, Research, Training, Practice; andThe Counseling Psychologist(TCP). The investigations have targeted a range of topics of interest to the field (e.g., individual counseling, vocational/career, training/supervision; see Table 1).

A particularly important design feature of mixed methods stud- ies is the extent to which they include an explicit purpose state- ment, research questions (RQs), and rationale for using both quan- titative and qualitative methods and data in a study (Creswell et al., 2003). As alluded to previously, purpose statements and research 231 SPECIAL ISSUE: MIXED METHODS RESEARCH DESIGNS questions serve as signposts and markers for identifying, under- standing, and evaluating the different types of mixed methods research designs. They also shape the analyses and integration of the results. Having a well conceived rationale is also important because it indicates to the reader that the quantitative and quali- tative methods and data were mixed intentionally and for defen- sible reasons.

In our sample, purpose statements, RQs, and rationales were included in 19 (86%), 11 (50%), and 19 (86%) studies, respec- tively. All 19 studies that stated a purpose stated it explicitly. For example, Wampold et al. (1995), in a two-part study of differences in social skills across Holland types (Study 1) and of how people who are task-oriented (e.g., C, R, and I types) construct their social/work environments (Study 2), stated, “The purpose of Study 1 was to test the hypotheses about relative strengths and weak- nesses in specified social skills for various types of people” (pp.

368) and “Study 2 was a qualitative study designed to examine the density and nature of social interactions produced by chemists in an academic setting” (pp. 371). Three studies (14%) did not include purpose statements.

Across the 11 studies that included RQs, the number of RQs ranged from one to five, with a mean of 2.64 RQs (SD 1.36).

Five studies (45%) included both quantitative and qualitative RQs.

Three (27%) included only quantitative RQs, one (9%) included only qualitative, and two (18%) included only combinations of quantitative and qualitative.

Across the 19 studies that stated a rationale for mixing methods and quantitative and qualitative data, 16 (84%) stated it explicitly.

For example, Gaston and Marmar (1989), in a time-series study of therapeutic change events, mentioned specifically the importance of including both forms of data: The main thesis of this article is that quantitativeandqualitative knowledge are both essential for the understanding of the change process in psychotherapy. Ideally, information from both paradigms should be acquired within single investigations. With the use of a study example, we attempt to illustrate the dual advantages of richer process-outcome findings provided by combining quantitative and qualitative approaches. (p. 169) Three (16%) of the 19 studies that reported a rationale did not state it explicitly. In these studies, it was implied and had to be inferred from the text. Three studies (14%) did not indicate a rationale.

Data Collection Procedures Fourteen mixed methods studies implemented data collection procedures concurrently (64%), and 8 implemented them sequen- tially (36%). Priority was distributed more or less evenly across studies, with 7 prioritizing quantitative data (32%), 6 prioritizing qualitative data (27%), and 9 prioritizing both equally (41%).

Quantitative data consisted primarily of self-report, instrument- based data (n 20; 91%), followed by rating tasks (n 5; 23%) and by observation- (n 1; 4%) and physiology-based data (n 1; 4%). Qualitative data consisted primarily of data based on individual or group interviews (n 17; 77%), followed by obser- vations/field notes (n 9; 41%) and by data based on existing materials (n 4; 18%), including official records, personal doc- uments, and archival data. Data Analysis Procedures Ten mixed methods studies (45%) analyzed quantitative and qualitative data separately, before all of the data were collected or analyzed. Data analysis was connected in 7 studies (32%), sepa- rated and transformed (e.g., qualitative data were transformed into quantitative scores) in 3 studies (14%), and connected and trans- formed in 2 studies (9%). Quantitative data analysis consisted primarily of descriptive, or exploratory, procedures (n 20; 91%), followed by inferential, or confirmatory, procedures (n 19; 86%). Qualitative data analysis consisted primarily of the identi- fication of themes and relationships (n 17; 77%), using, for example, grounded theory (Strauss & Corbin, 1990) and consen- sual qualitative research (CQR; Hill, Thompson, & Williams, 1997), followed by thick description (n 8; 36%; Wolcott, 1994).

Twenty (91%) of the studies integrated the data at the interpreta- tion stage, and 2 (9%) integrated the data at the analysis stage.

In considering the 22 studies cited in this section, a number of general observations may be made. First, mixed methods studies have indeed been published in counseling journals, the majority of which were published in CPQ, JCP, JCD, or TCP during the 1990s. Second, concurrent designs, where quantitative and quali- tative data are collected at the same time, were the most common type of design used. Third, researchers who published mixed methods studies tended to include purpose statements, research questions, and rationales for using these designs. None of the studies, however, used procedural notations to depict the design.

Fourth, the priority for data collection was distributed equally between quantitative and qualitative data across the studies. Fifth, data analysis tended to occur separately, and integration of the results (i.e., triangulation) tended to occur at the interpretation stage and in the discussion—approaches to analysis and integra- tion that are consistent with concurrent triangulation designs, the single most popular type of design that was used.

We are well aware that these observations are primarily descrip- tive in nature. In reviewing the studies, we did not attempt to critique or rate the quality of any of them. As descriptive catego- ries and standardized evaluative criteria continue to evolve, it may become easier to offer more formal strengths- and weaknesses- based observations. We are also aware that, despite our systematic, 9-month-long literature search, it is quite likely that we missed a few studies, especially ones that have been published within the past few years. Despite these limitations, we hope that this section of the article is of heuristic value to readers. Recommendations The primary purpose of this article was to introduce mixed methods research to counseling researchers and educators. On the basis of our understanding of mixed methods procedures and designs, as well as the general observations noted above, we offer the following recommendations for designing, implementing, and reporting a mixed methods study.

1. We recommend that researchers attend closely to theoretical/ paradigmatic issues. Attention should be paid to the theoretical lens that informs the investigation and to the priority that is assigned to the quantitative and qualitative data. Explicit statement of the researcher’s lens is informative. A postpositivist lens would, 232 HANSON ET AL. for example, be appropriate for a sequential explanatory design that prioritized the quantitative data, whereas a constructivist lens would be appropriate for a sequential exploratory design that prioritized the qualitative data. For transformative designs, an advocacy-based or transformative-emancipatory lens would be required, regardless of whether the quantitative or qualitative data were prioritized.

2. We recommend that researchers also attend closely to design and implementation issues, particularly to how and when data are collected (e.g., concurrently or sequentially). The study’s purpose plays an important role here (Creswell, 1999). If, for example, the purpose is to triangulate or converge the results, then the data may be collected concurrently. However, elaboration of the results would require a sequential design.

3. In mixed methods studies, data analysis and integration may occur at almost any point in time (Creswell et al., 2003). As noted by Onwuegbuzie and Teddlie (2003), “The point at which the data analysis begins and ends depends on the type of data collected, which in turn depends on the sample size, which in turn depends on the research design, which in turn depends on the purpose” (p.

351). We recommend that researchers familiarize themselves with the analysis and integration strategies used in the mixed methods studies cited in this article as well as with those recommend by Caracelli and Green (1993) and Onwuegbuzie and Teddlie (2003).

4. Because mixed methods studies require a working knowledge and understanding of both quantitative and qualitative methods, and because they involve multiple stages of data collection and analysis that frequently extend over long periods of time, we recommend that researchers work in teams. Working in teams allows researchers with expertise in quantitative methods and analyses, qualitative methods and analyses, and/or both to be involved directly in designing and implementing a mixed methods study.

5. In preparing a mixed methods manuscript, we recommend that researchers use the phrasemixed methodsin the titles of their studies. We also recommend that, early on, researchers foreshadow the logic and progression of their studies by stating the study’s purpose and research questions in the introduction. Clear, well written purpose statements and research questions that specify the quantitative and qualitative aspects of the study help focus the manuscript.

6. We recommend that, in the introduction, researchers explic- itly state a rationale for mixing quantitative and qualitative meth- ods and data (e.g., to triangulate the results, to extend the study’s results). It is best to specify the advantages, for the specified research questions, that accrue from using both methods and data.

Examples of good rationales may be found in Gaston and Marmar (1989) and Hill et al. (2000).

7. We recommend that, in the methods, researchers specify the type of mixed methods research design used (e.g., sequential explanatory mixed methods design) and include procedural nota- tions such as those shown in Figures 1 and 2. By doing this, the field will be able to build a common vocabulary and shared understanding of the different types of designs available.

8. Finally, we recommend that counseling researchers and ed- ucators continue having candid discussions about the legitimacy and viability of mixed methods research. As one anonymous reviewer noted, researchers [should] openly discuss their views on the integration of potentially distinct epistemological issues in using mixed designs.

This may not always be necessary when the methods are relatively close with respect to assumptions about the nature of knowledge.

However, when the methods are quite far apart. . .some exploration of the complexities of merging methodological perspectives would be quite helpful. We strongly agree. Discussions of this nature may stimulate ad- ditional interest and future advancements in this emerging form of inquiry.

Many scholars have begun to describe mixed methods research as a legitimate, stand-alone research design ready to stand beside time-honored designs such as experiments, surveys, grounded the- ory studies, and ethnographies (Datta, 1994; Tashakkori & Ted- dlie, 1998, 2003). Despite numerous challenges and obstacles, it has emerged as a viable alternative to purely quantitative or qualitative methods and designs. With studies available in the literature, and in this issue, to serve as models, and with the recommendations included here, counseling researchers and edu- cators may be on the verge of a new generation of thinking about method and methodology. References Aiken, L. S., West, S. G., Sechrest, L., & Reno, R. R. (1990). Graduate training in statistics, methodology, and measurement in psychology: A survey of PhD programs in North America.American Psychologist, 45, 721–734.

Aspenson, D. O., Gersh, T. L., Perot, A. R., Galassi, J. P., Schroeder, R., Kerick, S., Bulger, J., & Brooks, L. (1993). Graduate psychology stu- dents’ perceptions of the scientist-practitioner model of training.Coun- selling Psychology Quarterly, 6,201–215.

Baker, R. W., & Siryk, B. (1986). Exploratory intervention with a scale measuring adjustment to college.Journal of Counseling Psychology, 33, 31–38.

Balmer, D. H. (1994). The efficacy of a scientific and ethnographic research design for evaluating AIDS group counselling.Counselling Psychology Quarterly, 7,429 – 440.

Balmer, D. H., Gikundi, E., Nasio, J., Kihuho, F., & Plummer, F. A.

(1998). A clinical trial of group counselling for changing high-risk sexual behaviour in men.Counselling Psychology Quarterly, 11,33– 43.

Balmer, D. H., Seeley, J., & Bachengana, C. (1996). The role of counsel- ling in community support for HIV/AIDS in Uganda.Counselling Psy- chology Quarterly, 9,177–190.

Bamberger, M. (Ed.). (2000).Integrating quantitative and qualitative research in development projects.Washington, DC: World Bank.

Beck, K. A. (2005). Ethnographic decision tree modeling: A research method for counseling psychologists.Journal of Counseling Psychol- ogy, 52,243–249.

Behrens, J. T., & Smith, M. L. (1996). Data and data analysis. In D.

Berliner & B. Calfee (Eds.),The handbook of educational psychology (pp. 945–989). New York: Macmillan.

Blustein, D. L., Phillips, S. D., Jobin-Davis, K., Finkelberg, S. L., & Rourke, A. E. (1997). A theory-building investigation of the school-to- work transition.The Counseling Psychologist, 25,364 – 402.

Brewer, J., & Hunter, A. (1989).Multimethod research: A synthesis of styles.Newbury Park, NJ: Sage.

Brown, S. D., & Lent, R. W. (2000).Handbook of counseling psychology (3rd ed.). New York: Wiley.

Bryman, A. (1988).Quantity and quality in social research.London:

Routledge. 233 SPECIAL ISSUE: MIXED METHODS RESEARCH DESIGNS Campbell, D. T., & Fiske, D. (1959). Convergent and discriminant vali- dation by the multitrait-multimethod matrix.Psychological Bulletin, 56, 81–105.

Campbell, D. T., & Stanley, J. C. (1966). Experimental and quasi- experimental designs for research. In N. L. Gage (Ed.),Handbook of research on teaching(pp. 1–76). Chicago: Rand McNally.

Caracelli, V. J., & Greene, J. C. (1993). Data analysis strategies for mixed-method evaluation designs.Educational Evaluation and Policy Analysis, 15,195–207.

Cherryholmes, C. C. (1992). Notes on pragmatism and scientific realism.

Educational Researcher, 21,13–17.

Chusid, H., & Cochran, L. (1989). Meaning of career changes from the perspective of family roles and dramas.Journal of Counseling Psychol- ogy, 36,34 – 41.

Cook, T. D., & Reichardt, C. S. (Eds.). (1979).Qualitative and quantitative methods in evaluation research.Beverly Hills, CA: Sage.

Creswell, J. W. (1999). Mixed method research: Introduction and applica- tion. In T. Cijek (Ed.),Handbook of educational policy(pp. 455– 472).

San Deigo, CA: Academic Press.

Creswell, J. W. (2002).Educational research: Planning, conducting, and evaluating quantitative and qualitative approaches to research.Upper Saddle River, NJ: Merrill/Pearson Education.

Creswell, J. W. (2003).Research design: Quantitative, qualitative, and mixed methods approaches(2nd ed.). Thousand Oaks, CA: Sage.

Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E.

(2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.),Handbook of mixed methods in social and behavioral research(pp. 209 –240). Thousand Oaks, CA: Sage.

Crotty, M. (1998).The foundations of social research: Meaning and perspective in the research process.London: Sage.

Datta, L. (1994). Paradigm wars: A basis for peaceful coexistence and beyond. In C. S. Reichardt & S. F. Rallis (Eds.),The qualitative- quantitative debate: New perspectives(pp. 53–70). San Francisco:

Jossey-Bass.

Daughtry, D., & Kunkel, M. A. (1993). Experience of depression in college students: A concept map.Journal of Counseling Psychology, 40,316 – 323.

Gaston, L., & Marmar, C. R. (1989). Quantitative and qualitative analyses for psychotherapy research: Integration through time-series design.Psy- chotherapy, 26,169 –176.

Gelso, C. J. (1979). Research in counseling: Methodological and profes- sional issues.The Counseling Psychologist, 8,7–36.

Gergen, K. J. (2001). Psychological science in a postmodern context.

American Psychologist, 56,803– 813.

Goldman, L. (1976). A revolution in counseling psychology.Journal of Counseling Psychology, 23,543–552.

Good, G. E., & Heppner, M. J. (1995). Students’ perceptions of a gender issues course: A qualitative and quantitative examination.Counselor Education and Supervision, 34,308 –320.

Goodyear, R. K., Tracey, T. J. G., Claiborn, C. D., Lichtenberg, J. W., & Wampold, B. E. (2005). Ideographic concept mapping in counseling psychology research: Conceptual overview, methodology, and an illus- tration.Journal of Counseling Psychology, 52,236 –242.

Greene, J. C., & Caracelli, V. J. (Eds.). (1997).Advances in mixed-method evaluation: The challenges and benefits of integrating diverse para- digms(New Directions for Evaluation, No. 74). San Francisco: Jossey- Bass.

Greene, J. C., & Caracelli, V. J. (2003). Making paradigmatic sense of mixed methods practice. In A. Tashakkori & C. Teddlie (Eds.),Hand- book of mixed methods in social and behavioral research(pp. 91–110).

Thousand Oaks, CA: Sage.

Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward aconceptual framework for mixed-method evaluation designs.Educa- tional Evaluation and Policy Analysis, 11,255–274.

Guba, E. G., & Lincoln, Y. S. (1988). Do inquiry paradigms imply inquiry methodologies? In D. M. Fetterman (Ed.),Qualitative approaches to evaluation in education(pp. 89 –115). New York: Praeger Publishers.

Guernina, Z. (1998). Adolescents with eating disorders: A pilot study.

Counselling Psychology Quarterly, 11,117–124.

Heppner, P. P., Kivlighan, D. M., Jr., & Wampold, B. E. (1999).Research design in counseling(2nd ed.). Belmont, CA: Wadsworth.

Hill, C. E., Thompson, B. J., & Williams, E. N. (1997). A guide to conducting consensual qualitative research.The Counseling Psycholo- gist, 25,517–572.

Hill, C. E., Zack, J. S., Wonnell, T. L., Hoffman, M. A., Rochlen, A. B., Goldberg, J. L., et al. (2000). Structured brief therapy with a focus on dreams or loss for clients with troubling dreams and recent loss.Journal of Counseling Psychology, 47,90 –101.

Hoshmand, L. L. S. T. (1989). Alternate research paradigms: A review and teaching proposal.The Counseling Psychologist, 17,3–79.

Howard, G. S. (1983). Toward methodological pluralism.Journal of Coun- seling Psychology, 30,19 –21.

Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangu- lation in action.Administrative Science Quarterly, 24,602– 611.

Luzzo, D. A. (1995). Gender differences in college students’ career matu- rity and perceived barriers in career development.Journal of Counseling & Development, 73,319 –322.

Maione, P. V., & Chenail, R. J. (1999). Qualitative inquiry in psychother- apy: Research on the common factors. In M. A. Hubble, B. L. Duncan, & S. D. Miller (Eds.),The heart and soul of change: What works in therapy(pp. 57– 88). Washington, DC: American Psychological Asso- ciation.

Martin, J. S., Goodyear, R. K., & Newton, F. B. (1987). Clinical supervi- sion: An intensive case study.Professional Psychology: Research and Practice, 18,225–235.

Meier, S. T. (1999). Training the practitioner-scientist: Bridging case conceptualization, assessment, and intervention.The Counseling Psy- chologist, 27,846 – 869.

Mertens, D. M. (2003). Mixed methods and the politics of human research:

The transformative-emancipatory perspective. In A. Tashakkori & C.

Teddlie (Eds.),Handbook of mixed methods in social and behavioral research(pp. 135–164). Thousand Oaks, CA: Sage.

Morgan, D. L. (1998). Practical strategies for combining qualitative and quantitative methods: Applications to health research.Qualitative Health Research, 8,362–376.

Morrow, S. L., & Smith, M. L. (2000). Qualitative research for counseling psychology. In S. D. Brown & R. W. Lent (Eds.),Handbook of coun- seling psychology(3rd ed., pp. 199 –230). New York: Wiley.

Morse, J. M. (1991). Approaches to qualitative-quantitative methodologi- cal triangulation.Nursing Research, 40,120 –123.

Morse, J. M. (2003). Principles of mixed methods and multimethod re- search design. In A. Tashakkori & C. Teddlie (Eds.),Handbook of mixed methods in social and behavioral research(pp. 189 –208). Thousand Oaks, CA: Sage.

Newman, I., & Benz, C. R. (1998).Qualitative-quantitative research methodology: Exploring the interactive continuum. Carbondale: Univer- sity of Illinois Press.

Newman, I., Ridenour, C. S., Newman, C., & DeMarco, G. M. P., Jr.

(2003). A typology of research purposes and its relationship to mixed methods. In A. Tashakkori & C. Teddlie (Eds.),Handbook of mixed methods in social and behavioral research(pp. 167–188). Thousand Oaks, CA: Sage.

Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), 234 HANSON ET AL. Handbook of mixed methods in social and behavioral research(pp.

351–383). Thousand Oaks, CA: Sage.

Orndoff, R. M., & Herr, E. L. (1996). A comparative study of declared and undeclared college students on career uncertainty and involvement in career development activities.Journal of Counseling & Development, 74,632– 640.

Palmer, S., & Cochran, L. (1988). Parents as agents of career development.

Journal of Counseling Psychology, 35,71–76.

Paulson, B. L., Truscott, D., & Stuart, J. (1999). Clients’ perceptions of helpful experiences in counseling.Journal of Counseling Psychology, 46,317–324.

Payne, E. C., Robbins, S. B., & Dougherty, L. (1991). Goal directedness and older-adult adjustment.Journal of Counseling Psychology, 38,302– 308.

Poasa, K. H., Mallinckrodt, B., & Suzuki, L. A. (2000). Causal attributions for problematic family interactions: A qualitative, cultural comparison of Western Samoa, American Samoa and the United States.The Counsel- ing Psychologist, 28,32– 60.

Ponterotto, J. G. (2005). Qualitative research in counseling psychology: A primer on research paradigms and philosophy of science.Journal of Counseling Psychology, 52,126 –136.

Ponterotto, J. G., & Grieger, I. (1999). Merging qualitative and quantitative perspectives in a research identity. In M. Kopala & L. Suzuki (Eds.), Using qualitative methods in psychology(pp. 49 – 62). Thousand Oaks, CA: Sage.

Punch, K. F. (1998).Introduction to social research: Quantitative and qualitative approaches.Thousand Oaks, CA: Sage.

Reichardt, C. S., & Cook, T. D. (1979). Beyond qualitative versus quan- titative methods. In T. D. Cook & C. S. Reichardt (Eds.),Qualitative and quantitative methods in evaluation research(pp. 7–32). Beverly Hills, CA: Sage.

Reichardt, C. S., & Rallis, S. F. (Eds.). (1994).The qualitative-quantitative debate: New perspectives.San Francisco: Jossey-Bass.

Rossman, G. B., & Wilson, B. L. (1985). Numbers and words: Combing quantitative and qualitative methods in a single large-scale evaluation study.Evaluation Review, 9,627– 643.Sieber, S. D. (1973). The integration of field work and survey methods.

American Journal of Sociology, 78,1335–1359.

Smith, J. K. (1983). Quantitative versus qualitative research: An attempt to clarify the issue.Educational Researcher, 12,6 –13.

Steckler, A., McLeroy, K. R., Goodman, R. M., Bird, S. T., & McCormick, L. (1992). Toward integrating qualitative and quantitative methods: An introduction.Health Education Quarterly, 19,1– 8.

Strauss, A., & Corbin, J. (Eds.). (1990).Basics of qualitative research:

Ground theory procedures and techniques.Newbury Park, CA: Sage.

Tashakkori, A., & Teddlie, C. (1998).Mixed methodology: Combining qualitative and quantitative approaches.Thousand Oaks, CA: Sage.

Tashakkori, A., & Teddlie, C. (Eds.). (2003).Handbook of mixed methods in social and behavioral research.Thousand Oaks, CA: Sage.

Tolman, D. L., & Szalacha, L. A. (1999). Dimensions of desire: Bridging qualitative and quantitative methods in a study of female adolescent sexuality.Psychology of Women Quarterly, 23,7–39.

Wampold, B. E., Ankarlo, G., Mondin, G., Trinidad-Carrillo, M., Baumler, B., & Prater, K. (1995). Social skills of and social environments pro- duced by different Holland types: A social perspective on person– environment fit model.Journal of Counseling Psychology, 42,365–379.

Waszak, C., & Sines, M. C. (2003). Mixed methods in psychological research. In A. Tashakkori & C. Teddlie (Eds.),Handbook of mixed methods in social and behavioral research(pp. 557–576). Thousand Oaks, CA: Sage.

Williams, E. N., Judge, A. B., Hill, C. E., & Hoffman, M. A. (1997).

Experiences of novice therapists in prepracticum: Trainees’, clients’, and supervisors’ perceptions of therapists’ personal reactions and manage- ment strategies.Journal of Counseling Psychology, 44,390 –399.

Wolcott, H. F. (1994).Transforming qualitative data: Description, anal- ysis, and interpretation. Thousand Oaks, CA: Sage. Received October 27, 2004 Revision received December 6, 2004 Accepted December 10, 2004 235 SPECIAL ISSUE: MIXED METHODS RESEARCH DESIGNS