Research Methods Literature ReviewPrior to beginning work on this assignment, review the qualitative and quantitative research designs encountered so far in this course.For your literature review, you

Journal of Counseling & Development ■ April 2013 ■ Volume 91 184 © 2013 by the American Counseling Association. All rights reserved. Received 07/31/11 Revised 04/13/12 Accepted 09/04/12 DOI: 10.1002/j.1556-6676.2013.00085.x For practitioners in the field of counseling, the combining or mixing of qualitative and quantitative methodologies is not a new or unique phenomenon. In fact, as surmised by Powell, Mihalas, Onwuegbuzie, Suldo, and Daley (2008), “by defini- tion, assessment, whether for purposes of program planning or treatment, necessitates the consideration of multiple sources of data” (p. 293). In addition, as stated in Section E (i.e., Evalua- tion, Assessment, and Interpretation) of the ACA Code of Ethics (American Counseling Association, 2005), counselors use both quantitative and qualitative assessments in practice. Counselor researchers and counselors-as-practitioners routinely collect and analyze qualitative and quantitative data as a necessary part of their profession. Therefore, the purpose of this article is threefold: (a) to demonstrate that regardless of philosophi- cal stance, collecting quantitative data via psychometrically sound quantitative instruments during the qualitative interview process enhances interpretations by helping researchers better contextualize qualitative findings; (b) to explain the concept of the mixed methods interview; and (c) to provide an example demonstrating this strategy whereby a baseline was established using a quantitative scale and normative data as a mixed re- search approach.

On the basis of definitions provided by 19 leading meth- odologists of mixed methods research, or more aptly named as mixed research to denote the fact that more than methods typically are mixed (e.g., philosophical assumptions and stances, research questions), Johnson, Onwuegbuzie, and Turner (2007) defined mixed research as an intellectual and practical synthesis based on qualitative and quantitative research; it is the third methodological or research paradigm (along with qualitative and quantitative research).

Rebecca K. Frels, Department of Counseling and Special Populations, Lamar University; Anthony J. Onwuegbuzie, Department of Educational Leadership and Counseling, Sam Houston State University. The authors would like to acknowledge and thank John Harris, Applied Research Consulting, and Michael Nakkula, University of Pennsylvania, for the use of the Match Characteristics Questionnaire. The authors also thank Michael Karcher, University of Texas San Antonio, for his guidance in selecting the quantitative measure and locating other resources on school-based mentoring. Correspondence concerning this article should be addressed to Rebecca K. Frels, Department of Counseling and Special Populations, Lamar University, 223 Education Building, Beaumont, TX 77710 (e-mail: [email protected]).

Administering Quantitative Instruments With Qualitative Interviews: A Mixed Research Approach Rebecca K. Frels and Anthony J. Onwuegbuzie The authors demonstrate how collecting quantitative data via psychometrically sound quantitative instruments dur- ing the qualitative interview process enhances interpretations by helping researchers better contextualize qualitative findings, specifically through qualitative dominant crossover mixed analyses. They provide an example of this strategy, whereby a baseline was established using a quantitative scale and normative data to help interpret qualitative inter- views, resulting in what they call a mixed methods interview. Philosophical and practical implications are discussed.

Keywords: qualitative interviews, qualitative dominant mixed analysis, crossover mixed analysis, mixed methods research, mixed research It recognizes the importance of traditional quantitative and qualitative research but also offers a powerful third paradigm choice that often will provide the most informative, complete, balanced, and useful research results. Mixed methods research is the research paradigm that (a) partners with the philosophy of pragmatism in one of its forms (left, right, middle); (b) follows the logic of mixed methods research (including the logic of the fundamental principle and any other useful logics imported from qualitative or quantitative research that are helpful for producing defensible and usable research find- ings); (c) relies on qualitative and quantitative viewpoints, data collection, analysis, and inference techniques combined according to the logic of mixed methods research to address one’s research question(s); and (d) is cognizant, appreciative, and inclusive of local and broader sociopolitical realities, resources, and needs. (p. 129) If we take into account the integrative nature of counseling, it is surprising that relatively few counseling researchers com - bine or mix qualitative and quantitative data in their studies.

Ray et al. (2011), who recently reviewed 4,457 articles from 1998 to 2007 in 15 ACA division-affiliated journals, identi- fied only 171 mixed research articles, which represented only 3.84% of the total number of articles published in these jour - nals. In fact, this finding is consistent with other researchers’ studies examining counseling journals that documented the lack of mixed research articles for either empirical research articles or nonempirical research articles (e.g., theoretical/ conceptual articles; Hanson, Creswell, Plano Clark, Petska, & Creswell, 2005; Leech & Onwuegbuzie, 2011). Similar to other fields and disciplines, the low prevalence rates of mixed research articles published in counseling Journal of Counseling & Development ■ April 2013 ■ Volume 91 185 Administering Quantitative Instruments With Qualitative Interviews journals have occurred despite the exponential increase in the number of methodologically based mixed research ar - ticles that have been published in the literature (Ivankova & Kawamura, 2010), including two handbooks (i.e., Tashakkori & Teddlie, 2003, 2010) and numerous books (i.e., Andrew & Halcomb, 2009; Bergman, 2008; Collins, Onwuegbuzie, & Jiao, 2010; Creswell & Plano Clark, 2010; Greene, 2007; Hess-Biber, 2010; Morse & Niehaus, 2009; Newman & Rid- enour, 2008; Onwuegbuzie, Jiao, & Bostick, 2004; Plowright, 2011; Teddlie & Tashakkori, 2009) on mixed research, as well as guidelines written directly for counseling researchers that were published in ACA’s flagship journal (i.e., Leech & Onwuegbuzie, 2010). These and other methodological works have demonstrated the utility of conducting mixed research. Moreover, although quantitative research is particularly useful for “answering questions of who, where, how many, how much, and what is the relationship between specific variables” (Adler, 1996, p.

5), it is not optimal for answering why and how questions.

The converse is true for qualitative research. In contrast, mixed research can address both sets of questions within a single research study. Alternatively stated, mixed research has been shown to be useful for addressing simultaneously both quantitative-based questions that deal with prevalence rates (i.e., descriptive research), relationships (i.e., correlational re- search, causal-comparative, quasiexperimental research), and cause-and-effect relationships (i.e., experimental research) and qualitative-based questions that lead to the examination of local processes, experiences, and perceptions of individu- als such as counselees (e.g., biography, autobiography, life history, oral history, autoethnography, case study) and groups such as cultural groups (e.g., phenomenology, ethnography, grounded theory). Greene, Caracelli, and Graham (1989) identified five dif- ferent research purposes for mixing quantitative and qualita- tive data: (a) triangulation (the intent is to seek convergence in data); (b) complementarity (the intent is to measure overlap- ping but different facets of a phenomenon); (c) development (the intent is to help develop or inform the other method); (d) initiation (the intent is to discover paradox and contradiction, new perspectives of frameworks, and the recasting of ques- tions or results); and (e) expansion (the intent is to extend the breadth and range of inquiry). In addition, mixed research can be used to address a broader range of research questions than can monomethod studies (i.e., quantitative research alone or qualitative research alone). For example, as identified by Plano Clark and Badice (2010), mixed research can be used to address the following types of research questions: separate research questions (i.e., one or more quantitative research questions coupled with one or more qualitative research questions); general overarching mixed research questions (i.e., broad questions that are addressed using both quantita- tive and qualitative approaches); hybrid mixed issue research questions (i.e., one question with two distinct parts such that a quantitative approach is used to address one part and a qualitative approach is used to address the other part); mixed procedural/mixing research questions (i.e., narrow questions that direct the integration of the qualitative and quantitative strands of the study); combination research questions (i.e., at least one mixed research question combined with separate quantitative and qualitative questions); independent research questions (i.e., two or more research questions that are related, with each question not depending on the results of the other question[s]); dependent research questions (i.e., questions that depend on the results stemming from addressing another question); predetermined research questions (i.e., questions based on literature, practice, personal tendencies, and/or disciplinary considerations that are posed at the beginning of the study); and emergent research questions (i.e., new or modified research questions that arise during the design, data collection, data analysis, or interpretation phase). It is probable that many researchers might not conduct mixed research because of a lack of training or, as noted by Frels, Onwuegbuzie, Leech, and Collins (2012) and Onwueg- buzie, Frels, Leech, and Collins (2011), a lack of pedagogical information in published form. Other reasons for the dearth of mixed research studies published in counseling journals might be philosophical (i.e., researchers’ beliefs about the nature of knowledge, objectivity–subjectivity dualism), axiological (i.e., researchers’ beliefs about the role of values and ethics), or ontological (i.e., researchers’ beliefs about the nature of reality).

In particular, at least some researchers mistakenly believe that the philosophical assumptions and stances of their quantitative- based research (e.g., postpositivism) or their qualitative-based research (e.g., constructivism, critical theory) prevent them from mixing quantitative and qualitative approaches. However, as demonstrated by Onwuegbuzie, Johnson, and Collins (2009), the ontological, epistemological, and methodological assump- tions and stances representing the major research paradigms do not prevent researchers from collecting and analyzing both quantitative and qualitative data—at least to some degree. Table 1, which was created using Onwuegbuzie, Johnson, and Collins’s (2009) comparison of paradigms, displays major characteristics associated with three qualitative-based paradigms (i.e., construc- tivism, critical theory, and participatory), one quantitative-based paradigm (i.e., postpositivism), and one mixed research-based paradigm (i.e., pragmatism-of-the-middle) with respect to three axiomatic components (i.e., ontological, epistemological, and methodological foundations). It can be seen from the table that the philosophical assumptions and stances underlying postposi- tivism allow postpositivist researchers to utilize some qualitative analysis techniques, especially those that yield frequency data such as word count (i.e., counting particular times a word or words are used) and classical content analysis (i.e., counting the codes). At the other end of the research paradigmatic continuum, philosophical assumptions and stances associated with qualitative inquiry—specifically constructivism—allow Journal of Counseling & Development ■ April 2013 ■ Volume 91 186 Frels & Onwuegbuzie constructivist researchers (e.g., radical constructivists, cog- nitive constructivists, social constructivists/constructionists) to use, at the very least, descriptive statistics (i.e., measures of central tendency [e.g., mean, median, mode, proportion]; measures of variation/dispersion [e.g., variance, standard deviation]; measures of position [e.g., percentile rank, z score]; and measures of distributional shape [e.g., skewness, kurtosis]). As also seen in Table 1, philosophical assumptions and stances underlying both critical theory and participatory paradigms allow critical theorist researchers and participatory researchers, respectively, to use descriptive statistics and many forms of inferential statistics.

In the mixed research tradition, the philosophical assump- tions and stances that underlie mixed research represent the commensurability of paradigms. Johnson and Gray (2010) explained mixed methods thinking as considering how con- TABLE 1 Contemporary Research Paradigms and Characteristics Research Type Ontology Epistemology Methodology Rhetoric Qualitative analysis Quantitative analysis Postpositivism Note. This table was created based on definitions found in The Sage Dictionary of Qualitative Inquiry (3rd ed.), by T. A. Schwandt, 2007, Thousand Oaks, CA: Sage; and information from “Toward a Philosophy of Mixed Data Analysis,” by A. J. Onwuegbuzie, R. B.

Johnson, and K. M. T. Collins, 2009, International Journal of Multiple Research Approaches, 3, pp. 122–123.

aExternal statistical generalizations involve making generalizations, judgments, inferences, or predictions on data stemming from a representative statistical (i.e., optimally random and large) sample of the population from which the sample was drawn (Onwuegbuzie, Slate, Leech, & Collins, 2009). bInternal statistical generalizations involve making generalizations, judgments, inferences, or predictions on data obtained from one or more representative or elite participants, such as key informants, subsample members, or politically important cases, of the sample from which the participant(s) was selected (Onwuegbuzie, Slate, et al., 2009). Pragmatism-of-the- Middle Critical Theory Constructivism Participatory Social science inquiry should be objective Researchers are neutral, emotion- ally detached, and should eliminate biases; empirically justify stated hypotheses Generalizations are time- and context- free; real causes of social scientific outcomes can be determined reliably and validly via quantitative (and sometimes qualitative) methods Use of formal, impersonal, passive voice and technical terminology; focuses on social laws Some qualitative analy- ses that generate numbers as part of the findings (e.g., word count, classical content analysis) All forms of descrip- tive and inferential statistics for making external statistical generalizations a Traditional dualisms are rejected; high re- gard for the influence of the inner world of human experiences in action Knowledge is based on the reality of the world and constructed through experience; justification comes via warranted assertability Thoughtful/ dialectical eclecti- cism and pluralism of methods and perspectives; deter- mines what works and solves individual and social problems Use of both imper- sonal passive voice/ technical terminology and empathetic, rich and thick descrip- tions All forms of qualitative analyses All forms of descrip- tive and inferential statistics Social, political, cultural, ethnic, racial, economic, and gender values that evolve over time affect reality Transactional/ subjectivist; value- mediated findings Use of a dialogue or dialectical approach Use of critical discourse All forms of qualitative analyses Descriptive statis- tics; most forms of inferential statistics for internal statistical generalizations b and external statistical generalizations Multiple contradic- tory, but equally valid accounts of the same phenomenon represent multiple realities Co-created findings/ meaning; knowledge is subjective and not separable for the knower Dialectical and impos- sible to differentiate fully causes and effects; uses induc- tive reasoning; time- and context-free generalizations are neither desirable nor possible Use of empathetic descriptions that are informal, detailed, rich, and thick All forms of qualitative analyses Descriptive statistics; some inferential statistics for internal statistical generalization but not external statistical generalization The mind and given world order are co- created through subjective–objective reality Experiential and practical for practice; co-created findings Political participation for collaborative action research; emphasizes the practical Use of language based on shared experiential context All forms of qualitative analyses Descriptive statistics; inferential statistics for both internal statistical generalizations and external statistical generalizations Journal of Counseling & Development ■ April 2013 ■ Volume 91 187 Administering Quantitative Instruments With Qualitative Interviews flicting positions illuminate new learning. Onwuegbuzie, Johnson, and Collins (2009) described the coming together of philosophical assumptions through 11 philosophical stances. These stances fall within various points on a con- tinuum. As seen through the pragmatism-of-the-middle paradigm in Table 1, the major philosophical assumptions and stances do not prevent researchers from using one or more analysis types associated with one tradition (e.g., quan- titative analysis) to analyze data associated with a different tradition (e.g., qualitative data)—a concept Onwuegbuzie and Combs (2010) called crossover mixed analyses. Crossover Mixed Analyses As a mixed research approach, crossover mixed analyses can be used for the following:

• Reduce:

to condense the dimensionality of qualita - tive data/findings using quantitative analysis (e.g., exploratory factor analysis of qualitative data) and/or quantitative data/findings using qualitative techniques (e.g., thematic analysis of quantitative data; Onwueg- buzie, 2003; Onwuegbuzie & Teddlie, 2003) • Display:

to present visuall y qualitati ve and quantita - tive results within the same display (Onwuegbuzie & Dickinson, 2008) • Transform:

to con vert quantitati ve data to be anal yzed qualitati vely (i.e., qualitizing data) and/or qualitative data into numerical codes that can be analyzed statistically (i.e., quantitizing data; Tashakkori & Teddlie, 1998) • Correlate:

to associate qualitati ve data with quan - titized data and/or quantitative data with qualitized data (Onwuegbuzie & Teddlie, 2003) • Consolidate:

to mer ge multiple data sets to create ne w or consolidated codes, variables, or data sets (Onwuegbuzie & Teddlie, 2003) • Compare:

to e xamine si de-by-side qu alitative an d qu an- titative data/findings (Onwuegbuzie & Teddlie, 2003) • Integrate:

to incor porate qualitati ve and quantitati ve data/findings either into a coherent w hole or into two separate sets (i.e., qualitative and quantitative) of coherent wholes (Onwuegbuzie & Teddlie, 2003) • Assert:

to re view all qualitati ve and quantitati ve data to yield meta-inferences (Smith, 1997) • Import data: to use follo w-up f indings from quali - tative analysis to inform the quantitative analy- sis (e.g., qualitative contrasting case analysis, qualitative residual analysis, qualitative follow- up interaction analysis, and qualitative internal replication analysis) or follow-up f indings from quantitative analysis to inform the qualitative analysis (e.g., quantitative extreme case analysis, quantitative negative case analysis; Onwuegbuzie & Teddlie, 2003) Therefore, quantitative researchers and qualitative re- searchers can use mixed research techniques without contra- dicting their underlying research philosophical belief systems by conducting what they refer to as quantitative dominant crossover mixed analysis and qualitative dominant crossover mixed analysis, respectively.

Quantitative Dominant Crossover Mixed Analysis According to Onwuegbuzie, Leech, and Collins (2011), who expanded the concept, quantitative dominant crossover mixed analysis is used when the researcher seeks to answer research questions through a postpositivist (quantitative) stance while also believing that qualitative data and analysis help address the research question(s) to a greater extent. This occurs at various levels. At one end of the spectrum—the highest level—inte- gration involves combining one or more sets of inferential analyses with other types of qualitative analyses “for the purpose of integrated data reduction, integrated data display, data transformation, data correlation, data consolidation, data comparison, data integration, warranted assertion analysis, and/ or data importation” (p. 377). At the lowest end of the spectrum, quantitative dominant crossover mixed analysis is the combina- tion of one or more sets of inferential analyses with qualitative analyses that generate some frequency data (e.g., word count) because these data are closer to statistical data than are the data that would be generated by other qualitative analyses (e.g., constant comparison analysis, discourse analysis). Regardless of the level of integration, the quantitative strand would rep- resent the dominant strand, with the qualitative strand being incorporated to address one or more of Greene et al.’s (1989) five purposes for mixing (i.e., tri angulation, complementarity , development, initiation, and expansion), using one or more of the nine crossover analysis types.

Qualitative Dominant Crossover Mixed Analysis According to Onwuegbuzie, Leech, and Collins (2011), qualitative dominant mixed analysis involves a philosophical stance whereby the researcher assumes a (qualitative) con- structivist, critical theorist, or any stance that is associated with the qualitative research paradigm and also believes that the addition of quantitative data and analysis would address in more detail the research question(s). Building on this idea, Ross and Onwuegbuzie (2011) categorized the array of established quantitative analysis techniques into the follow- ing eight levels of complexity: Level 1, descriptive analyses (e.g., measures of central tendency, dispersion, position); Level 2, univariate analyses (e.g., independent samples t test, dependent samples t test, one-way analysis of variance); Level 3, multivariate analyses (e.g., multiple analysis of variance, multiple analysis of covariance, discriminant analysis, canoni- cal correlation analysis); Level 4, analyses of group mem- bership (e.g., exploratory factor analysis, cluster analysis, correspondence analysis, multidimensional scaling); Level 5, Journal of Counseling & Development ■ April 2013 ■ Volume 91 188 Frels & Onwuegbuzie measurement techniques (e.g., confirmatory factor analysis, item response theory); Level 6, analyses of time and/or space (e.g., autoregressive models, integrated models, moving av- erage models, geocoding, geostatistics, cartography); Level 7, multidirectional or multilevel analyses (e.g., structural equation modeling, hierarchal linear modeling); and Level 8, multidirectional and multilevel analyses (e.g., multilevel structural equation modeling, multilevel item response theory, multivariate hierarchical linear modeling). Thus, at the low- est level of integration, the qualitative dominant crossover mixed analysis would involve combining one or more sets of qualitative analyses with descriptive statistics (i.e., Level 1 quantitative analysis). At a higher level of integration, the qualitative dominant crossover mixed analysis would involve combining one or more sets of qualitative analyses with exploratory analysis techniques (i.e., Level 4 quantitative analysis), such as by subjecting the emergent themes to an exploratory factor analysis (i.e., integrated data reduction; see Onwuegbuzie, 2003; Onwuegbuzie & Teddlie, 2003).

At the highest level of integration, the qualitative dominant crossover mixed analysis would involve combining one or more sets of qualitative analyses with inferential statistics (i.e., Levels 2–3, 5–8). Whatever the level of integration, the qualitative strand would represent the dominant strand, with the quantitative strand being used in an attempt to fulfill one or more of Greene et al.’s (1989) five purposes for mixing. As Greene (2008) surmised, the combining of quantita- tive and qualitative analysis techniques “has not yet cohered into a widely accepted framework or set of ideas” (p. 14).

Furthermore, Bazeley (2010) concluded that “there are surprisingly few published studies reporting results from projects which make more than very elementary use of the capacity to integrate data and analyses using computers” (p.

434). Therefore, the concept of crossover analysis has great potential for advancing the process of combining quantitative and qualitative data collection and data analysis techniques within the same framework. Indeed, Teddlie and Tashakkori (2009) declared that this concept represents “one of the most fruitful areas for the further development of MM [mixed methods] analytical techniques” (p. 281). To this end, the purpose of the remainder of this article is to advance further one of the two components of crossover analyses, namely, the qualitative dominant crossover analysis. Specifically, we illustrate how a qualitative dominant crossover analysis can enhance the quality of interpretation of interview data. Qualitative Interviews Interviews represent one of the most common ways of col- lecting data in qualitative research because they provide opportunities for the researcher to collect rich and meaning- making data (e.g., Roulston, 2010). Because of the therapeutic relationship and the role of a counselor, there is no doubt that qualitative interviews are more relevant for the field of counseling than for other fields. In fact, certain types of interviews, to a certain degree, can resemble the counseling interview process (e.g., Chenail, 1997; Ortiz, 2001). Thus, in many counseling specialties, including the field of marriage and family therapy, interviews have been the most utilized qualitative method (Gehart, Ratliff, & Lyle, 2001). As Chenail (1997) declared, interviewing is a natural form of inquiry in the field of counseling because “it is so similar to the way in which counselors and therapists interact with their clients in therapy sessions” (Abstract).

We contend that the interview, as a natural mode of in- quiry, can be enhanced when researchers/interviewers collect quantitative data alongside qualitative responses. Teddlie and Tashakkori (2009) referred to this strategy as within- strategy mixed methods data collection. We call it a mixed methods interview. Some examples of studies can be found in the literature wherein the researcher(s) developed and uti- lized interview formats that contained both open-ended and closed-ended items (e.g., Brannen, 2005). However, our call is for an even more rigorous process of combining qualita- tive open-ended interview questions with items from one or more relevant (standardized) quantitative instruments (e.g., Likert-format scales, rating scales) that possess adequate psychometric properties (i.e., adequate score reliability; ad- equate score validity stemming from adequate content-related, criterion-related, and construct-related validity), whenever available, which allow the researcher(s) to contextualize further the qualitative interview responses. Extracting standardized quantitative information—which represents only Level 1 complexity on Ross and Onwueg- buzie’s (2011) quantitative analysis continuum—alongside qualitative information from qualitative interviews enhances both representation and legitimation of the phenomenon of interest. Representation refers to the ability to extract an adequate amount of relevant information from each partici- pant—optimally, under the conditions of saturation (Morse, 1995), particularly data saturation (i.e., when information emerges so repeatedly that the researcher can expect it and wherein the collection of more data appears to have no addi- tional interpretive worth; Sandelowski, 2008) and theoretical saturation (i.e., when the researcher can assume that her or his emergent theory is adequately developed to fit any future data collected; Sandelowski, 2008). Information gleaned from the quantitative instrument(s) also lends clarity to the voice of a participant. More specifically, in Greene et al.’s (1989) typology, representation would be increased via enhanced complementarity, expansion, and development. Thus, by incorporating additional sources of information, qualitative researchers would obtain richer interpretations. In contrast, legitimation refers to the validity of interpre- tations that stem from the interview data. Indeed, legitima- tion would be increased via the ability to compare and to contrast the qualitative and quantitative data extracted from the interview(s), again using Greene et al.’s (1989) triangu- Journal of Counseling & Development ■ April 2013 ■ Volume 91 189 Administering Quantitative Instruments With Qualitative Interviews lation and initiation. And by increasing both representation and legitimation by administering one or more standardized quantitative instruments, increased verstehen (i.e., under - standing) would ensue. What follows is a heuristic example to illustrate, using a real study, the benefit of administering a standardized quantitative instrument as part of the qualita- tive interview process. It is our belief that our exemplar also will serve as a model for understanding the mixed research concepts previously discussed. Heuristic Example The example we provide here was written by Frels (2010), a professional school counselor (also the first author of the present article), using a qualitative dominant crossover mixed analysis within the context of a qualitative study wherein interviews represented the main data collection tool. The purpose of Frels’s (2010) study was to explore selected mentors’ perceptions and experiences of the dyadic mentoring relationship in school-based mentoring (SBM).

A second purpose was to build on the qualitative body of research (Spencer, 2004, 2007) for understanding roles, purposes, approaches, and experiences of the relationship process with mentees (the dyadic relationship). The research explored SBM as a type of helping relationship facilitated by a mentor, involving the untapped resources of the psy- chotherapy literature and described by Spencer (2004), specifically, the dyadic relationship itself as the facilitator of change to affect both the mentor and the mentee. Frels’s research questions were as follows:

Research Question 1: What are the experiences and per - ceptions of selected school-based mentors regarding roles, purposes, and approaches of mentoring within the dyadic relationship with elementary school students?

Research Question 2: What are the differences and similarities in experiences and perceptions of selected school-based mentors working with elementary school students as a function of ethnicity of the mentor?

As noted by Plano Clark and Badice (2010), three ele- ments are key in the focus of any study: the content area, the purpose, and the research questions. Even though Frels’s (2010) research was a qualitative study, the research questions also might be considered as representing general overarching mixed research questions (i.e., broad questions that are addressed using both quantitative and qualitative approaches) for driving the data collection methods. Con- sequently, as explained by Plano Clark and Badice, research questions are inherently linked to environmental contexts that include theories and beliefs. Therefore, we explored belief systems and philosophies at the onset of the study to recognize the lens from which data would be collected.

As a result, the driving research paradigm was determined to be what Johnson (2011, 2012) labeled as dialectical pluralism, which refers to an epistemology that requires the researcher to incorporate multiple epistemological perspec- tives. This philosophical stance lends itself to the use of a crossover mixed analysis, and we combined epistemological perspectives to include pragmatism-of-the-middle (Onwueg- buzie, Johnson, & Collins, 2009) and social constructionism (Schwandt, 2000).

Data Collection To address the research questions, Frels (2010) conducted a multiple case study with 11 adult mentors (four men, seven women), with ages ranging from 28 to 70 years and ethnicities of African American (n = 5), Hispanic (n = 2), and White (n = 4). Each of these mentors was paired with a mentee such that the pairing involved one or each of the following two mentee–mentor pairings: same-gender versus different-gender mentee–mentor pairings and same-ethnic versus different-ethnic mentee–mentor pairings. Although many forms of data were collected with regard to all dyad interactions—including observations, descriptive case notes, reflexive data, and debriefing data—interviews represented the major data collection technique for explor - ing the phenomenon of dyadic mentoring relationships. This mode of inquiry resonated with the first author’s identity and relational approach in research as a professional school coun- selor. Each mentor was interviewed separately on multiple occasions. Each interview, which lasted between 20 minutes and 60 minutes, was semistructured, with questions being purposefully created to gain insight into the experience of the dyadic relationship. Examples of interview questions include the following: What are your beliefs, thoughts, and opinions about the purpose of mentoring? What words, phrases, or images come to mind to describe the time you spend with your mentee? When you feel challenged in your relationship, what are some thoughts or beliefs that help? In addition to the in-depth interviews, the 11 mentors completed a standardized quantitative instrument, the 62- item Match Characteristics Questionnaire (MCQ; Harris & Nakkula, 2008), which measures the quality of matching between mentors and mentees. The MCQ yielded good psychometric properties, with the score reliability pertain- ing to some of the subscales (e.g., Growth Focus, Support- Seeking From the Mentee) ranging in the .90s. The MCQ subscale scores were used to contextualize the position of each mentor relative to each other and also to obtain a richer description of each of the 11 participants. The subscale scores informed both the ensuing cross-case analyses and within-case analyses.

Data Analysis Cross-case analysis. The following segments are excerpts from Frels’s (2010) report that provide examples of how the MCQ subscale scores were used to enhance the richness of interpretations stemming from the cross-case analysis: Journal of Counseling & Development ■ April 2013 ■ Volume 91 190 Frels & Onwuegbuzie To legitimate the metathemes and themes, scores from the MCQ (Harris & Nakkula, 2008) and selected subscales were analyzed. Because norms are in the process of being established, scores have been established as percentiles by the authors of the MCQ (J. T. Harris, personal communication, June 2, 2010). On the whole, the selected mentors in my study scored high on every subscale of the MCQ. (pp. 197–198) To explore relationship behaviors in each dyad and per - ceived program support (e.g., relating to Research Question 2), the following three subscales were examined for each mentor: (a) program support subscale (i.e., the degree to which mentors feel that the program is providing effecting training, supervision, and support); (b) support-seeking broadscale (i.e., the degree to which mentors feel that their mentees seek their support in relation to personal issues and academics); and (c) mentor satisfaction subscale (i.e., the degree to which mentors feel that their match is growing stronger and producing good results for the mentee).

Interestingly, Savannah and Chad [pseudonyms of two participants of the study] scored lowest in the area of support- seeking as they did in the area of sharing. In addition, Savannah scored below the 50th percentile in all three categories: program support, support-seeking behavior, and mentor satisfaction. Of the 11 mentors, seven scored above the 50th percentile for all three categories. Two of 11 mentors scored higher than the 75th percentile for two of the three categories; and only one mentor (Savannah) scored low for all three categories. (pp. 200–201) Comparing the MCQ subscale scores with qualitative responses involved the following crossover analyses: inte- grated data reduction, data transformation, data consolidation, data comparison, data integration, and warranted assertion analysis. Indeed, the selected mentors expressed satisfaction (with the exception of the participant, Savannah, who scored lower on the MCQ) in the interviews, and the use of the MCQ enriched this finding. Within-case analysis. The profiles of MCQ subscale scores played an important role in Frels’s (2010) decision to select Savannah for a follow-up, in-depth within-case analysis. The following excerpt from Frels’s (2010) report distinguishes a unique profile for Savannah:

Savannah’s profile on the MCQ indicated an equal to or higher score than the 75th percentile on three (comfort, fun focus, future outlook) of the 10 subscales utilized to measure relationship characteristics. On three subscales (closeness, character develop- ment, relating focus), Savannah scored above the average range but below the 75th percentile. Additionally, Savannah scored particularly low in the areas of sharing (lower than average) and satisfaction (at the 25th percentile). Figure 32 [Figure 1 in the present article] depicts the scores of Savannah as they relate to the MCQ averages, the 75th percentile, and the 25th percentile.

Interestingly, Savannah scored highest in the area of focusing on the future (S f = 96 < 75th percentile = 80). As seen in [Figure 1 in the present article], Savannah’s profile on the MCQ was 100 — 90 — 80 — 70 — 60 — 50 — 40 — Closeness Discomfort Character Development Fun Focus Sharing Focus Academic Focus Relating FocusGrowth Focus Future Outlook Focus Satisfaction Savannah Average 75th percentile 25th percentile FiGuRE 1 A verages of Selected Subscales From the Match Characteristics Questionnaire and the Profile of Savannah Note. Adapted from “The Experiences and Perceptions of Selected Mentors: An Exploratory Study of the Dyadic Relationship in School-Based Mentoring,” by R. K. Frels, 2010, unpublished doctoral dissertation, pp. 212–213. Copyright 2010 by R. K. Frels. Journal of Counseling & Development ■ April 2013 ■ Volume 91 191 Administering Quantitative Instruments With Qualitative Interviews very high or very low on various subscales. Hence, her profile is deemed: highly fluctuating. Table 27 [Table 2 in the present article] provides statements (i.e., qualitative data that support the MCQ responses). (p. 211) As a result of the analysis, the table described in the excerpt (see Table 2) was a reference point that aligned the quantitative instrument with some of the qualitative findings. For example, as seen in Table 2, Savannah scored in the high fourth quartile for the MCQ subscale Future Outlook Focus. This technique of correlating scores with qualitative information involved the following crossover analyses: integrated data reduction, integrated data display, data transformation, data correlation, data consolidation, data comparison, data integration, and warranted assertion analysis. During the interview, Savannah disclosed that she was unhappy with the progress that she and her mentee were making. Subsequently, Savannah scored in the lower third quartile of the MCQ subscale Satisfaction. As seen in Table 2, the data were integrated by displaying this MCQ quantitative score aligned with an example quotation from the interview. Savannah’s high expectations and unre- alistic goals resulted in her frustration and her decision to discontinue mentoring (Frels & Onwuegbuzie, 2012a, 2012b).

This concept resonates with other SBM literature (Karcher, Herrera, & Hansen, 2010), whereby goal-oriented interactions (e.g., focusing on the future) often are not sufficient indicators of relationship closeness in SBM. Furthermore, revealing the relationship between Savannah’s intent to leave mentoring and the MCQ Satisfaction subscale provided evidence of triangulation or convergence in data. Finally, with reference to Greene et al.’s (1989) five differ - ent research purposes for mixing quantitative and qualitative data, themes from the constant comparison analysis of inter - view data and the MCQ scores were mapped and identified with one or more purposes. This type of data correlation map can provide further evidence of how a researcher can integrate qualitative findings with a quantitative instrument.

For example, Frels (2010) presented the purpose of comple- mentarity—to measure overlapping but different facets of a phenomenon (Greene et al., 1989)—with the case of Savan- nah. The constant comparison analysis of qualitative data yielded the theme of Too Many Questions, which seemed to hinder the dyadic relationship. Frels presented the subscale Satisfaction in relationship to the theme Too Many Questions to contextualize how the use of questions was inherent in Savannah’s relating style. For example, Savannah described her own disappointment of the mentoring experience through questions during an interview: “Why are you [myself] here?

. . .you know . . . why did they [the mentees] want you to come? And was it their idea? Was it their parents’ idea? Was it their teacher’s idea?” Thus, with the use of Greene et al.’s (1989) purposes for mixing as a frame for a visual display, data integration in the qualitative dominant crossover mixed analysis was evident. Conclusion As we have shown in this article, supplementing open-ended interview responses with quantitative data from one or more psychometrically sound (standardized) quantitative instru- ments can increase the rigor of qualitative studies; this practice is consistent with many philosophical paradigms. In addi- tion, by recognizing the value of crossover mixed analyses, researchers might view philosophical integration much like how they would view the concept of theoretical integration in counseling. Oftentimes, counselors adhere to one guiding theory, which can be integrated with points in common with other theoretical concepts, including underlying philosophy, values, and data collection (Kottler & Montgomery, 2011). Because incorporating information from standardized quantitative instruments into the analysis of qualitative interview data represents the use of quantitative analysis techniques that are classified only as Level 1 complexity (Ross & Onwuegbuzie, 2011), this strategy should not con- tradict the philosophical assumptions and stances of any of the major qualitative-based research paradigms (e.g., con- TABLE 2 Correlating Match Characteristics Questionnaire Subscale Scores With Qualitative Statements Dimension/Subscale Academic Focus Relating Focus Future Outlook Focus Satisfaction Quartile Note. Adapted from “The Experiences and Perceptions of Selected Mentors: An Exploratory Study of the Dyadic Relationship in School-Based Mentoring,” by R. K. Frels, 2010, unpublished doctoral dissertation, pp. 212–213. Copyright 2010 by R. K. Frels. Examples From Savannah’s i nterview Low fourth Low fourth High fourth Low third “I mean you have to have something—like if her teacher wanted her to work on fractions—she could have sent maybe something for her to work on with us, to work on together.” “Well, we played a couple of games but I think talking more connecting . . . because when we played the game—we didn’t talk so much.” “And um, I [mentor] . . . really studied. I worked hard but I did well. And I thought ‘wow—wow’ and I just want to encourage her ’cause things didn’t come easy to me; they don’t come easy to her. And I wanted to give her a head start. Don’t wait till you’re in college or until after your kids are born to learn how to study.” “The fact that she [mentee] was shy and the fact that she still struggles in school, I wanted to help but those two things really kind of made me decide, you know, on what I did [to quit].

Some days, it was hard.” Journal of Counseling & Development ■ April 2013 ■ Volume 91 192 Frels & Onwuegbuzie structivism, critical theory, participatory). Thus, even though supplementing qualitative interview data with quantitative data—what we call a mixed methods interview—leads to a mixed analysis, the resultant mixed analysis would be qualitatively dominant. Furthermore, as Guba and Lincoln (2011) wrote, Are paradigms commensurable? Is it possible to blend ele- ments of one paradigm into another, so that one is engaging in research that represents the best of both worldviews? The answer, from our perspective, has to be a cautious yes. This is so if the models (paradigms, integrated philosophical systems) share axiomatic elements that are similar, or that resonate strongly between them. (p. 117) Therefore, our call for an even more rigorous process of combining qualitative open-ended interview questions with items from standardized quantitative instruments, via a mixed methods interview, represents the blending of elements of one paradigm into another that provides qualitative research- ers from the field of counseling the best of both worldviews.

Most important, the collection of quantitative data during the qualitative interview process allows researchers to compare each interviewee with extant normative data, including inter - national norms, national norms, regional norms, local norms, and relevant cultural norms. Thus, we encourage qualitative researchers, whenever appropriate, to administer one or more quantitative instruments that tap the construct of interest to increase verstehen.

References Adler, L. (1996). Qualitative research of legal issues. In D. Schimmel (Ed.), Research that makes a difference: Complementary methods for examining legal issues in education (NOLPE Monograph Series No. 56, pp. 3–31). Topeka, KS: National Organization on Legal Problems of Education.

American Counseling Association. (2005). ACA code of ethics. Alexandria, VA: Author.

Andrew, S., & Halcomb, E. J. (Eds.). (2009). Mixed methods re- search for nursing and the health sciences. Chichester, England:

Wiley-Blackwell.

Bazeley, P. (2010). Computer-assisted integration of mixed methods data sources and analysis. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (2nd ed., pp. 431–467). Thousand Oaks, CA: Sage.

Bergman, M. (Ed.). (2008). Advances in mixed methods research: Theories and applications. Thousand Oaks, CA: Sage.

Brannen, J. (2005). Mixing methods: The entry of qualitative and quantitative approaches into the research process. International Journal of Social Research Methodology, 8, 173–184.

Chenail, R. J. (1997). Interviewing exercises: Lessons from family therapy. The Qualitative Report, 3. Retrieved from http://www.

nova.edu/ssss/QR/QR3-2/chenail.html Collins, K. M. T., Onwuegbuzie, A. J., & Jiao, Q. G. (Vol. Eds.).

(2010). The research on stress and coping in education series:

Vol. 5. Toward a broader understanding of stress and coping:

Mixed methods appr oaches. Greenway, CT: Information Age Publishing.

Creswell, J. W., & Plano Clark, V. L. (2010). Designing and conduct- ing mixed methods research (2nd ed.). Thousand Oaks, CA: Sage.

Frels, R. K. (2010). The experiences and perceptions of selected mentors: An exploratory study of the dyadic relationship in school-based mentoring (Unpublished doctoral dissertation).

Sam Houston State University, Huntsville, TX.

Frels, R. K., & Onwuegbuzie, A. J. (2012a). The experiences of selected mentors: A cross-cultural examination of the dyadic relationship in school-based mentoring. Mentoring & Tutoring:

Partnership in Learning, 20, 1–26. doi:10.1080/13611267.201 2.679122 Frels, R. K., & Onwuegbuzie, A. J. (2012b). Principles of play: A dialogical comparison of two case studies in school-based mentoring. International Journal of Play Therapy, 21, 131–148.

doi:10.1037/a0028536 Frels, R. K., Onwuegbuzie, A. J., Leech, N. L., & Collins, K. M.. (2012). Challenges to teaching mixed research courses. Journal of Effective Teaching, 12, 23–44.

Gehart, D. R., Ratliff, D. A., & Lyle, R. R. (2001). Qualitative research in family therapy: A substantive and methodological review. Journal of Marital and Family Therapy, 27, 261–270.

doi:10.1111/j.1752-0606.2001.tb01162.x Greene, J. C. (2007). Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass.

Greene, J. C. (2008). Is mixed methods social inquiry a distinctive methodology? Journal of Mixed Methods Research, 2, 7–22.

doi:10.1177/1558689807309969 Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs.

Educational Evaluation and Policy Analysis, 11, 255–274.

doi:10.3102/01623737011003255 Guba, E. G., & Lincoln, Y. S. (2011). Paradigmatic controversies, contradictions, and emerging confluences, revisited. In N. K.

Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualita- tive research (4th ed., pp. 97–128). Thousand Oaks, CA: Sage.

Hanson, W. E., Creswell, J. W., Plano Clark, V. L., Petska, K. S., & Creswell, J. D. (2005). Mixed methods research designs in counseling psychology. Journal of Counseling Psychology, 52, 224–235. doi:10.1037/0022-0167.52.2.224 Harris, J. T., & Nakkula, M. J. (2008). Match Characteristic Ques- tionnaire (MCQ). Unpublished measure, Harvard Graduate School of Education.

Hess-Biber, S. N. (2010). Mixed methods research: Merging theory with practice. New York, NY: Guilford Press.

Ivankova, N. V., & Kawamura, Y. (2010). Emerging trends in the utilization of integrated designs in the social, behavioral, and health sciences. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (2nd ed., pp. 581–611). Thousand Oaks, CA: Sage. Journal of Counseling & Development ■ April 2013 ■ Volume 91 193 Administering Quantitative Instruments With Qualitative Interviews Johnson, R. B. (2011). Dialectical pluralism: A metaparadigm to help us hear and ‘‘combine” our valued differences. In S. J. Hesse-Biber (Chair), Addressing the credibility of evidence in mixed methods research: Questions, issues and research strategies. Symposium conducted at the meeting of Seventh International Congress of Qualitative Inquiry, University of Illinois at Urbana-Champaign.

Johnson, R. B. (2012). Dialectical pluralism and mixed re- search. American Behavioral Scientist, 56, 751–754.

doi:10.1177/0002764212442494 Johnson, R. B., & Gray, R. (2010). A history of philosophical and theoretical issues for mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (2nd ed., pp. 581–611). Thousand Oaks, CA: Sage.

Johnson, R. B., Onwuegbuzie, A. J., & Turner, L. A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1, 112–133. doi:10.1177/1558689806298224 Karcher, M. J., Herrera, C., & Hansen, K. (2010). “I dunno, what do you wanna do?”: Testing a framework to guide mentor training and activity selection. New Directions for Youth Development, 126, 51–69. doi:10.1002/yd.349 Kottler, J. A., & Montgomery, M. J. (2011). Theories of counseling and therapy: An experiential approach. Thousand Oaks, CA: Sage.

Leech, N. L., & Onwuegbuzie, A. J. (2010). Guidelines for con- ducting and reporting mixed research in the field of counseling and beyond. Journal of Counseling & Development, 88, 61–69.

doi:10.1002/j.1556-6678.2010.tb00151.x Leech, N. L., & Onwuegbuzie, A. J. (2011). Mixed research in counseling: Trends in the literature. Measurement and Evaluation in Counseling Development, 44, 169–180.

doi:10.1177/0748175611409848 Morse, J. M. (1995). The significance of saturation. Qualitative Health Research, 5, 147–149. doi:10.1177/104973239500500201 Morse, J. M., & Niehaus, L. (2009). Mixed method design: Principles and procedures. Walnut Creek, CA: Left Coast Press.

Newman, I., & Ridenour, C. R. (2008). Mixed methods research. Chicago, IL: Southern Illinois University Press.

Onwuegbuzie, A. J. (2003). Effect sizes in qualitative research: A prolegomenon. Quality & Quantity: International Journal of Methodology, 37, 393–409. doi:10.1023/A:1027379223537 Onwuegbuzie, A. J., & Combs, J. P. (2010). Emergent data analy- sis techniques in mixed methods research: A synthesis. In A.

Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (2nd ed., pp. 397–430). Thousand Oaks, CA: Sage.

Onwuegbuzie, A. J., & Dickinson, W. B. (2008). Mixed methods analysis and information visualization: Graphical display for effective communication of research results. Qualitative Report, 13, 204–225. Retrieved from http://www.nova.edu/ssss/QR/ QR13-2/onwuegbuzie.pdf Onwuegbuzie, A. J., Frels, R. K., Leech, N. L., & Collins, K. M. T. (2011). A mixed research study of mixed research courses:

Experiences and perceptions of instructors. International Journal of Multiple Research Approaches, 5, 169–202. Onwuegbuzie, A. J., Jiao, Q. G., & Bostick, S. L. (2004). Library anxiety: Theory, research, and applications (Research Methods in Library and Information Studies, No. 1). Lanham, MD:

Scarecrow Press.

Onwuegbuzie, A. J., Johnson, R. B., & Collins, K. M. T. (2009). A call for mixed analysis: A philosophical framework for combining qualitative and quantitative. International Journal of Multiple Research Methods, 3, 114–139.

Onwuegbuzie, A. J., Leech, N. L., & Collins, K. M. T. (2011). Toward a new era for conducting mixed analyses: The role of quantitative dominant and qualitative dominant crossover mixed analyses.

In M. Williams & W. P. Vogt (Eds.), The Sage handbook of in- novation in social research methods (pp. 353–384). Thousand Oaks, CA: Sage.

Onwuegbuzie, A. J., Slate, J. R., Leech, N. L., & Collins, K. M. T. (2009). Mixed data analysis: Advanced integration techniques.

International Journal of Multiple Research Approaches, 3, 13–33.

Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 351–383). Thousand Oaks, CA: Sage.

Ortiz, S. M. (2001). How interviewing became therapy for wives of professional athletes: Learning from a seren- dipitous experience. Qualitative Inquiry, 7, 192–220.

doi:10.1177/107780040100700204 Plano Clark, V. L., & Badice, M. (2010). Research questions in mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (2nd ed., pp. 275–304). Thousand Oaks, CA: Sage.

Plowright, D. (2011). Using mixed methods: Frameworks for an integrated methodology. Thousand Oaks, CA: Sage.

Powell, H., Mihalas, S., Onwuegbuzie, A. J., Suldo, S., & Daley, C. E. (2008). Mixed methods research in school psychology: A mixed methods investigation of trends in the literature. Psychology in the Schools, 45, 291–309. doi:10.1002/pits.20296 Ray, D. C., Hull, D. M., Thacker, A. J., Pace, L. S., Swan, K. L., Carlson, S. E., & Sullivan, J. M. (2011). Research in counseling:

A 10-year review to inform practice. Journal of Counseling & De- velopment, 89, 349–359. doi:10.1002/j.1556-6678.2011.tb00099.x Ross, A., & Onwuegbuzie, A. J. (2011, February). Complexity of quantitative analyses used in mixed research articles from the field of mathematics education. Paper presented at the annual meeting of the Eastern Educational Research Association, Sarasota, FL.

Roulston, K. (2010). Considering quality in qualita- tive interviewing. Qualitative Research, 10, 199–228.

doi:10.1177/1468794109356739 Sandelowski, M. (2008). Theoretical saturation. In L. M. Given (Ed.), The Sage encyclopedia of qualitative methods (Vol. 1, pp.

875–876). Thousand Oaks, CA: Sage.

Schwandt, T. A. (2000). Three epistemological stances for qualitative inquiry: Interpretivism, hermeneutics, and social constructio- nism. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 189-215). Thousand Oaks, CA: Sage. Journal of Counseling & Development ■ April 2013 ■ Volume 91 194 Frels & Onwuegbuzie Schwandt, T. A. (2007). The Sage dictionary of qualitative inquiry (3rd ed.). Thousand Oaks, CA: Sage Smith, M. L. (1997). Mixing and matching: Methods and models. In J. C. Greene & V. J. Caracelli (Eds.), Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms (New Directions for Evaluation No. 74, pp. 73–85).

San Francisco, CA: Jossey-Bass.

Spencer, R. (2004). Studying relationships in psychotherapy: An untapped resource for youth mentoring. New Directions for Youth Development, 103, 31–42. doi:10.1002/yd.89 Spencer, R. (2007). “It’s not what I expected”: A qualitative study of youth mentoirng relationship failures. Journal of Adolescent Research, 22, 331–354. doi:10.1177/0743558407301915 Tashakkori, A., & Teddlie, C. (1998). Applied Social Research Meth- ods Series: Vol. 46. Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage.

Tashakkori, A., & Teddlie, C. (Eds.). (2003). Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage.

Tashakkori, A., & Teddlie, C. (Eds.). (2010). Sage handbook of mixed methods in social and behavioral research (2nd ed.). Thousand Oaks, CA: Sage.

Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative techniques in the social and behavioral sciences. Thousand Oaks, CA: Sage. Copyright of Journal of Counseling & Development is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.