** 4 Questions 200 Words Each and a Venn Diagram requires Turnitin***

Toward Development of a Generalized Instrument to Measure Andragogy Elwood F. Holton III, Lynda Swanson Wilson, Reid A. Bates Andragogy has emerged as one of the dominant frameworks for teaching adults during the past 40 years. A major and glaring gap in andragogy research is the lack of a measurement instrument that adequately measures both andragogical principles and process design elements. As a result, no definitive empirical test of the theory has been possible. The purpose of this article is to report on initial attempts to develop a survey instrument that corrects this shortcoming in the andragogy research literature. The instru- ment developed for this study was part of a comprehensive examination of andragogical principles and process design elements and their effect on student satisfaction and learning outcomes in a postsecondary education setting. It was administered to 404 adults enrolled in an adult-oriented postgraduate degree program. Exploratory factor analysis revealed promising scales to measure five of the six andragogical principles and six of the eight process design elements. This instrument is the most successful attempt to date to measure andragogical principles and elements. It holds promise for advancing research on andragogy, and subsequently advancing the field of HRD by explaining affective and cognitive responses to andragogical instruc- tional strategies across a spectrum of learning environments. Additional implications for future research to strengthen the instrument are also discussed.

Adult learning has long been a core component of HRD practice and research.

Although HRD has expanded well beyond adult learning alone, there are few HRD practitioners or researchers who don’t also work in the adult learning realm. Furthermore, there are few if any HRD academic programs that don’t HUMAN RESOURCE DEVELOPMENT QUARTERLY , vol. 20, no. 2, Summer 2009 © Wiley Periodicals, Inc.

Published online in Wiley InterScience (www.interscience.wiley.com) • DOI: 10.1002/hrdq.20014 169 170 Holton, Wilson, Bates include adult learning as part of their curriculum. Thus, research about and within adult learning remains timely and relevant to HRD.

Andragogy has emerged as one of the dominant frameworks for teaching adults during the past 40 years. Defined as the “art and science of helping adults learn” (Knowles, 1990, p. 54) and “an intentional and professionally guided activity that aims at change in an adult person” (Knowles, Holton, & Swanson, 1998, p. 60), andragogy has become “synonymous with the educa- tion of adults” (Pratt, 1998, p. 160). Andragogy is viewed by some in the field as “thetheory of adult education” (Merriam & Brockett, 1997, p. 135), due in part to its “significant influence exercised on the practice of adult education” (Pratt, 1998, p. 160). It has been described as “the preeminent and persistent practice-based, instructional method” (Rachal, 2002, p. 211), a “guiding prin- ciple on how best to educate adults” (Beder & Carrea, 1988, p. 75), a “set of guidelines for effective instruction of adults” (Feuer & Gerber, 1988, p. 35), and “a way of thinking about working with adult learners” (Merriam & Brockett, 1997, p. 135).

According to Lawson (1997), “the paradigm of andragogy continues to be a powerful influence in the field (of adult education) by its influence on shap- ing how we think about the delivery of services to adults” (p. 10). As a matter of fact, many educators wear andragogy as a “badge of identity because it grants them a sense of their distinct professional identity” (Brookfield, 1986, p. 91). Feuer and Gerber (1988) appeared to agree when they discussed the andragogical badge’s uniqueness for both educators and trainers as a way to “carve out a specific content domain, a formal, theory-based body of knowl- edge to be nurtured and cultivated” (p. 32). Some educators who subscribe to andragogical principles feel the most appropriate way to design learning is to keep the adult learner at the center or the focus of the learning experience by using instructional strategies that best meet adult learner needs. Similarly, other educators subscribe to a “Knowlesean” view of adult learning and thus approach the learning process by using adult learner characteristics and pro- viding an educational experience that is respectful, cooperative, and self- directed (Strawbridge, 1994). Therefore it is not surprising that andragogy’s impact on adult learning has been noted as groundbreaking and revolution- ary; it is perhaps the best-known theory of adult learning (Knowles et al., 1998; Merriam, 1987).

Andragogy’s six basic principles of adult learning differentiate what edu- cators must do to successfully teach learners. These principles shift the focus of learning needs analysis, curriculum design, delivery, and assessment from being teacher-centered to learner-centered. In addition to andragogical princi- ples, andragogical process design elements influence the adult learning expe- rience. The eight design elements encompass a wide range of activities which occur before, during, and after the learning experience, including: preparing the learners, climate setting, mutual planning, diagnosis of learning needs, for- mulation of learning objectives, learning plan design, learning plan execution, HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq and evaluation (Knowles, 1984). Adoption and integration of andragogical process design element strategies in adult learning settings can, as noted by Darkenwald and Merriam (1982), lead to an “understanding of the learning process which could enhance the practice of adult education” (p. 99).

When andragogical principles and design elements are adequately con- sidered, andragogy has the “ability to address the differences of learning needs between adults and children via sharply differentiated instructional methods” (Brookfield, 1986, pp. 96, 125). Darkenwald and Merriam (1982) stated that gaining an “understanding of the learning process could enhance the practice of adult education” (p. 99). Knowles et al. (1998) pointed out that in a learn- ing setting the andragogical model is appropriate because it addresses charac- teristics of the learning situation.

Regardless of restraints and challenges, andragogical principles have found their way into all levels of formal education (Knowles, 1990). Their influence has reached beyond traditional education to influence professional training such as nursing, social work, business, religion, agriculture and law (Davenport & Davenport, 1985; Knowles, 1980); workforce development efforts; and higher education academic counseling (Espinoza, 2001).

The fact of the matter is that andragogy has challenged and reshaped design and execution of adult education. Its influence has resulted in adap- tation of long-held education theories and practices, and it has prompted scholars and practitioners alike to question the assumption of pedagogical approaches to education delivery (Knowles, 1990). Feuer and Gerber (1988) noted that few in adult education would argue with the fact that Knowles’s ideas “sparked a revolution in adult education” (p. 31). However, even with accolades for andragogy and its influence in adult education, debate persists.

Studies have produced more questions than answers, mainly because of conflicting research findings (Strawbridge, 1994) and a void of research (Weinstein, 2002). Today, anecdotal or descriptive research continues to dom- inate in the field, which has prompted some researchers to question unequiv- ocal adoption of the theory without a clear explanation as to how it affects learning (Merriam & Brockett, 1997). One complaint levied against andragogy is that it has “caused more controversy, philosophical debate and critical analy- sis than any other concept proposed in adult learning” (Merriam & Caffarella, 1991, p. 250).

One probable cause of the confusion is its myriad conceptual interpreta- tions. Andragogy has been classified as a science, a philosophy, a set of assump- tions, a set of guidelines, as well as an art (Knowles et al., 1998). Rachal (2002) suggested that failure to reach a consensus was partly due to an “elasticity of meaning” (p. 211). Davenport and Davenport (1985) observed that the adult education literature classifies andragogy as a theory, a method, and a technique of adult education as well as a set of assumptions. Merriam (1987) seemed to agree when she stated that there was no clear academic agreement on how best to identify andragogy.

Toward Development of a Generalized Instrument to Measure Andragogy 171 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 172 Holton, Wilson, Bates Another reason for persistent debate in the field is the limited number of empirical investigations that have been conducted (Merriam & Caffarella, 1991). As noted by Strawbridge (1994), there may be a need to “narrow research questions to achieve empirical testability” (pp. 13, 73). Williams (2001) also expressed concern regarding the lack of aggressive investigative designs. This lack of aggressiveness will continue to perpetuate what Rachal (2002) noted as “failed efforts to move the andragogical debate to the next level beyond extensive anecdotal writing on the subject” (p. 211). It appears from the current state of research that the field is struggling with “inconclusive, con- tradictory and limited or insufficient empirical examinations” (Brookfield, 1986, p. 91; Rachal, 2002), working on clarifying this one as well as the “paucity of empirical research” (Beder & Carrea, 1988, p. 75).

Unfortunately, indications are that sophisticated research designs are few and far between. Survey designs are by far the most widely used research approach in the adult learning field (Williams, 2001), and descriptive and qualitative research methods consistently dominate the adult learning litera- ture (Long, Hiemstra, & Associates, 1980; Williams, 2001). Rachal (2002) argued that “the art of andragogy may be dominant over the science” (p. 212).

Davenport (1984) claimed that the field practices “an act of educational faith rather than an act of educational science” (p. 10). The results of flawed and limited research include (1) emphasizing practice over theory and research, (2) failure to produce credible outcome measurements, (3) studies limited in scope, (4) not following a systematic strategy, and (5) leaving unanswered questions about program effectiveness and accountability as well as future pro- gram planning and improvement (Beder & Carrea, 1988; Brockett, 1987).

A major and glaring gap in andragogy research is the lack of a measure- ment instrument that validly measures andragogical principles and process design elements. Only a handful of instruments have been created, mainly for dissertation work. Unfortunately, research has yet to produce an instrument with sound psychometric qualities that validly measures both andragogy’s six principles and its eight process design elements.

Several measurement instruments have appeared in the literature over the past 30 years, and each has contributed to the body of knowledge (Christian, 1982; Colton & Hatcher, 2004; Conti, 1978; Hadley, 1975; Kerwin, 1979; Knowles, 1987; Perrin, 2000; Suanmali, 1981). However, each instrument had its own flaws and limitations, particularly in its inability to completely isolate (1) adult learners, (2) the six andragogical principles, or (3) the eight andra- gogical process elements.

Clearly, no strong empirical research on the theory of andragogy is possi- ble if researchers don’t have a tool to measure it in field studies. We suggest that a valid and reliable measurement instrument is essential to advancing our understanding of andragogy and must include evaluation of both andragogi- cal principles and process design elements. Additionally, establishing an instru- ment’s validity is critical for its use in empirical examinations. It seems odd that HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq despite almost 50 years of use, andragogy remains more art than science. We suggest that it is critical for andragogy to move beyond philosophical or practice- based rhetoric to strong empirical testing so practitioners in the field of HRD will more thoroughly understand knowledge acquisition, which leads to design and delivery of adult-appropriate curriculums.

This is not just an issue for researchers. Adult learners are not a mono- lithic group, and educational settings span a wide contextual spectrum.

For HRD practitioners, it is vital that there be a stronger empirical base for identifying and developing best practices in adult learning instructional strategies.Indeed, andragogy may be the most widely practiced approach to adult education, and its tenants are well known to most adult educators. Yet the truth is that such practice is based on shaky empirical evidence at best. Our long-term goal therefore is to provide adult educators with stronger evidence- based theory and principles.

Thus, the purpose of this article is to report on creation of a measurement instrument that was used as part of a study that measured student satisfaction with andragogical teaching methods and subsequent evidence of learning in a facilitative postsecondary educational setting. The study was an initial attempt to address the problem of a lack of instruments that measure andragogy com- prehensively, both its principles and process design elements. The study devel- oped a survey instrument that corrected this shortcoming and answered the research question, Can an instrument with psychometric qualities be devel- oped that is valid and reliable, and that measures an instructor’s andragogical behaviors on the basis of andragogy’s six principles and the eight process design elements?

Review of Andragogical Measurement Research The most significant first step in the study of andragogy was the development of the Educational Orientation Questionnaire (EOQ) by Hadley (1975). It mea- sured differences in beliefs among adult educators as well as effective learning strategies, including both pedagogical and andragogical orientations to learn- ing. Knowles and Associates (1984) noted that the contribution of the EOQ was its ability to offer a way for teachers to examine their approach to adult education. Evidence from the study indicated that teachers tend to view them- selves as more andragogical than their students do (Hadley, 1975).

Kerwin (1979) remarked that the EOQ was “the first instrument to empir- ically study teaching behaviors of andragogically- and pedagogically-oriented educators” (p. 3). Hadley (1975) described his study as an “operational hypoth- esis based upon a theoretical construct that andragogy-pedagogy differences in attitudes toward adult education can be operationalized in terms of respondents’ agreement or disagreement with relevant statements” (p. 99). To that end, he created a 60-item questionnaire designed to “discriminate among adult educa- tors with respect to their andragogical-pedagogical orientation” (p. 127).

Toward Development of a Generalized Instrument to Measure Andragogy 173 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 174 Holton, Wilson, Bates Hadley (1975) solicited feedback on the instrument’s design from Malcolm Knowles, a member of his doctoral committee, and the questionnaire was administered to 409 teachers/educators from public and private educational institutions as well as from business, religious institutions, and government agencies. Of the 60 items on the questionnaire, 30 were described as likely to be favored by pedagogically-oriented educators and 30 were likely to be favored by andragogically-oriented educators. Hadley (1975) stated that the questionnaire’s underlying constructs or subdimensions along the pedagogy- andragogy continuum to be (1) philosophy of education, (2) purpose of education, (3) nature of learners, (4) characteristics of learning experience, (5) management of learning experience, (6) evaluation, and (7) relationships between educator and learner and among learners. Through factor analysis, eight factors emerged: (1) pedagogical orientation, (2) andragogical orienta- tion, (3) competitive motivation, (4) pedagogical teaching, (5) social distance, (6) student undependability, (7) standardization, and (8) self-directed change.

The EOQ was found to be reliable with a test-retest measurement of 0.89 and a coefficient alpha of 0.94 (Hadley, 1975).

The EOQ has been used or slightly modified by other researchers since its introduction (Christian, 1982; Kerwin, 1979; Smith, 1982). Davenport (1984) noted that the EOQ instrument had become the primary instrument for measuring the construct of education orientation and was useful because it demonstrated that educational orientations of instructors vary by gender, department, institutional setting, and academic discipline, but more research was needed to identify if these variances were related to important variables such as achievement. However, it is important to note that andragogical ori- entation loaded on a single factor and the other factors were not part of the andragogical model.

Two other instruments have been used by the adult learning community as tools to measure the theory of andragogy. They are based on the Hadley instrument: Kerwin’s Educational Description Questionnaire (EDQ; 1979), and Christian’s Student Orientation Questionnaire (SOQ; 1982). Kerwin (1979) modified Hadley’s Educational Orientation Questionnaire (EOQ) and created his EDQ as a means to examine “if students perceived any differences between the teaching behavior of andragogically- and pedagogically-oriented educators to determine in what ways the student-perceived teaching behavior of andragog- ically-oriented educators differ from that of pedagogically-oriented educators” (p. 3). Additionally, the instrument examined whether a significant difference between andragogical and pedagogical orientations toward education existed.

The EDQ was created by “converting Hadley’s instrument about education or effective learning situations to statements describing educator behavior” (Kerwin, 1979, p. 35). It measured behaviors or conditions that occurred in the class- room. The factorial categories were (1) student involvement, (2) control, (3) distrust and detachment, (4) professionalism, (5) counseling, (6) individual inattention, and (7) organization. The instrument was initially tested on HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 74 instructors and 961 students at two community colleges (one rural and one urban) along with two technical institutes (one rural and one urban). Kerwin (1979) noted that of all of the factors extracted from the EDQ, only one, stu- dent involvement, corresponded to a factor identified in Hadley’s EOQ. Kerwin stated that in comparison of the two instruments’ factors, andragogical orien- tation and student involvement were similar (1979).

Davenport (1984) noted that by identifying the factor of student involve- ment, the instrument was successful in reinforcing Knowles’s concept of an- dragogy. However, the study’s instrument, like the EOQ, failed to adequately measure the six principles of andragogy, thus limiting its ability to provide adult education researchers with the adequate data needed to move the theory to the next level of development. However, the Hadley (1975) and Kerwin (1979) instruments have been important to the field.

Christian’s 50-item SOQ was a measurement tool for identifying student preferences, attitudes, and beliefs about education. His instrument was created by modifying both the Hadley and Kerwin instruments. He studied 300 mili- tary and civilian personnel enrolled in mandatory management training at a U.S. military base, as well as adults attending voluntary education programs being conducted on the base by a local university. Findings revealed that mil- itary personnel preferred more andragogical teaching methods compared to civilian personnel. The researcher noted that his study was the first to isolate and examine military personnel preferences in the learning environment.

Christian (1982) suggested that his instrument was significant because it aided adult education instructors in identifying the most appropriate instructional strategies according to the preferences indicated by students’ responses. How- ever, the lingering problem with the Christian instrument’s ability to advance the theory of andragogy is that, like the Hadley and Kerwin instruments, it, too, failed to measure all six principles of andragogy.

The Andragogy in Practice Inventory (API), created by Suanmali (1981) as part of a doctoral study, examined leading adult educators and their beliefs regarding conceptual approaches in the andragogical process. The self-reported instrument examined conceptual agreement with the principles of andragogy held by members of the Commission of the Professors of the Adult Education Association. It measured instructor acceptance of, and agreement with, andr- agogical concepts, specifically the concept of self-directed learning. Suanmali’s 10-item inventory (1981) also examined the role of the educators, especially in terms of their contribution to helping adults become self-directed learners.

Although it attempted to specifically examine congruence with andragogical principles, the API was “an instrument designed to test the presence of effec- tive facilitation in practice, rather than providing empirical measures of forms of adult learning—or in other words whether or not teachers are behaving as effective facilitators” (Brookfield, 1986, p. 34). Suanmali (1981) concluded that “there was a low degree of agreement among professors of adult education regarding the relative importance of the concepts used in andragogy and Toward Development of a Generalized Instrument to Measure Andragogy 175 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 176 Holton, Wilson, Bates indicated wide variance in degree of agreement among respondents regarding andragogy’s significance in the adult learning environment. This variance may be due in part to the multiple disciplines among respondents and the breadth of populations adult education serves. There was some degree of agreement with five inventory items as andragogy’s impact in the learning setting:

(1) decrease learners’ dependency; (2) help learners use learning resources; (3) learners define their own learning needs; (4) assist learners to define, plan, and evaluate their own learning; and (5) reinforce self-concept as a learner (Suanmali, 1981).

Knowles (1987) created his own andragogical measurement instrument, the Personal HRD Style Inventory, as a way to aid instructors and trainers in their general orientation to adult learning. As Knowles described it, the inven- tory was designed as a self-assessment tool that would “provide insight into an instructor’s general orientation to adult learning, program development, learn- ing methods, and program administration” (p. 1). The instrument has yet to undergo academic testing; its potential to further understand the theory of andragogy and andragogy’s impact on adult learning remains unknown.

The Principles of Adult Learning Scale (PALS) was developed as a 44-item instrument that “measured the degree to which adult education practitioners accept and adhere to the adult education learning principles that are congruent with the collaborative teaching-learing “ (Conti, 1978, p. 2). Its focus was on teaching styles, not an examination of andragogy per se. However, it can be con- sidered one of the best instruments in the field from a psychometric quality per- spective. Even though it was not created as a way to directly measure andragogy, it measures teaching methodologies closely associated with the principles of the theory. According to the instrument’s creator, teaching styles are not randomly selected, do not change over time, and are linked to an instructor’s educational philosophy (Conti, 1991). Scores on the PALS indicate the extent of learner- centered versus teacher-centered approach to teaching.

Several factors were embedded in the instrument. The first, learner- centered activities, evaluated preference for standardized testing, exercising control over the learning environment, determining educational objectives for each student, supporting collaboration, and encouraging students to take responsibility for their own learning. Personalizing instruction, the second factor, included limiting lecturing, supporting cooperation rather than com- petition, and applying a number of methods, materials, and types of assign- ments. The third factor, relating to experience, included planning learning activities that encourage students to relate their new learning to experiences, make learning relevant, and organize learning episodes according to real-life problems. The fourth factor assessed student needs, which included the extent to which an instructor assists students in assessing short- and long-term objec- tives through student conferences and formal as well as informal counseling.

The fifth factor, climate building, included ways in which instructors eliminate learning barriers, propose dialogue, encourage interaction in the classroom, HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq and facilitate student exploration and experimentation related to their self- concept and problem-solving skills via a friendly and informal setting. The sixth factor, participation in the learning process, included the extent to which instructors encourage adult-to-adult relationship building between teacher and student, involve students in developing criteria for assessing classroom performance, and allow students to determine the nature of content material.

The seventh and final factor, flexibility for personal development, includes the extent to which an instructor facilitates learning as opposed to being a provider of knowledge to students, the level of rigidity and sensitivity to students, and openness to adjusting classroom environment and curriculum to meeting changing needs of students.

The Adapted Principles of Adult Learning Styles (APALS) was adapted from the Principles of Adult Learning Scale (PALS) and measured student perceptions of their instructors’ teaching styles (McCollin, 1998). Results indi- cated a learner-centered approach was positive for students. Additionally, findings suggested that the PALS may not be applicable in a higher education research setting because of a strong preference for teacher-centered instruc- tional methods in postsecondary education (McCollin, 1998, p. 110).

Perrin (2000) created an instrument as part of his doctoral study that examined the extent to which adults prefer educators who subscribe to an andragogical teaching style and the extent to which andragogy adequately reflects the learning characteristics of adults. The study resulted in creation of a seven-item, self-report instrument that was derived “directly from Knowles’ 1984 final statements of descriptions of adult learners” (p. 10). The study’s findings supported only a few of the seven adult learner assumptions, among them a desire for self-directed learning, and skill enhancement. Unfortunately, this instrument had no psychometric validity.

By examining previous instruments used in adult learning research, one finds it evident that the field continues to fall short of developing a psycho- metrically sound instrument that directly measures the six andragogical prin- ciples and eight process design elements. The inability to isolate andragogical elements into a single instrument remains problematic and thus hampers improving adult learning. Without a valid measure of andragogy, it is impos- sible to conduct predictive studies and the theory of andragogy remains at a philosophical and theoretical level. Furthermore, without rigorously designed studies, the field of adult education will continue to rely on intuition and anec- dotal evidence rather than empirical foundations for critical curriculum design and instructional delivery strategies.

More recently, an Online Adult Learning Inventory was developed to mea- sure andragogical principles in Web-based instruction (Colton & Hatcher, 2004). This study used an innovative Delphi research process to address adult learning principles. Although the results of the study were an excellent contribution to the field, they were not designed to measure andragogy alone.

Computer-based instructional design and delivery will become more Toward Development of a Generalized Instrument to Measure Andragogy 177 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 178 Holton, Wilson, Bates commonplace in HRD in the 21st century. Therefore the development of valid measurement instruments that examine the theory of andragogy specifically in computer-based adult learning setting will become more necessary in the field.

However, such instruments will need to be developed that isolate and measure andragogy in terms of both principles and process design elements.

Method Andragogical assumptions imply that adults differ from children in educational settings. The theory of andragogy is grounded in six distinct adult learner prin- ciples or assumptions, as detailed in Table 1; all six principles were included in this study.

In addition to the six andragogical assumptions or principles, the theory posits the importance of designing an educational experience specific to the adult learner’s needs. These eight process design elements, all key components in the development of classroom strategies, are shown in Table 2.

Unfortunately, andragogical process design elements have attracted far less attention in the literature, compared to andragogical principles, even though they may exert as much influence on adult learning outcomes as andragogical principles. Therefore, an examination of andragogical process design elements coupled with andragogical principles was timely and warranted in the present study.

Sampling Strategy.The convenience sample used in the study included graduate students enrolled at a large for-profit, private university with cam- puses located throughout the United States. The university’s aim was to cater to the adult postsecondary student. Students in the sample were enrolled in one of five core MBA courses: (1) organizational behavior, (2) business law, (3) business management, (4) economics, or (5) finance.

Twenty-one percent of the university’s entire population, or approximately 40,000 students, were enrolled in the university’s MBA program in 33 states, HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 1. Six Principles or Assumptions of Andragogy Assumption or Principle Andragogical Approach to Learning Concept of the learner, level Increasingly self-directed of self-directedness Readiness to learn Develops from life tasks and problems Experience A rich resource for learning by self and others Orientation Task or problem-centered Motivation Internal incentives, curiosity Need to know Learner’s perception of what and why of learning important to overall learning experience Puerto Rico, Canada, and in one European country. Students in the MBA pro- gram in Puerto Rico, Canada, and Europe were excluded from the frame on account of the potential influence of cultural and language differences.

It would have been more advantageous to draw a random sample of indi- vidual students enrolled in the university’s MBA program, but logistically it was impossible to impose random sampling techniques within intact classes. How- ever, the sample was stratified by campus age. Newer campuses were expected to have less-experienced instructors who might be less likely or able to exhibit andragogical behaviors in the classroom, which would affect student outcomes.

Campus age strata were (1) mature, (2) established, and (3) new.

Thematurecampus identifier signified that the campus was established between 1978 and 1997. The establishedcampus identifier signified the cam- pus was established between 1998 and 2001. The newcampus identifier sig- nified that the campus was established between 2002 and 2004. A frame obtained by the researcher indicated a total of 17 mature campuses, 15 estab- lished campuses, and 14 new campuses. The researcher randomly selected campuses for the study using a random number generator.

Originally the study’s campus-age strata design was to include nine campus locations, three per campus age group category. However, during the sampling process it was discovered that campuses identified as new would not produce enough class options; this was due to lower student population size and fewer courses offered during the study’s timetable. The lack of classes for the sample would have affected the study’s sample size and its statistical power.

Therefore, two additional campuses classified as new were included in the study.

Toward Development of a Generalized Instrument to Measure Andragogy 179 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 2. Eight Process Design Elements of Andragogy Process Design Element Andragogical Approach to Learning Preparing the learner Supply information, prepare students for participation, develop realistic expectations, begin thinking about content Climate Relaxed, trusting, mutually respectful, informal, collaborative, supportive Planning Mutually by learners and facilitator Diagnosis of needs By mutual assessment Setting of objectives By mutual negotiation Designing learning plans Learning contracts, learning projects, sequenced by readiness Learning activities Inquiry projects, independent study, experiential techniques Evaluation By learner-collected evidence validated by peers, facilitator, experts, criterion-referenced 180 Holton, Wilson, Bates According to Hair et al. (2005), sample size is “perhaps the most influen- tial single element under the control of the researcher, which has the most direct impact on the statistical power of the multiple regression” (p. 164). Sta- tistical power provides a rational basis for sampling size (Locke, Silverman, & Spirduso, 1998). Hair et al. (2005) suggested adherence to a 5:1 ratio rule of at least five observations for each instrument item being factored, with 10:1 being preferable. A total of 404 students were included in the sample, which exceeded the power criterion.

The Andragogical Practices Inventory.The andragogical measurement instrument created for this study was entitled the Andragogical Practices Inven- tory (API). The design process for the API included, in order: (1) thorough review of other instruments from past research, (2) development of a survey item pool based on specific andragogical principles and design elements, (3) development of a draft API, (4) panel of experts’ review of the API for pur- poses of establishing content validity, (5) revision of survey based on results of the panel review, (6) finalization of the survey instrument, (7) use of instru- ment in data collection, and (8) statistical analysis. The process is outlined in greater detail below.

The first step in the API creation process was a thorough review of the lit- erature. As reported above, results of the literature review revealed no avail- able instrument that had successfully measured the constructs in the theory of andragogy.

Although the theory of andragogy posits eight process design elements, the researcher determined that in a postsecondary educational setting mutual planning could not be measured. Planning for learning at the university in this study occurs at the organization level and is performed by central administra- tion personnel, including the dean and curriculum developers. Therefore, this particular process design element was eliminated from inclusion on the sur- vey instrument, and the instrument developed for this study attempted to mea- sure only seven of the eight process design elements.

The second step was to draft the initial instrument. The authors worked as a team to analyze each andragogical construct and draft items for each one.

The third step in the survey’s instrument design process included selec- tion of a four-person Ph.D. panel review. Panel review of the API was chosen as the best technique for establishing content validity. All four panel members are considered experts in the field of adult education and had published in the area of adult learning. All panel members also possessed an understanding of research rigor as it pertained to measurement instrument creation in addition to their intimate knowledge of the theory of andragogy. The panel members individually reviewed the instrument and subsequently provided feedback on the items, which was incorporated in the final version of the API. Comments were solicited and submitted electronically from all panel members to the researcher during the study. In particular, the panel of experts were asked to HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq respond to six validity questions: (1) In your opinion, does the API adequately incorporate the two construct domains of the theory of andragogy (andragog- ical principles and andragogical process design elements)? (2) Do the instru- ment’s items adequately describe the content of each of andragogy’s dimensions? (3) Describe changes you would make to the test items; (4) Is the rating scale appropriate for the items being measured? (5) Are there other changes you would make to the instrument? (6) In your opinion are the test items clearly written? The panel indicated that the instrument was valid, and modifications were minor (mostly semantics rather than content changes).

The final version of the API was a six-page, 77-item, Likert scale mea- surement instrument. The 86-item survey solicited student responses in two areas: (Section 1) agreement with andragogical principles for a total of 35 items, and (Section 2) perception of the instructor’s andragogical behaviors and learning design process for a total of 42 items. Student responses were rated as “strongly agree,” “agree,” “neither agree nor disagree,” “disagree,” and “strongly disagree.” The API included both positively and negatively worded items. Participants were asked to mark responses to the best of their ability.

Data Collection and Tracking.All data collection took place during the final class of the course. Assessment of student perception of andragogy via the API took place at the beginning of the final workshop. The university’s provost supported this research project and personally asked for assistance from each campus selected for the study. His letters to the academic affairs rep- resentative at each campus along with his letter to the faculty teaching the classes were key components of the overall success of the study. His endorse- ment of the study and solicitation of support resulted in total participation at all 11 campuses, including all of the administrative staff and faculty involved.

Campus personnel and faculty played a major role in the study’s success; they were critical to the data collection process.

Students in each of the 36 intact classes were informed that their class had been selected to take part in a universitywide research project. They were informed only on the day of data collection so as not to bias the results. A let- ter explaining the project and the university’s support of the study was attached to the research instruments.

Data Analysis Procedures.Cone and Foster (1993) noted that factor analysis is useful in its ability to summarize patterns of correlations among a set of variables and reduce survey items to homogeneous subscales in order to examine between group differences. Common factor analysis using oblique rotation was employed to establish construct validity, and an eigenvalue thresh- old of 1 was used to determine retention of factors. Items with a cross loading of .30 or greater were eliminated. The Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy and Bartlett’s Test of Sphericity were used to assess the suitability of the data for factor analysis.

Toward Development of a Generalized Instrument to Measure Andragogy 181 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 182 Holton, Wilson, Bates Results First, sample descriptive statistics are discussed. Second, results of the factor analysis are reviewed.

Sample Descriptive Statistics.This study examined graduate students enrolled in an MBA program in one of five core MBA courses: organizational behavior, business law, business management, economics, or finance. A total of 36 intact class groups, dispersed among 11 campus locations throughout the university’s system, were included in the study. The final sample comprised 404 graduate students who completed the survey instrument.

Student Descriptives.MBA students involved in the study were asked to complete a self-report, voluntary demographic profile. Descriptive statistics for participants in the study are shown in Table 3.

The institution involved in the study provided the researcher with statis- tical data of three student characteristics that are routinely measured. The stu- dents in this study were the same age as the university average. However, the sample had a higher percentage of males (60% versus 44%) and a substantially lower percentage of Caucasians (26% versus 61%).

Factor Analysis.The items for the six andragogical principles were fac- tor analyzed separately from the andragogical process design elements because they represented two distinct construct domains. Factors with eigenvalues greater than 1 were retained. Items that were either substantially cross-loaded or that exhibited low loadings of .30 or less were excluded (note that the fac- tor loading tables herein show the loadings for all items, not just those retained). The KMO Measure of Sampling Adequacy lent evidence that factor analysis was appropriate for both the andragogical principle items (.975 KMO) and andragogical process design elements (.950 KMO). Bartlett’s Test of Sphericity was highly significant (.000) for both andragogical principles and andragogical process design elements, which also indicated the appropriate- ness of the data for factor analysis.

The final results of the statistical analysis reduced the number of items from its original 77 (35 for andragogical principles and 42 for andragogical process design elements) to 43 (21 for andragogical principles and 22 for andragogical process design elements).

Of the theory’s original six andragogical principles examined, five factors emerged explaining 60.6% of the variance (see Table 4).

Results indicated that the motivation factor was extremely significant. One factor, Orientation to Learning, was not retained because it contained only one item, which was cross-loaded. The complete factor loadings for andragogical principles are shown in Table 5.

The study attempted to measure seven of the eight andragogical process design elements. Findings revealed six factors emerged. The one construct not to emerge from the data was Diagnosis of Learning Needs. The six factors explained 63.4% of the variance (see Table 6).

HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq The complete factor loadings for andragogical process design elements are shown in Table 7. The researcher was concerned that two factors contained only two items per factor. However, because this was the first test of the instru- ment, a decision was made to retain the factors.

Cronbach’s alpha was employed to establish scale reliability. All but two of the scales exhibited strong initial reliabilities. Two of the scales, Readiness and Learning Activities, contained reverse items that appeared to have con- tributed to weak reliability. Toward Development of a Generalized Instrument to Measure Andragogy 183 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 3. Student Characteristics in the Sample Student Characteristic Descriptive Statistics Age Mean 35 Gender Male 60% Female 40% Ethnicity Caucasian 26% African American 35% Hispanic 9% Asian 16% Other 14% Number of courses completed in current Mean 6 MBA program Number of years between undergraduate Mean 7 and graduate school Undergraduate degree or program of study Business 42% Engineering/computer science 16% Social science 15% Law/political science 8% Health/sciences 13% Education 1% Other 5% Current job Business 75% Government 3% Education 4% Law/security 3% Health care 8% Other 7% Current job related to course 33% yes 67% no 184 Holton, Wilson, Bates Readiness showed that it contained one negative item, and deletion of the reverse item significantly improved the scale’s reliability from a .401 to a .811. Therefore, the item was deleted from the Readiness scale. The reli- ability of Learning Activities was a .682, slightly below the desired thresh- old of .70. Because there were only two items in the factor, deletion of an item was not an option. There was speculation that both of the items’ reverse nature could have contributed to the reliability statistics. However, this study was the first to evaluate the validity and reliability of the API and the items seemed appropriate, so the scale was retained in this first test of the questionnaire.

Sample andragogical principle survey items included in each scale along with scale reliability statistics are described in Table 8.

Sample andragogical process design survey items along with scale relia- bility statistics are detailed in Table 9. Conclusions and Discussion Because andragogy has not been adequately tested, the adult education com- munity has continued to question the unequivocal adoption of andragogy without a clear explanation as to how it affects learning (Merriam & Brockett, 1997). The one-size-fits-all adult learner approach has been challenged (Pratt, 2002), and it has been suggested that it may be next to impossible for one overarching theory of adult learning to emerge that is applicable to all adult learning situations (Merriam, 1987).

This study’s research question asked, Could an instrument with sound psychometric qualities be developed that is valid and reliable and that mea- sures an instructor’s andragogical behaviors on the basis of six principles and the eight process elements of andragogy? Although only five of six andragog- ical principles were uncovered, and only six of seven andragogical process design elements examined in the study were extracted, this study was more successful than any previous study in measuring andragogical constructs (principles and process design elements). Findings presented in this chapter HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 4. Variance Explained by Retained Factors for Andragogical Principles Percent of Cumulative Variable Factor Eigenvalue Variance Explained Percentage Motivation 15.691 44.831 44.831 Experience 1.633 4.667 49.498 Need to know 1.513 4.324 53.822 Readiness 1.257 3.590 57.413 Self-Directedness 1.103 3.150 60.563 illustrate that most of the theory’s constructs were effectively captured and measured in the instrument.

Conti (1978) noted that a key prerequisite to growing the body of knowledge in the field of adult education was development of measurement instruments.

Toward Development of a Generalized Instrument to Measure Andragogy 185 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 5. Andragogical Principles Factor Loadings Question No. Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 32 .840 34 .841 35 .786 31 .780 29 .751 33 .668 27 .662 21 .580 17 .530 .239 25 .525 .294 28 .493 11 .248 18 .841 .226 19 .837 30 .778 22 .754 .289 20 .501 .318 23 .470 .313 16 .214 .411 24 .228 .409 .270 6 .618 10 .546 5 .432 9 .409 .230 13 .640 7 .360 .621 14 .230 .578 8 .411 .470 15 .261 .219 .468 4 .225 .415 .253 12 .339 .370 2 .933 3 .366 .466 1 .264 .403 26 .274 .216 .604 186 Holton, Wilson, Bates The Andragogical Practices Inventory (API) created in this study was the first instrument with sound psychometric qualities to successfully measure most of the andragogical principles and process design elements. Therefore, its creation and subsequent availability for future research should be considered a signif- icant advancement for the field of adult education.

Regarding andragogical principles, results of the factor analysis indicated that the API measured five andragogical principles. Two of the principles, moti- vation and orientation to learning, factored together. This factor was labeled as Motivation. Reliability using Cronbach’s alphas for the principle motivation indicated the scale was highly reliable at .933. The experience scale was also highly reliable with a Cronbach’s alpha of .839. For three of the andragogical principles, the scales were somewhat weaker, but were still usable and showed promise for future research. They were (1) need to know, (2) readiness, and (3) self-directedness. The principle need to know had a Cronbach’s alpha of .760; readiness had .811, and the principle self-directedness had .739.

The reduction of andragogical principles from the original six presented by Knowles (Knowles & Associates, 1984) to five, as indicated in this study, was interesting. When designing the instrument, the researchers speculated that students may be unable to differentiate between the motivation and orientation-to-learn constructs. On the one hand, it is possible that the two andragogical principles that factored together, motivation and orientation to learning, did so thanks to the instrument’s inability to effectively differentiate them. It is also possible that the theory does not support six distinct princi- ples. In fact, the andragogical literature discusses the fact that adults will be most motivated to learn when it addresses their life and work needs (their ori- entation to learning). Thus it may well be that the motivation and orientation- to-learning principles should be combined in the theory. Future research involving another attempt to design appropriate measurement scales is needed to establish whether theory modification is warranted.

HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 6. Variance Explained by Retained Factors for Andragogical Design Elements Percentage of Cumulative Variable Eigenvalue Variance Explained Percentage Setting of learning objectives 17.381 41.384 41.384 Climate setting 3.140 7.477 48.861 Evaluation 1.817 4.326 53.187 Preparing the learner 1.527 3.636 56.823 Designing the learning experience 1.481 3.525 60.348 Learning activities 1.290 3.071 63.419 Toward Development of a Generalized Instrument to Measure Andragogy 187 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 7. Andragogical Process Design Elements Factor Loadings Question No. Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 Factor 7 52 .944 57 .907 51 .755 .214 65 .750 56 .745 53 .641 .214 66 .612 .208 61 .575 .456 62 .545 .436 72 .545 .454 71 .472 .243 54 .449 67 .327 .251 63 .312 .214 .233 42 .316 .932 .237 44 .822 48 .789 50 .718 45 .710 47 .637 43 .632 46 .233 .603 .328 36 .575 49 .378 .402 .211 77 .362 .368 .216 41 .236 .243 75 .885 74 .871 76 .828 73 .245 .524 38 .779 39 .617 40 .609 37 .362 .531 64 .211 .295 59 .970 55 .952 69 .731 70 .694 68 .345 .408 .629 60 .315 .376 58.349 188 Holton, Wilson, Bates Eight andragogical process design elements are included in the theory of andragogy. However, as discussed earlier mutual planning was eliminated from this study because of the learning setting and students’ inability to participate in planning activities. Therefore, factor analysis attempted to measure seven andragogical process design elements and extracted six: (1) setting of learn- ing objectives, (2) climate setting, (3) evaluation, (4) preparing the learner, (5) diagnosis of learning needs, and (6) learning activities. The only process design element not extracted was diagnosing learning needs. The failure of a scale to emerge for diagnosing learning needs presents future research oppor- tunities to either find items that effectively differentiate the construct or inves- tigate whether the construct in the theory is valid.

Additionally, learning activities, with a Cronbach’s alpha of .682, came in slightly under the normally accepted threshold of .700 and could have been discarded from the study. However, the researcher retained it in the study because this was the first testing of the API.

An examination of the study’s results indicated a potential problem with reverse-coded items. For instance, readiness contained one reverse item. How- ever, after the reverse coded item was discarded from the scale, Cronbach’s HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 8. Scale Descriptives of Andragogical Principles Scale Name Item Number and Descriptions Scale Reliability Motivation 31) This learning experience tapped into my a .933 inner drive to learn.

33) This learning experience motivated me to give it my best effort.

Experience 18) I felt my prior life and work experiences a .839 helped my learning.

19) My life and work experiences were a regular part of the learning experience.

Need to Know 5) It was clear to me why I needed to a .760 participate in this learning experience.

9) The life or work issues that drove me to this learning experience were understood.

Readiness 14) The life or work issues that motivated me a .811 for this learning experience were respected.

15) This learning experience was just what I needed given the changes in my life or work.

Self-Directedness 2) I was satisfied with the extent to which I wasa .739 an active partner in this learning experience.

3) I felt I had control over my learning in this learning experience. alpha increased from .401 to .811, well above the threshold established for sig- nificance. Additionally, the scale for learning activities contained only two reverse-coded items, a concern for the researchers. This raises the possibility of a measurement artifact that must be closely investigated in future research.

In sum, previous instrument development attempts had failed to fully isolate and measure andragogical constructs (Christian, 1982; Hadley, 1975; Kerwin, 1979; Knowles, 1987; Perrin, 2000; Suanmali, 1981). The major con- tribution of this study is that the API more successfully isolated and measured andragogy’s principles and process design elements than any previous study.

Clearly this instrument will require further development, although it is a signif- icant advancement even in its current state. We believe that with only modest Toward Development of a Generalized Instrument to Measure Andragogy 189 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Table 9. Scale Descriptives of Andragogical Process Design Elements Scale Name Item Number and Descriptions Scale Reliability Setting of learning 51) The facilitator/instructor and the learners a .903 objectives negotiated the learning objectives.

52) Learners were encouraged to set their own individual learning objectives.

Climate setting 47) The climate in this learning experience a .910 can be described as collaborative.

50) The facilitator/instructor developed strong rapport with the learners in this learning experience.

Evaluation 74) The methods used to evaluate my learning a .863 in this learning experience were appropriate.

75) Evaluation methods used during this learning experience met my needs.

Prepare the learner 38) Sufficient steps were taken to prepare me a .875 for the learning process.

39) The way learner responsibilities were clarified was appropriate for this learning experience.

Designing the 59) There were mechanisms in place to a .943 learning collaboratively design which learning activities experience would be used.

55) Assessment tools were used that helped the facilitator and me work together to identify my learning needs.

Learning activities 69) The facilitator/instructor relied too heavily a .682 on lecture during the learning experience.

70) The way the learning experience was conducted made learners passive learners. 190 Holton, Wilson, Bates enhancement the API can be a sound measurement tool to empirically test andragogy and subsequently give HRD practitioners significant findings so that appropriateness and applicability of andragogy in their learning settings can be evaluated.

Future Research.First, future research should strive to find ways to strengthen the API’s scales. Future research should examine how to enhance reliability for the scales for experience, need to know, readiness, and self- directedness. The construct orientation to learning, which factored with motivation, should be revisited and its survey items amended to further inves- tigate whether the construct exists.

Reliability statistics from the andragogical process design elements scales were more encouraging. The researcher’s major concern was the lower-than- expected reliability of the andragogical process design scale learning activities (a .693). Findings demonstrate that there is room to strengthen the API and, in particular improve the reliability of the learning-activities scale.

Future research using and refining the API is recommended. One possi- ble area of refinement is word choice. The wording of the items was con- structed to effectively capture student attitudinal and behavioral perceptions of andragogical principles and process design elements across an array of adult learning settings. However, it is plausible that students in postsecondary edu- cational settings expect language and terminology consistent with that used in collegiate educational settings. For example, the term learning experiencecould have been misunderstood by students expecting the term classroomon the sur- vey instrument. Additionally, the term professorwas eliminated from the survey instrument, in part because the theory of andragogy adopted the term facilitatoras more appropriate for adult learning setting. Even though the terms professoranddoctorare discouraged from use by faculty at the university in this study, subjects taking the API could have been confused by the instru- ment’s wording.

Having a measurement instrument opens the door for an array of empir- ical research to strengthen the empirical research base on andragogy. Perhaps most important are the predictive studies that can now be conducted to exam- ine the effect of andragogical practices on critical adult learning outcomes such as learning and student satisfaction. In addition, this instrument opens the door to test more complex structural models that examine the process by which andragogy affects outcomes. As stated earlier, andragogy has suffered as a theory because of the lack of empirical tests. The API and its future iterations are a promising first step toward clearer validation of andragogy.

Implications for Practice.Having a valid measurement instrument is vitally important for the practice of HRD as well as for the HRD and adult learning research community. Indeed, we reject the notion that measurement instruments are just for researchers. Inventories are widely used in adult edu- cation and human resource development. We have always argued that any practitioner using an inventory has an obligation to select those that have HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq established evidence of validity so the field of HRD moves away from instruc- tional practices based solely on “an act of educational faith rather than an act of educational science” (Davenport, 1984, p. 10).

At one level we would argue that simply enabling researchers to empiri- cally research andragogy would be a tremendous aid to practitioners. We sus- pect that most if not all adult educators have been trained in andragogy even though the empirical evidence is weak. It seems only logical that all practi- tioners should prefer to learn an approach to adult learning that has been val- idated by research. Thus, that the API will enable a strong research evidence base for andragogy to develop should represent enough value for it to be respected by practitioners.

On top of this, though, we envision the API as a tool for practitioners to use in their learning settings. The API is a way for practitioners to assess whether students perceive them as adhering to the principles of andragogy.

Every HRD practitioner who is trying to implement andragogical approaches should want feedback from students as to whether he or she is being success- ful as a facilitator, and if student achievement and satisfaction result from the learning experience. The andragogical approach demands that teachers enter into a relationship with their students with frequent opportunities for student input and dialogue. The API can thus be used by HRD practitioners to diag- nose the learning climate, create points of discussion with students, and give teachers a means to self-evaluate their teaching just as andragogy advocates.

In summary then, the API has the potential to significantly advance the science and practice of andragogy. It may well be the key that enables future generations of adult education researchers and HRD practitioners to unlock the deeper secrets of andragogy. Whether andragogy is proven or rejected is not the point; it is having the tool to answer the question that matters. The API is this tool, and we are committed to an extended research program and col- laboration with HRD practitioners to build the tool so the questions about andragogy can be answered.

References Beder, H., & Carrea, N. (1988). The effects of andragogical teacher training on adult students’ attendance and evaluation of their teachers. Adult Education Quarterly,38(2), 75–87.

Brookfield, S. D. (1986). Understanding and facilitating adult learning. San Francisco: Jossey-Bass.

Christian, A. (1982). A comparative study of the andragogical-pedagogical orientation of mili- tary and civilian personnel. Ann Arbor: Dissertations Abstract International (UMI No.

8315684).

Colton, S., & Hatcher, T. (2004). The development of a research instrument to analyze the applica- tion of adult learning principles to online learning.Academy of Human Resource Development International Research Conference, Austin, TX, February 29–March 4, 2004.

Cone, J. D., & Foster, S. L. (1993). Dissertation and theses from start to finish: Psychology and related fields. Washington, DC: American Psychological Association.

Conti, G. J. (1978). Principles of adult learning scale: An instrument for measuring teacher behav- ior related to collaborative teaching-learning mode. Ann Arbor: Dissertations Abstract International (UMI No. 7912479).

Toward Development of a Generalized Instrument to Measure Andragogy 191 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq 192 Holton, Wilson, Bates Darkenwald, G., & Merriam, S. B. (1982). Adult education: Foundations of practice. New York:

Harper & Row.

Davenport, J. (1984, Spring). Adult educators and andragogical-pedagogical orientations: A review of the literature. MPAEA Journal.

Davenport, J., & Davenport, J. (1985). A chronology and analysis of the andragogy debate. Adult Education Quarterly,35,152–159.

Espinoza, L. D. (2001). Advising Learning Method of Andragogy (ALMA): Or university soul.

Ed.D. dissertation, The University of Arizona, United States – Arizona. Retrieved May 7, 2009, from Dissertations & Theses: Full Text database. (Publication No. AAT 3010259).

Feuer, D., & Gerber, B. (1988). Uh-oh: Second thoughts about adult learning theory. Training, 25(12), 31–39.

Hadley, H. (1975). Development of an instrument to determine adult educators’ orientation:

Andragogical or pedagogical. Ann Arbor: Dissertation Abstracts International (UMI No. 75- 12, 228).

Hair, J. F., Black, B., Babin, B., Anderson, R. E., Tatham, R. L. (2005). Multivariate data analysis (6th edition). Upper Saddle River, NJ: Prentice-Hall.

Kerwin, M. (1979). The relationship of selected factors to the educational orientation of andragogically- and pedagogically-oriented educators teaching in four of North Carolina’s two- year colleges. Ann Arbor: Dissertation Abstracts International.

Knowles, M. S. (1980). The modern practice of adult education: From pedagogy to andragogy.

New York: Cambridge Books.

Knowles, M. S. (1987). Personal HRD Style Inventory.King of Prussia, PA: Organizational Design and Development.

Knowles, M. S. (1990). The adult learner: A neglected species. Houston: Gulf.

Knowles, M. S., & Assoc. (1984). Andragogy in action: Applying modern principles of adult learning.

San Francisco: Jossey-Bass.

Knowles, M. S., Holton, E. F., & Swanson, R. A. (1998). The adult learner: The definitive classic in adult education and human resource development. Houston: Gulf.

Lawson, G. (1997). New paradigms in adult education. Adult Learning,8(3), 10.

Locke, L. F., Silverman, S. J., & Spirduso, W. W. (1998). Reading and understanding research. Thou- sand Oaks, CA: Sage.

Long, H., Hiemstra, R., & Associates (1980). Changing approach to studying adult education.San Francisco: Jossey-Bass.

McCollin, E. D. (1998). Faculty and student perceptions of teaching styles: Do teaching styles differ for traditional and nontraditional students? Dissertation Abstracts International (UMI No. 9901343).

Merriam, S. B. (1987, Summer). Adult learning and theory building. Adult Education Quarterly, 37(4), 187–198.

Merriam, S. B., & Brockett, R. (1997). The profession and practice of adult education: An introduc- tion. San Francisco: Jossey-Bass.

Merriam, S. B., & Caffarella, R. S. (1991). Learning in Adulthood. A Comprehensive Guide.

San Francisco:Jossey-Bass.

Perrin, A. L. (2000). The fit between adult learner preferences and the theories of Malcolm Knowles. Ann Arbor: Dissertation Abstracts International (UMI No. 9998105).

Pratt, D. D. (1998). Andragogy as a relational construct. Adult Education Quarterly,32(3), 160–181.

Pratt, D. D. (2002, Spring). Good teaching: One size fits all? New Directions for Adult and Contin- uing Education,93.

Rachal, J. R. (2002). Andragogy’s detectives: A critique of the present and a proposal for the future. Adult Education Quarterly,52(3), 210–227.

Strawbridge, W. G. (1994). The effectiveness of andragogical instruction as compared with tra- ditional instruction in introductory philosophy course. Ann Arbor: Dissertation Abstracts International (UMI No. 9509004).

HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq Suanmali, C. (1981). The core concepts of andragogy. Ann Arbor: Dissertation Abstracts Inter- national (UMI No. 8207343).

Wedeking. L. (2000). The learning styles of public health nurses. (UMI No. 995507).

Weinstein, M. (2002). Adult learning principles and concepts in the workplace: Implications for train- ing in HRD.AHRD 2002 Proceedings.

Williams, H. (2001). A critical review of research and statistical methodologies with human resource management quarterly. Academy of Management Journal, and Personnel Psychology 1995–1999. AHRD 2001 Conference Proceedings, O. Aliaga (ed).

Young, S., & Shaw, D. (1999). Profiles of effective college and university teachers. Journal of Higher Education,70(6), 670–686.

Elwood F. Holton III is the Jones S. Davis Distinguished Professor of Human Resource, Leadership and Organization Development at Louisiana State University.

Lynda Swanson Wilson is at University of Phoenix.

Reid A. Bates is Professor of Human Resource and Leadership Development at Louisiana State University.

Toward Development of a Generalized Instrument to Measure Andragogy 193 HUMAN RESOURCE DEVELOPMENT QUARTERLY • DOI: 10.1002/hrdq