Assignment: Evaluation Plan

Corporation for National and Community Service Toolkit for the Evaluation of Service -Learning Programs Contract # CNSHQ09A0010 September 1 6, 2011 Prepared for: Corporation for National and Community Service 1201 New York Avenue, NW Washington, DC 20525 Submitted by : Abt Associates Inc. 55 Wheeler St Cambridge, MA 02138 In Partnership with: RMC Research Corporation 633 17 th Street Suite 2100 Denver, CO 80202 Dillon -Goodson Research Associates 409 Montgomery Road Westfield, MA 01085 T Service -Learning Evaluation Toolkit Abt Associates Inc. ▌pg. i This document was prepared by Abt Associates Inc., with RMC Research Corporation and Dillon - Goodson Research Associates, for the Corporation for National and Community Service (CNCS), under contract number CNSHQ09A0010 . Corporation for National and Community Se rvice Office of Strategy and Special Initiatives September 2011 The mission of the Corporation for National and Community Service (CNCS) is to improve lives, strengthen communities, and foster civic engagement through service and volunteering. Each year, CNCS engages more than four million Americans of all ages and backgrounds in service to meet local needs through three major programs: Senior Corps, AmeriCorps, and Learn and Serve America. CNCS contracted with Abt Associates Inc., an independent and nonpar tisan research firm, to produce this document. This document is in the public domain. Authorization to reproduce it in whole or in part is granted. The suggested citation is: Caswell, L., Billig, S., Goodson, B., Gan, K., Levin, M., and Unlu, F. Toolkit fo r the Evaluation of Service -Learning Programs. Prepared for the Corporation for National and Community Service. Cambridge, MA: Abt Associates Inc. Upon request, this material will be made available in alternative formats for people with disabilities. Service -Learning Evaluation Toolkit Abt Associates Inc. ▌pg. ii To olkit for the Evaluation of Service -Learning Programs Table of Contents 1. Introduction to the Toolkit ................................ ................................ ................................ ......... 1-1 1.1 Contents of the Toolkit ................................ ................................ ................................ ... 1-1 1.1.1 General Evaluation Guidelines for Service -Learning ................................ ........ 1-1 1.1.2 Developing a Rigorous Evaluation Design for Service -Learning ..................... 1-2 1.1.3 Instruments and Recruitment Materials Developed for the National Evaluation of School -based Learn and Serve America Programs ..................... 1-2 1.1.4 Annotated Bibliography Based on Literature Reviews Conducted for the National Evaluation ................................ ................................ ........................... 1-3 1.2 Users of the Toolkit ................................ ................................ ................................ ........ 1-3 2. Gener al Evaluation Guidelines ................................ ................................ ................................ .. 2-1 2.1 Introduction ................................ ................................ ................................ .................... 2-1 2.2 Why Evaluate Service -Learning Projects? ................................ ................................ ..... 2-2 2.3 Characteristics of Effective Evaluations ................................ ................................ ......... 2-3 2.4 Evaluation Questions ................................ ................................ ................................ ...... 2-5 2.5 Using a Logic Model as a Gu ide ................................ ................................ .................... 2-1 2.6 Evaluation Designs ................................ ................................ ................................ ......... 2-2 2.6.1 Experimental Designs ................................ ................................ ........................ 2-3 2.6.2 Quasi -Experimental Designs ................................ ................................ ............. 2-4 2.6.3 Pre/Post Designs ................................ ................................ ................................ 2-5 2.6.4 Case Studies ................................ ................................ ................................ ....... 2-5 2.7 Data Collection Methods ................................ ................................ ................................ 2-7 2.7.1 Surveys ................................ ................................ ................................ .............. 2-7 2.7.2 Interviews ................................ ................................ ................................ .......... 2-9 2.7.3 Focus Groups ................................ ................................ ................................ ... 2-10 2.7.4 Observations ................................ ................................ ................................ .... 2-11 2.7.5 Secondary Analysis of Existing Data ................................ .............................. 2-11 2.7.6 Knowledge Assessments ................................ ................................ ................. 2-12 2.7.7 Multi -Method Approaches ................................ ................................ .............. 2-13 2.7.8 Data Collection Procedures ................................ ................................ ............. 2-13 2.8 Sample Survey Subscales ................................ ................................ ............................. 2-14 2.8.1 Measure of Responsibility for Community Issues and Social Problems:

Social Responsibility ................................ ................................ ....................... 2-15 2.8.2 Measure of Responsibility for Community Issues and Social Problems:

Neighborhood Obligations ................................ ................................ .............. 2-15 2.8.3 Measure of Personal Effic acy and Empowerment ................................ ........... 2-16 2.8.4 Measure of Sense of Belonging to School ................................ ....................... 2-16 2.8.5 Measure of Academic Engagement ................................ ................................ . 2-17 2.8.6 Measure of School Engagement ................................ ................................ ...... 2-18 2.9 Sampling ................................ ................................ ................................ ....................... 2-19 2.10 Human Subjects Protection ................................ ................................ .......................... 2-20 Service -Learning Evaluation Toolkit Abt Associates Inc. ▌pg. iii 2.11 Data Analysis ................................ ................................ ................................ ................ 2-22 2.12 Drawing Conclusions ................................ ................................ ................................ ... 2-23 2.13 Elements of a High -Quality Report ................................ ................................ .............. 2-24 2.14 Using Evaluation Results for Improvement ................................ ................................ . 2-25 2.15 Evaluation Resources ................................ ................................ ................................ ... 2-26 2.15.1 Evaluation Toolkits ................................ ................................ ......................... 2-26 2.15.2 Methods ................................ ................................ ................................ ........... 2-29 3. Developing a Rigorous Evaluation Design for Service -Learning: Alternatives and Considerations ................................ ................................ ................................ ............................. 3-1 3.1 Introduction ................................ ................................ ................................ .................... 3-1 3.2 Random Assignment Options ................................ ................................ ......................... 3-1 3.2.1 Considering the Level of Random Assignment ................................ ................. 3-1 3.2.2 Random Assignment of Schools ................................ ................................ ........ 3-2 3.2.3 Random Assignment of Teachers within Schools ................................ ............. 3-2 3.2.4 Random Assignment of Classes within Teachers ................................ .............. 3-3 3.2.5 Random Assignment of Students to Teachers ................................ ................... 3-3 3.3 Considerations for Each Design Option ................................ ................................ ......... 3-4 3.3.1 Primary Research Question Answered by Each Design Option ........................ 3-4 3.3.2 Feasibility of Recruitment for Each Design Option ................................ .......... 3-4 3.3.3 Power Associated with Each Design Option ................................ ..................... 3-6 3.4 General Considerations for All Design Options ................................ ........................... 3-10 3.4.1 Aligning Data Sources, Analytic Approach and Outcomes with Research Questions ................................ ................................ ................................ ......... 3-10 3.4.2 Developing a Multiple Comparison Strategy ................................ .................. 3-13 3.4.3 Methods to Maximize Response Rates and Deal with Issues of Nonresponse ................................ ................................ ................................ .... 3-17 4. Instruments and Recruitment Materials Developed for the National Evaluation of School -based Learn and Serve America Programs ................................ ................................ . 4-1 4.1 Intro duction ................................ ................................ ................................ .................... 4-1 4.1.1 Overview of the NELSAP Study Design ................................ .......................... 4-1 4.2 Instruments to Measure Service -Learning ................................ ................................ ...... 4-3 4.2.1 Teacher Information Form and Instructions ................................ ...................... 4-5 4.2.2 Teacher Log and Instructions ................................ ................................ .......... 4-33 4.2.3 Tea cher Interview on Service -Learning Activities in the Classroom .............. 4-51 4.3 Instruments to measure students’ academic and civic engagement .............................. 4-56 4.3.1 Student Survey, Crosswalk and Sources ................................ ......................... 4-57 4.4 Instruments for Recruitment of Districts, Schools and Teachers ................................ . 4-74 4.4.1 School Districts: Superintendents and Service -learning Coordinators ............ 4-74 4.4.2 Presentation of General Study Information ................................ ..................... 4-76 4.4.3 Int roductory Letters and Call Topic Guides ................................ .................... 4-81 4.4.4 Memorandum of Understanding (MOU) and Other Documents of Agreement ................................ ................................ ................................ ....... 4-93 Service -Learning Evaluation Toolkit Abt Associates Inc. ▌pg. iv 5. Annotate d Bibliography of Literature Reviews for the National Evaluation of School - based Learn and Serve Programs ................................ ................................ ............................. 5-1 5.1 Review of Potential Scales to Measure Students’ Academic Achievement and Civic En gagement ................................ ................................ ................................ ............................... 5-1 5.2 Review of Potential Moderators for Service Learning ................................ ................... 5-4 5.3 Review of the Impacts of High Quality Service -Learning ................................ ............. 5-6 5.4 Review of Studies that Use Within Teacher Random Assignment ................................ 5-7 5.5 Review of Studies Using Student -Level Random Assignment ................................ ...... 5-8 5.6 Review of Characteristics of Effective Teachers ................................ ............................ 5-9 5.7 References ................................ ................................ ................................ .................... 5-12 Glossary ................................ ................................ ................................ ................................ ............. G-1 Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 1-1 1. Introduction to the Toolkit From 2008 – 2010, Abt Associates (Abt) and its partners, RMC Research Corporation (RMC) and Dillon -Goodson Research Associates , under contract to the Corporation for National and Community Service (CNCS) were involved in designing the National Evaluation of School -based Learn and Serve Ame rica (LSA) Programs (NELSAP or ―National E valuation‖) . CNCS commissioned a National Evaluation to provide a rigorous experimental test of the impacts of LSA grantees’ high -qu ality service -learning (SL) activities on student s’ academic achievement and civic and academic engagement in core academic subject areas . This would have been the first rigorous impact study of service -learning and therefore was considered to have importa nt policy relevance for the field of service -learning specifically and for youth development more generally. Although changes in fundi ng and research priorities meant that the National Evaluation was not conducted, the substantial work on design and instru mentation for the evaluation represent a contribution to researchers who may conduct research on service -learning in the future. This Toolkit presents the products that were developed as part of the design and instrumentation work on the National Evaluatio n. The products in the Toolkit are specific to a design that CNCS presented to the Office of Management and Budget (OMB) . It is the hope of CNCS that other researcher s will benefit from a discussion of design alternatives for experimental studies and from access to instruments that measure students’ academic and civic engagement , imple mentation measures of service -learning in the classroom , and sample recruitment materials . 1.1 Contents of the Toolkit Following this I ntroduction, t he Toolkit has four main sect ions , each authored by different members of the National Evaluation study team : General Evaluation Guidelines for Service -Learning (RMC Research Corporation) Developing a Rigorous Evaluation for Service -Learning (Abt Associates and Dillon -Goodson Researc h Associates) Design Instruments and Recruitment Materials from the National Evaluation of School -based Learn and Serve American Program s (Abt Associates, RMC Research Corporation, and Dillon -Goodson Research Associates) Annotated Bibliography Based on Lit erature R eviews Conducted for the National Evaluation (Abt Associates and The Center for Information and Research on Civic Learning and Engagement) A brief description of the contents of each of these four sections is provided below. 1.1.1 General Evaluation Gui delines for Service -Learning This section provides an overview of the varying objectives for evaluation of a service -learning program and the range of different approaches to evaluation that could be adopted, depending on the research questions to be addre ssed. This section provides a ―walk -through‖ of the key steps in designing a meaningful evaluation, starting with the development of research questions and a study Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 1-2 logic model and ending with using the data for program improvement and to demonstrat e impact to external audiences . 1.1.2 Developing a Rigorous Evaluation Design for Service -Learning This section includes documents that were developed as during the design phase of the National Evaluation. The National Evaluation was intended to have an experimental ra ndom assignment design, and these documents discuss several alternative approaches to random assignment . In particular, t his section discusses the ramifications of the differe nt designs , including wh at research question (s) each design is meant to answer, t he ir sample size and power requirements, and implicatio ns for recruitment of schools, teachers, and students. 1.1.3 Instruments and Recruitment Materials Developed for the National Evaluation of School -based Learn and Serve America Programs This section present s the set of instruments and recruitment materials that were developed for the National Evaluation. The instruments include surveys and interview protocol s to measure the implementation of service -learning as well as student and classroom outcomes . The rec ruitment materials include documents for the recruitment of districts, schools, teachers and students for participation in the study. Although the student and classroom measures were designed for particular use in the National Evaluation, they measure stud ent and classroom outcomes that would be relevant to other research on service -learning. Similarly, although t he recruitment materials are particular to the National Evaluation study design, t he y include materials for multiple levels of recruitment needed for an evaluation and may be adapted by researchers for other studies. The materials were designed to be informative, easy to understand, and persuasive about the importance of the study. Exhibit 1.1: Key Design Features o f the National Evaluation A full overview of the National Evaluation is available in Section 4, but key features are presented here for context. Broad research question: What is the impact of participation in service -learning activities on students‘ outcomes? Who : 5, 660 students in the 9 th or 10 th grade nested in 139 teachers. Each teacher has two classes in the evaluation, for a total of 278 core academic classes (math, English/language arts, social studies, and scien ce). What : H igh -quality service -learning funded by Learn and Serve America grants in either the 2009 -2011 or 2012 -2014 grant cycles . When : 2011 -12 school year . Where : P ublic schools in a mix of rural and urban districts within approximately nine states acr oss the US, balanced by region . How : W ithin -teacher random assignment. With in each teacher‘s pair of classes, one is randomly assigned to treatment – business as usual implementing service -learning – and one is randomly assigned to control – forgoing servi ce -learning. Evaluators: Abt Associates Inc ., RMC Research Corporation, and Dillon -Goodson Research Associates Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 1-3 1.1.4 Annotated Bibliography Based on Literature Reviews Conducted for the National Evaluation As part of the development of the design and the instruments for the National Evaluation, liter ature reviews were conducted in the following areas: 1) research on definitions and measures of student civic and academic engagement ; 2) po tential moderators for service -learning; 3) studies that examin e the impact of high -quality service -learning; 4) studies that have used a within -teacher random assignment design; 5) studies that use student -level random assignment ; and 6 ) research on the characteri stics of effective teachers. Given the substantial amount of work that went into these reviews, it is hoped that other researchers will benefit from the annotated bibliography that the study team put together for each topic . 1.2 Users of the Toolkit The four main sections of the Toolkit are likely to be useful to different groups of researchers. Section 2.0, General Evaluation Guidelines , provides general information about how to develop an appropriate evaluation approach based on the study goals. Thi s section will be particularly useful to those who are new to evaluation research or who are looking for a step -by -step guide to the elements of an evaluation. This section also provides useful information for non -experimental evaluations, such as descript ive, implementation studies or pre -post designs that collect data from program participants only. Section 3.0, Developing a Rigorous Evaluation Design , will be of interest to researchers who are interested in conducting a random assignment study to answer questions about the impacts of service - learning programs. The kinds of design alternatives that are discussed in th is section are variations on random assignment. Section 4.0 , Instruments and Recruitment Materials Developed for the National Evaluation of S chool - based Learn and Serve Programs , provides student survey measures of academic and civic engagement, classroom instruction measure s of service -learning and project -based learning more generally, and teacher surveys on instructional practice in the area of service -learning. Any one of these measures could be useful to all of the types of research — descriptive, quasi -experimental and experimental. This secti on also contains samples of recruitment materials that could be useful in recruiting participants fo r an evaluation of service -learning. Section 5 .0, An Annotated Bibliography Based on Literature Reviews Conducted for the National Evaluation , will be of interest to researchers who desire a quick reference on articles in any of the areas that were reviewe d for NELSAP. The annotated bibliography could serve as a convenient guide of relevant research findings for researchers, students, and practitioners. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-1 2. General Evaluation Guidelines 2.1 Introduction This chapter provides general guidance for those who wish t o undertake an evaluation of K -12 service -learning projects. The chapter has 14 sections: Section 2.1 addresses why project leaders, staff, and community partners should evaluate their programs. Section 2.2 has information on the characteristics of effecti ve evaluations. Section 2.3 contains a discussion of typical questions asked by evaluators of school -based service -learning programs. Section 2.4 gives advice on how to develop and use a logic model to guide the evaluation process and provides a sample fr om a State Learn and Serve program evaluation. Section 2.5 presents options for evaluation designs, and includes discussions of experimental, quasi -experimental, pre/post, and case study designs. Section 2.6 discusses common methods used for evaluation p urposes, including surveys, interviews, focus groups, observations, collection of secondary data such as test scores, and development of knowledge assessments, the use of multi -method approaches, and some information on data collection. Section 2.7 display s sample survey subscales that measure constructs associated with common outcomes, such as academic and civic engagement. Section 2.8 offers guidance on sampling procedures. Section 2.9 specifically discussions human subjects protection to preserve respon dents’ privacy, a required component of most evaluations. Section 2.10 briefly considers typical types of data analysis undertaken by evaluators and when to use some of the more prevalent types of statistical analysis. Section 2.11 discusses how to draw co nclusions effectively and notes some of the criticisms in this area that have been leveled about service -learning evaluations. Section 2.12 shows the elements of a good evaluation report. Section 2.13 has a very brief discussion of using evaluation results for improvement. Finally, Section 2.14 provides a list of potentially useful evaluation resources. Throughout the chapter, examples are provided that illustrate the ways in which researchers have applied the information to service -learning evaluation pro jects. Most often, the projects used for illustration are State Learn and Serve evaluations conducted by RMC Research during 2009 -2011. These projects agreed to use the same evaluation approach, and many useful lessons can be derived from their experiences . Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-2 Information presented in this chapter is provided under the assumption that evaluators have fairly limited amounts of funding, perhaps in the $15,000 to $100,000 range, for annual evaluations. The information is intended to help those who are unlikely t o be able to conduct evaluations using experimental designs, but who are interested in conducting the most rigorous and effective evaluations possible given time and funding limitations. However, as will be discussed, evaluators are urged to try to impleme nt designs that feature random assignment of respondents by school, classroom, or students. Those who are able to conduct evaluations at this level of rigor will find in - depth guidance in Chapters 3 and 4. 2.2 Why Evaluate Service -Learning Projects? According to the Learn and Serve America (LSA) website: Service -learning offers a unique opportunity for America’s young people —from kindergarten to college students —to get involved with their communities in a tangible way by integrating service projects with classr oom learning. Service -learning engages students in the educational process, using what they learn in the classroom to solve real -life problems. Students not only learn about democracy and citizenship, they become actively contributing citizens and communit y members through the service they perform. Service -learning can be applied across all subjects and grade levels; it can involve a single student or group of students, a classroom or an entire school.

Students build character and become active participants as they work with others in their school and community to create service projects in areas like education, public safety, and the environment. 1 The website defines the general parameters of service -learning as an academic or instructional approach and id entifies many possible outcomes as illuminated by evaluation reports and anecdotes from participants. However, research reviews, including the one presented in Chapter 5 of this document, show that the outcomes are essentially untested, since there are few rigorous studies of K - 12 service -learning. In these times of high educational accountability, anecdotes and suggestive evaluation reports are not enough: rather, educational decision makers, practitioners, and advocates for and critics of service -learning need well -designed studies to identify the outcomes that service - learning can reliably achieve. People want to invest in strategies that predictably have the results they desire. Documenting outcomes, though, is only one reason to evaluate service -learnin g projects. Effective evaluations can also be used to determine if the project has met its goals and objectives and to assess the quality of the processes being used for implementation and their relationship to results. Evaluations can illuminate those pro ject characteristics associated with stronger outcomes and provide valuable information for project improvement purposes. Strong evaluation results can also be leveraged to secure additional funding, promote passage of supportive policies, eliminate barrie rs to adoption and implementation, nurture promising practices, attract community partners, and buttress sustainability. However, service -learning is not an easy approach to evaluate. While service -learning is often referred to as a project, as the preced ing paragraphs indicated, many practitioners conceptualize 1 Retrieved from http://www.learnandserve.gov/about/service_learning/index.asp Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-3 service -learning as inclusive of much more than the community service that participants provide. Rather, service -learning is conceptualized as involving investigation, planning, action in the form of service, demonstration, and celebration. Service -learning also is intended to have reflection activities woven throughout each of its components and ought to address an authentic community need. Further, s ervice -learning should incorporate the K -12 stan dards and indicators of quality (National Youth Leadership Council, 2008) that include sufficient duration and intensity, opportunities for meaningful service, cognitively challenging reflection activities, strong link to academic curriculum or other learn ing objectives, mutually beneficial partnerships between schools/programs and community organizations/members, respect for diversity, youth voice, and progress monitoring. Beyond these general issues of impact and quality, service -learning can include a l arge variety of other measurable constructs. Issue areas/content of the project can vary widely , and service -learning can be directed to people of nearly all ages and used by a range of different organizations such as schools, youth groups, and philanthrop ic partners. For example, service -learning can address issues of homelessness, animal shelters, challenges faced by senior citizens and the disabled, environmental concerns, disaster preparedness and other safety concerns, tutoring and mentoring youth, edu cating adults, childhood obesity and other health concerns, transportation challenges, and nearly any other social issue that can be identified. The type, length, and characteristics of projects are not pre -defined ; the critical characteristic is that the projects are authentic. While this openness to issues and youth -directed, adult -facilitated approach is very attractive to practitioners and particularly to many youth, it makes evaluation a challenge . Many service -learning ―projects‖ do not have definabl e characteristics until they are well underway. In addition, service - learning has extraordinarily difficult properties to define and measure – and many feel that it cannot be considered an ―intervention‖ for that reason. Rather, it is an instructional or t eaching and learning approach, much like project -based or place -based learning. As such, evaluations can be planned and be rigorous, but have challenges in terms of generalizability and determination of effects. Nonetheless, service -learning must be evalu ated so that we can learn what works (and what doesn’t work) and what reasonable expectations for outcomes should be, especially given the passion of the practitioners and the many testimonials about its transformative powers. The potential is very great, but the evidence of efficacy is thin. 2.3 Characteristics of Effective Evaluations An evaluation is a systematic assessment of the processes and/or outcomes of a project, program, or approach. The explicit intent of an evaluation is to understand what the ―in tervention‖ is about and its consequences. Evaluations are valuable when they are well -designed and executed. While this statement seems self - evident, the literature is rife with examples of service -learning evaluations that have not met these expectation s. What are the characteristics of effective evaluations? First, the evaluation should be designed to answer specific evaluation questions. As will be explained in the next section of this chapter, at a minimum the questions should examine whether the goal s and objectives of a program or practice are being met. Because of this, evaluation is not the same as research – though evaluators typically use the same methods as researchers. Unlike most research, evaluation is designed to provide timely and construct ive Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-4 information for decision making about a particular program or practice. As such, evaluation is client -focused. Research, on the other hand, tends to be designed to answer broader questions to advance a theory or to investigate specific phenomena and is typically not designed to meet a specific client need for program or practice information. Second, evaluations should be valid and reliable. Good evaluation s have both strong internal and external validity. Internal validity refer s to the extent to which the design allows the evaluator to make causal claims, that is, to attribute changes in outcomes to an intervention or treatment. In the case of service -learning, this would mean that the evaluator will have used an evaluation design that has the properti es of a true experiment and that has strong controls on other sources of influence on the outcome. More information about this is presented in the discussion of experimental designs in Section 2.5. External validity has to do with the ability to generalize findings from the study to a larger population . The sample needs to be selected and described in such a way as to identify the populations, occasions, and programs/ approaches to which the findings can be said to apply. Third, evaluations should use meas ures that are valid and reliable. Reliability refers to the extent to which measuring the same construct in the same way will consistently yield the same results. For example, your bathroom scale should show the same weight if you step on it several times within a short span of time because your weight is unlikely to fluctuate broadly within minutes. Similarly, measures of constructs such as ―ethic of service‖ should remain the same if the individual consistently expresses a strong desire to volunteer. Lack of reliability is typically associated with lack of clarity in the questions or answer categories. While some error in both validity and reliability is expected, likely errors need to be discussed. This idea will also be discussed more fully in the instru ment development section presented later in this chapter. Validity generally refers to ―face validity,‖ which is the idea that the measure actually measures what it intends to and whether there is an adequate sample of the types of attitudes or behaviors t hat represent what is being measured. For example, tests with knowledge questions on them that are of interest to service -learning evaluators, such as questions about how specific government entities make decisions, are supposed to measure the extent to wh ich students have learned the content – that is, whether they know how government works. Good tests measure multiple sub -skills associated with the overall skill being measured. Measuring one aspect of knowledge of government decision making – e.g., how a bill becomes a law – is not enough. As another example, evaluations with survey items that measure self -reported engagement in content should represent whether students are actually interested, enjoy, pay attention, and want to learn the content. More will be said about selecting valid measures in the instrument development section of this chapter. Fourth, effective evaluations are objective, that is, that the conclusions drawn as a result of the study are independent of the analyst drawing the conclusions. Evaluators’ predispositions or any aspect of the subjects of the studies should not bias the results. While some believe it is difficult if not impossible to achieve full objectivity, evaluators should strive to be as objective as possible and to illumina te any sources of bias that may be present. Any reader of an evaluation report should be able to see the relationship between the findings and the conclusions. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-5 Fifth , effective evaluations also should be well -organized and feature clear communication of pu rpose, theoretical foundation, design, methodology, sampling, analysis techniques, conclusions, and study limitations. For information to be clearly conveyed and understood, evaluators should write to their audiences (typically program leaders and staff) a nd sometimes to multiple audiences, such as policymakers, educators, community organization staff, parents, and the public at large. While different briefs or versions of the report may be needed, such as a tech nical report for researchers or a public info rmation brief for widespread distribution to decision makers, any evaluation report should be clear and easy to follow. Finally, evaluations should be useful. Results should help program designers understand the outcomes experienced by participants and the conditions under which those outcomes have been achieved, the limitations of their program designs, and other information that allows them to engage in continuous improvement. 2.4 Evaluation Questions The typical purpose of evaluating service -learning is to determine whether the service -learning program, project, or approach is meeting its goals, that is, whether the measured outcomes for a given set of activities match the intended (pre -specified) outcomes. This purpose presupposes that the service -learning program, project, or approach has stated its goals and objectives in the form of intended outcomes and further, that the outcomes are defined in terms of a benchmark for success. Effective service -learning programs should identify outcomes in advance and consider outcomes in different areas, such as addressing a community need, building community capacity, and developing participants academically and civically. Programs describe the need or issue that the measure will address; the activities to be conducte d to meet the need; and intended outputs, intermediate outcomes, and outcomes to be achieved by the end of the project. For the output and outcomes, program leaders should provide a statement showing their intended results, measurement types, and data/inst rument used to measure progress. These outcome statements then become the starting point from which evaluation questions can be developed. The evaluation questions about goal attainment are relatively easy to derive when program leaders have done a good j ob specifying their intentions. For example, service -learning program leaders often specify outcomes such as ―participating students will exhibit improved civic engagement‖ with a benchmark that reads something like, ―Over 50% of participating students wil l show an increased score on measures of civic engagement over time.‖ Other outcomes and benchmarks are even simpler, stating as an example, ―Over 300 students will participate in service -learning projects.‖ Evaluation questions, then, address either whet her, or the extent to which, these outcomes have been attained. In other cases, the evaluator may have to work with project leaders to develop outcome statements and benchmarks for success that lead to the development of the goal attainment evaluation ques tion. Evaluators may find it to be most helpful to have a conversation about what it is that the program is designed to do and for whom. Oftentimes, program leaders define outcomes for each of the program participant types. For example, leaders may define a set of outcomes for students, another set for teachers, community members, or partners, and another for community impact. When working with project leaders, it is important to help them clearly define realistic expectations. The outcomes that they specif y should be able to be translated into operational terms with linked measures. For example, Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-6 rather than to say that participants will learn more, evaluators should help leaders to determine what the participants will learn, by when, and how the learning wi ll be measured. It may also be helpful to discuss why they think the outcomes will occur and what activities they will conduct that are likely to produce these outcomes. Literature reviews can be very helpful in this regard, and this process is described i n more detail in the Logic Model section below. Outcomes could be stated in the form of: specific knowledge and skills acquired (such as learning the knowledge associated with understanding sources of air pollution, or learning how to construct a persuasiv e argument); general knowledge or skills (such as developing stronger abilities to solve problems or draw inferences); changing attitudes (such as motivation to learn, respect for diverse opinions, tolerance of ambiguity, or desire to serve the unfortunate ); and/or behaviors (such as attending school more regularly, turning in homework on time, or following directions). Beyond a determination of outcome attainment, many evaluation questions also address issues such as the quality of the project, implementa tion facilitators and impediments, ways in which challenges were overcome, and progress toward sustainability. Other common evaluation questions address whether there were differential outcomes for participants based on participant characteristics such as demographics, achievement levels, previous experiences, and other factors that potentially serve as moderators or mediators of success. A moderator of success is a variable that affects the direction and/or strength of the relationship between the indepe ndent and dependent variable. For example, the relationship between socioeconomic status of participants (level of affluence) and outcomes in the area of civic engagement may be moderated by age in that older students from less affluent homes may not be a ble to engage in civic activities after school because they are more likely to have jobs. A mediator refers to a variable that accounts for the relationship between the independent and dependent variable. For example, students who participate in service -learning may become more academically engaged because they are more interested in the subject matter. Becoming more academically engaged may lead to stronger levels of academic performance. In this case, the reason why students perform better may be because they are more academically engaged, and thus their levels of engagement may mediate the outcomes. Other frequently asked evaluation questions have to do with number of participants and hours of service, long -term sustainability of service -learning, the ext ent to which projects are led and managed well, and the financial value of the service -learning effort. Evaluation questions should be posed in such a way that they guide the evaluation. Some evaluators prefer definitive yes/no evaluation questions, while others devise questions about extent of change. The literature does not provide a preferred way of posing questions: however, there are many useful resources that can help you develop good questions, listed in the resource section of this chapter. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-1 Research Questions for A Study of Service Learning The service -learning cluster evaluation conducted by RMC Research reflects some of the more common evaluation questions used by service -learning evaluators. The questions were developed in partnership with Learn and Serve grantee leads, based on the outcomes they identified for their subgrantees. Guiding evaluation questions were as follows: Ques tions 1 and 2 are outcome questions, questions3 and 4 examine moderators and mediators, and question 5 looks at alignment to a framework based on the research literature of factors highly associated with sustainability. 1. What is the impact of participation in service -learning on the student participants in the following areas: a. Academic engagement? b. Academic performance/achievement? c. Likelihood of dropping out of school? d. Acquisition of science, technology, engineering, and mathematics (STEM) -related skills? e. Att itudes and behaviors associated with environmental stewardship? and f. Civic engagement? 2. What is the impact of participation in service -learning on the community or those receiving service? 3. Are there differences in impact based on participant characteristics such as demographics, student achievement levels at entry to the program, and teacher experience? 4. What program design factors, such as quality of program design and delivery and quality/amount of professional development provided to program facilitators) s erve to influence impacts? 5. To what extent have programs addressed factors associated with sustainability? Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-1 2.5 Using a Logic Model as a Guide Logic models are visual displays that represent a program. Typically a logic model shows the relationship between a program’s activities, intended outcomes, and factors that may explain or influence outcomes. There are many ways t o develop and convey a logic model. Most logic models, though, have seven parts: Inputs that detail program resources, such as staff time and expertise, funding levels, facilities, materials, and other factors that ―drive‖ the program. Major activities or processes that define the program, such as participant activities/learning opportunities and program components. This section also often describes the participants. In essence, it is the ―black box,‖ that is, a thick description of the intervention. Outpu ts, which refer to the program’s reach, typically presented as measurable units such as hours, numbers of people, or completed actions. Outcomes , typically defined as the knowledge, skills, attitudes, behaviors, or status changes that the program leaders hope will change as a result of program participation. Outcomes may be short -term, intermediate, or long -term. There is no common definition of short -term, intermediate, or long -term, so logic model developers will need to specify the expected time frame. Implementation factors , which include program design characteristics or any other variable associated with program execution. If included, this is where the moderators and mediators often appear. Context , typically variables that the program leaders, staf f, or evaluators determine may affect program outcomes. Contextual variables in education often include changes in administrative leadership, funding, accountability pressures, and other factors that are not under the control of the program. Many evaluator s find it very useful to work with program leaders and staff to develop a logic model to clarify exactly what it is that they hope to accomplish and why they think they will obtain results. When articulated clearly, the logic model clarifies program activi ties and intentions, focuses work, helps staff develop more realistic outcomes, illuminates assumptions and relationships between activities and intended results, and guides the evaluation. It also can help explain why a program is effective or not. Logic models typically look like flow charts, with boxes and arrows, and conventionally fit onto one page. Simple models show inputs, activities, outputs, and outcomes that clearly convey key program aspects, with arrows that show the relationship between the b oxes, such as the relationship between activities and outcomes. More complicated models tend to be associated with programs that have a theory of action or wish to explore more complex relationships between variables. A good rule of thumb to remember is th at logic models should not be laden with too many details, but rather should represent only the key characteristics of the program. A logic model usually does not provide numeric targets or measurements, but it does have broad descriptions of intended outc omes. Because they are visual and show relationships between project components, logic models are useful for illuminating the thinking behind a project but are not great at conveying the messiness of program Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-2 implementation or even the complexity of a proje ct. Logic models developed at the beginning of projects often fail to identify the right resources, activities, and outcomes, and typically do not show the possible negative outcomes that could derive from implementation. However, while they run the risk o f being perceived as a too simplified, too linear or too static view of a program, most people find logic models to be incredibly useful for planning program activities, developing the evaluation questions and design, and engaging in more thoughtful progra m refinement and improvement, particularly if leaders view the logic model as a dynamic document that should be revisited on a routine basis. The sample logic model presented on the next page was developed for the Texas Learn and Serve State evaluation, b ut represents a fairly typical logic model for service -learning programs. The logic model shows inputs in the form of partner contributions and other supports for service -learning, describes typical activities for teachers and students, and then specifies student outcomes and partner/community outcomes. The role of context is briefly indicated, along with the efforts to develop a sustainable model. Please note that this logic model was intended to cover one year, and thus only has one set of outcomes and no t short -, intermediate -, and long -term outcomes. It also does not specify outcomes for teachers, schools, or service recipients. In Chapter 3, the logic model for the National Evaluation of Learn and Serve America Programs is presented, and features a more fully developed description of various outcome areas. Exhibit 2.1: Sample Logic Model for a Service -Learning Project 2.6 Evaluation Designs The evaluation design presents the blueprint for the ways in which the evaluation will be conducted and reported . Designs vary in terms of purpose, rigor, data collection burden, and cost. While there are many possible designs that can be used to evaluate service -learning programs, projects, or Service -Learning Experience Frequency and type of project, implementation qualit y, linkage to subject matter, effectiveness of partnerships Number and type of students participating, intensity and duration of participation Student Outcomes Academic performance Civic knowledge, skills, dispositions, and behavior; attitudes toward diversity S.T.A.R.S. outcomes (e.g., leadership skills, ethic of service), workforce preparation School engagement, school and community attachment Partner Contributions LEA, K -12 campus and IHE investment Contributions by community partners Support for Service - Learning Training/technical assis tance Recognition of service -learning as improvement strategy Teacher buy -in Context (e.g., student, K -12 campus, IHE and community characteristics, support for service -learning, accountability pressures) Efforts to Develop Replicable Partnership Model s for Rural and Underserved Communities Partner/Community Outcomes Volunteer participation; benefits for LEAs, IHEs, community partners Social capital/Community capacity Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-3 approaches, only four will be discussed here: experimental, quasi -experi mental, pre/post, and case study designs. These designs are most commonly used for summative evaluations, that is, an evaluation designed to document program impact. While these designs can also be used as formative evaluations, that is, evaluation whose p urpose is primarily program improvement, there are many other formative evaluation designs that could be discussed. Interested readers can review the resource list for more information on formative evaluation designs. The evaluation designs briefly discuss ed here are presented to illuminate what they entail and their primary advantages and disadvantages. They are presented in descending order of rigor in terms of the certainty one has that the results are actually related to the intervention and in the gene ralizability of the results, meaning that the results apply to other service -learning programs with similar characteristics. As will be seen, certainty can be increased through the use of control or comparison groups, ensuring that both student groups and teachers being compared are equivalent. Generalization can be increased by selecting a sample of students or sites to study that represent the general population that participates in service -learning; selecting teachers for the evaluation who are represent ative of most teaching staff; and keeping the intervention as ―normal‖ as possible and not informing students that they are subject to study. Each of the designs below may also have different units of analysis for evaluation purposes. Service - learning ev aluators may be examining impacts on students at the classroom level, grade level, building or school level, or district level. While the unit of analysis may vary, the general design approach remains the same. 2.6.1 Experimental Designs An experimental design requires the evaluator to randomly assign subjects (or units, such as classrooms or schools) to conditions so that all other sources of influence are theoretically randomly distributed across the conditions. Experimental design operates with the assumption that random assignment allows one to conclude that the most likely source of differences in outcome between groups is the treatment, which in this case is service -learning. Experimental evaluation designs are considered the most rigorous of all of the desi gn choices because of the level of certainty one can have in the findings. Random assignment can occur at the student, classroom, school, or district level (or with any other unit of analysis). For example, students in a grade level could be randomly assi gned to teachers who will use service -learning as an instructional approach or to those who will not use service -learning as an instructional approach. Because the assignment of the students in the classroom is random, the two groups of students should be relatively well matched in terms of their demographic characteristics, previous achievement, and other variables that could potentially affect outcomes. The evaluation calls for either pre/post or post only measures for the two groups – treatment and contr ol – to see whether the intervention, service -learning, made any difference in intended outcomes, such as academic or civic engagement, academic performance, or others described in the logic model or performance measures. In some cases, because of potentia l ―contamination‖ of the sample, schools are used as the unit of random assignment rather than classrooms. Contamination concerns are about students or adults in the treatment conditions talking to or otherwise influencing others so that the others somehow obtain the benefit of the treatment. (In some service -learning evaluations, this type of contamination has Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-4 occurred. One teacher who was assigned to be a control really liked the idea of service -learning and implemented it in a small way, even though she was not supposed to do so.) When schools are the unit for random assignment, evaluators often try to find pairs of matched schools in terms of their demographic and achievement profile, and then randomly assign them to treatment and control conditions. Ev aluators need to recognize that experimental designs require a sufficient sample size to detect potentially small effects. As will be discussed in later chapters of this volume, the current research on service -learning suggests that it may have an effect s ize that is very low. In order to be able to demonstrate the effect, very large numbers of student participants may need to be studied. Experimental designs are considered to be the ―gold standard‖ for educational evaluations by the U.S.

Department of Edu cation’s Institute for Education Sciences. As such, service -learning evaluators should strive to implement this design to the degree possible. However, many evaluators have faced strong challenges when trying to implement the experimental designs. Some fam ilies do not like having their children either ―forced‖ into a program or having a desired program withheld. Educators may also resist the mandatory nature of the design and willingly or unwillingly undermine it. Experiments also may be more costly to impl ement than other designs since much more time is typically needed to identify and secure the agreement of participating individuals and sites. The experimental design that was prepared for the National Evaluation of Learn and Serve America is presented in detail in Chapters 3 and 4. These chapters illustrate the benefits and challenges to designing experiments at the appropriate level of rigor, and provide strong guidance and advice for the evaluator willing to undertake this desirable approach. 2.6.2 Quasi -Expe rimental Designs A quasi -experimental design is one that utilizes matched treatment and comparison groups. Quasi - experimental designs differ from experimental designs in that participants are not randomly assigned, but rather groups of participants that cl osely resemble the treatment group are recruited to participate in the evaluation. For example, if the subjects to be studied are classrooms of students of a biology teacher who is using service -learning as a primary instructional approach, then the evalu ator would try to identify biology teachers who do not use service -learning, perhaps from the same or neighboring school, who would be willing to participate in the study. Characteristics of students from the matched classrooms are compared to ensure that the students do not differ in ways thought to influence outcomes. For example, an appropriate comparison classroom for a service -learning class of gifted students would not be a traditional class. Rather traditional classes should be compared with other tr aditional classes and classes for gifted students should be compared with classes of other gifted students. In addition to basic achievement levels, classrooms should also be matched in terms of demographics such as gender balance, percent of English langu age learners, or percent of students from various ethnic groups since service -learning is known to be influenced by such demographics and achievement levels. Generally with a quasi -experimental design, evaluators use pre/post measures. The evaluators shoul d examine the pre -test to ensure that the groups are equivalent before the treatment begins. Sources of non -equivalence may be statistically controlled in the analysis as needed. Change over time is measured for both the treatment and comparison groups, an d differences are compared. If the service - learning group (treatment) outperforms or underperforms the comparison group in statistically Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-5 significant ways with reasonably high effect sizes (discussed in the analysis section of this chapter), then conclusion s are drawn about the influence of service -learning. To illustrate, if the biology teacher who used service -learning in his/her classroom had students that indicated much higher rates of academic engagement than the students of the biology teacher who did not use service -learning, the evaluator may be able to conclude that service -learning influenced academic engagement. The phrase ―may be able to conclude‖ reminds us that other sources of influence may explain differences since they were not as tightly co ntrolled. For example, in this case, even though students were well -matched and the teachers were using the same district biology curriculum, it could be that the service -learning teacher has been traditionally more effective than the other teacher, perhap s because of their experience, their creativity, or other characteristics of the teacher that have nothing to do with service -learning. Once again, in the best of all worlds, evaluators using the quasi -experimental approach should try to closely match the teacher and student characteristics, the curriculum being used, and other potentially influential variables to eliminate other sources of explanations of differences that may be found. Similar to experimental designs, evaluators using quasi -experimental de signs may need to have a large sample size to detect the effects of service -learning. An advantage of quasi -experimental designs is that they tend to be easier and more practical to implement than experimental designs. However, quasi -experimental designs also are subject to threats to validity through contamination of comparison groups. Careful sample selection, memos of understanding, explanations of all protocols, and implementation tracking can help to address these concerns. 2.6.3 Pre/Post Designs In a pre/ post design , the evaluator measures variables of interest before and after the treatment. In the case of service -learning, evaluators would perhaps administer a survey before the service -learning projects began in the fall of the school year, and then afte r the service -learning projects were over the next spring. While this design is very commonly used in service -learning evaluations, it has many disadvantages that limit its utility. The primary disadvantage is that differences from pretest to posttest can not be reliably attributed to the intervention. There are simply too many other sources of influence that have not been controlled that may account for the increases or decreases that were found. How does the evaluator know that the growth or decline was a ssociated with service -learning and not something else? Further analyses of data from many such service -learning evaluations have shown that other students in the same school had just as much growth (or decline) as the service -learning group. For this reas on, simple pre/post designs are considered weak and should not be used for evaluation purposes without introducing suitable comparison groups. 2.6.4 Case Studies Case studies generally refer to descriptive research using qualitative data collection methods to ex amine an individual or group of participants. Qualitative data collection methods may include observations, interviews, focus groups, document analysis, and analysis of other artifacts. A case study is often used to develop more complete understandings of a treatment and its implementation in terms of its rationale, context, facilitators of and impediments to success, and the meaning given to various activities and situations by its participants. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-6 Evaluators often develop what are referred to as ―thick desc riptions‖ of a site, which, in the case of service -learning, may include an in -depth analysis of all of the components of the service -learning experiences being evaluated, the educational and perhaps historical context for its use, the characteristics of t he individuals (teachers, students, community partners, and others) involved in its planning and implementation, the nature and needs of the community in which service -learning takes place, and the characteristics of service recipients. Case studies may al so discuss and interpret cultural norms, community values, participant motives, interpretations of experience, and other variables related to the service -learning intervention. As with other designs, the evaluation questions will dictate the areas being ex plored. While other designs tend to answer questions of ―who, what, when, where, how much or how many,‖ case studies are particularly useful in describing ―how‖ and ―why‖ evaluation questions. They tend to be used when evaluators wish to explore a phenomen on in detail, when a holistic understanding of a treatment is desired, or when investigation of the ways in which participants understand their experiences is warranted. Some believe that case studies are best used for exploration and to generate hypothese s while other designs such as experiments or quasi -experiments are best used to test hypotheses. In the field of service -learning, evaluators have used case studies to understand how the service was perceived by service providers and recipients, to delinea te the internal dynamics of the service -learning programs, to illuminate the ways in which some students experience service -learning as a transformative experience, to elucidate differences in the meanings of the experiences across participants, and to tea se out the various types of impacts that service -learning has had. Rigor in case studies is just as important as rigor in experimental, quasi -experimental, and other evaluation designs. Qualitative rigor tends to be defined in terms of the clarity of the questions, the opportunities to triangulate data (using at least three sources of data for the same topic) and therefore validate the findings, the skill of the evaluator in probing answers to illuminate the range of possibilities for interpretation, and t he types of in -depth analysis techniques that are used. More information on how to improve the rigor of these methods will be presented in the next section of this chapter. The clear advantage of case studies over other designs is that the case studies are more likely to yield in-depth understandings and insights into the range of experiences and impacts that service -learning may have on its participants. Case studies allow for more flexibility and innovation. However, case studies may not be generalizable across contexts, are more likely to be subject to bias, may be imprecise, and can be very time intensive, difficult, and costly to execute well. Evaluators need to be cautioned that extrapolating findings based on input from a few individuals may not be wa rranted. Instead, evaluation results should be considered suggestive and grist for further study. Nonetheless, case studies are ideal for exploring the range of outcomes that service -learning may produce and the optimal conditions for results to be obtaine d. The design that you should use for your evaluation should be the best possible one to answer your evaluation questions within your time frame and funding appropriation. Each design has strong advantages and disadvantages, and whichever you use should be discussed in terms of the benefits and limitations of the design. The limitations identified should also be delineated in your evaluation report, which will be discussed later in the chapter. The RMC Research cluster evaluation of Learn and Serve America state programs used quasi - experimental designs, matching classrooms based on demographic and achievement profiles. Key Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-7 challenges arose in recruiting comparison teachers and classrooms since the potential study participants had a hard time seeing any benef it to them for participating. In most states, this was resolved by providing an incentive to the teachers and students to participate, typically in the form of a gift certificate for the teacher or school. Other challenges associated with the quasi -experim ental designs had to do with previous service -learning experience of comparison participants , which was determined to impact findings, and the difficulty in describing the differences between what the service -learning teachers and the comparison teachers a ctually did to deliver the curriculum in their classrooms. The latter point illuminates the limitations of survey designs and promotes the use of multiple qualitative and quantitative methods. However, as with other evaluation projects, this evaluation was limited in its scope by its funding, and therefore raised as many questions as it answered, as will be shown in the discussion of this project throughout the remainder of this chapter. 2.7 Data Collection Methods Data collection methods and tools measure the outcomes and implementation variables identified in the evaluation questions and logic model. The evaluation design typically specifies whether the data collection will involve quantitative methods, such as surveys, knowledge assessments, or other numeric data or qualitative methods, which are narrative based and include data such as responses to interviews, focus groups, or observations. Qualitative data can be coded to become numeric. Service - learning evaluators, like other program evaluators, typically u se a combination of methods to collect data to answer their evaluation questions. Whenever possible, evaluators should seek existing tools to measure the variables and constructs identified in the evaluation questions and logic model. The advantage of usi ng existing tools is that most have been tested for validity and reliability and the evaluator saves valuable time and other resources by adopting them if they fit. Existing tools are relatively easy to locate, and can be found on the Internet, in educatio nal and other journals that publish service -learning studies, and through books or websites with lists and summaries of survey subscales, observation protocols, and other data collection tools. Six methods are discussed in this section of Chapter 2: survey s, interviews, focus groups, observations, secondary data analysis (e.g., analysis of test scores and other existing data), and knowledge assessments such as specially constructed tests and essays. Once again, the description of these methods is brief, and interested readers are encouraged to learn more by perusing additional, more detailed resources. At the end of this section, the practice of using multiple methods is also discussed, along with brief guidance on data collection strategies to use. 2.7.1 Surveys Surveys can be used to collect data for descriptive, exploratory, or explanatory purposes. For service - learning evaluations, surveys are typically administered to students and teachers, but may also be administered to community partners, service recipient s, and other stakeholders such as administrators or parents. Surveys are the best method to use to describe a population too large to observe directly. They are appropriate for measuring attitudes and self -reported behaviors, but are limited in terms of their accuracy in determining actual acquisition of knowledge and skills or actual displays of behaviors. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-8 Most surveys take the form of questionnaires that prompt participants to respond to a series of items, often presented as scales, which measure the cons tructs of interest. The service -learning cluster evaluation, for example, featured surveys that first asked student respondents to identify their grade level, age, race/ethnicity, language spoken in the home, and previous experiences in providing service and/or engaging in service -learning. The survey then asked the students to rate themselves on a series of scales that measured academic engagement, civic engagement, educational aspirations, interest in STEM -related topics, and other areas of interest as di ctated by the evaluation questions. Each of the scales that appeared on the survey had already been used in other studies or piloted for this study so that their validity and internal reliability properties were known. (Some sample survey subscales from the study are presented in the subsection 2.6 of this chapter.) Surveys are one of the most efficient methods of data collection available, allowing evaluators to receive large quantities of information on predetermined questions in a relatively short period of time. Surveys can be administered online, by e -mail or regular mail, by telephone, or in person. Most surveys yield data that are easily quantified, though some surveys use a combination of closed -ended (forced choice) and open -ended questions. The gr eatest challenge associated with surveys is to construct them so that they validly and reliably measure the outcomes and relationships that the evaluation questions identify. A good rule of thumb is to review many ways to measure various constructs that un derlie the variable being measured (e.g., academic or civic engagement, aspirations, acquisition of 21 st century skills) and select one(s) that best matches the intent of the service -learning program. For example, 21 st century skill acquisition could be de fined in multiple ways, and include many diverse constructs such as leadership, ability to work on a team, persistence, acquisition of workplace literacy skills and dispositions, and so forth. Discussing the definition with program leaders and staff, then selecting an existing scale that measures the dimension that best fits the intended outcome of the program may be the best and most efficient way to identify the most appropriate measure to add to your survey. If you choose to adopt an existing scale or su bscale, look for one with an internal reliability coefficient of .8 or higher, if possible. This will help to ensure that the items consistently measure the construct of interest. Also, as a reminder, make sure that the tool that you adopt was designed for a population similar to the one that you will evaluate. It is not a good idea, for example, to use a tool designed for college students as a measure for middle school students since the vocabulary may be too difficult or the concepts too sophisticated for the younger students to understand. Evaluators who develop their own survey items are urged to pilot them to ensure that they measure single discrete constructs clearly and are interpreted by respondents in consistent ways. The items should be at the read ability level of the respondents. Response categories should also be carefully constructed. Many survey items rely on interval scales to measure frequency (e.g., daily, weekly, or monthly) or levels of agreement (e.g., strongly disagree, disagree, agree, s trongly agree). The number of responses on the scale, the clarity of the response categories, and the availability of a ―don’t know‖ or ―not applicable‖ category all affect the type of statistical analysis to be conducted and the ability of the evaluator t o detect small changes over time. However, having more response categories is not always better since it may be very difficult to interpret what the respondent meant, for example, when he/she checked ―mildly disagree‖ versus ―moderately disagree.‖ The mor e precision in the categories, the better, and particularly better when Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-9 discrete intervals are important, such as those related to duration and intensity of service -learning projects or the extent to which various aspects of youth voice were present. In a ddition, it is useful to have consistent types of response categories within the survey so that the respondent is less likely to be confused and so that the survey can be completed more efficiently. If you use an agreement scale, then, you should consisten tly use the same response categories (e.g., strongly agree/agree/disagree/strongly disagree/don’t know) every time you ask the respondent about their agreement rather than sometimes providing a four -point scale and sometimes providing a six - point scale. Survey instructions should also be clear, and should convey to the respondent how they are to answer the questions, how terms are being defined, and what to do if the respondent is unsure about how to answer a question. Items should be worded so that they a re clear and precise, that they address only one idea, and that they avoid technical jargon and emotionally charged words. Items on the survey should be brief and written in such a way that all responses are equally acceptable. Questions should avoid doub le negatives and are best when written in active voice. Remember that questions that appear at the beginning of the survey should be easy to answer so that the respondent does not get discouraged, and that the sequence of questions should be logical and cl ear. Later responses should not be biased by earlier questions, and related questions should be grouped together. Include skip patterns as appropriate when a question does not apply to a respondent. 2.7.2 Interviews Interviews are typically conducted in person, and are designed to elicit information when details about implementation or outcomes are desired, when outcomes are not easily observed, or when evaluators want to know more about how and why decisions were made or activities were executed. Interviews prov ide more in -depth information than surveys, but they take longer to administer and analyze, and are less likely to yield responses that can be generalized to a larger population. Most service -learning evaluators that use this type of qualitative approach use structured interview protocols directly related to evaluation questions to collect information from key stakeholders. In some cases, however, evaluators may prefer to conduct less formal conversations for exploratory purposes. Generally interviews are constructed by determining the subjects and respondents for the interviews, developing the interview guide that lists the sequence of questions to be asked and possible probes to use if the respondent does not provide a full answer to the question of inter est. As with surveys, many evaluators design interviews to ask easy questions first so that the respondent becomes comfortable with the interview. Many interview protocols are constructed to obtain background information first and then to pose questions th at are more evaluative in nature. For example, service -learning evaluators may first ask the respondents to describe the history of service -learning in the school setting and then why they chose the particular service -learning approaches being implemented. The evaluator may then ask a series of interview questions that probe the sequence of activities, which activities were most and least effective and why, perceptions of impact, factors that served to facilitate or impede progress, and activities underway to engage in continuous improvement and promote sustainability. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-10 It is important to provide training for interviewers so that they know their information targets and they avoid bias in the ways they pose the questions. It is also a good idea to pilot inter view questions in advance to ensure that they are being interpreted as intended and measuring what the evaluator intends to measure. Evaluators should also inform their respondents about the time needed to answer the questions so that the interview is not rushed or is incomplete. Interested evaluators are urged to consult additional resources listed in the resource section of this chapter on constructing and implementing effective interview protocols. 2.7.3 Focus Groups Focus groups are a form of group interview that uses structured protocols to probe answers to key questions related to the evaluation. Focus groups are becoming an increasingly popular method to use to collect qualitative data on program impacts and implementation because they allow the evaluator t o determine, at least to some extent, the convergence and divergence of responses to a particular issue or when the evaluator wishes to establish an in -depth understanding of a project. Focus groups are particularly good when the evaluator wants to answer the ―how‖ and ―why‖ questions associated with the evaluation. A focus group has an advantage over an interview when interaction among participants is desired, as for example, when the evaluator wants to acquire different perspectives on what aspects of se rvice - learning were easy or challenging to implement, when comments from one respondent will help to trigger ideas from others, or when the juxtaposition of perspectives facilitates insight. Focus groups also allow for more efficient data gathering since t he groups typically contain 8 to 10 respondents. Interviews, on the other hand, may be better when the evaluator needs a ―full story‖ from each individual or when the questions are of a potentially sensitive nature. Most evaluators find that focus groups a re most valuable when they are comprised of homogeneous respondents, such as teachers or administrators or students. Mixing respondents, particularly when they are together with those who have a different position in a hierarchy (such as teachers and admin istrators) may inhibit the open and accurate responses since the individuals may not want to ―look bad‖ in front of their supervisors. As with interviews, focus group moderators (those leading the focus groups) should be trained in advance so that they do not bias the responses and so they have a good understanding of the types of information that they are to acquire. Other helpful focus group guidelines include the following advice: Identify and invite all participants in advance, informing them of the pu rpose of the focus group, its length, and any expectations you have for participation. If you are offering an incentive to participate, a description of the incentive should be provided. Identify alternatives to the original invitation list in case the res pondent you invite cannot attend. Encourage all focus group participants to participate, and control any member that tries to dominate the conversation. Frame questions so that they do not bias the responses and do not comment on or judge responses. Rather , remain neutral and welcoming of all responses. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-11 Probe responses as needed to ensure that you have a good understanding of what the respondent meant when giving the response. Feel free to gauge how many of the respondents feel the same or differently about an issue, and probe differences as appropriate. Moderate the group so that you have time to get answers to all of the questions that you have. More detail and advice on constructing focus group questions, preparing a setting for a focus group, and moderat ing effectively may be found in the resources listed at the end of this chapter. 2.7.4 Observations Observations of a project, program activities, or classrooms of teachers that implement service - learning can be helpful in understanding the ways that service -lea rning activities are conducted, responses to the activities, and impacts on participants. Observations can be informal or structured, using a pre -determined protocol. Most of the time, observers watch a setting, record what they see, and then code their ob servations. Observations may be made of settings, behaviors, verbiage, relationships, instructional styles, participation rates, levels of engagement, student groupings, and much more. Service -learning evaluators often use observations to illustrate findi ngs, provide insights into implementation or student reactions and impacts, or show exactly what a particular practice looks like. For example, evaluators may wish to illustrate the ways in which teachers have encouraged student voice by observing and codi ng the classroom interactions to report the relative percentage of time that students versus teachers are talking, who directs the conversations, what choices are provided, and how many students participate in decision making. Observations can also be used to help to determine fidelity to program design, duration and intensity of activities, alignment to standards, and other important information targets. When observations are being used for summative purposes, evaluators may wish to have two observers in t he room and ensure that there is sufficient inter -rater reliability, meaning that the two observers are watching the same thing and coding the observation data in exactly the same ways. Inter -rater reliability should reach the 85% level of agreement. Many evaluators provide training to observers and give them opportunities to practice so that higher levels of agreement are reached. It is also a good practice to debrief with the teacher or project facilitator after the observation to determine whether what w as observed was representative of typical practices. Evaluators can conduct observations at frequent intervals to obtain a record over time, or they can conduct a ―point in time‖ observation that presents a snapshot of a particular activity or event. Evalu ators should be careful not to overgeneralize when they collect point -in-time data since the data may not be representative of the project as a whole. In addition, evaluators should be aware that their very presence may influence those being observed, part icularly if the subjects of the observation are young children. Additional information on effective observational approaches may be found in the resources listed at the end of the chapter. 2.7.5 Secondary Analysis of Existing Data Many service -learning evaluator s have become interested in examining the impact of participation in service -learning on areas related to student academic performance, such as achievement test scores, Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-12 dropout/graduation rate, absenteeism, tardiness, truancy, and disciplinary referrals. These academic performance measures already exist and thus do not need to be developed but rather only need to be collected. The evaluator conducts secondary analysis (rather than primary) since the data were already collected and analyzed for another purpo se. Collection of such data, though, is easier said than done. Many schools and school districts will not allow evaluators access to the data without strong justifications and approvals by district - or state - level research and evaluation committees. The da ta may also exist in a variety of forms that require re-entry of data, compilation of data from various data sets, or recoding of data. In addition, different sites may have different definitions for their categories. For example, service -learning evaluato rs have learned through experience that states calculate dropout rates and graduation rates in different ways, and set different benchmarks to identify student proficiency or advanced -level work. Even absenteeism can be calculated in another way, with diff erent sites counting ―partial‖ attendance, such as attending some classes but not others, as absent or present. Evaluators will need to work with the specific sites being studied to determine the data that are available, how the data may be accessed and i nterpreted, and how ―missing data‖ are determined. In addition, evaluators will need to become very thoughtful about aggregation across sites. Care must also be taken in the way that attribution of change is made to service -learning since these particular measures are influenced by so many different factors both inside and outside of the classroom environment. Another form of data that some service -learning evaluators tap are existing records of voter participation, incidence of violence or bullying, incid ence of vandalism, rates of visits to medical facilities for diseases (e.g., sexually -transmitted), overall utilization of exercise facilities, enrollment in tutoring courses, number of website hits, and a variety of other data related to community impact. These sources of data can be very useful in tracking change over time that may be at least partially attributable to the service -learning effort. Many service -learning evaluators have found that collecting secondary data is much more time - consuming and d ifficult than they anticipated. However, while challenging, this endeavor is important to many clients and has been accomplished by several experienced service -learning evaluators. 2.7.6 Knowledge Assessments The final method to be discussed in this section is the use of constructed knowledge assessments . As the name implies, these ―tests‖ are typically closed - or open -ended questions or essay prompts that measure the extent to which students (or teachers or other respondents) have acquired specific knowledge an d skills that are the target of the intervention. For example, some service -learning evaluators design assessments that measure the extent to which students have learned how government makes decisions, strategies being used to decrease levels of pollution, policies governing transportation systems, or the steps needed to plan and implement a service -learning project, such as a health fair or a demonstration event. Good closed -ended knowledge assessments should be developed using guidelines for constructing effective tests and assessments, and should include responses to multiple choice questions that feature the right types of distractors and other errors that typically occur in students’ thinking. In addition, knowledge assessments need to be appropriate f or the level of knowledge domain being measured. Some assessments have issues with accuracy or fairness since they typically measure either very Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-13 broad domains of knowledge that have multiple types of correct responses and are difficult to score (e.g., What would you do to address the problem of water pollution in our community?), or very narrow domains of knowledge that comparison students may not know without having experienced the service -learning project (e.g., What does the city do to clean the dirty wa ter that goes down your sink? or Who is your Congressional district representative?). Because of this, many evaluators find it difficult to create effective knowledge assessments. Service -learning evaluators who choose to use knowledge assessments typicall y develop assessments directly tied to service -learning projects and administer them only to the service -learning students at the beginning and end of a project. These evaluators draw conclusions about whether and how much students learned based on those a ssessments. Broader knowledge assessments could also be administered as pre and post assessments for both treatment and control/comparison group for topics covered by the general curriculum, and differences in group responses could be assessed. An example of the latter that has been used by service -learning evaluators is to ask both service -learning and comparison students to discuss multiple solutions to a social issue and present their thinking on which solution is best, and why, as a measure of cognitive complexity and problem solving ability. Evaluators who use knowledge assessments should be aware of their strengths and limitations and discuss challenges and how they are resolved in their reports on findings. 2.7.7 Multi -Method Approaches Some of the most ef fective evaluation designs use a mix of quantitative and qualitative methods. The combination of data sources and methods can lead to richer and more detailed information about project implementation and impacts, and greater confidence in the results since there are several sources of information to answer each question. Consistency in responses across methods helps to ensure that the data are accurate and reliable. If the data do not converge, the evaluator should collect additional information to understa nd the source of and reasons for the differences. A typical constellation of methods for a service -learning evaluation may include student and teacher survey administration, interviews with project leaders, school administrators, and community partners, an d focus groups with a randomly selected group of participating students and with adult facilitators. While desirable to implement multiple data collection methods, there are time and cost considerations that are incurred with each additional method being used. The ―right‖ mix will depend upon the evaluation questions and the resources that are available for the evaluation. 2.7.8 Data Collection Procedures Data collection procedures vary by method and should be outlined as part of the evaluation design and projec t plan. The plan should specify whether the data are to be collected by the evaluator, by onsite personnel, by program staff, or through self -report. Part of the communication with respondents, both in the letters explaining the study and securing assent t o participate (explained in section 2.7) and in planning conversations with project leaders, is information on the methods being used, the length of time needed for any given data collection event, and how to handle any data collection issues that may aris e. For example, if surveys are being administered, survey administration protocols should be shared. The protocols should identify specifically who will be collecting the data, how the data will be collected (e.g., print, online) and how the surveys can be accessed. The typical amount of time to complete the surveys should be provided. If interviews or focus groups are being conducted, the plan should specify who is to participate and how long the data Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-14 collection event will take. A quiet place to conduct th e focus group or interview should be reserved, and equipment that may be used, such as digital audio or video recorders, should be tested in advance to ensure good receptivity and working order. If refreshments are to be made available, arrangements must b e made in advance for delivery and clean up. Evaluations that use observations should specify who is to be observed, the length of the observation, and what type of notification is to be given. If the plan calls for videotaping the observations, prior arra ngements must be made and the equipment should be tested. Evaluators who plan to analyze documents should specify the nature of the documents, when the documents are expected to be available, and the form they should take (electronic or print). If tests or assessments are to be administered, evaluators should make arrangements for copying and test administration. If records are to be accessed, evaluators should specify who is to extract the information, along with when, where, and how. The plan should also specify the ways in which data will be collected from participants who are not fluent in English. Evaluators should anticipate anomalies and the ways they will be addressed in advance. For example, evaluators should know what they will do if there is adul t interference while students are answering questions, how they will handle students who want to ask questions about the meanings of survey items, and so forth. Evaluators should also know in advance how data will be transmitted. If print surveys are admin istered, for example, how will they be treated so that confidentiality is preserved? How will audio or video recordings be protected so they are not erased? How will long documents be accessed? All of these considerations illuminate the need to identify qu ality controls well before the evaluation begins, and frequent checks to ensure that all protocols are being followed. Any challenges that occur should be reported immediately to project directors and evaluators and within the evaluation report. 2.8 Sample Sur vey Subscales As explained previously, it is important to identify and adopt or adapt survey subscales that have appropriate levels of validity and reliability. Sources of information on subscales are relatively easy to find both on the Internet and in pub lished journals and books. Chapter 4 presents the specific survey items that were intended to be used for the National Evaluation of the Learn and Serve America Programs. In this section, a few other examples are provided. The examples here represent measu res of outcomes often identified as important by K -12 service -learning practitioners. These samples identify the construct to be measured, the source, the intended survey population, validity, reliability, the stem, the items, and the response categories u sed. Readers who would like to use these subscales should contact the source for permission. The examples provide descriptive and psychometric information on six attitude scales that may be relevant to service learning outcomes. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-15 2.8.1 Measure of Responsibilit y for Community Issues and Social Problems: Social Responsibility Construct Social Responsibility Source RMC Research (2007). Survey of social responsibility. Denver, CO: Author. Population Grades 6 -12 Validity Face and content Reliability Alpha =. 83 / .84 (pretest / posttest) Stem Please indicate how much you agree or disagree with each of the following statements. Items a. Students my age can do things to make the world better. b. I can make a difference in my neighborhood or town. c. I feel responsible for helping others. d. I often think about the needs of others. e. Helping to solve community problems is something everyone should do. f. I intend to volunteer throughout my whole life. Response Categories 1 = Strongly Disagree 2 = Disagree 3 = Uncertain 4 = Agre e 5 = Strongly Agree 2.8.2 Measure of Responsibility for Community Issues and Social Problems: Neighborhood Obligations Construct Neighborhood Obligations Source Corporation for National and Community Service, Office of Research and Policy Development. (20 08, May). Still serving: Measuring the eight -year impact of AmeriCorps on alumni. W ashington, DC: Author. Population Participants in AmeriCorps between 1999 and 2001 Validity Face and content Reliability Alpha = .77 Stem Do you feel that each of the f ollowing is not an important obligation, a somewhat important obligation, or a very important obligation that a citizen owes to the country? Items a. Reporting a crime you may have witnessed. b. Participating in neighborhood organizations. c. Helping keep the neig hborhood safe. d. Helping keep the neighborhood clean and beautiful. e. Helping those who are less fortunate. Response Categories 1 = Not Important 2 = Somewhat Important 3 = Very Important Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-16 2.8.3 Measure of Personal Efficacy and Empowerment Construct Personal G rowth through Community Service Source Corporation for National and Community Service, Office of Research and Policy Development. (2008, May). Still serving: Measuring the eight -year impact of AmeriCorps on alumni. W ashington, DC: Author. Population Part icipants in AmeriCorps between 1999 and 2001 Validity Face and content Reliability Alpha = .81 Stem Thinking of all your voluntary community service or volunteer activities over the past 12 months, please indicate how much you agree with the following statements. Items a. I re -examined my beliefs and attitudes about myself. b. I was exposed to new ideas and ways of seeing the world. c. I learned about the ‗real‘ world. d. I did things I never thought I could do. e. I changed some of my beliefs and attitudes. Respons e Categories 1 = Strongly Disagree 2 = Disagree 3 = Neither Agree nor Disagree 4 = Agree 5 = Strongly Agree 2.8.4 Measure of Sense of Belonging to School Construct Sense of Belonging to School Source RMC Research Corporation. (2006). Public Achievement eval uation report. Denver, CO: Author. Population Grades 9 –12 Validity Face Reliability Alpha = .89 Stem For the next set of statements, think about this school and select the answer that best describes how you feel about each statement. Items a. I feel li ke I belong to this school. b. I contribute to this school. c. I am viewed by teachers as a valued part of this school. d. I have a responsibility for the welfare of this school. e. I feel proud of this school. f. I do things to make this school a better place. Response Categories A Lot Some A Little Not at All Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-17 2.8.5 Measure of Academic Engagement Construct Academic Engagement (includes affective, behavioral, and cognitive engagement) Source RMC Research Corporation. (n.d. – used in multiple evaluations). Survey of acade mic engagement . Denver, CO: Author. Population Grades 6 –12 Validity Face Reliability Cronbach‘s alphas 2 = .87 (Oregon Learn and Serve evaluation report, 2008) .84 (Wisconsin Learn and Serve evaluation report Year 2, 2008) .83 (Texas Learn and Serve eva luation report, 2006) Alpha = .82 pre, .85 post (Learn and Serve Michigan: 2006 –2007 school year, 2008) . Stem How much do you agree with each of the following statements? Items a. I like being in school. b. I am interested in the work at school. c. I pay attent ion in class. d. Time seems to pass quickly when I am doing schoolwork. e. I like schoolwork best when it is challenging. f. I feel that the school work I am assigned is meaningful and important. g. My courses are interesting to me. h. I think that the things I am learni ng in school will be important for my future. i. I feel that school is worthwhile. Response Categories 1 = Strongly Disagree 2 = Disagree 3 = Agree 4 = Strongly Agree 2 Cronbach’s a lpha is a measure of internal reliability or consistency. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-18 2.8.6 Measure of School Engagement Construct School Engagement (includes behavioral, emotion al, cognitive, and sense of belonging items) Source National Center for School Engagement. (2006). Merrill Middle School: School engagement and staff attendance efforts: School year 2005 –2006 . Denver, CO: Author. Population Middle school students Valid ity Criterion, Construct Reliability Not available in this report, but in another NCSE study* with emotional, cognitive, and behavioral scales from which items for this study were drawn (but for which response categories were not reported) Cronbach‘s alph as ranged from: .88 - .90 (Emotional engagement) .88 - .92 (Cognitive engagement) .49 - .80 (Behavioral engagement) Sample sizes in this study ranged from 39 -57 (Emotional engagement), 41 -66 (Cognitive engagement), and 46 -72 (Behavioral engagement), and included students from the Gulfton neighborhood in Houston, Texas; Kent County in Seattle, W ashington; and Jacksonville, Florida. *National Center for School Engagement. (2006, December). Quantifying school engagement: Research report. Denver, CO: Author . Retrieved from http://www.schoolengagement.org/Truancy preventionRegistry/Admin/Resources/Resources/Quantify ingSchoolEngagementResearch Report.pdf Stem How often are the following statements true for you? (Put an X in the box.) Items E = emotional, B = behavioral, C = cognitive, as specified in the NCSE Quantifying School Engagement report scales. N = seems to be related to sense of belonging (These were not in the Quantifying School Engagement report, but were in the survey instrument for the Merrill Middle School study.) a. When I am in class, I just pretend I am working. (B) b. I follow the rules at school. (B) c. I get in trouble at school. (B) d. I feel excited by the work in school. (E) e. I am interested in the work I get to do in my classes. (C) f. My classroom is a fun place to be. (E) g. When I read a book, I ask myself questions to make sure I understand what it is about . (C) h. I study at home even when I don‘t have a test. (C) i. I try to watch TV shows about things we are doing in school. (C) j. I talk with people outside of school about what I am learning in class. (C) k. I check my schoolwork for mistakes. (C) l. If I don‘t know wh at a word means when I am reading, I do something to figure it out, like look it up in the dictionary or ask someone. (C) m. I read extra books to learn more about things we do in school. (N) n. If I don‘t understand what I read, I go back and read it over aga in. (C) o. Most of my teachers praise me when I work hard. (N) p. I try my best at school. (C) Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-19 q. I skip (cut) the entire school day. (B) r. I get good grades in school. (C) s. I try to stay home from school. (B) t. I enjoy the work I do in class. (E) Response Categories Always Often Sometimes Seldom Never Stem How much do you agree with each of the following statements? Items a. I feel close to people at my school. (N) b. I feel like I belong in my school. (N) c. I am happy to be at my school. (E) d. The teachers at my school treat students fairly. (E) e. I feel safe in my school. (N) f. I like most of my teachers at school. (E) g. The students at this school don‘t like students who are different. (N) h. I am getting a good education at my school. (C) i. I will fail no matter how hard I try. (N) j. I will graduate from high school. (C) k. I want to go to college. (C) l. I am not interested in school. (N) m. The discipline at my school is fair. (E) n. Most of my classes are boring. (C) o. Most of my teachers care about how I‘m doing. (E) p. Most of my teachers know the s ubject matter well. (C) q. I learn a lot from my classes. (C) r. There is an adult at school that I can talk to about my problems. (E) s. I respect most of teachers. (E) t. School is a waste of my time. (N) u. Most of my teachers are always telling me what to do. (N) v. Mos t of my teachers understand me. (E) w. Most of my teachers expect too much of me. (N) Response Categories Strongly Agree Agree Disagree Strongly Disagree 2.9 Sampling Based on their experiences in the field, service -learning evaluators often expect a low respo nse rate or are unwilling to exclude any respondents from their studies at the risk of having others find out and fail to respond themselves. Rather than sampling their respondents, then, these evaluators use a census approach, and involve everyone who is eligible in the study as the ―universe‖ for the survey. A more efficient approach is to sample from the population being served, taking care to select participants that represent the entire population of the project being evaluated. Selecting a sample for either quantitative or qualitative data collection typically starts by defining the sampling frame. The sampling frame is the list of ―units‖ (in the case of service -learning, typically Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-20 individuals, classrooms, schools, or districts) that comprise the stu dy population. If the service - learning evaluation is of a single class, then the sampling frame would consist of the class roster. If the evaluation is of a school, the frame consists of all of the classrooms. An effective sample can be randomly drawn from the sampling frame if all of the units have similar characteristics. A random draw could be accomplished by using a computer to select sites or using a table of random numbers found in most statistics books. However, most evaluators find that there are va riables that they want to take into account when selecting their samples. For example, service -learning evaluators may wish to be sure that all grade levels or a variety of content areas such as English language arts, mathematics, and science, are represen ted. Evaluators may wish to draw samples that represent different teacher experience levels, different types of service -learning projects, varying duration of projects, or any of a myriad of other interesting variables to explore. In these cases, the entir e population is stratified first, that is, the population is sorted and categorized by the variables in question. Representatives of each category or strata are then randomly drawn. There are many other types of sampling procedures that may apply depending upon the purpose of the study and the degree to which specific types of analysis and generalization is desired. As can be seen in chapter 3, sampling procedures can become very complicated, and can involve using formulas for weighting and other procedures to ensure representativeness. Similarly, there are formulas to use to decide how many individuals or other units should be in the sample. Most of these formulas have to do with the levels of confidence and the extent of sampling error that evaluators are willing to tolerate. Levels of confidence refer to the degree to which evaluators can be certain that it was the intervention that influenced the result. Confidence levels are typically expressed as an approximate percentage. For example, if p =.05, then the evaluator is saying that he/she is 95% sure that service -learning was associated with the result that was found. Sampling error generally refers to the possible differences between the sample selected and the population as a whole. No matter what sam pling approach is used, the evaluation report should describe the sample that actually responded to the study. The description should include information about the demographics of the sample and any other pertinent characteristics. If samples of treatment and control or comparison groups are used, the description should also show the extent to which these samples are similar and different from each other. Sampling is an important aspect of any evaluation study, and interested readers are strongly encourage d to pursue additional resources to gain a better understanding of this topic. 2.10 Human Subjects Protection All evaluations should include highly specified steps to protect human subjects who participate, and should pay particular attention to the protection of children and youth. In some cases, evaluators may be required to obtain Institutional Review Board (IRB) approval before they may implement the study. Typical steps that must be undertaken for this purpose address two main areas: obtaining informed cons ent and assent to participate and foll owing appropriate requirements for protecting participants’ identities and individual responses. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-21 Ethically and legally, all participants in a study must give their permission or ―assent‖ to participate in the evaluati on. Their agreement should be based on their clear understandings of the purpose of the study, how data will be collected and used, and their rights as participants. Typically participants are informed about these aspects of the evaluation in writing, ofte n in the form of a letter from the project director or principal investigator. According to most prevailing regulations, the letters should include the following: The purpose of the evaluation, stated in clearly understood terms; The evaluation procedures that will be undertaken and timelines to be followed; Potential benefits and risks of participation; The fact that participants can withdraw from the study for any reason at any time and how they signal that they want to do so; How confidentiality will be strictly maintained; The project director’s or lead evaluator’s names and how to contact them; How copies of the results may be obtained; The voluntary nature of their participation; and A place to sign, which indicates that they understand and agree to pa rticipate. In addition to obtaining assent from each participant, youth under the age of 18 must have the permission of a parent or guardian in order to participate. Evaluators or program staff must distribute a permission form, called a parent consent for m, with the same information as found in an assent form, and sent to the students’ parents or guardians. There are two types of parent/guardian consent forms. An active consent form requires written consent from the parent/guardian for the youth to partici pate. If the form is not returned with the signature of the parent/guardian, then the student may not participate. Passive consent, on the other hand, does not require parent/guardian signature. Rather, the letter is provided to the parents or guardians of the youth, and the letter is returned with a signature only if the parent or guardian does not want their child to participate. Some school districts and programs require active parent consent and will not allow any evaluation to be performed without it. Others only require passive consent. The evaluator who works with a school or school district must check and abide by the rules of the district and the funders of the project. Data that are collected under conditions of confidentiality must also be careful ly handled. The names of those who participate may not be stored with the data they provide, and names or other information should not be provided that would allow a reader to identify the respondent. In the latter case, if there is only one administrator in a district, that person cannot be quoted without permission because it would be clear who provided the statement. In addition, data must be carefully stored so that strict confidentiality is maintained. Other restrictions also apply. For example, the f ederal government will not allow data from fewer than 10 survey respondents to be reported since it would be too easy to figure out what each respondent said if the numbers are small. The number of applicable regulations for any given evaluations varies, b ut typically includes quite a few cautionary measures that must be put into place. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-22 There are many websites, IRBs, and other sources of information available on this topic, and evaluators are urged to become very familiar with and follow the appropriate reg ulations and recommendations. Following appropriate rules and conventions for protecting human subjects is often time -consuming and has associated costs which are sometimes unanticipated by evaluators. Forms must be copied and distributed, collected, and t racked. Checks must be conducted to ensure that no one without the appropriate form is included in the study. Storage of the forms may also incur costs in terms of software or files. Follow up to ensure that data collectors are following appropriate protoc ols can also incur expenses. Evaluators should be aware of the time and cost factors and build them into the evaluation plan. 2.11 Data Analysis The types of data analysis to be used are related to the questions posed, evaluation design, and methods developed. Since there are so many variations, specific data analysis guidance will not be presented here. However, there are some general rules that should be followed. First, all data should be prepared in advance. The data should be well -organized, and should hav e been ―cleaned‖, that is, checked for errors, missing data, and other data -related problems. If data include surveys, focus groups, or interviews and there is a separate data entry or coding procedure, at least 5% of all data should be checked to ensure t here are no errors. If there are a significant number of errors, data may need to be reentered or recoded. Data analysis, of course, should be directed toward answering evaluation questions. The evaluation design typically specifies the procedures to be us ed to analyze the specific types of quantitative and/or qualitative data that were collected. If the data are quantitative, specific types of statistical analysis should have been predetermined. If the data are qualitative, data coding, reduction, and summ arizing protocols should also have been specified in advance. Analysts should follow the protocols and conduct their analysis at the levels of depth specified in the questions. Since many evaluators triangulate their data sources, data should be checked fo r consistency or divergence, and reasons for any divergent findings should be investigated. Evaluators also typically review the data to identify any factors related to the evaluation itself that may have impacted the findings. Evaluators should report the response rate, and determine whether the evaluation participants resemble and represent all of the program participants. Program and evaluation participant attrition rates should be noted, and the analyst should take appropriate steps to examine attrition to see if dropouts at the pretest level are different from those who remain. Response bias in the way that participants answer surveys should also be examined (e.g., whether the respondents always use the left -hand side of the response categories or alway s respond in a pattern, like abcd, abcd, abcd). Data analysis procedures can be complicated or simple. Exhibit 2.2 provides an example of some of the typical sorts of statistical analysis procedures used for quantitative analysis. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-23 Exhibit 2.2: Typical T ypes of Statistical Analyses Number of Dependent Variables Nature of Independent Variables (IVs) Nature of Dependent Variable(s) Test(s) 1 0 IVs (1 population) interval and normal one -sample t test ordinal or interval one -sample median categorical (2 categories) binomial test categorical Chi -square goodness -of-fit 1 IV with 2 levels (independent groups) interval and normal 2 independent sample t test ordinal or interval Wilcoxon Mann -Whitney test Categorical Chi -square test Fisher‘s ex act test 1 IV with 2 or more levels (independent groups) interval and normal one -way ANOVA ordinal or interval Kruskal Wallis Categorical Chi -square test 1 IV with 2 levels (dependent/matched groups) interval and normal paired t test ordinal o r interval Wilcoxon signed ranks test Categorical McNemar 1 IV with 2 or more levels (dependent/matched groups) interval and normal one -way repeated measures ANOVA ordinal or interval Friedman test Categorical repeated measures logistic regressi on 2 or more IVs (independent groups) interval and normal factorial ANOVA ordinal or interval generalized estimating equation Categorical factorial logistic regression 1 interval IV interval and normal correlation simple linear regression ordinal or interval non -parametric correlation Categorical simple logistic regression 1 or more interval IVs and/or 1 or more categorical IVs interval and normal multiple regression analysis of covariance Categorical multiple logistic regressio n discriminant analysis 2 or more 1 IV with 2 or more levels (independent groups) interval and normal one -way MANOVA 2 or more interval and normal multivariate multiple linear regression Adapted from Leeper, J. D. Choosing the correct statistical t est. Retrieved from http://bama.ua.edu/~jleeper/627/choosestat.html As has been stated many other times in this chapter, readers interested in quantitative data analysis should consult with statistics textbooks and other resources for additional informati on. Readers interested in qualitative analysis should refer to the several resources listed at the end of the chapter, along with the many excellent resources available on analyzing data from focus groups, interviews, observations, and other qualitative da ta sources. 2.12 Drawing Conclusions Many evaluators do a great job in designing their evaluations, collecting and analyzing their data, and presenting their findings, but still make errors in drawing conclusions. The service -learning field as a Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-24 whole has often been criticized in this regard, with some reviewers charging that the field is rife with ―overclaiming‖ the results of participation. This is a serious concern and thus should be taken into account by all service -learning evaluators. Put simply, the data should not be stretched beyond what the findings show. Findings should not be applied to populations that were not studied, should not be generalized to include any type or form of service -learning, and should not be disseminated without appropriate cautio ns about their interpretation. Evaluators may find a statistically significant difference between treatment and control or comparison groups, but that does not mean that the difference is meaningful. Instead, for example, there may be a statistically signi ficant difference in the academic performance of two groups, but the effect size 3 may be so small (such as a finding that translates into the higher performing group getting just one test item correct more often than the other group) that no one could conc lude that a real difference exists. Evaluators must also be true to the data in that they should draw conclusions that are both positive and negative as warranted. There should be no cover up of the outcomes that do not turn out as expected. Finally and o bviously, all of the conclusions should be justified. They should be stated in such a way that any reviewer would come up with the same conclusions when he/she reviewed the data. If warranted, alternative explanations of the data should be presented rather than a single conclusion drawn. 2.13 Elements of a High -Quality Report Evaluation reports should be clear and easily understood by service -learning stakeholders, including program leaders and staff, participants, community partners, policymakers, parents, and the public at large. Most stakeholders will appreciate language that is not technical in nature, though technical information should be included to ensure that sophisticated readers understand the contents of the report. Some evaluators address this issue by providing various types of report summaries. A typical report has the following sections: Executive Summary The Executive Summary usually portrays the big ideas in the evaluation. Often paralleling the report itself, the Executive Summary typically incl udes a few paragraphs on the background and purpose of the service -learning program, project, or approach; a short description of the evaluation design and methodology; and then a series of bulleted findings, followed by a short discussion showing how to interpret the findings and/or a list of conclusions. Most Executive Summaries also include recommendations for program improvement. 3 Effect size (ES) is a name given to a family of indices that measure the magnitude of a treatment effect, represen ted by differences in outcomes across groups. Unlike significance tests, these indices are independent of sample size. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-25 Introduction The Introduction usually presents information about the program purpose, pertinent history, program participat ion rates, and descriptions of program implementation. Many evaluators include the logic model in this section of the report and some report the name of the evaluation funder. Methodology The methodology portion of the report typically has the evaluation q uestions, a description of the evaluation design, the measures being used and their validity and reliability, and the characteristics of the evaluation sample and their representativeness of the population being served. Many evaluators also include a discu ssion of study limitations in this section and some add this section to the conclusions. Findings This section is the heart of the evaluation. The findings section often has both a summary and an analysis of the findings, typically organized by evaluation question or topic. Data are reported using both narratives and data displays, such as tables, pie charts, and/or bar or line graphs. Cautions about data interpretation are frequently presented in this section. Conclusions The conclusion section succinctly summarizes evaluation results and typically provides conclusions related both to implementation and impacts. Conclusions are also often mapped back to the logic model. Recommendations Recommendations are sometimes combined with the conclusions section and sometimes stand alone. The Recommendations section is usually very specific: it provides a set of suggestions derived from the findings, along with a justification for the suggestions being provided and/or details on what the recommendations mean. Appendi x/Appendices Evaluators frequently append copies of the instruments for data collection that they use. Appendices also often include tables with individual item analysis. Many evaluators provide a draft report to program leaders for their review. The purpo se of this review is to help the evaluator ensure accuracy of ―facts‖ and to discuss data interpretation. While findings should not be changed, the program staff may identify areas that need further explanation or areas where wording changes are desired. The report should then be revised as needed and finalized for distribution. 2.14 Using Evaluation Results for Improvement Most program leaders agree that a key purpose of conducting an evaluation is to provide information to them for improvement purposes. Evalua tors should go beyond the presentation of findings and recommendations and hold conversations with program leaders and staff about results and the various Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-26 mediators and moderators that were found to influence the results. Evaluators should be open to quest ions and should promote deep understandings of the findings and their implications. An example of this in the field of service -learning is the use of several evaluations that tested various aspects of program quality to ensure that they were indeed associ ated with better outcomes. Several evaluations and research studies showed that some practitioner wisdom simply was not supported by data. Aggregation of these types of findings and widespread discussion and dissemination can improve the practice of servic e-learning everywhere. Many evaluators promote the use of evaluation results for improvement by tracking the extent to which programs changed during the next year based on the recommendations that were made. In a second year report, data are collected and reported on the improvements that were made and the apparent yield that the improvements had. Evaluators also use various publication and presentation forums for this purpose. This is especially important for the field of service -learning since so few good studies are widely distributed and cited. 2.15 Evaluation Resources 2.15.1 Evaluation Toolkits Applied Environmental Education Program Evaluation Designed to help online course participants evaluate their education and outreach programs, and provides participants wit h an overview of evaluation and an opportunity to practice skills designing and using evaluation tools for environmental education and outreach programs. https://www.uwsp.edu/natres/eeta p/aeepe_course_page.aspx Ecological Understanding as a Guideline for Evaluation of Nonformal Education (EUGENE) Easy -to-use, practical instrument that can help users assess baseline knowledge of ecological principles, and assess knowledge gain in those s ame principles at the end of programs. Through the Web site, users can select which ecological principles are appropriate to assess, add up to four customized questions, print an instrument for pre - and post -testing, enter data following instrument adminis tration, and analyze results. https://projecteugene.org/cgi -bin/eugene Educators’ Guide to Service -Learning Program Evaluation Provides introductory information for youth development program staff on how to evaluate programs that feature service -learning as an instructional approach. www.servicelearning.org/evaluationguide/html Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-27 Educators’ Guide to Collecting and Using Data: Conducting Focus Group Research; Conducting Surveys; Conducting Classroom Observations Three RMC Research booklets that provide specific guidance on how to develop protocols and conduct focus groups, interviews, and classroom observations. http://www.rmcdenver.com/Default.aspx?DN=29e6b628 -bd88 -457f -840a -0d95b21908d9 Evaluating Your Environmental Education Programs: A Workbook for Practitioners Walks users through how to desig n and conduct an evaluation. A case study of one program demonstrates how to use each chapter to conduct an evaluation. http://www.naaee.org/publications Evaluation Assessment: Examining the Readiness of a P rogram for Evaluation An Office of Juvenile Justice and Delinquency Prevention resource from the program evaluation briefing series to help users decide when to evaluate a program. Other papers discuss hiring and working with an outside evaluator, cost ben efit analysis, incorporating evaluation into the request for proposal (RFP) process, and strategies for evaluating small juvenile justice programs. www.jrsa.org/jjec Evaluation Toolkit for Magnet Schools A toolkit wi th information, interviews, glossaries, and presentations to show how to evaluate magnet school programs. http://evaluationtoolkit.org Mobilizing for Evidence -Based Character Education A booklet produced by the U .S. Department of Education for evaluating character education programs. www.ed.gov/about/offices/list/osdfs/ndex.html My Environmental Education Evaluation Resource Assistant (MEERA) MEERA is an online "evaluation consultant" created to assist you with your evaluation needs. It will point you to resources that will be helpful in evaluating your environmental education program. MEERA can help you: Learn more about evaluation and its importan ce; move through the evaluation process step -by -step, with tips and pitfalls to avoid; obtain suggestions on important evaluation topics, for example, on how to find, select, and work with an external evaluator; search through example environmental educati on evaluations and obtain detailed insights about these evaluations; find additional evaluation resources such as ―how -to‖ guides and links to evaluation tools; and identify and learn about related professional development opportunities. http://meera.snre.umich.edu/ Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-28 Needs Assessment in Environmental Education and Interpretation (EE/I) Presents a basic, practical approach to needs assessment in an EE/I context to help users develop a plan for carrying out a needs assessment. https://www.uwsp.edu/natres/eetap/naeei_course_page.aspx Teacher's and Practitioner's Professional Development Needs Identifies 89 professional development needs for the fi eld of environmental education, and presents the specific priorities of educators who work with pre -kindergarten through college -age students in formal education systems and practitioners who work as informal or nonformal educators outside of these systems . http://www.eetap.org/pages/dynamic/web.page.php?page_id=150&topology_id=1&eod=1 The 2002 User Friendly Handbook for Project Evaluation. A National Science Found ation publication explaining the main components of evaluation, evaluation issues and concerns, and the complexity of being culturally responsive in evaluation. www.nsf.gov/pubs/2002/nsf02057/ nsf02057.pdf User -Friendly Handbook for Mixed Methods Evaluations A National Science Foundation publication to help people learn about evaluations using both quantitative and qualitative data, and which methods to use for which purposes. http://www.nsf.gov/pubs/1997/nsf97153/start.htm W.K. Kellogg Foundation Evaluation Handbook Offers a blueprint for conducting project -level evaluations. http://www.wkkf.org/~/media/10BF675E6D0C4340AE8B038F5080CBFC.ashx Logic Models Developing a Logic Model: Teaching and Training Guide A booklet that describes and provides training materials to help individuals learn how to develop a logic model. www.uwex.edu/ces/pdande Logic Models A Web site by the Office of Juvenile Justice Programs that provides information and templates on what should be included in a logic model. www.ojjdp.gov/grantees/pm/logic_models.html Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 2-29 W.K. Kellogg Foundation Evaluation Handbook/Logic Model Development Guide CD A handbook showing why logic models are important and how to construct logic models. http://www.wkkf.org/knowledge -center/resources/2005/10/WK -Kellogg -Foundation -Eval 2.15.2 Methods RMC Research Corporation (1999). Educators’ Guide to Collecting and Usin g Data: Conducting Classroom Observations. Denver, CO: Author. Booklet to help evaluators design observation protocols and collect and analyze classroom observation data. www.rmcdenver.com/products.html RMC Research Corporation (1999). Educators’ Guide to Collecting and Using Data: Conducting Focus Group Research . Denver, CO: Author. Booklet to help evaluators create focus group and interview protocols, facilitate focus groups and one -or-one interviews, and conduct qualitative data analysis. www.rmcdenver.com/products.html RMC Research Corporation (1999). Educators’ Guide to Collecting and Using Data: Conducting Surveys. Denver, CO: Author. Booklet to help evaluators design, administer, and analyze survey data. www.rmcdenver.com/products.html Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-1 3. Developing a Rigorous Evaluation Des ign for Service - Learning : Alternatives and Considerations 3.1 Introduction This section of the Toolkit includes material s that were developed during the design phase of the National Evaluation. Because there was a mandate that the National Evaluation employ an experimental random assignment design, several alternative approaches to random assignment were considered. The section includes a discussion of the ramifications of the different designs for the study sample – sample sizes and power of the designs – and implications for the recruitment of schools, teachers, and students. Wh ere useful, we use the National Evaluation as a n example of rigorous evaluation design (see Exhibit 1.1 and Section 4 for an overview of key design features of the National Evaluation). Note that all of the design options discussed here aim to estimate the effects of high quality service - learning, relative to the alternative of teaching without service -learning . Because that decision is specific to the National Evaluation, we discuss the focus on high -quality service learning in detail in Section 4 rather t han in this more general design section. 3.2 Random Assignment O ptions For the rigorous evaluation of LSA -funded service -learning, the Abt team , in collaboration with CNCS, considered a range of possible design options . All of the options were based on random assignment designs. The focus on random assignment wa s motivated by the fact that random assignment is the gold standard in evaluation research. Most other study designs cannot rule out the possibility that the study findings were due to some form of sel ection bias. Random assignment helps to ensure that the difference in outcomes between groups can only be attributed to the treatment — or more specifically, to the difference between the treatment and the counterfactual conditions. While a random assignmen t study offers the strongest evidence of impact, it is also the most complex and (typically) most expensive design option . As such it may not be practical or feasible for smaller programs and organizations. In addition to ensuring that they have the necess ary resources, organizations considering a random assignment study of a service -learning program should assess whether the program has the size and maturity to allow for the successful implementation of the evaluation. 3.2.1 Considering the Level of Random As sig nment In developing the NELSAP design options, we first consider ed the broad question of the level (s) at which randomization should occur : district, school, teacher, classroom, and/or student levels . In an experimental study, one unit at the level of rando mization would be assigned at random to implement/receive high -quality service -learning, and the other would not. For example, if teachers were the level of randomization, one teacher, Ms. Jones, might be randomly assigned to use service - learning, while an other teacher, Ms. Smith, might be randomly assigned to not implement service - learning in her class. Below we describe options for random assignment focusing on random assignment at the school, teacher, class, and/or student levels. It should be noted that it is optimal to Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-2 include student -level random assignment in design options that are based on random assignment at the class or teacher level . This ―double random assignment‖ ensures that the students in the treatment and control groups are the same. Howev er, for ease of exposition, we describe random assignment at each level separately. For th e National Evaluation, we considered four different options for the level of randomization . The se options are presented in Exhibit 3.1 and discussed below. Exhibit 3.1: Four Design Options for a Random Assignment Evaluation of High Quality Service -Learning Design Option Treatment Condition Control Condition A. School -level randomization LSA teachers in school A implement high qualit y SL. LSA teachers in school B do n ot implement service -learning B. Teacher -level randomization Teacher A implements high quality SL in all sections of a class in which s/he had previous ly demonstrated high quality SL. Teacher B does not implement service - learning in all sections of a clas s in which s/he had previously demonstrated high quality SL. C. Class -level randomization Teacher implements high qu ality SL in one section/semester of class X. Teacher does not implement service - learning in other section/semester of class X. D. Studen t-level randomization Contrasts student learning with and without high quality SL in the same subject. Students are randomly assigned to Teacher A. Teacher A, who already implements high quality SL (as assessed by the study team), continues to do so durin g the study year. Students are randomly assigned to Teacher B. Teacher B, who teaches the same subject in the same grade in the same school as Teacher A and has never implemented SL, continues not to do so during the study year. Note that Options A -C coul d also include randomization at the student level. Note that another possible level or random assignment is random assignment of districts. In district - level randomization, districts which had received/were receiving LSA funds would be randomly assigned t o implement or not implement service -learning. Randomization at the district level was never considered a viable option because it would have been difficult and extremely costly to recruit the number of districts that would have been needed to ensure suffi cient power to detect an effect . Given these challenges, district -level randomization is not discussed further. 3.2.2 Random Assignment of Schools In school -level randomization, schools with teachers who had received/were receiving LSA funds would be randomly as signed to implement or not implement service -learning. This would require that LSA teachers who were randomly assigned to the control group redesign their courses to not incorporate service -learning activities. School -level randomization would effectively control for between -classroom ―contamination‖ ; teachers in treatment schools are unlikely to be affected by the activities of teachers in control schools, and vice versa. However, it would seem to require that all of the teachers implementing high quality service -learning in control schools replace service -learning with other instructional techniques . 3.2.3 Random Assignment of Teachers within Schools In teacher -level random assignment, some teachers would be assigned to treatment and as ked to continue with thei r current instructional approach with high quality service -learning, and the other s Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-3 would be asked to replace high quality service -learning with an in structional approach that does no t involve service -learning . In other words, t his option would involve fin ding pairs of teachers who are implementing high quality service -learning in the school for the same course (e.g., Ms. Jones for regular 9 th grade English in first period and Ms. Smith for regular 9 th grade English in second period) and randomly assign ing one of the teachers to the treatment group and one to the control group . Control teachers would need to redesign their course to exclude service -learning , while treatment teachers would provide instruction according to their usual course plan, which involv es high quality service -learning. Students could also be randomly assigned to teachers in this design. Preferably, in randomizing teachers, we would block on the school 4 and randomly assign teachers within schools. 3.2.4 Random Assignment of Classes within Teach ers In class -level random assignment, high quality service -learning would be implemented in the treatment class but not in the control class. Preferably, in randomizing classes, we would block on the teacher and randomly assign pairs of classes for each pa rticipating teacher. Under this design, we would identify eligible teachers who are implementing high quality service -learning in at least one course, and who are teaching two or more classes or sections of this course (e.g., regular 9 th grade English in b oth second period and sixth period) . More specifically, evaluators would identify: eligible tea chers (i.e., teachers who have demonstrated high quality service -learning in at least one course), eligible courses (i.e., courses in which eligible teacher s dem onstrated high quality service - learning), and; pairs of classes or sections in eligible courses (e.g., regular 9 th grade English taught by Ms. Jones in both second period and sixth period). Then evaluators would randomize one class/section to treatment and one class to control. In this case, since "b usiness as usual" in this course involves high quality service -learning, evaluators would randomly assign treatment classes to "business as usual" high quality service -learning and control classes to a revised c ourse plan that removes the service -learning component and replaces it with an alternative strategy for providing instruction in that class. Teachers would be given discretion on how to design the control class, subject to some constraints. Students could also be randomly assigned to classes in this design. 3.2.5 Random Assignment of Students to Teachers In student -level random assignment, students could be randomly assigned to one of two classes , one with high -quality service learning or one without service -lear ning . Random assignment of students to teachers would involve finding pairs of teachers who teach the same course in the same school and in which one teacher of the pair is implementing high quality service -learning while the other teacher is not . The eval uator would randomly assign students to the two teachers. No teachers would nee d to 4 As explained by Raudenbusch et al. (2008), blocking entails grouping units to be randomized into subclassess or ―blocks‖ such that wi thin the blocks, units are exp ected to have similar outcomes. Random assignment of units to the treatment and control conditions is then conducted within the blocks, which eliminates the variation between the blocks when estimating the precision of the imp acts. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-4 redesign their classes. However, schools would need to allow the evaluator to randomly assign stud ents who enroll in the course to one of the two teachers. 3.3 Considerations for Each Design O ption 3.3.1 Primary Research Question An swered by Each Design O ption The design of the evaluation should be driven by the primary research question of interest. In this section, we explain the research questions answered by each of the four des igns that the study team considered when selecting a design for the National Evaluation. It is hoped that this discussion will demonstrate the importance of keeping the research questions at the forefront when designing an evaluation and of recognizing tha t not all designs answer the same research questions. Three of the four design options that we considered – random assignment at the school, teacher and class levels – would have allowed us to answer the following research question: What is the impac t of p articipation in high -quality LSA-funded service -learning activities on student outcomes ? In addition, with the class -level random assignment option in which the same teacher would teach both treatment and control classrooms , we would able to separate the i mpact of service -learning from teacher characteristics, thereby testing the impact of service -learning independent of teacher quality. In other words, since the same teacher is teaching both treatment and control classrooms, teacher quality is the same acr oss conditions and is therefore ―controlled for‖ in the design. This is an attractive option when there is concern about the effect of teacher quality for an intervention, but the downside is that there may be concerns with contamination across treatment a nd control conditions. Many researchers are not convinced that a teacher can ―divide herself in two‖ and truly teach differently in the treatment and control classrooms. If there is contamination, then the treatment - control contrast is not maintained and t he rigor of the design is compromised. For the fourth design option – assigning students to teachers – the research question is different because with this design, we would not be able to separate teacher effects from service -learning effects. We could, ho wever, use post -hoc exploratory subgroup analyses to examine differences in teacher quality. The research question answered by this design is: What is the impact on student outcome s of an LSA -funded teacher (both the SL activities and a teacher who chooses SL)? Furthermore, b oth the class - and student -level random assignment designs were based on recruiti ng experienced teachers who had received Learn and Serve grant funds in the recent past and allowing them to utilize service -learning as they normally woul d in at least some of their classes . As a result, the study would be of Learn and Serve -supported service -learning, as opposed to service -learning in general or a specific model of service -learning. 3.3.2 Feasibility of Recruitment for Each Design Option Each o f the design options entails recruitment challenges, which we considered in our deliberations on the appropriate design for NELSAP. For recruitment in general, we were concerned that: Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-5 The sample be both r epresentative (i.e. , have face validity) and take ad vantage of potential cooperation at the state level; Study schools should be selected from those meeting the accepted definitions of high quality service -learning (i.e. , implementing ―high -quality‖ LSA programs); and To the extent possible, the school sele ction process should make use of the clustered structure of schools within school districts and states to minimize the costs of site visits and data collection activities. In Exhibit 3.2 below, we list some of the recruitment challenges we identified for r andom assignment designs at each of the four levels – school, teacher, class, and student –for a study in which we were constrained to a sample of schools and teachers who had received LSA funds and were implementing high -quality service -learning with thos e funds. Of particular concern for the feasibility of recruitment are 1) whether the eligible population is large enough to support the research design and 2) whether potential respondents are willing to participate. For the second, respondents will be mos t willing to participate if doing so minimizes disruption to their normal operations. For example, schools might hesitate to allow for random assignment of students since doing so may affect their standard scheduling procedures. Exhibit 3.2 : Feasibility Co nsiderations for the Four Design Options for a Random Assignment Evaluation of High Quality Service -Learning Design Option Feasibility A. School -level randomization School -level random assignment is most effective when a curriculum is a dded to treatment schools and the control schools can continue with business as usual. In our case, we would be asking the control schools to withhold service -learning and this would be difficult to maintain and document at a school level. Because randomization is at the sc hool level, we would have to recruit a substantial number of schools, thus raising recruitment costs and the time needed to recruit the sample. Would not need to worry about within school contamination. B. Teacher -level randomization Easier to find indiv idual teachers implementing service -learning than whole schools, Since random assignment would occur within teacher pairs, it may be challenging to find two paired teachers in the same school. Would need to worry about contamination across teachers within schools. If control and treatment teachers were in a grade -level or subject ―team‖, they may share their instructional approaches. Would need to find similar -enough pairs of teachers to control for teacher effects. C. Class -level randomization Need a sma ller sample size than school or teacher level random assignment. Requires that teachers ―give up‖ SL in their control class (es), which some teachers might not be willing to do. Controlling contamination d epends on teachers‘ abilities to maintain separate c urricula for the ir treatment and control classes. D. Student -level randomization Depends on finding similar -enough pairs of classes, and schools willing to allow for random -assignment of students. Unlike the other designs, a considerable number of schools may be reluctant or unable to allow for this approach, i.e., random assign ment of students to classrooms , because it unduly disrupts their normal processes and student scheduling. While the number of schools need ed for the study may be small enough to mak e th is feasible, some outreach to schools would be needed before be ing certain that the sample requirements could be met . It tends to be more feasible to randomly assign students in elementary and middle schools than in high schools since students in lowe r grades tend to follow all similar schedules, while high school schedules are likely to be based more on interests and abilities. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-6 Additionally, we discuss the feasibility of each design for maintaining a clear contrast between treatment and control condi tions. A chief concern here is the issue of ―contamination‖, i.e. if teachers share practices between treatment and control classes. If there is contamination, then the treatment - control contrast is not maintained and the rigor of the design is compromised . 3.3.3 Power Associated with Each Design Op tion Statistical power analyses are used to determine the sample size required to detect a given effect size. While statistical power should not be determinative of the evaluation design chosen, it does significantly i mpact the cost of any evaluation (through the sample size). In general, the higher the level at which random assignment occurs, the less power a design has because additional levels of variability are included. For instance, school -level random assignment must take into account variation in student achievement associated with school, teacher, classroom, and student level factors. In contrast, randomly assigning classrooms within teacher (implicitly at the same school) controls for all school and teacher -level variance; all variation is at the classroom and student levels. Below, we provide two examples of how the study team calculated power for an evaluation of service -learning: one for a within -teacher random assignment design and one for a student -level random assignment design. Both have the same first step: selecting the target effect size – the Minimum Detectable Effect (MDE), i.e., the smallest true impact that an experiment has a chance of detecting (Bloom 1995) 5. They differ in their calculations of the implications of that MDE on required sample sizes, and associated data collection burdens and costs.Both Selecting the target effect size Since the primary outcome of the National Evaluation design wa s student academic achievement, we focused on devel oping expectations about annual grade -to-grade gains in achievement levels for the target population, 9 th-10 th grade students. Bloom, Hill, Black and Lipsey (2008) utiliz e national norming samples of seven standardized tests in reading, math, science, and social studies, to provide estimates of annual gains from kindergarten through 12 th grade in effect sizes. Those corresponding to the grades of interest for the National Evaluation are presented in Exhibit 3.3 : Exhibit 3.3 : Annual Gains in Reading, Math, S cience, and Social Studies Achievement Reported in Effect Sizes (from Bloom et al., 2008) Transition Reading Math Science Social Studies Grade 9 -10 0.19 0.25 0.19 0.19 Grade 10 -11 0.19 0.14 0.15 0.15 The target effect size in academic achievement for t he National Evaluation was set to be 0.10 (or 10% of a standard deviation of the test scores in the student population of interest), which is roughly half of the annual gain realized between 9 th and 10 th grade and roughly two -thirds of the gain between 10 th and 11 th grade. However, we acknowledged that this target effect size assumes that service -learning is a transformational educational strategy and that if service -learning ―helps,‖ but is not 5 Bloom, Howard S (1995) ―Minimum Detectable Effects: A simple way to report the statistical power of experimental designs,‖ in Evaluation Review 19(5): 547 -556. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-7 ―transformational‖ (i.e., if the effect is smaller than 0.10), we will be unable to detect these ―less than transformational‖ effects. We also acknowledged that the power requirements for detecting a smaller MDE are not feasible and, further, that a smaller MDE might not constitute strongly convincing evidence to the field about the value of service -learning in promoting academic achievement. We then calculate d the sample size necessary to detect the MDE of 0.10 for each of the design options. In doing so we considered t hree sources of variat ion in the outcome of inte rest. The first source wa s the teacher -level variance , which corresponds to the variation in student achievement that lies across teachers and is associated with teacher characteristics such as the academic content area matter being taught, the grade level , and teac her quality. The second source wa s the classroom -level variance, which corresponds to the outcome variation that lies within teachers but across classrooms , i.e., the variation in student achievement that is associated with characteristics of cla ssrooms taught by the same teacher, such as the types of students in the two classrooms taught by each teacher. The third source wa s the student -level variance , which captures the variation in student achievement within a given classroom. Example Power Ca lculation: Random assignment of classrooms within teachers In the power calculations for classroom -level random assignment , we assumed that all teacher -level variance i s effectively controlled by the within -teacher design. That is, ―teacher‖ is held consta nt across the treatment and control classrooms. Therefore, the critical paramet ers for the power analyses (discussed in detail below) are: σc2 : proport ion of the outcome variance that lies within teachers but across classrooms; σs2: proportion of the outco me variance that lies across students within classrooms; Rc2 : propor tion of the classroom -level variance that can be explained by covariates such as baseline measures of the outcome and student and teacher characteristics; and Rs 2: proportion of the stud ent -level variance that is explained by covariates. Ideally, we would be able to achieve balance in the set of classrooms selected for each teacher, e.g. , classrooms that are similar in terms of grade level, academic content area , and student characterist ics (such as Advanced Placement classrooms or regular classrooms or levels of civic and academic engagement ). Having ―unbalanced‖ classrooms within teachers will increase the variation associated with classroom -to-classroom differences (i.e., classroom -lev el variance) , and detecting a specified effect size will require a larger sample if classrooms are not balanced within teachers. Since we could not know in advance about the level of balance in the recruited sample, our power analysis wa s based on the wors t case scenario of having unbalanced classro oms within all study teachers. For plausible values of the classroom and student -level variances (σ c2 and σ s2), we utilize d the classroom -level intra -class correlation (ICC) values reported in the literature. For example , Schochet (2008) reports ICC values around 0.15 at the middle scho ol level. It is important to note that these ICC values generally combine the outcome variance that lies across teachers ( σt2) and that lies across classrooms but within teachers (σc2) into a combined classroom -level variance ( σt2+σ c2; i.e., σ t2+σ c2 = 0.15 ). When classrooms within teachers are unbalanced, we expect σ c2 to be much larger than σ t2. Noting the lack of any parameter values reported specifically for this case in the literature, we assumed σ c2 to be four times as large as σ t2 and set σ c2 to equal 0.12. Further n ote that the ICC value of 0.15 implies that 85% of the overall outcome variance is at the student -level so we set σ s2 to 0.85. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-8 Therefore, σ c2=0.12 and σ s2=0.85 enter into the power calculation while σ t2 is set to zero since we assume d that the random assignment of classrooms within teachers explain ed all of the teacher -level variance. When selecting plausible values of the proportion of classroom and student -level variances explained by covariates (R c2 and R s2), we chose not to rely on value s reported in the literature becaus e of a special situation we would encounter. Note that baseline measure of the outcome of interest (also called ―pre -test‖) is widely accepted as the most important covariate since it explains much of the variance in the outcome measure (Raudenbush, 1997; Raudenbush & Lui, 2000) . Further note that in most s tates, our outcome measures would come from standardized tests administered specifically for this study while ba seline achievem ent measures would come from state tests, creating a ―mismatch‖ between the outcome and baseline measure. 6 Therefore, we expect ed to have lower values for R c2 and Rs2 than those reported in the literature (see Bloom, Richburg -Hayes, & Black, 2005; Schochet , 2008; and Hedges and Hedberg, 2008 for examples), which are generally obtained from different administrations of the same test. 7 To our knowledge, there we re not any R 2 estimates readily available from previous studies addressing this issue. Therefore, w e reanalyzed data from three studies that employed outcome and baseline measures from different tests. These analyses yielded, as expected, lower R 2 estimates than what is repo rted in the literature. We chose the median of the three R c2 and Rs2 estimates, setting R c2 to 0.54 and R s2 to 0.25. Power analyses based on these assumptions and parameter values suggest that we would need 139 teachers (278 classrooms) to detect the effect size of 0.10. Based on our past experience implementing similar designs, we e xpected teacher attrition of 25% between recruitment and final outcomes. To account for 25% of teachers dropping out of the study, we estimated that the National Evaluation would need to recruit 185 teachers to have a final sample size of 139 and the conse quent MDE of .10. If the assumptions end ed up being too conservative, e.g., if we could identify and randomly assign ba lanced classrooms for most teachers or there was less attrition , our MDEs would be lower. As mentioned above, we would strive to select at least two classrooms per teacher . The goal would be to select two classrooms that maximize balance in terms of grade, academic content area , and entering stu dent ability. One classroom would be randomly assigned to treatment ( service -learning ) and one t o control (no service -learning ). Finally, it is important to note that number of teachers per school does not have any effect on the sample size requirements since we assume d that random assignment of classrooms within teachers would explain teacher and hi gher -level ( school, district, state, etc .) va riance in the outcome measures. 6 Based on state proficiency test data, w e estimated that ther e would be a mismatch be tween baseline (8 th grade state proficiency test) and post -test (study -administered norm -referenced test) for approximately half of the sample. However, because we would not know the exact perc entage of mismatch until after recruit ment , we utilize d the mos t conservative estimate (100% mismatch) in our power calculations . 7 Different tests, even in the same content area, are often designed to capture different concepts; therefore using outcome and baseline measures from different tests is expected to yield lower R 2 values than using outcome and baseline measures from different administrations of the same test. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-9 Example Power Calculation: Random assignment of students This second example calculates the number of schools that would be needed to detect an effect size of 0.1 in the new desig n that entails (i) matching teacher who use SL with teachers who teach the same subject area and class type in the same school but do not use SL and (ii) randomly assigning students to the matched teachers. In these analyses, we have employed the followin g parameters: 1) Two -sided hypothesis test with the significance level = 0.05. 2) Statistical power = 0.80. 3) Number of teachers per school : We consider three scenarios: (i) 1 service -learning and 1 control teacher (total: 2 teachers), (ii)1 service -learning teach er and 2 control teachers (total: 3 teachers), and (iii) 1 service -learning teacher and 3 control teachers (total: 4 teachers) 4) Number of classrooms per teacher = We consider two cases: one classroom per teacher and two classrooms per teacher 5) Number of stud ents in a classroom with a valid outcome = 20, assuming an average class size of 25 and a non -response rate of 20%. 6) Balanced allocation of units to treatment and control. 7) Target minimum detectable effect size = 0.1 standard deviation. 8) Cluster - and student -level variances : In educational settings, we usually consider the potential clustering of students at three levels: school, teacher, and classroom; therefore, the outcome variance that lies in each of these levels enter into the power calculations. In this design, school -level variance will be controlled for selecting the treatment and control teachers within a given school and classroom -level variance will be explained by the random assignment of students to classrooms/teachers. Therefore, our analysis nee ds to account for only the teacher -level variance. In order to find an estimate of this parameter, we start with a school -level intra class correlation (ICC) of 0.15, which implies that 15% of the outcome variance lies at the school, teacher, and classroom -levels while the remaining 85% lies at the student -level (this is in line with values reported by Hedges and Hedberg, 2007 and Schochet, 2008). We further assume that half of this total cluster -level variance will be at the teacher - level (Kane and Staiger , 2000; Nye, Konstantolpoulus, and Hedges, 2004) as teachers in the treatment and control groups are expected to be systematically different (the effect of matching on this paper will be considered subsequently). Therefore, we conduct a set of analyses for teacher -level variance estimate of 0.075 and student -level variance estimate of 0.85. This implies that we expect that 7.5% of the outcome variance that lies at the school and classroom -levels are explained by blocking schools and random assignment of stu dents while 7.5% of the outcome variance lies at the teacher -level and the remaining 85% lies at the student -level. Using data from Florida and North Carolina, a recent study by Xu and Nichols (2010), reported larger intra -class correlation values than are found by earlier studies at the high school -level. Taking this finding into account, we have also conducted a second set of analyses for a larger school -level ICC of 0.24, which implies teacher -level variance of 0.12 and student -level variance of 0.76. We refer to this ICC value as the ―conservative ICC‖ while the former ICC value is referred to as ―lenient ICC‖. Proportion of the teacher - and student -level variance explained by covariates (e.g., pre -test) – R2: Because we are not matching on teacher char acteristics, we only consider how much of the student -level variance could be explained by student -level characteristics such as baseline measures of outcomes and demographics. Taking into account the possibility of having a mismatch between the pre -test a nd post -test measures of outcomes, we have set the R2 value at the student -level to 0.25. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-10 The results of these analyses presented in Exhibit 3.4 below present the number of schools needed to detect an MDE of 0.1. As mentioned above, we consider several sce narios that correspond to different combinations of the two ICC values (lenient and conservative), three values for the number of teachers per school parameter (two, three, and four), and two values for the number of classrooms per teacher parameter. For e xample, the estimate in the first column of Exhibit 3 suggests that in the most conservative case of two teachers per school (one SL and one non -SL), and one classroom per teacher, we would need 170 schools (340 teachers) under the lenient ICC and 235 scho ols (104 teachers) under the more conservative ICC. However, if we are able to identify two comparison teachers for each service -leaning teacher within schools, we would only need between 127 (lenient ICC) and 176 schools (conservative ICC). Exhibit 3 .4: Number of Schools Needed to Detect an MDE of 0.1 Lenient ICC Conservative ICC Number of Teachers per School (1 SL and 1, 2, or 3 Ctrl Teachers) 2 3 4 2 3 4 Number of Classrooms Per Teacher 1 2 1 2 1 2 1 2 1 2 1 2 Number of Schools 170 145 127 108 112 96 235 213 176 159 156 141 3.4 General Considerations for All Design O ptions 3.4.1 Align ing Data Sources, Analytic App roach and Outcomes with Research Qu estions As can be seen in Exhibit 3.5 below, the data sources, analytic approach and outcomes of interest were a ligned with the evaluation questions for the National Evaluation. We include this sample exhibit to demonstrate the alignment between these research design elements and to encourage all researchers to adopt such a tool, which helps in ensuring a consistent ly aligned evaluation through all phases of evaluation. Developing such an exhibit in the early stages also helps all stakeholders to clarify their goals and to facilitate a common understanding among stakeholders. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-11 Exhibit 3.5 : Evaluation Questions, Data Sources, Analytic Approach, and Outcomes of Interest Primary Impact Questions Data Sources Analytic Approach Outcomes of Interest Short -Term Impacts What is the impact of participation in Learn and Serve America -funded service - learning on 9th and 10 th grade students‘ academic achievement in the service - learning core content area at the end of the class ? Content area tests at end of year (state proficiency tests or standardized tests administered by study team) School records data Impact analysis (HL M regression) Sample size: 5660 students nested in 278 classes Content area test scores Course completion Grade completion Expected c redit accrual What is the impact of participation in Learn and Serve America -funded service - learning activities in a cor e content area on 9 th and 10 th grade students‘ class - and school -level academic engagement at the end of the class ? Student baseline and post -program surveys (self -report scale) School records data Impact analysis (HLM regression) Sample size: 5560 students nested in 278 classes Global academic engagement rating Attendance/truancy Disciplinary actions What is the impact of participation in Learn and Serve America -funded service - learning activities in a core content area on 9 th and 10 th grade studen ts‘ civic engagement at the end of the class ? Student baseline and post -program surveys (self -report scale) Impact analysis (HLM regression) Sample size: 5560 students nested in 278 classes Global civic engagement rating Medium -Term Impacts What is the impact of participation in Learn and Serve America -funded service - learning activities in a core content area on 9th and 10 th grade students‘ overall academic achievement one year after the end of the class ? Student baseline, post -program and follow -up surveys (self -report scale) School records data Impact analysis (HLM regression) Sample size: 5660 students nested in 278 classes Composite state test scores on ELA, Math (and Science and Social Studies, as available) Course completion Grade completi on Expected c redit accrua l What is the impact of participation i n Learn and Serve America -funded service - learning activities in a core content area on 9 th and 10 th grade students‘ academic engagement one year after the end of the class ? Student baseline, post -program and follow -up surveys (self -report scale) School records data Impact analysis (HLM regression) Sample size: 5660 students nested in 278 classes Global academic engagement rating Attendance/truancy Disciplinary actions Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-12 Primary Impact Q uestions Data Sources Analytic Approach Outcomes of Interest What is the impact of participation in Learn and Serve America -funded service - learning activities in a core content area on 9 th and 10 th grade students‘ civic engagement one year after the end o f the class ? Student baseline, post -program and follow -up surveys (self -report scaleF Impact analysis (HLM regression) Sample size: 5660 students nested in 278 classes Global civic engagement rating Secondary Impact Questions Data Sources Analytic App roach Possible M oderators (for discussion) Variation in Student Impacts (Subgroups) Exploratory/Descriptive Questions and Secondary Impact Questions Data Sources Analytic Approach Characteristics of I nterest (Outcomes or Possible Moderators ) Student Outcomes What is the impact of participation in Learn and Serve America -funded service - learning activities in a core content area on 9th and 10th grade students‘ 21st century skills at the end of the class? Student baseline, post —program and follow -up surveys (self report scales) Impact analysis (HLM regression) Sample size: 5660 students nested in 278 classes Global 2N st century skills rating What is the impact of participation in Learn and Serve America -funded service - learning activities in a core content area on predictors of dropout at the end of the class? School records data Impact analysis (HLM regression) Sample size: 7400 students nested in 370 classes Failure in core courses (ELA, Math) Absenteeism Grade retention Disciplinary referrals Vari ation in Student Impacts Is the impact of participation in Learn and Serve America - funded service -learning activities in a core content area on 9 th and 10 th grade students different for groups of students as defined by their baseline characteristics? a School records data Student baseline survey Impact analysis (HLM regression) Sub -sample sizes vary Gender Race/ethnicity English Language Learner (ELL) Prior achievement Prior volunteering experience Prior SL experience Volunteering with family Is the impact of participation in Learn and Serve America - funded service -learning activities in a core content area on 9 th and 10 th grade students different for students depending on the baseline characteristics of their teachers? a Teacher Information Form Impa ct analysis (HLM regression) Sub -sample sizes vary Years implementing SL Content area certification Participation in SL professional development Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-13 Secondary Impact Questions Data Sources Analytic Approach Possible M oderators (for discussion) Is the impact of participation in Learn and Serve America - funded service -learning activities in a core content area on 9th and 10th grade students different for students depending on the baseline characteristics of their schools? a Common Core of Data (CCD) b Impact ana lysis (HLM regression) Sub -sample sizes vary Persistently dangerous AYP improvement status Percent low -SES Percent minority Service -learning Implementation Did the service -learning and control classrooms differ in terms of the presence of key charac teristics of service - learning? Teacher interviews and logs Impact analysis (OLS regression) Sample size=278 classrooms Presence of five core components of service -learning: investigation, planning, action, reflection, demonstration (IPARD); differences in instructional environments Did the service -learning classrooms in the study represent high -quality service - learning? Teacher interviews and logs Descriptive analysis Sample size=139 SL classrooms National Youth Leadership Council (N YLC ) quality standard s and related indicators What is the relationship between service -learning quality and student outcomes at the end of the class? Teacher interviews and logs Student baseline and post -program surveys (self -report scale) School records data Content area tests at end of year (state proficiency tests or study -administered, norm -referenced tests ) Relational analyses Sample size: 5560 students nested in 278 classes NYLC quality standards and related indicators Global civic engagement rating Global academi c engagement rating Attendance/truancy Disciplinary actions Credit accrual Grade retention Content area test scores a Subgroup analyses based on student, teacher and school characteristics are not considered part of the main impact analyses, but are i nstead considered exploratory. b School characteristics will be obtained from the Common Core of Data, available at the U.S. Department of Education ’s Institute of Education Sciences’ National Center for Education Statistics’ website ( http://nces.ed.gov/c cd/ ). 3.4.2 Developing a Multiple Comparison St rategy Accounting for multiple comparisons (or multiple hypothesis testing) in evaluations designed to produce multiple impact estimates is critical because as the number of comparisons or hypothesis tests increase, the probability of making a Type I error (finding an effect when in fact there is none) increases. For example, suppose that we are conducting 5 independent hypothesis tests at the usual Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-14 p=0.05 significance level and the null hypothesis is true for all 5 tests, i.e., there are no true differences between the conditions contrasted. In this case, the probability of finding at least one significant test result (or rejecting the null hypothesis in at least one of the tests) is 23%. If the number of tests incre ases to 20, then the likelihood of finding at least one significant finding rises to 64%. As indicated by this simple example, multiple hypothesis testing is an important analytic issue that has to be accounted for in every evaluation. In the past few yea rs, there have been major advances in education research regarding the multiple comparisons problem. A technical report published by the Institute of Education Sciences provides a set of conceptual and practical guidelines for addressing this issue (Schoch et, 2008). In the rest of this section , we present a conc ise summary of these guidelines. Guidelines to Address Multiple Comparisons Schochet (2008) recommends addressing the multiple comparisons issue using a theoretically - and empirically -driven framewo rk, elements of which are ideally specified before conducting any analyses. Schochet descri bes these elements as follows: Group outcomes into conceptual domains: Outcomes of interest should be grouped into domains such that a domain corresponds to a globa l construct addressing a research question. Creation of outcome domains should be driven by the theory of change that connects the intervention to outcomes. Each domain could include measures that are highly correlated or hypothesized to explain the same l atent construct. Specify confirmatory and exploratory analyses: All hypothesis tests should be classified as confirmatory and exploratory . Confirmatory analyses are conducted to test central research questions and hypotheses of the study. They include main impact analyses and analyses of meaningful and pre -determined subgroups that have been found to demonstrate impacts in prior research. A confirmatory analysis should have sufficient statistical power to detect reasonable effects. It is also important to n ote that some confirmatory analyses pertain to domain -specific research questions while some are conducted to answer research questions spanning across multiple domains. Evaluators should develop separate strategies for these two types of confirmatory anal yses. Exploratory analyses, on the other hand, are conducted to generate hypotheses to be tested more rigorously in future studies. They are not necessarily driven by the theory of change and can be conducted to identify subgroups or post -hoc outcomes tha t demonstrate impacts. Whether a particular analysis is deemed confirmatory or exploratory has important implications when reporting results. In particular, only confirmatory analyses should be considered when assessing the overall effectiveness of the int ervention; hence, executive summaries or report abstracts should only include results from confirmatory analyses. Results from exploratory analyses, on the other hand, should be classified as preliminary and presented in a separate report chapter that o nly includes similar analyses. Develop strategy to address multiple comparisons for confirmatory analyses of outcomes within a domain: There are essentially three approaches one could undertake to control for multiple comparisons within a domain. The first t wo approaches described below reduce the need to conduct multiple hypothesis tests by using a single hypothesis test. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-15 1. The first approach entails creating a domain -specific composite construct by combining the multiple outcome measures in that domain. Vario us weighting schemes can be employed to create this construct, including natural or unit weights; expert judgment or subjective weights; maximum reliability weights; equal correlation weights; factor analysis weights, etc. (see Appendix C in Schochet 2008 for more details on these options). The domain construct is then used as the single outcome to estimate the impact of the program or intervention on the whole domain . Impacts on the separate outcomes in a domain can be tested only in post -hoc exploratory a nalyses if and only if the impact estimate for the construct is significant. Such exploratory analyses are not subject to any multiple comparisons adjustments. 2. The second approach entails conducting a joint F -test on the individual impact estimates for the outcomes in a domain. If the result of this test is statistically significant, the intervention is deemed to have an impact on that domain and one could further examine the individual impact estimates without any adjustments to help interpret the signific ant effect on that domain. If the F -test is not significant, examining the individual outcomes is not recommended. This approach is also known as Fisher’s least significant difference and can be carried out by utilizing the seemingly unrelated regression ( SURE) models or multivariate analysis of variance (MANOVA) techniques. Although both of these approaches reduce the need to conduct multiple hypothesis tests, they can lead to different conclusions, as explained by Schochet (2008). Suppose that a domain i ncludes five individual outcomes, one of which exhibits a large treatment - control difference by chance while the treatment -control difference for the other four outcomes is negligible. In this case, the joint F -test conducted on the five individual impact estimates could yield a statistically significant result, driven by the outcome with the large treatment -control difference. This leads to the erroneous conclusion that the intervention has had a significant impact on that domain. This would not necessaril y be the finding from the first approach if the weights used to create the single composite were determined by conceptual or theoretical reasoning. The second approach would be analogous to the first approach if the weights were determined by the statistic al significance of the individual impact estimates, maximizing the likelihood of finding a significant effect. 3. As an alternative to searching for a global impact, a third approach entails keeping the individual outcomes in a domain separate and testing t hem separately, which yields as many impact estimates as the number of domain outcomes. In this case, a multiplicity adjustment has to be applied. Various methods can be employed for this adjustment, such as the Bonfer roni or Benjamini -Hochberg correction (see Appendix B in Schochet 2008 for details on these and other multiplicity adjustment methods). Among these methods, we propose using the Benjamini -Hochberg correction which has been adopted by What Works Clearinghouse as the primary method for adjusting for multiple comparisons. As an example of the ramifications of applying this correction, if five hypothesis tests are being conducted on five outcomes within a domain , the critical significance level for the impact estimate with the smallest unadjusted p -value (i.e., the most significant impact estimate) is set to 0.05/5=0.01. This means that this outcome would have to achieve a significance level of 0.01 to be considered significant. Similarly, the critical alpha values for the 2 nd, 3 rd, 4 th, and 5 th mos t significant impact estimates are set to 0.02, 0.03, 0.04, and 0.05, respectively. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-16 Schochet (2008) argues that the first two methods are preferable to the third approach since the latter suffers from statistical power loss, which could be severe when the number of hypothesis tests conducted is large. Between the first two approaches, Schochet (2008) recommends the first approach as the most ideal one since it is less sensitive to outlier impact estimates. Develop strategy to address multiple comparisons for confirmatory analyses across multiple domains: Many studies are designed to address confirmatory analyses that represent multiple domains. These analyses may be conducted to address two different types of research questions regarding the overall impact of the intervention. The first type of research question states that the test of whether an intervention is effective is based on evidence that it has an impact on each of the domains. In this case, there is no need to apply an adjustment, since the interv ention would be deemed ineffective unless the impact estimate for each domain wer e statistically significant. The second type of research question states that the test of whether the intervention is effective is based on evidence that the intervention has an impact on any of the domains. In this case, the multiple testing across domains needs to be controlled for, and any of the three approaches described above could be employed for this purpose. That is, for each domain, an overall composite would be const ructed. Then, in the first approach, a higher -order overall global construct could be created across the multiple individual domain -specific constructs, which is then used to estimate the overall impacts of the intervention. In the second approach, a joint F-test could be employed to qualify the individual impact estimates on the domain -specific constructs. Finally, in the third approach, the individual composites could be tested individually, which would require a multiplicity adjustment be applied. Schoch et (2008) argues that the third approach is more appropriate when testing across domains since the first two approaches are based on the principle of domain -specific constructs representing a higher -level global latent construct , which could be less plausi ble since the domain -specific constructs are likely to represent different latent constructs (otherwise they should have been placed in a single domain ). Multiple hypothesis testing can be ignored for exploratory analyses: Exploratory analyses are conside red preliminary, hence they are subject to less scrutiny and do not require addressing the multiple comparisons issue. Each subgroup analysis should be classified as confirmatory or exploratory: Subgroup analyses are also subject to the multiple comparison s issue since they add to the number of hypothesis tests conducted. Subgroup analyses are generally considered as exploratory due to a couple compelling reasons. First, evaluations are generally designed to have sufficient statistical power to detect an ov erall impact. Since subgroup analyses are conducted with only a portion of the full -sample, they tend to have less statistical power. Second, subgroup analyses that are specified as confirmatory should rely on a priori theoretical reasoning and/or empirica l evidence that suggest differential impacts for the subgroups being considered. Scarcity of such evidence supports the case for classifying subgroup analyses as exploratory. In summary, based on the current standards of practice in the field for accounti ng for multiple comparisons, we recommend the following procedures for an evaluation of service -learning: (a) group the outcomes into a small number of ―domain families‖ based on an a priori understanding of their underlying relationship; (b) for each doma in, create an overall construct that combines the Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-17 individual outcomes in that domain; (c) test for impacts on the overall constructs and adjust for multiple comparisons across domain constructs, where necessary. For example, the National Evaluation (a) pri oritized three key outcomes: academic achievement, academic engagement and civic engagement as primary and confirmatory outcomes; (b) created overall constructs out of the individual outcomes for each domains; and (c) was designed to first test for impacts across the overall constructs and adjust for multiple comparisons across those domain constructs. 3.4.3 Met hods to Maximize Response Rates and Deal with Issues of Nonresponse In this section, we describe the strategies and methods that we planned on using to m aximize response rates and deal with non -response for the recruitment phase o f the evaluation, which would have included recruiting additional states, school districts, high schools and teachers and determining the eligibility of interested teachers throug h completion and review of the Teacher Information Form. Based on our extensive experience conducting large -scale evaluations (e.g., The Reading First Impact Study, Evaluation of the U.S. Department of Education’s Student Mentoring Program), we have found the following strategies successful in facilitating communication with district and school respondents during recruitment activities and in maximizing response rates for telephone calls, on -site meetings, and the completion of study forms: use senior -leve l staff for recruitment and refusal conversion; provide a sufficient timeframe for recruitment activities (i.e. over the course of several months) to make sure that busy schedules of district and school administrators and teachers can be accommodated; pro vide sufficient information about the study design, objectives, and methodology so that potential participants have an informed basis for their decision to participate; provide potential participants with a realistic appraisal of the contributions in time, information, space, and human resources they will be expected to invest in the study effort and a statement of anticipated benefits (including honoraria and incentives); maintain regular contact between study team members to monitor response rates, identi fy non -respondents, and resolve problems quickly; use follow -up and reminder calls and e -mails to district and school staff who have not responded to outreach efforts or who have not returned study forms; hire a district or school staff member to be an o n-site study liaison at each school who will be responsible for overseeing data collection activities; demonstrate knowledge and understanding of service -learning and sensitivity to the issues facing district and school administrators and high school teach ers trying to complete the ir day -to-day activities; and obtain the endorsement and support of state agencies (e.g., State Education Agency) and other professional associations (e.g., for the objectives of the study). These strategies have been proven to fo ster honest and collaborative relationships between the research team and study participants, which in turn, lead to high participation rates in telephone conversations and onsite meetings and high response rates on study surveys and forms. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evalu ating Service -Learning Programs ▌pg. 3-18 References for Section Three Bloom, H.S., Richburg -Hayes, L., and Black, A. R. 2005. Using Covariates to Improve Precision: Empirical Guidance for Studies that Randomize Schools to Measure the Impacts of Educational Interventions. MDRC Working Paper. Bloom, H. S., Hill, C. J., Black, A. R., and Lipsey, M. W. (2008). Performance Trajectories and Performance Gaps as Achievement Effect -Size Benchmarks for Educational Interventions. MDRC Working Paper . New York, NY: MDRC. Billig, S. H. (2009). Does quality really matter? Tes ting the new K -12 service -learning standards for quality practice. In B. E. Moely, S. H. Billig, & B. A. Holland (Eds.), Advances in service - learning research: Vol. 9. Creating our identities in service -learning and community engagement (pp. 131 -157). Char lotte, NC: Information Age. Hedges, L. V. & Hedberg, E. C. (2007). Intraclass correlation values for planning group randomized trials in education. Educational Evaluation and Policy Analysis, 29 , 60 -87. National Youth Leadership Council. (2008). K-12 Serv ice -Learning Standards for Quality Practice . Retrieved June 4, 2010 from http://www.nylc.org/objects/publications/StandardsDoc.pdf Northup, J. (2010). Evaluation of the Oregon Learn and Serve Program . Denver, CO: RMC Research Corporation. Raudenbush, S. W. (1997). Statistical analysis and optimal design for cluster randomized trials. Psychological Methods, 2 (2), 173 -185. Raudenbush, S. W., & Lui, X. F. (2000). Statistical power and op timal design for multisite randomized trials. Psychological Methods, 5 (2), 199 -213. RMC Research Corporation. (2009). K-12 Service -Learning Project Planning Toolkit . Scotts Valley, CA: National Service -Learning Clearinghouse. Schochet, P. Z. 2008. Statisti cal Power for Random Assignment Evaluations of Education Programs. Journal of Educational and Behavioral Statistics, 33(1) , 62 -87. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-1 4. Instruments and Recruitment Materials Developed for the National Evaluation of School -based Learn and Serve America Progra ms 4.1 Introduction This section of the Toolkit provides sample instruments and recruitment materials that w ere developed by Abt Associates and its subcontractors, RMC Research Corporation and Dillon -Goodson Research Associates , to use in the National Evaluat ion. The teacher and student instruments were developed in consultation with experts in service -learning , based on prior research, and pilot tested with service -learning administrators, teachers, and students. As required, all instruments, recruitment mate rials and consent forms were reviewed and approved by Abt Associates ’ Institutional Review Board (IRB). In the introduction to this section, we provide an overview of the evaluation’s design – particularly the research questions and recruitment approach – in order to provide context for the instruments and materials . The remainder of the section is devoted to the sample instruments and recruitment materials , including : Instruments to measure teachers’ implementation of service -learning; Instruments to measu re students’ academic and civic engagement, and service -learning experience; and Materials for recruiting districts, schools, teachers and students to participate in the evaluation. Although the teacher and student instruments were developed for this parti cular study, the measurement instruments may be relevant to other research on service -learning . Any one of these measurement instruments could be useful for all rese arch design s – desc riptive, quasi -experimental, or experimental. The recruitment materials are less easily tr ansferred to another evaluation without substantial modifications, but are presented as examples of the range of types of recruitment materials required for a national study involving recruitment at multiple levels . They are examples of recruitment materials that were designed to be informative, easy to understand, and persuasive about the importance of the evaluation. 4.1.1 Overview of the NELSAP Study Design NELSAP was designed as a random assignment evaluation of the impacts of high -quality Learn and Serve America -funded service -learning activities. 8 It was designed to test the program logic model of 8 CNCS ’s Learn and Serve America Program encouraged civic participation and volunteerism throughout the country by supporting service -learning progra ms that help ed more than one million young people each year meet community needs while improving their academic skills and learning the habits of good citizenship.

For more than a decade, Learn and Serve America funds supported service -learning activities, distributing approximately $38 million in grants annually that reach ed approximately 18 00 schools, higher education institutions, and community -based organizations nationwide. The largest portion of Learn and Serve America funds (60 percent) was designate d for K-12 school -based service -learning. All 50 states, the District of Columbia, and the territory of Puerto Rico were eligible for these funds, which are allocated Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-2 this evaluation (see Exhibit 4.1 ). The primary evaluation questions address ed short - and medium -term impacts on 9th and 10 th grade students’ acad emic achievement , academic engagement , and civic engagement in core academic areas .9 Secondary impact questions were designed to investigate variation in student impacts for student subgroups and to test the effects of the experimental inter vention on clas sroom instruction. Exploratory/descriptive questions were designed to examine the quality of service -learning that teachers implement ed in the intervention classrooms , explore the relationship between service - learning quality and student outcomes, and inv estigate subgroups based on teacher and school characteristics .10 Finally, t he design included questions about the imp lementation of service -learning, including whether teachers successfully refrained from using service -learning in their control classes, the level of implementation of the components of service -learning and the quality of the service - learn ing in the treatment classrooms, and the relationship between implementation and the quality of service -learning and student outcomes. We describe the resea rch questions in more detail below. However, because NELSAP was designed as an evaluation of ―high -quality‖ service -learning, we first discuss our reasons for emphasizing quality and the implications of that choice. The decision to focus on high -quality se rvice learning was based in part on prior quasi -experimental research that indicates that it is only when service -learning is high -quality is there an association between service -learning and student outcomes (e.g., Billig 2009). Maintaining a focus on hig h-quality was desirable because results demonstrating the impacts of high quality service -learning might be used as an impetus for schools to adopt high quality standards. Conversely, if this focus was not maintained, the study might risk not finding an ef fect due to the lower -quality of the service -learning programs that are included in the study. This design decision had implications for the evaluation. First, by prioritizing a subset of service - learning programs, results from the National Evaluation wer e generalizable to only high -quality service -learning programs. Second, on the assumption that service -learning, as an instructional approach, develops with time and experience; the National Evaluation sample would include only teachers who had prior exper ience implementing (high -quality) service -learning. No teachers new to service -learning would be including. This also meant that the research design was atypical for a random assignment evaluation: the treatment condition would be business as usual and the control condition would entail foregoing service -learning. 11 according to a formula based on school -age population and Title I allotment. State Educat ion Agencies (the grantees) then provide d sub -grants to school districts, regional entities and schools to implement service - learning activities. 9 In schools that offer service -learning activities, 52 percent of social studies classrooms have service - lea rning as part o f their curriculum, followed by 42 percent of science classrooms, 34 percent of English language arts classrooms, and 15 percent of math classrooms (CNCS, 2008). 10 These outcomes were considered exploratory either because the measures them selves had not been tested before, there was scant research evidence on the question, or the study was not powered to reliably detect the hypothesized effect sizes. 11 In more typical evaluations, the intervention is ―added‖ to the treatment group while th e control condition is business as usual. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-3 Exhibit 4.1: Logic Model Inputs Outputs External Factors Student and teacher background characteristics Service -learning characteristics and quality School background characteristics and civic culture Overall classroom pedagogy LSA FundingState -offered professional development in service -learning Grants to districts and schools (for materials, transportation, teacher stipends) Community Partner SupportsWillingness to provide meaningful service opportunities for students District and/or School SupportsResources and time for implementing SL activities Other Funding Sources for SLResources from private foundations and/or individuals Teachers:•Develop curriculum that integrates content standards and service activities•Clearly articulate both academic and service goals•Help students learn how to transfer knowledge and skills from one setting (service or academic) to the other Students: •Deliberate to choose a social issue•Research the issue•Collectively plan activities linking service and academic content•Perform hands -on service•Reflect on service and its relationship to broader social issues •Demonstrate/celebrate what they have learned Student Outcomes -Impact Academic Achievement•High school completion College attendance•College graduation rate•Other post -secondary learning opportunities•military•AmeriCorps•trade school•apprenticeship programs Civic Engagement•Ethic of service •Political participation LONG -TERM SHORT -TERM Academic Achievement •Achievement in SL content area•Course completion•Attendance during course -term Academic Achievement•Overall academic achievement•Grade completion•Attendance ACTIVITIES MEDIUM -TERM 21stcentury skills •Problem solving•Teamwork Academic Engagement •Valuing school/SL class•Interest in core content area•Postsecondary aspirations Academic Engagement •Valuing school•Interest in core content area•Postsecondary aspirations Civic Engagement•Civic responsibility•Civic efficacy•Involvement with community Civic Engagement•Civic responsibility•Civic efficacy•Involvement with community Predictors of dropout•Failur e in core courses (ELA, Math) •Absenteeism•Grade retention•Discipl inary referrals Predictors of dropout•Failur e in core courses (ELA, Math) •Absenteeism•Grade retention•Discipl inary referrals 21stcentury skills •Problem solving•Teamwork Inputs Outputs External Factors Student and teacher background characteristics Service -learning characteristics and quality School background characteristics and civic culture Overall classroom pedagogy LSA FundingState -offered professional development in service -learning Grants to districts and schools (for materials, transportation, teacher stipends) Community Partner SupportsWillingness to provide meaningful service opportunities for students District and/or School SupportsResources and time for implementing SL activities Other Funding Sources for SLResources from private foundations and/or individuals Teachers:•Develop curriculum that integrates content standards and service activities•Clearly articulate both academic and service goals•Help students learn how to transfer knowledge and skills from one setting (service or academic) to the other Students: •Deliberate to choose a social issue•Research the issue•Collectively plan activities linking service and academic content•Perform hands -on service•Reflect on service and its relationship to broader social issues •Demonstrate/celebrate what they have learned Student Outcomes -Impact Academic Achievement•High school completion College attendance•College graduation rate•Other post -secondary learning opportunities•military•AmeriCorps•trade school•apprenticeship programs Civic Engagement•Ethic of service •Political participation LONG -TERM SHORT -TERM Academic Achievement •Achievement in SL content area•Course completion•Attendance during course -term Academic Achievement•Overall academic achievement•Grade completion•Attendance ACTIVITIES MEDIUM -TERM 21stcentury skills •Problem solving•Teamwork Academic Engagement •Valuing school/SL class•Interest in core content area•Postsecondary aspirations Academic Engagement •Valuing school•Interest in core content area•Postsecondary aspirations Civic Engagement•Civic responsibility•Civic efficacy•Involvement with community Civic Engagement•Civic responsibility•Civic efficacy•Involvement with community Predictors of dropout•Failur e in core courses (ELA, Math) •Absenteeism•Grade retention•Discipl inary referrals Predictors of dropout•Failur e in core courses (ELA, Math) •Absenteeism•Grade retention•Discipl inary referrals 21stcentury skills •Problem solving•Teamwork Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-1 Confirmatory Evaluation Questions: Overall Student Impacts 12 The first three NELSAP research questions reflect hypotheses about short -term impacts on 9th and 10 th grad e students at the end of their service -learning class on three outcomes : What is the impact of participation in Learn and Serve America -funded service -learning activities on 9 th and 10 th grade students’ academic achievement in the service -learning core co ntent area at the end of the class? What is the impact of participation in Learn and Serve America -funded service -learning activities in a core content area on 9 th and 10 th grade students’ class - and school -level academic engagement at the end of the class ? What is the impact of participation in Learn and Serve America -funded service -learning activities in a core content area on 9 th and 10 th grade students’ civic engagement at the end of the class? In addition, the study was designed to measure and test imp acts on these three confirmatory outcomes again at the one -year follow -up , contingent upon finding statistically significant effects at post - program. Exploratory/Descriptive Questions and Secondary Impact Questions Student outcomes What is the impact of p articipation in Learn and Serve America -funded service -learning activities in a core content area on 9 th and 10 th grade students’ 21 st century skills at the end of the class? What is the impact of participation in Learn and Serve America -funded service -lea rning activities in a core content area on predictors of dropout at the end of the class? Variation in Student Impacts Is the im pact of participation in Learn and Serve America -funded service -learning activities in a core content area on 9 th and 10 th grad e students different for different groups of students as defined by their baseline characteristics? Is the impact of participation in Learn and Serve America -funded service -learning activities in a core content area on 9 th and 10 th grade students different for students depending on the baseline characteristics of their teachers? Is the impac t of participation in Learn and Serve America -funded service -learning activities in a core content area on 9 th and 10 th grade students different for students depending o n the baseline characteristics of their schools ? 12 These questions were considered central because they were of particular interest to CNCS and prior quasi - experimental studies suggested that participation in service -learning was associated with the correspondin g outcomes. The question on the impacts on students’ academic achievement at post -program was considered primary, since this outcome was the question of greatest policy interest to CNCS. Secondary questions addressed the impacts of SL on students’ academic and civic engagement at post -program. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-2 Service -learning Implementation Did the service -learning and control classrooms differ in terms of the presence of key characteristics of service -learning? Did the service -learning classrooms in the study represent implementation of the five components of service -learning? What is the relationship b etween service -learning quality and student outcomes at the end of the class? To answer these questions, the study was designed to employ within -teacher random a ssignment during the 2010 -11 school year. This evaluation was designed to include high school teachers that received funds in the 2009 -12 and 2006 -09 federal Learn and Serve America funding cycles. Within - teacher random assignment was selected as the optim al design for this evaluation because it effectively controls for teacher effects and reduces sample size requirements, thereby allowing evaluators to detect reasonably small effects with fewer teachers than would be required with other random assignment o ptions . In addition, the d esign required teachers who had at least one year of experience implementing service -learning , based on advice from experts in the service -learning field who found that teachers need time to learn how to effectively integrate serv ice -learning practices into their instruction. Therefore, f or each eligible teacher who agree d to participate in the study, two eligible classrooms in the same subject area would be identifi ed. Each pair of classrooms would be randomly assigned, one to tre atment ( service -learning ), and the other to control (no service -learning ). During the study year (the 2011 -12 school year), participating teachers w ould then continue to implement service -learning in treatment classrooms , but agree to refrain from using service -learning in the classrooms that we re randomly as signed to the control condition . While teachers would have been screened prior to participation to ensure a history of teaching high - quality service -learning, t he study w ould not provide direction to t eachers on the implementation of service -learning during the study year ; each teacher would be allowed to continue with his/her usual approach to service -learning . This mirrors the reality of Learn and Serve America -funded service - learning activities, sinc e funded teachers do not follow a prescribed curriculum, but create service - learning activities that are adapted to their local contexts. Teachers w ould , however, be given guidance as to which activities should not be implemented in the control classrooms during the study year. Two types of documents were developed for the NELSAP study: data collection instruments and recruitment materials. While recruitment comes chronologically before data collection , we present data collection instruments first, anticip ating that they may be of most interest to service -learning researchers . Data collection instruments for NELSAP were intended to : 1) measure student -level impacts of participation in service -learning at course completion (post -program) and at the end of th e following school year (follow -up); 2) examine differences in student impacts according to students’ baseline characteristics; 3) describe service -learning and non -service -learning classrooms, and the differences between them; 4) examine differences in st udent impacts according to the quality of service -learning, teacher characteristics, and school characteristics. Data collection from students and teachers in the sample were designed to occur prior to a service -learning course (baseline) and immediately a fter the conclusion of the course (post -program) and, if warranted, one year later (follow -up). In the sections that follow, we present instruments by respondent, first teachers then students. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-3 In order to ensure a sufficient pool of study respondents, the NELSAP team also created a recruiting strategy and related instruments. Recruitment for NELSAP involve d identifying and obtaining the cooperation of eligible teachers to participate in the evaluation . Key to the recruitment for this design would be securi ng the agreement of 185 high school teachers who met the eligibility criteria, and that of their high schools and districts, to take part in the evaluation. The study team planned to determine t eacher eligibility based on the teacher’s prior experience in service -learning, the quality of the teacher’s approach to service - learning, the academic content area in which service -learning wa s being used by the teacher, the grade level of students being taught, and the teacher’s willingness to participate in the st udy (See Text Box ). Secondary to this recruitment effort would be obtaining the assent of students (and consent of their parents) in those teachers’ classes. 4.2 Instruments to Measure Service -Learning The NE LS AP study design called for measuring service -lea rning , both to determine teacher eligibility and to evaluate the implementation of the ―intervention .‖13 Additionally, the study was designed to explore the quality of the service -learning that was implemented and the extent to which the service - learning en compassed the five key components of service -learning: Investigation, Planning, Action, Reflection, Demonstration/Celebration (IPARD/C) (RMC Research, 2009) . Three types of instruments were developed to measure service -learning from the teachers’ perspecti ve :14 1. A survey of teachers’ prior experiences with service -learning: the Teacher Information Form (TIF) . 15 The TIF was develo ped specifically for this study to determine the eligibility of all potent ially eligible teachers and asks for information on ser vice -learning experience; previous and current service -learning classes (including the academic area, grade level, student ability level, length, and number of classrooms that the teacher taught in the past); 13 The validity of the NELSAP design rests on the assumption that experienced service -learning teachers can selectively change their instructional practices to eliminate service -learning in some classrooms in which they typically use service -learning, while continuing to implement service -learning in other classrooms. Thus, evidence of the degree to which teachers maintain the critical diffe rences between service -learning and control classes will allow the study to ass ess whether observed differences in the outcomes of treatment and control students can be reasonably attributed to service -learning. The study will attempt to ensure such treatment -control differences through high -quality site recruiting practices and clea r guidance for the study participants. Data collection will be used to monitor teaching practices in the treatment and control classrooms. 14 Note that question 14 in the student survey asks about service -learning implementation from the students’ perspec tive . 15 It was necessary to develop a new instrument to measure service -learning quality for this study because no such instrument that has proven psychome tric properties existed at the time . NELSAP Teacher Eligibility Criteria 1) had recently received school -based LSA funding 2) planned to implement service -learning in at least two classes of a core academic area (or areas) for 9 th or 10 th grade students 3) had at least one year of experience utilizing service -learning in a similar population 4) met a minimum level of service -learning quality practices Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-4 plans for service -learning classes in the 2011 -12 school year; and service -learning quality indicators. Additional items request teacher contact information and basic information on teaching background and qualifications. The TIF was originally intended to be administered on paper but was adapted to be completed online by teachers. 2. A short weekly survey of teachers’ implementation of service -learning during the study period: the Classroom Activities Log. The Log is a short online teacher survey developed specifically for NELSAP by the study team . Th e Log ’s items on implementation draw primarily from the five components of service -learning. Items on quality are based on the K - 12 Service -Learning Standards for Quality Practice (NYLC, 2008), which are generally accepted service -learning quality standard s in the field. 3. A post -program teacher interview about the quality of service -learning activities during the study period: the Teacher Interview. The Teacher Interview is a structured protocol that includes questions about the q uality of the service -lea rning corresponding to the eight quality standards in the K -12 Service -Learning Standards for Quality Practice (NYLC, 2008 ). The interview includes follow -up questions about individual indicators subsumed by each standard. Each question asks the teacher to describe the degree to which her service -learning class represents each of the indicators and to provide specific examples. The items on service -learning quality were developed for the NELSAP study based on four sources: (1) the National Youth Leadership Council’s (2008) K-12 Service -Learning Standards for Quality Practice ; (2) the five core components of a service -learning project (RMC Research Corporation, 2009); (3) consultation with members of the study’s Technical Work Group (TWG); and (4) other surve ys conducted by RMC as part of two evaluations on service -learning quality (Billig 2009; Northup, 2010). These standards are integrated into Learn and Serve America grant programs and delineated in the K -12 School -Based Formula Notices of Funding Opportuni ty (NOFO). The sample instruments are all researcher -developed measures for the study and, as such, do not have evidence on psychometric properties. Each of these measures underwent cognitive testing with service -learning teachers. References for Section F our Billig, S. H. (2009). Does quality really matter? Testing the new K -12 service -learning standards for quality practice. In B. E. Moely, S. H. Billig, & B. A. Holland (Eds.), Advances in service - learning research: Vol. 9. Creating our identities in serv ice -learning and community engagement (pp. 131 -157). Charlotte, NC: Information Age. National Youth Leadership Council. (2008). K-12 Service -Learning Standards for Quality Practice . Retrieved June 4, 2010 from http://www.nylc.org/objects/publications/StandardsDoc.pdf Northup, J. (2010). Evaluation of the Oregon Learn and Serve Program . Denver, CO: RMC Research Corporation. RMC Research Corporation. (2009). K-12 Service -Learning Proje ct Planning Toolkit . Scotts Valley, CA: National Service -Learning Clearinghouse. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-5 4.2.1 Teacher Information Form and Instructions OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-6 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serv e America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Teacher Information Form Teacher name : School state: _________________ School district: School name: Phone number (where you are most easily reached): E-mail address (where you are most easily reached): I prefer to be contacted by:  phone  email This information will be detached from your survey responses. We want to assure you that all responses to this survey wil l be kept confidential to the maximum extent allowed by law. Any personally identifiable information will be removed from your responses, all of which will be encoded with a unique identification number to be used only by persons engaged in the research. W e will report information in the aggregate only; your school and district will not have access to the completed surveys at any time. If you have questions or comments about the survey, or would like assistance completing it, please contact the study team by emailing (EMAIL), or by calling (toll -free) XXX -XXX -XXXX. In order to streamline the process, you will also have the opportunity to provide informed consent when you complete the survey. Informed consent indicates that you are willing to participate i n the study if you are selected. You are allowed to withdraw your consent and cease participation at any time, even if you have previously provided consent. Further details on your rights are provided on the informed consent page. According to the Paperw ork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information is XXXX -XXXX. The average estimated burden time for the survey is XX minutes/hours. This estimate includes the time to review instructions, search existing data resources, gather the data needed , and complete and review the information collected. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please contact Corporation for National and Community Service, 1201 New York Avenue, NW, Washington, DC 20525. OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 7 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associa tes. Teacher Information Form The purpose of this form is to gather information about teachers who are intere sted in participating in the National Evaluation of School -Based Learn and Serve America Programs. The form asks teachers to describe the approach to service -learning they have implemented in past courses or are implementing currently. Information from thi s form will help determine teacher eligibility for the study. Completion of these forms is voluntary. Thank you for your time. For the purposes of this study, service -learning is defined as students engaging in activities to meet a genuine community need while simultaneously learning and applying important knowledge and skills from the academic curriculum. All service -learning must involve the entire class. Students may work on activities in small groups or as a whole class, but for this study, no individu al projects will be allowed. Part I: My Service -Learning Experience and Future Plans 1. Prior to this school year, have you used service -learning in any core academic classes in grades 9 through 12 (core academic subjects include math, science, English/l anguage arts, and social studies/history)?  Yes  No a. How many core academic classes have you taught in grades 9 -12 using service -learning? Please count each course or section per school year separately (e.g., if you taught English using service -learn ing in 2008 -2009 and 2009 -2010, please count those as two classes.) # of classes :________________ b. In what grades were the students in the core academic classes in which you have taught using service -learning? (Consider all service -learning classes and check all that apply)  9th  10 th  11 th  12 th 2. This school year, are you using service -learning in any core academic classes in grades 9 -12? Please consider completed classes this school year as well as those in progress.  Yes  No a. How ma ny core academic classes are you teaching (or have you taught) using service - learning in grades 9 through 12? Please count each course or section separately ( e.g. , if you are teaching two sections of English using service -learning, please count those as tw o classes). # of classes:________________ OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 8 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associa tes. b. What grades are the students in the core academic classes in which you are (or were) using service -learning? (Consider all classes and check all that apply)  9th  10 th  11 th  12 th c. Have you finis hed a service -learning project in any class this year? ( Across the entire course period and considering all of the service -learning activities planned for any class. These activities may include investigation, planning, action, reflection, demonstration/ce lebration)  Yes  No AFTER ANSWERING QUESTIONS 3 AND 3.A , CONTINUE TO FORM A O F SECTION II - AFTER ANSWERING QUESTIONS 3 AND 3.A , CONTINUE TO FORM B O F SECTION II 3. In the upcoming school year (2011 -12) , do you expect to implement service -learning in a core academic class for students in the 9 th or 10 th grade? (choose one resp onse )  Yes - CONTINUE  Not sure - CONTINUE  Definitely not - Thank you for your interest. However, the study will include only those teachers who will be teaching with service -learning in the 2011 -12 school year. There is no need for you to provide m ore information or to complete the study consent. Thank you, again!. OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 9 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associa tes. Classes In Which I Expect to Implement Service -Learning Next Year a. Please provide the following information about classes in the 2011 -12 school year in which you expect to implement service -learning . Only include classes in core academic subjects. Subject Grade Level(s) (check all that apply) Are there any special designations for this class (check all that apply)?

Special education, English language learners; honors, college prepa ratory, other – specify______ _________; No special designation Number of Classrooms/ Sections 9th 10th OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 10 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. FORM A (COMPLETED CLASS) Part II: My Most Recent Service -Learning Class The next set of questions (4-13 ) relates to one of the core academic classes in which you implemented service -learning. Please choose one core academic class in which you implemented service -learning with 9th-12 th grade students. - Please pick a class in which you have recently completed teaching using service -learning. For the purposes of this study, service -learning is defined as students engaging in activities to meet a genuine community nee d while simultaneously learning and applying important knowledge and skills from the academic curriculum. All service -learning must involve the entire class. Students may work on activities in small groups or as a whole class, but for this study, no indivi dual projects will be allowed. 4. What subject(s) were you teaching in this class? (choose one)  English/Language Arts  Math  Science  Social Studies/History  Other (specify) _______________ a. During which school -year did you teach this [CLASS] class? School year: 20________ - 20_______ b. In what grades were the students in this [CLASS] class? (Check all that apply)  9th  10 th  11 th  12 th PLEASE ANSWER QUESTI ONS 5 -13 BELOW ABOUT THE CLASS LIST ED IN QUESTION 4. Please complete questions 5-13 about your activities during the full semester or year of the course. 5. Across the entire course period (semester or school year), a. How many weeks did this [CLASS] meet during the school year? # of weeks________________________(a) b. How many hours per week did this [CLASS] class meet? # hours per week__________________(b) Across the entire course period and considering all of the service -learning activities included in this class (investigation, planning, action, reflection and demonstration /celebration): c. Of the [a] weeks that the class met, during how many weeks did a ny service -learning activities occur in this [CLASS] class)? # of weeks of service -learning________(c) OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 11 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. d. During the [c] weeks of service -learning, during how many hours per week did any service -learning activities occur in this CLASS] class? # of hours o f service -learning per week _________________________________(d) e. During the entire course period, how many total hours of service were performed as part of service -learning in this [CLASS] class? (If students did service at different times, add all the times together.) # of hours of service________________(e) 6. How closely were the service -learning activities in this [CLASS] class aligned with academic content standards for the subject area? (e.g., district, state, or national standards) (choose one res ponse) Not aligned 1 2 Moderately aligned 3 4 Very aligned 5      7. Did students in this [CLASS] class conduct an assessment of community needs before selecting the service project?  Yes  No 8. In this [CLASS] class, how involved were students in: a. the selection of their service project(s)? (choose one response) Not involved 1 2 Moderately involved 3 4 Very involved 5      b. generating ideas and making decisions related to planning , throughout the service - learning process? (choose one re sponse) Not involved 1 2 Moderately involved 3 4 Very involved 5      c. generating ideas and making decisions related to action or service , throughout the service -learning process? (choose one response) Not involved 1 2 Moderately involved 3 4 Ver y involved 5      OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 12 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. d. generating ideas and making decisions related to evaluation , throughout the service - learning process? (choose one response) Not involved 1 2 Moderately involved 3 4 Very involved 5      9. In this [CLASS] class, did students collaborate with a community partner or partners as part of the service -learning?  Yes  No a. If yes, in which way(s) did students collaborate with the community partner(s)? ( Check all that apply)  Investigation (e.g., sharing knowledge of school or c ommunity assets or needs, or collaborating to investigate or research community needs)  Planning (e.g., collaborating to establish a shared vision or set common goals to address community needs)  Action (e.g., collaborating in service)  Reflection (e.g., coll aborating to think deeply about the community issue and alternative solutions)  Demonstration of results or Celebration (e.g., collaborating to share what has been learned or to celebrate the results)  Other __________________________________________________ __ 10. Did the students engage in any reflection related to their service -learning in this [CLASS] class?  Yes  No a. If yes, when did the reflection take place? ( check all that apply)  before service, as part of investigation or planning  durin g service  after service, as part of demonstration or celebration b. What type(s) of activities did the students do as any part of the reflection? (check all that apply)  written products  oral presentations or discussions  other (e.g., dance, dra ma) c. To what extent did the reflection activities include discussion of the larger social or civic issues related to students’ service -learning experience? Not at all 1 2 Moderate amount of discussion 3 4 Great amount of discussion 5 OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 13 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates.      11. Which of the following topics/activities were addressed during service -learning in this [CLASS] class? (check all that apply)  understanding multiple perspectives  looking at service from the perspective of those served  recognizing or overcoming stereotypes  how to resolve conflict(s) or group decision -making  none of the above 12. In this class, did students engage in any of the following (choose one response for each item) : Not involved 1 2 Moderately involved 3 4 Very involved 5 Not part of this class a. Collecting evidence toward meeting specific service goals or learning outcomes?       b. Collecting evidence of the quality of service -learning?       c. Using evidence to improve service -learning experiences?       d. Communicating evidence of progress towards goals and outcomes with the larger community?       13. Which kinds of events did students engage in to demonstrate the impact of their service to others?

(check all that apply)  classroom event  school event  community event  reports/articles in the media  other (specify) ______________________________  not part of this class Continue to Question 14 OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 14 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Pro grams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. FORM B (Service -learning in progress for first time) Part II: My Most Recent S ervice -Learning Class The next set of questions ( 4-13 ) relates to one of the core academic classes in which you are implementing service -learning. Please choose one c ore academic class in which you are implementing service -learning with 9 th-12 th grade students. o Please pick a class in which you have completed the greatest proportion of planned service - learning activities. Please provide estimates for the full course te rm (either semester or year in which the course is held), including completed activities, and planned activities that are not yet completed. For the purposes of this study, service -learning is defined as students engaging in activities to meet a genuine community need while simultaneously learning and applying important knowledge and skills from the academic curriculum. All service -learning must involve the entire class. Students may work on activities in small groups or as a whole class, but for this stu dy, no individual projects will be allowed. 4. What subject(s) are you teaching in this class? (choose one)  English/Language Arts  Math  Science  Social Studies/History  Other (specify) _______________ a. During which school -year did you teach this [CL ASS] class? School year: 20________ - 20_______ b. In what grades are the students in this [CLASS] class? (Check all that apply)  9th  10 th  11 th  12 th PLEASE ANSWER QUESTI ONS 5 -13 ABOVE ABOUT THE CLASS LIST ED IN QUESTION 4. Please complete questions 5-13 with your best estimate of planned activities for the full s emester or year of the course. 5. Across the entire course period (semester or school year), f. How many total weeks does this [CLASS] meet during the entire school year? # of weeks________________________(a) g. How many hours per week does this [CLASS] cla ss meet? # hours per week__________________(b) Across the entire course period and considering all of the service -learning activities included in this class (investigation, planning, action, reflection and demonstration /celebration): OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 15 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Pro grams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. h. Of the [a] weeks that the class meets during this school year, during how many weeks are service -learning activities occurring in this [CLASS] class (of the total weeks listed in a above )? # of weeks of service - learning_____ _________________(c) i. During the [c] weeks of service -learning, during how many hours per week are service -learning activities occur occurring in this CLASS] class? # of hours of service -learning per week _____________________________(d) j. During the ent ire course period, how many total hours of service are performed as part of service -learning in this [CLASS] class? If students perform service at different times, add all the times together. # of hours of service_____________(e) 6. How closely are the ser vice -learning activities in this [CLASS] class aligned with academic content standards for the subject area? (e.g., district, state, or national standards) (choose one response) Not aligned 1 2 Moderately aligned 3 4 Very aligned 5      7. At any ti me during the entire course period, are students in this [CLASS] class conducting an assessment of community needs before selecting the service project?  Yes  No 8. At any time during the entire course period for this [CLASS] class, how involved ar e students in: e. the selection of their service project(s)? (choose one response) Not involved 1 2 Moderately involved 3 4 Very involved 5      f. generating ideas and making decisions related to planning , throughout the service - learning process? (cho ose one response) Not involved 1 2 Moderately involved 3 4 Very involved 5      OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 16 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Pro grams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. g. generating ideas and making decisions related to action or service , throughout the service -learning process? (choose one response) Not involved 1 2 Moderately involv ed 3 4 Very involved 5      h. generating ideas and making decisions related to evaluation , throughout the service - learning process? (choose one response) Not involved 1 2 Moderately involved 3 4 Very involved 5      9. At any time during the e ntire course period for this [CLASS] class, are students collaborating with a community partner or partners as part of the service -learning?  Yes - continue  No Go to Q 10 b. If yes, in which way(s) students collaborating with the community partner(s)? ( Check all that apply)  Investigation (e.g., sharing knowledge of school or community assets or needs, or collaborating to investigate or research community needs)  Planning (e.g., collaborating to establish a s hared vision or set common goals to address community needs)  Action (e.g., collaborating in service)  Reflection (e.g., collaborating to think deeply about the community issue and alternative solutions)  Demonstration of results or Celebration (e.g., collabo rating to share what has been learned or to celebrate the results)  Other ____________________________________________________ 10. At any time during the entire course period, are the students engaging in any reflection related to their service -learning in th is [CLASS] class?  Yes - continue  No Go to Q 11 d. If yes, when is the reflection taking place? ( check all that apply)  before service, as part of investigation or planning  during service  aft er service, as part of demonstration or celebration e. What type(s) of activities are the students doing as any part of the reflection? (check all that apply) OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 17 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Pro grams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates.  written products  oral presentations or discussions  other (e.g., dance, drama) f. To what e xtent are the reflection activities including discussion of the larger social or civic issues related to students’ service -learning experience? Not at all 1 2 Moderate amount of discussion 3 4 Great amount of discussion 5      11. At any time during t he entire course period, which of the following topics/activities are addressed during service -learning in this [CLASS] class? (check all that apply)  understanding multiple perspectives  looking at service from the perspective of those served  recogn izing or overcoming stereotypes  how to resolve conflict(s) or group decision -making  none of the above 12. At any time during the entire course period of this class, are students engaging in any of the following (choose one response for each item) : Not involved 1 2 Moderately involved 3 4 Very involved 5 Not part of this class e. Collecting evidence toward meeting specific service goals or learning outcomes?       f. Collecting evidence of the quality of service -learning?       g. Using evidence t o improve service -learning experiences?       h. Communicating evidence of progress towards goals and outcomes with the larger community?       OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 18 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Pro grams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. 13. At any time during the entire course period, which kinds of events are students engaging in to de monstrate the impact of their service to others? (check all that apply)  classroom event  school event  community event  reports/articles in the media  other (specify) ______________________________  not part of this class Continue to Question 14 OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-19 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Part III: My Background and Teaching Experience 14. Including this school year, how many school years of teaching experience do you have at any school? # of school years:________________ 15. Please list the subjec t areas that you have taught at any school and indicate whether you are certified in that area: Subject Certified? a. .  Yes  No b.  Yes  No c.  Yes  No d.  Yes  No e.  Yes  No f.  Yes  No g.  Yes  No h.  Yes  No 16. Your education: a. What is the h ighest degree you have achieved? (choose one response)  Bachelor’s degree  Master’s degree  Professional school degree (for example: MPH, MSW)  Doctorate (for example: PhD, EdD)  Other (specify):__________________________________ b. What was yo ur area of study for that degree? Area of study: _____________________________________ OMB No. 3045 -xxxx NATIONAL EVALUATION OF SCHOOL -BASED — DRAFT — Approval expires: xx/xx/20xx LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-20 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. 17. During the last five years (since September 2006), did you participate in any professional development activities in service -learning? (check all that apply)  Yes – a short training or workshop about service -learning (e.g., after school, less than 8 hours)  Yes – a conference about service - learning (e.g., off -site, one day or more)  Yes – a professional learning group or inquiry group meeting  No End Please count each workshop, conference, or group meeting only once. How many different: Total hours a. A short training or workshop about service -learning (e.g., after school, less than 8 hours) Workshops? #_______ ____hours b. A conference about service -learning (e.g., off -site, one day or more) Conferences? #_______ ____hours c. A Professional Learning Group or Inquiry Group meeting Groups? #_______ ____hours END OF SURVEY Thank you for completing the Teacher Information Form! We will review your inf ormation, and notify you by the end of the Spring 2011 semester about your eligibility for the study. The study will select randomly from among all eligible teachers. If you are one of the teachers chosen to participate, you will be contacted in Spring/Sum mer 2011 to confirm your interest in and availability for the study, and to collect information about the courses you will be teaching in the 2011 -12 school year. In order to streamline the information collection process, you may now choose to complete co nsent to participate in the study. This consent is contingent on your being selected for the study. Participation in the study is voluntary and you may withdraw your consent at any time, even if you have previously given consent. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-21 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. Teacher Information Form - Instructions All responses to this survey will be kept confidential to the maximum extent allowed by law. Any personally identifiable information will be removed from your responses. All responses will be encoded with a unique identification number to be used only by persons engaged in the research. We will report information in the aggregate only; your school and district will not have access to the completed surveys at any time. If you have questions or comments about the survey, or would like assistan ce completing it, please contact the study team by emailing (EMAIL) or by calling (toll -free) XXX -XXX -XXXX. In order to streamline the process, we are also giving teachers the opportunity to provide informed consent when they complete the TIF. Informed co nsent indicates that you are willing to participate in the study if you are selected. You are allowed to withdraw your consent and cease participation at any time, even if you have previously provided consent. Contact information Contact information will be used to follow up with you about study eligibility and participation. All contact information will be detached from your survey responses. Your contact information will be kept confidential to the maximum extent allowable by law. Name Your name. If yo u filled out the sign in sheet at the recruitment meeting, please use the same version of your name (or nickname) that you provided at the meeting. School state Use the drop down menu to select your school’s state = District = After choosing your state, use= the drop down menu to select your school’s district = School Name = Provide the full name of your school = Phone number = Provide the phone number (XXu JXXu JXXXX) at which you are most easily= reached = Email address = Provide the email address at which you are most =easily reachedK = Preferred contact = Check how you prefer to be contacted by study staff. = Definitions The following terms are used throughout the survey. Service -learning For the purposes of this study, service -learning is defined as students engaging in activities to meet a genuine community need while simultaneously learning and applying important knowledge and skills from the academic curriculum. All service -learning must involve the entire class. Students may work on activities in small groups or as a whole class, but for this study, no individual projects will be allowed. Service -learning activities Service -learning activities include investigation, planning, action, reflection, demonstration and celebration. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-22 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. Authentic (community need) Need is relev ant and important to the community. Class One classroom or one section of a subject for the full semester or school year in which students are enrolled in that course. Core academic subjects math, science, English/language arts, and social studies/h istory. Investigation Process of identifying community needs of interest and begin research to assess the needs by designing a survey, conducting interviews, using varied media including books and the Internet, and drawing from personal experience and obs ervation. Students may document the extent and nature of the problem and establish a baseline for monitoring progress. Community partners may be identified. Planning Selecting the service activity and developing an action plan for the service activity. Outlining varied ways to meet the community need or contribute to improving the situation. Planning may include: clarifying roles and responsibilities, developing a common vision for success, deciding what will occur and who will do each part of the work, c reating a timeline, listing materials and costs, and overseeing any logistics and approvals that must be obtained. Action Implementation of the plan to address an authentic community need. Can include direct service, indirect service, or research of adv ocacy with the community in which the need exists Direct service : students respond to a community need by interacting with and impacting the service recipient or site. Indirect service : students build infrastructure or capacity to respond to the communit y need, for example, students pack food boxes at the local Food Bank Research and advocacy : students find, gather and report on information to raise awareness of a problem and/or advocate for change in the condition underlying the community need, for examp le, students meet with elected officials to urge support for additional food subsidy for low -income families. Reflection Students consider how the experience, knowledge, and skills they hope to acquire relate to their own lives, their community, and/or t heir academics. Students engage in varied activities to think about the needs, their actions, their potential or actual impact. This process includes both analytical and affective response. Demonstration Students provide evidence to others of their infl uence and accomplishments. They showcase what and how they have learned and their acquired skills and knowledge. In this context of demonstration, along with their partners, students may also plan and carry out a celebration of what they have gained and co ntributed including both the learning and the service. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-23 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. Part I: My Service -Learning Experience and Future Plans 1 Past experience Select YES if you implemented service -learning in a core academic subject prior to this school year for students in grades 9 through 12. CONTINUE TO QUESTION 1a A ―class‖ refers to one classroom or one section of a subject for the full semester or school year in which students are enrolled in that subject. Core academic subjects are math, science, English/language arts, and social studies/history. Service -learning may have occurred at any time prior to this (2010 -2011) school year. Service -learning must have occurred in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. Service -learning may have occurred at any scho ol, and is not limited to your current school. Select NO if you did not implement service -learning in a core academic subject prior to this school year for students in grades 9 through 12. SKIP TO QUESTION 2 A ―class‖ refers to one classroom or one sec tion of a subject for the full semester or school year in which students are enrolled in that subject. Core academic subjects are math, science, English/language arts, and social studies/history. Service -learning may have occurred at any time prior to this (2010 -2011) school year. Service -learning must have occurred in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. Service -learning may have occurred at any school, and is not limited to your current school. 1a # of classes Enter, as an integer , the number of different core academic classes in which you implemented service -learning prior to this school year for 9 th-12 th grade students. CONTINUE TO QUESTION 1b Each classroom or section in each school year should count as a separate course. Plea se include both semester -long and year -long courses. Core academic subjects are math, science, English/language arts, and social studies/history. Classes may have occurred at any time prior to this (2010 -2011) school year. Service -learning must have occurr ed in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-24 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. Classes may have occurred at any school, and are not limited to your current school. 1b Grade levels Check the grade level(s) of any students in any of those courses. Check all that apply. CONTINUE TO QUESTION 2 Please include both semester -long and year -long courses. Core academic subjects are math, science, English/language arts, and social studies/history. Classes may have occurred at any time prior to this (2010 -2011) school year. Serv ice -learning must have occurred in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. Classes may have occurred at any school, and are not limited to your current school. 2 Current experience Select YES if you are currently implementing, or have implemented, service - learning in a core academic subject this school year for students in grades 9 through 12. CONTINUE TO QUESTION 2a Core academic subjects are math, science, English/language arts, and social studies/history. Service -learning may occur at any time this (2010 -2011) school year, in either the Fall 2010 semester, Spring 2011 semester or both. Service -learning must occur in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. Select NO if you are not intending to implement or did not implement service -learning in a core academic subject this school year for students in grades 9 through 12. SKIP TO QUESTION 3 Core academic subjects are math, science, English/language arts, and social studies/history. Service -learning may occur at any time this (2010 -2011) school year, in either the Fall 2010 semester, Spring 2011 semester or both. Service -learning must occur in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. 2a # of classes Enter, as an integer, the number of differen t core academic classes in which you are implementing, or have implemented, service -learning this school year for 9 th-12 th grade students. CONTINUE TO QUESTION 2b Each classroom or section this school year should count as a separate class. Core academic s ubjects are math, science, English/language arts, and social studies. Classes may occur at any time this (2010 -2011) school year, either the Fall NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-25 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. 2010 semester, Spring 2011 semester, or both. Classes must include students in either the 9 th, 10 th, 11 th, or 12 th grade. Please include both semester -long and year -long classes 2b Grade levels Check the grade level(s) of any students in any of the core academic classes in which you are implementing, or have implemented, service -learning this school year. Check a ll that apply. CONTINUE TO QUESTION 2c if this is your first year of service -learning. Otherwise, CONTINUE TO QUESTION 3 and use FORM A OF SECTION II. Core academic subjects are math, science, English/language arts, and social studies. Classes may occur a t any time this (2010 -2011) school year, either the Fall 2010 semester, Spring 2011 semester, or both. Classes must include students in either the 9 th, 10 th, 11 th, or 12 th grade. Please include both semester -long and year -long classes. 2c Completed servic e-learning project If this is your first year implementing service -learning, you will be asked whether you have finished the service -learning in any class . Select YES if you have completed implementing service -learning in any core academic class in gr ades 9 through 12. CONTINUE TO QUESTION 3. USE FORM A OF SECTION II. Core academic subjects are math, science, English/language arts, and social studies/history. Service -learning must occur in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. Select NO if you have never completed implementing service -learning in any core academic class in grades 9 through 12. CONTINUE TO QUESTION 3.USE FORM B OF SECTION II. Core academic subjects are math, science, English/language arts, and social studies/h istory. Service -learning must occur in classes with students in the 9 th, 10 th, 11 th, or 12 th grade. 3 Plans for 2011 -12 school year Select YES if you think you will be implementing service -learning in any core academic class for students in the 9 th or 10 th grade in school year 2011 - 12. CONTINUE TO 3a Planned or expected classes refer to those in which you are planning to implement, or believe you have a good chance of implementing service - learning. Core academic subjects are math, science, English/languag e arts, and social NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-26 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. studies/history. Classes may be planned for any time in the 2011 -12 school year (Fall 2011, Spring 2012 or both), Classes may be semester -long or year -long. Classes must include students in the 9 th or 10 th grade. Select NOT SURE if yo u are not sure whether you will be implementing service -learning in any core academic class for students in the 9 th or 10 th grade in school year 2011 -12. CONTINUE TO 3a Planned or expected classes refer to those in which you are planning to implement, or believe you have a good chance of implementing service - learning. Core academic subjects are math, science, English/language arts, and social studies/history. Classes may be planned for any time in the 2011 -12 school year (Fall 2011, Spring 2012 or both), Classes may be semester -long or year -long. Classes must include students in the 9 th or 10 th grade. Select DEFINITELY NOT if you do not think you will be implementing service -learning in any core academic class for students in the 9 th or 10 th grade in sch ool year 2011 -12. END OF SURVEY, THANK YOU FOR YOUR INTEREST Planned or expected classes refer to those in which you are planning to implement, or believe you have a good chance of implementing service - learning. Core academic subjects are math, science, E nglish/language arts, and social studies/history. Classes may be planned for any time in the 2011 -12 school year (Fall 2011, Spring 2012 or both), Classes may be semester -long or year -long. Classes must include students in either the 9 th or 10 th grade. 3a Future plans Provide the following information about the number of core academic courses with 9 th or 10 th grade students in which you may implement, or believe you have a good chance of implementing, service -learning in the 2011 -12 school year: Each line corresponds to a course, as denoted by a common curriculum and lesson plan. Choose the course subject : English/Language arts, Math, Science, or Social studies/History. If the course combines subjects, select ―Other‖ and specify the subjects. Check the gra de level(s) of any students in that course. List the type of course: remedial, special education, pre -honors, honors, AP, NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-27 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. IB. If ―other‖ please specify. Indicate the number of classrooms or sections of that class you expect to teach using service -learning. Classes may be planned for any time in the 2011 -12 school year (Fall 2011, Spring 2012 or both), Classes may be semester -long or year -long. Part II. My Most Recent Service -Learning Class. This section asks questions about how you implemented s ervice -learning in a particular class. You are asked to pick one of your core -academic classes in which you have implemented service -learning. The first question in this section asks about the identity of that class, the remaining questions are concerned with the service -learning in that class and your approach to service -learning, Please provide estimates for the entire course period i.e. If you choose a year -long course, please provide estimates for the full school year. If you choose a semester -long cou rse, please provide estimates for the full semester. If you are in FORM A, provide estimates of completed activities. If you are in FORM B, provide estimates for the full course period, even if all activities have not been completed yet. 4 Subject of mos t recent service - learning class Choose the subject(s) of your one chosen core academic class with 9 th-12 th graders and in which you implemented service -learning. THIS IS YOUR ―SELECTED CLASS‖. CONTINUE TO QUESTION 4a If possible, pick a service -learning class which you recently completed . If you have never completed a class with service -learning, pick the class in which you are currently implementing service -learning. Please provide estimates for the entire course period. Class must include students in th e 9 th, 10 th, 11 th, or 12 th grade(s). 4a School year of most recent service -learning class Indicate the school year in which you taught the selected class . 20XX – 20XX format. CONTINUE TO QUESTION 4b. 4b Grades in most recent service - learning class Indica te the grades of any students in the selected class . Check all that apply. CONTINUE TO QUESTION 5a. 5a Weeks per class Report as an integer, the number of weeks for which the selected class was scheduled to meet. IF PAPER TIF, CONTINUE TO QUESTION 5c. IF ONLINE TIF, SKIP TO QUESTION 5d. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-28 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. - For a semester -long class, report the number of weeks in the semester. - For a year -long class, report the number of weeks in the school year. 5b Hours per week Report the hours of instructional time per week for which th e selected class was scheduled to meet. CONTINUE TO QUESTION 5b. e.g., if the class met for 1 hour on each Monday, Wednesday and Friday, the total number of hours per week was (1 hour /day x 3 days/week) 3 hours per week. 5c S-L weeks per class Report as an integer, the number of weeks any service -learning occurred in the selected class. CONTINUE TO QUESTION 5e. This number should be no greater than the total number of weeks in which the class was held (a). 5d S-L hours per week of class Report as an integer, the average number of hours in a given week that any service -learning occurred in the selected class , during the weeks of service - learning CONTINUE TO QUESTION 5e. For the average, please include only those weeks in which service -learning occurre d. Do not include those weeks in which there was no service - learning. Service -learning activities include investigation, planning, action/service, reflection, demonstration and celebration. This number should be no greater than the total number of hours pe r week that the class was held (b). 5e Service hours per class Report as an integer, the number of hours devoted to service activities as part of service -learning, during the entire course period for the selected class . CONTINUE TO QUESTION 6. If studen ts did service at different times, add all hours together. This number should be no greater than the total number of hours 6 Alignment with standards On a scale of 1 to 5, indicate how aligned the service -learning activities in the selected class were wi th the academic standards for the subject area. CONTINUE TO QUESTION 8. Standards may be at the district, state, or national level. 1 is not aligned or least aligned. 5 is perfectly aligned or very aligned. 7 Investigation Select Yes if selection of th e service -learning project was based (in part) on a student -conducted assessment of community needs in the selected class. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-29 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. CONTINUE TO QUESTION 8. . Select No if selection of the service -learning project was not based on students’ community needs assessm ent in the selected class . CONTINUE TO QUESTION 8. 8a Meaningful service - students On a scale of 1 to 5, indicate the extent to which students were involved in the selection of their service -learning project(s) in the selected class. CONTINUE TO QUESTIO N 8b. 1 is not involved. 5 is very involved. 8b Student voice - selection On a scale of 1 to 5, indicate the extent to which students were involved in generating ideas and making decisions related to selecting their service - learning project(s) in the sel ected class . CONTINUE TO QUESTION 8c. 1 is not involved. 5 is very involved. 8c Student voice - planning On a scale of 1 to 5, indicate the extent to which students were involved in generating ideas and making decisions related to planning throughout the service -learning process in the selected class . CONTINUE TO QUESTION 8d. 1 is not involved. 5 is very involved. 8d Student voice - evaluation On a scale of 1 to 5, indicate the extent to which students were involved in generating ideas and making decisi ons related to evaluation throughout the service -learning project(s) in the selected class. CONTINUE TO QUESTION 9. 1 is not involved. 5 is very involved. 9 Community partner(s) Select Yes if students collaborated with at least one community partner as part of service -learning in the selected class . CONTINUE TO QUESTION 9a. Select No if students did not collaborate with any community partners as part of service -learning in the selected class . SKIP TO QUESTION 10. 9a Community partner(s) – participatio n Check any of the ways in which community partners participated in service - learning in the selected class . Check all that apply. CONTINUE TO QUESTION 10 10 Reflection Select Yes if students engaged in any reflection activities as part of service - learning in the selected class . CONTINUE TO QUESTION 10a. Select No if students did not engage in any reflection activities as part of NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-30 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. service -learning in the selected class . SKIP TO QUESTION 11. 10a Reflection – timing Check any of the times in which student s engaged in reflection as part of service -learning in the selected class . Check all that apply. CONTINUE TO QUESTION 10b. 10b Reflection - type Check any of the ways in which students engaged in reflection as part of service -learning in the selected clas s. Check all that apply. CONTINUE TO QUESTION 10c. 10c Reflection – depth On a scale of 1 to 5, indicate the extent to which reflection activities in the selected class included discussion of the larger social or civic issues related to students’ service -learning activities. CONTINUE TO QUESTION 11. 1 is no discussion. 5 is great amount of discussion. 11 Diversity Check any of the ways in which diversity was addressed during service - learning in the selected class . Check all that apply. CONTINUE TO QUESTI ON 12a. 12a Progress monitoring – investigation On a scale of 1 to 5, indicate the extent to which students in the selected class were involved in collecting evidence toward meeting specific service goals or learning outcomes. CONTINUE TO QUESTION 12b. 1 is not involved. 5 is very involved. 12b Progress monitoring – reflection On a scale of 1 to 5, indicate the extent to which students in the selected class were involved in collecting evidence on the quality of service - learning. CONTINUE TO QUESTION 12c . 1 is not involved. 5 is very involved. 12c Progress monitoring – planning On a scale of 1 to 5, indicate the extent to which students in the selected class were involved in using evidence to improve the service -learning experience. CONTINUE TO QUESTION 12d. 1 is not involved. 5 is very involved. 12d Progress monitoring – demonstration On a scale of 1 to 5, indicate the extent to which students in the selected class were involved in communicating evidence of progress towards goals or outcomes with the larger community. CONTINUE TO QUESTION 13. NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-31 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. 1 is not involved. 5 is very involved. 13 Demonstration Check any of the forums in which students in the selected class engaged to demonstrate the impact of their service to others. Check all that apply. CONTIN UE TO QUESTION 14. Part III. My Background and Teaching Experience 14 Years of teaching experience Report as an integer, your number of school years of teaching experience. Include this school year as a full year of experience. CONTINUE TO QUESTION 15. 15 Subjects taught Choose the broad subject areas that you have taught at any point in your teaching career, at any school CONTINUE TO QUESTION 16a. Begin the list with core academic subject areas (math, science, English/language arts, social studies/hi story). End with non -core academic subject areas, selecting OTHER and listing the subject area. For each subject, select Yes if you hare certified in that subject area. For each subject, select No if you are not certified in that subject area. 16a Educat ion – degree Indicate your highest degree. Check one box only. CONTINUE TO QUESTION 16b. 16b Education – area of study Indicate the area of study for your highest degree from 18a. CONTINUE TO QUESTION 17. 17 Professional development in service -learning Select Yes if you have participated in any professional development activities related to service -learning since September 2006. CONTINUE TO QUESTION 17a. Select No if you have not participated in any professional development activities related to servic e-learning since September 2006. SKIP TO END. Your survey is complete. 17a Short training or workshop Indicate the number of different training sessions or workshops about service -learning that you have participated in that were on -site at your school an d lasted less than 8 hours. Indicate the number of hours since September 2006 you have spent in any training sessions or workshops about service -learning that were on -site at your school and lasted less than 8 hours. CONTINUE TO QUESTION 17b 17b Conferenc e Indicate the number of different conferences about service -learning that you have participated in that were either off -site or lasted at least 8 hours. Indicate the number of hours since September 2006 you have spent in any conferences about service -lear ning that were either off -site or lasted at least 8 hours. CONTINUE TO QUESTION 17c NAT IONAL EVALUATION OF SCHOOL -BASED — DRAFT — LEARN AND SERVE AMER ICA PROGRAMS Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-32 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servic e. Prime contractor: Abt Associates. 17c Group meetings Indicate the number of different groups (not short training sessions, workshops or conferences) about service -learning that you have participated in. Indicate the number of hours since September 2006 you have spent in any groups (not short training sessions, workshops or conferences) about service -learning. END OF SURVEY. END OF SURVEY Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-33 4.2.2 Teacher Log and Instructions Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 34 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. CLASSROOM ACTIVITY LOG I. For t he week ending on DAY/DATE, did any students in the class participate in these activities? 1. Investigation of an authentic community need? No Yes  Part A 2. Planning or preparation for a service activity*? No Yes  Part B 3. Participation in a service activity*? No Yes  Part C 4. Demonstration of the impacts of a service activity*? No Yes  Part D 5. Celebration of a service activity*? No Yes  Part E * For the purposes of this study, service -learning is defined as students engag ing in activities to meet a genuine community need while simultaneously learning and applying important knowledge and skills from the academic curriculum. All service -learning must involve the entire class. Students may work on activities in small groups o r as a whole class, but for this study, no individual projects will be allowed. II. For the week ending on DAY/DATE, did you use ANY of the following approaches to teach students content area knowledge, not including any service -learning activities indi cated in PART 1 ? 6a. Collaborative learning Groups of students have joint responsibility for understanding course content, developing solutions to problems or demonstrating what they have learned, e.g., collaborative writing, group projects, joint pro blem -solving, debates, study teams. 6b. Cooperative learning Groups of students have specific and distinct responsibilities for understanding course content, developing solutions to problems, or demonstrating what they have learned (e.g., each member of the group is responsible for one element of a group project or assignment). 6c. Project -based learning No Yes  Part F Students learning essential knowledge and life -enhancing skills through an extended, student -influenced inquiry process structured around complex, authentic questions and carefully designed products and tasks. 6d. Problem -based learning Content learning involving active problem -solving about an issue or situation that simulates the kinds of problems students are likely to face in the real world . 6e. Inquiry -based learning Inquiry -based learning is based around student questions. Students work independently to solve problems rather than receiving direct instructions from the teacher on what to do. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-35 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Part A. Investigatio n of a community need 1a. In the last week, approximately how much class time was spent on investigating a community need? # minutes 1b. Who helped identify the community need? (select all that apply) Teacher Students Community Partners 1c. Who had primary responsibility for identifying the community need? ( select one) Teacher Students Community Partners 1d. Did students document the need in any way? Y N 1e. Was the community need linked to what students are learning in class? Y N 1f. Was the commun ity need linked to academic content standards? Y N 1g. Did students engage in reflection as part of investigating the community need? Y N 1gi. Did students reflect on how the community need was connected to their lives outside of school? Y N 1gii.Did students reflect on how the community need was connected to what they are learning in class? Y N Part B. Planning/preparation for a service activity 2a. In the last week, approximately how much class time was spent on planning/preparation for a service activity? # minutes 2b. Did students prepare/plan a service activity that involves direct service, indirect service, or research and advocacy? (select all that apply) Direct Indirect Research & Advocacy 2c. Who selected the service activity? ( select all t hat apply) Teacher Students Community Partners 2d. Who had primary responsibility for selecting the service activity? ( select one) Teacher Students Community Partners 2e. Who prepared/planned the action plan for doing the service activity? ( select all that apply) Teacher Students Community Partners 2f. Who had primary responsibility for preparing/planning the action plan? ( select one) Teacher Students Community Partners 2g. Was the planning/preparation activity linked to what students are learning i n class? Y N 2h. Was the planning/preparation activity linked to academic content standards? Y N 2i.Did students engage in reflection about the planning/preparation process? Y N 2ii. Did students reflect on how the planning/preparation was connected to their lives outside of school? Y N 2iii.Did students reflect on how the planning/preparation was connected to what they are learning in class? Y N Part C. Participation in a service activity 3a. In the last week, approximately how much class time was spent participating in the service activity? # minutes 3b. Did the service activity meet an authentic community need? Y N 3c. Did the service activity involve a community partner? Y N 3d. Did students evaluate the impact of the service activity on the c ommunity? Y N 3e. Did students evaluate the impact of the service activity on their own learning? Y N 3f. Was the service activity linked to what students are learning in class? Y N 3g. Was the service activity linked to academic content standards? Y N 3h. Did students engage in reflection about the service activity? Y N 3hi. Did students reflect on how the service activity was connected to their lives outside of school? Y N 3hii.Did students reflect on how the service activity was connected to what they are learning in class? Y N Part D. Demonstration of the impact of a service activity 4a. Approximately how much class time was spent on preparing and delivering the demonstration? # minutes 4b. Did students make presentations about a service activ ity to: their own class, other school members (including parents), another school, the local community, or a broader audience? (select all that apply) Own class Other school members Another school Local community Broader audience 4c. Who participated in t he demonstration? ( select all that apply) Teacher Students Community Partner 4d. Who had primary responsibility for the demonstration? ( select one) Teacher Students Community Partner 4e. Did students present data about the impact of a service activity on the community? Y N 4f. Was the demonstration linked to what students are learning in class? Y N 4g. Was the demonstration linked to academic content standards? Y N 4h. Did students engage in reflection about the demonstration activities? Y N 4hi. D id students reflect on how the demonstration was connected to their lives outside of school? Y N 4hii.Did students reflect on how the demonstration was connected to what they are learning in class? Y N Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-36 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Part E. Celebration of a service activity 5a. Appr oximately how much class time was spent on celebration activities? # minutes 5b. Who participated in the celebration? ( select all that apply) Teacher Students Community Partner 5c. Who had primary responsibility for the celebration? ( select one) Teacher Students Community Partner 5d. Was the celebration linked to what students are learning in class? Y N 5e. Was the celebration linked to academic content standards? Y N 5f. Did students engage in reflection about the celebration activities? Y N 2fi. D id students reflect on how the celebration was connected to their lives outside of school? Y N 2fii.Did students reflect on how the celebration was connected to what they are learning in class? Y N Part F. Collaborative/Cooperative/Project -based/Problem -based/Inquiry -based Activities 6a. In the last week, approximately how much time did students spend in any of these activities? # minutes 6b. Who developed/chose the activity? ( select all that apply) Teacher Students Community Partner 6c. Who had prima ry responsibility for developing/choosing the activity? ( select one) Teacher Students Community Partner 6d. Did students engage in reflection as part of the activity? Y N Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-37 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Classroom Activities Log Instructions for Completing Web -based Log Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-38 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Logging On Logon page TeacherID Unique 5 -digit ID that has been assigned to you by the study. TeacherID is pre -populated on each Log. Teacher will be prompted to confirm that the Teacher ID is correct by using the checkbox. ClassID Unique n umber (1 - 5) assigned to each of your classes that is involved in the study . ClassID is pre -populated on each Log. Teacher will be prompted to confirm that the Class ID is correct by using the checkbox. Log# Number of this Log in the sequence of week ly Logs. Log# is pre -populated on each Log. LogStartDate LogEndDate These dates bracket the 5 -day reporting period for the Log (typically one school week) LogStartDate and LogEndDate is pre -populated on each Log. Date of Entry Date that teacher fil ls out Log. Teacher will be prompted to enter date of data entry in mm/dd/yyyy format. Days in reporting period Number of days that class met during the 5 -day reporting period . Teacher will be prompted to select a number between 1 and 5 from a drop -down menu. Number of students in class during reporting period Average number of students attending class during the 5 -day reporting period. Teacher will be prompted to enter a whole number. Next page Checkbox to indicate that entry/confirmation of identifying information is complete Once Teacher has entered a date and checked the two boxes confirming the pre -populated IDs, Teacher is promoted to check the box for Next to be directed to the Log items. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-39 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Section I: Investigation 1. Investigation of a community need Process of identifying community needs of interest and begin research to assess the needs by designing a survey, conducting interviews, using varied media including books and the Internet, and drawing from personal experience and observat ion. Students may document the extent and nature of the problem and establish a baseline for monitoring progress.

Community partners may be identified. Select Yes if, during the reporting period, any class time was spent on Investigation of a community n eed. If Yes is selected, Teacher will be prompted to answer items 1a – 1gii Select No if during the reporting period, no classroom time was spent on Investigation by pairs or groups of students, or the whole class. Teacher will be prompted to select Yes or No . 1a. Time spent in investigation of a community need Number of minutes of classroom time that was spent in investigation activities during the reporting period. Teacher will be prompted to enter a whole number. If teacher selects Yes for item 1, time spent in investigation should always be > 0 minutes. 1b. Groups helping in investigation activities Three possible groups --students, teacher, community partner(s) --who might have been involved during the reporting period in the investigation activi ties. Teacher will select one, two, or all three of the groups. 1c. Group with primary responsibility in investigation activities Which group --students, teacher, community partner(s )—had the primary responsibility during the reporting period in the inv estigation activities during the reporting period. This includes choosing the community need(s) to be investigated, selecting the method(s) of investigation. Teacher will be prompted to select only one of the three groups. 1d. Activities to document t he community need Whether or not during the reporting period students conducted activities to document the community need being investigated, for example, through displays of data, citations, reports. Select Yes if during the reporting period for the Log, one or more students spent any time documenting the community need. Select No if during the reporting period for the Log, no classroom time was spent on documenting the community need. Teacher will be prompted to select Yes or No . 1e. Link between co mmunity need being investigated and class content Whether or not the community need that students are investigating during the reporting period is related or connected to the class content . Select Yes if the community need being investigated is connected to the class content. The link does not have to be to the specific class content that was covered during the reporting period. The link can be to any class content regardless of when it was or will be covered. Select No if the community need being invest igated is not connected to the class content. Teacher will be prompted to select Yes or No . Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-40 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. 1f. Link between community need being investigated and academic standards Whether or not the community need that students are investigating during the reporti ng period is related or connected to academic content standards for the subject area. These standards may be national, state, or district standards. Select Yes if the community need being investigated is connected to academic content standards. Select No if the community need being investigated is not connected to academic content standards. Teacher will be prompted to select Yes or No. 1g. Reflection Students consider how the experience, knowledge, and skills they hope to acquire relate to their ow n lives, their community, and/or their academics. Students engage in varied activities to think about the needs, their actions, their potential or actual impact. This process includes both analytical and affective response. .

Select Yes if, during the re porting period, students engaged in reflection about the community need being investigated during. If Yes is selected, teacher will be prompted to answer items 1gi - 1gii. Select No if, during the reporting period, students did not engage in any reflectio n about the community need being investigated. Teacher will be prompted to select Yes or No . 1gi. Reflection about connection of community need to students’ lives Whether or not students engaged in reflection about how the community need being investig ated is connected to their lives outside school. For example, students could engage in reflection about whether they/their family/their friends or neighbors are personally affected by the community need. Select Yes if, during the reporting period, studen ts engaged in reflection about the connection between the community need and their lives. Select No if, during the reporting period, students did not engage in reflection about the connection between the community need and their own lives. Teacher will be prompted to select Yes or No. 1gii. Reflection about connection of community need to course content Whether or not students engaged in reflection about how the community need being investigated is connected to what they are learning in the course. For example, students could engage in reflection about how the community need intersects with a concept or lesson in the curriculum. Select Yes if, during the reporting period, students engaged in reflection about the connection between the community need a nd what students are learning in class. Select No if, during the reporting period, students did not engage in reflection about the connection between the community need and what students are learning in class. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-41 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Section I: Planning and Preparation 2. Preparation and Planning for a service activity Selecting the service activity and developing an action plan for the service activity. Outlining varied ways to meet the community need or contribute to improving the situation. Planning may include: clarifying roles and responsibilities, developing a common vision for success, deciding what will occur and who will do each part of the work, creating a timeline, listing materials and costs, and overseeing any logist ics and approvals that must be obtained. Select Yes if, during the reporting period, any class time was spent on preparation and planning for a service activity. If Yes is selected, Teacher will be prompted to answer items 2a – 2iii. Select No if during the reporting period, no classroom time was spent on preparation and planning by individual students, groups of students, or the whole class. Teacher will be prompted to select Yes or No . 2a. Time spent in preparation/planning for service activity Num ber of minutes of classroom time that was spent in preparation/planning for the service activity during the reporting period. Teacher will be prompted to enter a whole number. If teacher selects Yes for item 2, time spent in preparation/planning should always be > 0 minutes. 2b. Preparation/ planning for direct service, indirect service, or research & advocacy Whether the service activity that students are preparing/planning for involves direct service with the community in which the need exists (s tude nts respond to a community need by interacting with and impacting the service recipient or site); indirect service (students build infrastructure or capacity to respond to the community need, for example, students pack food boxes at the local Food Bank); o r research and advocacy (students find, gather and report on information to raise awareness of a problem and/or advocate for change in the condition underlying the community need, for example, students meet with elected officials to urge support for additi onal food subsidy for low -income families). Teacher will be prompted to select one or more of the three types of service: Direct, Indirect, or Advocacy. 2c. Groups helping to select the service activity Three possible groups --students, teacher, commu nity partner(s) --who might have been involved during the reporting period in the selection of the community need to be addressed by the class service activity. Teacher will select one, two, or all three of the groups. 2d. Group with primary responsibil ity for selecting the service activity Which group --students, teacher, community partner(s) —had the primary responsibility during the reporting period for selecting the community need to be addressed by the class service activity. Teacher will be prompt ed to select only one of the three groups. 2e. Groups helping to prepare/plan the action plan for the service activity Three possible groups --students, teacher, community partner(s) --who might have been involved during the reporting period in developin g the action plan for the service activity. Teacher will select one, two, or all three of the groups. 2f. Group with primary responsibility in preparation/planning for a service activity activities Which group --students, teacher, community partner(s )— had the primary responsibility during the reporting period in developing the action plan for the service activity. Teacher will be prompted to select only one of the three groups. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-42 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. 2g. Link between community need being investigated and class content Whether or not the planning/preparation is related or connected to the class content . Select Yes if the preparation/planning is connected to the class content. The link does not have to be to the specific class content that was covered during the reportin g period. The link can be to any class content regardless of when it was or will be covered. Select No if the preparation/planning is not connected to the class content. Teacher will be prompted to select Yes or No . 2h. Link between community need bei ng investigated and academic standards Whether or not the planning/preparation is related or connected to academic content standards for the subject area. These standards may be national, state, or district standards. Select Yes if the planning/preparati on is connected to academic content standards. Select No if the planning/preparation is not connected to academic content standards . Teacher will be prompted to select Yes or No. 2i. Reflection Students consider how the experience, knowledge, and ski lls they hope to acquire relates to their own lives, their community, and/or their academics. Students engage in varied activities to think about the needs, their actions, their potential or actual impact. This process includes both analytical and affectiv e response. Select Yes if, during the reporting period, students engaged in reflection about the planning/preparation for the service activity. If Yes is selected, teacher will be prompted to answer items 2ii – 2iii. Select No if, during the reporting p eriod, students did not engage in any reflection about the planning/preparation for the service activity. Teacher will be prompted to select Yes or No . 2ii. Reflection about connection of planning for the service activity to students’ lives Whether or not students engaged in reflection about how the planning/preparation for the service activity is connected to their lives outside school. Select Yes if, during the reporting period, students engaged in reflection about the connection between the plannin g/ preparation for the service activity and their own lives. Select No if, during the reporting period, students did not engage in reflection about the connection between the planning/ preparation for the service activity and their own lives. Teacher wi ll be prompted to select Yes or No. 2ii. Reflection about connection of planning for service activity to course content Whether or not students engaged in reflection about how the planning/preparation for the service activity is connected to the course c ontent. Select Yes if, during the reporting period, students engaged in reflection about the connection between the planning for the service activity and the course content. Select No if, during the reporting period, students did not engage in reflectio n about the connection between the planning for the service activity and the course content. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-43 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Section I: Action 3. Action (service activity) to address a community need (service activity) Implementation o f the plan to address a community need. Select Yes if, during the reporting period, any class time was spent on service activities to address a community need. If Yes is selected, Teacher will be prompted to answer items 3a – 3hii. Select No if, during t he reporting period, no time was spent on service activities to address a community need. Teacher will be prompted to select Yes or No . 3a. Time spent in service activity Number of minutes of classroom time that was spent in service activity during the reporting period. Teacher will be prompted to enter a whole number. If teacher selects Yes for item 3, time spent in preparation/planning should always be > 0 minutes. 3b. Service to address authentic community need Whether the service activity to ad dress a community need has been documented as being relevant and important to the community. Select Yes if the service activity addresses a documented and real community need. Select No if there is no evidence that the community need being addressed has been documented. Teacher will be prompted to select Yes or No . 3c. Involvement of community partner(s) in the service activity Whether or not one or more community partners collaborates with students in the service activity, e.g., by working alongsid e students in the community, by helping distribute or collect neighborhood surveys, etc. Select Yes if, during the reporting period, one or more community partners has collaborated with students on the service activity. Select No if, during the reportin g period, there was no involvement of a community partner in the service activity. Teacher will be prompted to select Yes or No . 3d. Evaluating impact of service activity on community Whether students participate in activities to evaluate the impact o f the service activity on the community, by measuring goals and results using methods such as surveys, discussions, other data collection. Select Yes if, during the reporting period, students participated in evaluation of the impact of the service activi ty on the community. Select No if during the reporting period, there was no student participation in evaluation of the impact of the service activity on the community. Teacher will be prompted to select Yes or No . 3e. Evaluating impact of service acti vity on student learning Whether students participate in activities to evaluate the impact of the service activity on their learning of the course material, by measuring goals and results using methods such as discussions, assessments, self -evaluation. Select Yes if, during the reporting period, students participated in evaluation of the impact of the service activity on their own learning. Select No if during the reporting period, there was no student participation in evaluation of the impact of the ser vice activity on students' own learning. Teacher will be prompted to select Yes or No . Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-44 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. 3f. Link between service activity and class content Whether or not the service activity is related or connected to the class content . Select Yes if the service act ivity is connected to the class content. The link does not have to be to the specific class content that was covered during the reporting period. The link can be to any class content regardless of when it was or will be covered. Select No if the service a ctivity is not connected to the class content. Teacher will be prompted to select Yes or No . 3g. Link between service activity and academic standards Whether or not the service activity is related or connected to academic content standards for the subj ect area. These standards may be national, state, or district standards. Select Yes if the service activity is connected to academic content standards. Select No if the service activity is not connected to academic content standards. Teacher will be p rompted to select Yes or No. 3h. Reflection Students consider how the experience, knowledge, and skills they hope to acquire relates to their own lives, their community, and/or their academics. Students engage in varied activities to think about the need s, their actions, their potential or actual impact. This process includes both analytical and affective response. Select Yes if, during the reporting period, students engaged in reflection about the service activity. If Yes is selected, teacher will be p rompted to answer items 3hi – 3hii. Select No if, during the reporting period, students did not engage in any reflection about the service activity. Teacher will be prompted to select Yes or No . 3hi. Reflection about connection of service activity to students’ lives Whether or not students engaged in reflection about how the service activity is connected to their lives outside school. Select Yes if, during the reporting period, students engaged in reflection about the connection between the service a ctivity and their own lives. Select No if, during the reporting period, students did not engage in reflection about the connection between the service activity and their own lives. Teacher will be prompted to select Yes or No. 3hii. Reflection about c onnection of service activity to course content Whether or not students engaged in reflection about how the service activity is connected to the course content. Select Yes if, during the reporting period, students engaged in reflection about the connecti on between the service activity and the course content. Select No if, during the reporting period, students did not engage in reflection about the connection between the service activity and the course content. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-45 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Section I: Demonstration 4. Demonstration of the impact of the service activity Students provide evidence to others of their influence and accomplishments. They showcase what and how they have learned and their acquired skills and knowledge. In this context of demonstration, along with their partners, students may also plan and carry out a celebration of what they have gained and contributed including both the learning and the service. Select Yes if, during the reporting period, any class time w as spent on demonstration of the impact of the service activity. If Yes is selected, Teacher will be prompted to answer items 4a – 4hii. Select No if during the reporting period, no time was spent on demonstration of the impact of the service activity. Teacher will be prompted to select Yes or No. 4a. Time spent in demonstration of the impact of the service activity Number of minutes of classroom time that was spent in demonstration of impact of the service activity during the reporting period. Teac her will be prompted to enter a whole number. If teacher selects Yes for item 4, time spent in preparation/planning should always be > 0 minutes. 4b. Groups to which students demonstrated the impact of the service activity Groups to whom students demonst rated the impact of the service activity. Alternatives include other members of the class, another class in the school, the entire school community, members of the community, or a broader audience . Teacher will select one, two, three or all four of the groups. 4c. Groups helping in demonstration of the impact of the service activity Three possible groups --students, teacher, community partner(s) --who might have been involved in demonstrating the impact of the service activity during the reporting per iod. Teacher will select one, two, or all three of the groups. 4d. Group with primary responsibility for demonstrating the impact of the service activity Which group --students, teacher, community partner(s )— had the primary responsibility for demonst rating the impact of the service activity during the reporting period. Teacher will be prompted to select only one of the three groups. 4e. Student presentation of data on the impact of the service activity Whether as part of the demonstration of the impact of the service activity students presented data on the results or outcomes of the service activity. Select Yes if, during the reporting period, students presented data on the impact of the service activity. Select No if, during the reporting peri od, there was no student participation in presentation of data on the impact of the service activity. Teacher will be prompted to select Yes or No . Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-46 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. 4f. Evaluating impact of service activity on student learning Whether students participate in activit ies to demonstrate the impact of the service activity on their learning of the course material. Select Yes if, during the reporting period, students participated in demonstration of the impact of the service activity on their own learning. Select No if, during the reporting period, there was no student participation in demonstration of the impact of the service activity on their own learning. Teacher will be prompted to select Yes or No . 4f. Link between demonstration of impact of the service activi ty and class content Whether or not the demonstration of the impact of the service activity is related or connected to the class content . Select Yes if, during the reporting period, the demonstration of the impact of the service activity is connected to the class content. The link does not have to be to the specific class content that was covered during the reporting period. The link can be to any class content regardless of when it was or will be covered. Select No if, during the reporting period, the d emonstration of the impact of the service activity is not connected to the class content. Teacher will be prompted to select Yes or No . 4g. Link between demonstration of impact of the service activity and academic standards Whether or not the demonstra tion of the impact of the service activity is related or connected to the academic content standards for the subject area. These standards may be national, state, or district standards. Select Yes if, during the reporting period, the demonstration of the impact of the service activity is connected to academic content standards. Select No if, during the reporting period, the demonstration of the impact of service activity is not connected to academic content standards. Teacher will be prompted to selec t Yes or No. 4h. Reflection about the demonstration of the impact of the service activity Students consider how the demonstration of the impact of the service activity relates to their own lives, their community, and/or their academics. Select Yes if, during the reporting period, students engaged in reflection about the demonstration of impact on the service activity. If Yes is selected, teacher will be prompted to answer items 4hi – 4hii. Select No if, during the reporting period, students did not eng age in any reflection about the demonstration of impact of the service activity. Teacher will be prompted to select Yes or No . 4hi. Reflection about connection between the demonstration of the impact of the service activity and students’ lives Whether or not students engaged in reflection about how the service activity is connected to their lives outside school. Select Yes if students engaged in reflection about the connection between the service activity and their own lives. Select No if students di d not engage in reflection about the connection between the service activity and their own lives. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-47 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. 4hii. Reflection about connection between the demonstration of the impact of the service activity and cour se content Whether or not students engaged in reflection about how the demonstration of the impact of the service activity is connected to the course content. Select Yes if, during the reporting period, students engaged in reflection about the connection between the demonstration of the impact of the service activity and the course content. Select No if, during the reporting period, students did not engage in reflection about the connection between the demonstration of the impact of the service activity and the course content. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-48 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. Section I: Celebration 5. Celebration Select Yes if during the reporting period, any classroom time was spent on celebration of the class activity about a community need, by any in dividual student, groups of students, or the whole class. If Yes is selected, Teacher will be prompted to answer items 5a – 5fii. Select No if during the reporting period for the Log, no classroom time was spent on celebration of the service activity. Teacher will be prompted to select Yes or No . 5a. Time spent in celebration of the service activity Number of minutes of classroom time that was spent in celebration of the service activity during the reporting period. Teacher will be prompted to enter a whole number. If teacher selects Yes for item 5, time spent in preparation/planning should always be > 0 minutes. 5b. Groups helping in celebration of the service activity Groups who participated in the celebration of the service activity --students, t eacher, community partner(s) during the reporting period. Teacher will select one, two, or all three of the groups. 5c. Group with primary responsibility in celebration of the service activity Which group --students, teacher, community partner(s ) — had the primary responsibility in celebration of the service activity during the reporting period. Teacher will be prompted to select only one of the three groups. 5d. Link between celebration of the service activity and class content Whether or not the celebration of the service activity is related or connected to the class content . Select Yes if, during the reporting period, the celebration of the service activity is connected to the class content. The link does not have to be to the specific clas s content that was covered during the reporting period. The link can be to any class content regardless of when it was or will be covered. Select No if, during the reporting period, the celebration of the service activity is not connected to the class con tent. Teacher will be prompted to select Yes or No . 5e. Link between celebration of the service activity and academic standards Whether or not the celebration of the service activity is related or connected to the academic content standards for the sub ject area. These standards may be national, state, or district standards. Select Yes if, during the reporting period, the celebration of the service activity is connected to academic content standards. Select No if, during the reporting period, the cele bration of the service activity is not connected to academic content standards. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-49 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. 5f. Reflection about the celebration of the service activity Whether or not students engaged in reflection about the celebra tion of the service activity and its relation to their own lives, their community, and/or their academics. Select Yes if, during the reporting period, students engage in reflection about the celebration of the service activity. Select No if, during the reporting period, students did not engage in reflection about the celebration of the service activity. Teacher will be prompted to select Yes or No . 5hi. Reflection about connection between the celebration of the service activity and students’ lives Wh ether or not students engaged in reflection about how the celebration of the service activity is connected to their lives outside school. Select Yes if, during the reporting period, students engaged in reflection about the connection between the celebrat ion of the service activity and their own lives. Select No if, during the reporting period, students did not engage in reflection about the connection between the celebration of the service activity and their own lives. Teacher will be prompted to selec t Yes or No. 5hii. Reflection about connection between the demonstration of the impact of the service activity and course content Whether or not students engaged in reflection about how the celebration of the service activity is connected to the course c ontent. Select Yes if, during the reporting period, students engaged in reflection about the connection between the celebration of the service activity and the course content. Select No if, during the reporting period, students did not engage in reflect ion about the connection between the celebration of the service activity and the course content. Teacher will be prompted to select Yes or No. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-50 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Servi ce. Prime contractor: Abt Associates. II. Approaches to Learning 6. Other approaches to learning Select Yes if during the reporting period, any o f the 5 approaches to learning were used in the class. This includes: collaborative learning, cooperative learning, project -based learning, problem -based learning, or inquiry learning. If Yes is selected, Teacher will be prompted to answer items 6a – 6d Select No if during the reporting period, no classroom time was spent on any of the 5 approaches to learning. Teacher will be prompted to select Yes or No . 6a. Time spent in other approaches to learning Number of minutes of classroom time that was spent in any of the 5 other approaches to learning during the reporting period. Teacher will be prompted to enter a whole number. If teacher selects Yes for item 5, time spent in preparation/planning should always be > 0 minutes. 6b. Groups involved in dev eloping the focus of the other approaches to learning Groups who participated in developing/selecting the focus of the other approaches to learning during the reporting period, including students, the teacher, community partner(s). Teacher will select one, two, or all three of the groups. 6c. Group with primary responsibility in developing the focus of the other approaches to learning Which group --students, teacher, community partner(s )— had the primary responsibility in developing/selecting the foc us of the other approaches to learning during the reporting period. Teacher will be prompted to select only one of the three groups. 6d. Reflection about the other approaches to learning Whether or not students engaged in reflection about how the othe r approaches to learning in which students were involved during the reporting period were related to students’ own lives, their community, and/or their academics. Select Yes if, during the reporting period, students engaged in reflection about the other approaches to learning. Select No if, during the reporting period, students did not engage in reflection about the other approaches to learning. Teacher will be prompted to select Yes or No . Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-51 4.2.3 Teacher Interview on Service -Learning Activities in the Cl assroom — DRAFT — Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-52 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serv e America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. National Evaluation of School -Based Learn and Serve America Programs Teacher Interview (End -of -Course) Introduction Thank you again for participating in the National Evaluation of School -Based Learn and Serve America Programs. We are grateful to you and will use the information both to document impacts of the program and to identify program design characteristics that appear to be associated with the strongest outcomes.

To help us to identify those characteristics, we have some questions for y ou. It is important for you to provide as much detail as possible and to be very candid about the strengths and challenges associated with each activity. We want to assure you that all of your responses will be kept confidential to the maximum extent all owed by law. Any personally identifiable information will be removed from your responses. We will report information in the aggregate only; your school and district will not have access to your interview data at any time. Do you have any questions before w e begin? Control Class(es) First, we’d like to ask you to describe the class(es) assigned to the control condition. ( If the teacher had more than one control class, add: We will ask these questions for each of the control classes, starting with the (xx x period and content area) class.) General 1. Please describe any of the following approaches that you used in the control class over the last year (or semester if class was one semester) (Interviewer should review the approaches the teacher reported on th e log): a. Collaborative learning b. Cooperative learning c. Problem -based d. Project -based e. Inquiry -based 2. Please describe any instructional activities in your control class that took the place of service -learning activities in your service -learning class? Service -Learning Class(es) Next, we’d like to ask you to describe the class(es) that employed service -learning and not the control class. ( If the teacher had more than one service -learning class, add : We will ask these questions for each of the classes for which y ou had service -learning, starting with the (xxx period and content area) class.) — DRAFT — Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-53 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serv e America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. General 1. What motivates you to implement service -learning? 2. Please describe the service -learning activities that you conducted over the course of the year (or semester if clas s was one semester) in this class. 3. Before doing any service activities, did students investigate a community need? If yes, a Did students conduct investigation of a community need: as a whole class, in small groups, or individually? b Was the community need being investigated linked to academic content standards? c Did the investigation involve library or internet research? d Did the investigation involve direct contact with the community? e Did the investigation involve collecting baseline data about the extent of the community need? f Did the investigation involve working with community partners? Link to Curriculum 1. How did you link your service -learning activities to the state or district content standards for the class? Why did you select those particular standar ds for linkage? a. In what ways, if any, did you help students to transfer the knowledge and skills acquired in the classroom to their service projects? b. (SKIP this question when doing this interview for teacher’s second class ) In your district, is service -lea rning formally recognized in school board policies and/or student records? If yes, please explain how ? Community Partnerships 2. Please describe the community partnerships that you formed during your service -learning activities for each phase of the activiti es, including the name of the partner and the role that each served. a. How often did you communicate with the partner(s) and what was the general content of the communication? i. (IF MORE THAN ONE PARTNER) Was the communication process and content the same wit h all partners or did it vary? Please explain. ii. (PROBE, if not answered) Did any of the communication involve sharing knowledge and understanding of school and community assets and needs? If so, please describe. b. Did you work with community partners to e stablish a shared vision and common goals? (If NO, skip to 3c. If YES: Please describe how you established the vision and common goals.) c. What role did partners play in developing and implementing action plans? Meaningful Service 3. Please describe the ways i n which you tried to facilitate experiences that were meaningful to the students. a. What did you do to promote personal relevance for the students? b. How did you connect the experiences to social, political, or environmental issues? c. What did you do, if anythin g, to ensure that the experiences were developmentally appropriate for the students? — DRAFT — Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-54 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serv e America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. d. What did you do, if anything, to raise the level of interest and engagement of students? e. Please describe the ways in which those being served viewed their experience. (PRO BE: What indicators do you have, if any, that show their value of the service being offered?) Youth Voice 4. From your logs, we are able to determine the extent to which you have been able to engage your students in planning and implementing various compone nts of service -learning. Please describe how you involved the students and where, in your judgment, it was better for adults to make decisions. a. (PROBE): Please describe the ways, if any, in which you tried to nurture youth leadership and decision -making. W hat specific types and forms of leadership were you trying to help students develop? b. Please describe the ways, if any, in which you tried to create an atmosphere that nurtured open expression of ideas. What sorts of factors facilitated and impeded the dev elopment of trust and open expression? c. To what extent were students involved in planning their service -learning experience? Please describe. Specifically, i. Did students plan for a service activity: as a whole class, in small groups, or individually? ii. As part of planning, did students brainstorm multiple solutions to address the community need? iii. Did students use a planning process that included assignment of roles and timelines? iv. As part of planning, did students discuss how they might measure the impact of the service activity? d. To what extent were students involved in evaluating the quality and effectiveness of their service -learning experience? Please explain. Diversity 5. In what ways, if any, did you address diversity in your service -learning approach? (PROBE S, use if did not already answer): a. What activities, if any, did you provide to help students gain an understanding of multiple perspectives? b. What activities, if any, did you provide to help students develop skills in conflict resolution? c. What activities, i f any, did you provide to help students understand and value the backgrounds and perspectives of those receiving service? d. What activities, if any, did you provide to help students address the issue of stereotyping? Reflection 6. From the logs, we know when a nd how often you were able to include reflection activities within your service -learning program. Could you briefly describe the types of reflection activities you used? — DRAFT — Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-55 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serv e America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. a. Were you able to include reflection activities that prompted students to think about community problems and alternative solutions? (If YES, please describe.) b. Were you able to include reflection activities that encouraged students to examine their preconceptions and assumptions so they could better understand their roles as citizens?

(If YES , please describe.) c. Were you able to include reflection activities that asked students to understand the connection between their service experiences and public policy and/or civic life? (If YES, please describe.) d. Were you able to include reflection activi ties that allowed students to consider the value of their individual and group contribution to service recipients? (If YES, please describe.) e. Were you able to include reflection activities that allowed students to demonstrate an understanding of how their knowledge, skills, and/or attitudes had changed? Progress Monitoring 7. From the logs, we are able to determine the extent to which students collected data. Will you please briefly describe the nature of the data collected? a. From whom were the data collected ? b. How were the data used? i. (PROBE if not answered): Were the data used to examine progress toward meeting goals? If yes, please explain how. ii. (PROBE if not answered): Were the data used to measure quality? If yes, please explain how. iii. (PROBE if not answered): Were the data used to improve the service -learning experience? If yes, please explain how. iv. (PROBE if not answered): Were the data presented to anyone in the community outside of school? If yes, please explain what was presented and how. v. (PROBE if not ans wered): Were the data used to help others understand service - learning? If yes, please explain how. Both Class(es) 1. Were you able to cover the full scope and sequence in both classes? If no, please describe any differences and the reasons for not being abl e to cover the full scope and sequence in either (or both) class(es). Conclusion 2. Is there anything else you would like to add? Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Pro grams ▌pg. 4-56 4.3 Instruments to measure students’ academic and civic engagement In this section, we present a draft instrument designed t o measure student outcomes in civic and academic engagement and students’ perspectives on their service -learning. Recall that t he primary goal of the NELSAP evaluation was to test the impact of service -learning on three key student outcomes : 1) academic ac hievement, measured by state proficiency or standardized achievement tests in core content areas 16 and school records data (for grade and course completion, and expected credit accrual); 2) academic engagement, measured by student self -report on standard q uestionnaires and school records data; and 3) civic engagement, measured by student self -report on standard questionnaires . Additionally, two exploratory student outcome domains were specified : 21 st century skills , measured by student self - report on standa rd questionnaires, and predictors of dropout, measured by school records data (for failure in core subjects, absenteeism, grade retention, and disciplinary referrals). Below, we present the Student Survey , designed to measure students’: Academic Engagement : valuing school, valuing the study class, interest in the core content area, and postsecondary aspirations Civic Engagement : involvement with the community, sense of civic responsibility, civic efficacy 21 st Century Skills : problem solving skills, teamwor k skills Service -learning : experience, service -learning characteristics and quality The Student Survey was designed for NEL SAP by the study team based on scales fr om other service - learning studies where possible. For evaluators interested in how the instru ment was created, we also include a table relating survey items to the constructs they are measuring and a list of sources. The Student Survey Crosswalk lists the original sources for each survey item and the scale reliabilities from prior research. Origin al scales are highly reliable and appropriate for students. Scales were adapted by the study team to apply to the research questions and the study sample (9 th and 10 th graders in service - learning and control classrooms in core academic subjects in the 2011 -12 school year). Student survey questions were cognitively tested with nine students in the 9 th or 10 th grade . 16 If relevant state test scores we re not available or were no t administered at the appropriate time during the school year , the design called for the study team to administer norm -referenced achievement tests in core content areas . Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Pro grams ▌pg. 4-57 4.3.1 Student Survey, Crosswalk and Sources Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 58 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve Americ a Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. National Evaluation of Learn and Serve America Programs Thank you, in advance, for participating in t his research. This survey should take approximately 40 minutes to complete. Your responses are confidential, and will only be seen by the research staff. Your responses will not be shared with your parents, your teachers, or anyone at your school. o Survey Instructions: The survey questions ask about your attitudes and opinions, so there are no right or wrong answers. The survey is not a test. Do your best, and read all instructions carefully. Don‘t spend too long on any one question; just answer as honestly as possible. The sections in the survey are: o You and your community o Your education o Your service -learning class ( For students in the treatment classrooms at post -test only. ) Fill out the survey in pencil . The questions in these sections are multiple cho ice. Read each question carefully, and then look at the scale provided. Answer each question by completely filling in the circle that best describes your opinions and behaviors. If you wish to change the answer you picked, completely erase your first answer and fill in the circle for your new answer. Your email address will be used to contact you about a final survey in Spring 2013. We will not share your email address outside of the study team or use your email address for anything besides the survey and sending you the $20 gift card after the final survey. Email address: ________________________________________________________________ If you have any questions while completing this survey, ask the person administeri ng the survey, or call the study’s project director, (NAME), at Abt Associates toll -free at XXX - XXX -XXXX. THANK YOU! According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless su ch a collecti on displays a valid OMB control number. The valid OMB control number for this information collection is XXXX. The time required to complete this information collection is estimated to average 45 minutes per response, including the time to review instructio n, search existing data Completely fill in the circle for your answer: Right Wrong — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 59 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate or suggestions for improving this form, please write t o: XXXX. You and your community In this section, you will answer questions about you and your community. This section will include questions about: a) Your skills b) Your community c) Volunteering or community service — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 60 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part a: Your Skills 1) In the past twelve months, how oft en did you do the following? (Mark only one response for each statement. ) Never A few times a year Almost once a month Almost once a week Almost everyday Everyday I talked about my ideas in front of other people.       I wrote about my ideas.       I found ways to solve problems.       I figured out how to make a good decision.       I came up with new ideas.       I was the leader of a group.       I listened to other people’s ideas even if they were different f rom mine.       I asked others to explain their ideas or points of view.       I compromised with other people to reach a common goal.       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 61 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part b: Your Community In this section, you will answer questions about your community. Th ink about the community as the agencies, businesses, and neighborhoods outside your school. 2) If you found out about a problem in your community that you wanted to do something about (for example, an increase in the number of local homeless people who coul d not shelter or high levels of lead were discovered in the local drinking water), how well do you think you would be able to do each of the following? ( Mark only one response for each statement. ) I definitely can’t I can’t I probably can’t I probably can I can I definitely can Create a plan to address the problem.       Get other people to care about the problem.       Organize and run a meeting about the problem.       Express your views about the problem in front of a group of peo ple.       Identify individuals or groups who could help you with the problem.       Express your views about the problem to others in writing.       Develop a webpage, newsletter, or blog about the problem.       Contact an exp ert that you had never met before to get their help with the problem.       Fundraise or collect donations to address the problem.       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 62 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Think about the community as the agencies, businesses, and neighborhoods outside your school. 3) In t he past twelve months, how often did you… (Mark only one response for each question .) Never Just a few times Almost once a month Almost once a week Almost everyday Everyday Do things to make the community a better place?       Pay attention to new s that affects the community?       Talk with my friends about community problems?       Help to address community problems?       Encourage others to work on community problems?       Work with others to address a community iss ue?       Discuss how national issues affect the community?       Think about the community as the agencies, businesses, and neighborhoods outside your school. 4) How strongly do you disagree or agree with each statement? ( Mark only one resp onse for each statement .) Strongly disagree Disagree Slightly disagree Slightly agree Agree Strongly agree I am aware of the important needs in the community.       It is my responsibility to help improve the community.       I am aware of what can be done to meet the important needs in the community.       Helping other people is something that I am personally responsible for.       It is easy for me to put aside my self -interest in favor of a greater good.       Becom ing involved in social issues is a good way to improve the community.       Being concerned about community issues is an important responsibility for everybody.       Being actively involved in community issues is everyone’s responsibility, i ncluding mine.       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 63 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part c: Volunteering or Community Service 5) In the past 12 months, how often did you participate in any volunteering or community service activities (tutoring, working in a soup kitchen, working in a community garden, visitin g the elderly, etc.)? Do NOT include any community service or volunteering that was part of one of your classes. (Mark only one response) Never Just a few times Almost once a month Almost once a week Almost everyday Everyday       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-64 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Your Ed ucation In this section, you will answer questions about your education. This will include questions about: a) your future b) your school c) your current class *at post -program only d) your interests e) service -learning f) service -learning class * at post -program, treatm ent only — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-65 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part a: Your Future 6) Below are some statements about your goals for the future. (Mark only one response for each statement .) Very sure I won’t Sure I won’t Probably won’t Probably will Sure I will Very sure I will Not sure one way or the oth er I will graduate from high school.        I will continue my education beyond high school.        7) What education will you complete? (Mark all that apply )  Less than high school diploma/GED  High school diploma/GED  Technical school or certification (for example, mechanics certificate, cosmetology license)  Some college (community college, university, or 4 year college)  2 year college degree (Associates degree or AA)  4 year college or university degree (Bachelors, BA, or BS degre e)  More than college (for example, law degree, medical doctor, masters degree, etc)  Other (specify)____________________ — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-66 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part b: Your School Baseline only: In this section, you will answer questions about the school you were in last school year. Th ink about your experiences in all of your classes at that school during last school year. 8) How much do you agree or disagree with each of the following statements? (Mark only one response for each statement ). Strongly disagree Disagree Slightly disagree Slightly agree Agree Strongly agree I liked being in school.       I was interested in the work at school.       I felt that the schoolwork I was assigned was meaningful.       My courses were interesting to me.       I though t that the things I learned in school would be important for my future.       I felt that school was worthwhile.       9) Last school year, how often did you… (Mark only one response for each question ). Never Just a few times Almost once a mo nth Almost once a week Almost everyday Everyday Have difficulty paying attention in school?       Have difficulty getting your homework for any class done?       Skip school?       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-67 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Post -program and follow -up. In this section, you will answer questions about your current school. Think about your experiences in all of your classes at your school this school year. 8) How much do you ag ree or disagree with each of the following statements? (Mark only one response for each statement ). Strongly disagree Disagree Slightly disagree Slightly agree Agree Strongly agree I like being in school.       I am interested in the work at school.       I feel that the schoolw ork I am assigned is meaningful .       My courses are interesting to me.       I think that the things I am learning in school will be im portant for my future.       I feel that school is worthwhile.       9) This school year, how often did you… (Mark only one response for each question ). Never Just a few times Almost once a month Almost once a week Almost everyday Everyday Have difficulty paying attention in school?       Have difficulty getting your homework for any class done?       Skip school?       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-68 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part c: Your [CONTENT AREA] class Post -program only: Now you will answer questions about the [CONTENT AREA] class you are currently in. Think about your experiences in this class only. 10) How much do you agree or disagree with each of the following statements? Mark only one response for each statement . Strongly disagree Disagree Slightly disagree Slightly agree Agree Strongly agree I like being in this class.       I am interested in the work in this class.       I feel that the work I am assigned in this class is meaningful.       This course is interesting to me.       I think that the things I am learning in this class will be important for my future.       I feel that this class is worthwhile.       11) Consider your experience in this class. Since this cla ss started, how often did you… (Mark only one response for each question, ) Never Just a few times Almost once a month Almost once a week Almost everyday Everyday Have difficulty paying attention in this class?       Have difficulty getting your ho mework for this class done?       Skip this class?       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-69 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part d: Your Interests In this section, you will answer questions about CONTENT AREA. 12) How much do you agree or disagree with each of the following statements? (Mark only one respon se for each statement .) Strongly disagree Disagree Slightly disagree Slightly agree Agree Strongly agree I am interested in CONTENT AREA.       I am good at CONTENT AREA.       I intend to take advanced courses in CONTENT AREA.       I am interested in careers that require CONTENT AREA skills.       — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-70 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part e: Your Service -Learning Baseline only 13) Before this school year, did you participate in service -learning as part of any class? Service - learning means that the class include s some kind of community service or volunteer activity that is related to the subject of the class. (Mark only one response)  No  Ye s 13) a If Yes, in what subject was the service -learning? (Mark all that a pply)  English/Language Arts  Social Studies  Foreign Language  Science  Math  Other (specify)________________ Post program only 13) Du ring this school year, did you participate in service -learning as part any class ? Service -learning means that the class includes some kind of community service or volunteer activity that is related to the subject of the class. (Mark only one response)  No  Yes 13) a If Yes, in what subject(s) was the service -learning? (Mark all that apply)  English/Language Arts  Social Studies  Foreign Language  Science  Math  Other (please specify)________________ — DRAFT — Student Survey Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-71 Measures and documents were developed as part of the National Evaluatio n of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Abt ID // barcode Part f: Your Service -Learning Class (For students in service -learning classrooms at end of course only) 14) Answer these questions about the [CONTENT AREA] class you are currently in. In thinkin g about your service -learning experience in this class, indicate how much you disagree or agree with each of the following statements. Service -learning means that the class includes some kind o f community service or volunteer activity that is related to the content of the class. ( Mark only one response for each statement .) Strongly disagree Disagree Slightly disagree Slightly agree Agree Strongly agree My service -learning activities were me aningful to me.       My service -learning activities were important to me..       I helped provide ideas for my service -learning activities.       I helped make decisions about my service -learning activities.       I see direct connections between my service -learning activities and what I learned in class.       Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-72 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Commu nity Servic e. Prime contractor: Abt Associates. NELSAP Crosswalk for Student Survey ---DRAFT --- Item Scale Service -Learning Control Original α Item # on survey (Post only) # of Items # of Items Base Post 1 yr Base Post 1 yr A. Volunteering Prior/current volunteering outside of service -learning Adapted: Evaluation of the TASC after - school program (Reisner, White, Russell, & Birmingham, 2004) 1 1 1 1 1 1 NA 5 B. Academic Engagement Postsecondary aspirations Postsecondary Aspirations Scale (RMC Research, 2009) 3 3 3 3 3 3 .86 6, 7 Valuing school Survey of academic engagement (school - level) (RMC Research, 2005) 6 6 6 6 6 6 .82 -.83 8 AddHealth (Johnson, Crosnoe, & Elder, 2001) 3 3 3 3 3 3 .77 9 Content area interests Adapted: Interest in STEM Subjects (RMC Research, 2009) 4 4 4 4 4 4 .64 (12) Valuing the class Adapted: Survey of academic engagement (class -level) (RMC Research, 2005) 6 6 .82 -.83 (10) Adapted: AddHealth (Johnson, Crosnoe, & Elder, 2001) 3 3 .77 (11) B. 21 st Century Skills Problem -solving and teamwork skills Adapted: 21 st Century Skills Acquisition Scale (RMC Research, 2009) 9 9 9 9 9 9 .83 1 D. Civic Engagement Involvement with community Survey of Community Engagem ent (RMC Research, 2007) 7 7 7 7 7 7 .82 -.84 3, beginning of 4 Sense of civic responsibility Civic Awareness Scale (Furco, Muller, & Ammon, 1998) 8 8 8 8 8 8 .85 4, end of 3 Civic efficacy Competence for Civic Action (Flanagan, Syvertsen, & Stout, 2007) 9 9 9 9 9 9 .78 -.87 2 E. Service -Learning Prior/current service -learning experience New 2 2 2 2 2 2 NA 13 (13a) Service -learning characteristics & quality Adapted: Quality of Service -Learning Practice Student Survey Scale (RMC Research, 2003) 5 .91 (14) TOTAL # of Items 53 68 53 53 62 53 Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-73 Student Survey References Flanagan, C. A., Syvertsen, A. K., & Stout, M. D. (2007, May). Civic measurement models: Tapping adolescents’ civic engagement (CIRCLE Working Paper 55). College Park, MD: Center for Information and Research on Civic Learning and Engagement. Furco, A., Muller, P., & Ammon, M.S. (1998). Civic responsibility survey for K -12 students engaged in service -learning. UC Berkeley, CA: Service -Learning Research & Development Center. Johnson, M. K., Crosnoe, R., Elder, G.H. (2001). Students' attachment and academic engagement: The role of race and ethnicity. Sociology of Education, 74, 318 -340. National Youth Leadership Council (2008). K-12 service -learning standards for quality practice. Saint Pa ul, MN. Reisner, E., White, R., Russell, C. & Birmingham, J., (2004). Evaluation of the TASC after -school program, high school student survey, 2001 -2001 . Washington DC: Policy Studies Associates. Retrieved on 27 August 2010 from http://www.policystudies.com/studies/youth/Evaluation%20TASC%20Programs.html/highschoo lsurvey RMC Research Corporation (2010). Quality of service -learning practice student su rvey scale . Denver, CO. RMC Research Corporation (2009) 21 st Century Skills Acquisition Scale . Denver, CO. RMC Research Corporation (2009) Interest in STEM Subjects . Denver, CO. RMC Research Corporation (2009) Postsecondary Aspirations Scale . Denver, CO. RMC Research Corporation (2007). Survey of community engagement. Denver, CO. RMC Research Corporation. (2005). Survey of academic engagement: school level . Denver, CO. RMC Research Corporation. (2005). Survey of academic engagement: class level . Denver, CO. Zimmerman, B. J., Bandura, A., & Martinez -Pons, M. (1992). ―Self -motivation for academic attainment: The role of self -efficacy beliefs and personal goal setting,‖ in American Educational Research Journal , 29, 663 -676. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for E valuating Service -Learning Programs ▌pg. 4-74 4.4 Instruments for Recruitment of D istricts, Schools and Teachers In this section, we present the recruitment materials and consent /assent forms developed for the National Evaluation of School -Based Learn and Serve America Programs . The recruitment materials include documents that were to be used by the study team to recruit potential participants at all levels (di strict, classroom, student ) and to gain their agreement to participate in the study. All of these documents are identified in bold text and are presented at the end of this secti on. Because no master list existed of the population of interest (i.e. eligible teachers) , the design called for a top -down recruitment approach, relying on state and regional information networks and professional ties to identify and recruit potential par ticipants for the study. For instance, school district superintendents and/or regional and district service -learning coordinators were to be recruited to facilitate data collection activities, to provide lists of schools or teachers in each district who re ceived LSA funding, and to facilitate site visits to the districts and schools by members of the study team. Through this approach, when the study contacted schools, principals could be assured that study participation had been cleared by the district. The exhibit below illustrates the recruiting strategy graphically. The goals of each contact and their customized materi als are described briefly below. 4.4.1 School Districts: Superintendents and Service -learning Coordinators The st udy team had planned to begin recruitment at the school district, regardless of whether Learn and Serve America funding was provided directly to districts, schools or teachers , in order to obtain district cooperation for schools to participate in the study . To facilitate our contact with the districts, the study team would request that the state Learn and Serve America director send an email to the superintendents of districts identified as having been funded directly b y the state and districts that we re as sociated with high schools or teachers that/who received school -based Learn and Serve America funding. The purpose Recruitment Process Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for E valuating Service -Learning Programs ▌pg. 4-75 of this email wa s to introduce the National Evaluation , express support for the study from the state Learn and Serve America director , and al ert the district to expect a call from a member of the study team. The study team would then follow -up by sending a District Introductory Letter (with Study Brochure and Study Fact Sheet) to the superintendent (with a copy to the district service -learning coordinator if applicable) of each identified district. The letter provide s information about the study design and data collection activities and notifies the district administrators that a member of the study team will be contacting them to arrange for a conference call to discuss the study and identify potenti ally eligible teachers in district high schools. 17 The Study Brochure and Study Fact Sheet provide general information about the study that could be distributed to district and school administrators, teachers, and parents. The district mailing would be followed by a telephone call to the district superintendent to introduce the study team, provide additional information about the study, answer any questions they may have , confirm the district’s willi ngness for its high schools and teachers to participate , and obtain permission to contact the principals in high schools with eligible teachers. To facilitate this call, the study team member w ould use the protocol designed for these calls, the Topic Guide : District Introductory Call . In districts with a district -level service -learning coordinator, we would suggest that this individual be included in the call. High Schools Once distr ict agreement to participate had been obtained, a senior member of the stu dy team would contact each high school with at least one Learn and Serve America -funded teacher. To facilitate communication with high schools in the sample, the design calls for district superintendents to send an email to the principal of each school in their district with potentially eligible teachers to tell them about the evaluation, express their support, and ask for their cooperation when contacted by a member of the study team. After principals had received the superintendent’s email, the study team would initiate communication with the principal of each high school by sending the School Introductory Letter (with Study Brochure and Study Fact Sheet ). This letter provides information about the study design and data collection activities and notifies t he principal that a member of the study team will be contacting them to arrange for a conference call to discuss the study and identify potenti ally eligible teachers in the school. Letters to school principals would followed by a telephone call to the pri ncipal from a member of the study team to provide additional information about the study, answer questions, discuss the school’s participation in the study, and confirm whether the school has teachers who meet the study eligibility crit eria (as described a bove) . Study team members would use the Topic Guide: School Introductory Call to facilitate this introductory call. Teachers When at least one teacher who met the three initial eligibility criteria was identified in a school and the principal was willing for the teacher(s) and the school to participate in the National Evaluation, the principal and eligible teacher(s) would be asked to participate in a half -day site visit with a member of the study team. These meetings would be facilitated using the Topic G uide: School Principal and Teacher Meeting . The purpose of the meetings was to: (1) to discuss with teachers the study design, including the 17 During the final recruitment phase, the fourth criteria for teacher eligibility (u ses an approach to service - learning that represents at least minimal standards of quality) would be confirmed through the Teacher Information Form. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for E valuating Service -Learning Programs ▌pg. 4-76 random assignment of classrooms, student and teacher data collection activities, compensation plans and the use of a site liaison; (2) to confirm information learned during the introductory call with the principal about the eligibility of teachers; (3) to discuss the logistics of how random assignment could be implemented at the school; and (4) to answer questions abou t the study design and secure the school and eligible teachers’ interest in and willingness to participate in the study upon confirmation of teacher eligibility. Interested teachers would then be directed to the online Teacher Information Form (see above) to collect information on the fourth teacher eligibility criterion – the quality of service -learning practices. They would also sign the Teacher Consent Form , formally agreeing to participate in the study. When the schools and teachers in the district hav e agreed to participate in the study and teacher eligibility is confirmed , district and school administrators would be asked to formally indicate their agree ment to participate in the study by signing a Memorandum of Understanding (MOU) . The MOU contains information about the study design and the expectation for the study team and for the districts and schools participating in the study. To facilitate data collection and reduce burden on school administrators and teachers, the National Evaluation included funds for a site liaison to help coordinate study activities in each site (see Site Liaison Job Description and Agreement ). Plans would be made for a member of the study team to work with the district or school to identify an individual in the community (c ould be a district or school staff person) who would be hired by the study team to act as the site liaison . The site liaison could be hired at the district or school level, depending upon the number of teachers and schools in the district participating in the study and the needs of the site. Parents and Students Because the N ational Evaluation design focused on underage students (9 th and 10 th graders), agreement was needed from parents (see Parental Consent Form ) of students in classrooms being recruited to participate in the research. In addition, student assent would be solicited from students prior to completing the student survey (see Student Assent Form ). 4.4.2 Presentation of General Study In formation Study Brochure Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 77 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Commu nity Service. Prime contractor: Abt Associates. B ENEFITS FOR P ARTICIPANTS Schools, teac hers, and students participating in the study will be making an important contribution to the largest and most comprehensive study ever conducted of service -learning. This study will provide rigorous evidence about the effectiveness of service -learning pro grams and contribute valuable research to benefit schools, teachers, students, service -learning practitioners and policy -makers. P AYMENT Districts, s chools , teachers and students will receive honoraria for study participati on. Q UESTIONS ? If you have que stions, or would like to learn more about the study, please contact us! (800) XXX -XXXX R ESEARCH T EAM (NAME) Corporation for National & Community Service (NAME) Abt Associates Inc. (NAME) RMC Research Corporation N ATIONAL E VALUATION OF SCHOOL –B ASED L EARN AND SERVE A MERICA P ROGRAMS The Corporation for National and Community Service is funding a NATIONAL study of service - learning. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-78 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Commu nity Service. Prime contractor: Abt Associates. W HAT IS SERVICE - L EARNING ? Service -learning is a teaching and learning strategy that integrates meaningful community service with instruction and reflection to enrich the learning experience, teach civic responsibility, and strengthen communities. By combining service obje ctives and learning objectives, service -learning aims to affect students ’ academic and ci vic outcomes . P URPOSE OF THE STUDY The purpose of this study is to measure the effect of service -learning activities on 9 th and 10 th grade students. The study will look at whether service -learning helps improve students’ academic achievement, and academic and civic engagement . P ARTICIPATION IS V OLUNTARY Participation in the study by schools, teachers, and students is voluntary. T EACHER P ARTICIPATION Teachers who agree to participate in the study will be asked to do the following:  Implement se rvice -learning in some of their classrooms and not in others.  Complete one pre -study survey, two telephone interviews, and logs on classroom activities.  Participate in webinars on study activities (as necessary).  Work with study liaisons at their schools to coordinate data collection activities. STUDY A CTIVITIES The study will take place during the 2011 -2012 school year in 9 th and 10 th grade classrooms in nine states across the country. Teachers participating in the study will teach some of their classes with service -learning and some without service -learning. Students will be asked to complete surveys at the beginning and at the end of the semester or school year, among other activities. C ONFIDENTIALITY There is a minimal risk of breach of confidentiality . However , the study team follows strict rules to protect the confidentiality of the information that schools, teachers, and students share with us. Data will only be collected from teachers and students who agree to be in the study and parents of students must also provide written permission. The information we collect through the study will be kept confidential and used for study purposes only. Names of individual schools, teachers and students will not appear in any reports produced for this study. DRA FT – JULY 27, 2010 The results of this study will help us better understand the impact of service -learning on students . Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-79 Study Fact Sheet Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-80 N ATIONAL EVALUATION OF SCHOOL -BASED LEARN AND SERVE AMERICA PROGRAMS Study Fact Sheet The Corporation for National and Community Service (CNCS) , the federal agency that oversees Learn and Serve America, the largest funder of service -learning programs, has contracted with Abt Associates and RMC Research Corporation to conduct a national evaluation of the impact of its K -12 school -based service -learning program. The purpose of the study is to conduct the most rigorous possible evaluation to measure of the effect of high -quality service -learning activities on 9th and 10th students who participate in service -learning in an academic course. The study will look at impacts on students’ academic achievement, and academic and civic engagement. The study will be based on a national sample of teachers who are experienced, high - quality service -learning teachers. The study will compare students taught by these teachers in similar classes with and without service -learning activities . Participation in this study is voluntary and will not affect schools’, districts’, or teachers’ Learn and Serve America funding opportunities. Study Design The study will be conducted in high schools that are currently receiving Learn and Serve America funds or those that received these funds during the 2006 -2009 Learn and Serve America funding cycle. Nine states – Arizona, California, Florida, Massachusetts, Michigan, New York, Ohio, Texas, and Virginia – have been selected to participate in the evalua tion based on geographic diversity and Learn and Serve America grant funding levels. A sample of approximately 185 teachers will be selecte d for the study. To be eligible for the study, a teacher must: Have received Learn and Serve America funding for the ir service -learning activities at least once since 2006; Have demonstrated the use of high -quality service -learning approaches; and Plan to implement service -learning in at least two classes of a core academic area with 9 th or 10 th grade students in the 2011 -12 school year. The study will select two classrooms from each teacher. One classroom will be randomly selected to be where the teacher will continue to implement service -learning; in the second classroom, the teacher will not implement service -learni ng in the 2011 - 12 school year. Timeline and Data Collection School and teacher recruitment for the study will begin in fall 2010 for participation during the 2011 -2012 school year. For teachers, data collection activities will involve reporting on the qu alities of the instruction and student experiences in their two participating classrooms, and keeping logs over the year on classroom activities. Students will be asked to complete surveys about their involvement with the school and with the community befo re and after the class and one year later. Students’ academic achievement will be measured by state test scores in the service -learning content area, or by standardized test administered at the end of the semester or school year (if agreed upon by all part ies). The study team will also collect school record data on demographic characteristics of participating students. No student data will be collected without parental permission for the student to participate. Benefits to Study Participants Study particip ants will be making an important contribution to the largest and most comprehensive study ever conducted of service -learning. This study will provide rigorous evidence about the effectiveness of service -learning and contribute valuable evidence -based resea rch to benefit all service -learning practitioners and policy -makers. Compensation Districts, schools, teachers, and students participating in the study will be compensated for their time and expenses related to study activities. Study Contacts (NAME) Project Director Abt Associates Inc. (xxx -xxx -xxx (NAME) Principal Investigator RMC Research Corporation (xxx -xxx -xxxx) (email address) (NAME) Federal Project Officer Corporation for National and Community S ervice (xxx -xxx -xxxx) (email address) Service -learning Evaluation Toolkit Abt As sociates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-81 4.4.3 Introductory Letters and Call Topic Gui de s District Introduct ory Letter and Call Guide Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-82 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. District Introductory Letter DATE «Salutation» «FirstName» «LastName» «Title» «Address1» «City» , «Abbr» «ZipCode» - Dear _________, We are writing in reference to an important national study of service -learning that is being funded by the Corporation for National and Community Service (CNCS) and carried out by Abt Associates Inc. and RMC Research Corporation. CNCS is a federal agency that was created in 1993 to provide opportunities for Americans of all ages and backgrounds to give back to their communities and their nation. As the nation’s largest grantmaker supporting service and volunteering, CNCS provides funds to K -12 districts an d schools to implement service -learning through its Learn and Serve America Program. As service - learning has become a prevalent educational practice (it is estimated that service -learning currently is practiced in over a third of public secondary schools n ationally), there have been calls for more rigorous research on the impacts of service -learning on students. We are hoping that you, along with schools and teachers in your district who are currently utilizing or have utilized Learn and Serve America grant funds for service -learning activities will be interested in participating in this study. The study will examine the impacts of service -learning in core academic content areas on the academic achievement, and academic and civic engagement of 9 th and 10 th grade students. Participation in this study is voluntary and is in no way tied to the district’s current or future Learn and Serve funding opportunities. Your district’s decision whether or not to participate will not affect your relationship with the state or CNCS. Your state is one of nine states selected by CNCS for the evaluation, based on factors that include the amount of Learn and Serve America funding received by the state and its geographic location. Across the nine selected states, we are seeking to recruit up to 185 teachers in approximately 93 school districts, with each teacher contributing at least two classrooms to the study. Your district was identified by your state Learn and Serve America director as having received state Learn and Serve Am erica funds to implement high -quality service -learning. We would like the opportunity to contact high school principals and their teachers in your district who have experience implementing service -learning and who plan to do so in 9 th and 10 th grade classe s in a core academic content area in the next school year. When we contact them, we will explain the study and determine if they are interested in participating. If so, we will arrange a visit to the schools to talk further to teachers who are implementing service -learning. The study team will work with schools and teachers to determine which teachers and classes are eligible for participation in the study . Our aim is to implement a random assignment study that has the capacity to assess whether service -learning produces positive academic and civic outcomes in students. Therefore, for each teacher who agrees to participate, our researchers will randomly select which classrooms will receive service -learning and which will not. Teachers will continue to impl ement their service -learning curriculum in treatment classrooms and will refrain from implementing service -learning in control classrooms for one school year (2011 -12). At the end of the year, the study will compare the outcomes for students in all of the classrooms in the study. This design will allow us to attribute differences in student outcomes to service -learning. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-83 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. A liaison will be identified in each district or school to help facilitate data collection activities. The liaison will be compensated for assisting with study activities. Districts, schools and teachers will also be compensated for their time spent on study activities. Data collection activities will consist of the following: Two teacher interviews , midway through the semester or school ye ar and at the end of the semester or school year. Teacher logs of classroom activities throughout the semester or school year. Student surveys on service -learning experiences in the classroom and on academic and civic engagement (at the beginning and the e nd of the semester or school year, and again one year later). One standardized test of students in the service -learning content area of the participating classes near the end of the semester or school year, if state test scores are not available in the se rvice - learning content area at the necessary time (or even if state test data are available but a study - administered test is agreed upon by all parties). School records data for participating students. All of the data collected for the study will be kep t confidential to the maximum extent allowed by law. Any personally identifiable information will be removed from participants’ responses, all of which will be encoded with a unique identification number to be used only by persons engaged in the research. We will report information in the aggregate only; no student, teacher, school, or district names will appear in any publically released reports. All data collected for this study by Abt Associates and RMC, including personally identifiable information, wil l be provided to CNCS. However, CNCS will not release these data or use them for purposes other than this study. We believe that this study will provide crucial and credible evidence about the effectiveness of Learn and Serve America programs and deepen and strengthen the available research about the benefits of service - learning on key academic and civic outcomes for students. We are hoping you will consider participating.

A member of the study team will be contacting you by telephone within the next week to provide additional details about the study, answer any questions you may have, and discuss the possible participation of eligible teachers in your district. Any further questions you may have about the study can be directed to (NAME) at Abt Associates, (NAME) at RMC, or (NAME) at CNCS. Contact information is provided below. Thank you for your consideration and cooperation. Sincerely, (NAME) Project Director Abt Associates, Inc. 55 Wheeler Street Cambridge, MA 02138 (xxx -xxx -xxxx) (email address) (NAME) Principal Investigator RMC Research Corporation 633 17th Street, Suite 2100 Denver, CO 80202 (xxx -xxx -xxxx) (email address) (NAME) Federal Project Officer Office of Research and Policy Development Corporation for National and Community Service 1201 N ew York Ave, NW Washington, DC 20525 (xxx -xxx -xxxx) (email address) Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-84 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Topic Guide: District Introductory Call This topic guide will be used by study staff to facilitate telephone calls to district administrators overseeing or supporting high schools an d/or teachers implementing service -learning in the district. A. Introductions and Overview of Study Review objectives of study and importance to CNCS, federal government, and the field Review key features of study design, including random assignment of c lassrooms, data collection requirements (student surveys, student testing (if necessary), teacher interviews, teacher logs), and schedule of study activities Review of h onorariums and incentives to compensate schools and teachers for burden Discussion of r ole of onsite school/district liaison to coordinate data collection activities at the school and planned compensation Discuss district administrator’s questions/concerns about the study design Note that participation in this study is voluntary and is not tied to the district’s current or future Learn and Serve funding opportunities. B. Service -Learning Activities in High Schools Explain that we would like to learn more about service -learning activities in their districts’ high schools in order to confirm their eligibility for the study and confirm that they agree to continue the conversation Confirm the number of high schools implementing service -learning activities (based on LASSIE data) in the district Confirm the number of 9 th and 10 th grade teachers i mplementing service -learning and discuss teachers’ previous experience Discuss core academic area of classes with service -learning in 9 th and 10th grades (math, science, social studies, English) If the district does not have 9 th or 10 th grade high school teachers with experience implementing service -learning in core academic areas –thank them for their time and interest but explain that they are not eligible for the study. If the district has high schools with teachers who appear eligible for the study – continue to site visit planning. C. Site Visit Planning Explain that we would like to contact the principals at the eligible high schools to discuss the study and a possible site visit and ask for their permission for us to contact the principals. Discuss district administrator’s role in school site visits (if necessary to facilitate recruitment) Obtain or confirm contact information for the principals at the eligible high schools Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Servi ce -Learning Programs ▌pg. 4-85 School Introductory Letter and Call Topic Gu ide Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-86 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. School Introductory Lett er DATE «Salutation» «FirstName» «LastName» «Title» «Address1» «City» , «Abbr» «ZipCode» - Dear _________, We are writing in reference to an important national study of service -learning that is being funded by the Corporation for National and Community Service (CNCS), and carried out by Abt Associates Inc. an d RMC Research Corporation. CNCS is a federal agency that was created in 1993 to provide opportunities for Americans of all ages and backgrounds to give back to their communities and their nation. As the nation’s largest grantmaker supporting service and v olunteering, CNCS provides funds to K -12 districts and schools to implement service -learning through its Learn and Serve America Program. As service - learning has become a prevalent educational practice (it is estimated that service -learning currently is practiced in over a third of public secondary schools nationally), there have been calls for more rigorous research on the impacts of service -learning on students. We are hoping that your school and teachers who are or have utilized Learn and Serve America g rant funds for service -learning activities will be interested in participating in this study. The study will examine the impacts of service -learning in core academic areas on the academic achievement, and academic and civic engagement of 9 th and 10 th grade students. Participation in this study is voluntary and is in no way tied to your school’s current or future Learn and Serve America funding opportunities. Your school’s decision whether or not to participate will not affect your relationship with the stat e or CNCS. Your state is one of nine states selected by CNCS for the evaluation, based on factors that include the amount of Learn and Serve America funding received and its geographic location. Across the nine selected states, we are seeking to recruit u p to 185 teachers in approximately 93 school districts with each teacher contributing at least two classrooms to the study. Your high school was identified by your state Learn and Serve America director as having received state Learn and Serve America fund s to implement high -quality service -learning. We would like the opportunity to come to your school to talk to you and the teachers in your school who have experience implementing service -learning and who plan to do so in 9 th or 10 th grade classes in a core academic content area in the 2011 -12 school year. During our visit, we will explain the study further and determine if teachers are interested in participating. The study team will work with schools and teachers to determine which teachers and classes ar e eligible for participation in the study. Our aim is to implement a random assignment study that has the capacity to assess whether service -learning produces positive academic and civic outcomes in students. Therefore, for each teacher who agrees to parti cipate, our researchers will randomly select which classrooms will receive service -learning and which will not. Teachers will continue to implement their service -learning curriculum in treatment classrooms and will refrain from implementing service -learnin g in control classrooms for one school year (2011 -12), or semester depending upon class length. At the end of the year, the study will compare the outcomes for students in all of the classrooms in the study. This design will allow us to attribute differenc es in student outcomes to service -learning. Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-87 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. A liaison will be identified in each district or school to help facilitate data collection activities. The liaison will be compensated for assisting with study activities. Schools and teachers will also be compe nsated for their time spent on study activities. Data collection activities will consist of the following: Two teacher interviews , midway through the semester or school year and at the end of the semester or school year. Teacher logs of classroom activiti es throughout the semester or school year. Three student surveys on service -learning experiences in the classroom and on civic and academic engagement (at the beginning and the end of the semester or school year, and again one year later). One standardize d test of students in the service -learning content area of the participating classes near the end of the semester or school year, if state test scores are not available in the service - learning content area at the necessary time (or even if state test data are available but a study - administered test is agreed upon by all parties). School records data for participating students. All of the data collected for the study will be kept confidential to the maximum extent allowed by law.

Any personally identifiab le information will be removed from participants’ responses, all of which will be encoded with a unique identification number to be used only by persons engaged in the research. We will report information in the aggregate only; no student, teacher, school, or district names will appear in any publically released reports. All data collected for this study by Abt Associates and RMC, including personally identifiable information, will be provided to CNCS. However, CNCS will not release this data or use it for purposes other than this study. Along with this letter, we are sending copies of a one -page study fact sheet and study brochure with additional information about the study. We would like to ask you to distribute these to any teachers in your school who yo u believe may be eligible for participation. To be eligible, teachers must meet the following criteria: 1) have received school -based Learn and Serve America funding in the previous (2006 - 09) or current (2009 -12) Learn and Serve America funding cycle; 2) p lan to implement service -learning in the 2011 -12 school year in at least two classes of a core academic area (or areas) for 9 th or 10 th grade students (these two classes do not have to be the same core academic area and classes can be semester - long or year -long); 3) have at least one year of experience, which can include the 2010 -11 school year, utilizing service -learning in a core academic area with 9 th or 10 th grade students ; and 4) use an approach to service -learning that represents at least minimal stan dards of quality. The evaluation of service -learning quality will be based on a teacher’s previous service -learning classes or, if the teacher is in the first year of service -learning, on a current service -learning class. A member of the study team will b e contacting you by telephone within the next week to provide additional details about the study, answer any questions you may have, and to discuss study participation of eligible teachers in your school. We believe that this study will provide crucial an d credible evidence about the effectiveness of Learn and Serve America programs and deepen and strengthen the available research about the benefits of service - learning on key academic and civic outcomes for students. We are hoping you and eligible teachers in your school will consider participating. Any additional questions you may have about the study can be Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-88 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. directed to (NAME) at Abt Associates, (NAME) at RMC, or (NAME) at CNCS. Contact information is provided below. Thank you for your consideration and c ooperation. Sincerely, (NAME) Project Director Abt Associates, Inc. 55 Wheeler Street Cambridge, MA 02138 (xxx -xxx -xxxx) (email address) (NAME) Principal Investigator RMC Research Corporation 633 17th Street, Suite 2100 Denver, CO 80202 (xxx -xxx -xxxx ) (email address) (NAME) Federal Project Officer Office of Research and Policy Development Corporation for National and Community Service 1201 New York Ave, NW Washington, DC 20525 (xxx -xxx -xxxx) (email address) Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-89 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Topic Guide: School Introductory Cal l A. Overview of Study Review objectives of the study and importance to CNCS, federal government, and the service - learning field Note that participation in this study is voluntary and is not tied to the district’s current or future Learn and Serve funding opportunities. Review key features of study design, including random assignment of classrooms, data collection requirements (student surveys, student testing (if necessary), teacher interviews, teacher logs), and schedule of study activities Review h onora riums and incentives to compensate school and teachers for burden Review confidentiality procedures for data collected Discussion of role of onsite school/district liaison to coordinate data collection activities at the school and planned compensation Dis cuss principal’s questions/concerns about study design B. Service -Learning Activities in School Discuss the core academic classes with service -learning in the ninth and tenth grade at the school Identify the teachers implementing these service -learning cl asses and their experience C. Determine Teachers who are Potentially Eligible to Participate in Study Confirm which teachers in the school meet the initial three criteria for participation in the study: 1) have received school -based Learn and Serve Americ a funding in the previous (2006 -09) or current (2009 -12) Learn and Serve America funding cycle; 2) plan to implement service -learning in the 2011 -12 school year in at least two classes of a core academic area (or areas) for 9 th or 10 th grade students (thes e two classes do not have to be the same core academic area and classes can be semester -long or year -long); and 3) have at least one year of experience, which can include the 2010 -11 school year, utilizing service -learning in a core academic area with 9 th or 10 th grade students. D. Site Visit Planning Determine if principal is willing to participate in the study o If principal is not interested, thank principal for his/her time o If principal is interested -- - Arrange to send additional materials (e.g., study brochure, one -page fact sheet) to be distributed to eligible teachers - Discuss arrangements for a site visit to the school to meet with the principal and eligible teacher(s) Service -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-90 Topic Guide: School Principal and Teacher Meeting Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-91 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Prog rams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Topic Guide: School Princi pal and Teacher Meeting A. Introductions and Overview of Study Introductions among teachers, other school staff attending meeting, and study team members Review objectives of study and importance to CNCS, federal government, and the service -learning fiel d Review key features of study design, including random assignment of classrooms, data collection requirements [student surveys, student testing (if necessary), teacher interviews, teacher logs) ], and schedule of study activities Review guidance that will be provided on what teachers will need to eliminate from control classroom Review of h onorariums and incentives to compensate school and teachers for burden of being in study Discussion of role of onsite school/district liaison to coordinate data collectio n activities at the school and planned compensation Review confidentiality procedures for study data Discuss questions/concerns about study design B. Process for Implementing Random Assignment Discuss procedure and timing of class assignments for the fo llowing school year to determine the best time for random assignment to take place C. Determine Initial Eligibility of Teachers to Participate in Study Ask the teachers interested in the study to confirm that they meet the initial three criteria for part icipation in the study: 1) have received school -based Learn and Serve America funding in the previous (2006 -09) or current (2009 -12) Learn and Serve America funding cycle; 2) plan to implement service -learning in the 2011 -12 school year in at least two cla sses of a core academic area (or areas) for 9 th or 10 th grade students (these two classes do not have to be the same core academic area and classes can be semester -long or year -long); and 3) have at least one year of experience, which can include the 2010 -11 school year, utilizing service -learning in a core academic area with 9 th or 10 th grade students. Determine that there is at least one teacher eligible based on these criteria and interested in participating o If no, thank principal and teachers for the t ime o If yes, continue to topics below D. Administer Teacher Information Form Explain the two -stage process of determining eligibility: eligibility on the 3 initial criteria and final (4 th) eligibility criterion based on responses to the Teacher Informatio n Form (TIF) Ask any teachers who meet the first three eligibility criteria to complete the TIF online E. Next Steps Thank teachers for their time and explain that they will be contacted after the study team has completed recruitment If time permits and p rincipal is attending meeting and interested, review draft Memorandum of Understanding (MOU) with principal Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-92 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Prog rams under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. F. Follow -up (To Occur after the Visit) Determine total number of teachers who will be asked to participate in the study and contact the school to notify them of which teachers have been selected for participation Request that eligible teacher(s) who agree to participate in the study complete the online Teacher Consent Form(s) Arrange to obtain signed MOU from principal (with signature of district a dministrator) Identify a site liaison (district or school staff) who will be paid a stipend by the study team to facilitate on -site data collection during the student and teacher data collection phase Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-93 4.4.4 Memorandum of Understanding (MOU) and Other Document s of Agreement School/ District MOU Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-94 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Memorandum of Understanding (MOU) Between the Learn and Serve America Study Team & [School and District Name] This Memorandum of Understanding (MOU) is between the Learn and Serve America Study Team and [School & Distr ict Name] concerning participation in the Corporation for National and Community Service’s (CNCS) National Evaluation of School -Based Learn and Serve America Programs. This MOU describes the terms and conditions associated with the participation of [School & District Name] in the study. Each party is signing this agreement in good faith and with the expectation of fulfilling its obligations as described in the MOU. This agreement is contingent on CNCS exercising its authority to approve continued implementa tion of this study. Participation in this study is voluntary and is in no way tied to the school or district’s current or future Learn and Serve America funding opportunities. Study Background The purpose of the National Evaluation of School -Based Lear n and Serve America Programs is to evaluate the effect of service -learning in core academic areas on 9 th and 10 th grade students. The study will examine student outcomes in academic achievement, and academic and civic engagement. CNCS has contracted with A bt Associates Inc. and its subcontractor, RMC Research Corporation, to conduct this study. Approximately 185 high schools that have received or are receiving Learn and Serve America grant funds in 9 states across the country will be selected to participa te in this evaluation on the basis of having teachers in the school who meet the eligibility criteria for the study. Teachers in these schools have voluntarily agreed to participate in the study, including following the study protocol, accepting the random ization of classrooms, and participating in the data collection (the participating teachers and schools from your school and district are listed in Attachment A).

Participation in this study by districts, schools, and teachers is voluntary and is not tied to Learn and Serve America program funding. Furthermore, CNCS has stipulated that grant -related student and teacher survey requirements (e.g., surveys for grantee performance measurement) for program year 2011 -2012 will be waived for districts and schools participating in this study. However, districts and schools will still be required to complete any other reporting requirements related to LASSIE and the grantee progress reports. The Study Team, led by Abt Associates, will randomly assign participating teachers’ classrooms to treatment (service -learning) and control (no service -learning) groups. The Study Team will collect data using student surveys, teacher interviews, teacher logs, student standardized tests (if state test scores are not available in t he service -learning content area at the necessary time or even if state test data are available but a study -administered test is agreed upon by all parties), and individual student school records. Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-95 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Roles and Responsibilities of the Study Team and Particip ating Schools and Districts Major responsibilities and expectations for both the Study Team and for participating schools and districts are listed in this section. THE STUDY TEAM AGREES TO: Recruit Teachers for the Study The Study Team will identify w hich schools and teachers are eligible for the study and willing to participate. From among the eligible and interested teachers, one or more teachers at each school will be asked to be in the study for the 2011 -12 school year. Conduct Random Assignment The Study Team will conduct random assignment of participating classrooms for each participating teacher to either the treatment (service -learning) or control (no service -learning) condition. Random assignment will be conducted after participating schools have assigned students to the participating teachers’ classes using regular school processes. Shortly after random assignment has been conducted, the Study Team will inform participating schools, the site liaison(s), and participating teachers of the resu lts of random assignment. Collect Data The Study Team will collect data from participating students and teachers in the participating schools. Data will be collected only from students with signed parent permission forms and who assent to participate. T he study team will work with the school or district site liaison to coordinate and/or implement data collection activities in each school. Attachment B summarizes the schedule for data collection activities; specific arrangements for data collection activi ties will be made at times agreed to by the Study Team, the site liaison, and participating teachers in each school. The following is a description of these activities. Student Surveys. Students who have parent permission and who themselves have assented to participate in the study will be surveyed three times: (1) at the beginning of the study course in fall 2011; (2) at the end of the course, either at the end of the semester or school year (depending on the length of the class); and (3) one year after the end of the semester or school year (depending on the length of the class). The site liaison (and/or teacher) will be responsible for distributing surveys and other forms (including Parent Permission Forms) to the participating students; ensuring that s tudents and parents complete the forms; and collecting and returning the completed forms to the Study Team. The site liaison will work with the participating teacher(s) and other school staff, as necessary, to identify a time and location for students to c omplete their surveys (preferably in class). The site liaison or other Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-96 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. school staff may be called upon to assist the Study Team in distributing the one -year follow up surveys and/or locating students who have left the school. Student Testing (if necessar y). If state test data are not available in the service -learning content area at the necessary time (or even if state test data are available but a study - administered test is agreed upon by all parties) , students will be asked to take a standardized test near the end of the semester or school year (depending on the length of the class). The test will be administered by the Study Team. The Study Team will work with the site liaison, participating teacher(s), and other school staff, as necessary, to identify a time and location for students to complete the test (preferably in class). Student Enrollment and Status Updates. The Study Team will request and collect information about students in the study who move and/or change schools during the study period. Te acher Interviews. Teachers who have consented to participate in the study will be interviewed by telephone mid -way through the semester ( for semester -long classes) or mid - way through the year (for year -long classes) and at the end of the semester (for seme ster - long classes) or school year (for year -long classes). Teacher Logs. Teachers who have consented to participate in the study will be asked to document their classroom activities in both the treatment and control classrooms using weekly teacher logs. Student Record Data. The Study Team will collect student records data for all participating students at three points in time: baseline, post -program, and follow up. The Study Team will work with the site liaison to obtain these files from the school or di strict. Ensure Confidentiality All members of the Study Team will sign data confidentiality agreements. Student and teacher data will be used only by Abt Associates and its subcontractors for research purposes. This includes study data collected directly by the Study Team, site liaison, or classroom teacher and any administrative data provided to the Study Team by the district and participating schools.

Parents will receive information about Abt Associates’ confidentiality policies prior to providing perm ission for their children to participate in the evaluation. The Study Team will keep all personally identifiable information confidential, to the extent allowed by law. Data collected on individuals as part of this study will only be included in study reports in aggregate form so they cannot be tied to individual students or teachers: no district or school names will appear in any reports. The Study Team is required to provide all data collected for this study to CNCS. However, CNCS will not release this data or use it for purposes that are not related to this study. Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-97 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Provide Honorariums and Incentives to Participating Districts, Schools, Teachers and Students Honoraria and incentives to be provided for study participation are as follows: Districts will receive an honorarium of $1000 to provide the Study Team with student records data (electronic if possible) for the students in participating classrooms. This data includes information on attendance, discipline, credit accrual, and state proficiency test scores. These data files will be provided for each round of data collection – (1) baseline, (2) post -program, and (3) follow -up. Half of the honoraria ($500) will be provided to districts at the end of the 2011 -12 study year, and the other half ($500) will be provided upon completion of the second study year. Schools will receive an honorarium of $500 at the beginning of the 2011 -12 school year for participation in the study for facilitating data collection activities and allowing study activities to be co nducted during the school year. Teachers will receive an honorarium of $2000 for participation in the study, including all activities related to curriculum planning time, student and teacher data collection, and study communication and training. Half of th e honoraria ($1000) will be provided to teachers at the beginning of the first 2011 -12 study year, and the other half ($1000) will be provided upon completion of the first study year. Students will receive a $20 incentive prior to completing the one -year follow up survey. Study classrooms will receive a $50 incentive at the beginning of the 2011 -12 study year for participating in the study. This incentive is intended to be used for an activity of the teacher’s or students’ choice. [School and District N ame] AGREES TO: Participate for the Duration of the Study School and district participation in the National Evaluation of School -Based Learn and Serve America Programs will begin in approximately May 2011 and extend through September 2013. [School and Di strict Name] agrees to support the data collection activities that will be conducted as part of this study as outlined above. Cooperate With Random Assignment Procedures The school and district agrees to support participating classrooms’ assignment, eit her to the treatment or control group, for the duration of the study. Each of the participating schools will assign students to the participating teachers’ classes using their regular processes. The Study Team will then conduct random assignment of two or more of each teacher’s classrooms selected to be part of the study, to either treatment (service -learning) or control (no service -learning) condition. The participating teacher(s) will implement service -learning only in treatment classrooms. To maintain th e integrity of the random assignment, the participating teacher(s) must agree not to implement service -learning in the control Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-98 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. classroom(s). Students in either classroom (treatment or control) may not enroll in any other service -learning classes at their s chools during the 2011 -12 school year. However, these students are not excluded from participating in other community service activities in the district, school, or community. Facilitate Data Collection Activities The school and district agree to facili tate data collection activities as necessary, and to allow the Study Team to conduct the activities outlined in this agreement for the duration of the study.

This includes allowing the site liaison or Study Team to distribute parent permission forms to stu dents in the study classes to take home to be signed. The parent permission forms will include permission for the student to participate in the student surveys and in any standardized testing that is required, as well as permission for the Study Team to co llect data from their child’s student record. Schools will be responsible for notifying parents and students of the results of random assignment that will determine which study class the students have been assigned to and whether the class will include ser vice -learning. Provide Student Data After the site liaison has collected parent permission forms, the district or school will provide the Study Team with a classroom roster for each participating class, including student names and ID numbers. Information on students not participating in the study will be redacted. The school or district will provide student records data for students in the study classrooms at three points during the study: baseline, post -program, and follow up. Identify and Work with a S ite Liaison The school and district will assist the Study Team in identifying one or more site liaisons to facilitate data collection activities at participating schools within the district. The site liaisons will be compensated separately (in addition to other district or school honorariums listed above) by the study team for assisting with study activities. CHANGES TO THE AGREEMENT: We anticipate that over the course of the study, some modifications or additions to this agreement may be necessary. It i s understood that the terms may be adjusted with written amendments as agreed upon by both parties. Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-99 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Signatures The following people have read this Memorandum of Understanding and acknowledge the terms and conditions regarding participation in the Corpor ation for National and Community Service’s National Evaluation of School -Based Learn and Serve America Programs by the [School and District Name] . [Name] , Project Director Date Principal, [School Name] Date Superintendent, [District Name] Date Study Team Contact Information For further information about the National Evaluation of School -Based Learn and Serve America Programs, please contact: [Name] Project Director, Abt Associates, Inc. 55 Wheeler Street Cambridge, MA 0213 8 (xxx) xxx -xxxx [email address] [Name] Principal Investigator RMC Research Corporation 633 17th Street, Suite 2100 Denver, CO 80202 (xxx) xxx -xxxx [email address] [Name] Federal Project Officer Office of Research and Policy Development Corporation for Na tional and Community Service 1201 New York Ave, NW Washington, DC 20525 (xxx) xxx -xxxx [email address] For questions about your rights as a study participant, please contact: [Name] Institutional Review Board Administrator Abt Associates, Inc. (xxx) xxx -xxxx (toll -free) Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-100 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Attachment A Participating Teachers & Schools in [School & District Name] Teacher Name School Name Serv ice -learning Evaluation Toolkit Abt Assoc iates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-101 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as admi nistered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Attachment B Data Collection Activity Date* Conduct random assignment of classrooms July 2011 - August 2011 (wil l depend upon school schedule) Conduct Student Surveys September 2011 (Baseline) June 2012 (Post -program) June 2013 (Follow -up) Note: schedule for post -program and follow - up surveys will vary for semester -long classes Collect Teacher Logs September 201 1 – June 2012 (every week) Conduct Teacher Interviews January 2012 (Mid -year) June 2012 (Post -program) Administer student achievement tests May 2012 Collect Student Records (including state test scores) September 2011 (Baseline) September 2012 (Post -program) September 2013 (Follow -up) *Dates will be revised for semester -long classes prior to finalizing MOU . Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-102 Site Liaison Job Des cription and Agreement Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Servic e-Learning Programs ▌pg. 4-103 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. National Evaluation of School -Based Learn and Serve America Programs Site Liaison Job Description & Agreement The Corporation for National and Community Service (CNCS) , the federal agency that oversees Learn and Serve America and the largest funder of service -learning programs, has contracted with Abt Associates Inc. and RMC Research Corporation to con duct a national evaluation of its school -based service -learning program. The purpose of the study is to measure of the effect of high -quality service - learning activities on 9th and 10th students who participate in service -learning in a core academic course during the 2011 -2012 school year . The study will include a national sample of 185 experienced s ervice - learning teachers across multiple states. Prior to the start of the 2010 -2011 school year, two 9 th or 10 th grade core academic classes of each teacher in the study will be randomly assigned to implement (a) curriculum with service -learning or (b) curriculum without service -learning. The study includes three student surveys – two surveys will be conducted in class - one at the beginning of each study class and the second at the end of the class. The final student survey will be conducted online in spring 2013. We will also collect student records such as academic achievement scores and attendance. Teachers in the study will complete brief weekly classroom a ctivity logs and one telephone interview at the end of the study class. To facilitate the data collection and reduce the burden on school staff, a site liaison will be hired in each site to work with the study team to oversee all study activities in that s ite. The responsibilities for the site liaison may vary slightly by site but will likely include the following key activities: Coordinating on -site procedures related to the random assignment of classrooms – working with the study team to i mplement the cla ssroom random assignment process (summer of 2011) . Managing the parent permission process – distributing and collecting parent permission forms for all students in the study classes (fall 2011 ). Facilitating the student survey data collection – (1) arrangi ng for study staff to meet on -site with all students in the study c lasses (whose parents provided permission ) to conduct the student survey and achievement tests, if necessary (fall 20 11 & spring 20 12 ) an d (2) providing contact information and outreach for students during the follow -up student survey (spring 2013). Assisting with student record collection -- helping to obtain student records for the students in the study by identifying the appropriate district IT or data person and facilitating the contact between the study team and this person (fall 2011). To fulfill the responsibilities of the site liaisons, we are seeking local individuals who are familiar with the schools and districts in the study. Possible candidates for these positions include curren t or retired teachers or school administrators, substitute teachers, or school or district support staff. We estimate that the site liaison’s study -related responsibilities in year one will begin just prior to the start of the 2011 -2012 school year and wil l require a total of approximately 30 hours through the end of the 2011 -2012 school year. For this work, we will pay the site liaisons a total of $1000 in two installments of $500 each upon the completion of each student survey data collection in the 2011 -2012 school year. In year two (2012 -13 school year), site liaison tasks will focus primarily on outreach and follow -up with students for the follow -up student survey . For this work, site liaisons will be paid $500 in two installments ($250 each) with the s econd installment contingent upon completion of study duties such as Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Servic e-Learning Programs ▌pg. 4-104 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. documentation of follow -up to all survey nonrespondents. Site liaisons will be encouraged to retain the position for two years. However, we recognize that school staff and individual circ umstances change year to year so the site liaison agreement is non -binding and there will be no consequences for site liaisons who choose to withdraw from the position at the end of the first year. Please contact (NAME) at (EMAIL ADDRESS)or (XXX -XXX -XXXX) for further information about the study and/or the responsibilities of the site liaison. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Servic e-Learning Programs ▌pg. 4-105 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. National Evaluation of School -Based Learn and Serve America Programs Site Liaison Agreement This memorandum serves as an agreement between the site liaison and Abt A ssociates Inc. for services provided in con nection with the National Evaluation of School -Based Learn and Serve America Programs during the 2011 -2012 and 2012 -13 school year s. Tasks and specific responsibilities include, but are not limited to, the followi ng: Coordinating on -site procedures related to the random assignment of classrooms – working with the study team to i mplement the classroom random assignment process (summer of 2011) . Managing the parent permission process – distributing and collecting parent permission forms for all students in the study classes (fall 2011 ). Facilitating the student survey data collection – (1) arranging for study staff to meet on -site with all students in the study c lasses (whose parents provided permission ) to conduct the student survey and achievement tests, if necessary (fall 20 11 & spring 20 12 ) an d (2) providing contact information and outreach for students during the follow -up student survey (spring 2013). Assisting with student record collection -- helping to obta in student records for the students in the study by identifying the appropriate district IT or data person and facilitating the contact between the study team and this person (fall 2011). In year one, t he site liaisons will receive a total of $1000 in two installments of $500 each upon the completion of each student survey data collection in the 2011 -2012 school year. In year tw o, site liaisons will receive a total of $500 in two installments ($250 each) with the second installment contingent upon completio n of study duties such as d ocumentation of follow -up to all survey nonrespondents. Please complete the contact information below, make a copy of this document for your records, and return the original, signed agreement to: ABT STAFF MEMBER Abt Associate s Inc. 55 Wheeler Street Cambridge, MA 02138 (or Fax to Abt staff name at #####) Name of School & District :__________________________________ Date: ____________ Site Liaison Name (please print): ________________________________________________ Site Liai son Signature: ________________________________________________________ Address (to send payment): ____________________________________________________ ____________________________________________________ ____________________________________________ _________ Social Security Number (needed for payment): _____________________________________ Telephone Number: _____________________ Fax Number: __________________________ Email Address: ______________________________________________________________ Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-106 Teacher Informed C onsent Form Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-107 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor : Abt Associates. TEACHER INFORMED CONSENT FORM The Corporation for National and Community Service (CNCS) is funding a study about service -learning in high schools. Abt Associates Inc ., a research firm in Cambridge, Massachusetts and its part ner RMC Research in Denver, Colorado have been hired to conduct this study. The goal of the study is to examine how service -learning affects the academic achievement , academic engagement, and civic engagement of 9th and 10 th grade students . This form provi des additional details about the study and asks for your consent to participate should you be selected. Selection for participation in the study will be based on your implementation of service -learning and your willingness to participate in the requirement s of the study. Study Design and Participation A total of up to 185 teachers in approximately nine states will be included in the study. To be able to rigorously test the effects of service -learning, the study team will utilize a within -teacher random ass ignment design, meaning that we will select two of the classes in which you were planning to use service -learning in the 2011 -12 school year and randomly assign which one will include service -learning (treatment class) and which will not (control class). You will not be able to choose which of your classes will use service -learning and which will not. To implement the random assignment design, the study team will ask the participating high schools to assign students to their classes for fall 201 1 just as they usually do. After all of the students have been assigned to their classes for the fall, the study team will use a lottery to decide which class will be the treatment class and which will be the control class. We will notify you of the results of the r andom assignment as early as possible in order to allow you to plan your coursework accordingly. If you have any questions or comments about the National Evaluation of Learn and Service America Programs study, please contact the study team by emailing (ST UDY EMAIL ADDRESS ), or by calling (toll -free) XXX -XXX -XXXX . Is Participation Voluntary? Yes, t aking part in the study is voluntary. Your decision whether or not to participate will have no effect now or in the future on your receipt of funding from your d istrict, state, CNCS, or Learn and Serve America, or on your employment status. You may change your mind and withdraw from this project at any time without penalty. Teachers who complete all study activities will be compensated $2,000 for their time and ex penses related to study activities. What are the Benefits and Risks ? Teachers participating in the study will be making an important contribution to the largest and most comprehensive study ever conducted of service -learning. This study will contribute va luable evidence about the effectiveness of service -learning programs for students that will benefit service -learning practitioners and policy -makers. There is minimal risk from participation in the study. There is a minimal risk of breach of confidential ity but we have many procedures in place to minimize this risk. All information obtained from schools, teachers, and students as a result of this study will be kept confidential, to the extent allowed by law. Your name will be kept separate from your inter view and log responses and will not be included in any reports about what we learn from the study . School staff, district staff , and parents will not be allowed to see any Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-108 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor : Abt Associates. of your interview or log responses . We are required to provide the information we co llect from you to CNCS, including your name and other personal identifiers, along with the results of the study. CNCS will not use this information in any way to determine whether you or your school or district will receive future Learn and Serve America g rants or other funding. CNCS will not disclose your personal information to any other parties not engaged in the evaluation, unless required by law. Honorarium Teachers will receive an honorarium of $2000 for participation in the study, including all act ivities related to curriculum planning time, student and teacher data collection, and study communication and training. Half of the honoraria ($1000) will be provided to teachers at the beginning of the first 2011 -12 study year, and the other half ($1000) will be provided upon completion of the first study year. Additional Questions If you have any questions about the study, you can contact (NAME), Abt Study Director, at XXX -XXX - XXXX (toll call). For questions about your rights with regard to the study, y ou may call (NAME), Abt Institutional Review Board Administrator , toll -free at XXX -XXX -XXXX. To agree to participate in the National Evaluation of School -Based Learn and Serve America Programs if you are selected , please complete the information below and check the box indicating that you agree to participate . Thank you. Name: _________________________________ School Name: ________________________________ To agree to participate in the National Evaluation of Learn and Serve America Programs (NELSAP) stud y if you are selected, please read your responsibilities as a study participant and if you agree, and select ―I AGREE…‖ below. I agree to participate in the National Evaluation of Learn and Serve America Programs study if I am selected. This will include: (1) accepting the treatment or control assignment s of my classes that are participating in the study; (2) using service -learning activities in my class assigned to the treatment group and withholding service -learning activities from my class assigned to t he control group; (3) working with the study team and my school/district liaison to schedule student data collection activities during three class periods , including one student survey during the first week of class, another near the end of the semester or school year (depending on the length of the class), and one testing session near the end of the semester or school year (if state test data are not available in the service -learning content area at the necessary time or even if state test data are availab le but a study - administered test is agreed upon by all parties); (4) participating in one telephone interview with study team members near the end of the semester or school year; (5) completing logs every week on activities for both of my class es participa ting in the study; and (6) participating in up to four training calls and webinars. 1. I AGREE to participate in the National Evaluation of Learn and Serve America Programs study. Thank you! We will contact you in the spring of 2011 if you are selected for the National Evaluation of Learn and Serve America Programs study. Please print a copy of this informed consent form from the NELSAP study home page. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-109 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor : Abt Associates. 2. I DO NOT AGREE to participate in the National Evaluation of Learn and Serve America Programs study. Tha nk you for your interest in the National Evaluation of Learn and Serve America Programs study. If you change your mind and would like to participate, or have any questions or comments about the study, please contact the study team by emailing (STUDY EMAIL ADDRESS) , or by calling (toll -free) XXX -XXX -XXXX . This form is designed to be submitted online. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-110 Parental Permission Form Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-111 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve Ameri ca Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Parent Permission Form for Student Participation in the National Evaluation of School -Based Learn & Serve America Programs We would like your child to participate in the National Evaluation of Learn and Serve America Programs . The purpose of this study is to look at how service -learning affects 9 th and 10 th grade students in terms of their performance in school. The Corporation for Na tional and Community Service (CNCS) is the federal agency responsible for running national service programs. CNCS has hired Abt Associates , Abt SRBI, and their partner RMC Research Corporation to carry out this study. Your child‘s classroom is one of about 400 classrooms across the country that has agreed to help us with this study. Students in about 400 classrooms, including your child‘s classroom) are expected to participate in this study. Participation in this study is voluntary. Your child‘s answers are very important to help us understand the effects of service -learning. We will protect your child‘s confidentiality, and all of his or her answers will be confidential to the extent provided for by law. What is the study about? Service -learning is a way o f teaching that connects what children are learning in school with service projects in the community. This study is being done with about 185 teachers in 9 states. Each teacher in the study is teaching two classes that will be part of this study. The teach er will do service -learning in one class , but not in the other. Because we do not yet know if service -learning is a better way to teach than other ways, your child is not better or worse off in either class. We used a lottery to decide which classes in the study would or would not have service -learning. This meant that all students had the same chance of being in either class. Your child is in the class in which the teacher will [NOT] be doing service - learning. What does it mean to participate in the stud y? If you agree that your child can be in the study, we will ask your child to do three surveys and to take one test. The first survey will be at the beginning of the c ourse . The second survey will be just before the course ends. The last survey will be on e year after the c ourse ends. Each survey will be about 45 minutes long. Your child will do the first two surveys i n class. Your child will do the last survey online and we will give him/her a $20 gift card to do it. The surveys will have questions about y our child‘s feelings about the c ourse , the school and volunteering. If we are not able to use your child‘s scores on the state test to measure what your child knows about the subject of the c ourse , we will give him/her a test just before or after the term ends . The test will take place during the class and will last about 45 minutes. We will also collect your child‘s school records , including demographic information, his/her number of credits and attendance, and if he/she has completed the c ourse and the grade or had any discipline problems. We may also collect state test scores and administer a standardized achievement tetst . We will collect your child‘s school records for this year and the next two school years (2010 -2011, 2011 -2012, and 2012 -2013). Pot ential benefits . There are no direct benefits to you or your child. However, y our child‘s participation will help us learn more about the impact of service -learning on students‘ school performance and their sense of responsibility to their community. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-112 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve Ameri ca Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Ris ks. There is very little risk for your child to participate in this study. There are no penalties for refusing to be in the study. Your child may refuse to answer any question on the survey or the test that he or she does not want to answer. Your child is not required to be in the study in order to take any of his/her classes. There is a minimal risk of breach of confidentiality, but we have many procedures in place to minimize this risk. Compensation . Your child will receive a $20 gift card for filling ou t the follow -up survey in Spring 2013. Being part of the study is your choice. Your decision to allow your child to participate is voluntary. R efusing to participate will not negatively affect you, your child, or your relationship to your child‘s school. Your child may refuse to answer any question on the survey. Your child may stop taking the survey at any time. There are no penalties for leaving from the study either now or in the future. Confidentiality . We will keep your child‘s survey responses and t est scores confidential to the extent provided for by law. We train all study staff to follow stric t rules to protect confidentiality. Survey staff also sign a confidentiality pledge. We will not allow teachers, school staff, and parents to see any of the completed student surveys or the test scores. The study team will never write the names of students, teachers, and schools in the study in any report. We will store the completed surveys in a locked file cabinet at Abt Associates. Only the study team will have access to them. They will be kept until 2017 , when they will be destroyed. The study team will also not give the names to anyone other than CNCS. CNCS will keep this information confidential and may use this information to track students during the st udy. After the completion of the study, CNCS will destroy this information. Your child‘s individual survey responses or test scores will not be shared with anyone outside of the study team. Teachers, school staff, and parents will not see any of your chil d‘s individual survey responses or test scores. Your child‘s information will be combined and reported with information from many students across classrooms. Additionally your child‘s information may be combined and reported with information from other stu dents in his/her classroom or school. This class -level data may be reported to your child‘s teacher, school, or district. If you have any questions about the study, please contact (NAME) , Abt Study Director, at xxx - xxx -xxxx (toll call). For questions abou t your rights or your child‘s rights in the study, please call (NAME) , at Abt Associates at XXX -XXX -XXXX (toll -free call) . If you give permission for your child to be in the study described above, please print your child‘s name, print your name, sign your name and write the date below in the space provided. Please return the completed form to your child‘s teacher. Thank you for your cooperation in this important study. Please Return This Form To Your Child’s Teacher Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-113 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve Ameri ca Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. Please Return This Form To Your Child ’s Teacher I have read and understood the description of the National Evaluation of Learn and Serve America Programs , being conducted by Abt Associates. I understand that the information will be used ONLY for the purpose of the study and will be kept strictly confidential , to the extent provided for by law . Yes , I agree to allow my child to participate in the Service -Learning Impact Study. I allow t he researchers conducting this study to collect survey, test and school rec ords information from my child and the school/district. No , I do NOT agree to allow my child to participate in the Service -Learning Impact Study. I do not allow re searchers conducting this study to collect survey , test or school records information from my child and the school/district. Print YOUR CHILD’S Name:

______________________________________________________________________ _____ __ First Last Print YOUR Name: _________________________________________________________ _____________________________ First Last Your S ignature: ________________________________ ____________________________ Date: __________________ Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-114 Student Assent Form Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 4-115 Measures and documents were developed as part of the National Evaluation of School -Based Learn and Serve America Programs under contract CNSHQ09A0010, as administered by the Corporation for National and Community Service. Prime contractor: Abt Associates. National Evaluation of School -Based Learn and Serve America Programs Student Assent Form Your class is one of about 400 classrooms in the country participating in a new study about service -learning. As you know, your parents or guardians have signed a permission form allowing you to be a part of this important study. The final decision about whether to participate is yours. Your answers as a student in this class are very important to the success of this study.

If you decide you do not want to participate in the study, simply notify your teacher. What this means is… You will be asked to complete three surveys – one at the beginning of your c ourse , one at the end, and one a year later – and may be asked to take an ach ievement test at the end of your course . The questions on the surveys will be about you, your class and your school. The achievement test will be on the subject matter that you have learned in your class. Potential benefits . There are n o direct benefits t o you for participating. Your participation in the study w ill help us to test the effectiveness of service -learning as a way to help students do better in school . Potential risks. There is a minimal risk of breach of confidentiality, but we have many proc edures in place to minimize that risk. For example, your name will not be on the survey you complete. Compensation. You will receive a $20 gift card for your time when completing the final survey. You should know… Everything you write is confidential. It won‘t be shared with anyone outside of the study team , not even your teacher. Nothing you write will affect your grades, your relationship with your teacher, or with your school. Your answers are very important to us, and participation in the study is v oluntary. If you are uncomfortable answering any question, you can skip it. You are free to withdraw from the study at any time without penalty. Any questions… If you have any questions about your rights as a participant in the study, you should ask the person who is giving you the survey or the test. Or, your parent or guardian can contact (NAME) toll -free at XXX XXX -XXXX or (NAME) toll -free at XXX -XXX -XXXX. Would you like to participate? Again, your participation is completely voluntary and you can wit hdraw from the study at any time. Completing the survey or the test will let us know that you are willing to participate in the study. Again, if you decide you do not want to participate in the study, simply notify your teacher. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-1 5. Annotated Bibliography of Literature Reviews for the National Evaluation of School -based Learn and Serve Programs This annotated bibliography covers literature reviews conducted for NELSAP. For each review, we provide an explanation of how the information gleaned from the literatu re review was used in the evaluation. A full list of references cited in the reviews is included at the end of this section. 5.1 Review of Potential Scales to Measure Students’ Academic Achievement and Civic Engagement This review wa s used to inform the develo pment of the NELSAP logic model and dec isions about scales to include in the student survey. (Billig et al., 2005) This study compared more than 1,000 high school students who participated in service -learning programs with those who did not. The study sugg ests that service -learning is effective when it is implemented well, but it is no more effective than conventional social studies classes when the conditions are not optimal. (Battistich et al., 1995) This study examines the relationship between students’ sense of school community, poverty level, and student attitudes, motives, beliefs, and behavior.

Within schools, individual students’ sense of school community was significantly associated with almost all of the student outcome measures. Between schools, school -level community and poverty were both significantly related to many of the student outcomes (the former positively, the latter negatively). (Bridgeland et al., 2010) Students, parents, and teachers have perspectives that exhibit significant discon nects that, if not more fully understood and bridged, will continue to set back efforts to keep more young people in school and on track to graduate prepared for postsecondary education. This report describes conversations involving all three groups at four different schools and contains guidelines to facilitate future conversations. (CIRCLE, 2010) Educational programs and other government -supported initiatives have been shown to enhance Americans’ civic skills and their levels of engagement. But these progr ams and other opportunities are scarce and unequal, often provided to people who are already the most likely to be engaged. A lack of civic learning opportunities not only inhibits Americans’ civic participation, but also has harmful consequences for their academic and economic progress. (Connors and Walters, 2007) This report describes a project to develop a rubric of indicators of high quality service -learning in schools, identify schools with exemplary service -learning programs and study the impact of se rvice learning at schools with exemplary programs. (Davila and Mora, 2007b) Female high school students tend to be more civically engaged than males in the same race/ethnic group. Asian students have the highest participation rates in civic activities out of the four race/ethnic groups considered here (non -Hispanic whites, African Americans, Hispanics, and Asians); Hispanics tend to be the least involved. Students who perform community service during high school are more likely to graduate college. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-2 (Eccles and Gootman, 2002) This report focuses broadly on community -based programs for youth and examines what is known about their design, implementation, and evaluation. These are programs located in the communities in which the youth live. In the context of th is report, communities may include neighborhoods, block groups, towns, and cities, as well as non - geographically defined communities based on family connections and shared interests or values. (Furco, 2002) Unlike volunteering, service -learning involves ac tive learning of content knowledge and skills while helping others. (Goodenow, 1993) This article presents a scale used to measure of adolescent students' perceived belonging or psychological membership in school environment. (Harter, 1982) This paper des cribes the Perceived Competence Scale for Children. This scale assesses a child's sense of competence across different domains ((a) cognitive, (b) social, and (c) Physical). A fourth subscale, general self -worth, independent of any particular skill domain, is also included. (Hawkins et al., 1999) This report summarizes the long -term effects of an intervention combining teacher training, parent education, and social competence training for children during the elementary grades on adolescent health -risk behav iors at age 18 years. Fewer students receiving full intervention than control students reported violent delinquent acts, heavy drinking or sexual activity. The full intervention student group reported more commitment and attachment to school and better aca demic achievement. (Hutchens and Eveland Jr., 2009) This paper uses data from a longitudinal study of high school students to examine the effects of exposure to various elements of a civics curriculum on civic participation. Both stimulating political comm unication by discussing media sources and engaging in political debate and rote learning of traditional civics content are correlated negatively with civic outcomes. (Johnson, 2001) There are differences across racial -ethnic groups in school attachment an d engagement. The racial -ethnic composition of schools is related to study attachment but not to student engagement. (Kahne and Sporte, 2008) Prior large -scale studies that found limited impact from school - based civic education often did not focus on the content and style of the curriculum and instruction. A set of specific kinds of civic learning opportunities fosters improvements in students’ commitments to civic participation. (Kim and Billig, 2003) This study of the impact of the Colorado Learn and Se rve program examined 35 classrooms and 761 students, about half of whom participated in service - learning and half of whom did not. Results for these students showed a statistically significant difference in connection to community, connection to school, an d civic responsibility for those participating in service -learning relative to their nonparticipating peers. (Larson, 2000) This article analyzes the development of initiative as an exemplar of one of many learning experiences that should be studied as par t of positive youth development. The context best suited to the development of initiative appears to be that of structured voluntary Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-3 activities, such as sports, arts, and participation in organizations, in which youths experience the rare combination of in trinsic motivation in combination with deep attention. (Martin et al., 2006) Service -learning has been proven beneficial for the youths and communities who participate. However, there is still a need for additional data about the relationship between servi ce-learning and youth -adult transitions. This report summarizes results from a survey of young adults with a range of experience providing direct or indirect service: those with service -learning experience, those with service experience that does not quali fy as service -learning, and those with no service experience at all. (Melchior et al., 1995) Higher rated service learning programs improve outcomes more than lower rated programs. Outcomes include personal and social responsibility and maturity. (Meyer e t al., 2004) Relates service learning to various outcome measures, including academic achievement and student engagement. The results are mixed. (Midgley et al., 2000) The manual for the Patterns of Adaptive Learning Scales (PALS).

These scales use goal o rientation theory to examine the relation between the learning environment and students’ motivation, affect, and behavior. Student scales assess 1) personal achievement goal orientations; 2) perceptions of teacher’s goals; 3) perceptions of the goal struct ures in the classroom; 4) achievement -related beliefs, attitudes, and strategies; and 5) perceptions of parents and home life. Teacher scales assess their perceptions of the goal structure in the school, their goal -related approaches to instruction, and pe rsonal teaching efficacy. (Morgan and Streb, 2001) Student voice in service -learning projects is positively correlated with improved self -concept, political engagement, and tolerance. (Muller, 2001) This study uses information from both teachers and stude nts to explore how the perceptions of each other's investment in the relationship affect the productivity of the relationship. Teachers’ perceptions that the student puts forth academic effort and students’ perceptions that teachers are caring are each wea kly associated with mathematics achievement for most students. For students who are judged by their teachers as at risk of dropping out of high school, however, the value for math achievement of having teachers who care is substantial and mitigates against the negative effect of having been judged as at risk. (RMC, 2006) This is a literature review of the impacts of service -learning. Several studies have been conducted showing promising results for the academic impact of service -learning. (Scales and Leffer t, 2004) This book reviews the literature on adolescent developmental assets such as positive relationships, opportunities, skills, values and self -perceptions. (Shouse, 1996) This paper examines tensions between two visions of schooling. One stresses soci al cohesion (i.e., common beliefs, shared activities, and caring relations between members). The other emphasizes strong academic mission (i.e., values and practices that reinforce high standards for student performance). Though not incongruous, numerous organizational studies reveal the potential for social cohesion and communality to be achieved at the expense of academic demand or ―press.‖ This study finds that, in fact, for Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-4 most schools, academic ―press‖ serves as a prerequisite for the positive achieve ment effects of communality. (Solomon et al., 2000) A comprehensive elementary school program, the Child Development Project, was conducted in two schools in each of six school districts over a three -year period. Two additional schools in each district ser ved as a comparison group. The program attempts to create a ''caring community of learners'' in school and classroom through classroom, school -wide, and parent involvement components. Results showed positive student results in the five program schools that made significant progress in implementation. Schools that progressed in implementation showed gains – relative to their comparison schools – in students'' personal, social, and ethical attitudes, values, and motives. (Torney -Purta and Wilkenfeld, 2009) S tudents are divided into four groups based on the style of their civic education (lecture, interactive, both or neither). Students who experience interactive discussion -based civic education (either by itself or in combination with lecture - based civic educ ation) score the highest on 21st Century Competencies, including working with others (especially in diverse groups) and knowledge of economic and political processes. (Yamauchi et al., 2006) This study examined student outcomes associated with weekly serv ice -learning activities. The service -learning activities were part of the Hawaiian Studies Program, a culturally relevant academic high school program. (Zaff et al., 2010) Presents a measures of civic engagement that go beyond civic behaviors.

This measure , an integrated construct of civic engagement, active and engaged citizenship includes behavioral, cogn itive, and socioemotional constructs. 5.2 Review of Potential Moderators for Service Learning This review was used to inform the development of the NELSAP logic model and decisions about data collection of potential moderators. (Allen and Philliber, 2001) Reports results from a study of the ―Teen Outreach Program‖. Finds largest effects for high -risk youth. (Allensworth and Easton, 2005) Students are consider ed on -track if they have completed enough credits by the end of the school year to be promoted to tenth grade, and have failed no more than one semester of a core subject area. On -track status is a better predictor of high school graduation than eighth -gra de test scores or students’ background characteristics. (Balfanz, 2008) It is possible to identify students at risk of drop -out early. This study presents a set of off -track indicators, such as attending school less than 80% of the time and failing math o r English. (Billig et al., 2005) See above. (Campbell, 2005) Civic education is at the root of the historical rationale for the massive investment made in the nation’s schools but little is known about how schools foster civic engagement. This paper focus es on the quality of civics instruction, in particular on the impact of how political and social issues are handled in the classroom. (Davila and Mora, 2007b) See above. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-5 (Gastic, 2010) This article discusses the Unsafe School Choice Option (a rarely used provision of the No Child Left Behind Act of 2001) which allows students who attend ―persistently dangerous‖ schools or who have been the victims of violent crime at school to transfer to another public school. (Hutchens and Eveland Jr., 2009) See above. (Hyman and Levine, 2008) It is important that service programs reach all of America’s diverse populations, particularly those that are relatively disadvantaged. This is both a matter of equality of opportunity as well as of program efficiency in that the biggest gains will probably be experienced by volunteers who are at greatest risk of dropping out of school or committing crimes. This paper offers concrete proposals for strategies that might enhance the diversity and equity of participation in CNCS -suppo rted programs. (Jennings and Stoker, 2004) This paper explores how social trust and civic engagement have evolved across generations in the United States. (Kahne and Westheimer, 2006) This article presents findings from a study of 10 nationally recognize d programs that engaged youth in community -based experiences and aimed to develop democratic values. Many, but not all, of these initiatives employed service learning activities. Data from the two -and -a-half year study lead the authors to question the comm on assertion that efficacious community experiences will necessarily prepare youth for participation in the democratic life of the community. (Kahne and Middaugh, 2008), (Kahne and Middaugh, 2009) A student’s race and academic track, and a schools average socioeconomic status (SES) are determinants of the availability of the school -based civic learning opportunities. High school students attending higher SES schools, those who are college -bound, and white students get more of these opportunities than low -income students, those not heading to college, and students of color. (Kahne and Sporte, 2008) See Above. (Lay et al., 2003) Drawing on extensive interviews with high school students from a variety of socioeconomic backgrounds, this paper investigates the determinants of attitudes towards government and politics. The authors conclude that while formal education is important, political socialization is also shaped by the social messages presented to citizens by others. (McIntosh and Youniss, 2010) Political participation is fundamentally public; it is necessary to not only hold beliefs but also contend with disagreement and form alliances. Young people learn political participation through actual political participation, but early experiences can be supporte d with scaffolding (training, access to a real political system, and support while participating in that system). (Metz and Youniss, 2003), (Metz and Youniss, 2005) These studies compare changes in civic engagement of student who do and do not have a comm unity service requirement. The requirement made no difference for students who were inclined to serve at the start of the study. However, students who were less inclined to serve showed larger gains on several measures of civic engagement at the schools wi th a community service requirement. (Plutzer, 2002) Most citizens are habitual voters or habitual nonvoters, and most young citizens start as habitual nonvoters and at some point transition to voting. This paper presents Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-6 an empirical analysis of the timin g of the transition, specifically examining the roles of aging, parenthood, partisanship and geographical mobility. (Scales and Roehlkepartain, 2005) Service learning may be particularly beneficial educationally for low income students and schools, making it an important, though overlooked, strategy for closing the achievement gap in American schools. (Shingles, 1981) Black Americans are more politically active than whites of similar socioeconomic status. This article theorizes that black consciousness co ntributes to political mistrust and a sense of internal political efficacy which in turn encourages policy -related participation. (Silver et al., 2008) This study tracks the educational progress of all first -time 2001 -02 9th graders in the Los Angeles Uni fied School District from the 6th grade through to their expected graduation in the spring of 2005. Transcript records, standardized test scores, and a broad database of student and school characteristics are analyzed to measure what middle and high school factors are related to school persistence and graduation. (Torney -Purta and Wilkenfeld, 2009) See Above. (Wilkenfeld, 2009) This study examines the family, peer, school and neighborhood contexts for the development of civic engagement using the 1999 Civi c Education Study and local demographic, s ocial and economic data from the U.S. Census. 5.3 Review of the Impacts of High Quality Service -Learning This review was conducted to inform deci sions about whether to limit eligibility to only teachers implementing high -quality service -learning. (Billig et al., 2005) This study compared more than 1,000 high school students who participated in service -learning programs with those who did not. The study suggests that service -learning is effective when it is implemente d well, but it is no more effective than conventional social studies classes when the conditions are not optimal. (duplicated) (Billig and Root, 2006) This article describes two classrooms in which civic engagement was particularly strong. Students engage d in research, action and advocacy that resulted in acquisition of civic knowledge, skills and dispositions at levels higher than their non - participating peers. (Billig et al., 2008) Service learning may be an effective tool for achieving character develo pment. Data from a four year grant in Philadelphia shows significant differences between participants and non -participants. (Billig, 2009) Students in high quality service learning programs show improved academic achievement and behavior. (Bradley et al. , 2007) Students whose service -learning experiences involved in the design and presentation of materials showed improved community engagement and seat belt awareness. (Melchior et al., 1995) See Above. (Meyer et al., 2004) See Above. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-7 (Northup, 2010) Stud ent in service learning classrooms showed increases in civic engagement and skills from the beginning of the school year to the end of the school year. (RMC, 2010) Elementary and high school students in service learning programs have higher ratings of civ ic skills and dispositions. (Spring et al., 2006) Students with poor academic performance are less likely to participate in service -learning. There is also a correlation between participation in service learning and interest in current events. 5.4 Review of Studies that Use Within Teacher Random Assignment This review was used to develop responses to OMB’s questions about the original design (based on within -teacher random assignment) that was proposed for NELSAP. (August et al., 2009) This study endeavored to assess the effectiveness of the Quality English and Science Teaching (QuEST) on the science knowledge and language acquisition of middle school English Langua ge Learners. The study design involved forty middle school science classes, taught by ten teac hers. Each teacher had two classes randomly assigned to the QuEST treatment group, and two classes randomly assigned to the control, which was the district curriculum. Teachers were observed teaching each treatment and control section twice. Results indica ted that there was more variability across teachers in terms of fidelity of implementation and less in terms of quality of instruction. (Dede et al., 2010) This study employs within -teacher randomization to evaluate technology - based Strategies for Enhanci ng Student Interest in STEM Careers through Algebra Curricula in Grades 5 -9. Teachers will be comprehensively trained to deliver a four -day technology - infused lesson exposing the treatment classes to STEM careers and authentic algebra problems. (Hedges an d Hedberg, 2007a) This article provides a compilation of intra -class correlation values of academic achievement and related covariate effects that could be used for planning group -randomized experiments in education. (Herlihy, 2007) As many as 40 percent of students fail to get promoted from ninth -to 10th - grade on time, and fewer than 20 percent of those students recover from failure and go on to graduate. Nationally, a recent study of public school enrollment patterns shows that (1) there is a sharp incre ase in the number of students enrolled in ninth -grade over the last 30 years, indicating that an increasing number of students are being retained, and (2) the rate at which students disappear between ninth -and 10th -grade has tripled over the same time peri od. This study presents strategies used to address retention. (NYLC, 2008) Standards for high -quality service learning. (OII, 2010) This study evaluates the impact of Collaborative Strategic Reading (CRS) in Denver Public Schools and will involve within -teacher random assignment of four middle school science and social studies classes per teacher. Each teacher will teach two treatment and two control sections. (Pane et al., 2010) For this evaluation of the Cognitive Tutor Geometry Curriculum, school pers onnel identified two teachers wh o both taught geometry at least two periods a day and Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-8 who taught two of those classes during the same period. The evaluators then randomly assigned one of these teachers to the intervention curriculum during the first shared period of the day and the other teacher to the control, standard district geometry, curriculum during that period. In the second shared period teachers switched assignments. In this design, therefore, teachers taught two sections of geometry one using the intervention curriculum and one using the standard district geometry. Researchers conducted three site visits per year and rated both treatment and control classrooms using a rubric that covered important elements of the instructional design. Teachers were also interviewed during these v isits. Results indicated that little contamination occurred between treatment and control classrooms. (Raudenbush et al., 2007) Random assignment of classrooms or schools to interventions eliminates selection bias. But unless expected impacts are large, t his kind of design can be quite expensive because the number of units required is large. (Spring et al., 2008) Service -learning is most prevalent in high schools, with approximately 35% of all public high schools implement service -learning. (Vaughn et al ., 2009) Both of these experiments involved the random assignment of seventh grade students to social studies sections, and the subsequent random assignment of these sections to treatments within teacher. Teachers were provided coaching and professional de velopment to support proper implementation of the intervention in treatment classes, and all classes were subject to fidelity checks over the course of the 12 week study. Researchers conducted four observations in control sections to determine if there was any contamination of the treatment into the control sec tions. Results of these observations and data from teacher reports confirmed that neither the materials nor the instructional practices designed for the treatment classes were being used in the contro l classes. 5.5 Review of Studies Using Student -Level Random Assignment This revie w of studies using student -level random assignment was conducted as CNCS considered switching design options for NELSAP. (Decker et al., 2004) In this study, students were random ly assigned to classrooms staffed by either a Teach for America teacher or whoever else was teaching in the same school and grade. Math performance was slightly better for students of TFA teachers, reading performance was the same. (Fox, 2008) notes that it was initially difficult to persuade principals to randomize students to classrooms. (GiveWell, 2008) summarizes the study as well as other evidence on the effectiveness of TFA. (Matthews, 2004) notes that while students of TFA teachers perform as well as students of other teachers, performance is very low for both groups. (Miner, 2009) criticizes TFA’s focus on fund -raising and media instead of children. (Rotherham, 2004) summarizes and defends the report. (Constantine et al., 2009) In this study, s tudents were randomly assigned to classrooms staffed by either teachers with traditional certification or teachers with alternative Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-9 certification. There was no statistically significant difference in the test scores of students taught by the two types of t eachers. (Corcoran and Jennings, 2009) criticize the study as underpowered and of limited generalizability (only schools that hire many alternatively certified teachers from non - selective programs participated). The authors also claim that the report fail s to acknowledge the many analyses from the study finding that traditionally trained teachers outperformed alternative route teachers in both math and reading. (Darling -Hammond, 2009) has a similar critique. The study schools are the hardest to staff scho ols in jurisdictions with the least selective hiring standards. While Spring test scores were the same, despite randomization gain scores were slightly larger for traditionally certified teachers. (What Works Clearinghouse, 2010) concluded that this is a well -implemented randomized controlled trial, but cautions that the study is not designed to answer the question of whether a teacher would be more effective if he or she attended a traditional certification program or an alternative certification program. Instead, it examines whether teachers who choose to attend AC programs are generally more or less effective than teachers who choose to attend a TC program. (Max et al., 2007). This is a feasibility study examining the possibility of recruiting high - perf orming ―star‖ teachers to work in high -needs schools. It addresses the costs and benefits of randomization within school (comparing the performance of students assigned to ―star‖ teachers to students assigned to other teachers) vs. randomizing across schoo ls (comparing the performance of schools with funds and authority to hire ―star‖ teachers to schools without ―star‖ teachers). 5.6 Review of Characteristics of Effective Teachers This review informed decisions about which teacher characteristics to include a s covariates in impact models and/or to use in post -hoc subgroup analyses. (Aaronso n et al., 2007) There are large differences in performance across teachers. Experience improves teacher performance, especially in the first few years. Few other observables (such as the quality of the teacher’s college) are strongly related to performance. (Angrist and Lavy, 2001) In -service teacher training improves student outcomes in some types of schools. (Boyd et al., 2007) There is not enough evidence to tell whether it is better to tighten or loosen teacher preparation and certification requirement s. Highly selective alternative route programs can produce effective teachers who perform about the same as teachers from traditional routes after two years on the job and teachers who score well on certification exams can improve student outcomes somewhat . (Constantine et al., 2009) See above. (Clotfelter et al., 2006) More highly qualified teachers tend to be matched with more advantaged students, both across schools and in many cases within them. This matching biases estimates of the relationship betwee n teacher characteristics and achievement. If Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-10 authors focus only on schools where this is not the case, teacher experience is consistently associated with achievement; teacher licensure test scores associate with math achievement. (Clotfelter et al., 2007 a), (Clotfelter et al., 2007b), (Clotfelter et al., 2007c) Experience improves teacher performance, especially in the first few years. Teachers with an MA perform no better than teachers without an MA. Higher certification test scores are associated with g ood performance, as is National Board Certification. (Dee, 2004) In the Tennessee Project STAR class -size experiment, assignment to an own - race teacher significantly increased the math and reading achievement of both black and white students. (Decker et al., 2004) See above. Ehrenberg and Brewer, 1994) Teacher credentials and demographics have mixed effects on student achievement. (Ehrenberg et al., 1995) The match between teacher and student race and gender is not related to achievement, but is related to the teacher’s subjective evaluation of the student. (Goldhaber and Brewer, 1997) Some school resources (in particular, teacher qualifications) are significant in influencing tenth -grade mathematics test scores. Unobservable school, teacher, and class c haracteristics are important in explaining student achievement but do not appear to be correlated with observable variables. (Goldhaber and Brewer, 2000) Teachers who have a standard certification have a statistically significant positive impact on student test scores in math relative to teachers who either hold private school certification or are not certified in their subject area. Contrary to conventional wisdom, mathematics and science students who have teachers with emergency credentials do no worse th an students whose teachers have standard teaching credentials. (Goldhaber, 2007) This paper explores the relationship between teacher testing and teacher effectiveness. Some teachers whom we might wish were not in the teacher workforce based on their cont ribution toward student achievement are eligible to teach based on their performance on the tests; other individuals who would be effective teachers are ineligible. (Goldhaber and Anthony, 2007) National Board Certified Teachers are generally more effecti ve than teachers who never applied to the program. However, the NBPTS certification process itself does not increase teacher effectiveness. (Gordon et al., 2006) This paper outlines a policy proposal wherein the federal government would pay for bonuses to highly rated teachers willing to teach in high -poverty schools. In return for federal support, schools would not be able to offer tenure to new teachers who receive poor evaluations during their first two years on the job without a waiver and states would open further the door to teaching for those who lack traditional certification but can demonstrate success on the job. (Hanushek, 1971) One of the first studies to address what characteristics of teachers and classrooms are important using student level data to construct value -added scores. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-11 (Hanushek, 1986) Differences in school quality do not seem to reflect variations in expenditures, class sizes, or other commonly measured attributes of schools and teachers. But there do appear to be significant diffe rences in skill level across teachers. (Harris and Sass, 2009b) Teacher value -added and principals’ subjective ratings are positively correlated and principals’ evaluations are better predictors of a teacher’s value added than traditional approaches to te acher compensation focused on experience and formal education. While past teacher value added predicts future teacher value added the principals subjective ratings can provide additional information and substantially increase predictive power. (Harris and Sass, 2009a) National Board for Professional Teaching Standards certification provides a positive signal of a teacher’s contribution to student achievement only in a few isolated cases. The process of becoming NBPTS certified does not increase teacher pro ductivity. (Hedges and Hedberg, 2007a) See above. (Jacob and Lefgren, 2004) Marginal increases in in -service training have no statistically or academically significant effect on either reading or math achievement, suggesting that modest investments in sta ff development may not be sufficient to increase the achievement of elementary school children in high -poverty schools. (Jacob and Lefgren, 2008) Principals can generally identify teachers who produce the largest and smallest standardized achievement gain s but have far less ability to distinguish between teachers in the middle of this distribution. (Jepsen, 2005) Experience improves teacher performance, especially in the first few years.

Attainment of a master’s degree does not strongly predict performanc e. (Kane et al., 2006) On average, the certification status of a teacher has at most small impacts on student test performance. However, among those with the same certification status, there are large and persistent differences in teacher effectiveness. (Monk, 1994) The amount of course work a teacher has completed in math and physical sciences is positively related to student achievement. (Rivkin et al., 2005) Teachers have powerful effects on reading and mathematics achievement, though little of the va riation in teacher quality is explained by observable characteristics such as education or experience. (Rockoff, 2004) Experience improves teacher performance, especially in the first few years. (Strauss and Sawyer, 1986) Teacher who perform well on cert ification exams also have higher performing students. (Schochet, 2005), (Schochet, 2008a) This article examines theoretical and empirical issues related to the statistical power of impact estimates for experimental evaluations of education programs. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-12 (Sum mers and Wolfe, 1977) Teacher who perform well on certification exams have lower performing students. Teacher who attend strong undergraduate institutions have higher performing students. (Wayne and Youngs, 2003) This study is a literature review on the r elationship between student achievement gains and th e characteristics of teachers. (Xu and Nichols, 2010) This study aims to provide empirical information needed to design adequately powered studies that randomize schools. 5.7 References Aaronson, D., Barrow , L., and Sander, W. (2007). Teachers and student achievement in the Chicago public high schools. Journal of Labor Economics , 25:95 –135. Allen, J. P. and Philliber, S. (2001). Who benefits most from a broadly targeted prevention program? Differential effi cacy across populations in the teen outreach program. Journal of Community Psychology , 29(6):637 –655. Allensworth, E. M. and Easton, J. Q. (2005). The On -Track Indicator as a Predictor of High School Graduation. Consortium on Chicago School Research at th e University of Chicago, Chicago, IL. Angrist, J. D. and Lavy, V. (2001). Does teacher training affect pupil learning? Evidence from matched comparisons in Jerusalem public schools. Journal of Labor Economics , 19(2):pp. 343 – 369. August, D., Branum -Martin , L., Cardenas -Hagan, E., and Francis, D. J. (2009). The impact of an instructional intervention on the science and language learning of middle grade English language learners. Journal of Research on Educational Effectiveness , 2(4):345 –376. Balfanz, R. (2 008). Early warning and intervention systems. Promise and challenges for policy and practice. National Academy of Education and National Research Council Workshop on Improved Measurement of High School Drop Out and Completion Rates. Retrieved 06/03/10 from http://www7.nationalacademies.org/BOTA/ Paper%20by%20R.%20Balfanz.pdf,. Battistich, V., Solomon, D., Kim, D., Watson, M., and Schaps, E. (1995). Schools as communities, poverty levels of student populations, and students’ attitudes, motives and performan ce: A multilevel analysis. American Journal of Educational Research , 32:627 –658. Billig, S., Root, S., and Jesse, D. (2005). The Impact of Participation in Service -Learning on High School Students Civic Engagement. The Center for Information & Research on Civic Learning & Engagement. Working Paper 33. Billig, S. H. (2009 ). Does Quality Really Matter? Testing the New K -12 Service -Learning Standards for Quality Practice, chapter 6 in B. E. Moely, S. H. Billig, & B. A. Holland (Eds.), Advances in service -lea rning research: Vol. 9. Creating our identities in service -learning and community engagement, pages 131 –157. Information Age, Charlotte, NC. Billig, S. H., Jesse, D., and Grimley, M. (2008). Using service -learning to promote character development in a lar ge urban district. Journal of Research in Character Education , 6(1):21 –34. Billig, S. H. and Root, S. (2006 ). Maximizing Civic Commitment Through Service -Learning: Case Studies of Effective High School Classrooms , chapter 3 in K. McKnight Casey, G. Davids on, S. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-13 H. Billig, & N. C. Springer (Eds.), Advances in Service -Learning Research: Vol. 6. Advancing Knowledge in Service -Learning: Research to Transform the Field, pages 45 –63. Information Age, Greenwich, CT. Boyd, D., Goldhaber, D., Lankford, H., and Wyc koff, J. (2007). The effect of certification and preparation on teacher quality. The Future of Children, 17(1):pp. 45 –68. Bradley, R., Eyler, J., Goldzweig, I., Juarez, P., Schlundt, D., and Tolliver, D. (2007 ). Evaluating the Impact of Peer -To -Peer Servi ce-Learning Projects on Seat Belt Use Among High School Students , chapter 5 in S. Gelmon & S. Billig (Eds.), Advances in service -learning research: Vol.7. From Passion to Objectivity: International and Cross -Disciplinary Perspectives on Service -Learning Re search, pages 89 –110. Information Age, Charlotte, NC. Bridgeland, J., Balfantz, R., Moore, L., and Friant, R. (2010). Raising their voices: Engaging students, teachers, and parents to help end the high school dropout epidemic. Center for Information and R esearch on Civic Learning and Engagement, Washington, DC. Campbell, D. E. (2005). Voice in the Classroom: How an Open Classroom Environment Facilitates Adolescents Civic Development. The Center for Information & Research on Civic Learning & Engagement. Wo rking Paper 25. CIRCLE (2010) Civic skills and federal policies. Center for Information and Research on Civic Learning and Engagement. Clotfelter, C. T., Ladd, H. F., and Vigdor, J. L. (2006). Teacher -student matching and the assessment of teacher effect iveness. The Journal of Human Resources , 41(4):pp. 778 –820. Clotfelter, C. T., Ladd, H. F., and Vigdor, J. L. (2007a). How and why do teacher credentials matter for student achievement? Working Paper 12828, National Bureau of Economic Research. Clotfelte r, C. T., Ladd, H. F., and Vigdor, J. L. (2007b). Teacher credentials and student achievement in high school: A cross -subject analysis with student fixed effects. (13617). Clotfelter, C. T., Ladd, H. F., and Vigdor, J. L. (2007c). Teacher credentials and student achievement: Longitudinal analysis with student fixed effects. Economics of Education Review , 26:673 –282. Connors, S. and Walters, B. (2007). A report of evaluation studies submitted to Learn and Serve Colorado (2006 -7). University of Colorado, De nver, CO. Constantine, J., Payer, D., Silva, T., Hallgren, K., Grider, M., Deke, J., and Warner, E. (2009). An Evaluation of Teachers Trained Through Different Routes to Certification: Final Report. Institute of Education Sciences, National Center for Edu cation Evaluation and Regional Assistance, Princeton, NJ. Prepared under contract for the US Department of Education NCC 2009 -4043. Corcoran, S. P. and Jennings, J. L. (2009). Review of an evaluation of teachers trained through different routes to certifi cation: Final report. Education and the Public Interest Center & Education Policy Research Unit. Retrieved 12/13/10 from http://epicpolicy.org/thinktank/review -evaluation -of-teachers. Darling -Hammond, L. (2009). A scope policy brief: Education opportunity and alternative certification: New evidence and new questions. Stanford Center for Opportunity Policy in Education. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-14 D´avila, A. and Mora, M. T. (2007a). Civic engagement and high school academic progress: An analysis using NELS data. The Center for Infor mation & Research on Civic Learning & Engagement. Working Paper 52. D´avila, A. and Mora, M. T. (2007b). Do Gender and Ethnicity Affect Civic Engagement and Academic Progress? The Center for Information & Research on Civic Learning & Engagement. Working P aper 53. Decker, P. T., Mayer, D. P., and Glazerman, S. (2004). The effects of Teach for America on students: Findings from a national evaluation. Mathematica Policy Research, Inc. Dede, C. J., Star, J., and Dukas, G. (2010). Studying Technology -Based St rategies for Enhancing Student Interest in STEM Careers Through Algebra Curricula in Grades 5 -9 (Award Abstract 0929575). National Science Foundation. Retrieved 11/12/10 from http://www.nsf.gov/awardsearch/showAward.do?AwardNumber= 0929575. Dee, T. S. (20 04). Teachers, race, and student achievement in a randomized experiment. The Review of Economics and Statistics, 86(1):pp. 195 –210. Eccles, J. and Gootman, J. (2002 ). Community Programs to Promote Youth Development. National Research Council and Institute of Medicine, Board on Children, Youth, and Families, Committee on Community -Level Programs for Youth. National Academies Press. Ehrenberg, D. G. and Brewer, D. J. (1994). Do school and teacher characteristics matter? Evidence from high school and beyond. Economics of Education Review , 13. Ehrenberg, R. G., Goldhaber, D. D., and Brewer, D. J. (1995). Do teachers’ race, gender, and ethnicity matter? Evidence from the national educational longitudinal study of 1988. Industrial and Labor Relations Review , 48 (3):pp. 547 –561. Fox, B. F. (2008). What’s new in Princeton and Central New Jersey?: Mathematica’s new CEO. US 1. Furco, A. (2002). Is service learning really better than community service? A study of high school service, chapter in A. Furco & S. Billig (Eds.), Advances in service -learning research: Vol. 1. Service learning: The essence of the pedagogy, pages 23 –50. Information Age, Greenwich, CT. Gastic, B. (2010). Student safety and the reauthorization of no child left behind. Educational Researcher , 3 9(5):423 – 424. Give Well (2008). Teach for America (TFA) July 2008 review. Retrieved 12/10/10 from http://www.givewell. org/united -states/charities/tfa. Goldhaber, D. D. (2007). Everyone’s doing it, but what does teacher testing tell us about teacher eff ectiveness? The Journal of Human Resources , 42(4):pp. 765 –794. Goldhaber, D. D. and Anthony, E. (2007). Can teacher quality be effectively assessed? National Board Certification as a signal of effective teaching. Review of Economics and Statistics , 89:134 –150. Goldhaber, D. D. and Brewer, D. J. (1997). Why don’t schools and teachers seem to matter? assessing the impact of unobservables on educational productivity. The Journal of Human Resources , 32(3):pp. 505 –523. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-15 Goldhaber, D. D. and Brewer, D. J. (2000 ). Does teacher certification matter? High school teacher certification status and student achievement. Educational Evaluation and Policy Analysis , 22(2):pp. 129 –145. Goodenow, C. (1993). The psychological sense of membership among adolescents: Scale deve lopment and educational correlates. Psychology in the Schools, 30, 79 -90. Gordon, R., Kane, T. J., and Staiger, D. O. (2006). Identifying effective teachers using performance on the job. Discussion paper, The Brookings Institution. Hanushek, E. (1971). Te acher characteristics and gains in student achievement: Estimation using micro data. The American Economic Review , 61(2):pp. 280 –288. Hanushek, E. (1986). The economics of schooling: Production and efficiency in public schools. Journal of Economic Literat ure , 24(3):pp. 1141 –1177. Harris, D. N. and Sass, T. R. (2009a). The effects of NBTS -certified teachers on student achievement. Journal of Policy Analysis and Management , 28(1):pp. 55 –80. Harris, D. N. and Sass, T. R. (2009b). What makes for a good teach er and who can tell? Working Paper 30, National Center for Analysis of Longitudinal Data in Education Research. Harter, S. (1982). The perceived competence scale for children. Child Development , 53(1): pp. 87 -97. Hattie, J. (2009). Visible Learning: A Syn thesis of Over 800 Meta -Analyses Relating to Student Achievement. Routledge Press, New York, NY. Hawkins, J., Catalano, R., Kosterman, R., Abbott, R., and Hill, K. (1999). Preventing adolescent health -risk behaviors by strengthening protection during chil dhood. Archives of Pediatrics & Adolescent Medicine , 153(3):226 –234. Hedges, L. V. and Hedberg, E. C. (2007a). Intra -class correlation values for planning group randomized trials in education. Educational Evaluation and Policy Analysis , 29(1):60 –87. Hedg es, L. V. and Hedberg, E. C. (2007b). Intra -class correlations for planning group randomized experiments in rural education. Journal of Research in Rural Education , 22(10). Herlihy, C. (2007). State and District -Level Supports for Successful Transition in to High School. National High School Center, Washington, DC. Retrieved 05/21/10 from http://www.betterhighschools.org/docs/NHSC_PolicyBrief_TransitionsIntoHighSchool.pdf,. Hutchens, M. J. and Eveland Jr., W. P. (2009). The Long -Term Impact of High School Civics Curricula on Political Knowledge, Democratic Attitudes and Civic Behaviors: A Multi -Level Model of Direct and Mediated Effects Through Communication. The Center for Information & Research on Civic Learning & Engagement. Working Paper 65. Hyman, J. B. and Levine, P. (2008). Civic Engagement and the Disadvantaged: Challenges, Opportunities and Recommendations. The Center for Information & Research on Civic Learning & Engagement. Working Paper 63. Jacob, B. A. and Lefgren, L. (2004). The impact of tea cher training on student achievement: Quasi - experimental evidence from school reform efforts in Chicago. The Journal of Human Resources , 39(1):pp. 50 –79. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-16 Jacob, B. A. and Lefgren, L. (2008). Can principals identify effective teachers? evidence on subjecti ve performance evaluation in education. Journal of Labor Economics , 26(1):pp. 101 – 136. Jennings, M. K. and Stoker, L. (2004). Social trust and civic engagement across time and generations. Acta Politica , 39(4):342 –379. Jepsen, C. (2005). Teacher characte ristics and student achievement: evidence from teacher surveys. Journal of Urban Economics , 57(2):302 – 319. Johnson, M.K., Crosnoe, R., Elder, G.H. (2001). Students' attachment and academic engagement: The role of race and ethnicity. Sociology of Educat ion, 74, 318 -340. Kahne, J. and Middaugh, E. (2008). Democracy for Some: The Civic Opportunity Gap in High School. The Center for Information & Research on Civic Learning & Engagement. Working Paper 59. Kahne, J. and Middaugh, E. (2009). Democracy for Som e: The Civic Opportunity Gap in High School , chapter 2 of Engaging Young People in Civic Life, James Youniss and Peter Levine, editors. Vanderbilt University Press, Nashville. Kahne, J. and Westheimer, J. (2006). The limits of political efficacy: Educatin g citizens for a democratic society. PS: Political Science & Politics , 39(2):289 –296. Kahne, J. E. and Sporte, S. E. (2008). Developing citizens: The impact of civic learning opportunities on students’ commitment to civic participation. American Education al Research Journal , 45(3):738 –766. Kane, T. J., Rockoff, J. E., and Staiger, D. O. (2006). What does certification tell us about teacher effectiveness? Evidence from New York City. Working Paper 12155, National Bureau of Economic Research. Kim, W. and B illig, S. (2003). Colorado Learn and Serve Evaluation. RMC Research Corporation, Denver, CO. Larson, R. W. (2000). Toward a psychology of positive youth development. American Psychologist , 55(1):170 –183. Lay, J. C., Gimpel, J. G., and Schuknecht, J. E. ( 2003). Cultivating Democracy: Civic Environments and Political Socialization in America. Brookings Institution Press, Washington, DC. Martin, S., Neal, M., Kielsmeier, J., and Crossley, A. (2006). The impact of service -learning on transitions to adulthood , chapter in J. Kielsmeier, M. Neal, and A. Crossley (Eds.), Growing to Greatness 2006: The State of Service -Learning Project. National Youth Leadership Council, Saint Paul, MN. Matthews, J. (2004). Class struggle: When good isn’t good enough: Study highl ights success and failure of Teach for America program. Washington Post . Max, J., McKie, A., and Glazerman, S. (2007). Feasibility of a star teacher demonstration. Mathematica Policy Research, Inc. McIntosh, H. and Youniss, J. (2010). Toward a Political Theory of Political Socialization of Youth , chapter 1 of Handbook of Research and Policy on Civic Engagement in Youth. Lonnie R.

Sherrod, Judith Torney -Purta and Constance A. Flanagan, Editors. Wiley, Hoboken, NJ. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-17 Melchior, A., Orr, L., Bloomquist, J., Le iter, V., Berg, J., Grobe, T., and Nahas, J. (1995). Final Report: National Evaluation of Serve -America (Subtitle B -1). Abt Associates, Cambridge, MA. Metz, E. and Youniss, J. (2003). A demonstration that school -based required service does not deter - but heightens - volunteerism. PS: Political Science and Politics , 36(2):281 –286. Metz, E. C. and Youniss, J. (2005). Longitudinal gains in civic development through school -based required service. Political Psychology , 26(3):413 –437. Meyer, S., Billig, S., an d Hofschire, L. (2004). The Impact of K -12 School -Based Service -Learning on Academic Achievement and Student Engagement in Michigan , chapter 4 in M. Welch and S. Billig (Eds.), Advances in Service -Learning Research: Vol. 4. New Perspectives in Service - Lear ning: Research to Advance the Field, pages 61 –85. Information Age, Charlotte, NC. Midgley, C., Maehr, M. L., Hruda, L. Z., Anderman, E., Anderman, L., Freeman, K. E., Gheen, M., Kaplan, A., Kumar, R., Middleton, M. J., Nelson, J., Roeser, R., & Urdan, T. , (2000). Manual for the Patterns of Adaptive Learning Scales (PALS) , Ann Arbor, MI: University of Michigan. Miner, B. (2009). Looking past the spin: Teach for America. Rethinking Schools. Retrieved 12/10/10 from http: //normsnotes2.blogspot.com/2010/03/b arbara -miners -article -on -teach -for.html. Monk, D. H. (1994). Subject area preparation of secondary mathematics and science teachers and student achievement. Economics of Education Review , 13(2):125 –145. Morgan, W. and Streb, M. (2001). Building citizensh ip: How student voice in service learning develops civic values. Social Science Quarterly , 82:155 –169. Muller, C. (2001). The role of caring in the teacher -student relationship for at -risk students. Sociological Inquiry , 71:241 –255. Northup, J. (2010). Evaluation of Oregon Learn and Serve Program. RMC Research Corporation, Denver, CO. NYLC (2008). K-12 Service -Learning Standards for Quality Practice. National Youth Leadership Council, Saint Paul, MN. Retrieved 11/12/10 from http://www.nylc.org/pages -reso urcecenter - downloads -K_12_Service_Learning_Standards_for_Quality_Practice?emoid=14:803. OII (2010). Denver Public School’s Investing in Innovation Validation Grant Application on Collaborative Strategic Reading. Office of Innovation and Improvement. Retrieved 11/12/2010 from http://www2.ed.gov/ programs/innovation/2010/narratives/u396b100143.pdf. Pane, J. F., McCaffrey, D. F., Slaughter, M. E., Steele, J. L., and Ikemoto, G. S. (2010). An experiment to evaluate t he efficacy of cognitive tutor geometry. Journal of Research on Educational Effectiveness, 3(3):254 –281. Plutzer, E. (2002). Becoming a habitual voter: Inertia, resources, and growth. The American Political Science Review , 96(1):41 –56. Raudenbush, S. W., Martinez, A., and Spybrook, J. (2007). Strategies for improving precision in group -randomized experiments. Educational Evaluation and Policy Analysis , 29(1):5 –29. Rivkin, S. G., Hanushek, E. A., and Kain, J. F. (2005). Teachers, schools, and academic ach ievement. Econometrica , 73(2):pp. 417 –458. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-18 RMC (2006). Impacts of service learning on participating K -12 students. National Service -Learning Clearinghouse, Scotts Valley, CA. RMC (2010). Evaluation Report: Learn and Serve Michigan 2008 -2009 School Year. RMC Research Corporation, Denver, CO. Rockoff, J. E. (2004). The impact of individual teachers on student achievement: Evidence from panel data. The American Economic Review , 94(2):pp. 247 –252. Rotherham, A. (2004). Eduwonk’s cliffs notes to the new teac h for America evaluation. Retrieved 12/10/10 from http://www.eduwonk.com/2004/06/ eduwonks -cliffs -notes -to-the -new -teach -for - america -evaluation.html. Scales, P. and Leffert, N. (2004). Developmental assets: A synthesis of the scientific research on adoles cent development. Search Institute, Minneapolis. Scales, P. C. and Roehlkepartain, E. C. (2005). Can Service -Learning Help Reduce the Achievement Gap? , chapter 2 in Growing to Greatness 2005: The State of Service -Learning Projects. National Youth Leadersh ip Council, Saint Paul, MN. Schochet, P. Z. (2005). Statistical Power for Random Assignment Evaluations of Education Programs. Mathematica Policy Research, Princeton NJ. Schochet, P. Z. (2008a). Statistical power for random assignment evaluations of educ ation programs. Journal of Educational and Behavioral Statistics, 33(1):62 –87. Schochet, P. Z. (2008b ). Technical Methods Report: Guidelines for Multiple Testing in Impact Evaluations. National Center for Education Evaluation and Regional Assistance, Inst itute of Education Sciences, U.S. Department of Education, Washington, DC. Shingles, R. D. (1981). Black consciousness and political participation: The missing link. The American Political Science Review , 75(1):76 –91. Shouse, R. (1996). Academic press an d sense of community: Conflict, congruence, and implications for student achievement. Social Psychology of Education , 1:47 –68. Silver, D., Saunders, M., and Zarate, E. (2008). What Factors Predict High School Graduation in the Los Angeles Unified School District. The University of California, Santa Barbara. California Dropout Research Project Report 14. Solomon, D., Battistich, V., Watson, M., Schaps, E., and Lewis, C. (2000). A six -district study of educational change: Direct and mediated effects of the child development project. Social Psychology of Education , 4:3 –51. Spring, K., Grimm Jr., R., and Dietz, N. (2006). Educating for Active Citizenship: Service -Learning, School -Ba sed Service and Youth Civic Engagement. Corporation for National and Community Servic e, Office of Research and Policy Development, Washington, DC. Spring, K., Grimm Jr., R., and Dietz, N. (2008). Community Service and Service -Learning in America’s Schools. Corporation for National and Community Service, Office of Research and Policy Develop ment, Washington, DC. Strauss, R. P. and Sawyer, E. A. (1986). Some new evidence on teacher and student competencies. Economics of Education Review , 5(1):41 –48. Serv ice -learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. 5-19 Summers, A. A. and Wolfe, B. L. (1977). Do schools make a difference? American Economic Revie w, 67(4):639 –652. Torney -Purta, J. and Wilkenfeld, B. S. (2009). Paths to 21st Century Competencies Through Civic Education Classrooms: An Analysis of Survey Results from Ninth -Graders. Campaign for the Civic Mission of Schools and American Bar Associatio n Division for Public Education, Washington, DC. Vaughn, S., Martinez, L. R., Linan -Thompson, S., Reutebuch, C. K., Carlson, C. D., and Francis, D. J. (2009). Enhancing social studies vocabulary and comprehension for seventh -grade english language learner s: Findings from two experimental studies. Journal of Research on Educational Effectiveness , 2(4):297 –324. Wayne, A. J. and Youngs, P. (2003). Teacher characteristics and student achievement gains: A review. Review of Educational Research , 73(1):pp. 89 –12 2. What Works Clearinghouse (2010). Quick review of the report an evaluation of teachers trained through different routes to certification. Retrieved 12/13/10 from http://www.ies.ed.gov/ncee/wwc/PDF/quickreviews/ altcert_072809.pdf. Wilkenfeld, B. S. (20 09). Does Context Matter? How the Family, Peer, School, and Neighborhood Contexts Relate to Adolescents’ Civic Engagement. The Center for Information & Research on Civic Learning & Engagement. Working Paper 64. Xu, Z. and Nichols, A. (2010). New estimates of design parameters for clustered randomization studies: Findings from North Carolina and Florida. Technical Report 43, The Urban Institute:

National Center for Analysis of Longitudinal Data in Education. Yamauchi, L., Billig, S., Meyer, S., and Hofschi re, L. (2006). Student outcomes associated with service -learning in a culturally relevant high school program. Journal of Prevention & Intervention in the Community , 32:149 –164. Zaff, J. Boyd, M, Li, Y, Lerner, J.V., Lerner, R.M. (2010). Active and Engaged Citizenship: Multi - group and Longitudinal Factorial Analysis of an Integrated Construct of Civic Engagement, Journal of Youth and Adolescence, 39:736 -750. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. G-1 Glossary Active consent: Requirement for written documentation from the respondent or the parent/g uardian if the respondent is a minor to participate in a study. Case study evaluation design: A type of evaluation design using data collection methods that involve in-depth studies of specific cases or projects within a program. The method itself consi sts of one or more data collection methods such as observations, interviews, focus groups, document analysis, and analysis of other types of data Closed -ended questions: Questions that provide a fixed list of alternative responses and ask the respondent to select one or more of the alternatives as indicative of the best possible answer. Confirmatory analysis: A type of analysis used in multiple comparisons testing to assesses how strongly the study’s pre -specified hypotheses are supported by the data. Conta mination: The absorption of elements of the program by members of the comparison or control group receiving the intervention being studied during the evaluation. Contamination is a threat to validity because the group is no longer untreated for comparativ e purposes. Control condition: In an experimental design, conditions that exist when there is a randomly assigned group from the same population that does not receive the treatment or intervention that is the subject of the evaluation. It is a stand -in for what the program group would have looked like if it had not received the program. Correlation: In statistics, correlation is the degree to which two or more attributes or measurements on the same group of elements show a tendency to vary together . Depe ndent variable: The variable that is being studied, explained or is dependent on another variable. It is a measure of the presumed effect in a study. In evaluation, it is a data item that represents an expected outcome of the program. Effect size: In statistics , an effect size is a measure of the strength of the relationship between two variables in a statistical population , or a sample -based estimate of that quantity. An effect size calculated from data is a descriptive statistic that conveys the estimated magnitude of a relationship without making any statement about whether the apparent relationship in the data reflects a true relationship in the population . Evaluation design: The conce ptual framework for determining whether an intervention or program has an effect on participants. Experimental evaluation design: Requires the evaluator to randomly ass ign subjects to treatment or control conditions so that all other sources of influence are theoretically randomly distributed across the conditions. Experimental evaluation designs are considered the most rigorous of all of the evaluation design choices b ecause of the level of certainty one can have in the findings. Exploratory analysis: A type of analysis used in multiple comparisons testing to identify hypotheses that could be subject to future rigorous testing. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. G-2 External validity: The extent to which a f inding applies (or can be generalized) to persons, objects, settings, or times other than those that were the subject of study. Face validity: The extent to which the evaluation actually measures what it intends to measure. The quality of an indicator that makes it seem a reasonable measure of some variable. Focus groups: A method of data collection in which a small group of individuals (typically 6 -12) are convened to discuss and provide data on a particular issues or questions related to the evaluation. Focus groups allow the evaluator to determine, at least to some extent, the convergence and divergence of responses to a particular issue and/or to establish an in -depth understanding of a project. Formative evaluation: A type of evaluation conducted dur ing the course of program implementation whose primary purpose is to provide information to improve the program’s effectiveness. Hypothesis tests: Procedures for deciding if a null hypothesis (i.e., the proposition that there is no relationship between a n intervention and specified outcomes) should be accepted or rejected in favor of an alternate hypothesis. A statistic is computed from a survey or test result and is analyzed to determine if it falls within a preset acceptance level. If it does, the nul l hypothesis is accepted. If it does not, the null hypothesis is rejected in favor of an alternate hypothesis. Informed assent: A term used to express willingness to participate in research by minors or persons who are by definition too young to give in formed consent but who are old enough to understand the proposed research in general, its expected risks and possible benefits, and the activities expected of them as subjects. The permission of a parent or guardian, using a parental permission form, must be obtained as well. Informed consent: A term used to express the voluntary agreement of an individual or participant in an evaluation , or his or her authorized representative, who has the legal capacity to give consent, and who exercises free power of c hoice, without undue inducement or any other form of constraint or coercion to participate in research. The individual must have sufficient knowledge and understanding of the nature of the proposed research, the anticipated risks and potential benefits, an d the requirements of the research to be able to make an informed decision. Internal validity: The value of a study or a set of studies for concluding that a causal relationship exists between variables, that is, that one variable affects another. Interv iew: A method of data collection that involve s face -to-face situations or telephone contacts in which the researcher orally solicits responses to questions . Interviews often can provide more in - depth information than surveys . Intraclass correlation (ICC) : A statistic that is used when measurements are made on units that are organized into groups and that describes how strongly units in the same group resemble each other. Institutional R eview Board (IRB): A committee or organization formed by hospitals or other institutions that are charged with reviewing and approving the use of human participants in research and evaluation projects. The IRB serves as a compliance or ethics committee and is responsible for Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. G-3 reviewing research protocols involving humans in order to determine the safety and ethical nature of the proposed study. National Youth Leadership Council’s K -12 standards and indicators of quality: Evidence -based standards and indicators for high -quality service -learning programs for students in Grad es K -12 that were developed by the National Youth Leadership Council in 2008. The standards include sufficient program duration and intensity, opportunities for meaningful service, cognitively challenging reflection activities, strong link to academic cur riculum or other learning objectives, mutually beneficial partnerships between schools/programs and community organizations/members, respect for diversity, youth voice, and progress monitoring. Knowledge assessment: Closed - or open -ended questions or essay prompts that measure the extent to which students (or teachers or other respondents) have acquired specific knowledge and skills that are the target of the intervention. Level of confidence (or confidence level): The degree to which evaluators can be ce rtain that it was the intervention that influenced the result. Confidence levels are typically expressed as an approximate percentage. For example, if p =.05, then the evaluator is saying that he/she is 95% sure that the intervention (e.g. service -learni ng) was associated with the result that was found. Logic model: A systematic and visual way to present the perceived relationships a program’s resources, activities, intended outcomes, and factors that may explain or influence outcomes. Mediator: A vari able that accounts for the relationship between the independent and dependent variable. Moderator: A variable that affects the direction and/or strength of the relationship between the independent and dependent variable. Minimum detectable effect (MDE): Th e smallest program impact that could be measured with confidence given random sampling and estimation error. Multiple comparisons (or multiple hypothesis testing) : More than one type of hypothesis test that is conducted to address key evaluation questions . For example, studies that examine the impacts of education interventions on key student, teacher, and school outcomes typically collect data on large samples and on many outcomes. Tests are conducted to assess intervention effects for multiple outcomes, for multiple subgroups of schools or individuals, and sometimes across multiple treatment alternatives. Observations: A type of data collection method whereby observers watch a setting, record what they see, and then code their observations. Observations may be made of settings, classes, behaviors, verbiage, relationships, instructional styles, participation rates, levels of engagement, student groupings, and much more. Observations can be informal or structured, using a pre -determined protocol. Office of Management and Budget (OMB): The largest component of the Executive Office of the President whose predominant mission is to assist the President in overseeing the preparation of the federal budget and to supervise its administration in Executive Branch ag encies. OMB is responsible for approving requests from federal agencies to solicit and collect information from the public. The Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. G-4 purpose is to ensure the quality and usefulness of the information collected from the public (respondents) and to minimize the burden placed on the public by the data collection process. Open -ended questions: Questions on a survey or questionnaire that have no preexisting response categories but allows the respondent to answer in his or her own words. Passive consent: A type of consent for a n individual (or parent/guardian of a minor) to participate in a study that does not require the individual or parent/guardian to provide signed consent to participate . Rather, the letter or form is provided to the individual or parent /guardi an with the stipulation that it with a signature only if the individual or parent /guardian does not want (or does not want their child) to participate in a study. Point in time observation: An observation conducted at one point in time which presents a sna pshot of a particular activity or event. Pre/post evaluation design: A type of evaluation design where surveys or tests are first administered prior to a treatment or intervention and then again following the treatment or intervention to determine the e ffects of the intervention. Qualitative methods: Type of research methods that involve detailed, verbal descriptions or observations of characteristics, cases, and settings. Qualitative analysis can be conducted on data collected from observations, interv iews, and documents. Quantitative methods: Type of research methods involve examination of phenomenon through the numerical representation of data and statistical analysis. Examples of quantitative data include responses to close -ended survey questions, t est scores, attendance rates, and graduation rates. Quasi -experimental evaluation design: A type of evaluation design that utilizes matched treatment and comparison groups. Quasi -experimental designs differ from experimental designs in that participants are not randomly assigned, but rather groups of participants that closely resemble the treatment group are recruited to participate in the evaluation. Random assignment: The process of assigning individuals or groups (e.g., classrooms) to the experimental and control treatments such that each individual or group has an equal chance of being in each treatment. Reliability: The extent to which measuring the same construct in the same way will consistently yield the same results. Response categories: Predeter mined categories typically found on surveys or questionnaires with close -ended questions that respondents can check off. For example, in response to a question about their opinion on an issue, respondents could check ―Strongly Disagree,‖ ―Disagree,‖ ―Agre e‖, or ―Strongly Agree.‖ Response rate: The percentage of potential respondents who were initially contacted who actually completed a survey, questionnaire, or other type of study instrument. Sampling: The process by which some portion of the population is selected for study so as to represent the larger population. Service -Learning Evaluation Toolkit Abt Associates Inc. Toolkit for Evaluating Service -Learning Programs ▌pg. G-5 Sampling error: The likelihood that any scientifically drawn sample will contain certain unavoidable differences from the true population of which it is a part. Sampling frame: A list of ―units‖ or member (in the case of service -learning, typically individuals, classrooms, schools, or districts) from which the actual sample is eventually drawn. Secondary analysis: Analysis of data that has already been collected by someone other than the investigator conducting the research. Common sources of secondary data in service -learning include state achievement test scores, school accountability reports, and attendance records. Service -learning: Service -learning is an experiential teaching and le arning strategy that integrates meaningful community service with the learning objectives of academic c urricula. S ervice -learning is unique among exp eriential learning pedagogies in that it seeks to simultaneously enhance students’ academic and civic outco mes. Service -learning can be applied across all subjects and grade levels; it can involve a single student or group of students, a classroom or an entire school. Standard deviation: A measure of the variability (dispersion or spread) of any set of nume rical values from the average mean , or expected value). A low standard deviation indicates that the data points tend to be very close to the mean , whereas a high standard deviation indicates that the data are spread out over a large range of values. Statistical power: A gauge of the likelihood that a true effect will be detected. In general, statistical power is increased by including more c ases in the sample. Summative evaluation: Evaluation designed to present conclusions about the merit or worth of an intervention and recommendations about whether it should be altered or eliminated. Survey: A method of data collection that allows evaluato rs to gather information about individuals, schools, programs, etc. Surveys can be administered online, by e -mail or regular mail, by telephone, or in person. Most surveys yield data that are easily quantified, though some surveys use a combination of cl osed -ended (forced choice) and open -ended questions. A survey may focus on factual information about individuals or entities, or it might seek to obtain the opinions of the survey takers. Treatment condition: The group that receives the program, intervent ion, or services being studied. Validity: In measurement, validity refers to the extent to which a measure captures the dimension of interest. In analysis, validity refers to the close approximation of study conclusions to the ―true‖ situation. Variance: A measure of the spread, or dispersion, of the values or scores in a distribution. The larger the variance, the further the individual cases are from the group mean.