10+Article+Critiques

Critiquing Tips

Introduction
Reason for assignment. Nature of ebsco search. what databases used. nature of findings.

For this assignment, the articles you review must be peer-reviewed articles on evaluations of specific programs (e.g., an after-school tutoring program) or articles that explain the implementation of specific evaluation methods (e.g., participatory evaluation). It is important that you frame the critique of your articles in relation to their value of strengthening your understanding of your own evaluation project. You must review 7 different articles.

Article Summary Table
 * **Summary Questions** || **Article 1:** Bauer, S. M., & Arthanat, S. (2010) || **Article 2:** Belzer, A. (2005) || **Article 3:** Edmondson, L., & Hoover, J. (2008) || **Article 4:** Fullmer, P. (2009) || **Article 5:** Hanley, P., Maringe, F., & Ratcliffe, M. (2008) || **Article 6:** Shapley, K. S., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2010) || **Article 7:** Singh, A. S., Chinapaw, M. M., Brug, J. J., & van Mechelen, W. W. (2009) ||
 * What program was evaluated? || Small Business Innovation Research (SBIR) and Small Business Technology Transfer Research (STTR) grant programs (from 5 federal agencies) || Pennsylvanian Bureau of Adult Basic and Literacy Education, a professional development system || Bullying Prevention Program in 4 Midwestern schools over a 3-year period || Tutoring services in the areas of reading, writing, and math in the Learning Resource Center (LRC) of Lincoln University, Pennsylvania || The process of development of a professional program for supporting teachers' expertise in science teaching || Pilot study of the Technology Immersion model in 21 immersion schools across 3 schools in Texas || Dutch Obesity Intervention in Teenagers (DOiT) ||
 * What was the purpose of the evaluation? || evaluate the impact of the grant programs on the development of assistive technology (AT) devices || In two phases, the study 1) looked at the relationships between program planners' vision for professional development and the actual experience of the program by participants; and 2) investigated the impact of the program. || To measure the impact of the bullying program; to determine whether the program was properly implemented; and to measure the quality of the program || To improve program effectiveness and services, as required by the standards of the Council for the Advancement of Standards in Higher Education (CAS) || To evaluate the success of the program development process || To determine the extent to which the implementation was properly deployed and how outcomes of the properly implemented model were affected. || To evaluate the "reach, implementation, satisfaction and maintenance of a school-based program aimed at the prevention of excessive weight gain among adolescents" (Singh, Chinapaw, Brug, & van Mechelen, 2009, p. 772) ||
 * Was the evaluation formative or summative? || Formative. Among the intended outcomes for the evaluation is this: "findings will allow federal agencies to compare and optimize the makeup and foci of their SBIR and STTR grant portfolios. In particular, the findings will reduce portfolio similarities, accentuate portfolio differences, and allow appraisal of SBIR and STTR programs with regard to mission fulfillment" (Bauer & Arthanat, 2010, p. 40). || Formative. The evaluation seeks to offer recommendations for the improvement of the program. || Ultimately summative, although it employed formative assessment || Formative. Results were used to develop and action plan to improve future effectiveness. || Summative || Summative || Formative. This is a process evaluation of the obesity prevention program. ||
 * Who were the stakeholders? || people with disabilities who need assistive technology, companies that invest in (design, manufacture, sell) assistive technology devices, the 5 federal agencies who have SBIR and STTR grants, state and federal legislative leaders, disability advocates, and taxpayers || program planners, professional development students, staff, instructors, state policymakers, taxpayers. || student victims, student bullies, teachers, parents, administrators, Mega || tutors, program administrators, faculty, students who use the tutoring service, Mega || science teachers, program planners, K-12 students, taxpayers, || teachers, students, parents, community, Mega, administrators, policymakers, Dell, Apple, Microsoft || children under study, children who might benefit from results, teachers, parents, program implementers, policymakers ||
 * What were the evaluation questions? || The study had objectives but no questions:

"1. Identify the Phase I and Phase II SBIR (for five agencies) and STTR (for two agencies) awards and funding for ATD development for the period 1996 through 2005. Classify the awards and funding using an ICF-based taxonomy. 2. Evaluate Phase I and Phase II SBIR and STTR awards and funding on a yearly and aggregate basis by: (a) types of ATDs funded (component and category); (b) agencies (number of awards, funding levels and award portfolios); (c) inter-agency comparisons (award numbers, funding levels and award portfolios); and (d) inter-program comparisons (SBIR and STTR programs) and trends 3. Interpret data and draw conclusions regarding SBIR and STTR award and funding trends for companies, agencies, across-agencies, across programs and across-technology domains (industry segments). Analysis will especially include longitudinal trends and a comparison of award portfolios." (Bauer & Arthanat, 2010, pp. 48-49). || What is the relationship between the vision for professional development held by planners and facilitators and the ways in which participants actually experienced the range of activities offered?

What is the impact of the system? || What is the impact of the bullying program?

Was the program properly implemented?

What was the quality of the program? || What are the strengths, weaknesses, opportunities, and threats related to the implementation of the tutoring services at the university? || This evaluation uses the "change transition model" to evaluate the program at 4 layers:

Trigger Layer What did the developers see as the trigger for the project? What triggered teachers’ involvement?

Vision Layer

What are the intentions of the development in terms of teachers’ expertise? How does this development meet with the individual needs of the teachers? What challenges might be met in working with teachers and how can these be resolved? What strategies can be used to motivate people to sign up to these changes?

Conversion Layer Questions How much empathy exists between teachers’ own aspirations and the project goals and processes? To what extent do teachers feel ownership of the project? How much shared understanding of the change exists between programme developers and the teachers?

Maintenance and Renewal Layer What strategies are designed to maintain the teachers at the level of experts once this has been attained? How will the programme evolve beyond the initial vision to reflect new conditions?

(Hanley, Maringe, & Ratcliffe, 2008, pp. 713-714) || To what extent did each of the 21 treatment schools implement the Technology Immersion model as designed?

Given variations in implementation, what is the relationship between implementation strength (at the school, teacher, and student levels) and students’ reading and mathematics achievement as measured by scores on the state’s criterion-referenced assessment—the Texas Assessment of Knowledge and Skills (TAKS)? || What is the reach of the program?

What is the nature of program implementation?

How satisfied are the participants?

How can/should such a program be maintained? ||
 * hat methods were used? What types of data collection were used? || The methods address these issues: "(a) gathering of SBIR and STTR award data, (b) construction of an ICF-based classification system, (c) inclusion and exclusion criteria for ATDs, and (d) assignment heuristics to place ATDs into the ICF-based classification system." (Bauer & Arthanat, 2010, p. 49).

These data were collected on every award granted and inputted into a Microsoft Access database: award, title, year, type, Phase, amount, abstract, principle investigator, organization information, and funding agency. || Qualitative research methods, which employed these data collection mechanisms: open-ended interviews, observations, collection and review of key documents || Using the //Teacher Use of the Bullying Prevention Project// tool, teachers were surveyed on their 1) use of the anti-bullying curriculum; 2) on their perception of changes in school climate; and 3) on their perception of how the program was received by students and parents.

The instrument contained a mix of open-ended and closed-ended questions. || SWOT Analysis was used. Four teams comprised of LRC staff evaluated one aspect of SWOT.

Mixed surveys of faculty and students; analysis of improvement from pretest to posttest; a review of course grades; and measures of student persistence and retention. || Semi-structured interviews of program developers and program participants were held. Documentary evidence was collected.

Interviews were transcribed and analyzed using grounded theory. || Data collection "focuses on the second (2005–06), third (2006–07), and fourth (2007–08) implementation years. Measures included teacher and student surveys completed at the end of each school year (April to May), and students’ TAKS [the state standardized assessment of student knowledge and skills] scores from annual administrations in April". (Shapley, Sheehan, Maloney, & Caranikas-Walker, 2010, p. 14)

Researchers used the teacher and student survey data to come up with "implementation scores for indicators that measured progress toward immersion standards" (Shapley, Sheehan, Maloney, & Caranikas-Walker, 2010, p. 20). || Quasi-experimental design with treatment and control groups.

Data collection: questionnaires among students, teachers, school board and site staff ||
 * What results were reported? || For SBIR grants, results were reported with tables presenting the data relationships of interest and accompanying text. The following represent the nature of results presented (titles taken from results tables in study):
 * Awards and Funding for ATD Development by Agency and Year
 * Ratio of Phase II/Phase I Awards
 * Number of Phase 1 and Phase II SBIR Awards by ICF Component and Agency
 * Percent of Phase I Awards by ICF Component and Agency
 * SBIR Phase I and Phase II Awards by Agency and by BFS Domain
 * SBIR Phase I Awards by Agency and by Activity Categories
 * SBIR Phase I Awards by Agency and by Participation Categories
 * SBIR Phase I Awards by Agency and by Context Categories
 * STTR Phase I Awards by Component
 * STTR Phase II Awards by Component
 * Comparison of STTR and SBIR Awards by Phase || The results reported were a qualitative write-up that included data on participation in the professional programs and 5 aspects of impact: "impact on practice, on thinking about teaching and learning and professional knowledge, on professional attitude, on program structures and procedures, and on the field" (Belzer, 2005, p. 41). || Results included a report of student behavior changes at the end of Years 2 and 3; the number of lessons taught at the end of Years 2 and 3; and the teachers' perception of school atmosphere at the end of Years 2 and 3.

Student pre-program data and post-program data also showed that feelings of safety increased. || Overall, SWOT analyses of the different tutoring programs (reading, math, and writing) revealed Strengths: certified tutors, positive interactions with students, positive lab climate

Weaknesses: difficulty with online interface, low staffing, lack of faculty involvement, low improvement rates for students

Opportunities: increased collaboration with other agencies on campus, increased training and certification of lab tutors

Threats: lack of funding and staffing, lack of buy-in from faculty and students, || This evaluation uses the "change transition model" to evaluate the program at 4 layers:

Trigger Layer Interest in learning, and marketplace demand for new learning were two main triggers found. Vision Layer What are the intentions of the development in terms of teachers’ expertise? Teachers want to impact student outcomes. Teachers valued more those things they could carry into the classrooms. Program developers valued the teachers' buy-in of portfolio development

Conversion Layer Questions The question of using portfolio evidence was unclear but became clearer among program developers as the project progressed.

Maintenance and Renewal Layer Little data were returned on this layer (as stated by the investigators). || "Using hierarchical linear modeling, [researchers] found that teacher-level implementation components (Immersion Support, Classroom Immersion) were inconsistent and mostly not statistically significant predictors of student achievement, whereas students' use of laptops outside of school for homework and learning games was the strongest implementation predictor of achievement" (Shapley, Sheehan, Maloney, & Caranikas-Walker, 2010, p. 4). || immense difficulties in the recruitment phase and therefore a low reach at school level. However, among adolescents of the schools that participated, the reach was high (84%). Furthermore, the classroom intervention was implemented successfully based on the number of lessons taught. Most teachers rated the DOiT-intervention positively; students rated the intervention 6.6 on a scale of 1-10. || create a shared vision for professional development, improve participation, use needs assessment to better respond to the practitioners who use the program, strike an appropriate balance between breadth and depth of courses, and improve the continuous improvement process for the program. || Because this was a process evaluation, many changes were made based on incoming data analyzed during the course of the evaluation. Some such changes were to have teachers implement the anti-bullying training rather than outside trainers, increasing the amount of anti-bullying resources in the school library, and the creation of //The Bullying Prevention Manual//. || Align the lab work with the course work, provide frequent and positive feedback, collaborate with other departments on campus. || Establish a shared vision of goals and outcomes. Be flexible when implementing the program. Negotiate common understanding between participants. Ensure that collaboration is taking place among participants. || In-line with its summative nature, this study did not provide recommendations, but it did provide this conclusion:
 * What recommendations were reported? || Because this and other studies suggest that policymakers lack the data needed to make evaluative decisions about funding in the area of AT, the recommendations point to interventions that would improve data collection and oversight. First, the researchers recommend the creation of a single classification system based on the ICF-based system. Second, all federal agencies must use this system. Third, all small businesses awarded an STTR or SBIR grant must report Phase III outcomes, which relate to commercialization. As yet, they are only required to report Phase I and Phase II outcomes. Fourth, all performance data related to these recommendations must be available in a single online web interface. || The report offers these recommendations:

"This study confirms that fundamental school change is difficult and requires long-term commitment at all levels of the school system (board members, superintendent, principals, teachers, students, and parents). Given the challenges and costs of implementing Technology Immersion, statewide implementation of the model may not be feasible." (Shapley, Sheehan, Maloney, & Caranikas-Walker, 2010, pp. 51-52). || "-examination of reasons why schools refuse to participate in the RCT/implement an intervention aimed at the prevention of obesity and thereby increasing the odds of adequate dissemination of the intervention; - decreasing time teachers spend on the implementation of the intervention (e.g. shorter duration of the intervention, fewer lessons and instead of a 1-year intervention period a 2-year intervention period); it is, however, not sure that such a less teacher-intensive implementation will lead to similar results; - embedding the intervention in local and/or national school health policy initiatives in order to ensure a stronger support for schools in their role as health-enhancing institution." (Singh, Chinapaw, Brug, & van Mechelen, 2009, p. 777) || Article Analysis

In what ways the articles are similar or different? (e.g., focus, population, research question, research design, data collection, and data analysis – you may address each of these in a paragraph or two) Are the similarities or differences only limited to a certain number of articles? In what way?

Article Critique Following your summary or table, write a critique and reaction to your collection of articles by considering the following:
 * Provide your personal opinion about this collection.
 * Explain why the information you gathered is important.
 * Discuss how the information will be relevant to your future professional practice.
 * Describe who may benefit from reading this collection of articles.

Quality of Collection (personal opinion) variable challenging for the novice researcher to evaluate quality beneficial to do a rapid comparison of articles.

Why Important because it informed me on ways to structure an evaluation report, verbiage to use in describing evaluation, what information is important to include, what types of information precede the study report (lit review)

Who may benefit from reading this collection of articles educators learning to conduct a program evaluation any program developers, planners, implementers or stakeholders who are involved in similar programs

Article Critique Synthesizing Questions How can the articles support or strengthen your understanding of the evaluation project? Are they making you feel confident that you are in the right direction? Are they giving you other ideas that make you pursue things differently from your initial thoughts? What key ideas presented in these articles did you find valuable? How will you integrate these ideas in your writing about this evaluation project? --again, it showed me how I can do my own evaluation. showed me what works and what doesn't work.

Conclusion--Relevance to future professional practice found a possible model for use in writing up a program evaluation of the planning process for migrating SI from f2f to online; the change transition model. shared it with co-researchers who were enthusiastic

use this article or one like it as a required reading in the course i teach...gives students one in-depth case study of a school implementation. http://www.eric.ed.gov/PDFS/EJ873678.pdf

References About Program Evaluation
Newton, X. A., & Llosa, L. (2010). Toward a More Nuanced Approach to Program Effectiveness Assessment: Hierarchical Linear Models in K-12 Program Evaluation. //American Journal of Evaluation//, 31(2), 162-179. Retrieved from EBSCO//host//.

For Fun Gallagher, J. J. (2006). According to Jim Gallagher: How to Shoot Oneself in the Foot with Program Evaluation. //Roeper Review//, 28(3), 122-124. Retrieved from EBSCO//host//. Permalink

Haertel, Edward H.; Herman, Joan L.. Yearbook of the National Society for the Study of Education, v104 n2 p1-34 Jun 2005.