R_+D3

Discussion Prompt -- D3

Chapters 11--Experimental Research 12--Quasi-Experimental and Single-case Designs 13--Nonexperimental Quantitative Research 14--Qualitative Research 15--Historical Research 16--Mixed Research

Make sure you include an accurate APA (6th ed.) citation of each article in your posting. This will be your first initial posting. A pdf copy of the actual article should be attached to this initial posting.
 * PART ONE.** Based on the topic/issue and research problem you identified and described in DISCUSSION 1 & 2, you need to access the UWG library database to search for two refereed or peer reviewed articles published after 2004. This will bring your article count to 6. Summarize each article for its general contents and sharing them with your peers in this discussion board (150-300 words each). Don't forget to identify and describe the approach used to determine the solution to the problem based on your readings.

Huett-Approaches Used in the Articles

Article 1 In //Research and practice in K-12 online learning: A review of open access literature//, Cavanaugh, Barbour, and Clark (2009) share their results of a review of the research literature related to K-12 online learning. In conducting their study, they looked at literature published between 1997 and July 2008. The purpose was to identify the emphases of the body of research during that period as well as identify areas for future research study. In organizing their review of the literature, they use a methodology called template analysis, which is a type of meta-synthesis. Meta-synthesis, not mentioned in our textbook, seems to be similar to the idea of meta-analysis, which is "a quantitative technique that is used to integrate and describe the results of a large number of studies" (Johnson & Christensen, 2007, p. 592). One of the key differences is that the two methodologies operate in different research paradigms: meta-analysis is quantitative, and meta-synthesis is qualitative. The authors explain that "content analysis of the documents, such as meta-synthesis, reveals the values and needs that dominate a field in its early stages" (Cavanaugh, Barbour, and Clark, 2009, p. 6).

In conducting their template analysis, the authors began by examining more than 500 literature sources. They narrowed this number down to 226 items that met their "criteria of relating directly to K-12 online learning and being openly Internet-accessible" (Cavanaugh et al., 2009, p. 7). They identified broad themes in the field and created a spreadsheet against which to note the presence or absence of the themes as they worked through the 226 pieces of literature. Three "coders" worked through the literature, and for each piece of literature, 2 out of the 3 coded the work.

In their results, Cavanaugh et al. (2009) presented five dominant thematic areas as follows: type of virtual school, professional roles, benefits and challenges of virtual schooling, standards of quality in online courses (based on the iNACOL instrument //National Standards for Quality in Online Courses//), and standards of teaching online courses (based on the iNACOL instrument //National Standards for Quality Online Teaching//).

In their conclusion, the authors explain that early research in the field focused on studies that compared virtual schooling to face-to-face schooling. Early studies were attempting to justify and rationalize the move to the online environment. Later studies (post-2000) have branched out somewhat into other areas, but research is needed to fill in some high-needs areas, as identified by the authors: "best practices for online teaching strategies" (p. 12), "identification of characteristics that are necessary for adolescents to be successful in online learning environments" (p. 12), "encouraging interaction between in-school and online classmates" (p. 12), and "examining the quality of student learning experiences in virtual school environments" (p. 13).

Article 2 In //Predicting success for virtual school students: Putting research-based models into practice//, Roblyer and Davis (2008) push for the development of prediction models that could be used to "identify students who may need additional assistance in order to be successful in virtual environments" (Roblyer & Davis, 2008). This article puzzled me because it is like an addendum to an article published earlier that year in the //American Journal of Distance Education//. In that study, titled //Toward practical procedures for predicting and promoting success in virtual school students//, the authors explored "whether a combination of learner characteristics and learning environment variables derived from past research could predict success in one kind of distance learning population (virtual school students) and how organizations that offer distance courses might use findings from such a model to facilitate online learning success for future students" (Roblyer, Davis, Mills, Marshall, and Pape, 2008, p. 91).

(To be clear, I did not read the study upon which my selected article is based, although I intend to read it for future related research. I am simply commenting here that I was puzzled by the way in which my selected article seemed like a continuation of an article already published--and by several of the same authors, no less. Shouldn't the authors have included this information from the Predicting Success article in the original article? -- musings from a novice researcher)

Roblyer and Davis (2008) introduce the article with a review of the literature, showing how the early promise of the virtual school movement in the 90s was to increase educational opportunities for less advantaged students such as rural and at-risk populations. As with postsecondary online learning populations, virtual school courses have a higher failure and dropout rate. Roblyer & Davis (2008) cite an example of a Florida virtual school in which minority students were found to enroll less and drop out more than other students. As the problems of success and retention have become apparent, researchers have explored a number of avenues to explain why the problem persists. One argument that has been made is that a poorly-designed learning environment may create "psychological distance" that impacts student performance. Even so, argue the authors, "in light of the fact that so many students are successful in the same courses in which others drop out, it seems likely that some students require even more facilitation and monitoring than others in virtual courses" (Roblyer & Davis, 2008).

While some virtual schools may have methods for filtering out at-risk students who may be prone to dropping out, there is an increasing need to deal with the issue of identifying and helping students who come to the online environment with fewer advantages such as the innate ability to self-regulate. "Equal opportunity and equity requirements will make it impossible for most schools to select only certain students to take online courses, so the emphasis will be on strategies to support students in ways that help promote retention and success in virtual courses" (Roblyer & Davis, 2008). An important consideration in designing a prediction model to identify and assist at-risk students is that the model be efficiently implemented and responsive to students so that just-in-time interventions may be applied.

Looking through the various types of research methods described in the textbook, the research presented in this study (or, should I say, this study of the other study), fall most in-line with the nonexperimental quantitative explanatory research described on pp. 379-380. In particular, I think that this may be an example of //causal modeling//, "a form of explanatory research in which the researcher hypothesizes a causal model and then empirically tests the model" (Johnson & Christensen, 2007). What Roblyer et al. seem to be attempting is not making predictions themselves (which might make this research fall under the category of nonexperimental quantitative predictive research), but rather, building a model to help make predictions. I would say model-building serves to serve an explanatory (or theory-explaining) type of purpose. (Others can use models to aid in making predictions).

Roblyer & Davis (2008) explain the quantitative methodology underlying the Roblyer et al. (2008) study. (To clarify, in //this// article, they are explaining the methodology put forth in //the other// article). The study subjects were 4,110 secondary students enrolled in 196 Virtual High School Global Consortium (VHS) courses in the spring of 2006. An electronic version of the instrument was provided to students in their online courses, and students were offered 10 extra credit points for completing the survey. The survey, based on the Educational Success Prediction Instrument (ESPRI) used in earlier studies, consisted of 60 Likert-based questions, and it addressed these "hypothesized factors: organization, achievement beliefs, responsibility, risk-taking, and technology skills/access" (Roblyer & Davis, 2008). Of the 4,110 students invited to participate in the study, 2,162 (53%) responded. A significant proportion of respondents (more than 80%) enrolled were from rural and suburban contexts, and 27% of respondents were from Title I schools.

Students earning a grade of A, B, or C were considered successful; and students earning a grade of D, F, I, or W were considered failing. Upon completion of the semester under study, 75% of students passed their courses, and 25% failed.

Reading the data analysis section of this study was difficult for me, as it used terminology that I have either forgotten, never heard of, or never really understood. I will include some highlights of some of what was done in terms of data analysis: factor analysis was used on the ESPRI to reduce the quantity of questions; logisitical regression analysis was completed in an attempt to "obtain a combination of factors that yielded the best prediction of success and failure" (Roblyer and Davis, 2008); and odds ratios for predicting the probability of success were employed and explained.

The authors argue that the procedures shared in this article can be used by virtual school administrators to determine students' Probability of Passing (POP) score. Schools would identify the ranges of POP score for which intervention would be necessary. Prior to the commencement of a given semester, a virtual school could employ interventions to assist students most at-risk, based on the POP.

In terms of future studies, the authors urge other researchers to create prediction models in other settings. The population on which this study was conducted was 77% Caucasian, and there is little doubt that other populations would yield other results and considerations in building a prediction model. On a final note, the authors push for expanding access to online courses for all learners, and discourage the pursuit of research that would limit online course offerings to only those students with such advantages as self-organization, self-regulation, and access to computers at home. They argue that "with functional strategies in place to identify and assist at-risk virtual learners, virtual schools can better fulfill their early promise of becoming an education equalizer" (Roblyer & Davis, 2008).

References

Cavanaugh, C., Barbour, M., & Clark, T. (2009). Research and Practice in K-12 Online Learning: A Review of Open Access Literature. //International Review of Research in Open and Distance Learning, 10//(1).

Johnson, B., & Christensen, L. (2007). //Educational research: Quantitative, qualitative, and mixed approaches// (3rd ed.). Boston: Allyn and Bacon.

Roblyer, M.D., Davis, L. (2008). Predicting success for virtual school students: Putting research-based models into practice. //Online Journal of Distance Learning Administration//, //6//(4).

Roblyer, M. D., Davis, L., Mills, S., Marshall, J., & Pape, L. (2008). Toward practical procedures for predicting and promoting success in virtual school students. //The American Journal of Distance Education, 22//(2), 90-109.


 * PART TWO.** Secondly, based on these readings and at this point in your thinking about your potential research project, identify an approach to finding solution to your research problem that you would like to pursue. Please explain your rationale and the appropriateness for using it (at least 150-300 words). Make sure that you include at least a reference to key ideas from chapters or articles you read. This will be your second posting.

Huett-Planned Approach

Background info before responding to prompt requirements (in case you don't remember where I'm coming from) My topic is K-12 online learning, and I am looking at the problem of how secondary schools can implement quality control measures in the design, development, implementation, and evaluation of their online course offerings. In terms of research purpose, the focus of the present study was to explore ways in which a mutually beneficial relationship could be established between graduate students in an online distance education course and K-12 teachers of online/blended courses in a Georgia local education agency (LEA). Graduate students conducted course reviews of 10 K-12 courses from grades 6-12 at a distance, providing feedback about curriculum content alignment to state standards as well as feedback related to instructional design, assessment, technology use, and "21st Century Skills" (National Standards of Quality for Online Courses). Graduate students would gain field experience in distance education working with practicing teachers in the field. Teachers would gain feedback from graduate students to improve their classes. It is hoped that through interative revisions of the project, a course review model for university/K-12 partnerships may be developed.

Methods My research design will employ mixed methods, where "approaches from quantitative and qualitative research are combined or mixed in a research study" (Johnson & Christensen, 2007, p. 441). The sources of information that I will use will include not only refereed journal articles (as required by this class), but also books, government and think tank reports, interviews, and other sources. In attempting to categorize my research, I would describe it as "exploratory," in that it seeks to explore the impact of a given intervention, but not necessarily explain it. According to Cavanaugh, Barbour, and Clark (2009), "it is important to know how students in virtual schools engage in their learning in this environment prior to conducting any rigorous examination of virtual schooling (pp. 2-3)." Through exploration in the first round of research, we will be able to firm up our methods and identify variables of interest for future study. The research will also have quantitative aspects.

To measure the extent to which we provided instructional technology graduate students with authentic experiences evaluating real online courses, we will employ an attitudinal survey, including both quantitative and qualitative (open-ended) questions. To measure the extent to which our intervention gave teacher designers feedback as to how well their courses covered state curriculum guidelines (GPS) and the extent to which our intervention gave teacher designers feedback on the design and interaction of their online courses, we conducted a pre-survey (qualitative), and will conduct an attitudinal post-survey (including both quantitative and qualitative questions). In addition, the feedback created by the graduate students--in the form of quantitative data on the iNACOL instrument and mixed data on the GPS instrument--as well as the mixed feedback presented through the final //VoiceThreads//--will be included in the final results.

References Cavanaugh, C., Barbour, M., & Clark, T. (2009). Research and Practice in K-12 Online Learning: A Review of Open Access Literature. //International Review of Research in Open and Distance Learning, 10//(1).

Johnson, B., & Christensen, L. (2007). //Educational research: Quantitative, qualitative, and mixed approaches// (3rd ed.). Boston: Allyn and Bacon.

Roblyer, M.D., Davis, L. (2008). Predicting success for virtual school students: Putting research-based models into practice. //Online Journal of Distance Learning Administration//, //6//(4).


 * PART THREE.** As part of the collaborative learning process, you are expected to discuss with your peers the ideas and summaries shared in this discussion board. Your active participation should be characterized by the following:

3 focusing on SIMILARITIES between peer postings and mine -k checkwood renfro -ronald -

3 focusing on DIFFERENCES between peer postings and mine -elizabeth -carlene bailey -stiefel

1 focusing on the lessons learned from the shared peer postings to the student’s approach in finding solution to his/her research problem. -jennifer -jennifer 2

1including a question about the articles shared by another student to the student’s approach in finding solution to his/her research problem - -

References
 * (1996). //Content analysis: A methodology for structuring and analyzing written material// (PEMD-10.3.1). Retrieved from United States General Accounting Office website http://archive.gao.gov/f0102/157490.pdf**

Content analysis is "a systematic research method for analyzing textual information in a standardized way that allows evaluators to make inferences about that information" (Government Accounting Office, 1996). When using content analysis, researchers attempt to identify important categories into which to analyze a body of content. Researchers code by "marking text passages with short alphanumeric codes. This creates "categorical variables" that represent the original, verbal information and that can then be analyzed by standard statistical methods" (Government Accounting Office, 1996). The content under study can include a variety of sources, including print, audiovisual, and other media. Content analysis may consist of simply identifying themes and categories of a given body of content. However, it can sometimes go further and provide insight into attitudes and beliefs held by the authors of the content. Cite AGAIN?


 * Cavanaugh, C., Barbour, M., & Clark, T. (2009). Research and Practice in K-12 Online Learning: A Review of Open Access Literature. //International Review of Research in Open and Distance Learning, 10//(1). (wiki page)**

In many ways, this is indicative of the foundational descriptive work that often precedes experimentation in any scientific field. In other words, it is important to know how students in virtual schools engage in their learning in this environment prior to conducting any rigorous examination of virtual schooling (pp. 2-3).

"The literature has not yet addessed the relative efficacy of teacher-developed, school-developed, and vendor-developed courses" (p. 8).

"Indeed, the success of any school hinges on the educators who are in direct contact with students and on the administrators who support them (Darling-Hammond, 2000). Other support personnel including media specialists and site facilitators are pivotal to the success of schools (Lance & Loertscher, 2005; Kleiman, 2007) but have a less central role (Cavanaugh & Cavanaugh, 2007). Therefore the roles of teachers and administrators received the majority of the scrutiny, while the impact of other professionals was just beginning to be explored" (p. 9).

"So although virtual schools may facilitate better instruction than the traditional classroom, there is no guarantee that this will happen" (Cavanaugh, Barbour, and Clark, 2009, p. 10). (comment--great quote for the rationale for projects such as ours!)

"Rather than using the individual standards as variables, we chose to code the standard areas.....While it may be revealing to explore the presence of each individual standard in the literature, the body of literature appeared too limited for such examination at this time" (p. 10). (comment--we had our graduate students discuss the course reviews they conducted around those broad "standard areas" as well.)

"Across virtual schools, course-level decisions are not made in uniform ways or in ways that resemble such decision-making in physical schools. A continuum of course development responsibility is evident in virtual schooling. At one end, teachers and/or designers make all content and design decisions at the school level. At the other end, vendors make all content and design decisions, and the role of the schools is to purchase and distribute courses to students. Schools select their level of involvement in course development based on personnel, funding, time, and other factors" (Cavalluzzo, 2004). (comment--my heart started pounding in excitement when I read this paragraph. Be still, my beating heart.)

Conclusion--Areas of Future Research --research into the area of best practices in online teaching, particularly asynchronous --identification of characteristics that are necessary for adolescents to be successful in online learning environments and to provide remediation for students who are lacking these characteristics.

"The second area is to improve upon the identification of characteristics that are necessary for adolescents to be successful in online learning environments and to provide remediation for students who are lacking these characteristics . The range of students enrolling in online learning opportunities is expanding (Barbour & Mulcahy, 2007; Cavanaugh, 2007). Yet the ability of virtual schools to support a broad range of student abilities appears to be limited. After describing the promising results associated with the use of the Educational Success Prediction Instrument (ESPRI), Roblyer (2005) stated that the next step in this line of inquiry is to create materials to assist in the remediation of those students whose ESPRI results indicated potential for problems. Rice (2006) also suggested that researchers need to continue the research into and development of prediction tools, such as the ESPRI" (p. 12). (comment--in establishing our context, we need to point out that in the case of LMS, there is no luxury to screen students. everyone is involved. this may be making unique challenges for stakeholders. possiblity for future research topics in this area.)

Finally, the fourth area is to examine the quality of student learning experiences in virtual school environments, especially those of lower performing students. As stated earlier, the range of students enrolling in online learning opportunities is expanding. Scherer (2006) indicated that as the range of students with new and different needs expands, research is required to ensure that online learning is a realistic and accessible opportunity. Research studies investigating the online learning experience for lower performing students will assist personnel to design appropriate supports as this particular population of students continues to grow within virtual schools. (p. 13)


 * Roblyer, M.D., Davis, L. (2008). Predicting success for virtual school students: Putting research-based models into practice. //Online Journal of Distance Learning Administration//, //6//(4).**

"As the virtual schooling movement gains momentum and states increase their virtual schooling offerings, virtual school populations will increase in both size and diversity of students. Equal opportunity and equity requirements will make it impossible for most schools to select only certain students to take online courses, so the emphasis will be on strategies to support students in ways that help promote retention and success in virtual courses." (contextual, background, rationale section)


 * Suber, P. (2004). Very brief introduction to open access. Retrieved from http://www.earlham.edu/~peters/fos/brief.htm**

According to Peter Suber (2004), professor of philosophy at Earlham College, "Open-access (OA) literature is digital, online, free of charge, and free of most copyright and licensing restrictions." Why the Cavanaugh et al chose to look at OA literature is beyond me. Ideas, anyone?

(General Accounting Office, 1996) Guide

===Research to Pull as a Result of This Activity

=== Cavalluzzo, L. (2004). Organizational models for online education. Alexandria, VA: CNA Corporation. Retrieved from http://www.cna.org/sites/default/files/P&P109.pdf

Darling-Hammond, L. (2000). Teacher quality and student achievement: a review of state policy evidence. //Education Policy Analysis Archives, 8//(1).

Lance, K. & Loertscher, D. (2005). //Powering achievement: School library media programs make a difference: The evidence.// (3rd ed.). Salt Lake City, UT: Hi Willow Research and Publishing. (The Cavanaugh article didn't mention Loertscher in the citation above.)

O’Dwyer, L., Carey, R. & Kleiman, G. (2007). A Study of the effectiveness of the Louisiana Algebra I online course. Journal of Research on Technology in Education, 39(3), 289-306. (The Cavanaugh article above didn't mention the first two authors here. Hopefully, this is the correct citation, and they didn't leave off the one to which they are referring.)

DiPietro, M., Ferdig, R. E., Black, E. W. & Preston, M. (2008). Best practices in teaching K-12 online: Lessons learned from Michigan Virtual School teachers. Journal of Interactive Online Learning, 7(1). Retrieved on May 30, 2008, from http://www.ncolr.org/jiol/issues/getfile.cfm?volID=7&IssueID=22&ArticleID=113

Roblyer, M. D. (2005). Who plays well in the virtual sandbox? Characteristics of successful online students and teachers. SIGTel Bulletin, 2.