R_+D1

Discussion Prompt -- D1

Broadly, my topic is K-12 online learning. As I begin to narrow the topic, it will relate to examining how to design, develop, implement, and evaluate quality online (including hybrid) courses for secondary students (grades 6-12). Below, I provide some insight on why I am interested in this topic.
 * Topic **

As K-12 schools shift to delivering instruction online, the need for gauging the quality of course design and delivery increases. In the case of Luella Middle School (Henry County Schools, Georgia), the principal has mandated that all teachers, grades 6 through 8, migrate the delivery of their face-to-face courses to an increasingly online format via the //Angel// learning management system (LMS). Teachers use their professional development and post-planning days to learn more about the LMS, and they build content on a weekly basis throughout the school year. Across the school, teachers' technical skills vary, as do their perceptions of the value of teaching in a hybrid format. As a result of this and other factors, the quality of online content across the courses is inconsistent.
 * Context and Rationale **

The same school district offers supplemental online courses through its Henry County Online Academy. County high school students can enroll in these courses to supplement their current school regimen, with options such as Spanish 1, English 2, and US History. As with Luella Middle School, Henry County Online Academy administrators have identified a need for gauging the quality of the online courses they are offering.

Through a partnership between the university and the school district, graduate students enrolled in MEDT 7472: Intro to Distance Education have reviewed 9 Henry County online or hybrid courses during the spring 2010 and summer 2010 semesters. They reviewed the courses against the //National Standards of Quality in Online Courses// instrument as well as an instrument designed to gauge the coverage of the Georgia Performance Standards.

I had my topic formed before I completed the reading, so here I would like to share some insights I gleaned from the reading. First, from the textbook, I took great comfort in the idea that we can learn how to research if we systematically apply ourselves to studying the toolset. Sometimes, we allow a concept or body of knowledge (in this case, research) to tower over us and intimidate us, when really, if we just break it down into steps, it becomes something to we can learn, given that we take the time. So, I am so glad to have broken the ice, and I am excited about this course of study. This will be an opportunity to really ask myself if I see myself working in a research environment as I move forward professionally.
 * Connection to Reading **

Speaking of things professional, I found Huff's idea that my chosen conversation fit my "scholarly identity and career trajectory" (Huff, 2009, p. 2) to be compelling and slightly disturbing. It sounds nice, and I think I agree with the idea, but my scholarly identity is pre-nascent at best, and I don't picture a "trajectory" when I think about my work. It feels as if each year, I work on different projects, making up a patchwork quilt of things, but cohesion of purpose may be lacking. I know that in her book, Huff is coaching us on how to pull it all together better, and while at times, I think she's speaking to the folks upstairs (the doctoral students and the newly-minted Ph.D.s), we Ed.S. students have a lot to learn from her messages. As we identify the conversation in which we would like to take part, Huff (2009) warns against making too many connections to other works and references. Knowing myself and the global nature of my brain which sees linkages to everything, I realize I will have to closely-self monitor this semester as I pull articles for our weekly discussions. There simply won't be time for tangential matters, and that is something I will keep in the front of my mind.

Perhaps my favorite point from Huff's (2009) discussion of the literature review was when she states "It is very important to recognize that a literature review is not a report on what //you// know but what the //audience// of the output you are preparing needs to know. This is an important point of transition from being a student to being a scholar" (p. 150). When we join a "conversation," we need to remember this. We're not talking to our professors anymore. We're talking with scholars in a field of study. What do they need to know?

The final reading, Donaldson and Knupfer's (2002) book chapter, was an excellent overview of technology integration history, theory, and current issues. I enjoyed the section on constructivism, as that is an area I find intriguing. How do we build constructivist environments online? I will continue to use this resource both this semester in //Research// as well as in the courses I teach on technology integration (the list of references is very useful, too!)

Thanks for reading.

Kim

P.S., The subject heading should read "Lastname-Topic/Issue." My original interpretation of that was that mine should be "Huett-K12 Online Learning," but I opted to follow the example set by the rest of you ("Huett-Topic/Issue"). I figure the second way is correct...please let me know otherwise. Thanks!

Donaldson, J. A., & Knupfer, N.N. (2002). Education, learning, and technology. In P. Rogers (Ed.), //Designing instruction for technology-enhanced learning// (pp. 19-54). Retrieved from []. Huff, A. (2009). //Designing research for publication//. Thousand Oaks, CA: Sage Publications, Inc. Johnson, B., & Christensen, L. (2007). //Educational research: Quantitative, qualitative, and mixed approaches// (3rd ed.). Boston: Allyn and Bacon.
 * References **

**Article #1** In //Establishing a quality review for online courses//, Chao, Saj, & Tessier (2006) detail a 5-month pilot study carried out by the Centre for Teaching and Educational Technologies (CTET) at Royal Roads University (RRU) in British Columbia, Canada in the fall of 2004. The university, established in the mid-1990s, serves adult professionals, offering graduate degrees in a hybrid format. The pilot was trying to find a way for CTET to ensure quality standards of online course offerings, and to do so across an institutional landscape of changing curriculum and personnel (Chao, Saj, & Tessier, 2006).

The authors suggest that a comprehensive framework is needed when measuring online course quality. Their proposed framework consists of these six elements: curriculum design, teaching and facilitation, learning experience, instructional design, web design, and course presentation (Chao et al., 2006). In the pilot study, however, they focused solely on the last three elements: instructional design, web design, and course presentation. In teams of three, 18 courses were reviewed. On average, ten hours were needed for the team to complete the review of a single course. A 4-point scale was used to rate course quality, allowing a range of responses from unsatisfactory to very satisfactory.

Results showed that courses varied across the three elements under review: one course might rate well on instructional design but poorly on another area such as web design. Another course might exceed in two areas and need improvement in the remaining area. Breaking down the reviews allowed CTET to prescribe customized revisions to each course, pulling in only those personnel needed to help make changes. The overall pilot effort was fruitful in that courses were improved. The pilot was limited, however, as reviews were conducted on static (i.e., not live) courses, which prevented reviewers from seeing the course elements in action with teachers and students. Chao et al. (2006) also point out that the pilot would have been more robust had it included areas of curriculum design, teaching and facilitation, and learning experience as areas for review. Finally, the authors conclude that the time needed for this process was great, and that a new model that allows for more efficient course reviewing is needed.

**Article #2** In //Getting it right gradually: An iterative method for online instruction development//, Kranch (2008) offers an instructional design model called the Individual Iterative Instructional Design (ID3M) model. The main audience for this model is the "individual developer/presenter" (p. 30). An individual developer/presenter is a person who designs, develops, and delivers (i.e., teaches) the instruction. It is assumed that this person works in a context with little support for instructional design tasks, and that it is necessary for such a person to be able to design instruction concurrently with teaching tasks. This suggests the model could be useful in a K-12 setting, where teachers work as individual developer/presenters with little in the way of outside support.

There are four iterations in the model. In the first iteration, instruction is developed, starting with analysis of learning aims and development of assessments. The next part of the first iteration involves analyzing learners and developing instruction. The final part of the first iteration involves delivering and assessing instruction. The second, third, and fourth iterations follow the same series of steps as the first, but as the instruction becomes better developed, as assessed at the end of each iteration, the workload lessens.

After having read this article, I am not sure I want to use it for my research proposal. It was a good article, and I learned a few things about instructional design, but I realize after finishing it that it is a little bit "off topic." I will keep it for now, but as I continue researching, I will be looking for alternatives.

**References**

Chao, T., Saj, T., & Tessier, F. (2006). Establishing a quality review for online courses. //EDUCAUSE Quarterly//, //29//(3), 32-39. Kranch, D. (2008). Getting it right gradually: An iterative method for online instruction development. //Quarterly Review of Distance Education//, //9//(1), 29-34.

References Used
Chao, T., Saj, T., & Tessier, F. (2006). Establishing a quality review for online courses. //EDUCAUSE Quarterly//, //29//(3), 32-39. Donaldson, J. A., & Knupfer, N.N. (2002). Education, learning, and technology. In P. Rogers (Ed.), //Designing instruction for technology-enhanced learning// (pp. 19-54). Retrieved from http://www.netlibrary.com/summary.asp?id=66126. Huff, A. (2009). //Designing research for publication//. Thousand Oaks, CA: Sage Publications, Inc. Johnson, B., & Christensen, L. (2007). //Educational research: Quantitative, qualitative, and mixed approaches// (3rd ed.). Boston: Allyn and Bacon. Kranch, D. (2008). Getting it right gradually: An iterative method for online instruction development. //Quarterly Review of Distance Education//, //9//(1), 29-34. Rogers, P. L. (2002). //Designing instruction for technology-enhanced learning//. Hershey, PA: Idea Group, Inc.

Research I Didn't Use But Might Should Look At
Abdous, M., & He, W. (2008). Streamlining the Online Course Development Process by Using Project Management Tools. //Quarterly Review of Distance Education//, //9//(2), 181-188.

Huff (2009) or (Huff, 2009) Donaldson and Knupfer (2002) or (Donaldson and Knupfer, 2002) Johnson and Christensen (2007) or (Johnson and Christensen, 2007)