R_+D6

Discussion Prompt -- D6

**Starts September 23, Thursday and ends September 29, Wednesday. Initial postings are due September 25, Saturday.**

For this discussion, first you need to read chapters 17, 18 and 19.

Summarize each article for its general contents and share them with your peers in this discussion board (150-300 words each). identify, describe and discuss the data collection strategy or data collection tool/instrument used in these research articles.
 * PART ONE. Huett -- Articles**

In //Developing a survey to measure best practices of K-12 online instructors//, Black, DiPietro, Ferdig, and Polling (2009) detail the creation of a quantitative survey that was developed for another study (see my Discussion 4 post related to DiPietro et al.) to measure online teacher quality. By identifying and gathering data on the qualities and practices of 16 successful (they operationally defined this) online teachers in Michigan, they had a basis for conducting a qualitative analysis. In particular, the nature of the qualitative analysis was one of "constructivist grounded theory," which entailed the use of the techniques of "coding data, using a constant comparative method, and data synthesis" (Black, DiPietro, Ferdig, & Polling, 2009). From their efforts, they identified a list of twelve personal characteristics and twenty-three pedagogical strategies on which to base their quantitative survey. "The goals of such a survey would be to: a) validate the characteristics developed from a smaller sample size; and b) to use an instrument to be able to assess current professional development needs of existing virtual school teachers" (Black, DiPietro, Ferdig, & Polling, 2009).
 * Article 1**

The researchers surveyed 53 virtual school teachers (from the same school from which the earlier "successful" 16 instructors were sampled, and not including those 16). Using SPSS, "Cronbach alpha procedures and descriptive statistics were calculated" (Black, DiPietro, Ferdig, & Polling, 2009).

Interestingly, the results showed that the most pressing professional development needs identified by the respondents were "the development of new technology-based skills, new methods for finding and evaluating resources for use with online classes, and content-based technology integration" (Black, DiPietro, Ferdig, & Polling, 2009).

In //Online learning and quality assurance//, Zygouris-Coe, Swan, and Ireland (2009) describe the implementation of a program called Florida Online Reading Professional Development (FOR-PD), and the way in which using quality assurance checks (QAC) improved learning for students (who were teachers being professionally developed). The use of QACs was time-consuming and expensive, but the researchers argue that it has been worth it in terms of program improvements. From fall 2003 to the time of this article's publication, more than 37,000 students have enrolled in the FOR-PD program. The researchers used outcomes data related to participants' performance, end-of-course student evaluations of facilitator effectiveness, and data from the QACs to iteratively improve the program.
 * Article 2**

To put all of this in plainer language, lots of Florida teachers had to get a reading endorsement through this online program. The online courses were taught by "facilitators." To add another layer to this, a person called a "quality assurance check specialist" monitored the course as well, providing a more data-driven perspective of what was going on in the class. The QAC specialist would alert the faciliator with performance feedback and make recommendations on how to run the class better. Some semesters, the QAC specialists were more involved than others. What the researchers learned based on constant collection and analysis was that at times, they were a little //too// involved in the classes: high-performing facilitators misunderstood the involvement of the QAC specialists and began to question their own value to the FOR-PD program. So, in following semesters, QAC specialist oversight was reduced. Again, the program is constantly tweaked based on continuous data analysis and the use of the QAC.

Black, E., DiPietro, M., Ferdig, R., & Polling, N. (2009). Developing a survey to measure best practices of K-12 online instructors. //Online Journal of Distance Learning Administration, 12//(1).
 * References**

Zygouris-Coe, V., Swan, B., & Ireland, J. (2009). Online learning and quality assurance. //Instructional Journal of E-Learning, 8//(1).

share your data collection strategy/tool/instrument you plan to use explain your rationale for choosing to use this strategy/tool/instrument (at least 150-300 words).
 * PART TWO. Huett -- Analysis to be Collected**

For data collection, I plan to use the graduate student reviewers' //VoiceThreads// which were created in spring and summer 2010. First, to transcribe the data, I will create a 2-column table. In the first column, I will place the audio text (spoken by graduate student reviewers). In the second column, I will place the text copied from the presented //VoiceThread// slides. Then, along with at least two other coders, I will segment the data into units. Next, I will code the data "with symbols, descriptive words, or category names" (Johnson & Christensen, 2007, p. 534). As we code, we will be generating a "master list," which is "a list of all the codes used in a research study" (Johnson & Christensen, 2007, p. 535). As new codes are added to the master list, they should be reapplied to the rest of the data. As we code, we will want to ensure intercoder reliabilty, "consistency among different coders" (Johnson & Christensen, 2007, p. 536), as well as intracoder reliability (perhaps the more difficult one: consistency with myself!). I suspect that we will use inductive codes rather than a priori codes.

For contextual background, please visit http://medt8484.wikispaces.com/Topic.

Johnson, B., & Christensen, L. (2007). //Educational research: Quantitative, qualitative, and mixed approaches// (3rd ed.). Boston: Allyn and Bacon.
 * References**

**NOTE: Both initial postings are due September 25, Saturday at 11:59 pm.**


 * PART THREE.** As part of the collaborative learning process, you are expected to discuss with your peers the ideas and summaries shared in this discussion board. Your active participation should be characterized by the following:

1. Each student should post MORE THAN SEVEN (7) comments or replies to their peers’ postings.

2. At least 3 of these comments or replies should focus on SIMILARITIES between peer postings and those posted by the student.

3. At least 3 of these comments or replies should focus on DIFFERENCES between peer postings and those posted by the student.

4. At least a comment or reply should focus on the lessons learned from the shared peer postings to the student’s approach in finding solution to his/her research problem.

5. At least a comment or reply should include a question about the articles shared by another student to the student’s approach in finding solution to his/her research problem.

6. In each posting, a comment or reply to a posting should include the name who is receiving it. This is important to generate a sense of community building by addressing in name those who are involved in the online conversation.

**NOTE: This discussion closes on Wednesday, September 29 at 11:59 pm.**

Expected Outcomes

1. Included in the subject heading the following -- "Lastname-Articles" for first initial posting; “Lastname-Analysis to be collected” for second initial posting.

2. Posted two initial postings.

3. Posted more than 7 comments or replies. At least 3 focused on similarities and another 3 focused on differences. At least a comment or reply focused on value to student’s approach in finding solution to his/her research problem.

4. Included pdf copy of articles reviewed.

5. Proper citations using APA format 6th edition.

**Finally, please check the assessment form for this discussion that is located in the appropriate learning module.**

Oliver, K., Osborne, J., & Brady, K. (2009). What are secondary students' expectations for teachers in virtual school environments? //Distance Education, 30//(1), 23-45.