Quality indicators for evaluating distance education programs at community colleges

by Hirner, Leo J., Ph.D., UNIVERSITY OF MISSOURI - COLUMBIA, 2008, 451 pages; 3371066

Abstract:

The continued rapid growth of online courses and programs in higher education has brought concerns regarding support services, learning resources, and effectiveness of instruction, as well as how institutions monitor the quality of online programs. These concerns have prompted questions about the effectiveness of instruction and how participants perceive online learning. Such questions led Phipps and Meritosis (1999) to question the methodology of the body of research on online programs and raised the need for a process by which programs and institutions could be compared by academics or prospective students. Unfortunately, the concerns first identified by Phipps and Meritosis continue to persist (Hannafin, Oliver, Hill, Glazer, & Sharma, 2003; Sherlock & Pike 2004).

These issues provided the impetus for this study, the goals of which were to identify quality indicators specific to community college online programs, and to determine stakeholders’ perceived importance of those indicators. A literature review identified common standards and best practices for online courses and programs developed by accrediting organizations and policy groups. The terms best practices, criteria, and standards are used interchangeably in the literature when discussing recommendations regarding practices and policies institutions should adopt for distance learning programs (Twigg, 1999a). One goal of the present study is to identify a set of indicators, and the best practices, criteria, and standards from the literature provide a place to start in the identification of possible indicators of quality.

Synthesizing these sources yielded five categories: institutional support, curriculum and instruction, faculty support, student support, and evaluation and assessment. A case was made for adding technical support as a sixth category. This information was used to guide the development of a Delphi study to identify potential indicators. Twenty distance education program administrators from community colleges and 4-year institutions agreed to participate in the study; fifteen completed the initial survey and thirteen the full process.

The potential items identified through the Delphi process were used to create a three-part stakeholder survey, which was designed to collect input on perceived levels of importance for each potential indicator using the magnitude estimation technique. Participants were also able to recommend indicators not included in the survey, and demographic data were collected. The stakeholder survey was then distributed to students and faculty, technical support staff, and program administrators participating in online courses offered by a community college system in the Midwest.

The perception of importance, as measured through the stakeholder survey, did not suggest that any Delphi items should be eliminated, and the relatively equal perceptions of importance indicated by each stakeholder group provides validation for the results of the Delphi study.

A third research step was added to refine the results of the Delphi process which included a mix of potential indicators, factors, and other measures. A group of distance learning experts, identified through their scholarly research and professional activity, was asked to review the Delphi items and classify each as a factor or indicator according to the following definitions. Indicators are outputs that an organization can point to as signs of success, and factors are inputs consciously made by the institution in support of its program.

Results from this study identify where and how an institution might look for data when measuring the effectiveness of its online programs and services. The potential indicators and factors identified in these three studies represent parameters that support the examination of how an institution supports its programs, or how programs might compare across institutions. What these items do not address is how an institution uses the data it collects on its programs.

AdviserThomas Kochtanek
SchoolUNIVERSITY OF MISSOURI - COLUMBIA
Source TypeDissertation
SubjectsCommunity college education; Educational administration; Educational technology; Higher education
Publication Number3371066

About ProQuest Dissertations & Theses
With nearly 4 million records, the ProQuest Dissertations & Theses (PQDT) Global database is the most comprehensive collection of dissertations and theses in the world. It is the database of record for graduate research.

PQDT Global combines content from a range of the world's premier universities - from the Ivy League to the Russell Group. Of the nearly 4 million graduate works included in the database, ProQuest offers more than 2.5 million in full text formats. Of those, over 1.7 million are available in PDF format. More than 90,000 dissertations and theses are added to the database each year.

If you have questions, please feel free to visit the ProQuest Web site - http://www.proquest.com - or call ProQuest Hotline Customer Support at 1-800-521-3042.