Author: Alexandra Pickett

Adapting seminar courses

Adapting seminar courses

Getting started

In a small seminar, learners read/watch assigned materials before class, think about the learning materials, and then connect the materials to the major themes of the course. Then, learners come to class prepared to make sense of what they have read with the group, engaging in deep reflection and discussion. The instructor might clarify misconceptions through lectures.

Add learning content

Readings and other media

List required readings. You can upload a file to a course module in your LMS listing its objectives and required readings, or you can create a course page and list the information there.

Incorporate electronic options when available. Check with your instructional designer (ID), or campus librarian to learn more about electronic materials from academic publishers that can be integrated into your course.

Think beyond readings. There might be podcasts, documentaries, YouTube videos or TED Talks that support your learning goals. If prominent scholars or professional societies in your field are on social media, you might ask learners to “follow” them for additional insights.

Use library resources. You are encouraged not to upload PDFs of assigned journal articles to your modules. Not only is this a risk of copyright infringement, it denies the library information about which journals or databases are being accessed, which might ultimately cause those journals to be removed from the subscription list. Instead, provide complete citation information for learners so they can retrieve the articles.

Faculty can also create a reading list in your LMS, and/or work with a librarian to set up electronic reserves, which can be made available to course participants.


Create activities and assessments

Once you have established what learners need to learn and provided them with learning materials, the next step is to add activities to support them in learning and assessments that allow learners to demonstrate what they have learned.

There is overlap between “activity” and “assessment” in an online course, as the activities that support learning can also be used as low-stakes assessments.

Discussions

As with lectures, online class discussions can be synchronous or asynchronous.

Enhancing discussions with technology

Use the following web-conferencing features to keep synchronous sessions interactive:

  • Breakout rooms allow students to discuss topics in smaller groups. You can create the rooms after the session has started, or pre-assign them.
  • Polls provide course participants a chance to respond quickly and anonymously to questions posed during the session.
  • Enable screen-sharing during the session so students can share their own work.
  • Allow students to annotate the whiteboard, or a slide you’ve shared. For example, share a slide of a chart or graphic organizer, and ask students to use the web-conferencing platform annotation tools to complete it.

LMS Discussion tools also have features that can add new dimensions to the conversation:

  • Use the rich content editor in your LMS to format the text of your discussion prompt, add links (to other parts course webpages), and embed videos.
  • Encourage learners to use these formatting tools as well.
  • Embed a media prompt (a diagram, video, etc.) for learners to respond to.
  • Give learners the option of using the media tools in the rich content editor to post responses.
  • For learners who are more comfortable speaking than writing, this provides a means for them to respond more fluently. In a language course, this allows you to assess learners’ pronunciation, grammar, etc.
Designing discussions for conversation, reflection and critical thinking

Regardless of modality, plan discussions that learners enjoy participating in—and that you enjoy assessing.

Make sure discussion prompts don’t have right answers. If all learners can post the same “correct” response to your prompt, this is not actually a discussion.

Ask learners to respond to assigned readings. They can share their takeaways and how they might apply what they learned to future research or practice. They can share any questions that remain or any “ah-ha!” moments that occurred. If appropriate, they might even share emotional responses—did the reading inspire them to some action, or anger them?

Consider the use of social bookmarking/annotation tools like Diigo, or Hypothes.is, that can be integrated into your LMS and instruction. You can ask course participants to share their notes related to an assigned course reading, which can support them in understanding challenging material. You can also use these tools in conjunction with any of these 10 Activities to Foster Deep Reading in Digital Environments.

Present a case study and ask learners how they might respond. You could even create a case that unfolds over the course of the semester.

Have learners “test” each other over the material. One student starts the discussion by posing a question. The next student answers the first question and then poses a new question. The first student follows up by giving feedback on the second student’s response, providing references as appropriate.

Visit Project Zero’s Thinking Routines (from Harvard University) for many more creative ideas.

Papers

From short essays to multi-page research papers, writing assignments are often at the heart of a college course syllabus.

Using rubrics with written assignments can help clarify expectations for the assignment. For instructors, rubrics streamline the grading process and help ensure objectivity.

Provide feedback. Use the tools in your LMS for viewing student assignment submissions and providing feedback. You can read written submissions in the LMS and use the annotation tools in the platform to provide feedback within the document. Then, give feedback comments—written, multimedia, or as a file attachment—on the student’s work as a whole.

Presentations

Giving an effective presentation is a critical academic and professional skill—planning the content of the talk, designing slides or other visual aids, and speaking to an audience.

Asking learners to present in a live session is one way to provide the opportunity for an authentic experience and allows classmates to respond in the moment.

VoiceThread is a tool that you might use to that provides an opportunity for learners to present asynchronously. They still must prepare and give a talk, but there is less pressure than doing so in front of a live audience, and the artifact persists beyond the end of the course and allows for asynchronous peer review and interaction.

You could also ask course participants to screen cast and record their presentations. Recordings can easily be shared with you (as an assignment), or with the class (embedded in a discussion reply). Check with your campus ID for the tools that may be supported to use for this purpose.

The presentations could follow a format, such as Ignite or Pecha Cucha that constrains the length and the number of slides permitted.

Electronic portfolios

Gathering course work into a portfolio can be an opportunity for online learners to reflect on their growth and learning during the course. For example, you could ask them to create a portfolio of their discussion contributions during the semester and ask them to find patterns, identify ideas that have evolved since the original posting, and so forth.

Check with your campus ID to see if your campus provides and supports a portfolio platform, or how you might achieve something similar in your course using LMS tools.

Other assessments

See the Alternative Assessments page for more ideas on how to assess your online learners.

Many faculty are concerned that online learners are more likely to cheat. See Tips for designing authentic online assessments to learn about teaching practices that can support academic honesty.


Build a learning community

Many faculty who move from face-to-face teaching to online instruction worry that their students will not develop a sense of community. However, there are steps you can take in your course design and teaching to cultivate student interaction and cohesion.

Instructor presence

If you want to build a learning community with your online students, the first step is cultivating instructor presence—being visible to and available to your students, and interacting with them as a person, not as disembodied chunks of text on a screen.

Some strategies for developing your instructor presence:

Introduce yourself to the class. Share a few personal details (family, pets, hobbies) in addition to your academic experience. Upload a photo to your LMS profile and in your welcome and introductory course pages.

Welcome learners to your course. Create a brief video (or VoiceThread) to go over the course requirements, and convey your own enthusiasm for the course and what you expect them to learn.

Regular announcements using the features of your LMS are a simple way to reach out to your course learners. Use announcements to provide a recap of content that was covered in the previous week and/or a preview of what’s coming in the week ahead. Remind them of upcoming due dates and events. Share relevant news items and current events. Definitely offer words of encouragement! Provide a mechanism for questions associated with announcements in case learners want to follow up on your remarks.

Create short, informal videos to make announcements or introduce lessons. You can use the rich content editor to record your video directly into your LMS. You can also create audio or video feedback using the tools in your LMS.

Communicate your availability to your students. In your syllabus and course information documentation, include your preferred means of contact and expected time frame for replying to questions and returning feedback on learner work. Host office hours in your web conferencing platform. Leverage groups and/or the LMS calendar, or other tools, to provide easy ways for learners to sign up for available meeting times with you.

Bring your voice and experience to the class. Most online communication is still written. It’s easy to fall into a formal, academic, third-person style when writing announcements and messages to students. Be professional but conversational, and use the second person (“you”) as often as possible.

Don’t be afraid to use yourself as an example. Talk about a discipline-related decision you had to make. Talk about how you applied a particular research method—why did you choose it for that particular study? How did it work? What mistakes did you make along the way, and how did you learn from those mistakes? When you connect the student learning to your own experiences, you show how it applies in the real world.

Learner engagement and connection

One element that differentiates a seminar course from a typical class is a sense of community. With intentional choices about your course design and activities, you can create the same community in an online class.

Set the ground rules for engagement. Make sure students understand that thoughtful, constructive debate is allowed (even encouraged!), but insults are not.

Start with an introduction or “ice breaker” activityThis can be as simple as a discussion thread in your LMS, in which learners can share a bit about themselves and why they are taking the course. You could ask them to respond to a fun or thought provoking question as well; for example, asking each learner to state two truths about themselves and one falsehood, inviting the rest of the class to guess the lie. Course participants could be asked to include a meaningful photo or to use the media recorder in the rich content editor to post a brief video.

In a web-conferencing session, you might ask each student to choose a thematic background picture (“a place you’d like to travel to,” or “something that begins with the letter M”).

Some additional ice-breaking ideas.

Include a general discussion board open to learners for social conversation. You might invite (not require) them to share personal news on “What’s New Wednesday” or something motivational on “Feel Good Friday.”

Plan collaborative work and peer reviewed work. For group assignments, you will need to set the groundwork:

  • Give group members a chance to get to know each other before beginning the actual project.
  • Consider asking each group to complete a contract: Preferred means of communication, available time to work on the project (especially if synchronous time is needed), how group members will address conflict or social loafing, and so forth.
  • If the task requires students to assume roles, ask students to establish those roles at the outset (e.g., “leader,” “reporter”).

As long as learners in a group reach a consensus on how to collaborate, give them flexibility. Make the LMS group tools available to them (including announcements, file sharing, and discussions), but let them know they can also meet via web-conferencing platforms, or even text each other (if all members agree to give out their mobile numbers).

When introducing an assignment with peer review, be sure to include a checklist or rubric to guide learners through the specifics of what they need to look for in the review.

Please see the following articles for additional tips:

Adapted from a resource originally found at https://keeplearning.umsystem.edu/instructors/adapting-seminar-courses (https://web.archive.org/web/20210530012023/https://keeplearning.umsystem.edu/instructors/adapting-seminar-courses)

OLC 2015 Effective Practice Award: THE OPEN SUNY COTE QUALITY REVIEW (OSCQR) PROCESS AND RUBRIC

OLC 2015 Effective Practice Award: THE OPEN SUNY COTE QUALITY REVIEW (OSCQR) PROCESS AND RUBRIC

OLC 2015 Effective Practice Award: THE OPEN SUNY COTE QUALITY REVIEW (OSCQR) PROCESS AND RUBRIC

Author Information

Author(s): Alexandra M. Pickett
Institution(s) or Organization(s) Where EP Occurred:

State University of New York, Open SUNY Center for Online Teaching Excellence

  • Broome Community College
  • Corning Community College
  • Dutchess Community College
  • Finger Lakes Community College
  • Fulton-Montgomery Community College
  • Herkimer County Community College
  • Jamestown Community College
  • Monroe Community College
  • Niagara County Community College
  • Onondaga County Community College
  • Rockland Community College
  • Schenectady County Community College
  • State University of New York at Stony Brook
  • State University of New York At Oswego
  • SUNY Brockport
  • SUNY Buffalo State
  • SUNY Delhi
  • SUNY Maritime College
  • SUNY Plattsburgh
  • SUNY Polytechnic Institute
  • SUNY Sullivan County Community College
  • University at Albany
  • University at Buffalo

Effective Practice Abstract/Summary

The Open SUNY Center for Online Teaching Excellence (COTE) has developed an online course design rubric and process that addresses both the instructional design and accessibility of an online course that is openly licensed for anyone to use and adapt. The aim of the Open SUNY COTE Quality Review (OSCQR) Rubric and Process is to support continuous improvements to the quality and accessibility of online courses, while also providing a system-wide approach to collect data across campuses, institutions, departments, and programs that can be used to inform faculty development, and support large scale online course design review and refresh efforts systematically and consistently.

Description of the Effective Practice:

The Open SUNY Center for Online Teaching Excellence (COTE) has developed an online course design rubric and process that addresses both the instructional design and accessibility of an online course that is openly licensed for anyone to use and adapt. The aim of the Open SUNY COTE Quality Review (OSCQR) Rubric and Process is to support continuous improvements to the quality and accessibility of online courses, while also providing a system-wide approach to collect data across campuses, institutions, departments, and programs that can be used to inform faculty development, and support large scale online course design review and refresh efforts systematically and consistently. The OSCQR rubric and process are currently being used by 23 SUNY institutions.

Working with multi institutional teams of SUNY online instructional designers, librarians, distance learning directors, and technologists, Open SUNY COTE staff started with the Chico rubric, 20 years of SLN research-informed best online practices, the SUNY office of general counsel’s memorandum on accessibility considerations, and conducted a gap analysis with Quality MattersiNACOL, and Bb exemplary courses. The resulting rubric was also informed by the Community of Inquiry model (Garrison, Anderson, and Archer, 2000), The 7 Principles for Good practice in Undergraduate Education (Chickering & Gamson, 1987), The Adult Learner(Malcom Knowles, 1973), Bloom’s Taxonomy (Bloom et al., 1956) and How People Learn (Bransford et al., 1999), and mapped to the Open SUNY COTE fundamental competencies for online teaching. We are continuing the process of building out the citations and annotations associated with each standard in the rubric here: OSCQR Rubric Annotations and Refresh resources – each standard is explained and supported by citations from the literature. A formal literature review is planned and in progress.

There are two components to OSCQR:

  1. The OSCQR Process provides a Framework and Dashboard that supports a campus-tailored and scalable approach to improving the instructional design of online or blended courses.
    • The Framework includes:
      1. Course Review that results in an action plan to improve the design of the online course.
      2. The Course Refresh based on the things targeted for improvement by the course review.
      3. Learning Review that identifies and prioritizes the next set of improvements for continuous quality improvement.
    • The campus Dashboard is the tool from which all the rubrics at a given campus (or in a program, or department) can be generated, customized, and managed.
      1. It provides automations for campus-level management of all course reviews, custom prioritization of the standards, and incorporation of additional standards. It can be used with any rubric and can assign different rubrics to different courses.
      2. Analytics are built in to aggregate and track all course review progress, identify and document course design issues and trends, and to aggregate information across courses to inform faculty development initiatives and course development planning.
  2. The OSCQR Rubric has 37 online course design standards and 37 accessibility standards. The Rubric is flexible and designed to be used in a variety of course quality assurance approaches.
    • By instructors and instructional designers in faculty development and course design professional development activities to inform and influence the design of new online courses.
    • By an individual instructor to self-assess and prioritize design improvements; to continuously review, revise and improve the instructional design of their existing online courses.
    • By an instructional designer to conduct a formal course review of an online course as part of an online course quality review process at the program, department, or institutional level.
    • As a peer review process, by a team of instructors interested in a peer-review model of online course review and continuous improvement (the teams can be made up of inter or intra disciplinary teams).
    • In a collaborative team model made up of a group of at least 3 people approaching the course review process from their own various specialized perspectives, i.e., instructional designer, course author, and external reviews that might include other subject matter experts (faculty), online librarian, student, instructional technologist, multimedia designer, other faculty.

A course review with the OSCQR Rubric produces an Action Plan that is framed from the perspective of the Community of Inquiry (CoI) model, to help reviewers assess and target opportunities to improve the course’s social presence, cognitive presence, and teaching presence, in addition to the overall online course educational experience. It substantively addresses accessibility. The OSCQR Accessibility Rubric is based on the recommendations of SUNY’s Office of General Counsel in their 2013 memo, “Accessibility Considerations in the wake of SUNY’s Online Initiatives.” The rubric has been reviewed by members of the FACT2 Accessibility Task Force, and address the legal considerations required to be compliant with Section 508 of the Rehabilitation Act of 1973, New York State Enterprise IT Policy NYS-P08-005, Section 504 of the Rehabilitation Act, and the Americans with Disabilities Act.

The OSCQR Rubric is unique and differs from other online course quality rubrics in several ways. It is not restricted to mature online courses. The Rubric can be used formatively with new online faculty to help guide, inform and influence the design of their new online courses. It is non-evaluative. Conceptually the rubric and process approach course review and refresh as a professional development exercise, to guide faculty in their understanding of improving course design from an effective practices perspective, rather than as a course evaluation, or quality assurance procedure. It prioritizes changes. The Action Plan, that is automatically generated by the course review process, presents recommendations for course design improvements based on the review, and assists in prioritization of course revisions based on the estimated time to make those improvements. The rubric also provides suggestions for course design improvements for each standard that can be selected from a menu of options by each reviewer to supplement reviewer feedback. The OSCQR Rubric can be customized, i.e., standards can be added, edited, and /or eliminated.


Supporting Information for this Effective Practice

Evidence of Effectiveness:

The use of the Rubric informs the quality of the instructional design of online courses. It is used formatively to guide online course design, as well as summatively to continuously improve the quality and accessibility of online courses. High levels of adoption (23 campuses so far) and consistently high levels of satisfaction, confidence, and reported learning from both faculty and students are evidence of our effectiveness in the systematic use of online course quality standards in professional development activities, as well as course quality review processes.

Open SUNY COTE Spring 2015 online student survey
2469 student respondents from 42 SUNY campuses

  • Employment status
    • 71% were employed full or part time.
  • Online Experience
    • 32% were taking an online course for the first time.
    • 30% had taken 4 or more online courses.
    • 67% would take additional online courses.
  • Satisfaction
    • 76% were satisfied or very satisfied with their online course.
    • 76% indicated that they learned a great deal in their online course.

Open SUNY Spring 2015 online faculty survey
402 respondents to the Spring 2015 Faculty Survey.

  • Online courses can achieve student learning outcomes that are at least equivalent to those of in-person courses at MY institution.
    • 37% of SUNY faculty strongly agreed with this statement with 66% indicating some level of agreement.
  • Online courses can achieve student learning outcomes that are at least equivalent to those of in-person courses in my discipline or department.
    • Overall 44% of SUNY faculty expressed strong agreement with this statement with another 22% expressing some form of agreement.
  • Online courses can achieve student learning outcomes that are at least equivalent to those of in-person courses in the classes I teach.
    • Over 52% of SUNY faculty expressed strong agreement with another 19% expressing some form of agreement.
How does this practice relate to pillars?:

This practice relates to pillars in the following ways.

Access: Open SUNY is a system-wide initiative of the State University of New York (SUNY) designed to expand access to online learning and meet the increasing demands of the workforce throughout New York State and the world. Open SUNY launched in January 2014 with an initial wave of 6 SUNY campuses and 8 Open SUNY+ online degree programs. The second wave of Open SUNY+ programs launched in January 2015 with 17 campuses and 56 Open SUNY+ online degree programs. Central to Open SUNY are a set of signature elements based in quality assurance for online learning and a series of system-level supports for faculty teaching online, student access to NYS high needs online degree programs and student success, campus and system-wide infrastructure, and innovative instructional models.

Learning Effectiveness: Open SUNY Course Supports, a pillar of the Open SUNY Center for Online Teaching Excellence, has been designed to support online learning effectiveness in online courses and degree programs with a flexible and comprehensive online course quality review and refresh process that is required of of all online courses in Open SUNY+ online degree programs. The OSCQR rubric and process were developed to establish online course quality standards and a process to scale continuous online course quality improvement. Both students and faculty report high levels of engagement and learning in our annual surveys.

Faculty Satisfaction: When online faculty are well prepared to teach online in well designed online learning environments, and have opportunities to continuously improve their online teaching practices, as well as their online course designs, they have positive satisfying experiences teaching online. Open SUNY COTE (formerly the SUNY Learning Network) is well known for success in large-scale online faculty development, and high levels of online faculty satisfaction (Fredericksen, et al, 2000). Today we are using the OSCQR Rubric in new online faculty development. We are also training online instructional designers and online faculty peer course reviewers to use it as a tool to review existing online courses to identify areas that need improvement, and we are issuing badges to online practitioners that receive training, that conduct reviews as a self-assessment or as a peer review, and that have online courses that have been reviewed. Our annual faculty surveys continue to show high levels of faculty satisfaction.

Student Satisfaction: Open SUNY COTE (formerly the SUNY Learning Network) is well known for high levels of online student satisfaction and reported learning (

OLC 2015 Effective Practice Award: THE OPEN SUNY COTE QUALITY REVIEW (OSCQR) PROCESS AND RUBRIC

Author Information

Author(s): Alexandra M. Pickett
Institution(s) or Organization(s) Where EP Occurred:

State University of New York, Open SUNY Center for Online Teaching Excellence

  • Broome Community College
  • Corning Community College
  • Dutchess Community College
  • Finger Lakes Community College
  • Fulton-Montgomery Community College
  • Herkimer County Community College
  • Jamestown Community College
  • Monroe Community College
  • Niagara County Community College
  • Onondaga County Community College
  • Rockland Community College
  • Schenectady County Community College
  • State University of New York at Stony Brook
  • State University of New York At Oswego
  • SUNY Brockport
  • SUNY Buffalo State
  • SUNY Delhi
  • SUNY Maritime College
  • SUNY Plattsburgh
  • SUNY Polytechnic Institute
  • SUNY Sullivan County Community College
  • University at Albany
  • University at Buffalo

Effective Practice Abstract/Summary

The Open SUNY Center for Online Teaching Excellence (COTE) has developed an online course design rubric and process that addresses both the instructional design and accessibility of an online course that is openly licensed for anyone to use and adapt. The aim of the Open SUNY COTE Quality Review (OSCQR) Rubric and Process is to support continuous improvements to the quality and accessibility of online courses, while also providing a system-wide approach to collect data across campuses, institutions, departments, and programs that can be used to inform faculty development, and support large scale online course design review and refresh efforts systematically and consistently.

Description of the Effective Practice:

The Open SUNY Center for Online Teaching Excellence (COTE) has developed an online course design rubric and process that addresses both the instructional design and accessibility of an online course that is openly licensed for anyone to use and adapt. The aim of the Open SUNY COTE Quality Review (OSCQR) Rubric and Process is to support continuous improvements to the quality and accessibility of online courses, while also providing a system-wide approach to collect data across campuses, institutions, departments, and programs that can be used to inform faculty development, and support large scale online course design review and refresh efforts systematically and consistently. The OSCQR rubric and process are currently being used by 23 SUNY institutions.

Working with multi institutional teams of SUNY online instructional designers, librarians, distance learning directors, and technologists, Open SUNY COTE staff started with the Chico rubric, 20 years of SLN research-informed best online practices, the SUNY office of general counsel’s memorandum on accessibility considerations, and conducted a gap analysis with Quality MattersiNACOL, and Bb exemplary courses. The resulting rubric was also informed by the Community of Inquiry model (Garrison, Anderson, and Archer, 2000), The 7 Principles for Good practice in Undergraduate Education (Chickering & Gamson, 1987), The Adult Learner(Malcom Knowles, 1973), Bloom’s Taxonomy (Bloom et al., 1956) and How People Learn (Bransford et al., 1999), and mapped to the Open SUNY COTE fundamental competencies for online teaching. We are continuing the process of building out the citations and annotations associated with each standard in the rubric here: OSCQR Rubric Annotations and Refresh resources – each standard is explained and supported by citations from the literature. A formal literature review is planned and in progress.

There are two components to OSCQR:

  1. The OSCQR Process provides a Framework and Dashboard that supports a campus-tailored and scalable approach to improving the instructional design of online or blended courses.
    • The Framework includes:
      1. Course Review that results in an action plan to improve the design of the online course.
      2. The Course Refresh based on the things targeted for improvement by the course review.
      3. Learning Review that identifies and prioritizes the next set of improvements for continuous quality improvement.
    • The campus Dashboard is the tool from which all the rubrics at a given campus (or in a program, or department) can be generated, customized, and managed.
      1. It provides automations for campus-level management of all course reviews, custom prioritization of the standards, and incorporation of additional standards. It can be used with any rubric and can assign different rubrics to different courses.
      2. Analytics are built in to aggregate and track all course review progress, identify and document course design issues and trends, and to aggregate information across courses to inform faculty development initiatives and course development planning.
  2. The OSCQR Rubric has 37 online course design standards and 37 accessibility standards. The Rubric is flexible and designed to be used in a variety of course quality assurance approaches.
    • By instructors and instructional designers in faculty development and course design professional development activities to inform and influence the design of new online courses.
    • By an individual instructor to self-assess and prioritize design improvements; to continuously review, revise and improve the instructional design of their existing online courses.
    • By an instructional designer to conduct a formal course review of an online course as part of an online course quality review process at the program, department, or institutional level.
    • As a peer review process, by a team of instructors interested in a peer-review model of online course review and continuous improvement (the teams can be made up of inter or intra disciplinary teams).
    • In a collaborative team model made up of a group of at least 3 people approaching the course review process from their own various specialized perspectives, i.e., instructional designer, course author, and external reviews that might include other subject matter experts (faculty), online librarian, student, instructional technologist, multimedia designer, other faculty.

A course review with the OSCQR Rubric produces an Action Plan that is framed from the perspective of the Community of Inquiry (CoI) model, to help reviewers assess and target opportunities to improve the course’s social presence, cognitive presence, and teaching presence, in addition to the overall online course educational experience. It substantively addresses accessibility. The OSCQR Accessibility Rubric is based on the recommendations of SUNY’s Office of General Counsel in their 2013 memo, “Accessibility Considerations in the wake of SUNY’s Online Initiatives.” The rubric has been reviewed by members of the FACT2 Accessibility Task Force, and address the legal considerations required to be compliant with Section 508 of the Rehabilitation Act of 1973, New York State Enterprise IT Policy NYS-P08-005, Section 504 of the Rehabilitation Act, and the Americans with Disabilities Act.

The OSCQR Rubric is unique and differs from other online course quality rubrics in several ways. It is not restricted to mature online courses. The Rubric can be used formatively with new online faculty to help guide, inform and influence the design of their new online courses. It is non-evaluative. Conceptually the rubric and process approach course review and refresh as a professional development exercise, to guide faculty in their understanding of improving course design from an effective practices perspective, rather than as a course evaluation, or quality assurance procedure. It prioritizes changes. The Action Plan, that is automatically generated by the course review process, presents recommendations for course design improvements based on the review, and assists in prioritization of course revisions based on the estimated time to make those improvements. The rubric also provides suggestions for course design improvements for each standard that can be selected from a menu of options by each reviewer to supplement reviewer feedback. The OSCQR Rubric can be customized, i.e., standards can be added, edited, and /or eliminated.


Supporting Information for this Effective Practice

Evidence of Effectiveness:

The use of the Rubric informs the quality of the instructional design of online courses. It is used formatively to guide online course design, as well as summatively to continuously improve the quality and accessibility of online courses. High levels of adoption (23 campuses so far) and consistently high levels of satisfaction, confidence, and reported learning from both faculty and students are evidence of our effectiveness in the systematic use of online course quality standards in professional development activities, as well as course quality review processes.

Open SUNY COTE Spring 2015 online student survey
2469 student respondents from 42 SUNY campuses

  • Employment status
    • 71% were employed full or part time.
  • Online Experience
    • 32% were taking an online course for the first time.
    • 30% had taken 4 or more online courses.
    • 67% would take additional online courses.
  • Satisfaction
    • 76% were satisfied or very satisfied with their online course.
    • 76% indicated that they learned a great deal in their online course.

Open SUNY Spring 2015 online faculty survey
402 respondents to the Spring 2015 Faculty Survey.

  • Online courses can achieve student learning outcomes that are at least equivalent to those of in-person courses at MY institution.
    • 37% of SUNY faculty strongly agreed with this statement with 66% indicating some level of agreement.
  • Online courses can achieve student learning outcomes that are at least equivalent to those of in-person courses in my discipline or department.
    • Overall 44% of SUNY faculty expressed strong agreement with this statement with another 22% expressing some form of agreement.
  • Online courses can achieve student learning outcomes that are at least equivalent to those of in-person courses in the classes I teach.
    • Over 52% of SUNY faculty expressed strong agreement with another 19% expressing some form of agreement.
How does this practice relate to pillars?:

This practice relates to pillars in the following ways.

Access: Open SUNY is a system-wide initiative of the State University of New York (SUNY) designed to expand access to online learning and meet the increasing demands of the workforce throughout New York State and the world. Open SUNY launched in January 2014 with an initial wave of 6 SUNY campuses and 8 Open SUNY+ online degree programs. The second wave of Open SUNY+ programs launched in January 2015 with 17 campuses and 56 Open SUNY+ online degree programs. Central to Open SUNY are a set of signature elements based in quality assurance for online learning and a series of system-level supports for faculty teaching online, student access to NYS high needs online degree programs and student success, campus and system-wide infrastructure, and innovative instructional models.

Learning Effectiveness: Open SUNY Course Supports, a pillar of the Open SUNY Center for Online Teaching Excellence, has been designed to support online learning effectiveness in online courses and degree programs with a flexible and comprehensive online course quality review and refresh process that is required of of all online courses in Open SUNY+ online degree programs. The OSCQR rubric and process were developed to establish online course quality standards and a process to scale continuous online course quality improvement. Both students and faculty report high levels of engagement and learning in our annual surveys.

Faculty Satisfaction: When online faculty are well prepared to teach online in well designed online learning environments, and have opportunities to continuously improve their online teaching practices, as well as their online course designs, they have positive satisfying experiences teaching online. Open SUNY COTE (formerly the SUNY Learning Network) is well known for success in large-scale online faculty development, and high levels of online faculty satisfaction (Fredericksen, et al, 2000). Today we are using the OSCQR Rubric in new online faculty development. We are also training online instructional designers and online faculty peer course reviewers to use it as a tool to review existing online courses to identify areas that need improvement, and we are issuing badges to online practitioners that receive training, that conduct reviews as a self-assessment or as a peer review, and that have online courses that have been reviewed. Our annual faculty surveys continue to show high levels of faculty satisfaction.

Student Satisfaction: Open SUNY COTE (formerly the SUNY Learning Network) is well known for high levels of online student satisfaction and reported learning (Fredericksen, et al, 2000). Today, by formalizing online course quality standards in the form of the OSCQR Rubric, and systematically applying instructional design and accessibility standards to all Open SUNY+ online courses and degree programs, we have continued to see high levels of student satisfaction and reported learning in our annual surveys.

Scale (institutional commitment to achieve capacity enrollment via cost effectiveness): Now fully operationalized, Open SUNY has scaled and institutionalized the Open SUNY+ signature elements and supports and designed an online course review/refresh process that is open to all SUNY campuses, online programs, courses, and faculty to ensure that all SUNY online students have access to high quality accessible online courses.

Equipment necessary to implement Effective Practice:

No special equipment is necessary to implement OSCQR. What you will need are:

  • Copies of the rubric (available for download as a pdf.)
  • Google account (to manage courses or use the interactive rubric online)
  • Access to the OSCQR Dashboard (which is optional).
Estimate the probable costs associated with this practice:

The probable costs associated with the use of the OSCQR Rubric and process are minimal on a small scale, except any costs associated with training faculty, or incentivizing course reviews. The process of conducting course reviews and making online course improvements to refresh course designs takes time, which is often not insignificant depending on many factors. We estimate a course review to take an individual 6-10 hours. The review and refresh of one online course is estimated to be 20% of an FTE for one academic term (16 weeks). Conducting team course reviews with collaborations on course refreshes that include librarians, technologists, and instructional designers, as well as the subject matter expert (SME) is resource intensive. OSCQR has been designed to flexibly support various implementation models, including use as a self-assessment in online faculty professional development, online instructional designer course review, peer faculty/SME course review, and the collaborative multi disciplinary team approach to online course review and refresh. The learning review part of the process to review and plan the next set of course design improvements involves the online instructor and instructional designer.

There is no license fee for use of the rubric. It is shared with a creative Creative Commons license: CC BY-NC-SA 3.0 US. Because the OSCQR Rubric is licensed under Creative Commons, and the Dashboard is licensed under LGPL, the entire process can be shared, used by anyone with no cost, and can be customized to address individual campus environments.

Technical Costs to self-host the OSCQR Rubric and Dashboard:
With a regular Google account there is a 1 hour computing time limitation and there is a limitation of only being able to create 250 spreadsheets a day. We suggest having two Google Apps for Business accounts – one for running the Dashboard, and one for the Self Serve Rubrics. Running out of time or hitting the maximum number of spreadsheets created per day with the limits of two different Google business accounts is unlikely.
Link for Google Apps for Business
Quotas for Google Apps for Business

We have one person responsible for the technical application design of the rubric and dashboard. To change or improve the application in terms of its design, analytics, reports, requires someone with Google spreadsheet programming skills. HelpDesk support and trouble shooting support are also necessary to support those accessing the interactive online rubric and dashboard.


References, supporting documents:

 Useful Links:


Other Comments:

We welcome inquiries.


Additional Contacts:
Alexandra M. Pickett (conceptual designer)
alexandra.pickett@suny.edu
Dave Ghidiu (application developer)
Dave.Ghidiu@suny.edu
Robert Piorkowski (operational architect)
Robert.Piorkowski@suny.edu
Dan Feinberg (project manager and accessibility rubric designer)
Dan.Feinberg@suny.edu
Erin Maney (course reviewer)
Erin.Maney@suny.edu


Contact(s) for this Effective Practice
Effective Practice Contact:

Alexandra M. Pickett (conceptual designer)

Email this contact:

alexandra.pickett@suny.edu

Effective Practice Contact 2:

Dave Ghidiu (application developer)

Email contact 2:

Dave.Ghidiu@suny.edu

Effective Practice Contact 3:

Robert Piorkowski (operational architect)

Email contact 3:

Robert.Piorkowski@suny.edu


Originally posted at: http://olc.onlinelearningconsortium.org/effective_practices/open-suny-cote-quality-review-oscqr-process-and-rubric
Retrieved from: https://web.archive.org/web/20160506030554/http://olc.onlinelearningconsortium.org/effective_practices/open-suny-cote-quality-review-oscqr-process-and-rubric

). Today, by formalizing online course quality standards in the form of the OSCQR Rubric, and systematically applying instructional design and accessibility standards to all Open SUNY+ online courses and degree programs, we have continued to see high levels of student satisfaction and reported learning in our annual surveys.

Scale (institutional commitment to achieve capacity enrollment via cost effectiveness): Now fully operationalized, Open SUNY has scaled and institutionalized the Open SUNY+ signature elements and supports and designed an online course review/refresh process that is open to all SUNY campuses, online programs, courses, and faculty to ensure that all SUNY online students have access to high quality accessible online courses.

Equipment necessary to implement Effective Practice:

No special equipment is necessary to implement OSCQR. What you will need are:

  • Copies of the rubric (available for download as a pdf.)
  • Google account (to manage courses or use the interactive rubric online)
  • Access to the OSCQR Dashboard (which is optional).
Estimate the probable costs associated with this practice:

The probable costs associated with the use of the OSCQR Rubric and process are minimal on a small scale, except any costs associated with training faculty, or incentivizing course reviews. The process of conducting course reviews and making online course improvements to refresh course designs takes time, which is often not insignificant depending on many factors. We estimate a course review to take an individual 6-10 hours. The review and refresh of one online course is estimated to be 20% of an FTE for one academic term (16 weeks). Conducting team course reviews with collaborations on course refreshes that include librarians, technologists, and instructional designers, as well as the subject matter expert (SME) is resource intensive. OSCQR has been designed to flexibly support various implementation models, including use as a self-assessment in online faculty professional development, online instructional designer course review, peer faculty/SME course review, and the collaborative multi disciplinary team approach to online course review and refresh. The learning review part of the process to review and plan the next set of course design improvements involves the online instructor and instructional designer.

There is no license fee for use of the rubric. It is shared with a creative Creative Commons license: CC BY-NC-SA 3.0 US. Because the OSCQR Rubric is licensed under Creative Commons, and the Dashboard is licensed under LGPL, the entire process can be shared, used by anyone with no cost, and can be customized to address individual campus environments.

Technical Costs to self-host the OSCQR Rubric and Dashboard:
With a regular Google account there is a 1 hour computing time limitation and there is a limitation of only being able to create 250 spreadsheets a day. We suggest having two Google Apps for Business accounts – one for running the Dashboard, and one for the Self Serve Rubrics. Running out of time or hitting the maximum number of spreadsheets created per day with the limits of two different Google business accounts is unlikely.
Link for Google Apps for Business
Quotas for Google Apps for Business

We have one person responsible for the technical application design of the rubric and dashboard. To change or improve the application in terms of its design, analytics, reports, requires someone with Google spreadsheet programming skills. HelpDesk support and trouble shooting support are also necessary to support those accessing the interactive online rubric and dashboard.


References, supporting documents:

 Useful Links: 


Other Comments:

We welcome inquiries.


Additional Contacts:
Alexandra M. Pickett (conceptual designer)
alexandra.pickett@suny.edu
Dave Ghidiu (application developer)
Dave.Ghidiu@suny.edu
Robert Piorkowski (operational architect)
Robert.Piorkowski@suny.edu
Dan Feinberg (project manager and accessibility rubric designer)
Dan.Feinberg@suny.edu
Erin Maney (course reviewer)
Erin.Maney@suny.edu


Contact(s) for this Effective Practice
Effective Practice Contact:

Alexandra M. Pickett (conceptual designer)

Email this contact:

alexandra.pickett@suny.edu

Effective Practice Contact 2:

Dave Ghidiu (application developer)

Email contact 2:

Dave.Ghidiu@suny.edu

Effective Practice Contact 3:

Robert Piorkowski (operational architect)

Email contact 3:

Robert.Piorkowski@suny.edu


Originally posted at: http://olc.onlinelearningconsortium.org/effective_practices/open-suny-cote-quality-review-oscqr-process-and-rubric
Retrieved from: https://web.archive.org/web/20160506030554/http://olc.onlinelearningconsortium.org/effective_practices/open-suny-cote-quality-review-oscqr-process-and-rubric