Monday, June 02, 2014

course evaluation rubrics, blended learning, in Open SUNY, in the library, with a lead pipe

The last webinar in BlendKit2014 was ably co-facilitated by my FLCC colleague Dr. Trista Merrill. During the webinar, Trista referred to her campus' decision to adopt a forthcoming course evaluation rubric as a "mandate" from SUNY System Administration as part of Open SUNY.  What follows is my opinion (and mine only; not yours, nor anyone else's) on this particular topic, in an attempt to clarify some details for those not deeply familiar with the deus ex machina that is SUNY.

First, a few words to clarify what SUNY is all about. Founded in 1948 (and leveraging many institutions of higher learning in existence way before that date), SUNY can best be thought of as a confederation. Now, what little you and I remember about US History I told us that "the Articles of Confederation" ultimately failed due to (among other things) the lack of a strong centralized governance structure. Coincidentally, it also makes for a fun trick question in pub trivia games ("name the 'first president' of the newly formed United States).

Using the word "confederation" is a deliberate choice on my part, to describe the level of local campus autonomy that exists within the system . "Mandates"; ie; "thou shalt" decrees from the Chancellor are fairly few and far between. To paraphrase from Captain Jack Sparrow, there are no online learning codes, more like a list of suggestions. Over half of SUNY's 64 campuses belong to (the artist formerly known as) the SUNY Learning Network (or SLN, or sometimes SNL if someone's typing in a hurry). Campuses may opt to work collaboratively with a team of instructional designers housed in SLN (from here on out, we're supposed to refer to this collection of support services as Open SUNY, but that'll likely muddle this posting even more, so hang in there, pilgrim). Or conversely, many campuses such as mine are fully staffed and capable of providing faculty development support.

So....now to the rubric in question. I will (im)modestly state that many years ago I did my best to drive our collective attention to the Quality Matters Project (still in it's FIPSE funded stage at that time). Eventually, System saw the wisdom in trying to bring QM to all of SUNY as a universal means of ensuring online course quality.

Any discussion you have with any ID anywhere, when asked about using QM, usually starts with  "well...we have a modified version...". Because, 51 (or 53? I forget) points of quality is just a wee bit overwhelming to chuck at noob online faculty. So, we end up with several localized versions in use; and we still get a very hefty annual invoice from QM to access and use their materials.

Now...pause, draw a breath, and consider Open SUNY. I'm not going attempt to explain what "open" means in this particular context, so we'll skip ahead to the concept of what we're calling "plus" online programs. The "plus" programs in our first wave are drawn from campuses long established in fully online learning in SUNY (Empire State College; Delhi, SUNY Broome; FLCC, Stony Brook, and SUNY Oswego). Part of our charge is to review and potentially partially or fully redesign courses within these campuses' online degree portfolios (8 online programs from the six campuses named).

To that end, we're piloting what we're calling the "course refresh rubric", which draws on sources such as Chickering and Gamson, The CSU Chico rubric, and Judith Boetscher's research, among others.  I want to be clear that in this pilot phase, this rubric is going to be deployed against fully online courses. It's certainly possible it can be beneficial to evaluate blended courses as well, but the initial focus is measuring specific attributes of fully online courses.

And now you know...and knowing is half the battle.