Skip to main content

Making the Pedagogy Visible with Morea

3 min read

Photo of a commercial building under construction.

Early on in my teaching position, before I was directly involved with the maintenance of the LMS, I found myself reworking the content in course that I suspect had become a catch-all for any computer-related skills someone thought first-year engineering students aught to have. There were effectively no course objectives. The official catalog listed something, of course, but I doubt anyone had paid much attention to them in years.

As a new faculty member in a unit that used its own LMS, this was the first time I was making significant content changes, both in content and structure.  The content-management side of our LMS was, and still is lacking, especially for widespread edits across multiple learning topics and moving topics around.  I was looking for a better authoring environment that could publish pages along side existing administrative pages.  I was also looking for some kind of structure to attache the reworked content to. 

I discovered the Morea framework through a CS education listserv that I had forgot I was on. It offered both a way of thinking about curricular content as well as a static-site publishing system built around jekyll. While I appreciated the static-site approach, it did not mesh well out-of-the-box with our existing system and workflow, the way of thinking and structuring content was something I wanted to work with in our own system.

Modules Outcomes Reading Experience Assessment

Morea structures instruction around modules, suggested to be more-or-less self-contained and spanning 1-2 weeks.  Each module has associated outcomes, reading (passive learning material), experiences (active learning material), and assessments. Each type of material is labeled as such and thus a morea course "makes the pedagogy visible" both to learners as well as instructors. This provides a helpful metric for assessing the balance of a curricular unit. i.e. we would want to make sure there is sufficient active content interspersed in any passive content, and we would hope both content and assessments were aligned with published outcomes.

Without using the technical framework, I didn't have the generated summary views to get a sense of how the active and passive content mingled, or the generated tag pages that e.g. would list all content for a particular learning output, but it was still a useful way to think about structuring material.  It is an idea I would like to integrate into our current LMS. I envision having some sort of soft validation/engagement metric that would notify instructors when certain modules might be week in active content, or when published assessments didn't line of with associated outcomes.  The idea of making the fundamental unit of instruction the module also has promise, but we would still need to specify an ordering of modules since much of our current content builds upon earlier content. 

I am not sure to what extent I will have the opportunity to go down this path. As I mentioned in an earlier post, there's a possibility we move away from the content-management side of our current LMS and my work is focused just on the assessment system.  If we do move to the university supported LMS we could certainly still follow the Morea principles, but any kind of automated curricular validation feedback would likely not be feasible.