When are they learning? A #Moodle activity calendar heatmap

As a precursor to looking at withdrawal and drop-out points I wanted to visualise learner activity across an academic year. This could help determine the patterns of behaviour for different individuals. The idea being that you can quickly see at a glance which days each learner is using the system. Extending last week’s exploration with activity heatmaps I came across the lattice based time series calendar heatmap which provides a nice way of plotting this for exploration. It is quite a simple process that requires some date manipulation to create extra calendar classifications. Then I made a change to the facet to show each row as a different user rather than a year.

Calendar Heatmap

calendar_heatmap

In the calendar heatmap each row is a learner and the grid shows the spread of daily activity across each month in a familiar calendar format. The visualisation quickly reveals patterns such as activity tailing off in the final months for Student4, the extended Easter holiday during Apr for Student8, and the late starter or crammer that is Student10. A couple of students also broke the usual Christmas learning abstinence and logged in during the holidays. There are a few variants of this that are possible to achieve by playing with the facet or applying it to different summaries of the log data for example a facet on activity types within a course or activity participation for a single learner that I may explore in future.

Continue reading

Assignment engagement timeline – starting with basics @salvetore #mootau15 #moodle #learninganalytics

Having joined the assessment analytics working group for Moodle Moot AU this year, I thought I’d have a play around with the feedback event data and it’s relation to future assignments. The simplified assumption to explore is that learners who view their feedback are enabled to perform better in subsequent assignments, which may be a reduction of potentially more complex distance travelled style analytics. To get started exploring the data I have produced a simple timeline that shows the frequency of assignment views within a course based on the following identified status of the submission:

  1. Pre-submission includes activities when the learner is preparing a submission
  2. Submitted includes views after submission but before receiving feedback (possibly anxious about results)
  3. Graded includes feedback views once the assignment is graded
  4. Resubmission includes activities that involve the learner resubmitting work if allowed

The process I undertook was to sort the log data into user sequences and use a function to set the status based on preceding events. For example, once the grade is released then count subsequent views as ‘graded’. This gives an idea of the spread and frequency of assignment engagement.

Timeline

Timeline

The timeline uses days on the x-axis and users on the y-axis. Each point plotted represents when events were logged for each learner – coloured by the status and sized according to the frequency on that day. There are a few noticeable vertical blue lines which correspond to feedback release dates (i.e. many learners view feedback immediately on its release) and you start to get an idea that some learners view feedback much more than others. The pattern of yellow points reveal learners who begin preparing for their assignment early, contrasted with those who cram a lot of activity closer to deadlines. I have zoomed into a subset of the learners below to help show this.

Timeline-zoomed

Having put this together quickly I am hoping I will have some time to refine the visualisation to better identify some of the relationships between assignments. I could also bring in some data from the assignment tables to enrich this having limited myself just to event data in the logs thus far. Some vertical bars showing deadlines, for example, might be helpful, or timelines for individual users with assignments on the y-axis to see how often users return to previous feedback across assignments as shown below. Here you can see the very distinct line of a feedback release, which for formative assessment it may have been better learning design to release feedback more regularly and closer to the submission.

timeline-learner

Continue reading

Can activity analytics support understanding engagement a measurable process? Inspiration from @birdahonk

I was pleased to find out my revised paper on this topic was accepted for publication in the September issue of the Journal of Applied Research in Higher Education. The basic premise of the paper is that engagement can be measured as a metric through the appropriation of ideas commonly used in social marketing metrics. For this post I’ll briefly discuss how I approached this by presenting engagement as a learning theory using the ideas of Freire and Vygotsky, as a process, and as a metric. I’ll also share my workshop slides from the conference if you want to try and create your own learner engagement profile. While I’ve started looking into different approaches, this post summarises some of the key principles developed throughout the paper that have guided my thinking of engagement.

UVUEngagement as a learning theory

The paper proposes a concept of engagement that draws on the work of Paolo Freire and Lev Vygotsky and the evolution of the learner voice. The first aspect of this is to re-position the learner as the subject within education and not the object of education, supplanting previous models which portray the learner as a passive recipient of pre-packaged knowledge. The second aspect is understanding the learner voice as a creative (Freire) and spontaneous (Vygotksy) expression within a socialised teaching-learning process that supports dialectical interactions between learner and teacher curiosity. This positions engagement as the process of recognising and respecting the learner’s world, which as Freire reveals is after all the ‘primary and inescapable face of the world itself’ in order to support the development of higher-order thinking skills. The repression of this voice is likely to result in patterns of inertia, non-engagement and alienation that are discussed widely in motivation and engagement literature. This triangulation between motivation and engagement remains a theme central to a range of learning analytics research. Correlation between learning and autonomy remains an interesting area of research.

Engagement as Process

For the paper I used Haven’s engagement process model and overlaid it with concepts from engagement literature reviews by Fredricks, Blumenfeld, & Paris (2004) and Trowler (2010)Haven posits that engagement is the new metric that supersedes previous linear metaphors encompassing the quantitative data of site visits, the qualitative data of surveys and performance, as well as the fuzzy data in between that represents social media. Haven and Vittel elaborate this into an expansive process that link four components of engagement: involvement, interaction, intimacy, and influence through the key stages of discovery, evaluation, use, and affinity (see below). To appropriate this into the educational literature research of Fredricks et al. one can explore examples of involvement and interaction as behavioural engagement, intimacy as emotional engagement, and influence as cognitive engagement. Furthermore when considering whether engagement is high or low in each component, Trowler’s categorisation of negative engagement, non-engagement, and positive engagement can be adopted. Engagement Process

Engagement as a metric

Learner Dashboard

The goal of positioning this as a metric was to create a learner engagement profile, similar to Haven’s engagement profile for marketing. I used Stevenson’s (2008) Pedagogical Model and Conole and Fill’s (2005) Task Type Taxonomy as ways of classifying log data and social network analysis to understand interactions between the different course actors. These were used to form dashboards such as the example above that could then be used to understand profiles, such as the one below (name fictionalised). One insight is that where simple raw VLE data might have suggested an engaged learner who is regularly online and features centrally in discussions, the engagement profile reveals the possibility of a learner who may lack academic support during their time online (evenings) and demonstrating a pattern of alienation based on an apparently strategic approach within an environment that is heavily structured through teacher-led inscription. Given the number of users who have not logged in or have yet to post to the discussions it might also seem sensible to target other learners for engagement interventions, however this would miss opportunities, revealed in the engagement profile, to provide useful support interventions targeting improved learner voice.

Engagement Profile

Continue reading

Learning logs: how long are your users online? Analytics Part 2 #moodle #learninganalytics

How long do users spend on Moodle (or more generally e-Learning) is another common question worth some initial exploration as part of my broader goal towards the notion of an engagement metric. This article discusses an approach into defining and obtaining insights from the idea of a session length for learning. This is mostly a data wrangling exercise to approximate the duration from event logs that will tell us that while all events are born equal, some are more equal than others. The algorithm should prove useful when I progress to course breakdowns in identifying particularly dedicated or struggling students who are investing larger amounts of time online, or those at-risk who aren’t spending enough. These questions are something I will return later in a future post as part of the project.

Learning Duration

This works on the same data as last week’s look at some basic distribution analysis which contains extraction SQL.

Event-duration Correlation

Event-duration Correlation

Duration distribution

Duration distribution

Session spread

Session spread

Continue reading

Scratching the surface: Moodle analytics in Rstudio Part 1 #moodle #learninganalytics

At some point I always come back to the question of how do we understand use of the VLE/LMS, which I’ve theorised a lot. As part of an interest to learn about Data Science I’ve signed up to Sliderule (@MySlideRule) and am being mentored through a capstone project with some Moodle data. The main goal is for me to learn R, which I’d never touched until 2 weeks ago, but hopefully the data can tell me something about Moodle at the same time. Feedback or advise on techniques is welcomed.

Exploratory Data Analysis on mdl_logstore_standard

For this part I am going to focus on producing some simple two-dimensional analysis. This assumes you have MySQL access to your Moodle database and RStudio.

Daily logins

Hourly access

Module use

Day of week

Frequency distribution

Activity distribution

Continue reading

the purpose of education is to delight (#purposedu #500words)

Learning begins in delight and ends in wisdom – Gardner Campbell

With its simplicity and panache, the purpose of delight is offered in a similar spirit to those so far: hope, independence, curiosity, magical experience, connection, confidence, enthusiasm, optimism, preparation … and to the seemingly over-arching theme of  helping people become what they are capable of becoming.

However this made me wonder how education actually supports a journey from delight to wisdom; a seemingly different journey than that from the classroom to the exam hall.

An education provider, as social institution, implies a certain structure of time and space and creates its own social system of relationships – perhaps most commonly being one of categorisation:

  • Categorisation of learning providers through league tables;
  • Categorisation of learning through curriculum;
  • Categorisation of learners through standardised testing.

For Foucault such classifications operate a disciplinary function that constitutes the individual as effect and object of power (pouvoir) and knowledge (connaissance). This produces the individual ‘case’ – learner as UCAS points, bachelor, master, drop-out, failure – the examination fixing individual differences and the commonality of potential.

Such systems may well suggest that the end of education is the dawn of learning.

An alternative to categorisation is sense-making, a process based on exploration rather than exploitation. In other words education must shift from instruction to discovery; from boring to building. Taking this further, McLuhan suggests that anyone who makes a distinction between education and entertainment doesn’t know the first thing about either. This should shift interest to the territory of knowledge (saviour) to be explored rather than domains of  knowledge (connaissance) being imposed.

Following Deleuze & Guattari this introduces an alternative concept of power (puissance) as a range of potential or ‘capacity for existence’.  This power resides with the learner – the power to be, so beautifully presented already as burning brightly, building minds, or the magical key to unlocking potential.

For me delight is the interest that can spark a connection with the world. Interest-driven learning or rhizomatic learning provide examples where there aren’t ‘things people should know’ but rather ‘new connections to be made’. The purpose of education and importance of teachers becomes to help learners follow their delights and make new connections; the community becomes the curriculum

As a technologist my interest has been in understanding how personalised learning might see systems adapt to the learner rather than learners to systems. One need only look at the impact of the long tail or the social network as ways of piquing personal interest and connecting people with shared interests.

While possibilities exist for improving learning, technology itself cannot act as an isolated catalyst for these changes which is why debates like this are so important. The classification model of education will resist such changes: rather than allow learners to explore through technology, a computer curriculum was developed … before the computer could change School, School changed the computer.

To reverse this, the purpose of delight seeks to help learners perceive education as something they’re participants in rather than recipients of. 

Expansive Analytics & Engagement #LAK12

[Learning] begins in delight and ends in wisdom.

Gardner Campbell gave an interesting talk about the danger of reductive models of learning that reduce the scope of education by limiting views of adjacent possibilities. Gardner argues that learning is one of the most complex processes one could study where no single theory is a good representation of all observations.

A potential issue with the traditional VLE/LMS model is that it has nothing to do with the self, identity or complexity that form learning. As such any analytics that relate to individual performance (i.e. behaviourial) are uninteresting for learning. Identity is not about just the self, but also about the other sets of selves that we interact with. An interesting analytics would help understand and encourage connections with the world rather than attempt to control them (e.g. the filter bubble effect that reduces exposure to challenging viewpoints).

The issue is that measuring what you get leads to getting what you measure. Measuring 1-dimensional models of student performance (as success) and at-risk (as point of intervention) reduces learning to behaviourism approaches. The structures being measured are inscrutable methods to get from one point to another and Taylorist models can be applied to test the effectiveness of each station. However complexity is the new reality – learning is non-linear and unpredictable.

Instead models of analytics that involve measuring student contributions and how they link to the world, are more interesting and should encourage further contributions and connections. Here connections as engagement can provide dynamic indications of success. Also rather than looking for the moment a learner is ‘about to fail’ one might look for moments of ‘beginning to learn’ as the point of intervention in a system that is able to learn.

Similar to my previous post, measuring the ‘beginning to learn’ moment hints towards Vygotsky’s Zone of Proximal Development. and further prompts investigations utlising Engeström’s Expansive Cycles. Learning analytics should drive new models of understanding for personalised learning for which engagement may be the new metric.