Inside forum posts – politics, networks, sentiment and words! Inspired by @phillipdawson, @shaned07, and @indicoData #moodle #learninganalytics

Enhanced communication has long been championed as a benefit of online learning environments, and many educational technology strategies will include statements around increased communication and collaboration between peers. So in thinking towards an engagement metric for my current project and the need to get inside activities for my, in progress, PhD proposal exploring forum use is one of the more interesting analytics spaces within the LMS. I’ve used three techniques for my initial analysis: (1) a look at post and reply counts inspired by @phillipdawson and his work on the Moodle engagement block, (2) social network analysis inspired by a paper by @shaned07 on teacher support networks; and (3) sentiment and political view analysis provided by @indicoData as an introduction to text mining.

I’ll start with sharing the visualisations and where these might be useful and then finish with details of how I coded these.

Forum posts

Total weekly forum posts by student

Following Phillip Dawson’s work on the engagement block for Moodle, I decided to look into two posting patterns: (1) posts over time; and (2) average post word count. The over time analysis (above) compares the weekly posting pattern of each student in a group. For most students replies to peers and teachers are “in phase” suggesting that when they are active they discuss with the entire group and so learning design might focus on keeping them active. One can also notice that those who only reply to peers appear to have much lower overall post activity, which in the original engagement block would place them at-risk – learning design may consider teacher-led interventions to understand whether discussions with the teacher impact their overall activity. The average word count analysis (below) reinforces the latter case where those demonstrating that those who only reply to peers infrequently post shorter replies. Conversely those who post infrequent lengthy posts tend to target the teacher and do not follow up with many further replies discussion. There is some suggestion of an optimal word count around 75-125 for forum posts that might warrant further investigation.

Forum Posts

Social Network Analysis

Social Network Analysis

The network diagram (above) confirms what was emerging in the post analysis: that a smaller core of students (yellow circles) are responsible for a majority of the posts, and further reveals the absolute centrality of the teacher (blue circle) that highlight how important teacher-led interventions may be to this group. This is probably not surprising although the the teacher may use this to consider how they might respond more equally to the group – here the number of replies is represented by increasing thickness of the grey edges and they appear to favour conversations in the lower left of the network. A similar theme is explored by Shane Dawson (2010) in “‘Seeing’ the learning community”. One can understand this further by plotting eigenvalue centrality against betweenness centrality (below) where a student with high betweenness and low eigenvalue centrality may be an important gatekeeper to a central actor, while a student with low betweenness and high eigenvalue centrality may have unique access to central actors.

Centrality

Content Analysis

Sentiment analysis

Text analysis of forums provides a necessary complement to the above analysis, exploring the content within the context. I have used the Indico API to aid my learning of this part of the field rather than try to build this from scratch. The sentiment analysis API determines whether a piece of text was positive or negative in tone and rates this on a scale from 0 (negative) to 1 (positive). Plotting this over time (above) provides insights into how different topics might have been received with this group showing generally positive participation, although with two noticeable troughs that might be worth some further exploration. The political opinion API scores political leaning within a text on a scale of 0 (neutral) to 1 (strong). Plotting this for each user (below) shows that more politicised posts tend to be conservative (unsurprising) although there is a reasonable mix of views across the discussion. What might be interesting here is how different student respond to different points of view and whether a largely conservative discussion, for example, might discourage contribution from others. Plotting sentiment against libertarian leaning (below2) shows that participants are, at least, very positive when leaning towards libertarian ideology, though this is not the only source of positivity. Exploring text analysis is fascinating and if projects such as Cognitive Presence Coding and the Quantitative Discourse Analysis Package make this more accessible then there are some potentially powerful insights to be had here. I had also hoped to analyse the number of external links embedded in posts following a talk by Gardner Campbell I heard some years ago about making external connections of knowledge, however the dataset I had yielded zero links, which while informative to learning design is not well represented in a visual (code is included below).
Political leaning

Libertarian sentiment

Continue reading

Advertisements

Educational Data Mining Taxonomy (#LAK12)

Learning Analytics – Week 1

For Baker and Yacef (2009) educational data is recognised to be different from other data sets due to multi-level hierarchy and non-independence. In their brief introduction to the state of Educational Data Mining (EDM), this rapidly growing research field originally emerged from exploration of student-computer interactions but have diversified into a broad spectrum of activities. The following taxonomy is presented as a summary of key activities in the field.

  1. Prediction
    • Classification
    • Regression
    • Density estimation
  2. Clustering
  3. Relationship mining
    • Association rule mining
    • Correlation mining
    • Sequential pattern mining
    • Causal data mining
  4. Distillation of data for human judgment
  5. Discovery with models

While items 1-4 are common to classical data mining the last item is unusal and has gained significant prominence within EDM. Item Response Theory, Bayes Nets, and Markov Decision Processes enter the field increasingly as psychometrics and student models merge into EDM.

Not being overly familiar with classical data mining I am drawn to learning analytics from Activity Theory, in particular Engestrom’s expansive research cycles. I am interested in data mining as a way to analyse activity to form new instruments for understanding the activity. My initial impression is that 1-3 would form part of the analysis, 4 would be instrument formation and 5 their application. My interest is how one can feed this back into the student learning experience which seems to fit nicely with the key applications identified by Baker and Yacef:
  1. Improvement of student models, such as the student’s current knowledge, motivation, meta-cognition, and attitudes;
  2. Improving models of a domain’s knowledge structure;
  3. Discovering which types of pedagogical support are most effective, either overall or for different groups of students or in different situations;
  4. Looking for empirical evidence to refine and extend educational theories and well-known educational phenomena.

Readings

Baker, R. S. J. D., & Yacef, K. (2009). The State of Educational Data Mining in 2009 : A Review and Future Visions. JEDM – Journal of Educational Data Mining, 1(1). Retrieved from http://www.educationaldatamining.org/JEDM/index.php?option=com_content&view=category&layout=blog&id=36&Itemid=55

Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research. Helsinki: Orienta-Konsultit. Retrieved from http://communication.ucsd.edu/MCA/Paper/Engestrom/expanding/toc.htm