Expansive Analytics & Engagement #LAK12

[Learning] begins in delight and ends in wisdom.

Gardner Campbell gave an interesting talk about the danger of reductive models of learning that reduce the scope of education by limiting views of adjacent possibilities. Gardner argues that learning is one of the most complex processes one could study where no single theory is a good representation of all observations.

A potential issue with the traditional VLE/LMS model is that it has nothing to do with the self, identity or complexity that form learning. As such any analytics that relate to individual performance (i.e. behaviourial) are uninteresting for learning. Identity is not about just the self, but also about the other sets of selves that we interact with. An interesting analytics would help understand and encourage connections with the world rather than attempt to control them (e.g. the filter bubble effect that reduces exposure to challenging viewpoints).

The issue is that measuring what you get leads to getting what you measure. Measuring 1-dimensional models of student performance (as success) and at-risk (as point of intervention) reduces learning to behaviourism approaches. The structures being measured are inscrutable methods to get from one point to another and Taylorist models can be applied to test the effectiveness of each station. However complexity is the new reality – learning is non-linear and unpredictable.

Instead models of analytics that involve measuring student contributions and how they link to the world, are more interesting and should encourage further contributions and connections. Here connections as engagement can provide dynamic indications of success. Also rather than looking for the moment a learner is ‘about to fail’ one might look for moments of ‘beginning to learn’ as the point of intervention in a system that is able to learn.

Similar to my previous post, measuring the ‘beginning to learn’ moment hints towards Vygotsky’s Zone of Proximal Development. and further prompts investigations utlising Engeström’s Expansive Cycles. Learning analytics should drive new models of understanding for personalised learning for which engagement may be the new metric.


Analysing Social Learning (#LAK12)

Dawson (2010) concludes that the value of learning analytics is to provide relevant and real-time visualisation that can assist staff to align their pedagogical practices more fully with the learning needs of all their students.

In an interesting webinar, Shane Dawson suggested that the VLE can be generalised into two tools: content page or discussion. Content is typically analysed via  session counts, dwell time, and downloads and discussions provide metrics relating to posts, replies, and views (modelled via Social Network Analysis (SNA)). I would adapt this to suggest that all activities in the VLE be viewed as discussions. Giest (2010) argues that interaction can be concrete and direct as in an online forum, but can also be abstract and indirect where the interactions or cultural tools of activity are represented in materialised forms such as content. In the hypermedia learning environment seeing activity as interactive (social) allows consistent analysis: while traditional hand-outs might show teacher-centric interactions, student-created content or collaborative Wikis may reveal different learning networks.

Some of the interesting findings that Dawson (2010) made were that students tend to interact with those of a similar ability, forming clusters of high and low performing learners. One explanation proposed was how this could represent  an effect of the Vygotskian zone of proximal development – learners gravitate within their zone and so engage with discussions at similar development levels. One might see this as an example of interaction creating conditions to identify zones (Chaiklin, 2003), where it is instruction that should create the zones (Geist, 2010). Dawson’s analysis finds a phenomenon inconsistent with this where staff interventions tend to gravitate towards the high performing networks.

Teaching staff were positioned in 81.7% of the high-performing and 34.61% of the low-performing student networks (pg. 746) and thus may be further restricting educational opportunities. One reason proposed was that in pursuing the values of a learning community it was often assumed that low end questions would be answered by other students. Whereas the tutor focused on the high performers because the questions were more difficult and so they felt more inclined to intervene. Rather than a shared project of community-wide learning students seem motivated to form networks that best enhance their individual grade performance (ego-centric).

One solution might be to use the SNA to identify peer-mediated instruction interventions (Fuchs & Fuchs, 2009). Where in practice peer groups contain a range of competency levels, ideas, conceptions, misconceptions and areas of expertise, learners can benefit from both the giving and receiving of ideas, embodied in different ways in both Vygotskian and Piagetian perspectives on social constructivism (Webb and Mastergeorge, 2003).

When designing successful social interactions and understanding the relation of technology to culture it is suggested that the only way to define the technological effects of the Internet is to build the Internet (Poster, 1995). Responsive feedback via SNA has shown how effective good visualisations of data can be in revealing these effects and allowing tutors to rebuild interactions or interventions.

Reading

Chaiklin, S. (2003). The Zone of Proximal Development in Vygotsky’s Analysis of Learning and Instruction. In A. Kozulin, B. Gindis, V. S. Ageyev, & S. M. Miller (Eds.), Vygotsky’s Educational Theory in Cultural Context (pp. 39-64). Cambridge: Cambridge University Press.

Dawson, S. (2010). “Seeing” the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, 41(5), 736-752.

Fuchs, D., & Fuchs, L. S. (2009). Peer-mediated instruction. Better: Evidence-based Education, 1(1), 18-19.

Giest, H. (2010). The Formation Experiment in the Age of Hypermedia and Distance Learning. In B. van Oers, W. Wardekke, E. Elber, & R. van der Veer (Eds.), The Transformation of Learning [Kindle Edition]. Cambridge: Cambridge University Press.

Poster, M. (1995). Postmodern Virtualities. The Second Media Age. Blackwell.

Webb, N., & Mastergeorge, A. (2003). Promoting effective helping behavior in peer-directed groups. International Journal of Educational Research, 39(1-2), 73-97.

Learning Analytics and Leading Activity (#LAK12)

I have been pondering the last couple of weeks what type of data or patterns might one look for from learning analytics. There are lots of different projects I have seen or been involved in that are emerging:

  • Retention focused – identifying learners at-risk of drop-out from the course;
  • Performance focused – predicting final exam success;
  • Activity focused – quantitative views of activity;
  • Course focused – usually linked to a bench-marking of staff performance;
  • Engagement focused – what types of things are people doing;

A further idea emerged for me when reading the NY Times article:  ‘How Companies Learn Your Secrets‘. The article details how supermarkets might gather information about you. The main goal for shopper analytics is to identify approaching periods when consumer’s patterns are subject to change. For example the number one period for this is when a new baby is born, or in marketing terms when parents are ‘exhausted, overwhelmed and their shopping patterns and brand loyalties are up for grabs’.  The approaching aspect is crucial here as the earlier an intervention occurs the more likely it is to beat other interventions, or in other words they more likely they are to switch to your store.

A similar model emerges in Vygotksy’s (1978) concept of the Zone of Proximal Development.  Vygotsky proposes that for effective pedagogical interventions one must calculate at least two development levels: actual development – that which the learner can achieve unaided (e.g. tests) and potential development – that which the learner can achieve with support. The zone of proximal development is the difference between the two. Vygotsky argues that ‘by using this method we can take account of not only the cycles and maturation processes that have already been completed but also those processes that are currently in a state of formation, that are just beginning to mature and develop‘ (pg. 87).

The zone of proximal development refers to the maturing functions that are relevant for the next development period, and Chaiklin (2003) further clarifies that social interaction does not create these functions but provides conditions for identifying their existence and the extent they are developed. Interaction and social network analysis alongside behavioural psychology may offer insights while prompting some further thought on the design of interactions.

Vygotsky further suggests this requires a re-examination of formal subject disciplines and their relation to overall development. For Vygotsky this cannot be solved in a single formula and diverse research around the zone of proximal development is required. Fortunately learner analytics isn’t about a single formula either (despite its philosopher stone appeal) – semantically linked data allows diverse sets to be explored and may be directed at revealing the ‘ripeness’ of maturing functions. It seems such an approach to learning analytics (one focused on development) may require new approaches to designing learning environments, in which analytics are an integral tool rather than a retrospective analysis of existing data.

Readings

Chaiklin, S. (2003). The Zone of Proximal Development in Vygotsky’s Analysis of Learning and Instruction. In A. Kozulin, B. Gindis, V. S. Ageyev, & S. M. Miller (Eds.), Vygotsky’s Educational Theory in Cultural Context (pp. 39-64). Cambridge: Cambridge University Press.

Vygotsky, L. (1978). Mind in Society. Cambridge, MA and London: Harvard University Press.

Analytics Case Studies: Perceiving Tools (#LAK12 )

Been watching some interesting case studies of learning analytics from the Signals project at Purdue and Check My Activity at UMBC and thinking about parallels with some projects I am working on.

When discussing the impact for learners at Purdue, John Campbell mentioned that the traffic lights led students to perceive that their tutors were there to support them more. This is similar to a conversation I had at University of Greenwich about the experience of a self service tutorial registration tool – students perceived that the LMS (Moodle) was offering them more choice because of it. I like this subtle impact of technology – I have no doubt the tutors at Purdue have always been supportive in the same way that tutorial selection was always permitted, however the introduction of technology makes the process visible and to some extent more tangible. To me this is how good technologies make for what John described as ‘actionable intelligence’.

What appeals to me about learning analytics in this sense is captured in the following quote you’ll see me use at conferences and have lost the source:

There’s probably a long way to go with learners generally to get them to perceive education as something they’re participants in rather than recipients of … I think it’s simply that we haven’t got far enough down the line yet with the whole situation.

This leads me into two examples of from projects that I have been involved in and cross into domains of leanring analytics, although I realise there are so much more to still be done:

Engagement Tracking

Manchester Metropolitan University

This development is in its infancy and shares a starting point with the UMBC project looking at quantitative use of the LMS initially for comaprison with grades, satisfactions surveys etc as we extract the data. This looks at overall activity in the past 4 weeks on the LMS for students in a course group to identify users who may be showing signs of disengagement. It colour codes based on inter-quartile ranges for the group.

Personal Development Plans

Lewisham College

This is part of a wider development for learning plans, but includes student status elements, attendance, and other indicators on a single dashboard available to tutor and student. Similar Purdue this approach embedded into student processes saw retention increase. In the evaluated courses retention rose from 62% to 92% and achievement from 69% to 73% in the first year of this transformation project.

What are Learning Analytics? (#LAK12)

The use of “academic” or “learning” analytics seems interchangeable in some articles. The below table gives an idea of their differences (Siemens and Long)

Type of Analytics Level or Object of Analysis Who Benefits
Learning
Analytics
Course-level: social networks, conceptual development, discourse analysis, “intelligent curriculum” Learners, faculty
Departmental: predictive modelling, patterns of success/failure Learners, faculty
Academic
Analytics
Institutional: learner profiles, performance of academics, knowledge flow Administrators, funders, marketing
Regional (state/provincial): comparisons between systems Funders, administrators
National and International National governments, education authorities

Educational Data Mining Taxonomy (#LAK12)

Learning Analytics – Week 1

For Baker and Yacef (2009) educational data is recognised to be different from other data sets due to multi-level hierarchy and non-independence. In their brief introduction to the state of Educational Data Mining (EDM), this rapidly growing research field originally emerged from exploration of student-computer interactions but have diversified into a broad spectrum of activities. The following taxonomy is presented as a summary of key activities in the field.

  1. Prediction
    • Classification
    • Regression
    • Density estimation
  2. Clustering
  3. Relationship mining
    • Association rule mining
    • Correlation mining
    • Sequential pattern mining
    • Causal data mining
  4. Distillation of data for human judgment
  5. Discovery with models

While items 1-4 are common to classical data mining the last item is unusal and has gained significant prominence within EDM. Item Response Theory, Bayes Nets, and Markov Decision Processes enter the field increasingly as psychometrics and student models merge into EDM.

Not being overly familiar with classical data mining I am drawn to learning analytics from Activity Theory, in particular Engestrom’s expansive research cycles. I am interested in data mining as a way to analyse activity to form new instruments for understanding the activity. My initial impression is that 1-3 would form part of the analysis, 4 would be instrument formation and 5 their application. My interest is how one can feed this back into the student learning experience which seems to fit nicely with the key applications identified by Baker and Yacef:
  1. Improvement of student models, such as the student’s current knowledge, motivation, meta-cognition, and attitudes;
  2. Improving models of a domain’s knowledge structure;
  3. Discovering which types of pedagogical support are most effective, either overall or for different groups of students or in different situations;
  4. Looking for empirical evidence to refine and extend educational theories and well-known educational phenomena.

Readings

Baker, R. S. J. D., & Yacef, K. (2009). The State of Educational Data Mining in 2009 : A Review and Future Visions. JEDM – Journal of Educational Data Mining, 1(1). Retrieved from http://www.educationaldatamining.org/JEDM/index.php?option=com_content&view=category&layout=blog&id=36&Itemid=55

Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research. Helsinki: Orienta-Konsultit. Retrieved from http://communication.ucsd.edu/MCA/Paper/Engestrom/expanding/toc.htm