What self-regulation can tell us about analytics

Learners who see technologies as being beneficial to learning are likely to use them less, while those with poor strategies for learning will use them more. This may appear counter intuitive and challenges common perceptions used to determine the success of learning technologies, but essentially insists on quality rather than quantity of interaction which makes sense. The findings behind this emerge from the literature on self-regulated learning and cognitive engagement, which highlight several non-linear relationships between perception, learning strategies and the use of tools. In this article I will summarise three pieces of research and discuss their implications for common metrics used to evaluate learning technologies.

Winne and Hadwin (1998) argue that studying can be distinguished from learning and ‘compels students to engage in complex bundles of goal-directed cognitive and motivational processes’ (pg. 278). Self-regulated studying is demonstrated to be more effective than normal studying, where self-regulation involves a four stage process: (1) definition of task; (2) goal setting and planning; (3) enactment; and (4) adaptation. The stages are recursive – products and evaluations from one stage feeds into the next – and weakly sequenced – the stages can overlap, skip, or go back – and while these are expected to leave traces in the forms of products the recursive nature of the system make this complex to research. Davidson and Sternberg (1998) argue that learners, or experts, who apply effective metacognitive strategies spend more time in the planning stage and as a result less time in the enactment stage – they are also likely to make more effective evaluations and adaptations because of standards identified during planning. Less skilled problem solvers spend more time in the enactment stage due to reduced planning that is often caused by insufficient domain or metacognitive knowledge. Unless the novice learner quits the task due to frustration it is likely that overall expert problem solvers are likely to spend less time on the problem and achieve greater results. 

Approaches to study and teaching (Richardson) - Winne & Hadwin SRL

Winne and Hadwin, 2008

Greene (2015) is initially interested more in the relationship between motivation and performance, mediated via engagement, and considers that perceived instrumentality – the view that current engagement aligns with future goals – may be easier to foster than, and act as a proxy for, intrinsic motivation. Greene identifies non-linear relationships where high self-efficacy or perceived instrumentality are likely to result in mastery goals, which in turn promote deeper learning strategies. Lower self-efficacy prompts performance goals which lead to shallower learning strategies. However when it comes to achievement as the dependent variable there is a general indication that deeper learning leads to better outcomes but there are many examples of shallow learners achieving at the same level. What seems to explain this is domain and task specificity where different areas of subjects may facilitate different strategies, for example a problem solving schemata may require a mix of strategies whereas surface learning can still be successful especially in more procedural subjects. What Greene reveals is that students who can vary their strategy based on the problem at hand tend to out-perform those who stick to a single strategy approach, irrespective of whether this represents a deep or surface approach. Winne and Hadwin’s model of self-regulated learning might explain this through the planning and evaluation of learning, while Davidson and Sternberg’s problem solving metacognition might identify an inaccurate encoding of the problem at hand resulting in poor choice of strategy.  

Approaches to study and teaching (Richardson) - Greene Cognitive Engagement

Greene, 2015

Clarebout, Elen, Juarez Collazo, Lust, & Jiang (2013) identify a non-linear relationship between self-regulation and tool use. There are three correlation patterns in their findings: (1) learner reporting high self-regulating strategy use will tend to perceive tools or technologies as being more useful, but will use them less; (2) learners reporting low self-regulating will tend to use tools more; and (3) learners reporting low help-seeking behaviours will tend to use systems less. Possible explanations for this is that self-regulation, in particular the planning and adaptation stages of Winne and Hadwin’s model for example, results in more selective use of tools – they are perceived as more useful because the learner gets more out of them through careful planning and evaluation. Learners with low self-regulation may tend to follow circuitous paths or use every feature, possibly through trial and error, that may reflect inexperience with metacognitive or domain specific strategies. Learners with low help-seeking behaviour will tend to use the tool for help rather than a human source – Schraw (2007) finds that a human mediator is more effective and so this may disadvantage these learners. These results appears to be consistent with Davidson and Sternberg’s findings that this pattern of behaviour is widely found at all educational levels.

Approaches to study and teaching (Richardson) - Clarebout Tool Use

Clarebout et al., 2013

Implications for Analytics

Reflecting on some of the metrics that are readily obtained and used from learning technologies, the literature on self-regulated learning and cognitive engagement might provide some rethinking of assumptions we might use with this data. Firstly tool use, which is probably the most common category for LMS data and might include metrics like the number of log entries, number of resources accessed, time on task, and number of posts. If one’s assumptions is that high values for these metrics is an indication of a good learning experience then this would fail to recognise the nuances of users with high use caused by low self-regulation (e.g. poor planning) or patterns of focused use based on strong planning or intentional surface learning both of which are likely to result in low use yet high success. Secondly, achievement which is usually captured via gradebooks will remain a key dependent variable however their relationship with activity is likely to be less correlated. These types of irregularities appear in often in learning technology discussions where students perhaps do not access the resources available or correlation between activity and achievement is weak or missing when expected. Self-regulation and cognitive engagement may provide some explanation as to why this is the case.   

References

Clarebout, G., Elen, J., Juarez Collazo, N. A., Lust, G., & Jiang, L. (2013). Metacognition and the Use of Tools. In V. Aleven & R. Azevedo (Eds.), International Handbook of Metacognition and Learning Technologies (pp. 187–195). New York: Springer. http://doi.org/10.1007/978-1-4419-5546-3

Davidson, J. E., & Sternberg, R. J. (1998). Smart Problem Solving: How Metacognition Helps. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Metacognition in Educational Theory and Practice (pp. 47–68). New Jersey: Lawrence Erlbaum.

 

Schraw, G. (2007). The use of computer-based environments for understanding and improving self-regulation. Metacognition and Learning, 2(2-3), 169–176. http://doi.org/10.1007/s11409-007-9015-8

Winne, P. H., & Hadwin, A. F. (1998). Studying as Self-Regulated Learning. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Metacognition in Educational Theory and Practice (pp. 277–304). Mahwah, NJ: Lawrence Erlbaum.

 

 

Advertisements

Share your feedback

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s