Published: Automated Analysis of Reflection in Writing

The International Journal of Artificial Intelligence in Education just published my paper on the Automated Analysis of Reflection in Writing. It is currently freely available at Springer. Find it here:
https://doi.org/10.1007/s40593-019-00174-2

The paper brings together much of the research into this new area of research that seeks to understand whether automated methods can be used to analyze writings (e.g. student essays) regarding reflective thinking.

Reflective practice is a common educational practice at most Universities. Writing down your thoughts can help you to deepen your reflective thought process about your experience. Everyone thinks reflectively, but not everyone has this skill fully developed. The good news is that reflective thinking can be taught and a common method for this is reflective writing.

There are many models of reflective thinking. Teachers and researches use these models (or assessment rubrics, frameworks, or coding scheme) to analyze and assess the writings of students. The paper has an overview of the many models that have been uses to manually analyze reflective writings.

The manual analysis tends to take time and this is a barrier which we need to overcome. It delays the feedback that we give to our students about their writing. Students may feel uncomfortable with others reading their reflective though. And, it makes large scale research on reflective writings costly. Automated methods on the other hand are immediate, non-judgemental, and operate at scale.

So, automated methods have some benefits, but are they any good? After all, reflective thinking is quite complex and may be hard to detect. The manual rating of such essays is not easy so why would a machine be good at it?

For this paper, I used machine learning to evaluate its performance on previously rated texts. I have written about other automated methods, such as the dictionary-based approach or the rule-based approach. The paper has a nice overview.

One of the tricky bits was to define what constitutes reflective thinking in writings. As outlined above there are many different models of reflective writing and I cite many of them in my paper. A close look at all those models showed that many models have similarities often masked behind technical terms. The common constituents of those models formed the model that I put machine learning to the test. It is a complex model of reflective thinking formed with the most relevant categories of reflective writing. Have a look in the paper to read more about the model for reflection detection.

Surprisingly, the evaluation of the model for reflection detection on thousands of sentences showed that machine learning actually does a good job on this complex construct.

In the paper, I provide a sense about how good the machine learning results are and also outline where we should put research funding to allow more research into this important area.