Using Qualitative Data for Driving Decision-Making at Scale QAA Event

On behalf of the QAA in Scotland, the University of Edinburgh invited all Universities of Scotland to present and provide input to the ‘Using Qualitative Data for Driving Decision-Making at Scale’ event on the 28th of November 2018 (

Professor Tina Harrison, the Assistant Principal Academic Standards and Quality Assurance of the University of Edinburgh, opened the well attended event. The event was structured around four presentations, which served as trigger for discussion and to bootstrap the activities that have been planned by the event coordinator Gillian Mackintosh.

Following the invitation of the Open University in Scotland, I presented ‘Understanding student experience comments at scale: Insights from an exploratory study’. I showed how automated text analytics methods can be used to mimic the typical tasks of the manual content analysis to make sense of student comments. Based on an exploratory study, I showed how keywords can be used to find topics in student comments, how these topics can be further refined and converted into a dictionary, that can be used to annotate student comments regarding topics. I then showed how these annotations can be enriched with sentiment analysis and how all this information can be used to empirically determine significant shifts in the experience of students over time. In my conclusions I tried to raise awareness about drawbacks of the presented methods but also their benefits. The presentation is based on the recently published Scholarly Insight Report Autumn 2018 .

This is the full abstract of my presentation:

Each year, students contribute tens of thousands of comments about their student experience via the Student Experience on a Module Survey (SEaM survey) of the Open University (OU). There remains a challenge as to how best utilize this data effectively for understanding module performance and planning module revisions. This presentation reports from an exploratory study that took a big data perspective analysing tens of thousands of comments. It uses automated empirical text analysis methods to detect hot topics students talk about during an academic year and it evaluates the sentiment that students express towards these topics. This presentation shows results from a recent OU scholarly insight report and related works .

I believe that my presentation strokes a nerve, as from the feedback of the participants that approached me, it became evident that many Scottish Universities showed an interest in these automated methods as they would allow to overcome the barriers of the time-consuming manual content analysis.

Other presentations included a presentation by Jill Mackay, which emphasised the importance of method triangulation for the evaluation of the lecture recording system at the University of Edinburgh. Another presentation by Paula Webster, the Head of Student Data and Surveys at the University of Edinburgh, gave a presentation about what works and what does not work when using qualitative data from student surveys for enhancement.

The presentation of Stef Black from sparqs (student partnerships in quality Scotland asked the participants of the event to directly provide steering regarding two high profile projects. One about student voice in especially about NSS question 25 (‘Is it clear how students’ feedback on the course has been acted on’) and the other about SLTA (student-led teaching awards.

After the main block of presentations, the facilitator planned an activity in order to directly gather input for the QAA about what works regarding the qualitative analysis of student data, what challenges are there and where do we need to be in the future. The output of this activity will feed directly into the strategic project ‘Evidence for Enhancement: Improving the Student Experience’ ( ) of the QAA Scotland.


Ullmann, T. D., Lay, S., Cross, S., Gaved, M., Jones, E., Hidalgo, R., … Rienties, B. (2018). Scholarly insight Spring 2018: a Data wrangler perspective (Scholarly Insight Series). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from
Ullmann, T. D. (2015). Keywords of written reflection – a comparison between reflective and descriptive datasets. In Proceedings of the 5th Workshop on Awareness and Reflection in Technology Enhanced Learning (Vol. 1465, pp. 83–96). Toledo, Spain: Retrieved from
Coughlan, T., Ullmann, T. D., & Lister, K. (2017). Understanding Accessibility as a Process through the Analysis of Feedback from Disabled Students. In W4A’17 International Web for All Conference. New York, USA: ACM. Retrieved from
Ullmann, T. D. (2017). Reflective Writing Analytics: Empirically Determined Keywords of Written Reflection. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 163–167). New York, NY, USA: ACM.