Text Analytics to Improve the Student Experience

It takes about two hours to skim read all student comments of a single popular module of the Open University. But, reading alone is not enough to make sense of all those comments, which is a process that takes much longer. Everyone who has ever done a content analysis knows how much effort it takes to analyse texts. The OU has effective mechanisms in place to to act upon student comments on module level, but what about if we want to analyze larger aggregates, such as on qualification level or even faculty level for quality enhancement purposes? The large-scale analysis of student comments quickly becomes an intractable problem .

The Open University’s Text Analytics of Student Comments Initiative seeks solution to this problem and in my role as academic lead on this project I have been recently invited to speak at the QAA Scotland Enhancement Themes event: Improving the Student Experience, which was on the 30/04/2019 in Glasgow. The event brought together more than 100 delegates from mostly Scottish Universities, the industry, the QAA, AdvanceHE, and student associations. Clare Parks and her team did a tremendous job to prepare for this event. To give you a sense of the event, they gathered images and Tweets from the day: Wakelet on ‘Exploring Student Surveys event’.

Presenting work of the OU Text Analytics Initiative

For my presentation, I had time to highlight results from two pieces of work out of a larger body of work that is already in the making since a few years.

Recently, I have been collaborating with the Open Programme team, which offers the OU Open Degree qualification. The Open Degree is a fantastic concept. A student can basically design their own qualification. It is all about openness extending the range of possibilities for our students. This programme fits well with our students, which can be seen by the number of students taking it. It is our biggest qualification.

Being the biggest qualification means also tons of student comments, which makes it a perfect example for the types of scenarios we want to explore within the Text Analytics of Student Comments Initiative. Based on technology that I have been developing, we looked at differences between students that studied towards an Open Degree and students that studied towards another named degree. The results of the analysis confirmed that Open Degree students emphasize in their comments certain topics more often than the rest of the students. They are, for example, very much interested in research, which seems very plausible for students that design their studies around their interests. They also mention more often their readings and course materials and they are very positive about those. The other students are also interested in these topics, it is just that Open Degree students mention these topics significantly more often. I touched on other topics, which I will not outline in detail here, because you can read about it in our latest Data Wrangler report .

The second example was about my collaboration with the Securing Greater Accessibility Initiative of the Open University. The OU has more students with disability than the entire student body of many Universities (currently about 25,000 students). The analysis showed that students that declared a disability talk about many student experience related things, but also that they talk about disability specific topics such as our DAISY books, screen reader technology, or comb binding books. Details of this study can be found in our contribution to the International Web for All conference, which also was a best paper candidate .

You can download the slides for my presentation here: https://www.enhancementthemes.ac.uk/en/current-enhancement-theme/student-engagement-and-demographics/national-student-survey-(nss)-analysis. It has many pointers to the reports and publications that I produced from this line of my research.

A big shout-out to my colleagues Heather Gibson and Shona Littlejohn from the Open University in Scotland, who made all of this possible. It is a fantastic feeling being part of this four nation OU family. Heather spotted this fantastic opportunity to further the discourse from our previous presentation at the QAA event in Edinburgh that looked more at the underlying technology.

References

LSE. (2011, June 20). Your essential ‘how-to’ guide to writing good abstracts. Retrieved March 9, 2016, from http://blogs.lse.ac.uk/impactofsocialsciences/2011/06/20/essential-guide-writing-good-abstracts/
Polanco, & Malioutov. (n.d.). Is Artificial Intelligence Permanently Inscrutable? – Issue 40: Learning. Retrieved September 22, 2016, from http://nautil.us/issue/40/learning/is-artificial-intelligence-permanently-inscrutable
Richardson, J. T. E. (2012). The attainment of White and ethnic minority students in distance education. Assessment & Evaluation in Higher Education, 37(4), 393–408. https://doi.org/10.1080/02602938.2010.534767
Richardson, J. T. E. (2015). The under-attainment of ethnic minority students in UK higher education: what we know and what we don’t know. Journal of Further and Higher Education, 39(2), 278–291. https://doi.org/10.1080/0309877X.2013.858680
Li, N., Marsh, V., Rienties, B., & Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large-scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657–672. https://doi.org/10.1080/02602938.2016.1176989
ECTEL 2020: Review Criteria. (n.d.). Retrieved April 13, 2020, from http://www.ec-tel.eu/index.php?id=971
Carpenter, D., Iphofen, R., Oates, J., Rawnsley, A., & Whitman, B. (2020). Research Ethics Support and Review in Research Organisations. UKRIO and ARMA. https://doi.org/10.37672/UKRIO-2020.01-ARMA
Alrashidi, H., Ullmann, T. D., Ghounaim, S., & Joy, M. (2020). A Framework for Assessing Reflective Writing Produced Within the Context of Computer Science Education. In Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20. Frankfurt, Germany.
Paquette, L., Andres, A., Ocumpaugh, J., Baker, R., & Li, Z. (to appear). Who’s learning? Using demographics in EDM research. Journal of Educational Data Mining, 29.
Ullmann, T. D. (2019, November). Big data analysis of SEaM comments – What are FASS students saying about employability? Presented at the FASSTEST Employability Day, The Open University, Milton Keynes. Retrieved from http://www.open.ac.uk/blogs/FASSTEST/index.php/employability-event/
Vaesen, K., & Katzav, J. (2017). How much would each researcher receive if competitive government research funding were distributed equally among researchers? PLOS ONE, 12(9), e0183967. https://doi.org/10.1371/journal.pone.0183967
How to review Elsevier. (n.d.). Retrieved March 23, 2020, from https://www.elsevier.com/reviewers/how-to-review
Ullmann, T. D. (2020). Propensity score analysis of the L161 peer assessment activity (QEI report). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from https://openuniv.sharepoint.com/sites/units/lds/scholarship-exchange/documents/IET_QEI_Report_20_2_Peer_assessment_v1.pdf
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703–714. https://doi.org/10.1016/j.chb.2017.03.028
Xu, X., Murray, T., & Woolf, B. P. (n.d.). Text    Analysis    of    Deliberative    Skills    in    Undergraduate    Online    Dialogue:        Using    L1    Regularized    Logistic    Regression    with    Psycholinguistic    Features, 7.
Zehner, F., Kroehne, U., Hahnel, C., & Goldhammer, F. (n.d.). PISA reading: Mode effects unveiled in short text responses, 21.
O’Brien, J. (2020). 2020 EDUCAUSE Horizon Report | Teaching and Learning Edition, 58.
Kubiak, C., & Ullmann, T. D. (2020). Student support advisors: context, practice and impact (QEI report). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from https://openuniv.sharepoint.com/sites/units/lds/scholarship-exchange/documents/IET_QEI_Report_20_1_Student_support_advisors_v1.pdf
Ullmann, T. D., Marsh, V., Slee, A., Cross, S., & Rienties, B. (2017). DW – Data wrangler’s key metric report for WELS (Key Metric reports). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://article.iet.open.ac.uk/D/Data%20Wranglers/Key%20Metrics%20Report%20Autumn%202016/
Ullmann, T. D., Marsh, V., Slee, A., Cross, S., & Rienties, B. (2017). DW – Data wrangler’s key metric report for STEM (Key Metric reports). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://article.iet.open.ac.uk/D/Data%20Wranglers/Key%20Metrics%20Report%20Autumn%202016/
Ullmann, T. D., Marsh, V., Slee, A., Cross, S., & Rienties, B. (2017). DW – Data wrangler’s key metric report for FBL (Key Metric reports). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://article.iet.open.ac.uk/D/Data%20Wranglers/Key%20Metrics%20Report%20Autumn%202016/
Ullmann, T. D., Marsh, V., Slee, A., Cross, S., & Rienties, B. (2017). DW – Data wrangler’s key metric report for FASS (Key Metric reports). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://article.iet.open.ac.uk/D/Data%20Wranglers/Key%20Metrics%20Report%20Autumn%202016/
Ullmann, T. D., Marsh, V., Slee, A., Cross, S., & Rienties, B. (2016). DW – Data wrangler’s key metric report for FASS (Key Metric reports). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://article.iet.open.ac.uk/D/Data%20Wranglers/Key%20Metrics%20Report%20Autumn%202016/
Ullmann, T. D. (2017). Group tuition policy – Analysis of SEaM open-ended questions (Report to the Group Tuition Policy evaluation working group). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from https://intranet9.open.ac.uk/collaboration/Scholarship-Exchange/Wiki/Document.aspx?DocumentID=2174
COPE: Promoting integrity in scholarly research and its publication | Committee on Publication Ethics: COPE. (n.d.). Retrieved March 2, 2020, from https://publicationethics.org/
PRISMA. (n.d.). Retrieved February 27, 2020, from http://www.prisma-statement.org/
Consort – Welcome to the CONSORT Website. (n.d.). Retrieved February 27, 2020, from http://www.consort-statement.org/
Cui, Y., Wise, A. F., & Allen, K. L. (2019). Developing reflection analytics for health professions education: A multi-dimensional framework to align critical concepts with data features. Computers in Human Behavior, 100, 305–324. https://doi.org/10.1016/j.chb.2019.02.019
Silla, C. N., & Freitas, A. A. (2011). A survey of hierarchical classification across different application domains. Data Mining and Knowledge Discovery, 22(1), 31–72. https://doi.org/10.1007/s10618-010-0175-9
Vygotsky, L. S. (1978). Interaction between Learning and Development. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in Society: The Development of Higher Psychological Processes (pp. 79–91). Harvard University Press.
Macfarlane, B. (2012). The higher education research archipelago. Higher Education Research & Development, 31(1), 129–131. https://doi.org/10.1080/07294360.2012.642846
Boring, A., Ottoboni, K., & Stark, P. (2016). Student Evaluations of Teaching (Mostly) Do Not Measure Teaching Effectiveness. ScienceOpen Research. https://doi.org/10.14293/S2199-1006.1.SOR-EDU.AETBZC.v1
CGHE. (n.d.). Analysing neoliberalism in theory and practice: The case of performance-based funding for higher education. Retrieved February 14, 2020, from https://www.researchcghe.org/publications/working-paper/analysing-neoliberalism-in-theory-and-practice-the-case-of-performance-based-funding-for-higher-education/
CGHE. (n.d.). University governance and academic work: the ‘business model’ and its impact on innovation and creativity. Retrieved February 14, 2020, from https://www.researchcghe.org/publications/working-paper/university-governance-and-academic-work-the-business-model-and-its-impact-on-innovation-and-creativity/
McDonald, J., Moskal, A. C. M., Goodchild, A., Stein, S., & Terry, S. (2020). Advancing text-analysis to tap into the student voice: a proof-of-concept study. Assessment & Evaluation in Higher Education, 45(1), 154–164. https://doi.org/10.1080/02602938.2019.1614524
Hynninen, T., Knutas, A., Hujala, M., & Arminen, H. (2019). Distinguishing the Themes Emerging from Masses of Open Student Feedback. In 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO) (pp. 557–561). https://doi.org/10.23919/MIPRO.2019.8756781
Santhanam, E., Lynch, B., & Jones, J. (2018). Making sense of student feedback using text analysis – adapting and expanding a common lexicon. Quality Assurance in Education, 26(1), 60–69. https://doi.org/10.1108/QAE-11-2016-0062
Fan, Y., Shepherd, L. J., Slavich, E., Waters, D., Stone, M., Abel, R., & Johnston, E. L. (2019). Gender and cultural bias in student evaluations: Why representation matters. PLoS ONE, 14(2). https://doi.org/10.1371/journal.pone.0209749
Cramer, K. M., & Alexitch, L. R. (2000). Student Evaluations of College Professors: Identifying Sources of Bias. Canadian Journal of Higher Education, 30(2), 143–164. Retrieved from http://journals.sfu.ca/cjhe/index.php/cjhe/article/view/183360
Shah, M., & Pabel, A. (2019). Making the student voice count: using qualitative student feedback to enhance the student experience. Journal of Applied Research in Higher Education, aheadofprint(ahead-of-print). https://doi.org/10.1108/JARHE-02-2019-0030
Scott, G., Grebennikov, L., & Shah, M. (2008). Using qualitative data to prove and improve quality in Australian higher education. In Proceedings of the 2008 Australasian Higher Education Evaluation Forum (pp. 97–111). Melbourne. Retrieved from https://web.archive.org/web/20110415135034/http://opq.monash.edu.au/mqu/aheef2008/conferenceproceedings.pdf
Liu, N.-F., & Carless, D. (2006). Peer feedback: the learning element of peer assessment. Teaching in Higher Education, 11(3), 279–290. https://doi.org/10.1080/13562510600680582
Winstone, D. N., & Carless, D. (n.d.). Designing Effective Feedback Processes in Higher Education: A Learning-Focused Approach, 1st Edition (Paperback) – Routledge [Text]. Retrieved February 11, 2020, from https://www.routledge.com/Designing-Effective-Feedback-Processes-in-Higher-Education-A-Learning-Focused/Winstone-Carless/p/book/9780815361633
Winstone, D. N. (n.d.). How are cultures of feedback practice shaped by accountability and quality assurance agendas?, 34.
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Carless, D. (2006). Differing perceptions in the feedback process. Studies in Higher Education, 31(2), 219–233. https://doi.org/10.1080/03075070600572132
Tertiary education fees in Australia. (2019). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Tertiary_education_fees_in_Australia&oldid=931804483
Wellings, P., Black, R., Craven, G., Freshwater, D., Harding, S., Australia, & Department of Education. (2019). Performance-based funding for the Commonwealth Grants Scheme.
Zhao, Y., Lei, J., Yan, B., Lai, C., & Tan, H. S. (2005). What Makes the Difference? A Practical Analysis of Research on the Effectiveness of Distance Education. Teachers College Record, 107(8), 1836–1884. https://doi.org/10.1111/j.1467-9620.2005.00544.x
Mayer, R. E. (2019). Thirty years of research on online learning. Applied Cognitive Psychology, 33(2), 152–159. https://doi.org/10.1002/acp.3482