A comparable reflective writing corpus of multiple languages

Wouldn’t it be great to have a large data set of annotated reflective writings written in various languages? We might get there but let’s start with a bit of brainstorming.

At first sight, the task to create a multiple languages reflective writing corpus looks like a relatively straightforward task. For example, we could use a set of English reflective writings and let them translate into several languages. The problem here is that we would not learn anything about the essence of reflection, for example, whether there is anything language/culture-specific to reflection or not. Another idea which is quick to execute is that everyone interested can just contact friends (or friends of friends) from Uni and ask them whether they want to provide a reflective writing that they have written during their study time in exchange for a small payment for research purposes. The problem here is that we would not know whether the outcomes of the analysis of these reflective writings is telling us more about our sample choice or about reflection.

These two examples already show that the task is tricky. There is much to consider and many decisions to be made.

We all can go out and just collect data, but with some planning and collaboration we might get better data. I believe that it is therefore important to bring together many potential stakeholders and to brainstorm what we need to consider to create a corpus of comparative reflective writings in multiple languages. What I mean with multiple languages is that the aim is to create a corpus of two or more languages consisting of reflective writings. Each language set of reflective writings is built according to the same standards. Overall the aim is to have data set of different languages that are similar or comparable.

This brings me to the brainstorming task. Imagine that we all received a good amount of time and money to study reflective thinking across the world in order to better understand the variety of reflective thinking expressed in writings.

“What do we need to consider to create and annotate a publicly available data set of comparable HE students’ reflective writings of multiple languages?”

Think about this question. Then jot down a list of your considerations (keep them short and add an explanation to each). Once you have created your list, sort the items according to your priority. But don’t stop there. Send them to me. I am more than happy to discuss your points with you so please send them to me. And remember, brainstorming is about creativity, so there are not right or wrong answers.

Such data set would be of great value for various purposes. One of them is that we may discover that we all reflect in the same way, but also if we discover that there are differences then this should help us to learn from each other. Personally, I would be very interested to know whether feelings play more of a role in some languages than in others and also the importance of expressing a personal perspective.

I set out this idea of a data set of reflective writings in many languages in my thesis a while ago. It was one of my limitations as in this work I only focused on English student writings. My hope was and is ‘that this research can inspire further research to evaluate the potential of automated detection of reflection across languages’ . In my 2019 paper, I proposed that the specific method that I used in this paper can be of guidance to set standards regarding the annotation of the data set (method and theory). The paper presented several ideas from of using a ‘standardized evaluation method, its proposal of reflective writing categories that are common to many models, its focus on model validity, and its reliability’ .

I summarised some of the problems that I had when creating my large data set of reflective writings in this Google group post a while ago here: https://groups.google.com/d/msg/wred-general/9Fpbp2sQ0K8/m8y-GbioBgAJ. This was in response to the blog post of Ming Liu and Simon Buckingham Shum of the UTS team calling for contributions to a reflective writing data set for machine learning. See here: http://wa.utscic.edu.au/2018/09/14/building-a-reflective-writing-corpus-for-analytics-research/

I updated my list of problems or considerations when creating a reflective data set taking especially my work in the 2019 paper into account . They are:

  • What is the use case of the data set? My research was about the analysis of texts. Manual content analysis is the number one method to analyse reflective writing and from the analysis of the manual content analysis of reflective writings, I developed the coding scheme for my data set. Although analysis and assessment are close they do differ and thus a data set for the assessment of reflective writings may be different from an analysis data set.
  • This also raises questions regarding what the central constituents of a reflective writing are. This relates very much to the theory of reflection.
  • What unit of analysis is useful?
  • What is the right size?
  • What standards should be followed?
  • What languages would be included?
  • Are there important subject differences that need to be considered?
  • Are there any demographic variables to consider?
  • What research would be needed to carry out first in order to create a sound data set?
  • Once there is a data set, how can others use it? 

This list gives a good summary of points to consider when creating a large data set suitable for machine learning. Now, in addition to this, what do we need to consider when creating a comparable data set of reflective writings in multiple languages?


Ullmann, T. D. (2019). Automated Analysis of Reflection in Writing: Validating Machine Learning Approaches. International Journal of Artificial Intelligence in Education, 29(2), 217–257. https://doi.org/10.1007/s40593-019-00174-2
Ullmann, T. D. (2015). Automated detection of reflection in texts. A machine learning based approach (PhD Thesis). The Open University. Retrieved from http://oro.open.ac.uk/45402/

Text Analytics to Improve the Student Experience

It takes about two hours to skim read all student comments of a single popular module of the Open University. But, reading alone is not enough to make sense of all those comments, which is a process that takes much longer. Everyone who has ever done a content analysis knows how much effort it takes to analyse texts. The OU has effective mechanisms in place to to act upon student comments on module level, but what about if we want to analyze larger aggregates, such as on qualification level or even faculty level for quality enhancement purposes? The large-scale analysis of student comments quickly becomes an intractable problem .

The Open University’s Text Analytics of Student Comments Initiative seeks solution to this problem and in my role as academic lead on this project I have been recently invited to speak at the QAA Scotland Enhancement Themes event: Improving the Student Experience, which was on the 30/04/2019 in Glasgow. The event brought together more than 100 delegates from mostly Scottish Universities, the industry, the QAA, AdvanceHE, and student associations. Clare Parks and her team did a tremendous job to prepare for this event. To give you a sense of the event, they gathered images and Tweets from the day: Wakelet on ‘Exploring Student Surveys event’.

Presenting work of the OU Text Analytics Initiative

For my presentation, I had time to highlight results from two pieces of work out of a larger body of work that is already in the making since a few years.

Recently, I have been collaborating with the Open Programme team, which offers the OU Open Degree qualification. The Open Degree is a fantastic concept. A student can basically design their own qualification. It is all about openness extending the range of possibilities for our students. This programme fits well with our students, which can be seen by the number of students taking it. It is our biggest qualification.

Being the biggest qualification means also tons of student comments, which makes it a perfect example for the types of scenarios we want to explore within the Text Analytics of Student Comments Initiative. Based on technology that I have been developing, we looked at differences between students that studied towards an Open Degree and students that studied towards another named degree. The results of the analysis confirmed that Open Degree students emphasize in their comments certain topics more often than the rest of the students. They are, for example, very much interested in research, which seems very plausible for students that design their studies around their interests. They also mention more often their readings and course materials and they are very positive about those. The other students are also interested in these topics, it is just that Open Degree students mention these topics significantly more often. I touched on other topics, which I will not outline in detail here, because you can read about it in our latest Data Wrangler report .

The second example was about my collaboration with the Securing Greater Accessibility Initiative of the Open University. The OU has more students with disability than the entire student body of many Universities (currently about 25,000 students). The analysis showed that students that declared a disability talk about many student experience related things, but also that they talk about disability specific topics such as our DAISY books, screen reader technology, or comb binding books. Details of this study can be found in our contribution to the International Web for All conference, which also was a best paper candidate .

You can download the slides for my presentation here: https://www.enhancementthemes.ac.uk/en/current-enhancement-theme/student-engagement-and-demographics/national-student-survey-(nss)-analysis. It has many pointers to the reports and publications that I produced from this line of my research.

A big shout-out to my colleagues Heather Gibson and Shona Littlejohn from the Open University in Scotland, who made all of this possible. It is a fantastic feeling being part of this four nation OU family. Heather spotted this fantastic opportunity to further the discourse from our previous presentation at the QAA event in Edinburgh that looked more at the underlying technology.


Clow, D., Coughlan, T., Cross, S., Edwards, C., Gaved, M., Herodotou, C., … Ullmann, T. (2019). Scholarly insight Winter 2019: a Data wrangler perspective (Scholarly Insight Series) (pp. 1–47). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://oro.open.ac.uk/59646/1/DW_Scholarly_Insight_Report_Winter_4oro.pdf
Coughlan, T., Ullmann, T. D., & Lister, K. (2017). Understanding Accessibility as a Process through the Analysis of Feedback from Disabled Students. In W4A’17 International Web for All Conference. New York, USA: ACM. Retrieved from http://oro.open.ac.uk/48991/
Richardson, J. T. E. (2005). Instruments for obtaining student feedback: a review of the literature. Assessment & Evaluation in Higher Education, 30(4), 387–415. https://doi.org/10.1080/02602930500099193

Published: Automated Analysis of Reflection in Writing

The International Journal of Artificial Intelligence in Education just published my paper on the Automated Analysis of Reflection in Writing. It is currently freely available at Springer. Find it here:

The paper brings together much of the research into this new area of research that seeks to understand whether automated methods can be used to analyze writings (e.g. student essays) regarding reflective thinking.

Reflective practice is a common educational practice at most Universities. Writing down your thoughts can help you to deepen your reflective thought process about your experience. Everyone thinks reflectively, but not everyone has this skill fully developed. The good news is that reflective thinking can be taught and a common method for this is reflective writing.

There are many models of reflective thinking. Teachers and researches use these models (or assessment rubrics, frameworks, or coding scheme) to analyze and assess the writings of students. The paper has an overview of the many models that have been uses to manually analyze reflective writings.

The manual analysis tends to take time and this is a barrier which we need to overcome. It delays the feedback that we give to our students about their writing. Students may feel uncomfortable with others reading their reflective though. And, it makes large scale research on reflective writings costly. Automated methods on the other hand are immediate, non-judgemental, and operate at scale.

So, automated methods have some benefits, but are they any good? After all, reflective thinking is quite complex and may be hard to detect. The manual rating of such essays is not easy so why would a machine be good at it?

For this paper, I used machine learning to evaluate its performance on previously rated texts. I have written about other automated methods, such as the dictionary-based approach or the rule-based approach. The paper has a nice overview.

One of the tricky bits was to define what constitutes reflective thinking in writings. As outlined above there are many different models of reflective writing and I cite many of them in my paper. A close look at all those models showed that many models have similarities often masked behind technical terms. The common constituents of those models formed the model that I put machine learning to the test. It is a complex model of reflective thinking formed with the most relevant categories of reflective writing. Have a look in the paper to read more about the model for reflection detection.

Surprisingly, the evaluation of the model for reflection detection on thousands of sentences showed that machine learning actually does a good job on this complex construct.

In the paper, I provide a sense about how good the machine learning results are and also outline where we should put research funding to allow more research into this important area.

PhD Studentships in the Faculty of Wellbeing, Education and Language Studies

My institute – the Institute of Educational Technology – and the other Schools within the Faculty of Wellbeing, Education and Language Studies (WELS) are offering full-time funded PhD studentships for an October 2019 start.

The PhD programme is located in an environment that supports world-leading quality research. In the latest Research Excellence Framework (REF 2014) the UK government endorsed our research as overwhelmingly world-leading and internationally excellent, with significant and wide-reaching impact. Research takes place across the three Schools of WELS and the Institute of Educational Technology.

Research is conducted in the following areas which adopt innovative methodological approaches:

  • Educational Technology
  • Children, Young People & Families
  • Health, Wellbeing and Social care
  • Language and Literacies
  • Transformative Education

What unites our research approach is a strong commitment to interdisciplinarity and a social justice ethos. Our research enables communities and individuals to thrive in an increasingly demanding and unequal world. We have the flexibility to support truly interdisciplinary and innovative research that makes a difference to people’s lives. We strongly encourage you to look at our research web pages http://wels.open.ac.uk/research and https://iet.open.ac.uk/research to see that we provide the right combination of substantive expertise and methodological experience to support your doctoral research proposal.

Our large, international group of PhD students are an essential part of our research community. If you feel you have the drive and intellectual curiosity to pursue postgraduate research as part of that community and you have a great idea for a doctoral study, then we want to hear from you! We welcome proposals for interdisciplinary research within or across all of the areas outlined above.

Funding is available for UK, EU and international students. Fully funded PhD studentships will include fees and maintenance for three years, depending on satisfactory progress. Anticipated stipend for 2019-20 is £14777.

Closing date: Monday 4th March 2019.
Interviews will commence late March – April 2019.

For detailed information and how to apply for the studentships go to www3.open.ac.uk/employment, or e-mail wels-student-enquiries@open.ac.uk

We promote diversity in employment and welcome applications from all sections of the community.

PhD grants: Technology enhanced learning and learning analytics

Get one of the the Grand Union Innovation in Learning Doctoral Training Partnerships PhD Studentships with the the Institute of Educational Technology and the Faculty of Wellbeing, Education, and Language Studies at the Open University of UK.

This programme is dedicated to supporting research into innovative approaches to learning, teaching and assessment. Its focus is on the use of digital technology to strengthen openness, inclusion and wellbeing, particularly where these cross the boundary between formal and informal learning

The programme welcomes and encourages interdisciplinary research in areas as diverse as openness and digital inclusion, the uses of learning analytics, MOOCs and the educational mainstream, open educational practices, the uses of social media and mobile devices for learning.

Apply until 13th of December 2019. Details are here: https://www.jobs.ac.uk/job/BOD277/the-grand-union-innovation-in-learning-doctoral-training-partnerships-phd-studentships and here: http://www.open.ac.uk/about/employment/vacancies/grand-union-excellence-and-innovation-social-science-research-training-11828

Using Qualitative Data for Driving Decision-Making at Scale QAA Event

On behalf of the QAA in Scotland, the University of Edinburgh invited all Universities of Scotland to present and provide input to the ‘Using Qualitative Data for Driving Decision-Making at Scale’ event on the 28th of November 2018 (https://www.ed.ac.uk/academic-services/quality/enhancement-themes-overview/evidence-based-enhancement).

Professor Tina Harrison, the Assistant Principal Academic Standards and Quality Assurance of the University of Edinburgh, opened the well attended event. The event was structured around four presentations, which served as trigger for discussion and to bootstrap the activities that have been planned by the event coordinator Gillian Mackintosh.

Following the invitation of the Open University in Scotland, I presented ‘Understanding student experience comments at scale: Insights from an exploratory study’. I showed how automated text analytics methods can be used to mimic the typical tasks of the manual content analysis to make sense of student comments. Based on an exploratory study, I showed how keywords can be used to find topics in student comments, how these topics can be further refined and converted into a dictionary, that can be used to annotate student comments regarding topics. I then showed how these annotations can be enriched with sentiment analysis and how all this information can be used to empirically determine significant shifts in the experience of students over time. In my conclusions I tried to raise awareness about drawbacks of the presented methods but also their benefits. The presentation is based on the recently published Scholarly Insight Report Autumn 2018 .

This is the full abstract of my presentation:

Each year, students contribute tens of thousands of comments about their student experience via the Student Experience on a Module Survey (SEaM survey) of the Open University (OU). There remains a challenge as to how best utilize this data effectively for understanding module performance and planning module revisions. This presentation reports from an exploratory study that took a big data perspective analysing tens of thousands of comments. It uses automated empirical text analysis methods to detect hot topics students talk about during an academic year and it evaluates the sentiment that students express towards these topics. This presentation shows results from a recent OU scholarly insight report and related works .

I believe that my presentation strokes a nerve, as from the feedback of the participants that approached me, it became evident that many Scottish Universities showed an interest in these automated methods as they would allow to overcome the barriers of the time-consuming manual content analysis.

Other presentations included a presentation by Jill Mackay, which emphasised the importance of method triangulation for the evaluation of the lecture recording system at the University of Edinburgh. Another presentation by Paula Webster, the Head of Student Data and Surveys at the University of Edinburgh, gave a presentation about what works and what does not work when using qualitative data from student surveys for enhancement.

The presentation of Stef Black from sparqs (student partnerships in quality Scotland https://www.sparqs.ac.uk/) asked the participants of the event to directly provide steering regarding two high profile projects. One about student voice in especially about NSS question 25 (‘Is it clear how students’ feedback on the course has been acted on’) and the other about SLTA (student-led teaching awards.

After the main block of presentations, the facilitator planned an activity in order to directly gather input for the QAA about what works regarding the qualitative analysis of student data, what challenges are there and where do we need to be in the future. The output of this activity will feed directly into the strategic project ‘Evidence for Enhancement: Improving the Student Experience’ (http://www.enhancementthemes.ac.uk/current-enhancement-theme ) of the QAA Scotland.


Ullmann, T. D., Lay, S., Cross, S., Gaved, M., Jones, E., Hidalgo, R., … Rienties, B. (2018). Scholarly insight Spring 2018: a Data wrangler perspective (Scholarly Insight Series). Milton Keynes: Institute of Educational Technology, The Open University. Retrieved from http://oro.open.ac.uk/56732/
Ullmann, T. D. (2015). Keywords of written reflection - a comparison between reflective and descriptive datasets. In Proceedings of the 5th Workshop on Awareness and Reflection in Technology Enhanced Learning (Vol. 1465, pp. 83–96). Toledo, Spain: CEUR-WS.org. Retrieved from http://ceur-ws.org/Vol-1465/paper8.pdf
Coughlan, T., Ullmann, T. D., & Lister, K. (2017). Understanding Accessibility as a Process through the Analysis of Feedback from Disabled Students. In W4A’17 International Web for All Conference. New York, USA: ACM. Retrieved from http://oro.open.ac.uk/48991/
Ullmann, T. D. (2017). Reflective Writing Analytics: Empirically Determined Keywords of Written Reflection. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 163–167). New York, NY, USA: ACM. https://doi.org/10.1145/3027385.3027394

7th Workshop on Awareness and Reflection in Technology Enhanced Learning

Accepted papers

Published at CEUR: http://ceur-ws.org/Vol-1997/

  • Alicja Piotrkowicz, Vania Dimitrova, Tamsin Treasure-Jones, Alisdair Smithies, Pat Harkin, Jane Kirby and Trudie Roberts: Quantified Self Analytics Tools for Self-regulated Learning with myPAL
  • Angela Fessl, Viktoria Pammer, Michael Wiese and Stefan Thalmann: Improving Search Strategies of Auditors – A Focus Group on Reflection Interventions
  • Svenja Neitzel, Christoph Rensing and Henrik Bellhäuser: Concept, Design and First Evaluation of a Mobile Learning Diary Application with Access to a Learning Record Store
  • Milos Kravcik, Carsten Ullrich and Christoph Igel: Supporting Awareness and Reflection in Companies to Move towards Industry 4.0
  • Mathieu D’Aquin, Alessandro Adamou, Stefan Dietze, Besnik Fetahu, Ujwal Gadiraju, Ilire Hasani-Mavriqi, Peter Holtz, Joachim Kimmerle, Dominik Kowald, Elisabeth Lex, Susana Lopez Sola, Ricardo Maturana, Vedran Sabol, Pinelopi Troullinou and Eduardo Veas: AFEL: Towards Measuring Online Activities Contributions to Self-directed Learning
  • Francesca Dagnino, Francesca Pozzi, Donatella Persico, Flavio Manganello and Andrea Ceregini: Supporting teachers’ self- reflection and professional development with gamification
  • Tom Broos, Laurie Peeters, Katrien Verbert, Carolien Van Soom, Greet Langie and Tinne De Laet: Dashboard for Actionable Feedback on Learning Skills: How Learner Profile Affects Use

Call for papers

The 7th Workshop on Awareness and Reflection in Technology Enhanced Learning (ARTEL 2017) will be held in the context of the EC-TEL 2017, Tallinn, Estonia: 12 September 2017.
Workshop webpage: http://teleurope.eu/artel17
Twitter hashtag: #artel17



Awareness and reflection are viewed differently across the disciplines informing Technology-Enhanced Learning (CSCW, psychology, educational sciences, computer science and others). The ARTEL workshop series brings together researchers and professionals from different backgrounds to provide a forum for discussing the multi-faceted area of awareness and reflection. 2017 will be the 7th workshop in the series.


Through the last ARTEL workshops at EC-TEL (2011-2016) the topic has gained maturity and questions addresses are converging towards the usage of awareness and reflection in practice, its implementation in modern organisations, its impact on learners and questions of feasibility and sustainability for awareness and reflection in education and work. To reflect the growing maturity of research in ARTEL over the years in conjunction with the latest trends in TEL, this year’s topic particularly invites contributions that deal with moving from awareness and reflection to action. Changing individual behaviour and collaborative practice is very challenging, and we invite research that particularly deals with technology’s role in helping users take this step.


The workshop will include a paper session, a demo and prototype slam as well as an interactive session. The workshop aims at:

  1. Providing a forum for presenting and discussing research on awareness and reflection in TEL.
  2. Creating an interactive experience that connects participants’ research, current tools or latest prototypes and models with real end users’ learning experiences and requirements regarding reflection technology.
  3. Creating an agenda for future ARTEL research and development.


Proceedings of the predecessor workshops are available via http://ceur-ws.org/Vol-790/ (2011), http://ceur-ws.org/Vol-931/ (2012), http://ceur-ws.org/Vol-1103/ (2013), http://ceur-ws.org/Vol-1238/ (2014), http://ceur-ws.org/Vol-1465/ (2015), and http://ceur-ws.org/Vol-1736/ (2016).
Topics of interest
Reflective learning is a mechanism to turn experience into learning. Boud, Keogh & Walker (1985) phrase this as “those intellectual and affective activities in which individuals […] explore their experiences in order to lead to new understandings and appreciations”. As a mechanism that is suitable for self-directed learning, reflective learning has been found to be critical for success at work (Eraut, 2004; Knipfer et al, 2012). Reflection might be seen as a cognitive process as well as a social, collaborative process of learning (Prilla, Balzert & Pammer., 2012). Technology has the potential to support breakdown in the sense of creating awareness of a potential reflection opportunity, to support inquiry into the object of reflection, and to support transformation in the sense of creating new knowledge, i.e. meaning (Baumer, 2015). Concretely, technology can support e.g., data collection and analysis, sensemaking, discussion, or to guide the reflection process itself.
Furthermore, the previous six ARTEL Workshops show a wide variety of challenges and solution ideas when designing technology for reflective learning. Additionally, the IJTEL Special Issue on Awareness and Reflection in Technology Enhanced Learning captures various theoretical and empirical works that deal with aspects such as reflection guidance, reflection in blended learning, reflective learning processes in the workplace, reflection interventions in online courses, digital storytelling for developing reflection and digital skills in professional education, system architecture discussions, visualisations for awareness raising, and the impact of awareness and reflection tools in a virtual laboratory setting.
Considering the multitude of views on awareness and reflection distributed over a wide range of disciplines (CSCW, psychology, educational sciences, computer science…) the workshop’s general theme is encapsulated in the following questions:
  • How can awareness and reflection support learning in different settings (work, education, continuing professional development, lifelong learning, etc)?
  • What are the role(s) that technology can play in these contexts?
For ARTEL 2017 we particularly invite contributions that address the theme of moving from awareness and reflection to action. To answer the above and related questions, we are looking for contributions that address the following aspects:
  • Theoretical discussion of awareness and reflection in TEL and related concepts (e.g., collaborative learning, creativity techniques, experiential learning, etc.)
  • Methodologies to identify, study and analyse awareness and reflection in the context of (technology-enhanced) learning (quantitative and qualitative methods, learning analytics, visualisations etc.)
  • Empirical studies about technology support for awareness and reflection
  • Technology (design, application, evaluation) supporting awareness and reflection
  • Designing awareness and reflection in TEL applications and processes
  • Using awareness and reflection support to enhance the learning experience
  • Awareness of social context, knowledge, artefacts and processes
  • Awareness and reflection in specific contexts, such as higher education, work-integrated learning, learning networks, etc.
  • Challenges and solution ideas to help users move from awareness and reflection to action, i.e. to changing individual behaviour and collaborative practice


  • Full papers: Description of novel theoretical, empirical or development work on awareness and reflection in TEL, including a substantial contribution to the field (up to 15 pages).
  • Work in progress: Ongoing research and current approaches on investigating the field, with initial insights for the community (up to 7 pages).
  • Demos: Prototypes, design studies and tools for the support of awareness and reflection in TEL, which can be demoed and discussed  (up to 3 pages).

All contributions will be peer reviewed by at least two members of the programme committee evaluating their originality, significance, and rigour. The papers will be published in the CEUR workshop proceedings. Submissions should use the Springer LNCS template. Please submit your paper via EasyChair: https://easychair.org/conferences/?conf=artel2017

Important dates
22.06.2017 30.06.2017 Submission Deadline

15.07.2017 Notification of Acceptance
30.08.2017 Camera-Ready Papers
12.09.2017 Workshop
31.10.2017 Publication of Workshop Proceedings

ARTEL format
ARTEL 2017 is targeted at research and development on awareness and reflection in TEL across disciplines (CSCW, psychology, educational science, computer science) and across European TEL projects. The target audience of ARTEL 2017 are researchers and practitioners in the field of TEL. The workshop will include a paper session, a demo and prototype slam as well as an interactive session. The workshop will last for a full day (around 7 hours of working time) and will consist of four main parts of 1.5h-2h each, bookended with introductions and summaries. Besides the discussion of novel research and work in progress, this year’s ARTEL edition will contain a session of demoing and discussing tools, especially prototypes and cutting-edge development, in which researchers and practitioners can discuss their ideas for the support of awareness and reflection in learning.

The workshop format will be a mixture of paper and demo presentations, and a discussion session in which we will link theory and existing prototypes (both from the involved projects and from the presented papers and demos) to practical needs in educational and professional settings (research agenda). This link is expected to be of value to both practitioners (in that research insights become more tangible), and to researchers (in that insights and research prototypes become grounded in practice). There will be an overall narrative trajectory throughout the day moving from theory at the beginning of the workshop to pragmatic implementations at the close.

Milos Kravcik, DFKI GmbH, Germany

Alexander Mikroyannidis, The Open University, United Kingdom
Viktoria Pammer, Graz University of Technology and Know-Center, Austria
Michael Prilla, Clausthal University of Technology, Germany

Programme Committee (tentative)
Sven Charleer, KU Leuven, Belgium
Philippe Dessus, Université Pierre-Mendès-France, France
Ines Di Loreto, Université de Technologie de Troyes (UTT), France
Eva Durall, Aalto University, Finland
Angela Fessl, Know-Center, Austria
Denis Gillet, École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
Michael, Kickmeier-Rust, Graz University of Technology, Austria
Effie Law, University of Leicester, UK
Elvira Popescu, University of Craiova, Romania
Carsten Ullrich, DFKI GmbH, Germany

Dominique Verpoorten, University of Liège, Belgium
Riina Vuorikari, Institute for Prospective Technological Studies (IPTS), Spain

Supporting projects

Employ ID – Scalable & cost-effective facilitation of professional identity transformation in public employment services: http://employid.eu

SlideWiki – Collaborative OpenCourseWare Authoring: https://slidewiki.eu

ADAPTION – Migration zum Cyber-physischen Produktionssystem: http://www.adaption-projekt.de

Social platform
To stay tuned, consider joining the ARTEL social space on TELeurope: http://teleurope.eu/artel
It is easy to join. First register an account on http://teleurope.eu and then join the group on Awareness and Reflection in Technology-Enhanced Learning.

The workshop will take place as part of the 12th European Conference on Technology Enhanced Learning (EC-TEL), will take place in Tallinn, Estonia, September 12-15, 2017. The overall theme of EC-TEL 2017 is Data Driven Approaches in Digital Education, focusing on the new possibilities and challenges brought by the digital transformation of the education systems. For more information visit http://www.ec-tel.eu


Research Evidence on the Use of Learning Analytics: Implications for Education Policy

The EU published our report on ‘Research Evidence on the Use of Learning Analytics: Implications for Education Policy’: https://ec.europa.eu/jrc/en/publication/eur-scientific-and-technical-research-reports/research-evidence-use-learning-analytics-implications-education-policy

From the abstract: Learning analytics is an emergent field of research that is growing fast. It takes advantage of the last decade of e-learning implemResearch Evidence on the Use of Learning Analytics: Implications for Education Policyentations n education and training as well as of research and development work in areas such as educational data mining, web analytics and statistics. In recent years, increasing numbers of digital tools for the education and training sectors have included learning analytics to some extent, and these tools are now in the early stages of adoption. This report reviews early uptake in the field, presenting five case studies and an inventory of tools, policies and practices. It also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.

How to cite: Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G., Mittelmeier, J., Rienties, B., Ullmann, T., Vuorikari, R. (2016). Research Evidence on the Use of  Learning Analytics – Implications for Education Policy. R. Vuorikari, J.  Castaño  Muñoz (Eds.).  Joint  Research  Centre  Science for  Policy  Report;  EUR  28294  EN; doi:10.2791/955210.