This article is based on the paper “Learning Analytics Dashboards to Support Adviser-Student Dialogue”, published in IEEE Transactions on Learning Technologies in June 2017. I presented this work recently at the Dataviz Belgium Spring Meetup 2019, Mechelen, Belgium. It was originally published on Medium, but deserves a spot on my own blog…
During my PhD I mainly focused on Learning Dashboards. Most of these the research on these dashboards focuses on teachers and researchers as users, but my goal was to empower the students, by giving them insights into their own process through data visualisation. I designed and evaluated a number of dashboards, but one stood out: LISSA, or “Learning dashboard for Insights and Support during Study Advice”. LISSA supports the conversation between study advisers and students, and has been deployed across campuses of KU Leuven.
We involved 17 study advisers in our study. These study advisers are responsible for both the study advice and content-related support for first-year students in a particular program. They are experts in both the content of the first-year courses, the current organisation of the program, and the regulation, both program-specific as university-wide.
The study advisers helped us get insights into how they work, and what their requirements were during these advising sessions. These sessions are private conversations with a student (occasionally with parents) taking place in an office environment. These students typically do not have a flawless study career: they have trouble studying, would benefit from a personalised program plan, did not achieve enough credits through the year, or simply wish to re-orientate towards a new program.
The study advisers have multiple tools and websites to their disposal. But combining and interpreting these multiple channels of information for each specific student requires effort and time, and is error-prone. In addition, data is often incomplete.
Using a user-centred, rapid-prototyping design approach, we started of with Sketch for our initial designs and D3.js / Meteor to create our final interactive prototype. Here are two screenshots of the final design (in the meantime current PhD students redesigned the interface):
LISSA provides an overview of every key moment in chronological order up until the period in which the advising sessions are held: the grades of the positioning test (a type of entry-exam without consequence), mid-term tests, January exams, and June exams. A general trend of performance is visualised at the top: the student path consists of histograms showing the position of the student among their peers per key moment.
Every course is represented by its name and grade (out of 20). A green, orange, and red colour coding represents successful exams, tolerable grades (students can request to pass a limited number of 8–9/20 grades) and failed courses. The course is accompanied by a histogram visualising the performance of peers and the position of the student among them (black highlight).
For June sessions, it is important to plan the re-sits during September. Too few exams result in a credit threshold issue, while too many will most likely result in failure. The check-boxes next to the failed exams re-sit planning let adviser and student select several courses. The “re-sit exam success rate graph” uses historical data to provide insights into the number of students succeeding the selected number of exams in the past.
The stacked prediction bar provides historical data of students with a similar profile (based on the number of exam passed or failed) to the student: it shows the distribution of the duration of the bachelor program (three-four-five years or drop-out/ “NIET”) with similar September re-sits.
To visualise the key moments, data regarding student grades is required. This includes all first-year students of the current year to populate the courses and course histograms, the student path, and course histograms. All grades regarding the January, June and September periods are available in the KU Leuven data warehouse.
The stacked prediction bar is based on the first-year student grades of previous years. This provides the data needed to predict the three, four, five, or more years length of a Bachelor degree.
We created a data process pipeline using Python scripts to convert the different files and formats into a simple representation that is imported into a MongoDB.
The role of the Learning Analytics data
LA dashboards are often developed for specific institutions with certain data requirements. The Learning Analytics data necessary to deploy LISSA is very basic: grades of students across key moments and data regarding student success (derived from historical grade data). This data is usually available in most higher education institutions, but limited to staff. Yet, we have shown that this data placed in a student advising context, can help support students, provide insights into their progress and help plan their future.
LISSA is based on factual data. Exam success rate and bachelor duration show what has happened historically as facts and provide no calculated estimations. This reliable way of visualizing the data provides reassurance among both study advisers and students about the advice they are giving and receiving.
Personal background data regarding socio-economic status, parents’ education, gender, and high school achievement can provide further insights and help the study adviser understand the student’s situation better. However, this unalterable data does not provide the students with actionable insights. It is therefore important to investigate how to integrate this data in an ethical manner.
The role of the Study Adviser
LISSA facilitates insights at multiple levels, but these insights benefit from guidance by the study adviser. Even though the data is objective, there is still a need for critical and reflective interpretation by domain experts . Overconfident students might interpret an overall negative result as a surmountable problem, whereas the study adviser could advice and plan a more achievable program, preventing the student from wasting years on incorrect choices. LISSA can portray a student in a negative way, while a discussion with the student might reveal problems that are easily resolved, e.g. a change in study method, a new program, or a change in attitude. Without the study adviser’s guidance, such students might choose not to continue their Bachelor program.
LISSA still leaves room for personal opinions and tacit experience, as they still play an important role during advising sessions by allowing study advisers to e.g. emphasise certain results to push them on the correct path. Many external factors, such as information gathered through discussion and previous study adviser experiences with students, impact the decision to deviate from the factual data or interpret it differently.
During the semi-structured interviews and workshops, ethical issues arose regarding confronting the students with the data.
Some study advisers did not show LISSA to students with a very high number of failed courses. While some students might benefit from an “eye-opener”, study advisers prefer to use LISSA as a motivational tool.
An important role of LISSA is the ability to position a student among peers. In general, the use of histograms was considered very useful and positioning had positive effects such as motivating the student when a bad grade is still good among peers, or a course failure rate is very high. Some study advisers worried about demotivation: a student on the low end of the histogram might see succeeding the course as an unachievable goal.
There is however a demand by students to gain access to LISSA outside the sessions. But study advisers fear that the data visualised can be greatly misinterpreted without their guidance: students with fear of failure or over-confident students might interpret the data incorrectly. Parents can play a negative role into either pushing too hard, or interpreting mediocre results as insurmountable. The lack of knowledge about higher education with parents without a degree might stop a student pursuing an achievable degree. These problems might result in wrong decisions regarding exam and study career planning. Study advisers do see potential in providing reduced information, but what this reduction entails must be further discussed and evaluated.