You are on page 1of 7

ETEC565A - Assignment 1: Analysis of the Brightspace Student Success System 1

Submitted By: Marc Tavares


Date: November 10, 2019

The learning analytics tool I have chosen for analysis is the Student Success System (S3) by
Brightspace, which uses learning analytics to predict student success and identify at-risk
students. I chose this tool for its relevance to my professional context, since my institution has
previously discussed the potential of using analytics to help academic advisors identify at-risk
learners. Our use of learning analytics is underdeveloped and further examination of this tool
could strengthen the case for its exploration and potential adoption at our institution,
especially since we are already on the Brightspace LMS.

An evaluation of the Student Success System was conducted using the Evaluation Framework
for Learning Analytics:

For Teachers

Data:

Evaluation of the S3’s data collection methods received a high score. Brightspace provides an
instructor guide for the tool and breaks down what data is being collected, categorizing it under
distinct domains:

• Course access: Tracks the frequency of learner logins and course homepage access.
• Content access: Measures engagement by tracking how frequent a learner accesses course
content and modules.
• Social Learning: Measures learner engagement in discussions forums.
• Assessments: Displays how well learners have performed on assessments such as quizzes.
• Preparedness: Data collected from the SIS such as demographic data or admission scores.

Brightspace makes it clear why data is being collected, which is to generate weekly learner
success predictions and visualizations.

Awareness and Reflection:

Instructors are made aware of their learners’ current and future learning situations (if
behaviour remains unchanged) through several predictive charts and visualizations:

• Success Index: A course timeline chart displays a student’s weekly success index and
academic trends. A risk quadrant chart visually categorizes learners as on-track/not at risk,
under-engagement risk, withdrawal/dropout risk, or academic performance risk. A student at-
risk widget can also be added to the home page and is updated weekly to display up to 5
students that are deemed at-risk.
2

Source: (Essa & Ayad, 2016) Source: (Brightspace by D2L, 2016).

• Win-loss Chart: Displays a learner’s predicted success level for the week for each domain.
This prediction is based on analyzing correlations between learner performance and behaviours
in past courses, and comparing the behaviour of past learners with current learners.

Source: (Essa & Ayad, 2016)

• Social Learning: A visualization of a learner’s social network based on discussion posts.

Source: (Essa & Ayad, 2016)


3

• Assessments: Displays a learner’s performance compared to their peers.

Source: (Brightspace by D2L, 2016).

All charts are referred to as predictive by Brightspace, but it is hard to determine what
predictions about social learning or assessments are being made, as opposed to just showing
current performance.

Impact:

The S3 tool may stimulate teachers to teach more effectively and efficiently based on learner
engagement or interaction, but does not provide any specific feedback on the quality of course
content or instruction.

For Learners

Very little to no information is available on the tool from a student perspective, including
whether they can see the same predictions or data about themselves that their instructors or
course administrators do.

Brightspace offers a guide for instructors, but I was unsuccessful in finding a guide for learners.
This resulted in a failing EFLA score from the learner perspective.
4

Critique

Predictive Modeling & Interventions


Brightspace’s Student Success System is based on predictive models related to learner
performance and risk levels. Although not the only student alert and intervention tool, it
suggests its models are far superior. Essa & Ayad (2012) claim that S3’s predictive modeling
solves two key problems that exist with other predictive models: they cannot be generalized
across courses or institutions, and they generate a risk signal void of meaningful data for
instructors. On the risk signals front, S3 allows instructors to click on an at-risk student to see
their learning profile and get meaningful information or data about that student, but it still uses
traffic signal colors to indicate a student’s status. The color red, which usually implies danger, is
used to indicate an at-risk student and could result in an instructor having preconceived notions
of a student (Ballard, 2013; Sclater, 2017).

S3’s predictive model is based on data from previous courses and learners, but can be applied
to different courses at different institutions (Essa & Ayad, 2012). This claim particularly stood
out to me considering that different courses can have different learning conditions. Contextual
factors, such as consistency in course LMS features, characteristics of students in the class, and
instructional factors like who is teaching the course, could have an impact on the model
(Gašević, Dawson, Rogers, & Gašević, 2016). Gašević et al., (2016) argue that predictive models
of academic success require careful interpretation and analysis if differences in instructional
contexts are not factored into the model. Slade & Prinsloo (2013) also argue that the context of
data should be evaluated critically before making a prediction, since it is not always
transferable. Essa & Ayad (2012) claim that S3 utilizes “An adaptive framework and a stacked-
generalisation modeling strategy, whereby intelligent data analysis can be applied at all levels
and graciously combined to express higher-level generalisations” (p.61). Without a data mining
background, it was difficult for me to assess if this modeling technique factors in differences in
learning contexts to make accurate predictions.

Impact of Interventions
Providing instructors with information about at-risk students gives them a chance to intervene
and personally reach out to the student, hopefully resulting in a positive impact on the
student’s behaviour and subsequent performance (Sclater, 2017). This can be helpful given the
positive correlation between student engagement on the LMS and academic achievement
(Macfadyen & Dawson, 2012; Sclater, 2017); however, I could not find any research on the
impact of these interventions, an observation also made by Sclater (2017). Brooks & Thompson
(2017) offer many examples of predictive analytics in practice, but the results of interventions
are not discussed. Evidence of the positive impact of interventions resulting from the student
success system could strengthen Brightspace’s claims.
5

Lack of Learner Visualization Information


The lack of available information on the learner’s view of their analytics is one of the biggest
concerns for me regarding the Student Success System. Students should have access to
detailed analytics about themselves (Slade & Prinsloo, 2013), especially if the focus is on
contributing to their learning and development. This resulted in a low score For Learners score
using the EFLA framework.

Recommendation
In order to advance or even just explore learning analytics and its impact on student success in
my own institutional context, I think it is important to experiment with available tools. Although
the generated predicted models and visualizations of S3 may not be fully accurate, it can still be
a valuable resource for our instructors and advisors (Macfadyen & Dawson, 2010) while we
attempt to evaluate its impact on the academic success of our students. I would personally
make use of the tool to gain a useful snapshot of how students are engaging with the course
content and their classmates, or to notify academic advisors to reach out to students that may
be having difficulty.
6

References & Consulted Sources

Ballard, C. (2013). Data Visualisation with Predictive Learning Analytics. Retrieved from
https://www.slideshare.net/ChrisBallard/data-visualisation-with-predictive-learning-analytics

Brightspace (n.d-b). Brightspace student success system. Retrieved from


https://www.d2l.com/products/student-success-system/

Brightspace by D2L. (2016). Brightspace student success system: Instructor guide. Retrieved from
https://community.brightspace.com/servlet/fileField?entityId=ka56100000090DsAAI&field=Attachment
__Body__s

Brightspace. (n.d-a). About Brightspace student success system. Retrieved from


https://documentation.brightspace.com/EN/insights/s3/instructor/about_s3_1.htm?tocpath=Administr
ators%7CBrightspace%20Performance%20Plus%20Analytics%7CAbout%20Brightspace%20Student%20S
uccess%20System%7C_____0

Brooks, C. and Thompson, C. (2017). Predictive Modelling in Teaching and Learning. In Lang, C., Siemens,
G., Wise, A. F., and Gaevic, D., editors, The Handbook of Learning Analytics, pages 61–68. Society for
Learning Analytics Research (SoLAR), Alberta, Canada, 1 edition

D2L. (2019, May 29). How to support at risk students. [Video file]. Retrieved from
https://www.youtube.com/watch?v=6hbCsh-DUcY

ECAR-ANALYTICS Working Group. (2015). The Predictive Learning Analytics Revolution: Leveraging
Learning Data for Student Success. ECAR working group paper. Louisville, CO: ECAR.

Essa, A., & Ayad, H. (2012). Improving student success using predictive models and data
visualisations. Research in Learning Technology, 20(sup1), 19191-13. doi:10.3402/rlt.v20i0.1919

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size
fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher
Education, 28, 68-84. doi:10.1016/j.iheduc.2015.10.002

LACE. (n.d.). Evaluation framework for learning analytics. Retrieved from


http://www.laceproject.eu/evaluation-framework-for-la/

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using
learning analytics: The ethical dilemmas of intervention strategies in a higher education
institution. Educational Technology Research and Development, 64(5), 957-968. doi:10.1007/s11423-
016-9459-0

Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. why e-learning analytics failed to
inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149-163.
7

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for
educators: A proof of concept. Computers & Education, 54(2), 588-599.
doi:10.1016/j.compedu.2009.09.008

Sclater, N. (2017). Learning Analytics Explained (pp. 79-87). New York, USA: Taylor & Francis.

Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral
Scientist, 57(10), 1510-1529. doi:10.1177/0002764213479366

You might also like