P. 1
p246-swenson.pdf

p246-swenson.pdf

|Views: 23|Likes:
Published by ThumbCakes

More info:

Published by: ThumbCakes on Mar 27, 2014
Copyright:Traditional Copyright: All rights reserved

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

09/29/2014

pdf

text

original

Establishing an Ethical Literacy for Learning Analytics

Jenni Swenson
Lake Superior College 2101 Trinity Road Duluth, MN 55811

j.swenson@lsc.edu
ABSTRACT This paper borrows multiple frameworks from the field of technical communication in order to review theory, research, practice, and ethics of the Learning Analytics and Knowledge (LAK) discipline. These frameworks also guide discussion on the ethics of learning analytics “artifacts” (data visualizations, dashboards, and methodology), and the ethical consequences of using learning analytics (classification, social power moves, and absence of voice). Finally, the author suggests a literacy for learning analytics that includes an ethical viewpoint. Categories and Subject Descriptors J.1 [Administrative Data Processing]: Education K.7.4 [The Computing Profession]: Professional Ethics— codes of ethics, ethical dilemmas General Terms Design, Human Factors, Legal Aspects Keywords Ethics, higher education, learning analytics, literacy 1. INTRODUCTION Researchers and practitioners from many disciplines (e.g. statistics, behavioral science, cognitive psychology, education, and computer science) have worked to define and establish Learning Analytics and Knowledge (LAK) as a discipline. However, because work within respective fields can typically include a myopic view, finding the identity of a new discipline can be difficult and complex in terms of a more holistic view and in terms of finding purpose. In her paper, Mapping the Research Questions in Technical Communication, Carolyn Rude proposed a method to strengthen disciplinary identity and provide direction for a maturing field of study [11]. Her motivation for such an undertaking was that “the technical communication field lacks the status, legitimacy, and power of mature professions” and that “we cannot be recognized by others if we cannot even recognize ourselves.” Her purpose was to define the field of technical communication both internally and externally, to determine what was unique to technical communication, and to identify knowledge gaps.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions@acm.org. LAK '14, March 24 - 28 2014, Indianapolis, IN, USA Copyright 2014 ACM 978-1-4503-2664-3/14/03… $15.00. http://dx.doi.org/10.1145/2567574.2567613

Modifying Rude’s methodology to the emerging field of LAK can help identify areas of strength as well as provide guidance for future research and activities. While Rude examined four areas of inquiry to unbundle technical communication (disciplinary, pedagogy, practice, and social change), this paper will review LAK theory, research, practice, and ethics as guided by a modified subset of Rude’s questions: Theory: How shall we know ourselves, what is unique to learning analytics that defines it as a field of study? Research: What are our definitions, history, status, possible future, and research methods in learning analytics? Practice: What should be the content of our courses and curriculum to teach students best practices, history, and possibilities? Ethics: Why is learning analytics important and how are we maintaining an ethical viewpoint as it relates to learning analytics? This paper will also discuss the ethics of learning analytics “artifacts,” review consequences of learning analytics use, and suggest an ethical literacy for learning analytics. Why choose a technical communication perspective? Technical communication exists at the intersection of multiple disciplines and facilitates information flow from practitioner to public — a situation and purpose in common with the LAK discipline. 2. MATERIALS AND ANALYSIS LAK conference abstracts (2011-2013) were used as a body of knowledge for this review. Although the abstracts are a limited resource, the motivation for focusing on the LAK abstracts is two-fold. Those outside of the discipline, or new to the field, would certainly refer to the conference proceedings as a comprehensive body of knowledge when searching for information on learning analytics. Therefore, it is a test of the accuracy and quality of information in the conference abstracts: a way to determine if LAK conference authors describe their work in a way that provides clarity and focus. 2.1 Theory To understand which theories LAK conference authors referenced, a search on the terms theory, theoretical, theories, and framework was conducted and produced 30 occurrences (18 unique abstracts). Many abstracts spoke of “borrowing” theory from another discipline for theoretical guidance. Borrowed theoretical frameworks included cultural, postmodern, identity, ecological system, item response, reflective learning, social networked learning, sociocultural, technology enhanced learning, and learning sciences. Other abstracts did not use the specific search terms but were clearly trying to define the discipline theoretically. “Framework” was also a common term throughout the abstracts, with 27

mentions and 16 unique abstracts as in, borrowing theory to establish a framework to analyze and discuss learning analytics. From this search, it appears common for the emerging discipline of learning analytics to borrow theories from other disciplines in order to “test the waters.” How shall we know ourselves, what is unique to learning analytics that defines it as a field of study? One abstract, by Templar, Heck, Cuypers, Kooji, and Vrie) [16], described their research as building on the previous learning analytics framework of Shum and Crick [13]. It is in this type of testing and retesting of methods in learning analytics that allows a discipline to develop and strengthen their own unique disciplinarity and theoretical frameworks whether borrowed, enhanced, or proposed as new. For now, practitioners and researchers in the field of learning analytics could reduce confusion for newcomers to the field by clearly defining their research as empirical or theoretical. 2.2 Research To explore current research topics in LAK conference abstracts, a search was conducted on the terms research, empirical, and analysis and resulted in: Research: 83 matches, 46 unique abstracts Empirical: 12 matches, 10 unique abstracts Analysis: 78 matches, 50 unique abstracts Rather than list specific abstracts here, Figure 1 reveals where learning analytics research is engaged: Teaching and Learning (65%), Methodology (27%), Visualizations (5%), and Policy (3%). Figure 2 separates out Communication Analysis (abstracts with discourse, linguistics, or textual analysis) from Teaching and Learning, as Communication Analysis was the largest component of Teaching and learning. What are our definitions, history, status, possible future, and research methods in learning analytics? Definitions and history of learning analytics have been well defined. As a discipline, learning analytics is clearly focused on the research of teaching and learning, and specifically on communication analysis within teaching and learning research. The strength of teaching and learning research is not surprising. A quick word count within the LAK abstracts reveals a predominant occurrence of such terms as learning and learners, students, course, environments, online, and support. It is interesting that methodology comprises a smaller portion of the research topics. As was determined in the previous search for the term “theory,” the discipline is still looking to other theoretical models to advise its own theory. More research on methodology will develop and strengthen LAK’s disciplinarity. One possible focus for future research in learning analytics is discussed in Section 3 of this paper.
5% 27% Methodology (32 abstracts) Policy (3 abstracts) 65% 3% Teaching and Learning (78 abstracts) Visualizations (6 abstracts)

5%

20%

Communication Analysis (24 abstracts) Methodology (32 abstracts) Policy (3 abstracts)

45% 27% 3%

Teaching and Learning (54 abstracts) Visualizations (6 abstracts)

Figure 2. Detailed Topics of Research at LAK 2.3 Practice Tools and methods of practice define a discipline’s pedagogy. To better understand LAK practice, the search terms pedagogy and pedagogical were used to reveal a variety of abstracts related to teaching methods and tools. There were two types of abstracts, those that sought to teach instructors how to use learning analytic tools in the classroom (SNAPP, AAT, Sherpa, to name a few), and those that offered some level of theoretical pedagogy. The LAK conference abstracts are a robust source of cutting-edge pedagogical practices, but the depth and breadth of pedagogy beyond the conference abstracts cannot go without mention. LAK pedagogy has become firmly established by the innovative Massive Open and Online Courses (begun in 2011), and the subsequent educational opportunities provided by the Society for Learning Analytics Research (SoLAR) such as the Learning Analytics Summer Institute and the distributed research Lab PhD Training [see: http://www.solaresearch.org]. What should be the content of our courses and curriculum to teach students best practices, history, and possibilities? From tools and methodology, to assessment and principles of use, leaders in the LAK discipline are continually and clearly defining and redefining pedagogy for the LAK discipline, not only in terms of what instructors need to know, but also of the content and outcomes of a learning analytics classroom. 2.4 Ethics For the purpose of this paper, ethics is defined as ethical action — guiding learning analytics by the philosophy of use, the motivation for use, and working towards a desire to “do better” by the institution, the instructor, and the discipline as a whole. A search for the terms ethics and ethical revealed four abstracts related to ethical use of learning analytics, from ethical issues for the institution, to ethical design and ethics as taught or learned in the classroom [3, 5, 10, 15]. These four abstracts are certainly a starting point for discussing ethics in learning analytics, but not more than a starting point. Why is learning analytics important and how are we maintaining an ethical viewpoint as it relates to learning analytics? Disrupting and transforming education are key goals of learning analytics. At a time when our education models are described as failing, this alone elevates the importance of learning analytics. However, the conversations of maintaining an ethical viewpoint within the discipline have been slow to develop. To begin the conversation, the rest of this paper will discuss the ethics of learning analytics “artifacts,” review the consequences of using learning analytics, and propose an ethical literacy for LAK.

Figure 1. Combined Topics of Research at LAK

3. ARTIFACTS OF LAK For those in the field of technical communication, artifacts are products of a genre and are grouped by content similarities or context relatedness in order to study rhetorical action. Two tangible artifacts of learning analytics are data visualizations and interactive data dashboards used by faculty and students. A more esoteric artifact is the process of learning analytics itself (gather, predict, act, measure, and refine). To better understand how LAK practices mediate knowledge, values, and action, the question becomes, Who generates these artifacts, how, and for what purpose, and are these artifacts produced and presented ethically? Before tackling this question, a brief description of the types of learning analytics is necessary. In their paper, Social Learning Analytics: Five Approaches, Shum and Ferguson [14] identify learning analytics as using dynamic and behavioral data in the classroom (e.g., attendance, grades, quantified participation). Ferguson and Shum view learning analytics as a social process that builds on knowledge created through cultural settings, and they describe five categories of learning analytics: Social Network (analyzes relationships), Discourse (analyzes language), Content (analyzes user-generated content), Disposition (analyzes motivation), and Context (considers formal and informal learning). Anyone with access to the Internet and open source software can create data visualizations for social network analytics, content analytics, and discourse analytics (the former if not protected within a Learning Management System). Disposition analytics can be self-reporting or a more formal survey and, in each case, faculty and/or administration would have access to this data. Currently, context analytics are less commonly used (but gaining in popularity), and this highly personal data may be available publicly through mobile technology and applications. 3.1 Artifact: Visualization Because learning analytic visualizations can be easily created and posted publicly, situating these artifacts within Laura Gurak’s framework for cyber literacy is a good place to begin identifying ethical issues [6]. Gurak’s work describes the speed, reach, anonymity, and interactivity of information available through the Internet and, when applied to learning analytics, clear consequences are revealed for each attribute (Table 1): Table 1. Attributes and Potential Consequences Attribute Consequence Speed: Real-time data Lack of relationship context allows for predictive between individual and data, modeling to occur without and historical context of the trained analysis by experts individual other than “data” Reach: Inaccurate or Lack of accountability when incomplete data may be predictions created without used in predictive model verification of statistical methods or accuracy and completeness of data Anonymity: Private data Lack of privacy if data may be may be used without harvested without permission permission or deand can be de-anonymized anonymized by either through implied relationships party Interactivity: Pushed one- Lack of interactivity if artifacts way (Institution to student) are not interactive or allow for feedback

When viewed through Gurak’s framework, ethical issues for those working with the artifacts of learning analytics become a lack of context, accountability, privacy, and interactivity. Furthermore, the framework as applied to learning analytics either magnifies or reverses the effect of information: Lack of contextmagnifiesthe problem of speed Lack of accountabilitymagnifiesthe problem of reach Lack of privacyreversesthe problem of anonymity Lack of interactivityreversesthe problem of interactivity 3.2 Artifact: Dashboard In, The Evolving Face of Ethics in Technical and Professional Communication: Challenger to Columbia, Paul Dombrowski [2] provides a framework for addressing ethics theory, one that raises awareness of social and historical context to provide a variety of approaches to dealing with ethical issues. Specifically, Dombrowski believes that both ethics and rhetoric must be considered in, “[how] technology is designed, the way it is actually used, and to some degree even the shape of the technology itself.” Applying Dombrowski’s work to learning analytics and, specifically, to a learning analytics dashboard, provides ethical questions that can be used to consider the ethics of the design, use, and shape of technology of a learning analytics artifact: Design: Which design or visual elements of dashboards are persuasive and how might they affect the viewer? How might design or visual elements assign meaning? Use: Where is meaning created beyond the dashboard’s label of at-risk? How might the student respond to being labeled without context? Shape of Technology: How does the dashboard function as an ethical document (power, status, labels, and data)? Is there a way to elevate student over data rather than view student as data? 3.3 Artifact: Methodology Katz and Rhodes [7] examine ethical issues in relation to human-computer interaction and identify “ethical frames” as a “set of philosophical assumptions, ideological perceptions, and normative values underlying and/or guiding how people relate to and exist with technology.” Describing ethics as being “socially dynamic and constructed,” Katz and Rhodes reveal that, when viewed through ethical frames, technology constructs social values. Situating the five stages of learning analytics (gather, predict, act, measure, refine) within Rhodes and Katz’ ethical frames indicates that each stage of learning analytics has potential consequences for affecting social value (Table 2): Table 2. Ethical Frames and Potential Consequences Frame Description Consequence Nothing of Learning analytics may be harmful False value if predictive category is wrong (predict) A Means Prediction is only as good as the Tool data are accurate and complete (gather, measure, and refine) Suggested intervention strategy Means- A means and an end may or may not increase student end success (act)

The above examples describe how the artifacts of learning analytics have ethical implications. However, a broader perspective of ethics in learning analytics must also include a discussion on the effects of learning analytics on individuals, faculty, and institutions, and is necessary before establishing a “literacy” for learning analytics that includes an ethical viewpoint. 4. EFFECTS OF LEARNING ANALYTICS Certainly, the topic of ethics in learning analytics has been raised at previous LAK Conferences [3, 5, 10, 15]. Willis, Campbell, and Pistilli [17] connect ethics to learning analytics by maintaining that the institution is responsible for analyzing the data and fulfilling an “obligation of knowing” by providing students with tools for success, faculty with training to use the prediction models, and a campus climate the enhances student success. Slade and Galpin [15] specifically discuss ethical issues related to learning analytics — from the responsibility of the institution for student success, to the rights of a student to remain an individual. Technical communication can build on this previous work by providing much-needed frameworks to discuss an ethical literacy in terms of consequences of classification, social power moves, and the absence of voice. 4.1 Consequences of Classification In Sorting Things Out: Classification and Its Consequences, Bowker and Star [1] focus on the ambiguous process of classification and the “invisible forces of categories and standards.” Bowker and Star explain that, for the most part, we are trained to accept classification systems as fact even though classification is subject to data entry errors, data storage limitations, data “cleaning,” and data revision (due to economic, social, and political pressures). Classifying people involves generalizing and/or stereotyping in order to create a data profile that “fits” into categories and, in doing so, “existing differences are covered up, merged, or removed.” As classifying is the work of learning analytics, the biggest setback to the process may be, as Bowker and Star explain, when put into a category people “bend and twist their reality to fit into a more “desirable” category” or, even more problematic, “socialize to their category.” Being labeled “atrisk” may serve as an incentive for those who are motivated and fortunate to have life circumstances that enable them to move to a more desirable category but, for others, being labeled at-risk may influence their outlook for success and serve to deflate untapped potential. 4.2 Identifying Power Moves In Multiliteracies for a Digital Age, Stuart Selber [12] proposes an ethical approach to computer literacy and carefully dissects where institutional technology regularization (required use of hardware or software) imposes a social power differential on individuals (“power moves”). For learning analytics, power moves can be revealed by considering the following questions: Exclusion/Compartmentalization/Segregation: Do intervention strategies favor campus-based, face-to-face activities, prevent benefit to some, or are they difficult to obtain? Deflection: Is there transparency regarding benefits a college may receive through increased retention and tuition?

Differential incorporation: Do predictive categories reinforce status when differentiating by academic achievement? Centralization: Who owns, shares, maintains, and is responsible for data accuracy? Standardization/Delegation: Are intervention strategies specific enough to be effective? How does an institutional assignment of at-risk fit the reality of a student being atrisk or feeling at-risk? Without deeper discussion and reflection learning analytics may unintentionally reinforce unfortunate classifications, social power differentials and exclude stakeholder voices. 4.3 Considering Voice In The Spurious Coin, Bernadette Longo [8] maintains that those in power marginalize — intentionally or unintentionally — certain knowledge in their efforts to legitimize other knowledge, and that the only way to change this practice is to fully explore how and why this marginalization and associated legitimization occurs within a system. In another work, Human+Culture: Where We Work, Longo [9] examines choices that create both culture and community. For learning analytics, this process would start by considering who has the power to: make decisions about the learning analytics model and data, legitimize some student knowledge or data and not others, focus on potential intervention strategies and not others, give voice to certain students and not others, and validate some student stories and not others. Being aware that the learning analytics process legitimizes preferential selection of data, knowledge, actions, and opinions allows the LAK community to consider how the process marginalizes certain voices (and at which stage), and affects social relationships between students, between student and faculty, and between student and institution. Sections 3 and 4 above have outlined ethical issues for learning analytics related to artifacts (visualization, dashboard, process) and the consequences of learning analytics applications (classification, power moves, and voice). Each section also provided information or questions that can be used to guide discussion on the ethics of learning analytics. While identification and acknowledgement is a first step, how might the LAK community begin to establish an ethical literacy for learning analytics that is promoted by LAK theory, research, and practice? 5. AN ETHICAL LITERACY Returning to Selber’s work on multiliteracies, Selber suggests a three-fold approach to literacy that includes functional literacy, critical literacy, and rhetorical literacy. Applying this to learning analytics, a LAK literacy could include: Functional Literacy: Using software to create visualizations, or to access and navigate a learning analytics dashboard; Critical Literacy: Understanding data limitations in predictive modeling and visualizations/dashboard as

limitations of context, accountability, privacy, and interactivity, and Rhetorical Literacy: Analyzing visualizations/dashboard rhetorically (producing knowledge) in terms of persuasion, power, and ownership. Selber maintains that all three literacies are needed to fully account for an ethical viewpoint, and he provides four categories — persuasion, deliberation, reflection, and social action — to reveal ethical issues for each literacy. For learning analytics, this review process answers the following questions: Persuasion: How are the predictive model, visualizations, and student dashboard persuasive? Where might there exist potential for harm? Deliberation: Who has control of the learning analytics conversation (and data sharing) at each stage of the process? Who is left out of the conversation and why? Reflection: How can the learning analytics process be improved upon? Are we following our Code of Ethics? Social action: How are members of the community (institution and student) benefiting from the relationship or having a beneficial effect on the community? Who has power and who does not? When guided by this framework, an ethical literacy for learning analytics will answer the question: Who generates these artifacts, how, and for what purpose, and are these artifacts produced and presented ethically? The answer to this question begins with the LAK community articulating ethical goals, aligning values with practice, and adopting conversations that are inclusive and transparent. 6. CONCLUSION LAK is a young discipline theoretically, yet, with already strong and complex research activities and practices. LAK practitioners will encounter obvious (and not so obvious) ethical dilemmas while working in this field. Adopting an ethical literacy for learning analytics, such as outlined in this paper, is a first step in maintaining an ethical viewpoint and fully incorporating ethics into theory, research, and practice of the LAK discipline. 7. REFERENCES [1] G. Bowker and S. Star. Sorting Things Out: Classification and Its Consequences. MIT Press, 2000 [2] P. Dombrowski. The Evolving Face of Ethics in Technical and Professional Communication: Challenger to Columbia. IEEE Transactions on Professional Communication, 52(1), pages 306-319, 2007. [3] H. Fournier, R. Kop and H. Sitlia. The value of learning analytics to networked learning on a personal learning environment. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK '11). ACM, New York, NY, USA, pages 104-109, 2011 [4] A. Wise, Y. Zhao and S. Hausknecht. Learning analytics for online discussions: a pedagogical model for intervention with embedded and extracted analytics. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK '13), Dan Suthers, Katrien Verbert, Erik Duval, and Xavier Ochoa (Eds.). ACM, New York, NY, USA, pages 48-5, 2013.

[5] S. Graf, C. Ives, L. Lockyer, P. Hobson and D. Clow. Building a data governance model for learning analytics. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK '12), Simon Buckingham Shum, Dragan Gasevic, and Rebecca Ferguson (Eds.). ACM, New York, NY, USA, pages 2122, 2012. [6] L. Gurak. Cyberliteracy: Navigating the Internet with Awareness. Yale University Press, 2002. [7] S. Katz and V. Rhodes. Beyond Ethical Frames of Technical Relations: Digital Being in the Workplace World. In Digital Literacy for Technical Communication: 21st Century Theory and Practice. Rachel Spilka, Ed. Routledge, 2009. [8] B. Longo. Spurious Coin. University of NY Press, 2000 [9] B. Longo. Human+Machine Culture: Where We Work. In Digital Literacy for Tech Communication: 21st Century Theory and Practice. R. Spilka, Ed. Routledge, 2009. [10] P. Prinsloo and S. Slade. An evaluation of policy frameworks for addressing ethical considerations in learning analytics. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK '13), Dan Suthers, Katrien Verbert, Erik Duval, and Xavier Ochoa (Eds.). ACM, New York, NY, USA, pages 240-244, 2013. [11] C. Rude. Mapping the Research Questions in Technical Communication. Journal of Business and Technical Communication 23(2), pages 174-201, 2009. [12] S. Selber. Multiliteracies for a Digital Age. NCTE Studies in Writing and Rhetoric, 2004. [13] S. Shum and R. Crick. Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK '12), Simon Buckingham Shum, Dragan Gasevic, and Rebecca Ferguson (Eds.). ACM, New York, NY, USA, pages 92-101, 2012. [14] S. Shum and R. Ferguson. Social Learning Analytics. Learning, 1(26), 2011. [15] S. Slade and F. Galpin. Learning analytics and higher education: ethical perspectives. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK '12), Simon Buckingham Shum, Dragan Gasevic, and Rebecca Ferguson (Eds.). ACM, New York, NY, USA, pages 16-17, 2012. [16] D.Tempelaar, A. Heck, H. Cuypers, H. Kooij and E. Vrie. Formative assessment and learning analytics. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK '13), Dan Suthers, Katrien Verbert, Erik Duval, and Xavier Ochoa (Eds.). ACM, New York, NY, USA, pages 205-209, 2013. [17] J. Willis, J. Campbell and M. Pistilli. Collegiate Administration and the Obligation of Knowing: An Essay on Practical Ethics in an Era of Big Data. Educause, 2013.

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->