Trends and Issues in Student-Facing Learning Analytics Reporting Systems Research" this is research article and also need to make a research report according to given requirements and also i attached article file.
Proceedings Template - WORD This is an author-prepared version of accepted peer-reviewed work. The definitive Version of Record for this publication can be found in the ACM Digital Library here: http://dl.acm.org/citation.cfm?id=3027403. DOI: 10.1145/3027385.3027403 This article can be cited as: Bodily, R., & Verbert, K. (2017, March). Trends and issues in student-facing learning analytics reporting systems research. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 309-318). ACM. Copyright © 2016 by the Association for Computing Machinery, Inc. (ACM). Permission to make digital or hard copies of portions of this work for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page in print or the first screen in digital media. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Send written requests for republication to ACM Publications, Copyright & Permissions at the address above or fax +1 (212) 869-0481 or email
[email protected]. For other copying of articles that carry a code at the bottom of the first or last page, copying is permitted provided that the per-copy fee indicated in the code is paid through the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923. http://dl.acm.org/citation.cfm?id=3027403 Trends and Issues in Student-Facing Learning Analytics Reporting Systems Research Robert Bodily Brigham Young University
[email protected] Katrien Verbert University of Leuven
[email protected] ABSTRACT We conducted a literature review on systems that track learning analytics data (e.g., resource use, time spent, assessment data, etc.) and provide a report back to students in the form of visualizations, feedback, or recommendations. This review included a rigorous article search process; 945 articles were identified in the initial search. After filtering out articles that did not meet the inclusion criteria, 94 articles were included in the final analysis. Articles were coded on five categories chosen based on previous work done in this area: functionality, data sources, design analysis, perceived effects, and actual effects. The purpose of this review is to identify trends in the current student- facing learning analytics reporting system literature and provide recommendations for learning analytics researchers and practitioners for future work. CCS Concepts • Information systems ~ Decision Support Systems • Human centered computing ~ Visualization • Information systems ~ Data Mining • Information systems ~ Web Mining Keywords Learning analytics; learning analytics dashboards; educational recommender systems; student-facing systems; literature review 1. INTRODUCTION As online learning continues to grow, it becomes increasingly important to identify design and teaching strategies to improve student success in online and technology mediated environments [1]. Learning analytics (LA) is commonly defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”, and could be used to help improve student success in online environments [2]. Within the LA process, there are a number of stages that have been identified: select, capture, predict, use, refine, and report [3]. This article focuses on the reporting stage of the LA process. Learning analytics dashboards, educational recommender systems, intelligent tutoring systems, and automated feedback systems are commonly used in the reporting stage to close the feedback loop and provide information to stakeholders that can be easily understood in a short period of time. There have been previous literature reviews conducted in this area (see [4], [5], [6], and [7]) which focus on learning analytics dashboards for all stakeholders (e.g. administrators, instructors, students). In order to enable student autonomy and compare student-facing reporting systems across disciplines, we focus exclusively on student-facing systems (collecting student data and reporting the data back to students) that report data back in a learning analytics dashboard, educational recommender system, educational data mining system, intelligent tutoring system, or automated feedback system. This review has implications in the learning analytics community because student-facing reporting systems close the feedback loop and in best case scenarios, give students real-time access to their data to increase student awareness, reflection, and achievement. This review identifies research trends and issues related to designing, developing, and evaluating student-facing reporting systems. Based on the analysis from this review, we provide recommendations to (1) aid researchers in conducting more rigorous research in this area, and (2) enable practitioners to increase the impact of their systems on student success. 2. PREVIOUS LITERATURE REVIEWS This review builds on four literature reviews conducted within the past four years. Verbert, et al. [2013] selected interesting dashboard articles and provided a framework for coding various types of systems [4]. Their framework included what types of data were tracked, which stakeholder the dashboard was intended for, and whether the system was evaluated or not. This article did not have a systematic search of the literature so it is hard to make comprehensive statements about learning dashboards from this article alone. However, this article is an excellent example as the first review article on learning analytics dashboards. Verbert, et al. [2014] built on the previous review by including a few additional systems not included in the previous review. The authors also expanded the article categorization framework discussed in Verbert et al. [2013]. The expanded framework included what kind of technology was used to track the data, additional evaluation categories, and the presentation medium (tablet, cell phone, computer, etc). This study was a good follow- up to Verbert et al. [2013], but in order to generalize across learning dashboards, a comprehensive literature review is still needed [5]. Yoo et al. [2015] used the Verbert et al. [2014] review to find articles about learning analytics dashboards that conducted system evaluations. They excluded articles if they did not conduct an evaluation and ended up with 10 articles in their final analysis. The purpose of this article was to find learning analytics dashboard articles that conducted evaluations in order to develop an evaluation framework. Yoo et al. [2015] provided an evaluation framework at the end of their article to guide future Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from
[email protected]. LAK '17, March 13-17, 2017, Vancouver, BC, Canada © 2017 ACM. ISBN 978-1-4503-4870-6/17/03$15.00 DOI: http://dx.doi.org/10.1145/3027385.3027403 evaluations of dashboard systems. Our literature review categories have included elements from this evaluation framework [6]. Schwendimann, et al. [2016] conducted the first systematic search in the literature for learning analytics dashboard articles. Their final analysis included 53 articles. They analyzed all types of learning analytics dashboards, including administrator-, counselor-, instructor-, and student-facing systems. Some of their findings include that most dashboard systems are developed predominantly for instructors, and mainly exist in higher education [7]. In addition, most articles do not report on research experiments to determine effects on students. The majority of analytics systems focus on providing teacher- or administrator-facing views in their systems [7]. These can be beneficial in helping teachers or administrators accomplish their goals, however, these approaches generally increase teacher control and decrease student autonomy. Ryan and Deci [2000] suggest that from a self-determination theory perspective, students will have greater intrinsic motivation to succeed in their coursework when they have greater autonomy [8]. Student-facing reporting systems enable, rather than inhibit, student autonomy, and could increase student motivation in ways that teacher or administrator systems could not. Additionally, many articles use different terminology and are presented in different venues (e.g., automated feedback systems, educational recommender systems, intelligent tutoring systems, or educational data mining systems). However, the goal of these student-facing systems is the same: to provide some kind of feedback to learners to improve teaching and learning. Because each of these systems has a common purpose, we wanted to review all systems trying to accomplish the same goal to better compare the strengths and weaknesses of each type of system. In order to enable student autonomy and compare student-facing reporting systems across disciplines, we build on the research that has been previously conducted by reviewing student-facing learning analytics reporting systems. Our review builds on the previous reviews in the following ways: (1) we use the evaluation framework proposed by Yoo et al. [2015] in the creation of the categories for this literature review, (2) we use the categorization frameworks from Verbert et al. [2013] and Verbert et al. [2014] as part of our literature review categories, (3) we build on the work of Schwendimann et al. [2016] by enlarging the search criteria from learning analytics dashboards to all learning analytics reporting systems, and (4) we narrow our scope by focusing exclusively on student-facing reporting systems. Instead of focusing on the tool (learning analytics dashboards) we focus on the stakeholder (students) in order to provide practical suggestions for all student-facing learning analytics reporting systems (see Figure 1). The research questions that will be addressed in this review include the following: 1. What types of features do student-facing learning analytics reporting systems have? 2. What are the different kinds of data collected in these systems? 3. How are the designs of these systems analyzed and reported on? 4. What