How did our students use Canvas to facilitate their final exam preparation?

Dartmouth has been using Canvas as an extension to F2F class in course materials delivery, quizzes administration and asynchronous discussion/communication. As such, we are particularly interested in learning how our students use Canvas to facilitate their final exam preparation.

We gathered students’ page view activities data in Canvas during final exam period and students’ score on their final exams. Since each course deployed its own grading schema on the final exam, the final exam score was re-scaled by referencing the points possible assigned to the final exam. We examined number of page views made by individual students and the time they spent on each page, since the number of views and time spent are highly correlated, we decided to use one variable in in the further analysis.

The residual plot and QQ-plots suggest the simple linear regression assumption is not satisfied, thus we employed a natural logarithm transformation on the ID variable, and drew lowess line on the scatterplot and identified one point (87%) that might be of interest. In addition, outliers were identified and removed by referencing both leverage and Cook’s distance. Although the log transformation of predictor helped to satisfy the regression assumption, still only less than 1% of the variation could be explained by the model, thus we decided not to introduce the model, but to visualize the results in scatterplot instead.

  • Overall, the range of students’ performance (points received/points possible) on their final exam covers from very low to very high, and quite a few students achieve fair performance on final exam without significant Canvas page view activities.
  • In average, students, who received grades greater than 87% on final assignments/exams, tended to spend more time navigating through Canvas during final exam week.
  • In contrast, the almost horizontal regression curve of students who got less than 88% on their final exam indicates that saturation is reached quickly where reviewing materials during final exam week does not correlated with better performance on final exam. In other words,
  • Navigating through Canvas produces marginal effect on final exam preparation of students, who performed at B- level or lower (grades less than 88%)

The analysis is derived from Canvas page view activities of students who use Canvas as an extension of F2F learning. The data suggest that many students used Canvas to review course materials during final exam week, but we have very limited data shows their study activities beyond Canvas and classroom, while students’ performance on final exam can be significantly affected by their study activities conducted offline and outside classroom. Therefore, it is instrumental to identify new learning behavior taken place in a variety of digital platforms when we try to understand how our students learn.

Changes in Canvas course content over time

By Winter 2015, Dartmouth completely transitioned out of Blackboard, and began using Canvas as its primary LMS. We are interested in learning how Canvas has been utilized as an extension to face-to-face learning experience. The following chart suggests:

  • A&S undergraduate and graduate level courses show different changes over the three terms,
  • more SP15 undergraduate level courses adopted module-based design compared to WI15 term,
  • compare to WI15, fewer SP15 graduate level courses used either page or module to deliver content, in contrast, graduate level courses tend to use Canvas to administer more quizzes and facilitate more discussions,
  • among A&S undergraduate level courses, the chart reveals that as the number of published Canvas courses grows, the average number of assignments per course goes down.

We plan on consistently collecting similar set of descriptive data, and comparing the results to examine whether and how the pattern evolves over time. We are also in the process of gathering more data for diagnostic analysis in an attempt to identify elements that contributed to the changes.

CanvasCourses

Changes in Canvas course content over terms

Average counts of course contents (Assignments, Quiz and Discussion) by terms:

CourseContents

User Path – Flow Visualization

By Google definition, a flow visualization demonstrates a route, and reveals the actual path as an user explores the possible route.

“Flow reports in Google Analytics illustrate the paths that users take through content. In a single graphic, you can see how users enter, engage, and exit your content. You can also use these reports to troubleshoot your content by finding any unexpected place users exit or loop back.” https://support.google.com/analytics/answer/2519986?hl=en

http://cutroni.com/blog/2011/10/19/path-analysis-in-google-analytics-with-flow-visualization/

Students assignment submissions in relation to assignments due date

We are interested in learning about our students’ assignment submission activities, such as when our students tend to work on assignments and further whether adjusting assignments due date can better facilitate students assignment submission activities. We harvested students assignment submissions data being generated in A&S Spring 2015 Canvas courses during the week of April 26-May 4. In order to examine whether there is a relationship between assignments’ due dates and students’ assignment submissions, we also gathered information of the corresponding assignments’ due dates in the A&S Spring 2015 courses. The infograph demonstrates students’ assignment submissions activity by hours during the week, and the time of students assignment submissions in relation to the assignment due dates. The graph reveals that in that particular week, there was an assignment submission peak time during each day, and the peak time always corresponded to a due date, in other words, majority students tend to submit their assignments right at the due date/time.

https://public.tableau.com/profile/publish/CanvasCourses/NumberofAssignmentSubmissionsbyTimeofDay

studentsparticipationinaweek

The Application of Learning Analytics

Learning Analytics (LA) is a field of research that aims to predict and advise on learning, further to support faculty in identifying students’ learning needs and improve pedagogical strategies (Siemens, 2012; Verbert, Manouselis, Drachsler & Duval, 2012; Greller&Drachsler, 2012).

Verbert and his associates identified six highly interrelated objectives which are relevant in existing learning and knowledge analytics research (Verbert, Manouselis, Drachsler & Duval, 2012):

  • Predicting learner performance and modeling learners
  • Suggesting relevant learning resources
  • Increasing reflection and awareness
  • Enhancing social learning environments
  • Detecting undesirable learner behaviors
  • Detecting affects of learners

The results derived from LA research suggest that the learning data of students enrolled in programs with competence-based model can inform program core curriculum design. Under that assumption, I came up with a framework that illustrates how Learning Analytics can contribute to positive outcomes at the level of individual students, courses and departments.

  • Analyzing students’ data in learning diligence and outcome can hopefully target learners’ meta-cognition, foster awareness and reflection about one’s learning processes.
  • Data analysis from the student level can inform instructors to implemented targeted interventions and enhance their teaching practices (Greller & Drachsler, 2012).
  • Departments/programs can monitor the performance of students regarding retention and achievement in a discipline. Furthermore, they can evaluate course offerings within the discipline and improve outcomes of the programs.

LAFramework

REFERENCE:

Ali L., Hatala M., Gasevis D & Jovanovic J. (2012). Computers & Education, 58 (1), 470-489, Available at http://www.sciencedirect.com/science/journal/03601315/58/1

Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15 (3), 42-57.

Siemens, G. (2012). Learning Analytics: Envisioning a Research Discipline and a Domain of Practice. LAK ’12 Proceedings of the 2nd International Conference on Learning Analytics and Knowledge , 4-8. Available at http://dl.acm.org/citation.cfm?id=2330605

Verbert, K., Manouselis, N., Drachsler, H., & Duval, E.(2012). Dataset-Driven Research to Support Learning and Knowledge Analytics. Educational Technology & Society, 15 (3), 133-148.