Archives for category: learning

I really like Avinash Kaushik’s Web Analytics: An Hour a Day! (His blog is Occam’s Razor)

In thinking about ways that we can evaluate the effectiveness of learning via our multimedia course redesigns, the following comes to mind:

  1. Web analytics is an excellent way to unobtrusively learn more about our students’ learning that is particularly suited to the medium.
  2. Analytics can tell us what and how: what students are doing online (while logged into the content) and how they are using that content. We can “see” without actually observing them while they click.
  3. Because our work is for required courses and students have an incentive to master the content in order to pass the course as well as end-of-program qualifying/certifying exams, “intent” is not an issue – as is currently discussed in the analytics literature. And “conversion rates” are more about achievement of learning outcomes on short and longer term exam/program outcomes – not on short-term purchasing decisions.
  4. We can use the Trinity framework to assess our multimedia course designs and possibly (I hope) student learning outcomes.

Other ideas:

  • If we can provide students with the ability to monitor their learning — through analytics reporting —   they may be more open to the idea of pre- and post- “quiz” assessments. In order to be credible, this research must have pre/post measurements as baseline evidence of learning.
  • Qualitative data is essential to complement analytics. We need to determine the best strategy for collecting students’ reactions to the multimedia as well as their own assessments of their learning. Would prefer to capture that feedback anonymously…and use in aggregate.

In summary, clickstream, time-on-task and other data gleaned from logs will tell us how students are using the multimedia (which we need to know). Linking behavior to assessment scores (learning outcomes) will help us understand correlations between behavior and achievement (maybe).  Connecting behavior, outcomes, and students’ own assessments of the impact of multimedia on their learning can help us develop “actionable insights” to change/improve the media and educational impact for future students.

Next step to figure out: how to design reports for students and reports for instructors — with the aim of helping them target studying/improvement in master and changes/improvements in classroom instruction, respectively.

Advertisements

What a treat to be able to exchange information and ideas related to…data! Others we met are engaged in similar efforts. Most notably, a team is working on CANSAWARE to track, mine, and present data from Sakai.

The clickstream is one <easy> way to implement an imperfect method of assessment that aligns with the technology used — but it seems that the true power comes from:

  1. the meaning individuals (faculty) or groups (faculty, students, and support staff) attribute to the data and the action that results, and
  2. the ability to display trends in behavior over time

It seems that we are on a useful path but would be wise to clearly articulate what data is tracked, why, and how it is used. Data to help students reflect on their learning (metacognition) is one of the best immediate uses of the data.

In the end, it’s about trust—-just like any instructor/student interaction.

We need to conduct experimental research in Fall 2008 classes and try to identify and publish results and best practices associated with this methodology — as pertains to education. Bring on the IRB (institutional review board) paperwork…

In preparation for our ELI presentation, here are a few thoughts and resources to share with anyone interested in this topic. We will demo a section of an online module designed to present foundational concepts in Pharmakokinetics to first year PharmD students.

Caveats:

  • Students log in to the module outside normal class time with their unique ID
  • The module supports and enhances in-class interactions between students and with the instructor
  • Only students’ behavior inside the instructional module is tracked

Why use clickstream data? What are the drivers?

  • Accountability (accreditation; institutional accountability efforts)
  • Affordances of the technology (easy to implement, unobtrusive)
  • Research (scholarship of teaching and learning)

What factors raise concerns?

  • Privacy issues. Do students know that their behavior online is tracked (similar to Blackboard page tracking)?
  • The overall weight given to clickstream data as an assessment form
  • How the resulting data will be used (high-stakes decision-making regarding individual students v. assessing overall class progress at a point in time)

What opportunities are presented?

  • Improving individual and group learning outcomes by immediately identifying misconceptions and problems, and addressing them in class
  • Modifying in-class instruction and activities to meet individual and class needs at the point in time in which they arise
  • Discovering how students actually use interactive content (research)
  • Improving the instructional impact of interactive content & online learning environments based on students’ behavior, feedback, and other assessments
  • Mapping content, activities, and assessments into a connected whole rather than disparate parts
  • Co-opting a method typically used by advertising/marketing to predict who will buy something and instead using it to better understand learners’ behavior and potentially, predict how people with different learning styles will use online environments (research)
  • Providing students’ own clickstream data to help them reflect on their learning, progress, and strategies

Questions:

  • Can a student turn off tracking or is it “always on”? Can a student choose to have his or her tracking be anonymous?
  • How long is data stored? Where? What trends can reasonably be deduced over time?
  • What is the best that tracking can tell us about learning? Where are the boundaries and what are the limits?

Next steps:

  • Reports (sense-making from raw data to inform instructors, students, and designers)
  • Visualization (graphs, charts, the elusive “dashboard” of learning outcomes)

Resources:

Attended some excellent sessions today — most notably Henry Jenkins’ keynote, “What Wikipedia Can Teach Us About New Media Literacies” and George Siemens’ presentation on “Connectivism.” More details and reflections to follow later — however, there are some good resources I’d like to remember:

Complexity, education, networks, participation & learning ecologies — loved it!

Yesterday I finished Made to Stick: Why Some Ideas Survive and Others Die. I loved it! For me, it reinforced how difficult it is for all of us to communicate clearly. And further, how clear and simple communication can be achieved, but in order to do so, one must really step out of self and into others’ perspectives. This was a very bloomish book and I look forward to trying to implement the strategies the authors outline. See for yourself: madetostick.com.