This session featured two speakers: Dr. Rober Appelman, faculty member for Indiana University’s Instructional Systems Technology department, and Dr. Sonny Kirkley, CEO and researcher at Information in Place, Inc. and adjunct professor in IU’s School of Infomatics.
In this session, there was a call for converging language and approaches to design for game designers and instructional designers. Pure game (entertainment only) designers find adding the instruction/assessment pieces to games challenging and pure instructional designers find challenges in making content fun.
Dr. Kirkley reviewed several projects his company has developed. In all, the challenges of managing stakeholders (who want to have a say in the project’s direction) as well as the external vendors/developers is greater than the challenges of actually making the game.
One example he discussed was the Virtual Astronaut project which is a 10-year project and they have completed the first year (it’ll be multiple modules added to the core game over time).
It is designed for middle school students and is based on problem-based learning using a first person explorer of STEM curricula. They collaborated with Virtual Heroes on this project.
What I took away from this presentation are the following:
- The struggles we have right now for simple instructional multimedia are likely to only increase with games (bring it on!). One reason is that funding (especially iterative funding) can force problematic processes. We’ve already seen this one…
- A well-defined process (including detailed game design documents) are essential once the project officially begins. See Gamasutra’s article on this topic.
- Authentic assessment within the game is the goal. If we can be smart about assessment up front, we will have a good chance of being successful.
- The design team needs to articulate their own theories of learning and ideas about what makes games fun — at the outset — in order to minimize conflict later and better communicate from the start.
In this session, three different paper authors shared an overview of their work. Debra Lieberman conducted research on video games designed specifically to help kids manage asthma (Bronkie) and diabetes (Packy and Marlon). The research used an experimental design with clinical trials and found that children’s visits to the emergency room to treat these problems decreased while their self-efficacy increased.
I couldn’t help but think of the book Persuasive Technology which is really at the core of all of these gaming sessions. I’m a bit surprised that no one has mentioned it or cited it…?
In any case, her site www.healthgamesresearch.org will soon send out a RFP (January 2009) that we should check out.
The next paper was given by John Richardson, a college senior and video game designer/researcher who has a form of cerebral palsy. He made a strong case for the need to ensure all of our games are accessible for people with disabilities: it serves as a means for empowerment — for gaining control over one’s body. This is complex and challenging because a disability for one person may change over time (congenital disabilities) and the discussion about disabilities tends to focus on inputs rather than a standard platform.
Is there an organization dedicated to accessibility in games? We need to do more testing; seek out feedback from our current designs…
The last presenter was Moses Silbiger whose paper and research focused on using games to help people move through the human development/self-actualization process. (I didn’t really understand this one…?)
Wow! How fun does this game look? Made by two former students (see 2D Boy) in Carnegie Mellon’s Entertainment Center — on a shoestring budget with smart marketing. I’ll buy it!
Drew Davidson, the Director of the ETC walked us through parts of this game showing us how the attention to detail in this physics puzzle game including graphic style, sound effects, and smart, satiric writing all combine to make a great gameplay experience.
The Tower of Goo came out of the Experimental Gameplay Project in which each game must be made in less than 7 days by 1 person, and show off something never seen before. Another game from this project is Crayon Physics.
The World of Goo should be available on Wii Ware and PC on 10.13.08. It is available for pre-order at Amazon.
The Meaningful Play conference began with a great keynote by Richard Hilleman, the Chief Creative Officer of EA. There were several take-aways for those of us who don’t have to turn a profit but do have significant challenges in staffing and development, moving educational content from experts to interactive experiences for learners, and for general investments in R&D in terms of people and products.
Selling is the hardest part of game development. Hilleman’s advice: Begin by figuing out how to sell what you’re making. Be clever! The better the narrative (about what you’re making) and the more provacative the content, the easier it is to sell. For those of us providing fee-based development services, this is useful advice…
Developing creative leaders is essential to success. At EA they have developed internal education programs designed to prepare employees to take on greater leadership roles — if they show the aptitude. Programs are selective, and people can fail. But overall, they have proven to be a resounding success to the company. Here are some of the first programs Hilleman developed:
- The Gong Show – a 2-day rapid-prototyping program where the team composes a game, the financial model, marketing strategy, and pitch. It not only brings to light new ideas and solidifies team collaboration, but also gives team members the confidence that game development doesn’t have to take years…
- Seeing is Believing – a workshop focused on managing Art Directors. Too often Executive Producers instruct the Art Director (i.e., “use this image and turn it blue”) rather than giving Art Directors the creative freedom and support they need to make the game successful.
- Roaring Silence – a workshop focused on game audio and audio design
- Listening Out Loud – Listening to audiences and opinion leaders
- It’s All Nuts & Bolts – Game design skills workshop targeted to non-programmers. In these sessions, teams use Lego Mindstorms robots to complete challenging tasks
What qualities does EA look for in candidates? For Designers, they look for people who like people. Empathy and respect for the player is key. For Producers, they look for selflessness; the ability to amplify the efforts of others.
Parting comment: Flash is the most ubiquitous game platform. As of the end of September 2008, there were over 1 BILLION Flash installations. Yowza.
I really like Avinash Kaushik’s Web Analytics: An Hour a Day! (His blog is Occam’s Razor)
In thinking about ways that we can evaluate the effectiveness of learning via our multimedia course redesigns, the following comes to mind:
- Web analytics is an excellent way to unobtrusively learn more about our students’ learning that is particularly suited to the medium.
- Analytics can tell us what and how: what students are doing online (while logged into the content) and how they are using that content. We can “see” without actually observing them while they click.
- Because our work is for required courses and students have an incentive to master the content in order to pass the course as well as end-of-program qualifying/certifying exams, “intent” is not an issue – as is currently discussed in the analytics literature. And “conversion rates” are more about achievement of learning outcomes on short and longer term exam/program outcomes – not on short-term purchasing decisions.
- We can use the Trinity framework to assess our multimedia course designs and possibly (I hope) student learning outcomes.
- If we can provide students with the ability to monitor their learning — through analytics reporting — they may be more open to the idea of pre- and post- “quiz” assessments. In order to be credible, this research must have pre/post measurements as baseline evidence of learning.
- Qualitative data is essential to complement analytics. We need to determine the best strategy for collecting students’ reactions to the multimedia as well as their own assessments of their learning. Would prefer to capture that feedback anonymously…and use in aggregate.
In summary, clickstream, time-on-task and other data gleaned from logs will tell us how students are using the multimedia (which we need to know). Linking behavior to assessment scores (learning outcomes) will help us understand correlations between behavior and achievement (maybe). Connecting behavior, outcomes, and students’ own assessments of the impact of multimedia on their learning can help us develop “actionable insights” to change/improve the media and educational impact for future students.
Next step to figure out: how to design reports for students and reports for instructors — with the aim of helping them target studying/improvement in master and changes/improvements in classroom instruction, respectively.
What a treat to be able to exchange information and ideas related to…data! Others we met are engaged in similar efforts. Most notably, a team is working on CANSAWARE to track, mine, and present data from Sakai.
The clickstream is one <easy> way to implement an imperfect method of assessment that aligns with the technology used — but it seems that the true power comes from:
- the meaning individuals (faculty) or groups (faculty, students, and support staff) attribute to the data and the action that results, and
- the ability to display trends in behavior over time
It seems that we are on a useful path but would be wise to clearly articulate what data is tracked, why, and how it is used. Data to help students reflect on their learning (metacognition) is one of the best immediate uses of the data.
In the end, it’s about trust—-just like any instructor/student interaction.
We need to conduct experimental research in Fall 2008 classes and try to identify and publish results and best practices associated with this methodology — as pertains to education. Bring on the IRB (institutional review board) paperwork…
In preparation for our ELI presentation, here are a few thoughts and resources to share with anyone interested in this topic. We will demo a section of an online module designed to present foundational concepts in Pharmakokinetics to first year PharmD students.
- Students log in to the module outside normal class time with their unique ID
- The module supports and enhances in-class interactions between students and with the instructor
- Only students’ behavior inside the instructional module is tracked
Why use clickstream data? What are the drivers?
- Accountability (accreditation; institutional accountability efforts)
- Affordances of the technology (easy to implement, unobtrusive)
- Research (scholarship of teaching and learning)
What factors raise concerns?
- Privacy issues. Do students know that their behavior online is tracked (similar to Blackboard page tracking)?
- The overall weight given to clickstream data as an assessment form
- How the resulting data will be used (high-stakes decision-making regarding individual students v. assessing overall class progress at a point in time)
What opportunities are presented?
- Improving individual and group learning outcomes by immediately identifying misconceptions and problems, and addressing them in class
- Modifying in-class instruction and activities to meet individual and class needs at the point in time in which they arise
- Discovering how students actually use interactive content (research)
- Improving the instructional impact of interactive content & online learning environments based on students’ behavior, feedback, and other assessments
- Mapping content, activities, and assessments into a connected whole rather than disparate parts
- Co-opting a method typically used by advertising/marketing to predict who will buy something and instead using it to better understand learners’ behavior and potentially, predict how people with different learning styles will use online environments (research)
- Providing students’ own clickstream data to help them reflect on their learning, progress, and strategies
- Can a student turn off tracking or is it “always on”? Can a student choose to have his or her tracking be anonymous?
- How long is data stored? Where? What trends can reasonably be deduced over time?
- What is the best that tracking can tell us about learning? Where are the boundaries and what are the limits?
- Reports (sense-making from raw data to inform instructors, students, and designers)
- Visualization (graphs, charts, the elusive “dashboard” of learning outcomes)
Attended some excellent sessions today — most notably Henry Jenkins’ keynote, “What Wikipedia Can Teach Us About New Media Literacies” and George Siemens’ presentation on “Connectivism.” More details and reflections to follow later — however, there are some good resources I’d like to remember:
Complexity, education, networks, participation & learning ecologies — loved it!
Attending the New Media Consortium conference in Indianapolis. It is great to see old friends, meet new ones, and get new ideas. Spent most of the day learning tips and tricks for Final Cut Pro. It was great and the Apple training team was fabulous. Hopefully today will bring some Motion and Aperture.
We also had the opportunity to meet Lance Speelmon, IU’s Manager Online Development / Sakai Release Manager. He was gracious enough and crazy enough to answer our 400 Sakai/LMS questions.
And the Penn State contingency arrived in full force. Don’t ask about “group hugs.”
Today the conference officially begins…more later.
Yesterday I finished Made to Stick: Why Some Ideas Survive and Others Die. I loved it! For me, it reinforced how difficult it is for all of us to communicate clearly. And further, how clear and simple communication can be achieved, but in order to do so, one must really step out of self and into others’ perspectives. This was a very bloomish book and I look forward to trying to implement the strategies the authors outline. See for yourself: madetostick.com.