It's been grindingly slow, but I have finally written a program to analyse the log files we gathered in our recent school study. I actually really like doing this kind of thing. I have been tormenting the rest of the AA team with my graphs for weeks. My poor husband has been roped in to help my with some of the database code (thank you, dear!)
We had a class of 25 pupils work with our Adventure Author game making software for around 6 weeks. As they used the software, it logged what they were doing, for example adding an idea to the Fridge Magnets, writing conversation, or playing their game. The result of this was a huge number of time stamped log files, totalling around 234,000 actions.
Why on earth did we bother? Well, in Adventure Author we are interested in the creative process of game design. We want to know what sequences of actions when using the software seem to produce good games in the end. Cathrin has been busy analysing the games themselves to get an estimation of their quality, and I have been automatically processing these log files to try to build up a picture of the sequence of activities the children typically do. Eventually we will match them up.
The first step is to simple counts of how the children spent their time. The pie chart above shows the average proportion of time spent in each category across the whole class. I also have pie charts for each child, and a bar chart for each child which shows exactly how they spent their time each session. That will help me understand their process better.
But in the mean time, let's take a look at this pie chart of average time spent in activities. It shows the main categories of things the children can do when using the software. You can see that the largest proportion of time (27%) was spent working with blueprints. This means adding characters, plot items and placeable objects such as houses or castles. Working with blueprints tends to belong either in the exploration or problem solving phases of the creative process. Next most common (23% of the time) was making conversations. Here the children were writing dialogue for the characters to say during the game. There is quite a high variance for conversation time, because some of the girls spent up to 50% of their time writing dialogue, whereas some boys spent less than 10% of the time on it. Also note that the time spent on it doesn't tell you how much dialogue they wrote, as some children evidently find it much harder than others. Conversation tends to be in the problem solving or exploration phases too.
Right next to conversation time, we have game testing time (22%). This is an interesting one because of course you don't want the kids to spend all their time playing the game, and no time refining it. But equally you want the children to test out their work as part of the internal validation phase in the creative process.
At 17% we have terrain time: when the children change the landscape of their game by adding water or hills and so on. Cathrin is learning some interesting things about how what the children do with the visual layout of their games contributes to storytelling. This usually relates to problem solving or exploration phases too.
What about the other stages of the creative process? Well, we have problem finding. This is represented by idea time in the graph at about 6%. This was when the children used the fridge magnets facility in Adventure Author. A lot of the children only used it when we first introduced them, but some used the magnets to remind them of ideas they had previously had, and one child spent a while rearranging them to help her structure ideas for a written story. I would say that the fridge magnets were helpful when the kids actually used them, but most of the time they didn't remember to.
Lastly, we have the external validation phase. This is the peer review category at 4% of time when the children used Comment Cards. We only had time for one session of formal peer evaluation, but it's important to note that our observation notes show a lot of spontaneous peer collaboration which we like to call "brush fire". This happens a lot in the exploration phase.
So what? This graph gives us an idea of how the kids spent their time when they were using the software. (It doesn't show anything about how much time was spent in discussion with peers and the teacher). It shows us that the children spent roughly equal time in writing conversation as they did in "just" playing their games. It shows that they spent a lot of time in creating and populating their areas, and there is other evidence from the games themselves that this resulted in games with strong storylines. My analysis tells us how the children spent their time. Cathrin's will tell us whether they put their time to good use.
I have various tasks to do now, but one is to look into the peer review session more deeply. What kind of advice did they give each other? Did they choose to take each others' advice? I also need to bite the bullet and learn how to do sequential analysis.