In-class WOD results

WOD Results

FizzBuzz (8/26/13)

This WOD occurred on the first day of class.  The goal is to implement the FizzBuzz program on a piece of paper. The results were self-reported.  Maximum of two mistakes allowed. The code was hand-checked.  This was to simply give everyone a sense for coding under a time constraint.


The results indicate that many students need to brush up on their Java coding skills. Note that over half of the class DNF’d.

Eclipse Startup (9/4/13)

This WOD was designed to give everyone feedback on how well they were able to install and configure Eclipse.  The WOD timed how long it takes Eclipse to start up on your laptop.  The goal is to have the IDE ready for work within 10 seconds. The Eclipse Configuration Guide provides ideas on how to improve startup time.


This time, about half of the class DNF’d, even though this wasn’t even a programming problem (There was a homework assignment that required the students to configure their environment in order to have Eclipse boot within 10 seconds.)   My hope is that those students whose Eclipse did not boot within the desired time limit will reconfigure their system for improved startup time.

CharFrequency (9/11/13)

CharFrequency is the first “real” WOD of the semester. Students were asked to implement a Java program called CharFrequency which differs only slightly from the last Java Practice WOD.


Half the class completed the WOD successfully, which is a reasonable starting point.  My hope is that the percentage of DNFs will decrease as the semester continues and everyone gets more experienced.

BrowserHistory4 (9/18/13)

BrowserHistory4 asked the students to create a simple four page web site by reorganizing the data in the BrowserHistory3 practice WOD.


Once again, only about half the class finished the WOD successfully. In contrast to last week, in which the DNFers actually did not turn anything in, this week all but two students turned in their code prior to the cut-off time.   This means that in seven cases, students submitted solutions that upon review were not correct.

In some of these cases, the difference between the submission and the reference screenshot shown in the WOD was so obvious that I wonder if those students simply “gave up” once it got close to the DNF time. In other cases, it could be that the students simply overlooked a problem with their approach before turning it in.

Responsive Hokulani (9/25/13)

Due to poor Internet connectivity, this WOD was cancelled.

Multi-Page Kamanu (10/2/13)

This WOD required the students to use the Play framework to create a multi-page web site based upon the Kamanu Composites site design.


For the first time this semester, the number of DNFs fell substantially below 50%.  In addition, more students obtained Rx time on this WOD than ever before.  These appear to be encouraging trends.

On the other hand, I provided substantially more time for this WOD than previously. While I did that due to the increased complexity of the task, it may have (at least) positively influenced the number of Rx submissions.

Surferpedia (10/9/13)

This was the first group WOD of the semester: students added a new page to a partner’s Surferpedia website.  To make matters more complicated, they could not speak to their partner; all interactions had to be through Google Hangout; and they shared their code through GitHub repositories.  The goal of this WOD is to test both website manipulation skills as well as software engineering collaboration skills.


Like last week, there were relatively few DNFs in this WOD. Also like last week, I provided significantly more time to work before cutting off with DNF, which might account for more finishers.  Of course, the task was complex, so more time appears warranted.

Digits-Delete (10/16/13)

This was a solo WOD in which students updated a dynamic web application written in Play to incorporate “Delete” functionality.  (The application already had Create, Read, and Update, so this WOD completed a CRUD application.) The WOD required them to update the model, view, and controller sections of the web app. While the base application code needs to be touched in several places to implement Delete, if you know what you are doing it is possible to implement this quite quickly.


Performance on this WOD was interesting in that there were no Sd times: either the students finished in less than 20 minutes or they didn’t finish at all.  Also, all of the finishers had a correctly functioning application. Out of the 14 students, there were only 2 DNFs, which is a strong showing from the class.

Digits-Hidden (10/23/13)

This was a solo WOD in which the students had to design a scala template to abstract away some raw html code in their view.  The WOD required them to touch just two files in their Digits application.


This WOD showed more variability than I expected; I had assumed this would be very easy for the class.  The finish time went from  5:08 minutes for the best finisher to a DNF at 21:00.  The percentage DNF continues to fall, although five students have now dropped the class, which may account for the low percentage of DNF.

Surferpedia Footstyle (10/20/13)

The goal of this WOD was to add a pair of radio buttons to the Surferpedia application to indicate the surfer’s footstyle (regular or goofy). The students were required to create a branch of their application in GitHub in which to commit their changes, then deploy the working application to CloudBees. This required modifications to 8 files and about 100 lines of changed code.


Most students actually reported this to be an “easy” WOD, even though most spent between 30 and 60 minutes.   Only 2 students DNF’d.

Surferpedia Login (11/13/13)

This was a fairly complicated WOD that required the students to add authorization and authentication to their Surferpedia system.  It required changes to around 14 files and about 300 added lines of code.


This turned out to be a (relatively) harder WOD: four students DNF’d.

WOD Performance Trends

A central question for this approach to education is whether the “WOD” approach actually works:  do students “get better” over time at software development as a result of this kind of pedagogy?

Gaining empirical insight into whether the students are improving in performance over time is challenging due to the following issues:

  • The WODs vary in difficulty, so an increase or decrease in WOD time does not necessarily indicate an increase or decrease in performance capability; it is more likely due to the difficulty of the task.
  • My assignment of Rx, Sd, and Av times is arbitrary.  It is quite possible that  increases in (for example) the number of Rx performances over time is not due to actual performance improvement, but merely due to me assigning an easier threshold for Rx as the course goes on.

Box Plot Visualization

To attempt to gain some preliminary insight into performance trends over time, I have constructed a box plot showing the performance times (in seconds) for all students who complete a given WOD successfully (i.e. do not DNF).    The following chart shows the minimum, maximum, 2nd and 3rd quartiles, and median values for the non-DNF WOD times.


Here are some things to note about this visualization:

  • It removes the “bias” present in assigning Rx, Sd, and Av to WOD times, as only the actual performance time is represented.
  • There appears to be a relatively consistent ratio of approximately 3:1 between the fastest and slowest students who successfully finish the WOD.   This can be observed by looking at the minimum and maximum values. I believe 3:1 to be a conservative estimate of productivity differences, since this does not account for students who get cut off at the DNF time and who might have ultimately completed the WOD with significant additional effort.
  • The most hopeful indication of performance improvement is the proportional decrease in the size of the quartile box.  This means that instead of students being spread out across all times, there are more students finishing in approximately the same time, with fewer students doing substantially better or worse.

DNF trends

The box plot visualization omits data regarding students who DNF, but that is obviously an important aspect of performance.  To provide some preliminary insight, here is a simple chart showing the percentage of the class that DNF’d for the WODs in chronological sequence:


Clearly, the percentage DNF shows a precipitous decline after the first two WODs.  I believe that the decline in DNFs are a result of: (a) students learning how to use the homework to prepare for the WOD;  (b) students becoming accustomed to “programming under pressure”, and not having it impede their ability to accomplish the task at hand; and (c) several of the poorer performing students dropping the class over the course of the first six WODs. (Although note also that one of the highest performing students also dropped the class, so attrition did not occur from the bottom only.)

My belief is that this decline in DNF cannot be attributed to the WODs becoming easier. The Digits-Delete WOD is substantially more complex than the CharFrequency WOD, even though the Rx times are the same.

Taken together, the decline in DNF along with the reduction in variability of performance for the second and third quartiles appears to provide evidence that more students are performing “adequately”, and that “adequate” is becoming less variable over time.