We released RC5 early last night. The first thing that you will notice is that this release is much more feature anemic than the rest of our releases and, amazingly, the interface looks exactly the same. This is because we worked on most of the features over the holiday break and, because of that, we wanted to take it easy. So, RC5 is about doing things we wanted to do in RC4 but didn't have time to do. This includes:

  • Default actions for forms such as the feedback form and the login form.
  • Asynchronous loading from the server so that the application doesn't lock up
  • Testing! testing! testing!

We also did some research into some other features (such as HTML5 file uploading) that never made it into this release because of road blocks.

Asynchronous loading

While writing the code to make our loading asynchronous, there were only a handful of files that I needed to change. This was amazing to me. I've worked on projects where changing the data access layer from synchronous to asynchronous would bring the entire architecture down. However, our team has been diligent about practicing good design.

Of the handful of files that needed to be changed, none of them was view code. First, I needed to change our data access class (OLActiveRecord) and then, because the API changed I had to go and fix the controllers that were calling the old API. In no time, I loaded up the app to find that it worked flawlessly. Boy did my eyes light up!

Default actions

When Kyle said that committed default actions for all known, programmatic forms, I was glad that someone was able to figure out how and had the time to implement the sure-to-be-crazy amount of things that needed to happen. He had pulled the wool over my eyes. In most cases, it was two lines of standard code and none were more than four lines of code. Kyle has shifty eyes.. never believe a word he says!

Lessons learned

While this was a short release, I still learned some nuggets of information.

First, holiday releases are hard to get through. In part, this is due to the dispersement of the team. In part, this is due to the amazing amount of things that don't get done during the holiday breaks. Our team had planned to release on Sunday but we hadn't put together enough work to make it a viable release. So we pushed it to today and did some more work. Don't do holiday releases!

Second, holiday releases keep the ball rolling. Coming off of Christmas break has been an entirely different experience for our team than Thanksgiving break. This could be due to the fact that we are still in the middle of a term but I think a lot of it is about the fact that we were thinking about our project over break. Maybe you should do holiday releases!

Experimental changes are both fun and frustrating. When all goes well, it is extremely satisfying (asynchronous loading) but when all goes wrong, it is extremely frustrating (file uploading). While the result (nailing 2 out of 3 highly volatile, unknown scope type of features) was really good for our release, I think that throwing in 2 or 3 more stable tasks would have helped the sanity of our team as a whole. This is just a project management perspective on keeping a team together and not necessarily a reflection of the team's skill, but I think that those more stable tasks would allow for developers to stay in the groove while working on those unstable tasks in parallel.


Metrics time! For this release, our goal wasn't to add a huge chunk of functionality, as we normally do, but to instead add a couple experimental things and write tests. So, we expected to see our tests per method to go up dramatically.

Release LOC Classes Files Methods Tests Methods/Class LOC/Class Tests/Method
1 1424 27 32 137 21 5.07 52.7 0.153
2 2551 35 39 271 26 7.74 72.9 0.096
3 2578 37 40 260 41 7.02 69.7 0.158
4 3436 43 47 338 49 7.86 79.9 0.145
5 3813 53 58 351 92 6.62 71.9 0.262

Our LOC went up from RC4, which is mostly test code LOC.

Our Classes and Files increased, which is mostly test code.

Our tests increased by nearly double, but this surprises me in the fact that while testing I thought that we had covered most of the major areas in the code. The raw numbers of tests is very low compared to the raw numbers of methods. There may be a counting error in the metrics tool (I'll talk to Chandler about it).

Methods / Class went down because of the increase in Classes from tests (this also makes me think we are counting tests as methods).

LOC/Class went down for the same reason as above.

Tests/Method went up, which was a goal, but it didn't go up enough. Again, if my hypothesis is correct, we should be calculating 92/(351/92)=0.355 instead of 92/351=0.262. That is still not great but its a more reasonable number, in my opinion. It would be really nice to have a code coverage tool to see which methods we are failing to test.