Preparation
When I was asked if I’d be interested in judging the Software Testing World Cup Oceania event this year, I said “Sure, why not?”. I actually hadn’t heard of it before. From the little information available on the website at the time, I didn’t have a good understanding of the sheer scale of this event.
When I received the first group email communication from Maik I realised that preparations were already well underway. I was overwhelmed at first by the large number of judges who were already involved, and impressed when I recognised some of their names from the online software testing community.
Once we had all agreed on dates for the regional competitions, it seemed like no time at all until the first competition was held in North America. Immediately after the competition there were group emails flying back and forth about test reports, bug details, reproducibility, product owner engagement… I read every email but most of them went over my head. It didn’t occur to me that I could ask Maik & Matt for access to the test reports and the bug tracking tool in order to follow the process more closely. I assumed I wouldn’t be allowed access by the product owners. I also know now that there’s a recorded YouTube stream video of each competition available for public viewing, which I didn’t realise at the time. Mostly I was just feeling overwhelmed and thinking, “We’re next”.
A few days later the Oceania competition was approaching and I asked if the North America judges had any blog posts or ‘Lessons Learned’ to share with the judges of upcoming events. But of course they were still in full-swing of actual judging, on top of their day jobs, and hadn’t had a chance to put anything together yet.
Competition
On the night of the competition, I sent out an email 1 hour before the start time basically asking, “What should I be doing?”.
That lead to some emails from Matt and 30 minutes later I had about 7 tabs open in Firefox on my laptop and I was feeling very confused. We were using Skype, Twitter, Google Drive, YouTube, email, Google Hangouts, HP Agile Manager and SUT concurrently. I started familiarising myself with SocialText, our SUT. I hadn’t realised I’d be on camera and recorded on YouTube, but luckily I wasn’t in my pyjamas.
Due to some technical issues (read: user error – long story) I was hearing the audio live from Google Hangouts and with a 5 second delay in stereo from YouTube. I wasn’t able to mute one without also muting the other. When I explained the problem Sigge sent me a link to view comments on the live stream without viewing the video, and then I could start to participate properly. I’d lost more than 20 minutes worth of product explanation from the product owner unfortunately. The other regional judge Dean joined just after I did, I think he was having his own technical issues.
For the remainder of the competition, I was Alt-Tabbing my way through Firefox:
– Monitoring YouTube comments and reporting participant questions to the product owners verbally
– Typing up answers to the questions back on YouTube for quick reference
– Constantly switching YouTube comments view from Top Comments to Newest First so I could see if any new questions had been asked (grumble grumble)
– Reviewing the judging categories in Google Drive
– Posting to Twitter, just for fun
– Watching the product owner video stream and chatting to other judges in Google Hangouts
– Reading the bugs that were being raised
– Trying to repro some of the more interesting bugs in the SUT
– Emailing participants who were having technical issues
– I was also sending the occasional one-on-one chat message to other judges using Skype.
About 2 hours into the competition, the participant questions had settled down while they got on with raising defects and writing their reports. As soon as we had a moment to think, all of the regional judges in Oceania were asking ourselves the same question:
If this was so complicated and hectic for 3 regional judges with xx a cup load of teams, how will the 3 regional judges for Asia cope with xxx a bucket load of teams?
Judging
From the outset we had two goals for judging the competition:
1. Judge the participating teams and agree on a winner for the region.
2. Think of ways to make the Asia competition manageable for judges and participants.
Every bug was read by one or more judges. Like the other judges, I tried to reproduce the interesting bugs on my own systems. Before even reading the test reports, I could get a feel for which areas each team had focussed their time on.
Every team’s entry was judged by at least 3-4 of the 5 judges. If the discrepancy between scores was above average, we’d judge that team’s entry again until we agreed on the overall score. This is a very fair approach, and also very time consuming.
Each day after work, the local judges were online judging entries. We didn’t coordinate to judge at the same time, we all just tried to complete the task expediently, around our existing schedules. We could see each other online in Skype, updating the judges notes online, creating strange things in the SUT and updating the status of defect reports. I found it helpful to be able to jump on Skype and ask for opinions and clarifications in real-time.
During the judging I found more than a few bugs in the defect management tool. As a thankyou for their sponsorship (read: because the bugs were annoying me so much), I was planning to report the bugs to HP somehow. But now that I’ve finished using the tool I’ve lost motivation for reporting the bugs to HP.
I don’t like to say too much more about the judging publicly until after the final STWC event in November. I’m happy to talk it through with other judges in our Google Group in the meantime.
After the judging I put together some metrics to show myself once again just how useful metrics are. Suffice to say, there was no metric I could find that correctly predicted the winning team. I didn’t try very hard though. Add a comment below if there’s a metric you’d like me to generate from our competition data, and I’ll let you know if it would have proven useful for judging the quality of the teams’ work.
Tips for Judges
1. Like most volunteer work this was very rewarding, particularly when you find people in your region who are as passionate about testing as you are.
2. This is an excellent learning experience, as you would expect. As with participants, this is your chance to show your professionalism, work with peers in your region, and learn about different approaches to testing.
3. You don’t need to be in the same region as the competition in order to judge it. There are xxx many teams registered for the upcoming Asia competition and I see 3 local judges listed on the website. Are you free to assist with judging? Why not contact Maik or Matt today and offer to help (perhaps include a link to your LinkedIn profile or twitter account).
4. Why not write a blog about your experience? It helps you to capture and remember what you’ve learned, and it may help a judge of the next competition who is feeling unprepared and unsure of what to expect.
5. Have some drinks and snacks within reach (or in my case, a great partner within earshot).
6. This is your regional competition. Be actively involved during the competition, speak up during the live chat, ask clarifying questions…
7. Set aside a few days\evenings after the competition to focus on judging.
8. If you have an idea to improve the judging in any way, speak up to Matt and Maik. They’re experienced in this process now, and will be able to discuss it with you further.
9. Sigge, Dean and I have come up with some ideas for processes and improvements, and fed those back to Matt and Maik for consideration.
Some tips for Participants
1. This competition is intensive and participant’s attention is pulled in multiple directions for the duration of the competition. None of the Oceania teams with 1 member completed their Test Report. Most of the 2 member teams also failed to complete the competition. In my opinion, if you are a team of 1 or 2 people, please consider joining up to form teams of 3 or 4 people.
2. If at all possible, co-locate with team members for the competition. Or if you have video conferencing available at the office, use it. There is a lot going on at once, and ease of communication with team members is key.
3. Bug quality is so much more important that quantity. I could write a whole blog post about this point alone, but I shouldn’t really have to for this target audience 🙂 Brush up on your bug advocacy skills.
4. Have multiple monitors available if you can. At a minimum, try to have one screen where the whole team can see and hear the live stream video of the product owner, in addition to everyone having their own screen to work on.
5. Get familiar with the defect reporting tool in advance. Learn how to attach screenshots to issues, for example.
6. The competition has a strict finish time. That’s the deadline to have all bugs and test reports submitted by, not 30 mins later and not the next day 🙂
7. You can find additional tips on twitter if you search for #STWC2014 (If you’re not on twitter yet, read this and then join twitter).
8. Have fun with your team! This is an opportunity to work closely with colleagues and peers. While it’s a simulation of a high-pressure project with a very short timeline, it is just a simulation.
Good luck!
Like this:
Like Loading...