Monday, October 26, 2015

Testing uTest: or How I Learned to Stop Worrying and Love the Gig Economy

I’m a member of an online community of testers called uTest (the practitioner-facing side of Applause).  The company hosts a social network for testers as well as offers short-term gigs to its users (the cliched Uber for testers).    I was called on this weekend for my first gig: testing a payment method in Taxis.  It was pleasant, if stressful, and it made me think of ways a company could take advantage of it to expand the perspective on their products. 

After the initial invite to the test cycle, I communicated with a project manager via Skype to ensure I was able to carry out the test scenarios.  I brought to the table a specific model of phone and a verified ability to pull debug logs from it (thanks Verizon for turning off the op codes on my S4; I resorted to adb logcat).  They provided technical assistance and reimbursement for the transactions, but the primary incentive was a reward for test cases completed.

Throughout, I felt like a skilled secret shopper rather than a functional tester.  I was asked more about the non-software components of the launch than the app or phone functionality.  I reported on the advertisement of the feature, the education of the employees, and the capability of the equipment that supports the feature. In spite of expectations from the participating companies that the advertisement would match hardware would match training, this happened 0% of the time, and no employee had been informed why their equipment had been updated.  I wasn’t the only tester in this project on the ground, and the others testers saw related issues, and none had all their ducks in a row.  In all, the information they were most excited about was the boots-on-the-ground report I provided.  It was fascinating to see a live product launch from that perspective, and doubly so considering my history in this product space. 

The final bulk of time spent on this gig was an experience report.  Complete with detailed feedback, photos of the equipment, and other evidence, this is where I was able to comment on the process as a whole.  From a testing perspective, I was able to provide detailed UI/UX feedback, evaluate the consistency of the process, and help them understand how much of their plans actually came off during the launch period.  There was some repetition in these reports.  One was a document with pictures, the other was a survey that linked to the doc and a third was a JIRA-in-spreadsheets for tracking testers.  These reports were all submitted to the project manager for feedback, and I received an up/down "accepted" in less than 24 hours.  While there is definitely room for improvement on the platform that would reduce tester overhead, it wasn't enough of a burden to avoid gigs.

Participation in this project taught me that coordination is hard, education of low-skill workers is even harder, and successful coordinated nation-wide launches like this are next to impossible.  This mirrored my experience with other companies.  There are always bits and pieces of projects that do not make their way down the chain to the workers that are face-to-face with customers, so having a boots-on-the-ground perspective is vital.

Overall, a company can leverage uTest services to add layers of testing to their launch activities over and above internal testing.  Specific and targeted verification is where they are the strongest.  They can provide feedback on as to whether an uneducated but motivated customer can make it through the new process right now.  Specific errors and third-party verification can be  farmed out for feedback through such a relationship: things the company wants to do but can’t do efficiently given a small staff or tight timelines.  This can provide a company with an enhanced view of the product and help challenge their assumptions before their customers do.

Note: edited to be a little more coy about who I was testing for and what I was testing.