Upgrading to a User Research Centre of Excellence (for under £30)

Songkick
Songkick
Published in
5 min readJun 4, 2015

--

Here at Songkick HQ we’re big advocates of design research, which includes welcoming users into our home to collect feedback and insight on our ideas, prototypes and products. This year, we’ve significantly enhanced our research setup — here’s how.

Our in-house research was made possible by the creation of our Research Centre V1.0, crafted by the awesome Jo Packer and Tomasz Zbikowski a few years ago. They literally drilled massive holes in the wall, built a one-way mirror and set up the observation room filled with technology to enable such research.

Old setup

Research Setup Pre 2015

1 — A desktop machine, running Windows XP and the screen recording software.

2 — Two webcams, one to record the user at the computer and the other sometimes used for recording paper prototyping sessions or mobile.

3 — A microphone feeds audio into the observation room.

4 — A basic LCD monitor with VGA cabling shows the observers the users desktop and face camera.

5–4 Pairs of headpones connect to a pre-amp (TUBE MP Personal Processor) allowing team members to listen in and take notes.

The setup had served us well for the past few years but began to have drawbacks, based on our needs:

  • Multiple device-types: We’ve been developing products on a wide-range of devices now and the existing setup has made it hard observe users on smartphones, tablets and watches.
  • Formal environment: The previous lab felt a bit formal — participants sit at a desk and it doesn’t mirror the way most of use use technology at home.
  • WMV Video Files: The Windows machine recorded video in WMV which can’t be easily converted into sharable and editable files.

The Upgrade

We decided it was about time to bring the lab up-to-date and put together a lean list of items for the upgrade.

We bought the following bits and pieces:

  • 3m Lightning Cable (£14.99) so we could record QuickTime video from the iPhone.
  • 2m USB extender (£4) (to give an enormous 5m of iPhone cable)
  • 5m Mini Display Port (£10.85)- Display Port cable to feed the Macbook to a new HD Widescreen Display.

That was all we had to buy (total cost: £29.84).

We moved a sofa and sidetable from our reception area into the lab to make a more comfortable lounge. I decided to use my MacBook Pro going forward as it’s normally got the prototypes or whatever we’re researching already setup. I decided to use Silverback 2.0 for the screen recording with multiple QuickTime instances streaming the cameras. Although it’s unsupported now, it’s free and works pretty well (and Silverback 3 is not released at time of writing).

New setup

Research Setup Post Change 2015

1 — The iPhone is connected to the laptop via the Lightning Cable. A Quicktime instance is loaded up to stream the display of the phone.

2 — MacBook Pro running Silverback. This is responsible for recording the Desktop Viewport which includes Gesture Camera (QuickTime), Face Camera (Silverback) and Screen Camera (QuickTime).

3 — Gesture Camera: A webcam on a tripod is positioned to the side of the participant so we can observe their gestures.

4 — Bigger Widescreen HD Screen: Uses the 5 metre Display Port cable so the clarity is improved significantly.

5 — The audio setup was unchanged.

User Research Setup_blog.fw

Above: A sketch showing one of the sessions. Participants can now sit comfortably on the sofa to use any mobile devices. The laptop is on a coffee table with castors, so it can be easily moved in or out of reach.

Observation Room Experience

The upgrade of the display made a huge difference. Now observers can see the interactions and iPhone screen with much greater clarity. The gesture camera allows us to capture the other interactions from the users (sometimes they try and interact with non-interactive items — usually a design problem!).

UserResearchCentreofExcellence_Observation_blog

Above: View of the Observation Room. The chairs are facing a 1-way mirror into the lab. Each chair has its own set of headphones.

UserResearchCentreofExcellence_Feed_Blog

Above: View of the screen in the observation room. In this example, the observers can see whatever the participant is seeing on the phone, their gestures and face.

UserResearchCentreofExcellence_DTFeed_Blog

Above: When the researcher switches to the browser, the observers can see the whole screen in with greater clarity than in the earlier setup.

Summary & Improvements

The new setup made a world of difference in helping us meet the objectives of our research. On this occasion we were running through user journeys across several device types, much more seamlessly than was previously possible.

Having the video files in a more flexible format has made them easy to upload to Google Drive — they can be linked to (timestamp bookmarks like YouTube) from any documentation or prototypes.

The observers are able to see what’s going on much more clearly.

Whilst this setup was a definite improvement — there are still things we hope to address for our next batch of in-house research.

  • Diskspace: Silverback 2.0 (which is free and unsupported so you can’t grumble) does use a LOT of diskspace (20–30GB per hour). This caught me out after a couple of sessions. We’ll probably resolve this with a portable / network drive next time.
  • Audio: can definitely be improved. Now we’ve moved from the desk to the sofa, we’ll probably need to use a more suitable microphone.
  • Webcam glitches: Our Logitech Webcam (recording the gestures) intermittently began flashing and stuttering. All that could be done was to refresh the QuickTime stream — not ideal in the middle of a session. We need to play around a bit more with the setup.

Let us know if you have any questions!

Karim

@karimtoubajie

--

--