When Google Consumer Surveys launched last year, I took an immediate interest. As a digital marketer, I love finding new tools to test. I’m also a big fan of most Google products, and it seemed like this one opened up to some very interesting news and reviews around the credibility of survey results in comparison to more traditional and approaches to market research.
In November 2012, I saw a story that declared Google Consumer Surveys “blew the lid off” one established polling company and scored the highest among the many research firms running surveys during the U.S. Election. To summarize, the Google Consumer Surveys came the closest to predicting Obama’s win and the story noted some “pitiful” success rates by others. Only a few days earlier, the Pew Research Center for the People & the Press published a detailed account of how Google Consumer Surveys test results compared against other forms of research, such as telephone-based polling. Of course, discrepancies were noted, and you can read all about it here. But hey, when your new product outperforms traditional solutions in a way that’s both impressive and easy to remember – a U.S. election prediction – it sounds good and kind of sticks to the brain.
So I kept thinking about it, and I was convinced that I wanted to test this tool and maybe even use it in a professional setting. Convincing others that it would be worthwhile was easy, but I still wanted to give it a test drive myself with no risk whatsoever. I got busy, but at the end of January 2013, the opportunity to play with Google Consumer Surveys came up, and I tried it out with no official goal but to satisfy my own personal curiosity.
Signed into my Google account, I landed at the Google Consumer Surveys page and clicked around. Within seconds, Google offered me a coupon worth $75 toward my first survey. For a single question to the 1,000 members of the general population, it would only cost me $25. So I decided to cook at home that night instead of eating out, and clicked through to activate my question.
What to ask 1,000 people? Well, Google Android vs. Apple iOS was on my mind. I’d made the switch from iPhone to Android about one year earlier. My brother was playing with a new Android, and loving it, having just switched from a Blackberry touchscreen. And at the software company where I work, we had Apple and Android fans in the office and regularly blogged about which operating system was in the lead – in terms of app downloads or units sold or whatever the latest counts indicated.
So the question I chose was fully-loaded and total opinion: “Which mobile operating system is better?” Again, it was chosen simply to get the tool running and satisfy my own curiosity about what the tool would do with it. Click to submit, yes, thank you. The service charged my credit card, and the whole process took less than five minutes.
For a few hours, my single question survey was “pending approval.” I checked in a few times the same day, but it hadn’t moved. I mentioned my test to a co-worker, and we joked about Google disallowing me due to possibility that my question might paint Android as the loser. They’d never do that, he said. And I said, if they do, then maybe I can blog about that. And then we joked that the survey might proceed but that there’s no way Google would allow Apple to win, even if it was just a single question survey with no hint about how the results might be used. Also, he seemed to think the tool would just burn through a thousand answers in minutes. No way. Well, possible I guess. Probably not.
About a half day later, the Google Consumer Surveys tool was reporting that I’d received a little over 200 responses, that the answer itself was “too close to call,” but that it was “trending towards” a winner.
On Saturday, January 26, 2013 – almost 48 hours from the time I’d given permission to charge my card – I got the results notification from Google – subject line: “Your Google Consumer Survey ‘Android vs. iOS’ is complete.” They gave me a link to the results, and a coupon for $15 off if I chose to run another survey within seven days. They also attached a spreadsheet of my data, but clicking view from Gmail gave me 24-page preview of the file and a notification that the information was just too large to display it all without downloading. It was a nice touch, but the spreadsheet was a little less interesting than checking out the results within Google’s own interface.
To make this long story shorter, Google Android won.
But if you’ve ever worked with a research company on a survey project, you’ll know the data for a single question goes a lot deeper than the overall response. You get spreadsheets full of data from which you can pull insights about certain demographics, geographic differences in your responses, and often some pretty summaries and charts to help highlight the things that really stood out. My experience with Google Consumer Surveys offered some great data digging deeper into gender differences, age groups, geographies, urban density, income levels and other details about the people who’d clicked in favour of either Android or Apple. But, as you’d expect, all of it was interactive. The simple bars on the chart would grow or shrink to satisfy my clicks. I can also see the average response times, what hours of the day recorded the most responses, and a graphical comparison of my survey’s sample distribution to US Census data. They have an insights tab, which for me, only offered a single and uninteresting “insight” relating to the geographical differences, but I imagine a larger, more robust project would populate that tab a little more efficiently.
Overall, a very slick but simple experience. Will I use Google Consumer Surveys again? Yes. I had a great time playing with the data Google gave me around a single question. Some people might call the interface a little dull. I’ve read that Google isn’t known for great design. But I like the simplicity. And it beats the slide presentations and spreadsheets I’ve seen from past projects.
Now, if only I could pull up this data on an Android app…