Last week I was working on a web-usability research project. Looking at the pros and cons of the various methodological approaches, I was struck, shocked even, by the cost differences between DIY, unmoderated software driven solutions and the consultative lab-based approaches.

The usability software products invariably offered scale – the ability to talk to larger numbers of users – flexibility, speed, all at a very low cost point. A $1.000 would buy quite a lot of raw recorded user experience data. Compare that with a lab-based, moderated qualitative approach: 8 to 10 in-person sessions could easily cost many times that amount.

"Data" versus "insights" – that seemed to me the dichotomy, with data immensely cheap or even free, in abundance, "insights" appearing extremely expensive by comparison.

It's extremely easy to side with the cheaper option if it seems halfway viable, and is extremely fast.

This to me is one of the central if problematic paradigms facing Research right now. In so many areas there are newish options or approaches, often IT-powered, that are radically cheaper than legacy MR approaches: DIY software, crowd-sourcing, open innovation, micro-surveys to mention a few.

Can MR – the human part – compete? How do we continue to generate value, justify consultative fees, without opening ourselves to the charge of over-engineering?

Innovation is as ever key – but not just in the sense of developing a new, better methodology, with the risk of offering only marginal added benefit. We need to innovate the way we deliver insights.

Researchers are not traditionally strong at either the back or front-end of a project – reports are often cautious, lengthy, not concise and well-informed, pre-project consultative discussions are often non-existent.

It's the pre- and post-analysis phases and insights-generation part of a project that can add serious value, that can help shift away from a focus on cost, towards value perceptions. We need to get better at both – start thinking more like business analysts, less like traditional researchers. Here's my take.

1. Ask Business Questions First.

In the face of the mass of data available, the knee-jerk reaction of the knowledge worker – the Market Researcher – will be to worry about quality, comparability, validity, reliability. All good stuff, but secondary questions in my view.

Interrogating data is about solving a business question – it's something we need to approach with commercial acumen first, and methodolical rigour second. It also helps us getting lost in time-sinks, buried in heaps of analyses, driving to ever more abstruse interrogations that are simply fuelled by our natural curiosity.

2. MR Myopia is Career-Threatening.

Too often we are lead to look at just one data source to address a business or marketing problem.

From a supplier MR persepctive, this is natural: the Research Design is geared to solving a problem with a particular approach, almost always wth a limited budget, and constraints on multi-modality.

In addressing the "so what" question at the insights-end of a project, we need to get into the habit of thinking broadly.

Taking a 360 degree approach brings weight, authority: accessing diverse data sources, case studies, historical insights, macro-economic data, competitive intelligence, syndicated reports – anything that will illuminate differently.

Using multiple data points helps give insights bite, lead to business impact – and begin to equate to a senior management perspective.

3. Sector Knowledge Can Help.

Being knowledgeable about a particular industry sector remains a powerful way to generate respect and possibly ongoing revenue. Some Agencies execute this, many don't.

Offering sector expertise – knowledge, experience of competitors and market movements across the globe – can be a very powerful differentiating argument and one that a computer will struggle to replicate.

In summary, all the innovations in MR over the past 5 -8 years – including "in the moment" mobile MR* approaches, biometric measurements, predictive analytics, MROCs – potentially help deliver insights that are much richer than a narrow survey Q&A approach.

Better tools on their own aren't enough. It's our ability to make business sense of all the data at our disposal that will count. We may not wish to compare ourselves with Business Consultants, but we could certainly benefit from becoming more analytically effective – impact focussed, referring to multiple data streams, and strong in communicating our indicated actions.

Curious, as ever, as to others' views.

* I've just completed a book on Mobile MR – "In the Moment. Perspectives on Mobile Research" – will be publishing in the next few days. Keep your eyes peeled.