“Sweet Spot” Pricing Research: Too Much, Reasonable, Bargain and Steal.

Do you need to know the “best” price to charge for a product or service? Here is a practical way to get after this issue without spending a mint.

It is too simplified and misleading to list a series of attributes/pricings and ask for a rating on a scale of 1 to 7 as to which are most important. All items will likely skew towards the higher ratings, because the person doing the rating does not have to trade anything off, so she does not lose anything by rating all attributes high.

On the other hand going into advanced “conjoint analysis” or “max diff” analysis where trade-offs are analyzed ad infinitum to reach a price point about various product attribute mixes has its shortcomings as well. It is a costly research maneuver requiring specialists and research expertise that is beyond most clients. It can suck up all the time and energy of a survey and not allow for other types of important questions from this audience, and it can still be misleading.

More reasonably, if you can explain a product fairly simply with its main attributes, you can ask a question that tries NOT to find out which attributes are most desired, but what is the package price at which people “tune out”, “tune in”, are “attracted”, and are “smitten”. These are human dimensions that we all experience when we shop. These reactions send (or suppress sending) dopamines to the synapses (true). (See our “Brainy Decisions” blog.)

Buyers almost always consider “cost” as one of the most important considerations in a purchase. Based on our 20 years of research experience, they almost always do not consider cost to be the “most” important consideration. In our experience, it is usually ranked third or fourth most important, after other attributes like “quality”, “performance”, “style”, “durability”, “reliability”, “brand”, and so on.

Respondents in real buying situations can only seriously consider four or five key attributes of any product or service. After that, differences in attributes start to blend together or fade completely. So a way of keeping pricing questions real and keeping the respondent from low-balling a pricing question, is to ask the following type of questions.

1. For (product A), at what price would you consider this product…
a. To be too much or “overpriced” and would definitely not purchase this product $_____
b. To be a “reasonable” price and would likely consider purchasing this product $_____
c. To be a “bargain” price and would definitely consider purchasing this product $_____
d. To be a no-brainer “steal” at a price that is almost too good to be true $_____

2. For (product B), at what price would you consider this product…
a. To be too much or “overpriced” and would definitely not purchase this product $_____
b. To be a “reasonable” price and would likely consider purchasing this product $_____
c. To be a “bargain” price and would definitely consider purchasing this product $_____
d. To be a no-brainer “steal” at a price that is almost too good to be true $_____

This style of questioning for two or more combinations of attributes can provide a very accurate picture of what the audience values, and also fits into the respondent’s way of thinking about pricing, as most people think of these categories (overpriced, reasonably priced, bargain, steal) in almost all buying situations where they have done even the most minimal shopping.

Bottom Line:
In designing questions to understand the “sweet spot” in a market (not too much and not too little), we consider first how most people think when they purchase products or services. Although “price” is usually not the first or foremost attribute people consider (“quality”, “performance”, “reliability”, “durability”, “style” and “brand” are often higher rated attributes), when they do get down to the price (almost always ranked among the top four attributes) buyers get turned on or off by their own internal human measuring system of “too much, about right, bargain or steal.”

(See also our blog “Brainy Decisions”, our web page on Startups where we have actually used these techniques, and our page on the Data Robotics Drobo startup.)

For more information on pricing, tell us about your project , or email Larry@wilsonresearch.com for a complementary discussion of your pricing research needs.


Brainy Decisions

I just finished a book I highly recommend by Jonah Lehrer on “How We Decide”, something that market researchers, among others, should look into. There is good stuff in there on focus groups, how distressful situations need prefrontal-lobe concentration, how emotions can sometimes interfere and other times aid in making decisions, how complex decisions that might overload the prefrontal lobes are aided by “time outs” to allow the emotional brain time to “catch up” and provide support, how a world class gambler hones his intuitions in making betting decisions. So when you have a tough decision that is literally freezing you into decide-a-phobia, go take a shower, or a nap, or a walk and give it a rest, that is if you are not in a jet plane at 36,000 feet with no way to steer.

Sully glided his jet liner gently onto the Hudson River with great prefrontal lobe help taking over. A miracle in itself. Lehrer describes how years ago a DC-10 jumbo jet from Denver to Chicago suffered a broken disc in its tail engine, severing the three redundant hydraulic systems, and leaving the pilots with no means of steering the plane. Yet somehow they found a unique decision-making path through this unprecedented situation (nothing in the pilot’s manual about this, no phone help from expert pilots) and, although they unfortunately lost 126 lives, they also saved 186 lives almost certain to have perished were it not for the pilot’s decisions. The pilot was on his own as was Sully. His decisions led to new training of pilots to cope with this harrowing situation.

It turns out that focus group research for shows like Seinfeld, Hill Street Blues, and The Mary Tyler Moor Show all indicated flops in the making. How did this research go wrong? You have to understand what it is that is happening. Remember Bob Dylan’s “Ballad of a Thin Man”? He was talking about researchers (Mr Jones) when he said “You walk into a room and you know there’s something happening, but you don’t know what it is, DO YOU, Mr. Jo-o-o-ones?“ You really have to understand what exactly it is that you are asking, of whom, and if you are missing the point by asking someone to tell you why they like the taste of this better than that. When you start focusing on one trait or another, you can skew or muddle the overall experience, the big picture, the most important picture.

Nuf said. Related to this decision making is research on the Brain that is now illuminating all kinds of interesting things. Go to www.CharlieRose.com and watch the first four episodes (of what will be a total of 12 hour long segments when completed) of his Brain Series with Eric Kendel, Nobel Laureate for neurological work on memory. (Lehrer was a student in Kendel’s lab). This is truly a great use of the airwaves for educational advancement in understanding the neurological workings of our moods, social behavior, empathy towards others, visual fields, memory, facial recognition, and much more.

Here is a blog by Jonah Lehrer that I also highly recommend. Personal disclosure: None! I have no relation to Jonah whatsoever. He just does good stuff worth reading about and thinking about. http://scienceblogs.com/cortex/2009/12/free_will_and_ethics_1.php#more


Americans Are Giving Research Shows

As of January 18, 2010, two-thirds of U.S. adults (64 percent), and an even higher percentage of African-Americans (81 percent), have given or plan to give to relief efforts following the earthquake in Haiti, according to a survey conducted by Zogby Interactive, a Utica, N.Y., research company.

Interesting aside: Jonah Lehrer in “How We Decide” (see next blog) shows in the “ultimatum game” that decisions that researchers might think would be more “selfish” in nature are actually based on a kind of moral sense of fairness. Our brain synapses get boosts of dopamine when we receive rewards or gifts, but we also get almost as much of a dopamine boost when we give to charities and participate in altruistic causes. So we to some degree have a “hardwired” predisposition to give aid to others.

Don’t let this take away from your own personal decision and free will to give. It is just that biology and neurology help us to be what we really want to be anyway. Two of my year in and year out favorite charities who are always hard at work are: www.oxfam.org and www.doctorswithoutborders.org , but I am sure you have your own.


Communicating With Designers

As a former product marketing manager (acquistions editor) in the publishing world, I have worked with many a designer on book jackets, book design and artwork. I have always wondered why it was so difficult to get the designer to see what it was that I wanted in a design. Of course I was usually trying to convey functional ideas about what the design should do. And designers always seemed to talk a different kind of language. I eventually — it took many hands on experiences — learned a whole lot from the designers I came into contact with. I am not sure they learned anything from me.

The following link is written by a designer who appreciates the fact that designers (including himself) have some quirky ways. Getting the communications channels uncluttered and working between quirky designers and geeky developers is an interesting challenge. Some good suggestions and thoughts at this blog. Hope these thoughts smooth the waters between two worlds that sometimes seem like unbridgeable two-brane universes, and two types of brains.



Sample Size Calculator

We have a new sample size calculator up on the site. It is an excel spreadsheet that you can download and use as you wish. You can find it at http://www.wilsonresearch.com/main/market_research_company_articles.php


New Article

I got my latest article published on EzineArticles.  Here it is if you are interested.  Here is the orig link: http://ezinearticles.com/?id=2552698

Top 10 mistakes in conducting online market research

1. Not knowing what you don’t know

Its easy to do online surveys these days. Too easy. It may be so cheap and easy that you do it without understanding the basics and end up with misleading answers that send your business down the wrong path. This is worse than never doing any research in the first place. Spend a little time and get to know what you don’t know about market research. A basic review of the following topics is a great start.

  • Sampling and sampling error
  • Quantitative vs. qualitative research
  • Question bias / question design
  • Response rates / confidence levels
  • Questionnaire coding
  • Why people take surveys (social contract)

Some great books on these subjects are:
Mail and Internet Surveys: The Tailored Design Method” by Don A. Dillman
Asking Questions: A Definitive Guide to Questionnaire Design” by Norman Bradburn, Seymour Sudman, Brian Wansink

2. Not eliminating sampling errors

Now that you know what sampling error is you can understand why it is critical to conducting meaningful market research. Many of the online surveys you see today are full of potential sampling errors. Don’t be one of them. Take the time to develop a good sample and then make sure you get as many of those people as possible to your survey. This is probably the biggest difference between professional market research and your do-it-yourselfers. The pros take the time and money to develop good samples and then make sure that they get good response rates. You can to if you put in the effort.

  • Always use a true random sample
  • Tracking your respondents (PINs)
  • Program the survey to eliminate duplicates and respondents with bad intentions
  • Check the data for oddities (clean the data of illegitimate records)
  • Use incentives (does not have to be monetary, see social contract)

3. Making decisions with inaccurate information

If you never understood any of #1and #2 it is a good bet your survey is useless. Worse than that you may think it is telling you what to do with your important business decisions. Making decisions with inaccurate information is worse than taking a guess.

4. Writing bad questionnaires

You might get everything else right and then go and write a bad questionnaire. Lots of online surveys have at least one bad question. What is a bad question? It’s any of the following:

  • Biased questions
  • Unanswerable questions (impossible to know the answer)
  • Questions with two meanings
  • Hard to understand questions (way to long, strange use of words)
  • Dumb questions (asking about something the researcher should already know, or has already asked)

5. Programming a hard to take survey

After you have spent all that time creating a good sample and writing good questions don’t ruin it by programming a hard to use survey. One of my top gripes is forcing respondents to complete every answer. Too much of this is going to get you either a contrived answer or the respondent leaving. Neither is good.

  • Don’t force non-critical questions
  • Don’t have non-standard buttons
  • Don’t use non-standard technologies (java applets, etc.)

6. Going cheap

Both the good and bad thing about online market research is that it can be much less expensive than in the past. The bad of this is that it is just too easy to conduct flawed market research. Many of the above items cost time and money (sampling, questionnaire design, etc.) Spend the time and money to do it right. Even better hire a quality market research firm like Wilson Research Group to do it for you. Either way you will save money in the long run by conducting quality market research.

7. Confusing social networking with quantitative market research

Talking with lots of people (social networking) might gain you valuable qualitative information but it is not quantitative market research. The difference is qualitative information rarely represents all of your audience and gives you individual opinions and ideas. Quantitative research on the other hand is designed to represent all of your audience and gives you answers that you can know reflects all of your customers. Don’t confuse the two. Social networking can be useful but understand its limitations.

8. Being overly “cute” with the survey tool

Your market research is supposed to gather meaningful information about your target audience. It is not supposed to impress them with all the high technology you can master. Keep your survey technology as simple as possible to reduce excluding respondents that are not up to speed with the latest and greatest.

  • Keep Flash and JavaScript to a minimum (use them but not in critical areas, always provide alternatives.)
  • Use tried and true web technologies

9. Relying on only one source of information

Market research is a snapshot of opinions at a certain time. If your research results in wildly different answers than you were anticipating it is wise to confirm these conclusions with more data.

  • Conduct another survey
  • Look for corroborating data

10. Ignoring your market research

If you go to all the trouble to conduct a good study then have a plan to do something with that information. Too many organizations will conduct market research for one reason or another and when they get information back just sit on it. Don’t be the one who ends up saying “Wow, if we had just done what our market research told us we wouldn’t be in this bad position”. Before you conduct any online research have a plan as to what you will do with it.