Quantcast
Channel: Cvent Survey | Inquisium
Viewing all 967 articles
Browse latest View live

Brand Awareness Measurement: What's On Your Mind?

$
0
0

What’s on your mind? Awareness is on the top of my mind at the moment. Measuring awareness of companies, brands, products, people or advertisements have never been easier with the advent of online survey tools. Awareness exercises present an interesting opportunity for both consumer and B2B marketing researchers. There are two components of awareness that researchers have interest in; un-aided (also know as top of mind awareness) and aided.

Unaided awareness is measured by asking respondents which company(s) or brand(s) they have heard of. This is done with an open-ended question. The typical question asks which single brand the respondent has heard of. Alternatively this can be expanded to capture two or three companies or brands. When creating these questions make sure to include a ‘none’ option to allow respondents who cannot bring to mind a company or brand to have a selection option. This will mitigate a sense of failure in not being able to recall a name.

If using an open-ended question with more than one write in box then the ‘top of mind’ response is the company or brand in the first position. From a survey analysis stand point it is interesting to calculate the number of brands recalled in the unaided exercise. The companies or brands in this portion of the awareness exercise form the evoked set. Again, from an analysis standpoint, it is insightful to crosstabulate the number of brands mentioned by other key variables such as product usage levels, demographics or engagement with the brand category.

An alternative method involves two screens, the first with a single open-ended box for top of mind, followed by a second screen asking respondents to add other companies or brands they are aware of (see example survey question below).

Example Survey Question for Brand Awareness Measurement: Again, thinking of credit cards, what other credit cards come to mind?

The second half of the exercise involves aided awareness. This is where a list of key competitive brands or companies can be listed. The list itself is considered an ‘aid’ to the respondent, therefore it is expected you will see an increase in the awareness percentage across brands listed. The question can be formatted in two ways: as a multiple response where respondents select any or all they have heard of, or with categories such as ‘have used’, ‘aware of, but have not used’ and ‘unaware of’. The second method allows researchers to estimate usage, overall awareness and levels of unawareness.

One-time snapshots of awareness are not as useful as measures over time. Tracking studies allow the researcher to measure the awareness impacts of changes to the marketing mix. For example additional media spend in a market should increase awareness, and hopefully lead to additional sales and profit. Having a benchmark to compare to is critical to optimizing marketing efforts.


90% of Businesses Don't Know What Customers Want

$
0
0

Customer Service

Do you know what your customers want? Jaynie L. Smith, author of Relevant Selling, says that 90% of businesses do not. Smith believes that those businesses that focus on finding out what the customer wants and providing it are those that will fair best in the current economic recession.

To obtain these statistics, Smith analyzed 150 customer surveys to determine why customers make certain purchases at certain stores. After analyzing ten years' worth of market research for over 100 businesses, she discovered that 90% of those businesses did not know what their customers valued most.

In order for you to better know your customer base and ensure you maintain relevance, Smith recommends the following tactics:

  • Customers care how you are selling, not what—Most products can be purchased at any number of stores, so why do customers ultimately choose one retailer over another? They search for an added value, such as customer service or reliability.
  • Your current customers probably want something different from your future customers—70% of the time, current and potential customers differ in what they value most, so try to target a different marketing message to each of these customer bases.
  • Use your new-found knowledge—If your customers are most interested in how quickly you solve help-desk requests, concentrate your efforts on minimizing hold-time, for example. Once you achieve brag-worthy statistics about an area your customers care about, be sure they know how great you're doing ("Voted best customer service in the tri-state area three quarters in a row!")
  • Invest in reliable customer surveying methods—You've read how important it is to know about your customers' needs and values, so go out and do something about it!

Cvent can help you figure out what kind of survey works for your organization, and put that information into action so you can be part of that elite 10% of businesses that know what their customer wants. Register for an upcoming webinar to learn more about Cvent's offerings.

Market Research & Gamification: Let the Games Begin

$
0
0

Market Research & Gamification | Image by: Idea goLet’s play a game. Seems harmless, but games and the phrase gamification are working their way into our world, especially the realm of consumer marketing research. Have you ever wondered how video games can enthrall and engage participants for hours? Having two teenagers myself it amazes me how they can occupy the couch playing a game for hours without flinching.

It is this unyielding level of engagement that fill survey researchers' dreams. Is it the holy grail of survey design? I doubt whether we can ever approach the same level that game designers achieve, yet we can learn a few tricks and bring them into our online survey design process.

What gamers go after is a form of achievement motivation. They are motivated to achieve a certain level, defeat an opponent, or uncover hidden treasure. This keeps them engage, long after the initial thrill has warn off. Gamification is about creating a framework to increase respondent focus on the task at hand. Increasing survey respondent focus has the potential to yield higher quality data (our true holy grail).

One way to accomplish this is to set up a game. For example, you can ask respondents to describe a brand or an ideal product using a limited number of words, say 10 for example. This leads to a higher level of concentration and ultimately a payoff in achievement and a feeling of competence. Alternatively you can create a competition. Set the framework that the respondent has a limited amount of time, perhaps 30 seconds, to list their favorite brands. This stimulates concentration by increasing tension, but the yield is a sense of success, much like a runner crossing the finish line.

Other options include questionnaire design that allows respondents to compare themselves to others. This promotes relatedness, autonomy and brings some fun to the survey.

To wrap, gamification can instill a bit of fun, friendly competition and increase engagement levels in our surveys. Yet, we must be cautious not to insert bias into the equation. Although we are not in the entertainment business, we can make our surveys more entertaining.

Survey Sampling: Combating Non-Response Errors by Comparing Respondents to Invitees

$
0
0

By their vary nature, surveys are subject to certain types of error. Only Census’ reach the total population, and even then they can be subject to errors associated with question construction. With the ability to link survey data to your CRM, one can assess the potential for non-response error. Let’s take a quick look at how that might work.

Using your CRM tool to select a sample should allow you to incorporate metrics that are critical to your business success. These might include profitability, sales within the last six months, most recent transaction, geographic region, number of customer complaints, or a segment value. These can vary depending upon your focus – consumer or B2B marketing research. Without going deeply into sample design, if your process includes a random selection then you can be reasonably assured your sample is an appropriate representation of the elements within your CRM. Take the time to validate that your sample mirrors your customer or prospect base by comparing the percentages of each file.

These key metrics will be used again in comparing those who responded to your survey with the sample that was invited to participate. The table below shows an example where respondents differed from invitees on two dimensions (region and tenure). You can incorporate as many metrics as needed, but typically 4–5 is a workable number.

table

As the researcher it is your call to make if the results seen above or too far afield. You do have options however. First, you can look to your non-responders in these quota cells and send additional follow up invitations. If you have exhausted your number of contacts, then you can re-sample your CRM and select a random sample using your key metrics as filters. For example, we might re-sample customers from the Southwest region or those with 6–12 months tenure to increase their share of responses.

Alternatively you can go ahead with the data you have and document the differences clearly in your analysis, or you can apply weights that compensate for the respondent distribution. This process, in short, assigns weights to the actual data to make it more representative of the population. More on working with weights in another post.

5 Secrets to Training People...Learned from Horses

$
0
0

Our newest little boy...I'm sure he will teach us a lot!I love horses.  They have been among my cherished friends and teachers for as long as I can remember.  When I tell many of my old friends that “I’m a trainer” they automatically assume I’m a horse trainer because they know my affinity and experience with equines.  I have to explain that I actually train people more often than horses!   

As I was contemplating what training topic to blog about this week, horses kept crossing my mind.  I took this as a sign and started thinking of the many parallels that exist between training horses and people.  Yes, there are many things that are similar (and many that are VERY different)! 

Do you love training animals and people?  Here are a few thoughts about the lesson’s I apply in the classroom that I learned in the arena.

  1. Horses are curious creatures.  Engaging in their curiosity vs. using brute force, or the “because I said so” approach is always best. Makes sense with people, right?  Structuring activities that tap into curiosity and creativity make learning fun and fast!
  2. Positive first experiences are key to success.  Burn a horse with a bad experience once, and they might not be all that open to trying again, no matter what you do.  It can take a long time to gain trust and willingness so you have to do everything you can to set them up for success.  With people, this is critical…small wins will help to build confidence and willingness to keep learning and trying new skills.  
  3. Smooth motions vs. jerky motions will get you where you want to be, faster.  Sudden, jerky movements cause stress and can move into fear.  How would you define this one with people in the classroom?  When coaching/providing feedback, transitioning to new topics, when you’re asked questions that take you outside your comfort zone, etc?  What small shifts can you make to make your classroom even more consistent and smooth?
  4. Repetition leads to consistency.  With horses you can’t just try a new movement once and expect perfection the next time you try.  It takes many hours in the saddle to perfect skills and if you get things right in one direction, it doesn’t always mean in a new direction you will get the same results.  For example, you may get a horse to spin to the right with great ease…but the same move in the opposite direction can cause a melt-down.  The skill is the same, but the situation is different.  The lesson here…try skills in lots of different ways.  Teaching sales?  Your student might master the skills with one product, but make sure you have exercises that take the same skills, but applied to a different product.  And then repeat until it feels more like a habit. 
  5. Fear often drives behavior.  Find out where the fear is coming from and do you best to get rid of it.  If fear is present, you’re not going to be moving forward, period.

What have you learned about training people from animals? 

Happy trails my training friends! 

Make Business Dreams Reality & Avoid [Costly] Mistakes

$
0
0

Achieve your business dreams and avoid costly mistakes by first conducting an online surveyAccording to a recent survey, asking for input from your target audience can save businesses, particularly start-ups, big bucks. Because new businesses are most susceptible to failure, it is especially important for them to survey customers before making any big business decisions. Some entrepreneurs are hesitant to enlist the help of surveying software because they believe it is too time- and money-intensive, but avoiding making a huge mistake due to poor market knowledge is worth it.

So how can you best go about surveying your target customers? Here are a few tips to get started:

  1. Define your assumptions. Business, both new and established, always make assumptions about their target audience: Who are they? What do they want? Write those assumptions down so you can test them.
  2. Verify (or disprove) those assumptions with a well-thought out plan and specific goals. What do you want to get out of the surveying project? Do you want to confirm that Mommy Bloggers are interested in your new product? Or find a way for your marketing department to create content that drives values faster? Maybe you want to understand whether what's already in the market is meeting the consumers' needs, or if there's a big pain point that your business could be addressing.
  3. Consult best practices for survey design. Question writing, flow, and types can all impact the results you get. Check out these best practices for more information:
  4. Do the analysis. The biggest surveying mistake any organization can make is asking for opinions and then ignoring the results. Don't think you have the expertise or time needed to analyze your results, but don't want to spend a ton of money on a third party market research firm? Cvent's Web Survey Professional Services team can do it for you!
  5. Select the right DIY tool. Look for automation, variety of question types, email marketing (plus corresponding tracking), contact management/segmentation options, text analysis and other out-of-the-box reporting.

Ready to get started? Sign up for an upcoming demo and learn more about Cvent's key online survey features that will help you start being confident that your decisions will drive value.

The Screen Door in Market Research Surveys

$
0
0

Screen DoorMarketing researchers have insight to share. Yet if you are involved in research and answer things truthfully you will, more likely than not, be screened out of most online survey opportunities. In full disclosure on a few occasions I have not listed myself as a research guy because of being really interested in the topic of the survey. I did this not to be secretive but because I felt my opinions would be of use, after all I am a shopper, a parent and owner of various technological devices. In theory this makes me qualified for many surveys.

The concept of screening questions (aka screeners) has significant impact on our work. After all we should be conscious of maximizing the value of the data we collect. In truth this is a two-way street. We have a clear picture of who the survey is targeted to and the potential prospect wants to invest their time on a survey of interest to their lifestage.

Screeners are typically found at the beginning of the survey and are designed to weed out those who do not qualify for the study, either because of a disconnect between the topic (e.g. accountants would generally not be interested in a study of marketing practices) or a quota (e.g. no more than 20% of respondents should be male and under age 30). Screeners do serve a valid purpose as gatekeepers letting in only those for whom the study is designed.

Depending on your focus (either consumer or B2B marketing research) screeners may look different. You will typically find age, gender and state of residence early in the process for consumer research. In B2B marketing studies you may find industry, company size or job role as screeners. Involvement in the purchase decision is also a commonly used screening question.

The one that invariably gets me is the question “Do you or anyone in your household work in marketing, advertising or marketing research?” Which begs the question for a survey on grocery store selection why would my opinions be different than someone else of similar educational and demographic background?

Screening questions are invaluable in making sure researchers get qualified respondents into the funnel and that those for whom the survey would not be of interest are kept out. If you use screeners think about the impact of weeding people out. Best practice includes thanking them for their 15 seconds, and letting them know why they were screened out of the process. This will help to keep their interest up so the next survey that comes their way will be received and not send to the trash file.

Survey Says: Too Many Surveys

$
0
0

Bored girl

Are you suffering from "opinion overload"? Are you tired of having to opt out of surveys from your magazine subscription, your insurance company, and even that random website you were browsing on your lunch hour? Most likely the answer here is YES. So why would you want to make your customers take yet another survey for your organization? Don't you think they are just as "over" surveys as you are? I know what you're thinking, "But surveys can be such a valuable customer feedback tool! How will I know what my customer wants or doesn't want without surveys?!"

We agree with you: customer surveys are incredibly important to providing your client base with the products and services they want. However, it's incredibly important for you to properly disseminate and administer customer surveys to avoid burnout among your most valuable input-providers. Here are a few ways to maximize the benefit while minimizing the annoyance to your customer.

  • Be clear what your goals are. What do you want to find out with your survey? How are you going to use the information you receive? Set specific goals for what you hope to achieve, and follow through with them.
  • Identify your target audience. Are you interested in customers who have just made a purchase? Or maybe you are curious about someone who utilized your services a few months ago and hasn't been back since? Use the goals you identified earlier to figure out who can best help you answer your questions.
  • Use short, concise surveys. Better to give a ten-question survey to ten people it really applies to, than a 100-question survey to a bunch of people who aren't relevant (AKA no shooting in the dark). Again, use your answer to the "target audience" question to determine who is important to poll so that you can direct these questions to that specific group of people.

 


Are You Ready to Make Lemonade? Steps to Customer Complaints Around

$
0
0

lemons | Image by: graur razvan ionutTalk doesn’t cook rice, or so an ancient Chinese philosopher said. In the realm of customer satisfaction, both of these quotes hold sway. I was reminded of this in a recent post by Dana Vaille of Chadwick Martin Bailey. Dana was expressing her discontent about a recent purchase she had made. The item failed to live up to expectations, and this led to a state of discontent. She took the time to provide feedback to the company from a place of genuine concern. What the company did in light of this situation is the moral of today’s story.

When customer feedback is received many organizations seem grateful, but they do not always have a process in place to act upon that feedback. The company in question not only responded to Dana's comments, they thanked her, informed her that because of such feedback they were switching manufacturers and lastly offered to replace her item when the new and improved product became available.


This is about taking the lemons and making the best lemonade possible.

From a best practice standpoint there a few things that need to be done to develop best in class service. First, implement a voice of the customer (VOC) program that provides customer insight to ALL departments that touch the customer, yes even accounting can be a touchpoint. That VOC program should focus on capturing insight that can be used to directly impact the customer experience in the short-run and provide input into broader more strategic initiatives.

Second, the program should be designed to interact with the consumer in the ways they want to be communicated with. This might include phone surveys after transactions, web surveys after online purchases, and the opportunity to provide customers a feedback method after in-person transactions.

Text analysis allows researchers the opportunity to glean insight from unstructured data. Much of the data coming in from social media, survey comments, or even online chat logs is unstructured. There is a wealth of insight waiting to be mined.

Most importantly, we need to close the circle with the customer by sharing what we have learned and showing them what we are doing with the insight. This may come from personal response or via broader approaches such as newsletters and emails. As individuals we want to feel valued. Ackonwledging, with gratitude, the time that a consumer gives when sharing their opinions is how we make the best lemonade.

 

If you're ready to start making lemons into lemonade when it comes to negative customer feedback, register an upcoming demonstration to discover out how the Cvent Web Surveys solution could help!

Respondents Want to have Fun and 7 Other Reasons People Complete Surveys

$
0
0

Increase Survey Response Rates: Most of all Respondents want FUNIncentives are common place in the market research industry. This is true for consumer and B2B marketing research. Often we, as survey providers, offer tokens of our appreciation ranging from branded items to gift cards (can we say Starbucks, please!) to charitable donations. Yet, incentives are not the only reason why people take the time to share their opinions. There are two types of motivation for participants to respond to our surveys: intrinsic and extrinsic.

Intrinsic motivation, as the name implies, comes from within the participant. It is driven by an enjoyment in the task itself as opposed to working through a task for a reward. The task itself is the reward. We can help support intrinsic motivation by keeping our surveys short, engaging through the use of a multi-media survey experience, use of visual questions, and keeping the survey focused on the interests of the target population. The latter point requires we know something about the interests of the population, perhaps through a screening survey.

Supporting intrinsic motivation requires the author to keep the survey timely and relevant. It also provides us the opportunity to acknowledge the respondent’s efforts, show respect for their time and opinions, and to provide feedback on how the survey will be used. A common practice that supports intrinsic motivation involves asking the respondent about their survey process. To close the loop we should incorporate this feedback into future survey designs and inform respondents that we are listening to what they say.

Extrinsic motivation is external to the respondent and could take the form of an incentive. According to research respondents rarely mention cash as the reason they like surveys. What respondents actually want from a survey experience is:

  1. Ease of use
  2. Respect for their time
  3. Relevance
  4. Opportunity to learn
  5. Good survey design
  6. Time to pause and reflect
  7. A break in their daily routine
  8. And lastly...fun

Surveys, if conducted well, provide respondents a chance to break away for a few minutes and share their opinions. It gives them the opportunity to stop and think about something other than their routine. So when contemplating your next employee satisfaction survey or bank customer survey take a minute to put yourself in the respondent’s shoes and do what you can to make the experience a fun one that they will talk about.

Want more articles like this delivered to your inbox? Subscribe to Cvent's Monthly Survey Newsletter to get the most popular posts delivered via email every month.

What's In Your Customer Relationship Management Goodie Bag?

$
0
0

Gift bags

Josiane Feigon, president of TeleSmart Communications, Inc., used to love planning all the intricacies of her kid's birthday parties: the food, the games, the activities, and of course, the goodie bags. Goodie bags are a great way for your child to say "Thanks for coming,  and here's something for you to take away to remember how much fun we had."

Companies can use their own so-called "nurturing goodie bags" to help retain customer interest after an interaction without inundating the client with information. As Josiane says, a goodie bag of tactics to reach out to customers gives them a little "nurturing love" that lets them know you have their backs. Here is her three month strategy for concocting the ultimate goodie bag:

  • Week 1: Send a cool video. Pretty self-explanatory. Got an interesting Youtube video that showcases what you do? Put it in the goodie bag.
  • Week 2: Solution deep dive. What is your solution? What does it do? How can you demonstrate what it does and how it can help your customers achieve their goals? Find a cool way to illustrate it, and put it in the goodie bag.
  • Week 3: Industry research/surveys. What are people saying about your solution, your company, your line of work? Let Cvent help you put together a survey for your market research.
  • Week 5: Customer testimonials. What are your current customers saying about your product? Find out by adminstering a customer satisfaction survey, then let your potential customers know how awesome you are.
  • Week 6: Press releases. Did you recently receive some good press on your solution? Put it in the goodie bag!
  • Week 7: Competitive climate. How do you compare to your competitors? Survey markets you want to target and find out what they think of you and others like you.
  • Week 8: Webinars. Can you host a webinar demonstrating your product? See which webinars are most popular, and figure out why.
  • Week 9: Roadshow/executive briefing. Find upcoming events, send the most charismatic member of your team, and network, network, network!
  • Week 11: Awards. You can't have a goodie bag without some kind of trophy or prize. Advertise your accolades in your goodie bag! For example, Cvent's goodie bag might mention our recent selection for the OnDemand 100 list for game-changing technology. See how I did that?
  • Week 12: ROI calculator. Give your bottom-line folks some numbers to crunch. This is what we can offer you, this is what it'll do for you, please sign here.

 

Customer Insights Comes from 3 Streams: Tips for Measuring Social Media

$
0
0

Insights come from 3 data streamsAt one point, not too long ago, marketing research involved the collection of quantitative survey data and in-depth opinions through qualitative focus groups and interviews. Oh, how that has changed! The second stream in this river of consumer insight came when data mining became economically feasible, and then market researchers were asked to analyze transactional data and supplement it with insight from traditional methods (surveys and focus groups).

Now a third stream has come along: Data scrapped from the Internet in the form of Facebook likes and comments, blogs, chats and a wealth of reviews have provided another source of relevant insights. Taken together these three streams provide a deep and wide view of the consumer. Do these streams ever cross over? Certainly they do!

Those of us in the research world have a big part in supporting effective outbound marketing. Surveys are an ideal place to start to gain understanding of the social media habits of our customers and ultimately our prospects. The table below provides a solid example of how consumer and B2B market researchers can provide our internal clients with a measure of customer interest in social media.

Example Consumer Survey Question for Measuring Social Media

This multiple response question allows the respondent to select any or all of the social media venues they participate in. In analyzing the survey data, we can create a composite measure by summing up the number of venues respondents utilize. This allows us the option to create a usage segmentation scheme. If our context is B2B marketing then we may want to follow up this question with queries into how they use social media (personal, professional or both). It is also useful to ascertain if any social media sites are blocked on their work computers. For example, some companies prohibit the use of Facebook on work machines.

Measuring social media usage is critical to supporting acquisition and retention marketing efforts. It seems like data is released daily showing the increasing importance of this stream’s impact on consumer behavior. Get your measure’s ready cause the water is rising.

Successfully Summarizing Your Survey Results

$
0
0

Executive Summary

Everyone knows what an executive summary is: it's where you summarize the key points of your sales proposal. Right? Wrong. According to Geoffrey James, author of the most visited sales blog in the world, an executive summary should tell the customer why they should buy from you. Not only that, but the executive summary should be written first, setting the tone for and laying out the key points of your proposal.

So why should you care about how to put together an effective executive summary for a sales proposal? After all, this is a blog about surveying, not selling. However, James' tips for writing an executive summary that sells can (and should) also be applied to putting together a summary of your research findings. An executive sales summary suggests an action to be taken (Buy our stuff!); your research results should also clearly indicate some action on the part of your organization (Focus on this target audience! Advertise more for product X! Change ABC about the Service Portal!). This is the part of survey projects that is most fumbled, and it is arguably the most important part. Delivering an incredibly boring, unnecessarily wordy report on your research findings is going to go straight into the trash can, and the research you completed will be rendered worthless.

Since not everyone is a trained market researcher and knows how to craft a compelling report, take a look at these five recommendations on creating an effective executive summary that can showcase your survey results:

  1. The Problem, Need, or Goal. Define the current business situation, showing that you understand exactly what questions the survey is supposed to be answering. Example: We don't know who our current customers are, so we don't know what they want.
  2. Expected Outcome. What, specifically, do you anticipate happening based on your survey findings? Example: Once we know more about our customers, we can create products and advertising targeted toward their interests, and potentially increase sales by 25%.
  3. Solution Overview. What's your suggestion, based on your research? Example: Increase marketing to small business owners because they make up 38% of our target market.
  4. Evidence. Back up your claims with proof. Example: Pull out the relevant research from your report and put it into a summary or chart that is easily understood.
  5. Call to Action. You just delivered all this great research and results. Now tell the reader what to do with it. Example: Create a marketing plan for reaching your target audience that incorporates what you learned. 

 

Advice for Writing Survey Questions: Choose yourds wisely

$
0
0

Like it or not, as researchers we are in the wordsmithing business. Either we are developing questionnaires to probe into constructs such as advertising awareness or customer satisfaction, or we are analyzing respondent comments for depth and tonality. At the end of the day we take these words and translate them into thoughts that our clients and executives can easily understand. The words used to convey thoughts are as important to a marketing researcher as are the statistics we use to turn raw data into customer insight.

It should serve as no surprise that words can have different meanings depending on the country they are used in, or the audience we are speaking to. Although we are typically not linguists, it is important to take time to craft survey questions that will be understood by the audience we are trying to reach. Avoid this rule and you expose yourself to systematic bias.

If you are conducting a global survey then it is worth the time and effort to create multiple versions of your survey. Many platforms have translation services, but this may not always be sufficient. I have seen errors creep up from translated surveys (English to Spanish and French) because of social context. Phrases in English do not always have the same contextual meaning in other languages. The same can be said for English spoken outside the United States. The example below came from a survey for a well-known auto maker. The use of the phrase ‘advert’ is common outside the United States where such paid media messages are more commonly known as advertisements.

Advert

This may be seen as nitpicking, but I have also seen the same problems arise when Baby Boomers create surveys for Generation Y. The takeaway here is: if we are to generate reliable and valid data, then we need to ensure our questions are contextually and socially relevant to the audiences we are trying to elicit opinions from. This may require having input early in the process from native language speakers or from members of the subgroup we want to survey.

Digging Deeper into the Mindset

$
0
0

Seldom is a consumer purchase simply a one-off experience. The amount of time and energy, not to mention financial resources, increases exponentially when we move up the ladder to the category of ‘shopping goods’. This type of purchase includes the potential for significant risk, not only financial, but social as well. High price tag items such as cars, homes, jewelry and even insurance fall into this category.

Marketers of these products know there is significant risk and will do what they can to minimize the cognitive dissonance, or post-purchase regret, that consumers may feel. Marketing research can provide insight into the types of consumers who purchase a company’s products and services. In addition to demographics, purchase experiences, and customer satisfaction levels we also want to know the attitudes that drive the purchase. The table below comes from a panel profiling survey for a national insurance company.

lifestyle

The 14 items provide insight into the beliefs of the panelists, which in theory can be translated to the broader market. These items may be measuring a smaller set of underlying constructs such as creativity and self-expression, or practicality versus dreaming. Factor analysis can be used to assess whether or not these Likert scale items are one dimensional, or in fact measuring multiple constructs.

The purpose for including these types of questions in your panel survey has everything to do with the creative process. This information provides advertising agencies and their creative staff with the fuel they need to develop more persuasive communications. This type of data can also be used for creating market segments, especially when tied with purchase history or capacity to make future purchases.

Our attitudes drive our beliefs and subsequently our actions. It is vital to understand not just the demographics, geographics and purchase history of our customers, but also the mindset which motivates our behavior.


Becoming a "Best Place to Work"

$
0
0

Do your employees think your company belongs on a best places to work list?

It's that time of the year again. Allergy season? March madness? Yes, those too, but more importantly it's time for the Best Places to Work survey results to come out! Some believe that getting on lists from publications like AdAge or Washingtonian is about frivolous self-promotion and is a waste of time. But those articles continue to be hugely popular for a reason, so it's certainly worth a shot to make use of a potentially powerful recruiting tool.

What are the common characteristics of those companies that repeatedly top the lists? Here are a few, based on results from AdAge, Washingtonian, Glassdoor, and CNNMoney:

  • Personal touches, such as hand-written birthday cards and one-on-one meetings with executives make employees feel valued by upper level management. 
  • Who said there's no such thing as a free lunch? Some offices offer monthly catered lunch, stocked refrigerators, happy hours and homemade breakfasts. Just like Mom's house!
  • Family-friendly companies offer flexible work schedules, childcare and breastfeeding rooms to encourage work-life balance.
  • Workers who are in shape and healthy are less likely to use sick leave and run up high health insurance bills. So offering gym membership and well-being subsidies, like many "best of" companies do, is a win-win.
  • Employees realize that they're at work to do work, so make it worth their while. The companies at the top of the list tend to focus on training and development, challenging their people often.
  • And of course, who doesn't like a raise or bonus? Don't forget the value of a great compensation package.

You don't have to wait till these articles come out to find out if you'll be on the list. Be proactive! Start polling current employees to see if they think you would make the cut by sending out a firm-wide employee satisfaction survey. If the results aren't terribly flattering, figure out why and what you can do to fix them. If your employees think your company is a great place to be, don't be afraid to advertise those selling points to prospective employees and, if it's something you think your organization would benefit from, encourage your employees to officially vote for you. Finally, don't forget to ask your former employees to fill out a survey as well—they're more likely to give you honest answers and can be particularly helpful if you have a high turnover rate.

 

Learn more about how a professional online survey solution could help you get on the right track by registering for an upcoming demo.

Redirecting Survey Respondents

$
0
0

Knowing where to direct respondents after completion is a critical task with several options available to the survey author. This takes on a higher level of importance if outside sample is being used. Our options are two-fold if we are using internal data for survey invitations. After completion respondents can be shown a thank you page or directed to a website, perhaps your company’s home page or a special landing page for survey respondents.

If our survey involves screening questions and/or outside sample, then we should consider how to handle those participants that do not meet screening criteria. Not all online consumer surveys are meant for all respondents. Screening questions are designed to allow only respondents who meet specified criteria (e.g. age and gender, income, job role, company size, etc.) into the survey. If a respondent doesn’t meet the criteria, we can use the ‘Thank You’ page in Cvent to redirect them to another survey, website or simply say thank you for their time but let them know they don't qualify for this particular survey. If done gently we can minimize any negative feelings the participant may have developed toward the online data collection process.

Each screening question should have a termination sequence established. The thank you page allows the author to select a question and values for that question that need to be selected in order for the respondent to continue. If those values are not selected then the respondent is redirected out of the study. With internal respondents we can specify a website to redirect to or a thank you note to be received.

Participants from outside sample providers will be redirected back to the panel. This goes for participants that have completed the survey, been screened out or have exceeded a particular quota. Panel companies will provide you a list of specified redirect links for each of the above scenarios. This serves two purposes: 1) it allows the sample provider to compensate the respondent and/or offer them another survey, and 2) it provides a measure of anonymity for their panelists.

An ID field will be included as part of the redirect link. In the Cvent online survey platform this field will be the reference ID (Ref ID). In order to ensure the handoff between sample provider and survey platform works correctly as survey author you will create a linkage between the parameter (provided by the sample provider) and the Cvent field (Ref ID).

Redirecting the respondent after completion or due to a screen out is an important part of the engagement process. Allow for time to test this process before you launch your survey in order to ensure proper accounting for the response.
 

Employee and Customer Satisfaction Correlation

$
0
0

Correlation between Employee and Customer Satsifaction RatingsIs customer satisfaction strictly a function of the product or service the consumer purchased? The short answer is no. Is it solely dependent upon the context of the purchase situation? Again, no. From the depth of customer satisfaction research we can glean that customer satisfaction is a multi-dimensional construct, in other words there are several variables which feed into the overall satisfaction calculus. Several of these are within the control of the marketer, while others are internal to the company, but may be outside of marketing’s control (e.g. billing, customer service, shipping, etc.). And lastly, there are those exogenous variables which are outside of the company’s control and range from the weather and economy to the consumer’s state of mind.

One area that should not be overlooked, but frequently gets passed by, is employee satisfaction. Again there is a wealth of academic and practitioner based literature on the connection between happy employees and satisfied customers. So the connection between the two constructs has been vetted out. Without going into the level of academic rigor, I can say that steps taken to measure and increase employee satisfaction, if it is below norm, can produce gains in customer satisfaction, assuming all things are equal.

Just as customer satisfaction is multi-dimensional so is employee satisfaction. Surveys designed to track employee attitudes typically include measures of interaction with customers, management and vendors, satisfaction with job tasks and roles, feelings on compensation, corporate culture, work environment and opportunities for growth and advancement, to name a few.

Things to consider when looking at employee satisfaction:

  1. Ensure anonymity
  2. Keep surveys focused
  3. Provide feedback by letting employees know their opinions are being heard

The first point, anonymity, often requires that human resource departments look to an outside vendor to conduct the survey. Alternatively, online survey tools such as Cvent can be used to measure employee attitudes offering a high degree of discretion. Best practice in survey design includes keeping surveys as brief and focused as possible. This ensures engagement and a higher likelihood of cooperation and completion. Lastly, feedback is currency. If someone shares their opinion with you, they want to know they are being heard. Alternatively, why ask for employee sentiment if you are not planning to do something with it?

Satisfied employees typically think in terms of doing what is best for ‘their company’ not ‘the company’. This translates to going the extra mile for the customer. A winning solution for all.

 

Get started tracking employee attitude and improving customer satisfaction by participating in an upcoming webinar on how technology can help you acheive these goals.

How to Slam Dunk Customer Satisfaction (Hint: Not with Call Resolution)

$
0
0

Slam Dunk

Increasing customer satisfaction (CSAT), and thereby your organization's Net Promoter Score (NPS), isn't always an easy task. But there is one sure-fire way to make your customers positively giddy: first call resolution!

If you read this post's title, you would know that this is blatantly incorrect. But some companies think that resolving the customer's concerns on the first phone call to your help center is the "slam dunk" of satisfying your clients. But in reality, it's the "Michael Jordan sinking a free throw shot" of customer service—it's not above and beyond, it's exactly what they expect. The only thing that a first call resolution does is meet your customers' expectations. On the flipside, if Jordan shoots an airball, you're going to boo him from here to halfcourt and back. When you don't solve a problem on the first try, your customer service score and corresponding NPS actually decrease.

"Alright," you say, "thanks for bursting my customer service bubble [or basketball, to poorly continue the analogy]. What can I do to make these folks happy?" Glad you asked! Here are a few tried and true methods for improving customer satisfaction:

  • Hire the right people. Your staff, particularly your mamagement team, should be knowledgeable and helpful. (More on that from this past post: Hire Customer Service Naturals — and Keep Them)
  • Customize your solution to your client. Know who your VIPs are and cater to them.
  • Cut costs by giving your client what they want, not a bunch of superfluous stuff you think they want.
  • Personalize the way you interact with clients using customer relationship management tools.
  • Fix it before the customer realizes its broke. Acknowledge your mistakes, and make up for them immediately.
  • Make your support easily accessible. This means literally (a clear link to your customer service hotline and email account) and figuratively (local customer service representatives are often seen as more relatable).
  • Keep in touch. It's called customer relationship management, and good relationships shouldn't be a one-way street.
  • Let them quit you. This might seem counter-intuitive, but making clients jump through hoops to get rid of your services only turns an unhappy customer into a disgruntled ex-customer, who, in the days of anoynmous online reviews, can become an even bigger problem than a lost sale.
So what did we learn from Jordon today? Meeting expectations is not going to WOW the crowd. Predicting their move and driving to the basket before they've had a chance to blink, well, that might just win you some points.

Survey Design: Removing Cultural Bias

$
0
0

Mexican Flag Image by Salvatore VuonoDo all cultures and ethnic groups respond equally to marketing research? Nope. Growing up in Southern California, I was exposed to the Hispanic culture from an early age and have been an active observer of cultural differences since I began my career in research many years ago. Hispanics in the US, especially those of Mexican origin are acculturated to avoid speaking poorly of people, products and brands. According to author Jeffry Savitz in his article in the December issue of Quirks, Data Use: Reconciling Hispanic Product Evaluation Ratings, this acculturation process leads to a philosophy that supports praise over criticism. This could lead to inflated brand and customer satisfaction scores, compared to non-Hispanics, on the basis of culture alone. At first glance, this appears to be mainly a consumer research issue, but could have implications for B2B market research as well.

Savitz’s article illustrates this issue and provides insight on how best to merge Hispanic data with that of non-Hispanics to provide comparisons that as close to culture neutral as possible. In this post we will examine, in brief, his core approach. It is a fascinating one that combines both a metric scale with a verbal scale.

Respondents to the study were asked to rate a well-known beverage on a 0 to 100 point scale where 0 is the worst possible rating and 100 is the best possible rating. Hispanics rated the beverage six points higher than non-Hispanics (80.8 vs. 74.8), a significant difference. The second step involved having respondents assign numerical points to a verbal scale, e.g. “What points on the 0 – 100 scale would you assign to a product that was excellent, very good, good, etc.?” This second step was ‘product agnostic’ in order to facilitate removing cultural bias toward a specific product.

Lastly respondents were asked to estimate the percentage of products that fall into each of the verbal categories (excellent, very good, good, neither good nor poor and poor). Multiplying these percentages by the average for the scale points (derived in step two) allows the researcher to create a weighted average that is culturally unbiased. The average for Hispanics was recalculated to be 74.9 which was no longer significantly different from non-Hispanics.

Culture plays a significant impact in the consumption of goods and services. This post illustrates that Hispanics, of Mexican origin, tend to rate products and services more favorably based on their cultural upbringing. There are methods which can be employed to remove that cultural bias allowing for more accurate and comparable consumer insights.

Viewing all 967 articles
Browse latest View live