Quantcast
Channel: Cvent Survey | Inquisium
Viewing all 967 articles
Browse latest View live

The Key to Database Marketing

$
0
0

Key to Database Marketing | Image by renjith krishnanThis post applies to those who have an active CRM (customer relationship management) program in place, or are considering adopting one. This might include programs such as Microsoft CRM, Surado, Sales Logix or Salesforce.com. As a tool these systems offer the user the ability to track contact information, outbound sales and marketing activities, and inbound inquiries from the client or prospect. The data stored in your CRM provides an ever evolving picture into health of your business.

As such it is a critical component to competing in a market where growth is likely to come primarily from gains in market share. With a slow moving economy it is less likely that companies will see growth because of positive growth in the economy. In short, the spoils of competition will go to those who can outmaneuver their rivals.

How does this apply to consumer or B2B marketing research in general, and use of Cvent in particular? Let’s examine that further.

It is no wonder that the Cvent online survey tool is integrated with Salesforce.com and can be adapted to other CRM systems as well. Adept marketers realize that communication is a two-way street. On occasion the customer or prospect will offer information voluntarily, perhaps through comments on the website, social media or by connecting with your call center. In this ‘Big Data’ world we play in, these comments have value and should be stored, analyzed and leveraged where appropriate. CRM systems are a perfect place to house such data. Text analytics tools are well suited for the task of mining these comments for nuggets.

Adept marketers also know that customers are not homogenous and have different characteristics which, if understood, can be utilized to create and execute more effective marketing programs. For example, new customers bring with them the potential for ‘lifetime value’. It is worth your time to survey new customers, if for no other reason than to find out how they heard about your offerings. You can also survey customers who are ‘missing in action’, high value customers, and those who you want to move up the value chain.

Both consumer and B2B marketers must understand the principle that success in the marketplace requires an open and honest two-way dialog between marketer and the customer or prospect. This is the key to data-based marketing.


Using Technology to Effectively Conduct Text Analysis

$
0
0
Product Consultant: Jonathan Wilson

My name is Jonathan Wilson and I am a Product Consultant supporting the Web Survey tool.

The Open Ended - Comment Box question can be one of the most effective means of collecting respondent feedback. However, analyzing these comments can often be an overwhelming process for survey authors, especially for larger surveys.  My best practice tip this month is to utilize the new Text Analysis functionality to streamline your reporting process and easily target common responses.

Text Analysis is available for any Open Ended - Comment Box question.  The first step on your path to carefree text analysis is to select the comment question from your survey you would like to analyze and name the analysis page.

Survey Text Analysis Settings

Now that you have created the analysis, the next, and most important step, is to configure the output settings to customize the parameters of the results. I recommend keeping the phrase length 8 words or less, as this will cast a wider net and return the maximum number of responses.  If you are looking for a specific phrase length in your analysis, use the preferences section to further narrow your search.

Survey Text Analysis Phrase Settings

Once you have configured the Output Settings, it is time to view the results!  We offer two different displays for Text Analysis: Phrases and Word Cloud.  The Phrases view will display the results in a table format, listing the most common phrases at the top.  The Word Cloud view provides a more visually engaging display, using size and font color to indicate which phrases occurred most frequently.  Another tip is to assign categories to phrases so you can easily group like phrases such as "Customer Service", "Good Customer Service", and "Rep," as in the example below.

Survey Text Analysis Categories

Survey Text Analysis Results

Text Analysis is a great way to quickly and easily analyze the open-ended feedback from your survey and identify common trends.  If you have any questions about adding Text Analysis to your survey, please contact our Survey Support team at 866-318-4357, Option 3, or surveysupport@cvent.com

To qualify your knowledge of the Cvent Web Surveys Application, register to become Cvent Certified.  The Cvent Web Surveys Professional Certification Program is comprised of two exams: Survey Creation and Survey Reporting. After completing and passing both, you will receive a certificate acknowledging your aptitude with the Cvent solution. As Cvent becomes more widely used, proficiency with the system is a must for professionals seeking to showcase their aptitude with leading feedback collection tools.

Gaining Market Research Value: You Must Spread the Word

$
0
0

Image By Tina Phillips

Shhhh, I have a secret to tell you. Are you listening? The secret is this: if your research is going to be effective at supporting marketing, then you have to get the word out. Your customer feedback program must use its voice!

As marketing researchers, we often forget that we are marketers at heart. Customer segmentation studies are more valuable when they are used by brand managers, product development teams and advertising directors to create effective products and messaging. Let’s just say I have seen segmentation studies used as doorstops, and they don’t really work that well.

So how can you get the word out that your data is available and ready for business? There are many ways, but technology can be your friend in helping to socialize your results. Research departments large and small are using desktop publishing and presentation tools to create newsletters, stand-alone emails and portals. If your company has a SharePoint or a collaborative drive then you can house your studies in these locations.

There are a few questions to ask before launching down this path. First, can your data be used to support more than one internal client? For example, does the analysis from your most recent customer satisfaction study have implications for sales, customer service, finance, marketing, or other departments? If so then you have good reason to present your findings to these groups. This can be done in a single presentation (often harder to do than it sounds) or in smaller team meetings. The latter approach allows you to tailor your story to meet specific team needs. This may entail more work; however it will go farther toward having your data used to make strategic decisions. Ask yourself if the additional workload is worth the payout?

An effective tool I have used frequently is the email with condensed findings targeted toward specific audiences. If you have web support then you can add links in your email that take recipients to more detailed findings.

This is certainly not an exhaustive treatise, but you should takeaway the need to socialize your findings (which adds to your personal value), and the fact that there are many ways to get your word out.

 

Overworked and Understaffed: Tips for Prioritizing Market Research Projects

$
0
0

Overworked and Understaffed | Image by Ambro

If you are like me, then you are overworked and understaffed. This has become the mantra in these lean times since the Great Recession. You may feel underpaid as well, but we will leave that for another discussion.

There has been no escaping this trend; I have heard it from consumer and B2B marketing researchers, as well as those in agencies and on the client-side. What this means is if we are to maintain our sanity, and deliver on time we need to develop a priority scheme. And yes, I know, everybody says their project is the top priority. In their minds that may be true. However the reality is this, as researchers, we need to be able to keep our work moving through the pipeline and if we give all projects top priority then nothing will be completed, at least not on time anyway.

To get started on the path to peace, tranquility and completed assignments, you will need to secure guidance form your organization’s senior management. By understanding their priorities you can better align incoming requests so those they match more closely with management’s view of the world.

A second tool is a knowledge gap analysis. This approach compares the incoming request against a standard of what is known and that which is unknown. Projects that help to expand the company’s marketing knowledgebase with new customer insight can be given a higher priority, especially if this information fits with the aforementioned management view.

The important takeaway is that tools exist, well beyond the two mentioned here, that can assist in the process of prioritizing your research agenda. Adding a level of prioritization will increase your effectiveness at delivering projects on time, and maybe even under budget. These factors alone will score points with your internal and external clients.

 

What methods have you used to help prioritize projects?

How to Decide Whether to Brand a Survey

$
0
0

To Brand or not to Brand Surveys | Image by NaypongTo brand or not to brand, that is the question. As marketers and researchers, we need to consider whether or not our surveys should be branded or unbranded. Each option has its place in our toolkit.

First, let’s look at what branding a survey means and why it may be appropriate. When we think of branding a few things come to mind. Color palette, logo and typography are all associated with your company or organizational brand. Incorporating these into your survey will serve to reinforce the connection respondents have your brand. The development of a brand identity is a lengthy and potentially costly process. If your organization has gone down the path of brand development then your surveys, as an extension of that brand, should incorporate the brand’s standards.

Taglines are another element of brand identity, and are often more powerful than the logo itself. Most survey platforms, including Cvent, can incorporate text either as a graphic, or plan text if SEO considerations are in play.

Like music, brands have a tone and emotion. This tone can be used as an adjunct to guide question development. If your brand targets a younger demographic then your questions should incorporate language that resonates with that demographic. Conversely, if your brand speaks to an older more sophisticated audience use this knowledge in framing your survey. For example, financial services or insurance firms will typically take a more conservative tone. By contrast brands whose appeal is to teens, Hot Topic for example, will create surveys with a looser more hip appeal.

Imagery and photography can make even the most arduous customer satisfaction study more appealing. When selecting imagery, keep in mind your brand’s target audience.

When is it best not to brand your survey? Two scenarios come to mind:

Scenario 1: Measuring Brand Awareness
If you are measuring brand awareness or market position, it is best not to brand your survey. Employing brand standards in an awareness study will unduly influence respondent perception and can skew your results.

Scenario 2: Measuring in Left Field
The second scenario is one where your brand is considering something from left field, perhaps a new product that appeals to a different market segment. Use of brand elements here could also skew the results.

Brands are important. They have there place in differentiating your company’s offering in a crowded marketplace. In most cases, it is advised to create surveys that are consistent with your brand and that utilized established standards. With this in mind there are also compelling arguments and this applies to both consumer and B2B marketing research, to step away from the brand and approach the market with a blind survey.

Digging Deeper into Customer Segments: Combine Transactional & Survey Data

$
0
0

Dig into Customer SegmentsMarketers, like businesses in general, find themselves awash in data. It is the savviest and most successful marketers who can utilize this data to the fullest to gain market share in an otherwise languid economy. Predictions that the economy will return to a robust growth mode are few and far between, which means sales are not going to benefit from economic trend. Sales and revenue growth will have to be earned the hard way – in the trenches.

A recent article in Quirk’s brought this thought back to life. The authors conducted a study comparing two forms of customer segmentation analysis and showed the benefit of incorporating multiple data sources. Market researchers have long used attitudes as a segmentation basis. This is a valid method and allows the marketers we partner with to develop a deeper level of customer insight and awareness. On the other side of the fence, if you have spent any time around database marketers, then you know there is a wealth of actual transactional data to be analyzed. Some would say who needs attitudes or demographics when you have actual behavioral data to segment.

I say you need both and then some!

Understanding the key drivers of behavior is critical, but then again so is knowing a thing or two about the behaviors themselves. In this world of data overload, we have to leverage all of our resources including survey data, transactional behaviors, media usage and geo-demographics. This is the key to successful acquisition and customer retention.

For practical application start by analyzing your transactional data with the goal of forming behavioral segments. This will provide you with a basis of understanding – who contributes most to profits, what is the behavior of new customers, what percentage of our customer base leaves the fold and are there clues to that exit. Follow this up with a behavior survey of these segments to uncover the reasons behind their actions. Make sure to include demographic components (either via survey or from your CRM) to allow for tying these attitudes and behaviors to the ground. Having a strong geo-demographic link allows you to purchase prospecting lists and link to data in syndicated surveys.

One can segment the data along a singular line, but in this day and age my question to you is why would you ignore the possibilities of a more robust solution?

 

Download 5 Steps for Driving Sales with Customer Insights for more tips on how to use data to drive revenue.

Surveys Meet Sales: Lead Generation Tactics

$
0
0

Surveys as lead generation tools, this sounds like an abomination. However, they can be conducted in the spirit of both research and sales. In the old days of marketing research, when phone calls were one of the primary tools for data collection, unscrupulous sales people would lead with a survey and try to close with a sales pitch. This technique is known as sugging as has been frowned upon by all of the major industry trade groups.

Now let’s fast forward to days of online surveys. If you are in the B2B marketing world then, you realize that surveys can provide a valuable service to the sales qualification process. It is also understood that if you are downloading content, such as a white paper, podcast or webinar, then there is a trade off. That trade off is in the form of information. Most organizations will allow you access to their branded content in exchange for information about yourself. This data might include contact information, industry, job role, purchase likelihood and role in the purchase decision.

The rise of customer relationship management systems (CRM) in modern marketing has helped to foster this practice. Data that you provide in exchange for the content will be used in the sales process to qualify you as a potential customer. Low likelihood prospects may be diverted into a ‘drip system’ that provides a stream of information released over a defined period of time. Higher likelihood prospects will be diverted to an inside or outside sales team for follow up.

Quizzes are a perfect example. In my day job, we frequently use ‘knowledge checks’ which are nothing more than short assessments as a lead generation tool. The exchange happens when respondents provide their contact information. They are then allowed to take the assessment on a particular knowledge area, such as networking or project management. When complete they receive their score and suggested training areas, based on their score.

The short quizzes can be expanded into larger needs assessments. Again these are nothing more than online surveys which provide the respondent and sales team a deeper look into the situation. The respondent receives a summary and the sales team receives information which allows them to custom tailor a solution.

Revisited: 5 Email Marketing Tips to Increase Online Survey Responses

$
0
0

Tips for increasing survey response rates are always popular—everyone wants to increase their participation! There's a variety of methods that will help reach your response goals. Changes to your email marketing is one of the easiest ways to give your response rates a boost! Let's throw back to a post written in 2009 surrounding email marketing best practices. 

5 Email Marketing Tips To Increase Online Survey Responses

FROM FIELD

The from field is the name you see in your inbox identifying who sent the email. If you don't specify a from name, many clients will default to the email address. No one opens emails from survey@orangecorp.comWhen deciding on a from field you have three main choices: individual's name, brand/organization name or a combination. Some people prefer to use the survey administrator's name, while others prefer the organization or department. Arguments can be made for either method. I personally prefer a combination.

Competition for space in the inbox has increased significanty over the last three years, and the from name is still a critital piece to the puzzle. Make sure to pick a name that your recipents will recognize. If you're conducting a customer satisfaction survey, the CEO, the head of Customer Service or the account rep would be good options. For employee surveys, the CEO or director of HR are often good choices. Think about your audience and how they interact with your organization. 

SUBJECT LINE

A good subject line may be the most critical part of the formula for a high open rate. It's important not to be deceptive or spammy in your subject line. Either of these can result in a lower open rate. If people don't open your email, they're never going to click through to the survey. Here are a couple example survey subject lines that have been proven to work for clients in the past: Your Feedback Needed or Help Us Serve You Better. Remember that just because a subject line works well the first time, it may not work the second. Always review your subject lines with each survey.

If you're not sure what will work the best, and most people don't, it's a good idea to do a few tests. Take a small segement of your list (10%) and break them into small batches. For each batch, see what the open rate is, then use the winner to send the email to the rest of the 90%. 

FIRST SENTENCE 

When a respondent reads an email, the sender has approximately 8 seconds to capture reader attention. Give deep thought to the first line of your email. 

COMPELLING MESSAGING

When writing survey invitations be compelling. Keep the reader engaged with relevant, targeted messaging. If you're sending the same survey to different types of customers, consider segmenting your list for a more personal, targeted message. Don't forget to tell the reader that they are part of a small group of selected respondents. These tips can also be used when creating email reminders.

Your survey invitations should include five key components: 

  • Introduce the survey
  • Explain why it's important to participate
  • Brief overview of the survey topic 
  • Set expectations for the length of the survey

EMAIL REMINDERS

Email survey reminders have been proven to boost response rates. Using a system that will automatically send reminders to individuals who have not yet completed your survey is a huge time saver, but remember to alternate the time you're sending reminders. Perhaps your first invitation got buried in their inbox because of the time of day. 

In addition to alternating the time of day you're sending reminders, you should also revise your email subject line and the email copy itself. There are many reasons why a survey will unanswered: the person never saw the invitation, the person meant to respond and forgot, or they weren't compelled by the inviation copy. I always suggest using different copy in the reminders because if that was the reason the for the no response, the same email won't be compelling the second or third time. 

See the original post from May 2009

 

What subject lines have worked well for you in the past? 

 


Leverage Reports to Manage the Survey Experience

$
0
0

One of the keys to developing and maintaining effective customer feedback surveys is the awareness of which questions trigger respondents to exit the survey. If you are administering an extended survey then this becomes critical to managing the completion process. Longer surveys have lower completion rates; this is as close to a truth as we can get in our business.

If you have the luxury of time to fully pre-test and make course corrections to your survey before a full launch, then the Cvent Web Survey platform offers several reporting tools to help you gain insight into the completion process. These can be leveraged to make the respondent experience as effective and pleasing as possible.

I am always concerned about the time it takes to complete a survey. You should be aware of this metric and use it your advantage. Utilizing a realistic time to completion measure in your survey invitation and/or welcome page will help to frame the participant’s expectation. Follow the rule of under promise and over deliver by setting the time expectation a bit longer than you actually anticipate the respondent will need (e.g. set expectation at five minutes when the average to complete is closer to four minutes). The ‘Response Time’ report provides guidance on the amount of time needed to complete your survey. I encourage the use of the median value for completion time as it avoids the pitfalls of outliers (e.g. the respondent who started a survey before lunch and didn’t finish until she returned over an hour later).

Another critical tool, useful in managing respondent experience, is the ‘Last Question Answered’ report. This report provides a visual on the exit points respondents are using. Ideally we would like to see respondents exiting after completing the final question, but that is seldom the case. The table below shows that 23 participants exited the survey after the second question. It is actually typical to see a large checkout in the first few questions followed by a downward sloping tail. Not all participants have the time, interest or the intent to complete a full survey. Large spikes later in the survey could be indicative quota stops or potential confusion on the part of the respondent.

With that said we should be cognizant of question wording, and how we set up expectations for the flow of the survey. The best we can do is to select a sample, that in theory, the survey is relevant for, word our questions and introductory text as clearly as possible, and use visuals to create an engaging experience. Making use of reports such as these will help you to manage that experience more effectively.

4 Ideas for Creating & Refining Leadership Development Programs

$
0
0

Leadership Development can come in many different shapes and sizes.

Leadership Development…what an interesting segment of the training world!  Ask five people what their definition of leadership development is or what it looks like, and you’ll walk away with five different responses.  Although there is much variety in how organizations execute their leadership development strategies, I believe there is at least one thing those who focus on continuous leadership development have common:  higher employee satisfaction and engagement.

Consider these ideas when creating or refining your company’s leadership development program:

1. Ask your organization’s key stakeholders to clearly define leadership development…don’t stop asking questions until you have a crystal clear vision.

Yes, this is the traditional “assessment” phase of the training design process (ADDIE).  This includes asking your CEO, VPs, and of course your mid-level and front-line leaders what they think.  The biggest mistake you can make is to create the definition based on your assumptions.  Could their definition change over time?  Maybe!  As organizations mature, or key shifts happen, needs will change.  Keep your ears open and be flexible…it will help you keep the program fresh and relevant.  

2. Keep things simple by focusing in two areas: Leadership of Self and
Leadership of Others
.

It’s the inside out approach…never forget it.  If a leader is to be effective she must “walk-the-talk” and lead herself in a way that demonstrates strong balance, continuous improvement, and un-wavering commitment to a clear personal mission.  Only after she’s mastered Leadership of Self will she be a GREAT leader of others.  Leadership of Others involves relationship building, trust, open communication, and creating and supporting systems which lead to amazing results.     

3. Don’t preach.

Involve people in leadership activities and create an environment in which they can receive valuable feedback from their peers and skilled mentors.   Assign individual and group projects and critique results and relationships more than methods.  Experiential learning is critical to leadership development.  Habits won’t change unless an emotional shift occurs and that’s hard to achieve if you’re only being lectured to or taught leadership theory. 

4. Don’t forget to include special opportunities for your emerging leaders.

Build a pipeline of strong leaders.  Identify the emerging rock stars in your organization and assign them a mentor who can help to grow their skills and provide them special opportunities to develop their Leadership of Self and Leadership of Others skills.  Your investment in them will be realized ten-fold in the future.
 

"The leader’s job is not to cover all the bases—
it is to see that all bases are covered."

~ James Crupi

What leadership development tips do you think are critical?  Can't wait to hear your thoughts!

 

Survey Analysis Shouldn't be a Fishing Expedition

$
0
0

Survey Analysis shouldn't be a fishing expeditionAs a market research professional and long-time survey guy, I have come across many clients who could be classified as unwieldy. Although I have nothing against them personally, their thoughts about survey development and the kind of information needed are enough to make one flinch.

A few unmistakable signs of an unwieldy client:

  1. They believe you should be able to ask an unlimited number of questions, as if all respondents have an unlimited amount of time and desire to perform brand awareness exercises.
  2. There is an insatiable desire follow every fixed question with an open-end.
  3. Their approach to analyzing survey data is one of ‘everything by everything to find anything’.

To answer the first point, we have to turn the tables back to the decision maker and ask them how likely they would be willing to dedicate 15 to 45 minutes of their time to complete a survey? If you are surveying highly engaged fans, you might achieve a decent completion rate. However, most customers, prospects, constituents and donors have a limited amount of time to complete countless Likert scales.

To the second point, even though tools such as Cvent offer text analytics, there is little need to convert your survey into a focus group. Certainly, follow-up questions to those who give you a poor Net Promoter Score are a best practice, but seldom is there a need to go beyond a limited number of open-ended questions. These should be strategically selected and placed where they will not impede overall completion.

Lastly, there is little need to go on an analysis fishing expedition. If your survey is focused and you have hypotheses established, then you can limit your analysis to answering those key questions.

In short, before you launch a survey project take the time to engage your client with a consultative approach. As the survey author, you will need to guide the client toward a compromise, one that balances their information needs with respect for the respondent’s time and opinions. Deploying surveys without agreed upon goals for analysis, cost and timeframe will create large headaches. Alternatively, if these three points describe you, the Cvent Professional Services group would be happy to help give you guidence in questionnaire writing, survey analysis or report creation.

Brand Survey Analysis: Turning Down the Noise

$
0
0

Sometimes as market researchers we get lucky and find the answers with little effort. Then there is the rest of the time in which we have to dig, look to the left, look to the right and hope we can find data that supports or refutes our hypotheses. Analyzing a brand survey is one of those areas where digging is most certainly required.

A recent article in Quirk’s, Data Use: Seeing Beyond the Brand Halo by James Rohde, brought this to light and provided a simple technique for removing the noise inherent in surveys where we measure brands across several dimensions. The author speaks to the concept of the brand halo. Rohde defines brand halo as a propensity to rate attributes of a brand either more positively or negatively based on our feelings toward that brand.

A second effect can also be found when making brand comparisons. The attribute effect occurs when respondents rate an attribute higher or lower based upon their feelings toward that aspect in question. Price typically gets beat up pretty hard since it is an attribute that many customers have negative feelings toward, in comparison to other aspects such as value, quality or durability. We will look at this further in a separate post.

Removing the noise, also known as ‘centering’ the data is a function of the survey analysis phase. The table below shows average scores for three brands across several dimensions including an overall customer satisfaction score. The data, as is, shows that Manny’s is the overall leader followed by Moe’s and Jack’s. In comparison, though, there are some areas where Moe’s or Jack’s does better than Manny’s. We could accept the data and be done, but good analysts are always looking for another angle in which to view the data.

Row and Center 1

The second table accounts for the brand halo effect by subtracting the score for each dimension by the overall mean for each brand. This serves to separate respondent bias toward a brand from their feelings toward the attributes we are using to measure brand awareness. We can now clearly see that certain dimensions are more favorable (e.g. quality vs. price or environment vs. menu variety). The key takeaway is that we can remove a brand’s popularity or negativity from our analysis. This allows us to drive more focused efforts for improving our brand’s position.

Row and center tbl 2

The next post will finish up the process by removing respondent feelings toward the attributes. This allows the ability to examine the data clear of any biasing influence.
 

As We Go Jetsons, Still Act Like the Flintstones

$
0
0

Do you remember the world pre-internet? Remember how we used to do business? At the Inc. 500 | 500 Conference last year, Gary Vaynerchuk made a bold statement that really caught my attention when I watched a clip from his speach: Our grandparents are more prepared to be successful in the next decade than we are, because business was built in the old days was built on "small town rules" and the internet is taking us back to those rules. It's about building relationships and personalizing each experience. It's not about going out in mass anymore. The future is all about one-on-one marketing. The stakes are high.

Gary goes on to talk about how most companies are treating the internet (and social media) like a one night stand. We're all acting like 19 year old dudes. Trying to close the deal on the first transaction. Ask yourself, are you going to buy something because someone walks up to you in a cocktail party and starts pitching you? Probably not, yet it's what so many people are doing on social media. We need to personalize our approach, build relationships—one interaction at a time!

But don't think this just applies to social media. It doesn't. It applies to everything, including your emails and surveys. You need to build on existing customer relationships and individualize the (survey) experience. Whenever you write an email or create a survey, keep in mind what you already know about the respondent. Include information to trigger that additional level of trust, use the information in analysis to gain deeper customer insights. Of course we're going Jetsons, but with the help of Big Data, we can still be very Flintstone link in our interactions with customers and prospects. I believe, like Gary, the organizations that can do that most effectively throughout all of their markeitng (not just social) are going to be the ones poised for success.

I invite you to listen to this short clip from Gary's speech, and then as you head off into the age of the Jetsons, don't forget about the Fred and Wilma Flintstones!

Download our eBook, Harnessing the Conversation, for more tips on how to drive value from feedback collected through multiple sources.

Have a Laugh: Where does Humor Belong in Survey Design?

$
0
0

Laughter | Image by: photostock“Does anyone remember laughter?” said one Robert Plant in the immortal song Stairway to Heaven. Does laughter (once described as the "best medicine") have a place in survey research? Some might argue we should follow the route of the dismal science, but I heartedly disagree.

As a construct humor has been used countless times to convey messages in both consumer and B2B marketing. No doubt many of those advertising campaigns were tested before they were launched, while in market, and after conclusion to see if they effectively moved the needle of consumer sentiment toward the brands in question.

So if humor can be leveraged as an arrow in the advertising quiver, should it be brought into the world of survey research? Why not! Maybe it’s the early Spring or the fact that I just completed a survey from the Federal Reserve Bank of St Louis that made me chuckle. Although research is serious business, and even in the DIY-world resources such as time and money are spent cautiously, humor and its good friend laughter do have a place.

The aforementioned website satisfaction survey from the not-so-jolly bankers was concise and made good use of humor to increase engagement with the survey. Placing a well crafted funny in the midst of a question, as a category, or as part of a closing ‘thank you for responding’ message can improve the overall survey experience.

If you are the worrisome type then you might craft two versions of your next brand awareness or employee satisfaction survey, one with a bit of humor and one more sedate and test the differences in response patterns. Make sure to randomly assign prospects to their survey in order to minimize potential selection bias. Just for fun, include an open-end asking respondents what they thought of the survey. This alone should make for interesting reading.

Brand Survey Analysis: Noise Removal Part 2

$
0
0

In a previous post I spoke to removing noise from brand awareness data. Noise comes in the form of two effects. First we removed noise created by the halo effect. This effect is common, especially in measuring brand performance along key dimensions. A halo exists when respondents’ rate attributes either more positively or negatively based upon their overall brand preference instead of on the brand’s performance on those attributes. When analyzing survey data, we can control for this effect by subtracting a brand’s attribute score from its overall average score. This levels the playing field.

But we shouldn’t stop there. In his recent Quirk's article, Data Use: Seeing Beyond the Brand HaloJames Rohde speaks of a second effect entitled the attribute effect. A respondent can allow their feelings toward an attribute to unduly influence their selection. Our feelings about price and quality come to mind here. Quality scores can eclipse price satisfaction scores simply because ‘quality’ is viewed more favorably than ‘price’ as a brand attribute.

Table 1 below illustrates removing the halo effect by subtracting the mean column score from the brand score for a particular attribute (e.g. for Manny’s overall satisfaction 7.10 – 7.54 = -0.44). We remove the attribute effect, completing our row and centering process, by using these reduced scores and subtracting a row score by the row mean (e.g. for Moe’s price satisfaction -1.55 – (-0.99) = -0.56). This is shown in Table 2.

Table 2
Table 1

table 2
Table 2

What remains is a reduced data set that is sans noise. If we are managing the marketing efforts for Jack’s then we know our strength areas are in staff and checkout, but we need to increase our likelihood to recommend and perceptions of our menu. The owners at Moe’s should focus on staff training as consumers perceive their staff to be far from friendly. Oddly, though, they are doing better in terms of satisfaction than the competition. Manny’s needs to improve their checkout process and bring greater variety to the menu.

Removing the noise from data is applicable to many of the tasks facing both consumer and B2B marketing research professionals. Taking the time to do so will pay dividends.
 


Consumers Trust Online Reviews as much as Personal Recommendations, Survey Says

$
0
0

Consumers Trust Reviews as much as Personal RecommendationsTrust is one of those constructs that underlies all relationships from personal to professional. If you cannot trust your spouse or partner, who can you trust? The same goes for professional relationships. In this era of expeditious means of conveying both positive and negative feelings to the masses, we as consumer and B2B market researchers need to be aware of the impact of reviews on brand awareness and affinity.

The 2012 Local Consumer Review Survey, part of a three-year study covering online reviews indicated that 72% of consumers surveyed reported they trust online reviews as much as personal recommendations. The study included panelists (n = 2,862) from the US, Canada and the UK. In the same study over half of respondents indicated that positive online reviews increase their likelihood of using local businesses.

The sample is large enough and broad enough geographically to provide consumer insights useful for the reader to implement into their marketing strategy. Remember though, trust is not built overnight, but can be lost overnight. For consumer and B2B marketers, especially those with local market responsibilities, the key takeaway is that online reviews have the ability to impact the decision making process. We should encourage customers to leave reviews, monitor them regularly, and be keen to respond to both the negative and the positive review.

Let’s look at more of the results. One in four respondents uses the Internet at least once per week to find local businesses. This is up eight points from the 2010 study. Over one-third use it at least once per month to locate a business while eight percent are daily users. This tells me that we have to be vigilant in updating our online presence to keep our messaging current.

Does quantity matter? When it comes to reviews, it appears respondents are reading fewer reviews before making a decision on a business. Although consumers may be reading fewer reviews it is important for marketers to pursue reviews as increasing the number of reviews helps to instill a greater sense of confidence in consumers. Nearly six out of ten respondents indicated that positive online reviews increase their trust in a business.

Food for thought: How can you increase your online reviews, ensure they are authentic in nature, and keep them current?

Level I Evaluations: Items Selection and Piloting

$
0
0

In the last blog on Level 1 Evalations, we looked at different scales that can be used in creating a survey. Here let us consider some of the characteristics the items on the scales need to have so that the respondents can answer the survey effectively.

“Items” means the “questions” or “statements” in the survey. It is very important that the items have to be worded and designed very carefully in any survey. The following characteristics need to be considered to create an effective survey instrument:

ORDER: The items have to be placed in a proper order. If you are collecting data on different aspects of training, group items according to these aspects. For example, all the items related to the design, to the facilitator skills, administrative support, etc. have to be neatly separated into these different categories.

===================

The items in a survey have to be worded effectively. The following is a list of errors that can be made while designing a survey.

DIRECTIONAL QUESTIONS: Items such as “The facilitator was good”, “The content was relevant” are directional. These are also called “loaded” questions. They are worded positively and this may influence the respondents. The items should be worded in a neutral manner. For example, the item can say “style of the facilitator” and “good” can be one of the points on the rating scale that ranges from say, “Very Poor” to “Very good”

AMBIGUOUS QUESTIONS: The items on the rating scale should not be ambiguous. Here is an example of an ambiguous item: "Was the facilitator good or bad?" This question cannot be answered on a rating scale or a Yes or No scale! It would need to have a qualitative answer.

DOUBLE NEGATIVES: Do not word your questions using double negatives. For example: “The session did not start late” This is an example of a very badly worded sentence on a survey. The direct question would be if the session started on time.

USING ABBREVIATIONS: Do not use abbreviations in your survey. For example: “How do you rate the training delivered by AGC?” Is AGC is a company/a department, or what? This is very true if the feedback is being taken from samples drawn from different populations, where the reference and context may not be easily discernible. If an abbreviation is to be used, its first reference has to be in parenthesis, and in the following sections, the abbreviation can be used.

USE OF JARGON: The questions should be worded simply. Here is an example of a feedback sentence with jargon: How did you find the andragogy of the facilitator? While the word Andragogy is the correct word to be used in adult training context, the participants may not understand what it means- if the words methods or pedagogy is used, they may understand it better. If a word has to be used, its meaning can be written in parenthesis for better understanding

USE OF SLANG: Avoid using slang in designing surveys. For example, use of the words like “guys”, “fellow” etc., should be avoided.

GENDER BIAS: Sentences should be free of gender bias. For example: “Trainer was well versed in his area of work.” Apart from being directional, this statement also has a gender bias. A correct sentence would be “The trainer was well versed in his/ her area of expertise.” A correct item would be Rate the “Expertise of the Trainer”.

===================

Item design is a very important step in collecting data. The relevance and validity of the data gathered depends on the design of the questionnaire.

PILOT THE INSTRUMENT: It is a good idea to pilot the questionnaire to see the type of data it yields. Also get inputs from the pilot group about any problems with each of the items.

  • Did the pilot respondents think that the item is relevant?
  • Was it clear?
  • Did the rating scale match the item? For example if the item asks for “Frequency” and the Rating Scale is that of “Quality” there would be a mismatch.
  • How much time did it take? Was it justified?
  • Did it have enough scope for a qualitative aspect as well (if required)
  • What were the problems that the respondents faced while answering the survey?
  • What is the suitable design for administering such a survey? Hands on after the session, web survey, etc?

The pilot would help you to find out the challenges that may lie in scoring the surveys too. For example:

  • How much time does it take to score the survey?
  • Can it be machine scored?
  • What is the software requirement?
  • What kinds of scores does the survey yield?
  • What kind of analysis is possible with the data?
  • What meaningful conclusions can you draw from the data?

===================

Successful Level I evaluations will help you redesign your training interventions in terms of the training content, method, participant levels, facilitators and training administration and coordination support services. Get started with an online surveys free trial and start evaluating your trainings at Level 1.

Survey Design: Slowing Things Down

$
0
0

Let’s take a break. Now doesn’t that help to relieve a bit of the tension? As market researchers by default we create tension in the design of our surveys. One of my key points is doing what I can to minimize the cognitive load placed on respondents.

Why is this a good thing you ask?

Well, let’s start with completion rates. Countless studies have been conducted that report the longer your survey the less likely a respondent is to complete it. Isn’t our goal to generate meaningful completions? Cvent’s reporting tools provide insight into respondent exit points. If it isn’t the end of the survey, or a logical checkpoint then we have lost the game. Steps taken to reduce the cognitive load facing respondents increases the likelihood they will make it to the end of the survey. We can do this through effective targeting (matching surveys to those known to have an interest in the topic), employing visual stimulation, as well as keeping our focus and not allowing surveys to run long.

Let’s take a look at the concept of meaningful completes. The literature is rife with stories on speeding and straight lining. Both of these issues cause problems when we analyze survey data. Speeding, as the name applies, is an issue where respondents complete the survey without focusing on the questions at hand. This is where it is handy to have a benchmark for completion time. If a respondent’s time is significantly less than the benchmark chances are he or she did not give the survey its due.

Straight lining is a common problem when large matrix questions are used. These matrices create an environment where the respondent can find it easy to go down one side or the other. Best practices involve breaking the matrices into smaller sub-tables and using negatively worded questions where appropriate. The thinking is this will slow respondents down and provide them the impetus to read the questions and think more carefully about their responses.

Another trick of the trade is to use check questions and reminders. A check question, usually inserted between lengthy sets of tasks, instructs the respondent to select a specific answer. Respondents who do not select the check answer can be screened out from analysis. A sample reminder, as seen below, provides the respondent with a mental clearing of the palette.

pause and reflect

Lightening the load on respondents increases the chance they will complete the survey and provide you with well thought out answers. After all that is our reward.

5 Qualities of Remarkable Bosses

$
0
0

Qualities of a Remarkable Boss

Inc., a website dedicated to helping small businesses flourish, recently published the top qualities great bosses have that can motivate their employees to work hard and grow their business. Jeff Haden cites these five characteristics that a remarkable boss should embody:

  1. Develop every employee: Focus on helping your employees develop skills through training and mentoring, rather than requiring them to meet generic progress report metrics. 
  2. Deal with problems immediately: Even small problems are important to address, as they can become big problems quickly.
  3. Rescue your worst employee: When an employee has a public failure at work, they often become the black sheep of the organization. Don't allow them to fall by the wayside; step up the mentoring and training to help them get back on track.
  4. Serve others, not yourself: Put your employees in the spotlight, not yourself. Remember that if you are a remarkable boss, others will notice—you shouldn't have to tell them.
  5. Always remember where you came from: A great boss is something of a rockstar in the workplace. Don't blow off the lower-level employees in your organization, no matter how inconsequential their request may be. 

How can you find out if you would qualify as one of Haden's remarkable bosses? Asking your personnel point-blank really isn't going to get you an honest answer. But a great way to measure how you're doing is with an employee survey. An online employee satisfaction survey can help gauge how you're performing as a leader, what management might need to work on, and what the company could be doing to better engage and grow employees. Consider utilizing a variety of types of questions to achieve the most well-rounded, honest and thorough responses, such as:

  • Drop-down options: "How long have you been with our company?"
  • Likert Scale Ranges: "On a scale of 1 to 5, please rate your satisfaction with your compensation package."
  • Open-ended questions: "What is your ideal management style?"

By using Cvent's popular online survey tools, you can discover how your employees feel about your company and use the results to make improvements that can improve morale and your bottom line simultaneously.

Hear, See and Do!

$
0
0

On occasion I come across a post that really moves me to action. In a recent Greenbook newsletter Edward Appleton raised two strong points about market research’s role. I strongly suggest reading his post if time permits, but in summary he raises a few questions that we need to be able to answer.

First, as a profession do we need to worry about the reputation of the market research industry? Short answer is yes! There are numerous pundits saying that marketing research is dead, that surveying customers and prospects is so yesterday, and respondents will invariably say one thing and do another. Are these points true? Again the answer is yes. Does it mean we should all be looking for new jobs? Short answer is no!

As a profession market research has undergone numerous tidal waves of change. The mere fact that phone and mail surveys have been largely replaced by online or mixed-mode surveys is proof that we can adapt and evolve. The fact that many of us are now using text analytics and social media tracking tools to dive into that stream of data is another point of proof.

Edward raises a second point: how can research be saying one thing and actual events portray another? When we isolate ourselves behind computers and stacks of statistics books, it is easy to lose sight of what is important to the customer. We must get out into the field, observe and ask direct questions of customers as they stand in line or browse the isles. Customer satisfaction surveys and brand awareness trackers can be powerful tools, but they can also be locked into measuring constructs that are not of import to the customer, prospect or end-user of the data.

One of the most challenging, but ultimately rewarding experiences a researcher can have is to motivate his or her executives to get out into the trenches and see what is going on first hand. The insight gained can provide valuable feedback to our work and allow us to keep it current with the times.

Confucius said it best:

I hear, I forget.
I see, I remember.
I do, I understand.

Viewing all 967 articles
Browse latest View live