The Consumer Picks survey, now in its fifth year, continues to grow and reveal new insights about brand strength and consumer preferences in the restaurant industry.
The 2015 report includes more than 42,000 vetted consumer ratings and analyzes consumer perceptions of 172 restaurant brands, which is 10 more brands than last year and 33 more than were rated in the debut survey in 2011. As a result, this year’s report offers the most complete picture of restaurant brand health the survey has ever revealed.
Among the new brands included this year are:
• Six Limited-Service chains: Marco’s Pizza, Red Mango, Peet’s Coffee & Tea, Penn Station East Coast Subs, Raising Cane’s and Which Wich.
• Two Family-Dining chains: First Watch and Shari’s Cafe and Pies.
• Two Casual-Dining brands: Beef O’Brady’s and Genghis Grill.
Interestingly, two of these newcomers took the top honors in their competitive sets, with First Watch taking the top score in the 16-brand Family-Dining category, and Penn Station East Coast Subs leading the 16 Limited-Service sandwich brands.
One thing I am often asked about is how brands’ Overall Scores are calculated, specifically why they are not simply an average of each brands’ attribute scores. The answer is that the Overall Score for each brand is weighted by how consumers in that segment value the various attributes. This creates scores that more accurately reflect how well brands are delivering on the expectations of consumers in their segments.
It should be noted that differences in closely ranked ratings should be considered statistically equivalent. In Consumer Picks — or any other statistical survey of its type — small differences in ratings are not material. For reporting purposes, brands are listed by Overall Score, and when two or more brands appear tied in the report, the order of listing is based on the next level of decimal point(s).
As in prior years, brands are grouped into four major categories: 111 Limited-Service chains, 40 Casual-Dining chains, 16 Family chains, and five Fine-Dining chains. The Limited-Service and Casual-Dining categories are further divided by menu type to allow for comparisons of brands against their direct competitors. Brands can be compared within the four major segments, but making comparisons across these primary categories will introduce some error, as the Overall Scores, are weighted by the importance attributes for the segment in which the brand appears.
That said, the 2015 survey format is consistent with previous years in order to facilitate year-to-year comparisons of how a brand performs against its competitors. These comparisons are an insightful and common use of the data, but must be made carefully to ensure accuracy. For that reason, I offer a few suggestions and warnings.
First, do not make assumptions purely based on the difference in a brand’s scores from last year to this year. That approach does not take into account the time difference between the two studies as well as social factors that took place during the year that may have impacted consumers’ responses. A better way to track a brands year-over-year performance is to look at its score against the average score for an identical group of competitors for both this year and prior year(s). Then compare the relative difference between the subject brand and its group of competitors.
WD Partners has created a blank Excel file to aid you in making such comparisons. It is available as a download. Another useful tool for measuring brand performance over time is to look for changes in ranking over time, keeping in mind that changes in the total number of brands in the group from year to year may impact rankings.
WD Partners and NRN hope you find Consumer Picks 2015 to be a useful and valuable tool.
Methodology
The Consumer Picks survey was developed by WD Partners and is designed to provide relative benchmarks on attribute ratings across restaurant brands.
The survey was conducted online, and respondents were given a list of 178 restaurant chains organized into groups depending on the chain’s service model as well as the respondent’s location. Respondents were asked to identify which restaurants they had patronized in the last six months, or since July 15, 2014, and how many times they visited it during that time period. Respondents were then asked to rate their experiences at up to eight of the restaurants they had patronized. The six-month time frame is designed to increase the likelihood of recall and to ensure that respondents would represent a broad range of consumers.
Responses were screened for inconsistent answers; “straight lining,” or selecting the same response repeatedly; and excessive haste in completing the questionnaire. In addition, six of the 178 restaurants were not included in the report because they had fewer than 100 ratings.
That process yielded 42,196 acceptable ratings for 172 restaurant chains.
Twenty-seven of the 172 restaurant chains had between 100 and 149 ratings and are included in this report, but are marked with a double asterisk to indicate they fell below the desired threshold of 150 ratings. The remaining 145 restaurant chains were rated by at least 150 respondents, and more than half, or 85 brands, were rated by more than 200 respondents.
The questionnaire was developed and tested by WD Partners. To administer it, the firm worked with Survey Sampling International, which supplied panel respondents, and SurveyGizmo, which provided survey software.
The survey addresses ten attributes: Atmosphere, Cleanliness, Food Quality, Likelihood to Recommend, Likelihood to Return, Menu Variety, Reputation, Service, Value and Craveability. Results for individual attributes are presented as the percentage of top-two-box ratings received, based on a standard five scale. In the case of all attributes except Likely to Recommend and Likely to Return, the scores are the percentage of respondents who said a chain was “outstanding” or “above average” in that area. For Likely to Recommend and Likely to Return, the scores are the percentage of respondents who said that they would “definitely” or “probably” recommend the brand or visit it again.
Brands’ Overall Scores are an average of those scores, weighted by the importance of each attribute to that segment’s customers, or the percentage of respondents who said an attribute was “essential” or “very important.” Likely to Return is not included in importance rankings and is not factored into the Overall Scores.
Demographic information on the respondents was obtained to align the survey results with the U.S. population based on 2014 Bureau of Labor Statistics reports.
Additional data and custom analysis is available for purchase. Contact WD Partners’ Dennis Lombardi at [email protected].
Dennis Lombardi is executive vice president of foodservice strategies for WD Partners.
WD Partners is a Dublin, Ohio-based firm specializing in the customer experience that helps global food and retail brands innovate through strategy and design. Research conducted by WD Partners’ Insights group is part of the company’s approach to enhancing the performance of foodservice brands.