Join Us for a Deep Dive into J.D. Power’s 2017 Bank Ratings
If I utter the phrase “J.D. Power Ratings”, I’ll bet you a dollar I can guess what pops into your mind. It’s probably a memory of the last time you bought a new or used car. Maybe you were trying to figure out whether people are happier with their Toyota Highlanders or their Nissan Pathfinders, or which sedan is more dependable, the Ford Fusion or the Honda Accord. Whatever car model one’s considering, J.D. Power can ply you with a wide array of metrics to help (or perhaps complicate?) your decision.
But the common thread that runs through J.D. Power is not an automotive focus – it’s customer satisfaction research on almost every consumer-oriented industry you can think of, digging into consumer perspectives in almost a dozen industries, and into numerous subsegments within each industry. For a typical study, J.D. Power surveys 50,000-100,000 consumers in 11 regions across the nation.
Of highest curiosity to us here at DepositAccounts is their annual spring release of the U.S. Retail Banking Satisfaction Study. J.D. Power says that the study, now in its 12th year, is the “longest running and most in-depth survey” of the U.S. retail banking landscape. Its final report every year synthesizes customer satisfaction findings for six different banking factors – account information, channel activities, facilities, fees, problem resolution, and product offerings – into a single indexed rating score for dozens of the country’s largest brick-and-mortar banks.
With the 2017 results freshly released, we decided to take a close look at who the winners and losers are this year, both nationally and in each of the 11 segmented regions; how J.D. Power’s rankings compare with aggregated reviews from DA readers; how the very biggest banks fared; and whether any of these large brick-and-mortar banks have rates worthy enough to attract your business.
So without further delay, let’s start the slice ‘n dice…
J.D. Power’s 11 regional rankings
In its methodology, J.D. Power asks consumers about the banks with locations in their particular region. And it calculates and reports its findings compartmentalized in these areas, resulting in 11 different ranking lists. The press release touts that its 2017 survey included 136 of the nation’s largest banks, but because some banks appear in more than one region, some presumably didn’t receive a statistically significant threshold of responses for reporting, and some have since been merged into other banks, the number of unique banks in our analysis is 72.
In the chart below, you can see the top-rated bank, the lowest-rated bank, and the average rating in each of the 11 geographical regions. Each score is based on J.D. Power’s 1,000-point index that combines all six areas of consumer satisfaction.
From this layout, you can observe a few things. First, customers in some regions are noticeably more satisfied than in others. For instance, the average rating in both the South Central states and in Texas is more than 30 points above the average in New England, where the mean score almost dips into the 700s. No one can say whether that tells us Texans and Alabamians are easier to please than perhaps more-demanding New Englanders, or whether the customer service ethic or training among employees in these regions is what differs. Quite feasibly, it could be both. But we can’t know that from this data.
The score spectrums for each region also make it easy to see which banks had the absolute worst and best scores among all those in the survey. We see that no one scored higher than Frost Bank did among its Texas customers, with BancFirst in the Southwest scoring just a point lower.
We can also note that, while Banner Bank won the blue ribbon in its Northwest division, it is at the very best an 11th-ranked bank overall, given it has the lowest score of any of the winners. But it’s likely lower-ranked nationally than that, since there could be non-first-place banks in the other regions that scored higher than Banner’s 840 points.
At the other end of the spectrum, we see that Old National Bank in the North Central states holds the unfortunate distinction of receiving the worst overall score, with TCF National Bank close behind in the Midwest. HSBC in the Mid-Atlantic and Santander in New England are also sitting fairly low in the cellar.
How we boiled 11 rankings down to one
While these 11 regional rankings can be interesting, what I really wanted to see was how all these banks stacked up in one nationwide list. So where a bank’s customers were surveyed in more than one region, I calculated their average (mean) J.D. Power score, leading to a single score per institution. Of the 72 banks, 21 were scored in multiple regions, with 16 of them appearing 2-5 times. PNC and U.S. Bank were scored in a half dozen segments, and three of the “Big Four” banks (Chase, Wells Fargo, and Bank of America) were surveyed in all 11 regions. Citibank, though the fourth largest bank by assets, has a much smaller geographic footprint than the top three banks, and was only scored in four regions.
With a singular score now calculated for each bank, I used J.D. Power’s published methodology for identifying four scoring tiers: Best of the Bunch, Better Than Most, About Average, and The Rest. For our purposes, though, I’ve changed “The Rest” to “Below Average” because I find it more descriptive.
Here are our results, which present a nationalization of J.D. Power’s regional rankings and, to my knowledge, have not been synthesized and published anywhere else.
These tables pretty much speak for themselves: find your favorite big bank and see how others have experienced it, and then see who consumers have liked more and less. My most notable observation is that I might declare one bank a bigger winner than the top-scoring bank. True, Frost Bank and BancFirst deserve well-earned kudos for achieving the highest customer satisfaction scores in the survey.
But I’d venture that Huntington National Bank has earned a special shout-out. The top four scorers were all rated in just one region, and it’s a bit easier to score one stand-out rating than two. Huntington is the only bank to win top honors in two regions – Mid-Atlantic and North Central – receiving a customer satisfaction score of 849 in both regions.
How do these align with DA reader reviews?
As with all customer satisfaction surveys, the study findings may not jibe with your personal experience. Perhaps a bank you’ve come to loathe due to multiple bad experiences head-scratchingly appears in the “Better Than Most” tier. Or your hands-down favorite bank was inexplicably deemed “Below Average” by J.D. Power respondents. This of course is inevitable because we usually don’t know if our personal experience is an outlier or part of a common theme.
In any case, we were curious to see how J.D. Power’s scores align – or don’t – with the scores that DA readers have given these banks over time. Several words of caution are in order here, though.
First, the number of DepositAccounts reader reviews can’t compare to the breadth of J.D. Power surveys conducted for each bank. While we don’t know the minimum number of respondents per bank, we know the full survey tapped 78,000 banking customers, and I’d assume J.D. Power set a minimum sample size for it to score a bank. Perhaps it’s the market research industry’s rule-of-thumb number of 400 completed surveys, or perhaps a little less or more. I don’t know. But in the case of DA reader reviews for these banks, only three banks among these 72 have more than 200 reviews registered on our site, and all had below 300 reviews.
Second, J.D. Power’s survey is statistically and randomly sampled from banking customers across the country. This differs significantly from the pool of DA reviewers in a couple of ways. For one, DepositAccounts reviewers, or at least a subset of them, may be more banking-engaged (and presumably more knowledgeable and rate-savvy) than the average U.S. consumer.
Additionally with DA reviews, individuals have self-selected themselves to participate, presumably because they’ve felt strongly enough in their views of the bank that they’ve wanted to share their good or bad experience with others. This type of review is quite different than the random selection process J.D. Power employs. It’s fairly defensible, for instance, to assume a self-selecting survey will naturally draw more respondents that have a strong opinion on the issue, and few that feel neutral. It might also be argued that readers with a strong negative experience with a bank are the most likely of all to write a review (anger tends to spur more action in human beings than does happiness or contentment), and therefore might depress scores more than a random sample would.
Still, it’s fun to mash-up the J.D. Power scores alongside the DA reader reviews, keeping in mind this is not a scientific comparison. We did this by compiling all of the J.D. Power banks for which we had at least ten DA reviews, which was a total of 36 banks. Here’s what we found.
What we see are some definite instances of correlation. For instance, the top three DepositAccounts reviews in this group appear within the top four J.D. Power slots. And if you look at how the five blue circles appear from top to bottom versus where the red circles appear in the vertical, DA’s top-reviewed banks do sit higher in the list and the reds, lower. (One note about Capital One: Since it merged its physical and internet operations, only a single review score is available for the entire bank.)
Done a little more numerically, if you average the J.D. Power rank of DA’s five blue circles, you get an average rank of about 8. Do the same with DA’s five worst reviews and the average J.D. Power rank is 18. So in general terms, higher DepositAccounts reviews correlate with higher J.D. Power scores and vice versa.
You can also see that in the “Below Average” section of the J.D. Power rankings, all of the DA review scores are below 3, which is considered the average rating on a 5-star scale. So DA readers, too, collectively deemed these banks as “below average”.
Of course, there is some dissonance between the rating systems when you look closely at individual scores. For instance, J.D. Power’s highest-scored bank in this group – United Community Bank – gets a below average rating from DA readers. But notice that this is based on just 12 individuals’ opinions. Same thing at the other end of the spectrum for Fulton Bank: it has the second-lowest DA score in this list, yet appears as “better than most” in J.D. Power’s study. Again, this is based on 12 reader reviews.
As a foil to this, look at the four large banks where we have over a hundred reviews. Here, our reviews correlate fairly closely with J.D. Power’s assessments: Capital One is the best of these big banks, Chase is about average, and Bank of America and Wells Fargo are in the “below average” tier. We agree!
As I outlined earlier, there are various structural problems with comparing J.D. Power’s scores side-by-side with DA reviews, so it’s not surprising we don’t see lock-step alignment across all 36 banks. But we’re encouraged at how much correlation the analysis did turn up.
Focusing the lens on the biggest of the big banks
We also took a look at the Top 10 largest retail banks, to see which ones customers like most and least, according to their J.D. Power scores. You can see below that the Top 10 bank appearing in the smallest number of regions – Capital One, in three regions – earned the highest average score. Meanwhile, the nation’s second- and third-largest banks – Wells Fargo and Bank of America, respectively – were scored in all 11 regions, and earned the lowest average scores of the Top 10.
While there is certainly some downward pressure on scores from being surveyed by, say, 50,000 customers versus 5,000 customers, you can compare apples-to-apples by looking at Chase’s average J.D. Power score. It, too, appears in all 11 regions of the study and still scored 12 points higher than Wells Fargo and Bank of America. Similarly, PNC earned an average of 828 from six regional scores, beating out four Top 10 banks that only competed in four or five regions.
Another thing that’s hard not to notice is how poorly Bank of America and Wells Fargo rate, ranking in the “Below Average” tier. And also, that no Top 10 bank appears in J.D. Power’s “Best of the Bunch” tier.
The elephant in the room… Rates
Regular DepostAccounts readers will be able to anticipate the general outcome of the last facet we looked at in this analysis, and that’s whether any of these big banks have competitive rates. I say it’s predictable because these are 1) very large banks, and 2) brick-and-mortar operations, and regulars here know that online banks generally have much better rates, as do credit unions. And among brick-and-mortar banks, smaller and medium-sized banks generally trump large institutions on rates. (You can see our analysis of this here, which we will soon be updating with current rates.)
But whether or not we can guess the outcome, we pulled the best available rates from each of these 72 brick-and-mortar banks anyway, for three products: their best savings or money market account, their standard 1-year CD, and their standard 5-year CD. After a rate review, the findings did not buck our expectations.
In short, none of these banks offer much in the way of rates to recommend you open an account with them. Even considering the respectable 5-year CD rates, a search through DepositAccounts’ CD rate tables will lead you to more than a handful of nationally available banks – either online banks or smaller institutions – that pay upwards of 2.25% APY, making 2.00% APY a losing proposition.
One important note is required here, though. Some brick-and-mortar branches do not publish their rates online, so we could not assess those. Also, some physical banks occasionally offer special promotional rates that can be a worthy deal. Savers should always keep their eyes open for those. But in terms their general menu of standard rates, few of these 72 banks with published rates listed anything worth a savvy saver’s attention.
Final take-aways
Landing on the note of rates is perhaps not the best endpoint to this analysis, as there are other worthwhile reasons to hold an account at one of these large banks. I know that, personally, I typically hold one Chase account open, though I keep virtually none of my money there. But having the Chase account provides me with a local branch I can access, should I need it; and it often is the result of me scoring a cash bonus for opening the account.
Even for the smartest capital-preserving savers, who save all of their deposit funds in internet banks and credit unions, and at the highest rates they can find, holding an account with one large bank that operates a physical branch in your community can provide some benefits. And now you know which of those large institutions across the country make their customers happiest.
They receive hidden contributions from many institutions for favorable rating. The best rating goes to the highest bidder and I stopped following their ratings and rewards, it became biased and overrated.
They belong to The McGraw-Hill Companies, Inc. and are driven by profit now and the impartiality is gone forever.
https://www.pacbiztimes.com/2016/04/18/j-d-power-sold-for-1-1-billion-to-xio-investment-firm/
But thanks for the effort. :-)
Rhett
Same with any consumable goods, if a company offers half price on any product, we could care less about the ratings of the company who sells it.
Ann brings up a good point too. "No methodology is perfect" and any survey can be skewed to suit a desired outcome. The Federal Reserve and their inflation rate is a prime example!
I moved down on the page and it is about other stuff too!
Land-line phone polls: OK, how many folks aged 18 - 34 have landlines to begin with. Inherent selection bias.
Land-line phone polls for folks with land-lines: OK, we're mainly talking older folks, who might (or might not) answer their phones. I know we routinely route any calls to our answering machine.
Land-line phone polls for folks who pick up their phones: Seriously, folks who respond are truly dumb, bored, or need company. Folks who respond are especially vulnerable to "push polls", i.e., "Don't you agree Democrats who advocate killing babies are evil"? Problem being, pro-choice folks are seldom evil.
I gave up on polls long ago.
An example of skewing polls to suit the outcome and make great headlines:
The Democratic party had Hillary all but crowned President right up into the eve of the general election last November. Well when Democrats poll Democrats what else does a person expect the polls to indicate?
I took a look at the 5 year CD rates of their best rated institutions and only Capital One had any reasonable CD rates. The other 4 were abysmal. Capital One brick and mortar branches are quite convenient for me and have had accounts with them in the past. However, previously there was always a 6 month guarantee on their money market account and now their rate of 1.0% for a miniumum of 10K is too low and can change at any time. With regard to the CDs I have found more competitive rates either from the blog or researching on my own.
Unfortunately, this "survey" appears to be more of an advertisement for the institutions that garner its top ratings.