Popular Posts

Join Us for a Deep Dive into J.D. Power’s 2017 Bank Ratings


If I utter the phrase “J.D. Power Ratings”, I’ll bet you a dollar I can guess what pops into your mind. It’s probably a memory of the last time you bought a new or used car. Maybe you were trying to figure out whether people are happier with their Toyota Highlanders or their Nissan Pathfinders, or which sedan is more dependable, the Ford Fusion or the Honda Accord. Whatever car model one’s considering, J.D. Power can ply you with a wide array of metrics to help (or perhaps complicate?) your decision.

But the common thread that runs through J.D. Power is not an automotive focus – it’s customer satisfaction research on almost every consumer-oriented industry you can think of, digging into consumer perspectives in almost a dozen industries, and into numerous subsegments within each industry. For a typical study, J.D. Power surveys 50,000-100,000 consumers in 11 regions across the nation.

Of highest curiosity to us here at DepositAccounts is their annual spring release of the U.S. Retail Banking Satisfaction Study. J.D. Power says that the study, now in its 12th year, is the “longest running and most in-depth survey” of the U.S. retail banking landscape. Its final report every year synthesizes customer satisfaction findings for six different banking factors – account information, channel activities, facilities, fees, problem resolution, and product offerings – into a single indexed rating score for dozens of the country’s largest brick-and-mortar banks.

With the 2017 results freshly released, we decided to take a close look at who the winners and losers are this year, both nationally and in each of the 11 segmented regions; how J.D. Power’s rankings compare with aggregated reviews from DA readers; how the very biggest banks fared; and whether any of these large brick-and-mortar banks have rates worthy enough to attract your business.

So without further delay, let’s start the slice ‘n dice…

J.D. Power’s 11 regional rankings

In its methodology, J.D. Power asks consumers about the banks with locations in their particular region. And it calculates and reports its findings compartmentalized in these areas, resulting in 11 different ranking lists. The press release touts that its 2017 survey included 136 of the nation’s largest banks, but because some banks appear in more than one region, some presumably didn’t receive a statistically significant threshold of responses for reporting, and some have since been merged into other banks, the number of unique banks in our analysis is 72.

In the chart below, you can see the top-rated bank, the lowest-rated bank, and the average rating in each of the 11 geographical regions. Each score is based on J.D. Power’s 1,000-point index that combines all six areas of consumer satisfaction.

JD Power Rankings

From this layout, you can observe a few things. First, customers in some regions are noticeably more satisfied than in others. For instance, the average rating in both the South Central states and in Texas is more than 30 points above the average in New England, where the mean score almost dips into the 700s. No one can say whether that tells us Texans and Alabamians are easier to please than perhaps more-demanding New Englanders, or whether the customer service ethic or training among employees in these regions is what differs. Quite feasibly, it could be both. But we can’t know that from this data.

The score spectrums for each region also make it easy to see which banks had the absolute worst and best scores among all those in the survey. We see that no one scored higher than Frost Bank did among its Texas customers, with BancFirst in the Southwest scoring just a point lower.

We can also note that, while Banner Bank won the blue ribbon in its Northwest division, it is at the very best an 11th-ranked bank overall, given it has the lowest score of any of the winners. But it’s likely lower-ranked nationally than that, since there could be non-first-place banks in the other regions that scored higher than Banner’s 840 points.

At the other end of the spectrum, we see that Old National Bank in the North Central states holds the unfortunate distinction of receiving the worst overall score, with TCF National Bank close behind in the Midwest. HSBC in the Mid-Atlantic and Santander in New England are also sitting fairly low in the cellar.

How we boiled 11 rankings down to one

While these 11 regional rankings can be interesting, what I really wanted to see was how all these banks stacked up in one nationwide list. So where a bank’s customers were surveyed in more than one region, I calculated their average (mean) J.D. Power score, leading to a single score per institution. Of the 72 banks, 21 were scored in multiple regions, with 16 of them appearing 2-5 times. PNC and U.S. Bank were scored in a half dozen segments, and three of the “Big Four” banks (Chase, Wells Fargo, and Bank of America) were surveyed in all 11 regions. Citibank, though the fourth largest bank by assets, has a much smaller geographic footprint than the top three banks, and was only scored in four regions.

With a singular score now calculated for each bank, I used J.D. Power’s published methodology for identifying four scoring tiers: Best of the Bunch, Better Than Most, About Average, and The Rest. For our purposes, though, I’ve changed “The Rest” to “Below Average” because I find it more descriptive.

Here are our results, which present a nationalization of J.D. Power’s regional rankings and, to my knowledge, have not been synthesized and published anywhere else.

best of bunch better than most about average below average

These tables pretty much speak for themselves: find your favorite big bank and see how others have experienced it, and then see who consumers have liked more and less. My most notable observation is that I might declare one bank a bigger winner than the top-scoring bank. True, Frost Bank and BancFirst deserve well-earned kudos for achieving the highest customer satisfaction scores in the survey.

But I’d venture that Huntington National Bank has earned a special shout-out. The top four scorers were all rated in just one region, and it’s a bit easier to score one stand-out rating than two. Huntington is the only bank to win top honors in two regions – Mid-Atlantic and North Central – receiving a customer satisfaction score of 849 in both regions.

How do these align with DA reader reviews?

As with all customer satisfaction surveys, the study findings may not jibe with your personal experience. Perhaps a bank you’ve come to loathe due to multiple bad experiences head-scratchingly appears in the “Better Than Most” tier. Or your hands-down favorite bank was inexplicably deemed “Below Average” by J.D. Power respondents. This of course is inevitable because we usually don’t know if our personal experience is an outlier or part of a common theme.

In any case, we were curious to see how J.D. Power’s scores align – or don’t – with the scores that DA readers have given these banks over time. Several words of caution are in order here, though.

First, the number of DepositAccounts reader reviews can’t compare to the breadth of J.D. Power surveys conducted for each bank. While we don’t know the minimum number of respondents per bank, we know the full survey tapped 78,000 banking customers, and I’d assume J.D. Power set a minimum sample size for it to score a bank. Perhaps it’s the market research industry’s rule-of-thumb number of 400 completed surveys, or perhaps a little less or more. I don’t know. But in the case of DA reader reviews for these banks, only three banks among these 72 have more than 200 reviews registered on our site, and all had below 300 reviews.

Second, J.D. Power’s survey is statistically and randomly sampled from banking customers across the country. This differs significantly from the pool of DA reviewers in a couple of ways. For one, DepositAccounts reviewers, or at least a subset of them, may be more banking-engaged (and presumably more knowledgeable and rate-savvy) than the average U.S. consumer.

Additionally with DA reviews, individuals have self-selected themselves to participate, presumably because they’ve felt strongly enough in their views of the bank that they’ve wanted to share their good or bad experience with others. This type of review is quite different than the random selection process J.D. Power employs. It’s fairly defensible, for instance, to assume a self-selecting survey will naturally draw more respondents that have a strong opinion on the issue, and few that feel neutral. It might also be argued that readers with a strong negative experience with a bank are the most likely of all to write a review (anger tends to spur more action in human beings than does happiness or contentment), and therefore might depress scores more than a random sample would.

Still, it’s fun to mash-up the J.D. Power scores alongside the DA reader reviews, keeping in mind this is not a scientific comparison. We did this by compiling all of the J.D. Power banks for which we had at least ten DA reviews, which was a total of 36 banks. Here’s what we found.

JD Power scores vs DepositAccounts.com reviews

What we see are some definite instances of correlation. For instance, the top three DepositAccounts reviews in this group appear within the top four J.D. Power slots. And if you look at how the five blue circles appear from top to bottom versus where the red circles appear in the vertical, DA’s top-reviewed banks do sit higher in the list and the reds, lower. (One note about Capital One: Since it merged its physical and internet operations, only a single review score is available for the entire bank.)

Done a little more numerically, if you average the J.D. Power rank of DA’s five blue circles, you get an average rank of about 8. Do the same with DA’s five worst reviews and the average J.D. Power rank is 18. So in general terms, higher DepositAccounts reviews correlate with higher J.D. Power scores and vice versa.

You can also see that in the “Below Average” section of the J.D. Power rankings, all of the DA review scores are below 3, which is considered the average rating on a 5-star scale. So DA readers, too, collectively deemed these banks as “below average”.

Of course, there is some dissonance between the rating systems when you look closely at individual scores. For instance, J.D. Power’s highest-scored bank in this group – United Community Bank – gets a below average rating from DA readers. But notice that this is based on just 12 individuals’ opinions. Same thing at the other end of the spectrum for Fulton Bank: it has the second-lowest DA score in this list, yet appears as “better than most” in J.D. Power’s study. Again, this is based on 12 reader reviews.

As a foil to this, look at the four large banks where we have over a hundred reviews. Here, our reviews correlate fairly closely with J.D. Power’s assessments: Capital One is the best of these big banks, Chase is about average, and Bank of America and Wells Fargo are in the “below average” tier. We agree!

As I outlined earlier, there are various structural problems with comparing J.D. Power’s scores side-by-side with DA reviews, so it’s not surprising we don’t see lock-step alignment across all 36 banks. But we’re encouraged at how much correlation the analysis did turn up.

Focusing the lens on the biggest of the big banks

We also took a look at the Top 10 largest retail banks, to see which ones customers like most and least, according to their J.D. Power scores. You can see below that the Top 10 bank appearing in the smallest number of regions – Capital One, in three regions – earned the highest average score. Meanwhile, the nation’s second- and third-largest banks – Wells Fargo and Bank of America, respectively – were scored in all 11 regions, and earned the lowest average scores of the Top 10.

While there is certainly some downward pressure on scores from being surveyed by, say, 50,000 customers versus 5,000 customers, you can compare apples-to-apples by looking at Chase’s average J.D. Power score. It, too, appears in all 11 regions of the study and still scored 12 points higher than Wells Fargo and Bank of America. Similarly, PNC earned an average of 828 from six regional scores, beating out four Top 10 banks that only competed in four or five regions.

satisfaction with largest banks

Another thing that’s hard not to notice is how poorly Bank of America and Wells Fargo rate, ranking in the “Below Average” tier. And also, that no Top 10 bank appears in J.D. Power’s “Best of the Bunch” tier.

The elephant in the room… Rates

Regular DepostAccounts readers will be able to anticipate the general outcome of the last facet we looked at in this analysis, and that’s whether any of these big banks have competitive rates. I say it’s predictable because these are 1) very large banks, and 2) brick-and-mortar operations, and regulars here know that online banks generally have much better rates, as do credit unions. And among brick-and-mortar banks, smaller and medium-sized banks generally trump large institutions on rates. (You can see our analysis of this here, which we will soon be updating with current rates.)

But whether or not we can guess the outcome, we pulled the best available rates from each of these 72 brick-and-mortar banks anyway, for three products: their best savings or money market account, their standard 1-year CD, and their standard 5-year CD. After a rate review, the findings did not buck our expectations.

rates at largest banks

In short, none of these banks offer much in the way of rates to recommend you open an account with them. Even considering the respectable 5-year CD rates, a search through DepositAccounts’ CD rate tables will lead you to more than a handful of nationally available banks – either online banks or smaller institutions – that pay upwards of 2.25% APY, making 2.00% APY a losing proposition.

One important note is required here, though. Some brick-and-mortar branches do not publish their rates online, so we could not assess those. Also, some physical banks occasionally offer special promotional rates that can be a worthy deal. Savers should always keep their eyes open for those. But in terms their general menu of standard rates, few of these 72 banks with published rates listed anything worth a savvy saver’s attention.

Final take-aways

Landing on the note of rates is perhaps not the best endpoint to this analysis, as there are other worthwhile reasons to hold an account at one of these large banks. I know that, personally, I typically hold one Chase account open, though I keep virtually none of my money there. But having the Chase account provides me with a local branch I can access, should I need it; and it often is the result of me scoring a cash bonus for opening the account.

Even for the smartest capital-preserving savers, who save all of their deposit funds in internet banks and credit unions, and at the highest rates they can find, holding an account with one large bank that operates a physical branch in your community can provide some benefits. And now you know which of those large institutions across the country make their customers happiest.

Related Pages: banking tools and data
Previous Comments
  |     |   Comment #1
The only car I drive is a Ford Explorer. I simply adore cars! Thanks for writing this article about them. It's simply refreshing to hear about something different (I came on here to read about banking stuff :)
  |     |   Comment #2
Thanks Sabrina, I read it all, however J.D. Power is no longer as independent as they try to admit to the public.
They receive hidden contributions from many institutions for favorable rating. The best rating goes to the highest bidder and I stopped following their ratings and rewards, it became biased and overrated.
They belong to The McGraw-Hill Companies, Inc. and are driven by profit now and the impartiality is gone forever.
  |     |   Comment #3
I forgot to mention that the global arm of J.D.Powers was sold last year to XIO group for $1.1 billions and they control J.D. Powers reports on global bases and every reward or review must be approved by the new owners before publishing.

  |     |   Comment #4
All three of our banks are in the bottom tier. We have checking, savings and direct deposits at one, CD's at all at one time or another and have never lost a dime or a minute of time with any bank. Where do theses ratings come from?
Bottom Tier Poster
  |     |   Comment #7
Perhaps the dings come from the lending side rather than the savings side?
  |     |   Comment #10
Many people never have any problems with their bank, and thus little or no complaints. For the unlucky people who do run into one or more of the many possible kinds of bank-related problems, then they find out just how helpful or unhelpful the bank's customer service really is.
Sabrina Karl
  |     |   Comment #13
Assets1, it is hard sometimes to reconcile our personal experience with a business to what a large group of others might contradictorily be saying about it. Sounds like that's happened here for you. But one thing I want to point out is that these scores are on a 1,000-point scale, so for instance a score of 800, while on the low end of the rankings, still represents an 80% score. So mostly, people are reasonably satisfied. They just happen to feel a lot better about their interactions with Frost Bank (85.2% score) than they do with Old National (76.6%). Or put another way, Frost Bank has a lot less dissatisfied customers than Old National, but Old National still has more happy customers than angry customers.
  |     |   Comment #5
Making profit is all I care about (For profit and profit only); let it be Rockland Trust (forgot why I signed up for), Wells Fargo (5% G/G/D credit card), Webster Bank (sign-up bonus), or BofA (HELOC). The rest (e.g., rankings) doesn't mean much to me.

But thanks for the effort. :-)

  |     |   Comment #8
Two words: CREDIT UNION! :)
  |     |   Comment #9
So far as I am concerned JD Power has no credibility. Any rating of banks which takes no account of internet banks is deficient.
Sabrina Karl
  |     |   Comment #12
I'm with you MidAtlantic, on internet banks (or credit unions) being superior from a rates perspective. But JD Power's ratings are a broad customer satisfaction survey, not a rates survey. Its study digs into what consumers' feelings are about the brick-and-mortar banks they do business with, and how satisfied they are with customer service, including inside the branch. Internet banks couldn't be fully scored using their long-running 1,000-point index, because it includes questions that wouldn't pertain to a bank with no branches. So we take it for what it is--an assessment of how happy American consumers are with various brick-and-mortar banks, which can still be interesting.
  |     |   Comment #15
Nobody cares or read J.D. Powers ratings if a bank offers 5% CD rate, we will all run there without reading or thinking of any ratings of the bank.
Same with any consumable goods, if a company offers half price on any product, we could care less about the ratings of the company who sells it.
  |     |   Comment #17
The only reports about cars, banks, or other products that I trust is from consumer reports. Totally unbiased, factual reports, and every product tested is purchased just like you and I would from the store or dealers (no samples allowed). Questionnaires are sent out to subscribers asking them for their opinion regarding their purchase without any form of monetary compensation or promises. JDP accepts contributions and product samples. That is a biased and flawed system, that guarantees favorable outcomes to the contributors.
  |     |   Comment #18
No methodology is perfect. ;-) http://www.allpar.com/cr.html
  |     |   Comment #31
Really? So a site that exclusively peddles Chrysler products is a valid source, reference standard, and authority on judging a highly respected non-profit organization who's sole mission is to look after the best interest of the consumers? Amazing and mighty gullible.
  |     |   Comment #32
If you actually bother to read it, it makes multiple valid points regardless of its motivation.
  |     |   Comment #33
If you would actually stop making assumptions you would realize that I read it entirely. A site that promotes mediocre Chrysler products will be biased when those products consistently get bad review.
  |     |   Comment #19
While Consumer Reports "may" be unbiased, they are not the deciding factor whether or not I purchase a particular product. Some of the trivial things they ding a product for, I couldn't care less about.

Ann brings up a good point too. "No methodology is perfect" and any survey can be skewed to suit a desired outcome. The Federal Reserve and their inflation rate is a prime example!
  |     |   Comment #20
Oh! I thought this article was only about cars.

I moved down on the page and it is about other stuff too!
  |     |   Comment #21
For young people banks, as many know them, are obsolete relics of horse and buggy days. The iPhone might as well be called iBank.
  |     |   Comment #22
And just what do you think they do with their iPhones, carry on financial transactions with BANKS.
  |     |   Comment #23
Maybe gonegishing means brick an mortor branches.
  |     |   Comment #24
The basic problem with surveys is selection bias. Last I checked, nobody from J.D. Powers ever called me to ask me if I was satisfied with the service I received from a particular bank or credit union. Indeed, J.D. Powers would have no way of knowing how to sample, inasmuch as financial institutions do not publish account holders. Garbage in, garbage out.
  |     |   Comment #25
And, if they called me, I would first ask to see all the results so that I can make an informed decision. Like the infamous health plan surveys...I ask, can I see a redacted survey on my specific doctors? No. Then I say, I can't make an informed decision and all "you" are interested in is data for global marketing purposes...and we know they have the individual doctor results b/c the doctor gets dinged if negative AND a non-response by us is, by them, deemed neutral. I said a non-response is adverse.
  |     |   Comment #26
Declaring that "a non-response is adverse" would make all surveys of any kind meaningless. Everything being evaluated would have abysmal scores regardless of anything else, if you do that. Large percentages of people don't want to bother with surveys just because they have other things they want or need to be doing instead. And if you don't even get far enough into it to find out what companies/products that person has experience with that they don't feel like bothering to tell you about, this absurd scheme wouldn't even know which ones to assign that 'adverse' demerit to.
  |     |   Comment #27
Ann, your comment nailed it! Exactly why a non-response should always be considered neutral.
  |     |   Comment #28
Bogey, I give polls the grain of salt they deserve (a toss over the shoulder to ward off bad luck). Let's begin.

Land-line phone polls: OK, how many folks aged 18 - 34 have landlines to begin with. Inherent selection bias.

Land-line phone polls for folks with land-lines: OK, we're mainly talking older folks, who might (or might not) answer their phones. I know we routinely route any calls to our answering machine.

Land-line phone polls for folks who pick up their phones: Seriously, folks who respond are truly dumb, bored, or need company. Folks who respond are especially vulnerable to "push polls", i.e., "Don't you agree Democrats who advocate killing babies are evil"? Problem being, pro-choice folks are seldom evil.

I gave up on polls long ago.
  |     |   Comment #30
I always hang up on telephone survey polls. And rarely answer polls on internet web-pages. Even then, sometimes I check mark the opposite of what my opinion or selection would really be. How many other people do the same. So much for the accuracy of survey polls, eh?

An example of skewing polls to suit the outcome and make great headlines:
The Democratic party had Hillary all but crowned President right up into the eve of the general election last November. Well when Democrats poll Democrats what else does a person expect the polls to indicate?
  |     |   Comment #29
Just my humble opinion but I don't believe this survey is relevant to DA readers/bloggers.
I took a look at the 5 year CD rates of their best rated institutions and only Capital One had any reasonable CD rates. The other 4 were abysmal. Capital One brick and mortar branches are quite convenient for me and have had accounts with them in the past. However, previously there was always a 6 month guarantee on their money market account and now their rate of 1.0% for a miniumum of 10K is too low and can change at any time. With regard to the CDs I have found more competitive rates either from the blog or researching on my own.

Unfortunately, this "survey" appears to be more of an advertisement for the institutions that garner its top ratings.

The financial institution, product, and APY (Annual Percentage Yield) data displayed on this website is gathered from various sources and may not reflect all of the offers available in your region. Although we strive to provide the most accurate data possible, we cannot guarantee its accuracy. The content displayed is for general information purposes only; always verify account details and availability with the financial institution before opening an account. Contact [email protected] to report inaccurate info or to request offers be included in this website. We are not affiliated with the financial institutions included in this website.