I will apologize in advance about the rant in this blog post. I think that bankers and regulators should change the way they evaluate risk. I work with a lot of bankers and lenders in order to aid our clients obtain and retain credit, both personal and business. We all know that in 2011 banks aren't lending (proven by survey) and are blaming two factors:
- Lack of demand.
- Increased regulatory scrutiny – which is a doom spiral with (1) as the pool of credit worthy borrowers decreases with additional regulatory scrutiny.
I have seen this first hand in my own client base. People with great cash flow and credit history can't get a loan to finance growth (and create jobs) because they are in a bad industry (e.g. construction) or have other issues (e.g. too many real-estate investments).
I would propose that the fundamental way of evaluating risk in lending decisions is all wrong, and until bankers get it right they are going to continue to post lackluster profits (and continue to cripple our economy) – see the McKinsey and Company survey of forecasted banking profits.
So – what's wrong? Currently, banks assume the worst case scenario. They assume that you will default on everything and they want to know how they will secure their outstanding balance via collateral (or via a government guarantee from the SBA). Not much consideration is given to credit history, cash flow, or industry – they are ancillary to the fundamental collateral position. So – if you have a lean balance sheet (either personal or business) but great cash flow guess what? No loan. And there's the problem!
Let's revisit the fundamental risk equation. Risk = Impact X Probability. The way that banks currently use this model is as follows – on a $100,000 loan:
- Impact = $100,000 – the amount of the loan.
- Probability = 100% - because we're assuming the worst.
- Risk = $100,000 – therefore the bank requires a collateral position of at least this amount (regardless of the quality of the collateral by the way – are accounts receivable really collateral? Inventory?)
Here's the flaw. 100% of loans don't default. All banks keep a history of default rates. And guess what; they have the ability to stratify default rates by industry, length of time in business, all sorts of things that could give them better information regarding the probability of default on a borrower. Why did we get into the mess we're currently in? We didn't use this data. Loans were made to folks that had no probability of repayment – commercial real-estate was viewed as a safe bet because it's collateralized and always appreciates. What's the flaw? Well, the households that got loans default, and about every ten to fifteen years or so real-estate doesn't appreciate – it depreciates and there is a default.
So – how could a bank make money? Let's assume that in scenario one the bank makes one loan to a great borrower - $100,000 at 6% over 5 years. Using simple interest – this equates to interest income of $30,000. Great. Now assume that bank down the street – the one that knows how to evaluate risk and has worked with its regulators – makes 10 loans for $100,000 at 8% interest (they are commanding an interest premium in order to offset risk – a common practice – and a price a business can and is willing to pay) but has a 20% default rate. This same bank makes $120,000 ($40,000 * 8 notes less $200,000 in defaults). The collateral position required across the entire base would be $200,000 – ($100,000*10*20%). Why not make the loan – especially if you're lending to industries and data that has a historically low default rate relative to the rest of the general business population? By the way – 20% is a ludicrously high default rate – even if you doubled the current commercial default rate at most banks you'd be at around 5% - 7% - if even that.
So – what do you think it will take for banks to start lending again?