Sunday, May 7, 2023

ChatGPT - OpenAI: Allowing lower bank equity requirements against what’s perceived safe than against what’s perceived risky, could that distort the allocation of bank credit?

I asked ChatGPT:

"When lending banks take into account what they perceive as safe and as risky. If regulators allow lower bank equity requirements against what’s perceived safe than against what’s perceived risky, could that distort the allocation of bank credit?"

ChatGPT answered:

"Yes, if regulators allow lower bank equity requirements against what is perceived as safe than against what is perceived as risky, it could distort the allocation of bank credit. Banks have an incentive to allocate more credit to what is perceived as safe, even if it is not the most economically productive use of that credit, in order to take advantage of the lower equity requirements.

This could lead to a misallocation of credit, as potentially more productive or riskier investments may not receive the necessary funding. Moreover, it could lead to an overconcentration of credit in certain areas, which could create systemic risks in the banking system.

To mitigate these risks, bank regulators should ensure that equity requirements are appropriately set based on the risk of the underlying assets. This would help ensure that banks allocate credit based on economic merit rather than regulatory arbitrage, which is essential for a healthy and stable banking system."

Note: The answers are copied exactly from those given to me by ChatGPT

PS. I’ve been arguing for decades against bank regulations based on that what’s perceived as risky being more dangerous to our bank systems than what’s perceived as safe. For instance, 2012, Financial Time’s chief economics commentator Martin Wolf gave a testimony to that effect.