Wednesday, November 1, 2023
In President Biden’s “Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence” of October 30 I read: “Provide clear guidance… to keep AI algorithms from being used to exacerbate discrimination”
I asked ChatGPT - OpenAI
"Those perceived as risky always found it harder and more expensive to access bank credit than those perceived as safe. That very natural risk aversion has always discriminated against the “risky” and in favor of the “safe”.
But current bank regulators also imposed on banks the algorithm of risk weighted capital/equity requirements; decreeing higher risk weights for those perceived as risky than for those perceived as safe.
With that, have not regulators added an artificial layer of risk aversion that exacerbates the discrimination against the access to bank credit of those perceived as risky?"
ChatGPT - OpenAI answered:
"The point you raise touches on the complex relationship between financial regulation and risk assessment in the banking sector. Let me break it down for a better understanding.
Historical Risk Perception: It is true that historically, banks have been risk-averse and tend to favor borrowers or assets perceived as safe. This is because banks are responsible for safeguarding depositors' money and must ensure the safety and soundness of their operations. Lending to riskier borrowers or investing in riskier assets can threaten the stability of the financial system.
Risk-Weighted Capital Requirements: The concept of risk-weighted capital requirements is a regulatory framework designed to ensure that banks maintain adequate capital to cover potential losses based on the riskiness of their assets. Under this framework, riskier assets are assigned higher risk weights, which means banks have to hold more capital against them.
Now, your question focuses on whether these risk weights exacerbate discrimination against riskier borrowers. Here are a few considerations:
Pros:
Safety and Soundness: Risk-weighted capital requirements help ensure that banks have enough capital to absorb potential losses on their riskier assets. This contributes to the overall stability of the financial system.
Market Discipline: By assigning higher capital requirements to riskier assets, regulators encourage banks to be more cautious in their lending and investment decisions, which can help prevent excessive risk-taking.
Cons:
Risk Discrimination: As you pointed out, risk weights can lead to discrimination against riskier borrowers because banks may be less inclined to lend to them or may charge higher interest rates to compensate for the higher capital costs.
Pro-Cyclicality: In times of economic stress, these capital requirements can become pro-cyclical, meaning that they may exacerbate economic downturns by causing banks to pull back on lending to riskier borrowers, further restricting credit availability.
Subjectivity: The determination of risk weights can be subjective and complex, which may lead to misclassifications or regulatory arbitrage.
In summary, the use of risk-weighted capital requirements is a trade-off between maintaining the stability of the financial system and potentially exacerbating discrimination against riskier borrowers. Regulators aim to strike a balance, but the effectiveness of these regulations and their impact can vary depending on the specific circumstances and the quality of their implementation. It's an ongoing challenge to refine and improve these regulatory frameworks to address these concerns while maintaining financial stability."
Thanks!
"You're welcome! If you have any more questions or need further clarification on any topic, please feel free to ask. I'm here to help!"
A comment: AI mentions “Lending to riskier borrowers or investing in riskier assets can threaten the stability of the financial system.”
But, how does that square with that the large bank exposures that detonated major bank crises were all built-up with what’s perceived as safe and never ever with what’s perceived as risky?
Note: The answers are copied exactly from those given to me by OpenAI