The industry is insurance. If you offload your due diligence to an AI that's wrong, the company will go underwater. What if the LLM tells you it's found a great market with little competition, high net worth individuals, etc. etc. and you don't check and end up making all your sales in the hills of California in wildfire country?
Not OP but I could see this being short term savings related to the cost of sourcing/generating risk data leading to bonuses prior to the deficiencies in that risk model being exposed in claims long term.
Or do you mean they succeed by promising lies via AI?