The old payday loan application was a few questions on a clipboard, a paystub, and a verbal back-and-forth with a clerk who could feel a deadbeat across the counter. The new version is a tap on a phone and an answer in eight seconds, generated by models that have already pulled hundreds of signals about you before you finished the form.
Faster isn’t the same thing as fairer, and that distinction is where the debate over algorithmic underwriting gets serious.
What the models are actually looking at
Modern alternative-data lenders ingest far more than a credit score. They look at bank account inflows and outflows when consumers grant access through Plaid-style APIs, mobile phone metadata, device type, typing cadence on the form itself, geolocation history, social graph proxies, utility payment records, and behavioral patterns from prior visits to the lender’s site. Some models score the time of day you applied, on the theory that two a.m. applicants default at higher rates. The pitch is that this richer data set extends credit to thin-file borrowers who would have been declined under traditional underwriting. The empirical record on that claim is genuinely mixed, but the technological direction is unmistakable.
Where the speed comes from
The underwriting itself is mostly gradient-boosted decision trees and, increasingly, neural networks tuned on millions of prior loan outcomes. These models output an approval probability and a risk-adjusted price almost instantly because the heavy computation happened during training, not during your application. Decisioning that used to take a human loan officer twenty minutes now takes the time it takes a server to run inference, which is measured in milliseconds. The net result is a customer experience indistinguishable from one-click checkout, and that frictionlessness is itself a regulatory concern, because impulse borrowing is one of the consumer harms payday rules were designed to slow down.
What gets harder to see
Algorithmic underwriting also moves discrimination into territory that’s much harder to audit. Models that don’t explicitly use race or gender can still produce racially disparate approval rates because zip code, device type, and language settings carry that information indirectly. Federal regulators, including the CFPB, have begun requiring lenders to provide adverse action notices specific enough to satisfy the Equal Credit Opportunity Act, but proving that a model treated similarly situated borrowers differently is genuinely hard when the model has thousands of features. Consumers almost never know which signal sank their application, and lenders themselves often can’t fully explain individual decisions. That opacity is the price of the speed.
Bottom line
Algorithmic underwriting is faster, often cheaper for lenders to operate, and sometimes broader in who it approves. It is also a black box that has shifted the burden of proving fairness onto regulators and consumer advocates who are still catching up to the math. Borrowers should treat instant approval as a feature, not a virtue. The old clipboard had its problems. The new model has different ones, and they’re harder to see by design.
Leave a Reply