AI Is Reshaping the Ladder Into Financial Services — So What Should We Train Apprentices For?

For decades, the financial services profession followed a relatively familiar developmental path.

Junior entrants learned through repetition.

They prepared reports.
Updated spreadsheets.
Sat in meetings.
Shadowed senior advisers.
Observed difficult conversations.
Watched mistakes unfold.
Gradually absorbed judgment through exposure, supervision, and experience.

It was not always efficient.
But it developed professional instincts.

Now AI is changing that ladder — rapidly.

Tasks that once occupied graduate trainees, paraplanners, junior analysts, and administrator roles can increasingly be completed in seconds:

  • report drafting,
  • research summaries,
  • suitability structures,
  • meeting notes,
  • compliance checking,
  • data extraction,
  • forecasting,
  • and document analysis.

The entry-level work that once acted as the training ground for future professionals is disappearing.

And that creates a serious structural question for the future of financial services:

How do we develop professional judgment if AI removes much of the repetitive work that historically produced it?

Tony Frost’s recent paper on legal professionals captures the challenge well:

“AI simultaneously increases the need for judgment and erodes the experiences that produce it.”

The legal profession is not alone.

Financial planning, wealth management, banking, compliance, and insurance are now facing the same tension.

AI can produce technically competent outputs.
But clients do not ultimately pay professionals for formatting documents.

They pay for:

  • judgment,
  • discernment,
  • perspective,
  • accountability,
  • ethical reasoning,
  • emotional steadiness,
  • and decision-making under uncertainty.

That changes what junior development should look like.

Historically, many firms trained juniors in low-value administrative tasks because those tasks were commercially necessary.

But in an AI-enabled environment, firms have an opportunity to redesign apprenticeship itself.

Instead of spending years buried in procedural work, future entrants could be exposed far earlier to:

  • strategic thinking,
  • systems thinking,
  • behavioural finance,
  • client psychology,
  • ethical tensions,
  • vulnerability recognition,
  • scenario analysis,
  • communication,
  • life transitions,
  • and human decision-making.

In other words, AI may allow firms to accelerate the development of genuinely human capability.

That is a major shift.

The future adviser may spend less time producing information and more time helping clients navigate complexity, ambiguity, stress, and change.

Because information is becoming abundant.
Judgment is becoming scarce.

This has important implications for the financial services industry.

For years, many firms recruited and trained around technical competence alone:

  • product knowledge,
  • regulation,
  • process adherence,
  • and operational efficiency.

All remain important.

But AI is rapidly commoditising procedural expertise.

The emerging value lies elsewhere:

  • trust,
  • interpretation,
  • coherence,
  • emotional intelligence,
  • strategic synthesis,
  • and helping clients think clearly.

The industry therefore faces a choice.

It can continue using AI primarily to reduce costs and compress junior headcount.

Or it can use AI to redesign professional development itself.

The firms that thrive may not be the ones with the fewest humans.

They may be the ones that develop the strongest humans.

That means rethinking apprenticeship models for the AI age:

  • exposing juniors to strategic-level discussions earlier,
  • involving them in client thinking rather than just administration,
  • encouraging reflective practice,
  • teaching ethical reasoning,
  • helping them understand systems and incentives,
  • and developing the uniquely human skills that AI cannot easily replicate.

The irony is that AI may ultimately make human capability more important, not less.

But only if firms consciously cultivate it.

Otherwise, the sector risks producing technically assisted professionals with shallow experiential depth — people who can operate systems, but struggle with judgment when complexity arrives.

And in financial services, complexity always arrives eventually.

The future jobs market will not reward humans for competing with machines at machine tasks.

It will reward humans for doing what humans uniquely do well:

  • contextual thinking,
  • empathy,
  • discernment,
  • creativity,
  • moral judgment,
  • relationship-building,
  • and helping others navigate uncertainty.

That is where future apprenticeship models need to move.

Not away from human development — but deeper into it.

The opportunity now is not simply to train better technicians.

It is to train wiser professionals.

Leave a comment