Why tomorrow’s finance professionals need Python, data storytelling and ethical judgement in equal measure
AI is quietly rewriting what it means to be a great analyst. In many banks and asset managers, models now scrape, clean and aggregate data at a speed no human can match. The real bottleneck is no longer calculation, it is interpretation. The value sits with people who can turn complex, AI generated output into stories that leaders and clients can act on.
The data on skills is clear. The World Economic Forum’s Future of Jobs Report 2025 confirms that analytical thinking remains the single most sought after core skill, with around seven in ten employers rating it essential, while roughly two fifths of current skill sets are expected to be transformed or obsolete by 2030. In parallel, LinkedIn’s most recent Workplace Learning and Skills on the Rise updates show sharp growth in creativity, strategic thinking and problem solving, alongside technical skills such as AI literacy and engineering basics. The signal is consistent, employers want people who can think, adapt and communicate, not just operate tools.
For financial services this shift is amplified by the spread of advanced AI. McKinsey’s 2025 State of AI survey reports that AI use is expanding rapidly into functions closest to the business, including finance, risk and product development. Yet only about one third of organisations have reached real scaling, and the firms that are pulling ahead are those that have defined when model outputs need human validation and how that oversight works in practice. AI increases the premium on human judgement rather than removing it.
A common misconception inside banks is that “everyone now needs to be a Python developer”. In reality, deep coding is likely to remain concentrated in a smaller slice of highly technical roles. Surveys of finance and accounting professionals by ACCA and the Institute of Management Accountants show that while over 70 percent recognise the importance of data and technology skills, fewer than 30 percent feel confident working directly with code or advanced analytics tools. For a large proportion of talent, intensive programming remains a barrier.
This does not make them obsolete, it reshapes the opportunity. Rather than forcing every relationship manager or product specialist through a full software engineering curriculum, banks can design complementary roles around three tiers of depth:
- Light literacy for the majority, enough to understand what Python, SQL and LLM based tools can and cannot do, read simple queries or notebooks, and ask the right questions of model owners. Think of this as “code aware, not code fluent”.
- Applied analytics for translators and product owners, where people can manipulate data using low code or vibe code tools or prebuilt LLM notebooks, tweak parameters, interpret diagnostics and work closely with quants and engineers. Here, SQL basics and an ability to follow Python logic are important, but production grade engineering is not.
- Deep engineering and modelling for specialists, who design architectures, build models, manage pipelines and own performance and robustness.
This tiering opens up new opportunities. Colleagues who do not want to code deeply can still become highly valuable as data storytellers, AI product managers, risk and governance specialists or client advisers who sit at the intersection of human need and algorithmic capability. Demand is already rising for roles such as “AI risk officer”, “data product owner” and “analytics translator” in major banks.
Data storytelling is becoming non-negotiable across these tiers. Alpha’s accelerator programmes on business analytics now explicitly combine modelling with visualisation and storytelling, framing them as essential for data informed decision making. A recent global survey on data storytelling reported that although nearly 90 percent of business professionals work with data weekly, two thirds feel anxious about using it and a sizeable minority avoid it because they cannot confidently extract insights. Organisations that invest in storytelling skills report performance uplifts and faster decision cycles.
There is an ethical layer too. As models permeate credit scoring, fraud detection and portfolio construction, analysts and translators need a working grasp of explainable AI, bias risks and regulatory expectations. The World Economic Forum’s work on “human skills in an AI world” stresses that cognitive skills such as analytical thinking must be paired with systems thinking, collaboration and ethical awareness if organisations are to unlock a genuine human advantage in an AI rich economy.
For financial institutions, the implication is straightforward. Job descriptions that talk only about spreadsheets, ratios and valuation techniques are already out of date. The core spine for the algorithmic workforce in finance is a blend of technical literacy at the right depth, analytical depth, storytelling and ethics. A subset of specialists will go deep on Python, SQL and cloud; a much larger population will become “code aware” translators and advisers. The professionals who lean into that blend, at the level that fits their strengths, will not be the ones squeezed out by AI, they will be the ones designing how it is used and leading the conversations that matter.




