Great Southern Bank prepares to reveal first AI agents

Great Southern Bank will begin deploying its first AI agents after a multi-year data modernisation programme, ITNews reports. The bank's head of customer technology, data and AI, Matt Cammack, said the bank launched a "scorched earth" platform modernisation program in 2021 to consolidate legacy systems and improve data hygiene. Cammack told ITNews the bank faced new reporting obligations when its assets reached 20 billion in 2024, and that by 2025 improved data quality allowed more confident financial analysis. According to ITNews, the bank migrated to a Databricks Lakehouse with Unity Catalog and Databricks Genie modules, eliminated three main data warehouses, reduced some reporting tasks from days to hours, and is bringing some AI modelling back in-house. Editorial analysis: This rollout highlights how large data-platform investments often precede practical AI agent deployments in regulated industries.
What happened
Great Southern Bank will start rolling out its first AI agents after a multi-year data modernisation programme, ITNews reports. The bank's head of customer technology, data and AI, Matt Cammack, said the organisation began a "scorched earth" platform modernisation in 2021 to address data spread across legacy systems that evolved over 75 years. Cammack told ITNews the bank's reporting obligations changed when assets hit 20 billion in 2024, and by 2025 improved data quality enabled more confident financial analysis and better capital allocation. Cammack is quoted saying, "We are now accelerating into new AI use cases," and "We're about to start deploying our first agents."
Technical details
According to ITNews, the bank moved off three main data warehouses to a single-view architecture built on the Databricks Lakehouse and Unity Catalog for governance. ITNews reports the bank is using Databricks Genie natural language BI modules on top of the Lakehouse and has ingested both structured and unstructured small-business customer data to automate business assurance tasks. The article states some reporting tasks that previously took days now complete in hours, and that the bank has rebuilt third-party AI models in Databricks to internalise modelling work.
Industry context
Editorial analysis: Companies in regulated financial services frequently invest several years in data consolidation and governance before deploying production AI, because reporting accuracy, auditability, and regulatory requirements create high barriers to model rollout. Industry practitioners will recognise the pattern where a Lakehouse governance layer like Unity Catalog is used to unify access controls, lineage, and metadata before adding natural-language or agentic layers.
Implications for practitioners
For practitioners evaluating enterprise AI rollouts, this case underscores the operational dependencies between data hygiene, governance, and downstream agent use cases. Editorial analysis: Organisations attempting similar deployments typically see initial ROI in reduced reconciliation and reporting effort, then apply the same platform to forecasting, stress testing, and scenario planning workloads.
What to watch
Observers should watch for published follow-up details about the scope of the bank's agents, the data sources exposed to those agents, audit and governance controls applied to agent outputs, and any third-party vendor disclosures. ITNews does not provide technical specs or an exhaustive list of models and safeguards, and the bank has not published a public technical report linked from the article.
Scoring Rationale
This is a notable, practitioner-relevant example of a regulated organisation moving from data modernisation to agent deployment. It is not a frontier-model announcement, but it provides actionable evidence about platform choices and operational outcomes.
Practice with real Banking data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Banking problems


