KPMG launches dashboard tracking consultants' AI use
Business Insider reports that KPMG introduced a dashboard for its US advisory division that lets consultants see how often they and their peers use AI, and how that usage compares to target goals. The dashboard, which Business Insider says came online late last year, covers about 10,000 advisory workers, and the firm told Business Insider that more than 90% of its US employees use AI weekly. Russ Grote, a KPMG spokesperson, is quoted by Business Insider saying, "Our data shows regular AI users produce higher-quality work, feel less stressed, and spend more time on strategic work." Business Insider also reports some employees say there is pressure to use AI and that the dashboard is easy to manipulate. A separate Business Insider report notes KPMG cut roughly 4% of its US advisory staff in late April, about 400 roles.
What happened
Business Insider reports that KPMG rolled out an internal dashboard for its US advisory division that displays individual consultants' AI usage against personal targets and peer-group averages. Business Insider says the tool came online late last year and applies to roughly 10,000 advisory employees. The publication reports KPMG states more than 90% of its US employees use AI weekly. Russ Grote, a KPMG spokesperson, is quoted by Business Insider: "Our data shows regular AI users produce higher-quality work, feel less stressed, and spend more time on strategic work." Business Insider additionally reports that some employees described pressure to use AI and said the dashboard can be easy to manipulate. Business Insider separately reported KPMG reduced about 4% of its US advisory workforce, roughly 400 people, in late April 2026.
Editorial analysis - technical context
Industry-pattern observations: Corporations increasingly instrument employee workflows to measure AI adoption and output. Such dashboards typically integrate usage logs from sanctioned tools, single-sign-on events, and API call counts to produce per-user metrics; published reporting does not provide technical specifics of KPMG's telemetry. From a practitioner perspective, telemetry-based adoption metrics commonly face signal-noise issues, including divergent definitions of "use," variable session lengths, and gaming via scripted or superficial interactions.
Industry context
Industry-pattern observations: Reporting frames this move within a broader trend of consultancies and large enterprises promoting AI adoption through incentives, awards, and internal benchmarking. For practitioners, these programs often accelerate tool standardization, demand for policy and governance artifacts, and increased emphasis on reproducible, auditable pipelines to validate model outputs.
What to watch
Industry context
Observers should track how firms reconcile adoption metrics with quality controls, for example, whether dashboards are paired with outcome-based KPIs, documented usage definitions, or sampling-based quality audits. Also watch for governance artifacts such as approved-tool lists, logging standards, and guidance on acceptable automation levels that would appear in public filings or internal policy updates.
Scoring Rationale
This is a notable example of enterprise-level AI adoption instrumentation that signals operationalization trends relevant to practitioners, but it is not a technical breakthrough. The story affects governance and telemetry practices rather than core model research.
Practice with real Ad Tech data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Ad Tech problems
