Illustration for: JPMorgan AI use reaches 50% of staff, driven by connectivity‑first architecture
LLMs & Generative AI

JPMorgan AI use reaches 50% of staff, driven by connectivity‑first architecture

3 min read

JPMorgan reports that half of its global workforce now interacts with generative‑AI tools on a daily basis—a milestone the bank attributes to a “connectivity‑first” architecture that stitches large‑language models directly into existing workflows. The rollout began in early 2023 with a handful of pilot projects in risk management and client service, then expanded to trading desks, compliance units and back‑office operations. By the end of 2024, usage metrics showed a sharp uptick: prompt submissions doubled each quarter, and internal dashboards flagged a surge in user‑generated assistants that carried distinct personas and task‑specific instructions.

The rapid diffusion caught senior leaders off guard, prompting a deeper look at how employees were not merely asking questions but actually shaping bespoke AI companions and sharing them across teams. That organic, peer‑to‑peer spread is what sparked the surprise, leading Waldron, the bank’s chief analytics officer, to reflect on the phenomenon in a recent VB Beyond the Pilot podcast.

"We were surprised by just how viral it was," Waldron explains…

"We were surprised by just how viral it was," Waldron, JPMorgan's chief analytics officer, explains in a new VB Beyond the Pilot podcast. Employees weren't just designing prompts, they were building and customizing assistants with specific personas, instructions, and roles and were sharing their learnings on internal platforms. The financial giant has pulled off what most enterprises still struggle to achieve: large-scale, voluntary employee adoption of AI.

It wasn't the result of mandates; rather, early adopters shared tangible use cases, and workers began feeding off each other's enthusiasm. This bottom-up usage has ultimately resulted in an innovation flywheel. "It's this deep rooted innovative population," Waldron says.

"If we can continue to equip them with really easy to use, powerful capabilities, they can turbocharge the next evolution of this journey." Ubiquitous connectivity plugged into highly sophisticated systems of record JPMorgan has taken a rare, forward-looking approach to its technical architecture. The company treats AI as a core infrastructure rather than a novelty, operating from the early contrarian stance that the models themselves would become a commodity.

Related Topics: #JPMorgan #AI #generative‑AI #large‑language models #connectivity‑first #Waldron #chief analytics officer #user‑generated assistants

Did the architecture deliver what it promised? JPMorgan’s internal LLM platform now touches half of the workforce, a figure that dwarfs many early enterprise pilots. The connectivity‑first design, which links assistants directly to existing data pipelines, appears to have lowered the barrier for staff to experiment, according to chief analytics officer Derek Waldron.

“We were surprised by just how viral it was,” he told the VB Beyond the Pilot podcast, noting that employees moved beyond simple prompts to craft assistants with distinct personas and role‑specific instructions. Yet the report stops short of quantifying productivity gains or cost savings, leaving those outcomes uncertain. Moreover, while usage has climbed to 250,000 users and reportedly exceeds 60 % in sales, finance, technology and operations, the long‑term retention of those custom assistants remains unclear.

The rapid uptake suggests a cultural shift toward self‑service AI, but whether the connectivity‑first model can scale without new governance challenges is still an open question. For now, the data points to an unusually swift adoption curve within a traditionally cautious financial institution.

Further Reading

Common Questions Answered

What percentage of JPMorgan's global workforce now interacts with generative‑AI tools daily?

JPMorgan reports that 50 % of its global employees use generative‑AI tools on a daily basis. This milestone reflects the rapid adoption driven by the bank’s connectivity‑first architecture.

Which business areas did JPMorgan initially pilot its AI rollout in early 2023?

The initial pilot projects in early 2023 focused on risk management and client service functions. After those successes, the rollout expanded to trading desks, compliance units, and back‑office operations.

How did the connectivity‑first architecture lower the barrier for staff to experiment with AI?

The architecture stitches large‑language models directly into existing data pipelines and workflows, allowing assistants to access real‑time data without custom integration. This seamless connection made it easier for employees to build, customize, and share AI assistants.

What behavior did chief analytics officer Derek Waldron observe among employees after the AI tools were introduced?

Waldron noted that employees went beyond simple prompt writing; they created assistants with specific personas, instructions, and roles, and then shared their learnings on internal platforms. He described the adoption as "viral" and largely voluntary.