LLMs Are the New Databases (and ChatGPT Is Just the First Application)

Nomad Data
August 21, 2025
At Nomad Data we help you automate document heavy processes in your business and find the right data to address any business problem. Learn how you can unlock insights by querying thousands of documents and uncover the exact internal or external data you need in minutes.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

For many people, ChatGPT is synonymous with AI. It’s the product that made artificial intelligence accessible to millions of users almost overnight. But just as databases transformed software decades ago, large language models (LLMs) represent the true foundation of this technological shift, not ChatGPT itself.

At Nomad Data, we believe the best way to understand what’s happening in AI today is to look at history. LLMs are the modern equivalent of relational databases. And ChatGPT? It’s closer to Oracle or Microsoft’s early attempts to build end-user applications directly on top of their databases.

The real winners in AI will be the specialized applications that sit on LLM infrastructure, just as Salesforce, SAP, and countless others flourished by building on databases.

The Database Revolution

When relational databases first emerged, they were a breakthrough. For the first time, data could be stored, organized, and retrieved in flexible, powerful ways. Developers quickly embraced databases as the foundation of modern applications.

But in those early days, database companies didn’t just sell infrastructure. Oracle and Microsoft, for example, built applications like Oracle Forms to show off what their databases could do. These products aimed to capture more value by moving up the stack toward end users.

The problem? While the databases were revolutionary, the applications built directly by database vendors rarely became the category leaders. They were general-purpose, and they lacked the deep customization needed for specific industries or workflows.

Best of Breed Providers Take Over

It wasn’t long before independent software providers saw the opportunity. They took databases as the raw infrastructure and built tailored, best of breed solutions.

Customer relationship management (CRM), enterprise resource planning (ERP), financial software, these applications became the true workhorses of business. Databases receded into the background, quietly enabling everything, while the application vendors captured the lion’s share of business value.

As Brad Schneider, CEO of Nomad Data, puts it: “Most people today don’t enter things directly into any kind of a database. They’re entering things into a highly customized application that at its core is a wrapper on a database.”

The AI Parallel: LLMs as the Database, ChatGPT as the Early App

Fast forward to today, and we’re seeing the same pattern unfold with AI.

LLMs like GPT-4 and Claude are the new relational databases: powerful, flexible, and general-purpose. They began as APIs, available to developers who wanted to experiment and build. But with the launch of ChatGPT, OpenAI packaged that API into a user-facing application.

This was enormously successful in raising awareness. But ChatGPT itself is not the endgame. Like Oracle’s early applications, it’s a general-purpose tool that shows what’s possible, but it isn’t optimized for the specific workflows most users need.

“At the end of the day, LLMs and databases are developer-facing tools,” Schneider explains. “There’s enormous value in the ecosystems built around them. But the assumption is that over time, the best-of-breed vendors will outperform the infrastructure vendors trying to move up the stack.”

Why Specialized Applications Will Prevail

The future of AI applications lies in specialization.

Most people don’t want to learn how to “prompt” effectively, just as most people never learned SQL. They want tools that fit seamlessly into their daily work. They want applications that feel familiar, align with existing workflows, and deliver clear business value.

That’s why infrastructure companies like OpenAI and Anthropic aren’t likely to build the ultimate industry-specific tools. Their models must serve the broadest possible base. By contrast, specialized vendors can focus on deeply understanding the nuances of a single industry or problem set, and tailoring AI to fit.

This is the same dynamic that allowed companies like Salesforce, SAP, and Workday to thrive on top of databases.

An Infinite Range of Possibilities

Just as nearly every application today stores structured data in a database, we expect LLMs will underpin a vast range of future applications. Some will be entirely new categories we haven’t yet imagined. Others will look familiar—existing tools, but reimagined with AI at their core.

“There’s an infinite number of application types that sit on a database,” says Schneider. “Almost every application stores structured data in some way. I think the same is going to be true of LLMs. We’re going to see a huge proliferation of products—some incredibly new, and some that look pretty familiar.”

A Glimpse at What’s Emerging

At Nomad Data, we see document intelligence as a clear example of this shift.

Basic document processing is already available in APIs and ChatGPT. But documents are messy. Some are clean, digitized text; others are scanned images at odd angles, with tables, forms, or handwritten notes. Some are one page; others run tens of thousands. The questions users want to ask vary just as widely.

ChatGPT offers a one-size-fits-all approach, which works well for simple cases. But as Schneider explains, “DocChat goes all the way down the rabbit hole of document processing. It can handle the full range of document types, lengths, and workflows. It’s an enterprise tool for managing businesses that have lots and lots of documents. It’s not a chatbot where you just say one thing and it can do anything. It’s custom-tailored to do specific things.”

DocChat is one example of how best-of-breed applications can leverage LLM infrastructure to solve high-value, domain-specific problems.

The Road Ahead

Over time, LLMs will fade into the background—just as databases did. Most people won’t think about them directly. Instead, they’ll use tools that seamlessly integrate AI into their workflows, often without realizing what infrastructure is powering them.

“Anytime people are putting in unstructured text, typing questions, and getting human-readable answers, they’ll think of that as AI,” says Schneider. “But they’ll be removed from the underlying technology. They’re not going to use ChatGPT for everything. Complex tasks will rely on industry-specific, workflow-specific best-of-breed tools.”

Conclusion

The database era taught us that infrastructure is only the beginning. The real value lies in the ecosystem of specialized applications built on top.

LLMs are today’s databases. ChatGPT is the early vendor-built app that showcases what’s possible but isn’t the final destination. The winners will be the best-of-breed providers who understand industry problems deeply and design tools that fit seamlessly into existing workflows.

Just as the world runs on databases today, the future will run on LLMs. But most people won’t interact with them directly. They’ll experience AI through the tailored applications that turn this raw power into indispensable business tools.

Learn More