Sep 2, 2025

Duration

/

Anant Bhardwaj

Founder and CEO | Instabase

Ethan Lee

Director of Product

Building AI Products That Enterprises Actually Want

Building Enterprise AI That Works: Document Intelligence, Workflow Automation, and the Value-First Approach

Transforming unstructured enterprise data into actionable insights requires more than just throwing documents at large language models. In this episode, we explore how AI product builders should approach complex enterprise workflows, why early technology bets matter, and how to create genuine value in heavily regulated industries where AI adoption seems impossible.

Guest Introduction

Anant Bhardwaj is the Founder and CEO of Instabase, an AI platform that helps enterprises unlock value from unstructured data. Having made the prescient decision to bet on transformer architecture in 2018—long before the current AI wave—Anant brings hard-earned insights into building AI products that solve real enterprise problems. With major banking customers including four of the top five US banks, he offers practical guidance on navigating regulated industries and scaling AI workflows beyond simple document extraction.

Why Enterprise AI is Different from Consumer AI

Real Problem Solving: Enterprises don't care about AI capabilities; they care about end-to-end business outcomes and workflow automation that moves the needle

Regulated Industry Adoption: If you solve genuine problems and create significant value, even the most heavily regulated industries will find ways to bring you in

Compound Accuracy Challenges: 90% field-level accuracy across 25 fields results in only 7% document-level accuracy, requiring entirely different approaches to workflow design

Infrastructure Complexity: Early AI companies had to build extensive processing pipelines; modern builders can leverage LLMs to eliminate 90% of traditional infrastructure code

The Evolution from Field Extraction to Workflow Automation

Beyond Document Processing: The magic isn't extracting fields from documents—it's creating AI assistants that can process entire packets, validate information, and make decisions end-to-end

Accuracy Multiplication: Workflow-level accuracy has jumped from 3-4% to 50%+ with modern AI, representing gains of 5,000-6,000% in straight-through processing

Multi-Model Orchestration: Different models excel at different tasks; successful systems use multiple AI models with validation layers rather than relying on single solutions

Packet-Level Intelligence: Real enterprise value comes from understanding relationships between multiple document types within complex business processes

Early Technology Betting and Timing

Transformer Adoption (2018): When nothing else worked for document layout understanding, transformers with X-Y coordinate encoding became the only viable solution

GPU Resistance: Enterprise customers initially rejected GPU requirements, but early technical bets proved essential for long-term competitive advantage

Pre-LLM Infrastructure: Building extensive OCR, table detection, and extraction pipelines provided deep domain understanding that remains valuable today

Technology Pragmatism: Success came from choosing what worked rather than what seemed intellectually sophisticated—a lesson for today's AI builders

Enterprise Sales and Value Creation Philosophy

Drivers vs. Drags: Focus first on core value propositions (drivers) that customers love, then handle compliance requirements (drags) with reasonable accommodation

Value Over Differentiation: Being useful matters more than being different; every SaaS company is fundamentally a "wrapper" around underlying technologies

Regulated Industry Navigation: US banks are generally reasonable to work with; European institutions can be more detail-focused but still manageable

Enterprise Readiness Balance: Strong value propositions earn leeway on compliance details, but both elements are eventually necessary for success

The Infrastructure Implications of LLM Ubiquity

Eliminated Complexity: Modern AI builders can skip 80-90% of traditional document processing infrastructure by leveraging capable foundation models

Workflow Runtime Gap: The missing piece isn't document intelligence—it's reliable systems for running complex AI workflows at enterprise scale

Open Source Opportunity: Like Kubernetes and Docker, AI workflow runtimes will likely emerge as open source solutions rather than proprietary SaaS platforms

Tool Integration Requirements: AI agents need access to organization-specific tools and infrastructure that external LLM providers cannot provide

The Future of Retrieval and Context Management

Beyond RAG Limitations: Vector search and chunking strategies are tools in a toolkit, not complete solutions for enterprise data challenges

Context Window Reality: While expanding to millions of tokens helps, enterprises often need to process billions of tokens worth of historical data

Horizontal Scaling Necessity: Like databases, AI systems will require distributed approaches rather than relying solely on larger single-model context windows

Dynamic Tool Selection: Future AI systems will choose retrieval strategies based on specific problems rather than defaulting to predetermined approaches

User Experience as Competitive Moat

The Cursor Phenomenon: Despite having access to the same underlying technologies, better user experience can create massive competitive advantages

Beyond Feature Parity: When all products have similar capabilities, success comes from intangible experience quality that's difficult to replicate

Enterprise Application Analogy: OpenAI's success stems more from ChatGPT as a beloved consumer application than from having the best underlying models

Product-Market Fit Evolution: As foundational AI capabilities commoditize, differentiation increasingly comes from workflow design and user experience excellence

Lessons for Enterprise AI Builders

Value-First Approach: Solve significant business problems first; technical sophistication without clear value creation leads nowhere

Early Technology Intuition: Make principled bets on emerging technologies when they're the only viable solution, even if adoption seems difficult

Infrastructure Patience: Build robust workflow execution capabilities rather than chasing the latest model improvements or AI features

Experience Investment: As AI capabilities democratize, sustainable competitive advantage comes from superior user workflows and enterprise integration

This episode provides essential frameworks for founders and product leaders building AI solutions for enterprise markets, emphasizing that long-term success requires combining technical depth with practical business value creation and exceptional user experience design.

Interested in being a guest on Future Proof? Reach out to forrest.herlick@useparagon.com