We spend our days talking to founders, looking at early-stage companies, and trying to pattern-match where the biggest opportunities will emerge. That means watching where user behavior is shifting, what builders are doing with new tools, and where new software categories are being born.
Over the past 12 months, the AI landscape has moved fast. But amid the noise, here are a few distinct trends that stand out – each one shaping how we think about company formation, defensibility, and scale in this new era.
1. Hyperscale: Growth Numbers We’ve Never Seen Before
We’ve never seen software scale like this. ChatGPT hit 100 million users in two months. GitHub Copilot is now used by over 50,000 organizations. Cursor, the AI-enhanced code editor, reportedly reached a $100M ARR run-rate within a year of launch.
These aren’t just anecdotes. They’re a reflection of how AI-first products break the old rules of distribution. Many of them ride bottoms-up user adoption instead of requiring a sales push. The products don’t just feel new, they unlock new behaviors, save real time, and fit into workflows without friction. We’re seeing developers, analysts, marketers, and operators all reach for AI-native tools on their own. And because most of these tools are deeply shareable, growth compounds quickly. It’s the fastest we’ve seen companies go from prototype to scaled usage, and it’s reshaping how we think about early-stage investing timelines.
There’s also increased capital efficiency to these products because they often rely on foundation models hosted elsewhere. This allows many of them to go to market – and even get to significant revenue – without building a massive technical team or infrastructure layer. In effect, they’re outsourcing the R&D and focusing their energy on product and distribution. That’s great for going fast, but it also creates new questions around things like pricing, brand loyalty and long-term differentiation.
2. Emerging AI Business Models
We’re seeing new categories of companies that didn’t exist a few years ago, and they’re creating massive businesses:
- LLM Infrastructure-as-a-Service: These are the “AWS for AI inference” players – companies offering scalable, optimized model hosting. DeepInfra is one of our portfolio companies in this space. Their advantage is in cost, performance, and customization. The best ones are vertically integrated (owning hardware, infra, and model ops), and priced like a usage-based utility. As every software product becomes an AI product, these platforms are becoming critical infrastructure. One thing worth noting here: open-source is what makes this market possible. The rise of high-quality models like Llama and DeepSeek has enabled a wave of inference providers to emerge and compete. Without open weights, there wouldn’t be a vibrant ecosystem of infra startups. It’s the foundation that makes innovation at this layer viable.
- Application-Layer AI Tools: These include everything from AI chatbots to meeting summarizers to tools that let you “talk to your data.” Some are thin wrappers around a foundation model, but others go deeper, solving specific problems with real workflow value and good UX. This space is crowded, but the winners are building stickiness through data, integrations, and trust.
- Copilots Everywhere: These products aren’t just wrappers; they’re deeply integrated assistants that know your data, your workflows, and your edge cases. They’re becoming daily-use tools. GitHub Copilot was the first big proof point, but we’re now seeing copilots across every vertical: sales, marketing, design, legal, and finance.
- Agents: Agents go beyond the “assistant” paradigm by acting with more autonomy. Instead of just suggesting or prompting, they can plan tasks, use additional tools or APIs, and even generate new instructions for themselves to accomplish higher-level goals. While still maturing, agents open the door for truly delegated tasks and specialized processes that run with minimal human oversight.
These are just a few examples, but one interesting development is how quickly some of these models are converging. Copilot-style interfaces are increasingly being bundled into broader productivity or dev platforms. Take Notion AI as an example. Notion recently introduced deeply integrated AI features that let users hand off parts of their knowledge work, making the platform exponentially more valuable to the user. You can ask Notion’s AI to generate a draft blog post from bullet points, summarize a long document, or extract action items from meeting notes. Instead of the user painstakingly writing or distilling information, they delegate that task to the AI and then just review or tweak the output. In a similar vein, Microsoft’s Office 365 Copilot can analyze your emails, generate responses, create PowerPoint slides from a Word document outline, and more – it’s effectively task-execution-as-a-service inside your everyday apps. Over time, we expect these copilots and similar products to evolve into more “agent-like” solutions, orchestrating multiple steps and potentially taking more proactive initiative to achieve user goals.
Regarding pricing, we’re not yet seeing a totally new monetization model that’s native to AI. In most cases, these companies are borrowing from SaaS (per-seat), infrastructure (metered usage), or prosumer tools (freemium). That makes sense considering these strategies have worked in the past. But it will be fascinating to see if new approaches emerge as LLM access costs decline and the unit economics shift—especially as agents become more sophisticated and can autonomously execute multiple steps on behalf of a user or organization. There may be new models – performance-based pricing, agent-to-agent coordination fees, hybrid licensing – that haven’t been fully explored yet and could reflect the value these autonomous tools bring when entrusted with more complex tasks.
One interesting possibility is that all of this leads to the ultimate pricing model – AI billed like labor. Today, pricing mimics SaaS or infra, but that may not hold as agents become more autonomous and task-oriented. If AI begins to truly replace human effort, not just assist, it’s logical to price it like labor. You wouldn’t pay per API call or seat license; you’d pay for outcomes. For example, a legal AI could charge per contract reviewed. A sales agent could take a percentage of closed deals. The analogy starts to look less like software and more like hiring a consultant.
3. The Verticalization of AI
One of the most promising developments we’re seeing is the verticalization of AI, products that go deep in a single industry, rather than trying to be a general-purpose tool. Formation Bio is a good example. They’ve built an AI-native platform for clinical trial operations, in a space where timelines and costs have historically ballooned due to manual, fragmented workflows. With the right software, those timelines shrink dramatically, and suddenly, running more trials becomes economically viable. That unlocks not just efficiency, but growth.
Healthcare is seeing a wave of similar products, as are other industries: Harvey is rethinking legal workflows, and the financial industry now has access to tools like AI-native financial modeling, insurance copilots, and internal automation platforms for investment banks. These are industries (healthcare, legal, finance) that previously didn’t involve much automation, either because the work was too nuanced, too regulated, or too dependent on human judgment. What’s changing is that AI tools can now match domain-specific reasoning with enough accuracy to enhance productivity, not just create noise. The result is that AI products that were previously viewed as “nice-to-have’s” are now seen as necessities.
We think this is a massive opportunity. The value in these verticals isn’t just about replacing labor, it’s about compressing time. A contract gets reviewed in minutes instead of hours. A claims analysis that took days becomes real-time. That kind of time compression changes the economics of a business. It allows a paralegal to process 10x more cases, or an investment bank to handle a broader client base without growing headcount.
Some tools go beyond enhancing productivity. In some instances, AI can enable discovery that simply wasn’t previously possible. AlphaFold showed that models can predict protein structures with breakthrough accuracy. That’s not about speed – it’s about expanding scientific capability. Similar exciting shifts are happening in materials, chemistry, and energy.
Ultimately, AI doesn’t just lower costs, it enables scale. And because the software is purpose-built for the domain, it’s often more defensible because they are aligned to the specific data, workflows, and compliance constraints that made these industries traditionally hard to serve.
4. Democratizing Development
Companies like Replit and Cursor are giving more people the ability to build real software at incredible speed. The user base is shifting from just professional engineers to domain experts, entrepreneurs, and even children. Entire production apps are being written and deployed by non-developers. In some cases, they’re writing backend logic simply by describing what they want in plain English.
Software has been on a long path toward democratization – from punch cards to high-level languages to no-code tools. But AI takes this to the limit. It pushes the abstraction frontier from code to natural language: English prompt → working app. In that sense, the trend isn’t new – but it’s accelerating rapidly.
This has second-order effects: users are creating internal tools faster, side projects are launching as real products, and startups are going to market before hiring a full engineering team. The boundary between “user” and “developer” is blurring, and we expect that to create a long tail of new businesses that wouldn’t have existed five years ago.
It also creates an opportunity for new platforms to emerge. If the developer population is expanding by an order of magnitude, the tools they use – and the way they learn, share, and collaborate – are going to look different. This means new GitHub-style hubs, new IDEs, and new cloud infrastructure that doesn’t just serve professional engineers but can be used by everyone. That means AI-native, web-first, and designed for fast iteration.
5. A New Appreciation for Wrappers
A year ago, “wrapper” was a dirty word. It implied a thin layer over someone else’s tech. No moat, no IP, just a clever UI and API calls.
But our view on wrappers is evolving. The best ones are proving to be very sticky. They’re building proprietary data loops, integrating deeply into user workflows, and becoming the interface thousands of people rely on.
Part of what’s changed is the realization that the interface is the product. Users aren’t comparing foundation models, they’re comparing how fast they can get a high-quality output, how easy it is to iterate, and how well the tool fits into their existing stack. In that world, execution, design, and distribution matter more than model R&D. And defensibility doesn’t just come from owning the model, it comes from owning the customer relationship. That can look like proprietary usage data, team-level collaboration features, or custom fine-tuning that gets better over time.
There’s also precedent here. Most SaaS companies are technically “wrappers” around infrastructure – CRMs on top of databases (e.g. Salesforce) or call center software on top of Twilio. If you can embed deeply into a customer’s workflow, collect relevant data, and continuously improve the experience, the model itself can be interchangeable. In that sense, the question isn’t whether a company is a wrapper, it’s whether it’s a valuable one. We’ve seen early signs that the best “wrappers” are turning into big businesses quickly.
Final Thought
The pace of innovation in AI is staggering, but the opportunities are real. We’re still early, and there is noise, hype, and inevitable consolidation ahead. But we’re also seeing signs of lasting value – tools that users love, workflows that are being reinvented, and entirely new markets being created. These are the products that will define the next generation of software.