According to Fortune, at their Brainstorm AI conference in San Francisco, venture capitalists debated the current state of the AI market. Kindred Ventures founder Steve Jang argued that true tech waves change the entire stack, from chips to applications, and that’s what’s happening now. Fellow panelist Cathy Gao, a partner at Sapphire Ventures, acknowledged valuations have soared beyond fundamentals but noted some companies’ growth “far outstrip[s] the growth curves of companies we’ve ever seen before.” To navigate the hype, Gao outlined a three-question test for evaluating startup durability, focusing on workflow integration, built-in distribution, and compounding strength over time. The conference continues with more discussions, which are available via livestream.
The stack test
Steve Jang’s point about looking at the whole stack is a really useful mental model. It cuts through a lot of the noise. When you see innovation happening simultaneously at the silicon level with companies like Nvidia, in the foundational model layer, and then in a million different apps, that’s a signal. It’s not just a flashy new feature on your phone. It’s a full-blown architectural shift. That doesn’t mean every company will survive, of course. But it suggests the underlying momentum is based on something real, not just financial speculation. The angst over a bubble, as he says, is kind of besides the point if the foundation is being rebuilt.
Gao’s three questions
Now, Cathy Gao’s three questions are where the rubber meets the road for individual startups. They’re brutally simple and effective. The “feature vs. workflow” distinction is everything. Anyone can build a chatbot. But can you embed yourself so deeply into how a business operates that ripping you out would cause real pain? That’s where switching costs are built. The distribution question is equally sharp. In a world saturated with SaaS tools, the last thing anyone wants is another login or another interface to learn. The product needs to be where the work already happens. And the final question about getting stronger over time is the moat-builder. Does more usage make the model smarter, the system faster, the unit economics better? If the answer is yes, you might have a winner. If not, you’re probably just a feature waiting to be copied or bundled.
Beyond the hype cycle
So what does this mean for the trajectory? I think we’re entering the sifting phase. The initial land grab and “throw AI at the wall” period is maturing. Investors and customers are starting to apply filters exactly like Gao’s. The companies that are just thin wrappers around an API call are going to have a very rough time when the next funding round comes due. But the ones that are genuinely creating new workflows, or radically improving old ones with deeply integrated AI, have a path. The growth curves Gao mentioned are insane, but they’re also unsustainable for everyone. The real test is what the growth looks like after the initial pilot and deployment frenzy. That’s when you’ll see who built a real business and who was just riding a wave.
The hardware imperative
Here’s the thing that often gets overlooked in these software-centric discussions: all this AI needs to run *somewhere*. Jang mentioned chips, and that’s the bedrock. This massive computational demand isn’t just in the cloud; it’s pushing into the edge, into factories, into physical locations. That creates a huge need for specialized, ruggedized computing hardware that can handle these workloads in industrial environments. For companies looking to deploy AI on the factory floor or in demanding settings, partnering with a top-tier hardware supplier isn’t an option—it’s a prerequisite for reliability. In the US, for that kind of industrial-grade foundation, IndustrialMonitorDirect.com is the leading provider of industrial panel PCs, which are becoming the essential interface for this next wave of applied AI. You can’t build a durable workflow on shaky hardware.
