- 20 November, 2024
We’re going to need a bigger boat…
One of the most noticeable changes in how the best product teams are using LLMs is in lowering the barrier to entry for people to get the best from their products. In short, LLMs are massively expanding TAM (total addressable market).
The demand to test new generative AI tools has been incredible. Whether it is consumer interest, with ChatGPT gaining over 100 million registered users in just 6 months since its launch, or enterprise adoption, with Github CoPilot already generating $100Ms in ARR, there is no shortage of interest in the new capabilities that LLMs and diffusion models unlock.
But matching this explosion in demand for such tools is another development, the adoption of LLMs to expand a product’s potential user base. These products utilise the new functionality offered by large language models and diffusion models to lower the barriers to entry for new types of customers.
Take PhotoRoom, a mobile-first solution for creating professional product images. Their mobile app was already highly popular before 2022 amongst its tens of millions of customers, who used it to enhance product photos for platforms like Etsy and Depop. Although powerful, the product was limited to some degree by how much photo editing functionality you could fit into a mobile interface. Even with excellent product leadership, making small edits or designing new backgrounds from scratch would require a completely new solution. However, with the adoption of text-to-image and image-to-image models, users can now ask their app for complex changes to their product photography through a web or mobile chat box and get results that would put even a Photoshop master to shame. Suddenly, users with little product design experience are now in their core-user demographic, with Photoroom rocketing to the top of the app store as a result.
Or take Cleo, the Gen Z financial advisor app, who previously used traditional natural language processing (NLP) to assist users with their products. While this was an acceptable solution, it had limitations with customers having to put up with rather robotic and unadaptable responses to their queries. According to Cleo CEO Barney Hussey-Yeo, the usage of Cleo has increased rapidly as they have adopted state-of-the-art LLMs. They were able to “introduce relevant products in conversations and answer detailed multi-faceted questions to reassure customers.” In both cases, LLMs allowed users who may not have converted before to find relevant products and get answers to their questions.
The teams that capitalise on this opportunity will be able to scale faster than ever before, bringing users to value more quickly.
This trend is not only relevant to consumer and prosumer products mentioned above. Jamie Cuffe from Retool recently discussed the “blank canvas” problem faced by many SaaS tools. This challenge arises when potential customers adopt a new tool like Notion, Figma, or Retool, where they know they can build anything in theory but don’t know where to start, resulting in early churn. This complex product challenge usually means that only customers with a common use case or a clear pain point can navigate through it, often with the help of templates or solutions architects.
However, with the ability of LLMs to convert natural language queries into executable code, the time to value for new customers who have never tried the product before can be significantly reduced. Retool has recently launched Workflows AI as a way to accomplish this, allowing customers to quickly set up a demo workflow and ask real-time questions with an AI “agent” alongside it. Once again, this means that customers with zero experience of similar low-code tools are suddenly within their TAM.
This trend even applies to developer tooling. I have barely used the terminal or programmed in Python for years. However, after installing Open Interpreter, a locally-hosted version of OpenAI’s CodeInterpreter using Llama 2, I can use natural language queries such as “make a list of the meetings I had in my calendar last week and put them into a to-do list” and have the code generated and executed on my Mac. Suddenly, Apple Scripts and Python packages I have never used, like Pandas and Beautiful Soup, become tools I can adopt. The TAM for developer-focused tools just got a whole lot bigger.
Of course, the examples given are only a tiny aperture into how GPT models, and agents, are going to change our relationship with products, but it’s an important starting point.
For the past decade, much startup advice has emphasised the importance of focus. Find a niche customer profile, prove product-market fit, and test distribution channels until the economics make sense. This advice still holds true, but with the general availability of increasingly powerful LLMs, a TAM-grenade has been thrown into your product roadmap. The teams that capitalise on this opportunity will be able to scale faster than ever before, bringing users to value more quickly.