As a Product Manager
Last updated
Last updated
Once we identified a business problem to solve with LLMs, a few natural places to integrate LLMs emerged. We had to make some product decisions on what parts of the user journey to enhance with LLMs.
LLM-generated output is a great way to solve the cold-start problem users face on your platform.
For us, that was “what SQL query can I write on my data”. By adding a tab to ask our AI to suggest queries and data trends to explore while users were onboarding, we improved conversion and stickiness of downstream features.
LLM-generated output is also great at helping your users work through your core product quickly and efficiently.
We added our AI SQL Copilot right on the page where users were creating, modifying and saving their SQL queries for maximum impact.
The more constrained inputs to your LLMs can be, the less you run into hallucinations and other typical issues. This makes LLMs good debugging assistants.
We've had success integrating LLMs into workflows where we expect the user to be debugging SQL queries as part of the AI SQL Copilot. It's the perfect combination of being too frustrating for a human to figure out where they missed a semi-colon and a machine being able to do that easily.