Guide to building your first LLM-powered app
  • Building an LLM-powered Application
    • Who this guide is for
  • Before you build
    • As a Business User
    • As a Product Manager
    • As a Designer
  • Implementing LLMs
    • Model Selection
    • Prompt Engineering
    • Model Failures
  • Important Next Steps
    • On LLM Usage Analytics
    • On LLM Fraud, Waste & Abuse
    • On Pricing LLM-powered features
    • Beyond V1
Powered by GitBook
On this page
  1. Important Next Steps

On Pricing LLM-powered features

PreviousOn LLM Fraud, Waste & AbuseNextBeyond V1

Last updated 1 year ago

Most companies, arguably OpenAI included, are struggling with how to price their LLM features.

There's so many options, since you can use this both as an opportunity to go down-market or reserve AI features for your Enterprise customers.

You could eat the cost, or pass them onto your customers. You could offer these features at-cost, or for a premium if you add intermediate value.

Here's the decision tree we used, that might help you:

LogicLoop Pricing AI Feature Decision Tree