There's a pricing pattern spreading through AI-powered travel software that's worth understanding before you sign a contract.
Most platforms that use large language models (Anthropic, OpenAI, and others) to power their AI features have a straightforward commercial choice to make: pass those compute costs through to customers at cost, or bundle them into the platform fee at a markup. The second option is more common. The platform buys tokens wholesale, sells them retail, and takes margin on every AI interaction your team or your travelers generate. Clean model. Logical. Also worth scrutinizing.
The problem with token markup
When AI compute is embedded in your platform fee at a margin, a few things follow.
First, your costs scale with AI usage in a way that's opaque. You're not paying for infrastructure you can see and audit — you're paying a blended rate that includes someone else's margin on top of the underlying cost. As your volume grows, so does the embedded markup, invisibly.
Second, the vendor's commercial incentive is misaligned with yours. A platform making margin on tokens has a reason to maximize AI usage, not to optimize it. Every interaction, necessary or not, contributes to their revenue.
Third, it creates commodity risk. The underlying models are Anthropic's or OpenAI's, not the platform's. A competitor willing to take a thinner margin on the same infrastructure can undercut on price without changing the product. The platform that built its business model on token resale finds itself in a margin war over someone else's technology.
How Acai structures its pricing
Acai passes AI compute costs through to customers at cost, transparently, with no markup. The invoice shows what the infrastructure actually costs to run your workflows. Not a profit center but a utility line.
What Acai charges for is the platform we built: the orchestration layer that connects natively to GDS systems (Sabre, Amadeus, Travelport+), NDC content aggregators, reads live PNR data, checks fare rules in real time, cross-references traveler profiles and corporate travel policies, and takes autonomous action at the moment it's needed. That's the capability that required years to build and can't be replicated by adjusting a token margin.
The test worth running on any AI vendor: if the major LLM providers cut token prices by 50% tomorrow, what happens to your pricing?
If their costs go down automatically, they're in the token resale business. If nothing changes because you're paying for the platform they built, that's a different conversation.
A second question worth asking: could you bring your own AI model and still find the platform worth paying for?
At Acai, the answer is yes. Customers who find better LLM pricing elsewhere can bring their own model. The content connectivity, the reservation parsing, the fare rules engine, the policy logic, the action layer...none of that changes. What they're paying for was never the compute sitting underneath. It was always the system built around it.
Compute is infrastructure. Treat it like one.
Want to learn more?
Scale your travel operations when, where and how you want




