OpenAI
LLM APIsfor spikes — not silent production dependencies.
Overview
Prototyping and tooling with clear boundaries
ChatGPT and the OpenAI API can shorten exploration for internal tools and proofs of concept. Keys stay scoped, usage is logged where needed, and customer-facing behavior is not quietly tied to unreviewed prompts.
Where it shows up
Appropriate uses
Internal assistants
Support macros, draft generators, and ops helpers with access controls and audit trails.
Product spikes
Time-boxed prototypes to validate UX before committing to model choice and cost.
Developer productivity
Explaining errors and suggesting fixes — always verified against your codebase.
Practice
Controls we expect
- Separate keys and environments for experiment vs production.
- Prompt and output retention policies aligned to GDPR and your security review.
- Fallbacks when APIs rate-limit or change behavior.

Next step
Product AI is a different conversation
For customer-facing AI features and automation, see our AI Solutions — here we focus on delivery tooling.