AI-first DATA PIPELINE

Warehouse your LLM logs.

Store your LLM requests, responses, and parameters. Analyze, optimize, and fine-tune your AI features.

Optimize your AI features

WAREHOUSE
Store logs
QUERY
Analyze data
OPTIMIZE
Test & train
Data pipeline from OpenAI to Postgres
Use velvet

Implement a trusted evaluation loop

table icon
Warehouse LLM logs

Store your requests, responses, and parameters to Postgres.

code icon
Analyze data

Natively query costs, performance, features, endpoints, and models.

send icon
Forward logs

Send raw LLM logs to any platform your team wants to use.

graph icon
Automate evaluations

Experiment with prompts, RAG, and models to optimize responses.

sparkle icon
Scale infrastructure

Automatic scaling, rate limiting, caching, and error handling.

groupings icon
Fine-tune models

Use your data to fine-tune your own model when you're ready.

CAPTURE

Log every LLM request & response

Capture every raw LLM log at scale. Use data to evaluate performance, trace problems, optimize cost, and fine-tune your own models.

OpenAI JSON blurb
Image of data in Postgres
STORE

Warehouse logs to your database

Store LLM requests, responses, and parameters to your own PostgreSQL instance. Get a queryable table to analyze and optimize over time.

QUERY

Use data to optimize your AI features

Unlock an evaluation loop to build consistent and trustworthy features. Resolve issues, evaluate models, and train AI specific to your system.

Illustration of query editor
ai-first data PIPELINE

Warehouse LLM requests, optimize AI features.

Try Velvet for free

Q & A

Who is Velvet made for?
How do I get started?
Which models and DBs do you support?
What are common use cases?
How much does it cost?

Articles to learn more

Engineering
Create a fine-tuning dataset for gpt-4o-mini

Use Velvet to identify and export a fine-tuning dataset.

Engineering
Why Find AI logs OpenAI requests with Velvet

AI-powered B2B search engine logged 1,500 requests per second.

Product
How we use AI to automate our data copilot

Lessons learned using LLMs to automate data workflows.