LogoLogo
  • 🔬Activeloop
  • 🏠Setup
    • Quickstart
  • Models & Pricing
  • 📖User Guide
    • Ingesting with Metadata
      • Modalities
    • Filtering Query
      • Query Syntax
    • Streaming Output
  • 🏗️API Reference
    • Chat Completions
    • Files
Powered by GitBook
On this page
Export as PDF
  1. User Guide

Streaming Output

PreviousQuery SyntaxNextChat Completions

Last updated 19 days ago

Following the tutorial on , instead of waiting the whole reasoning to be completed, you can stream intermediate tokens.

import os
from openai import OpenAI

client = OpenAI(
    base_url="https://api.activeloop.ai/",
    api_key=os.getenv('ACTIVELOOP_TOKEN')
)

stream = client.chat.completions.create(
    model="activeloop-l0",
    messages=[{"role": "user", "content": "what is the AIME score of DeepSeek R1?"}],
    stream=True,
)

for event in stream:
    print(event)
📖
QuickStart