LogoLogo
  • 🔬Activeloop
  • 🏠Setup
    • Quickstart
  • Models & Pricing
  • 📖User Guide
    • Ingesting with Metadata
      • Modalities
    • Filtering Query
      • Query Syntax
    • Streaming Output
  • 🏗️API Reference
    • Chat Completions
    • Files
Powered by GitBook
On this page
Export as PDF
  1. Setup

Quickstart

Upload a document and run a query. All within minutes.

PreviousActiveloopNextModels & Pricing

Last updated 17 days ago

Get Token

Sign up at . Navigate to ⚙️ → API tokens, and create a token. Set the token as ACTIVELOOP_TOKEN environment variable.

Upload Documents

We will fetch 4 reference guides from NASA each more than 90 pages and ask a highly complex question.

import os, io, requests

pdf_urls = ["https://www.nasa.gov/wp-content/uploads/2022/03/sls-reference-guide-2022-v2-508-0.pdf",
            "https://www.nasa.gov/wp-content/uploads/2023/02/orion-reference-guide-111022.pdf", 
            "https://www.lpi.usra.edu/lunar/artemis/Artemis-I-Reference-Guide_NP-2022-03-3045-HQ.pdf",
            "https://www.ulalaunch.com/docs/default-source/rockets/2023_vulcan_user_guide.pdf"]

files = [('file', (os.path.basename(url), io.BytesIO(requests.get(url).content))) for url in pdf_urls]

response = requests.post(
      'https://api.activeloop.ai/files', 
      headers={"Authorization": f"Bearer {os.getenv('ACTIVELOOP_TOKEN')}"},
      files=files
)
# Onced uploaded, it would take few minutes to index

You can learn about the state by .

Query Data

Once the data is indexed, you can run a query against it as if you are calling your LLM .

from openai import OpenAI

client = OpenAI(
    base_url="https://api.activeloop.ai/",
    api_key=os.getenv('ACTIVELOOP_TOKEN')
)

response = client.chat.completions.create(
    model="activeloop-l0",
    messages=[
        {
            "role": "user",
            "content": "Using the side-view diagrams that annotate overall height, rank SLS Block 1, Orion (CM + SM), " +
                       "Falcon 9 (v1.2 FT), and Vulcan Centaur by height; which vehicle is the second tallest, " +
                       "and what is its annotated height (m, one decimal place)?"
        }
    ],
    stream=True
)

chunks = [chunk.choices[0] for chunk in response]
thinking = "".join([c.delta.reasoning_content for c in chunks if c.delta.reasoning_content is not None]) 
answer = "".join([c.delta.content for c in chunks if c.delta.content is not None]) 
citations = chunks[-1].metadata['relevant_docs']
🏠
chat.activeloop.ai
listing files