A Thread represents a group of interactions between a user and an AI app. It can be, for example, a conversation between a human and one or several assistants, or an agent generating a document piece by piece from a user input.

A Thread is composed of Steps. A Step can be a message, or an intermediate AI step like an LLM call.

Log a Thread

import os
import time
from literalai import LiteralClient

client = LiteralClient(api_key=os.getenv("LITERAL_API_KEY"))

@client.step(type="run")
def my_assistant(input: str):
    # Implement your assistant logic here
    time.sleep(1)
    response = "My assistant response"
    client.message(content=response, type="assistant_message", name="Assistant")
    return response

def main():
    # You can also continue a thread by passing the thread id
    with client.thread(name="Thread Example") as thread:
      print(thread.id)
      user_query = "Hello World"
      client.message(content=user_query, type="user_message", name="User")
      my_assistant(user_query)

      # Let's say the user has a follow up question
      follow_up_query = "Follow up!"
      client.message(content=follow_up_query, type="user_message", name="User")
      my_assistant(user_query)

main()
# Network requests by the SDK are performed asynchronously.
# Invoke flush_and_stop() to guarantee the completion of all requests prior to the process termination.
# WARNING: If you run a continuous server, you should not use this method.
client.flush_and_stop()

Visualize the Thread on Literal

Navigate to the Threads page on the platform to see the thread you just created.

Output on the platform