Skip to content
/ weave Public
forked from wandb/weave

Weave is a toolkit for developing AI-powered applications, built by Weights & Biases.

License

Notifications You must be signed in to change notification settings

Zezo-Ai/weave

This branch is 1395 commits behind wandb/weave:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

author
Weave Build Bot
Sep 11, 2024
b41b885 · Sep 11, 2024
Sep 11, 2024
Sep 4, 2024
Sep 11, 2024
Aug 28, 2024
Aug 28, 2024
Aug 28, 2024
Sep 10, 2024
Sep 11, 2024
Jun 28, 2022
Jul 12, 2024
Nov 15, 2022
Oct 3, 2023
Aug 28, 2024
Jun 6, 2023
Sep 4, 2024
Aug 28, 2024
Jun 7, 2023
Sep 11, 2024
Sep 10, 2024
Jun 19, 2024
Sep 11, 2024
Aug 28, 2024
Jan 29, 2024
Jun 1, 2023
Sep 11, 2024
Dec 18, 2023
May 13, 2024
Nov 27, 2023
Sep 10, 2024
Aug 21, 2024
Jun 19, 2024
Aug 8, 2024
Aug 5, 2024

Repository files navigation

Weave by Weights & Biases

Open in Colab Stable Version Download Stats Github Checks

Weave is a toolkit for developing Generative AI applications, built by Weights & Biases.


You can use Weave to:

  • Log and debug language model inputs, outputs, and traces
  • Build rigorous, apples-to-apples evaluations for language model use cases
  • Organize all the information generated across the LLM workflow, from experimentation to evaluations to production

Our goal is to bring rigor, best-practices, and composability to the inherently experimental process of developing Generative AI software, without introducing cognitive overhead.

Documentation

Our documentation site can be found here

Installation

pip install weave

Usage

Tracing

You can trace any function using weave.op() - from api calls to OpenAI, Anthropic, Google AI Studio etc to generation calls from Hugging Face and other open source models to any other validation functions or data transformations in your code you'd like to keep track of.

Decorate all the functions you want to trace, this will generate a trace tree of the inputs and outputs of all your functions:

import weave
weave.init("weave-example")

@weave.op()
def sum_nine(value_one: int):
    return value_one + 9

@weave.op()
def multiply_two(value_two: int):
    return value_two * 2

@weave.op()
def main():
    output = sum_nine(3)
    final_output = multiply_two(output)
    return final_output

main()

Fuller Example

import weave
import json
from openai import OpenAI

@weave.op()
def extract_fruit(sentence: str) -> dict:
    client = OpenAI()

    response = client.chat.completions.create(
    model="gpt-3.5-turbo-1106",
    messages=[
        {
            "role": "system",
            "content": "You will be provided with unstructured data, and your task is to parse it one JSON dictionary with fruit, color and flavor as keys."
        },
        {
            "role": "user",
            "content": sentence
        }
        ],
        temperature=0.7,
        response_format={ "type": "json_object" }
    )
    extracted = response.choices[0].message.content
    return json.loads(extracted)

weave.init('intro-example')

sentence = "There are many fruits that were found on the recently discovered planet Goocrux. There are neoskizzles that grow there, which are purple and taste like candy."

extract_fruit(sentence)

Contributing

Interested in pulling back the hood or contributing? Awesome, before you dive in, here's what you need to know.

We're in the process of 🧹 cleaning up 🧹. This codebase contains a large amount code for the "Weave engine" and "Weave boards", which we've put on pause as we focus on Tracing and Evaluations.

The Weave Tracing code is mostly in: weave/trace and weave/trace_server.

The Weave Evaluations code is mostly in weave/flow.

About

Weave is a toolkit for developing AI-powered applications, built by Weights & Biases.

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 38.7%
  • Python 32.7%
  • C 17.3%
  • Jupyter Notebook 8.9%
  • HTML 0.8%
  • CSS 0.6%
  • Other 1.0%