Skip to content

Harrison/ver0089 #1144

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Feb 18, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/modules/llms/integrations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,12 @@ Integrations

The examples here are all "how-to" guides for how to integrate with various LLM providers.

`OpenAI <./integrations/openai.html>`_: Covers how to connect to OpenAI models.

`Cohere <./integrations/cohere.html>`_: Covers how to connect to Cohere models.

`AI21 <./integrations/ai21.html>`_: Covers how to connect to AI21 models.

`Huggingface Hub <./integrations/huggingface_hub.html>`_: Covers how to connect to LLMs hosted on HuggingFace Hub.

`Azure OpenAI <./integrations/azure_openai_example.html>`_: Covers how to connect to Azure-hosted OpenAI Models.
Expand Down
99 changes: 99 additions & 0 deletions docs/modules/llms/integrations/ai21.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "9597802c",
"metadata": {},
"source": [
"# Cohere\n",
"This example goes over how to use LangChain to interact with Cohere models"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "6fb585dd",
"metadata": {},
"outputs": [],
"source": [
"from langchain.llms import AI21\n",
"from langchain import PromptTemplate, LLMChain"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "035dea0f",
"metadata": {},
"outputs": [],
"source": [
"template = \"\"\"Question: {question}\n",
"\n",
"Answer: Let's think step by step.\"\"\"\n",
"\n",
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "3f3458d9",
"metadata": {},
"outputs": [],
"source": [
"llm = AI21()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a641dbd9",
"metadata": {},
"outputs": [],
"source": [
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9f0b1960",
"metadata": {},
"outputs": [],
"source": [
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
"\n",
"llm_chain.run(question)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "22bce013",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
110 changes: 110 additions & 0 deletions docs/modules/llms/integrations/cohere.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "9597802c",
"metadata": {},
"source": [
"# Cohere\n",
"This example goes over how to use LangChain to interact with Cohere models"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "6fb585dd",
"metadata": {},
"outputs": [],
"source": [
"from langchain.llms import Cohere\n",
"from langchain import PromptTemplate, LLMChain"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "035dea0f",
"metadata": {},
"outputs": [],
"source": [
"template = \"\"\"Question: {question}\n",
"\n",
"Answer: Let's think step by step.\"\"\"\n",
"\n",
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "3f3458d9",
"metadata": {},
"outputs": [],
"source": [
"llm = Cohere()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a641dbd9",
"metadata": {},
"outputs": [],
"source": [
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "9f844993",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"\" Let's start with the year that Justin Beiber was born. You know that he was born in 1994. We have to go back one year. 1993.\\n\\n1993 was the year that the Dallas Cowboys won the Super Bowl. They won over the Buffalo Bills in Super Bowl 26.\\n\\nNow, let's do it backwards. According to our information, the Green Bay Packers last won the Super Bowl in the 2010-2011 season. Now, we can't go back in time, so let's go from 2011 when the Packers won the Super Bowl, back to 1984. That is the year that the Packers won the Super Bowl over the Raiders.\\n\\nSo, we have the year that Justin Beiber was born, 1994, and the year that the Packers last won the Super Bowl, 2011, and now we have to go in the middle, 1986. That is the year that the New York Giants won the Super Bowl over the Denver Broncos. The Giants won Super Bowl 21.\\n\\nThe New York Giants won the Super Bowl in 1986. This means that the Green Bay Packers won the Super Bowl in 2011.\\n\\nDid you get it right? If you are still a bit confused, just try to go back to the question again and review the answer\""
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
"\n",
"llm_chain.run(question)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4797d719",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
110 changes: 110 additions & 0 deletions docs/modules/llms/integrations/openai.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "9597802c",
"metadata": {},
"source": [
"# OpenAI\n",
"This example goes over how to use LangChain to interact with OpenAI models"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "6fb585dd",
"metadata": {},
"outputs": [],
"source": [
"from langchain.llms import OpenAI\n",
"from langchain import PromptTemplate, LLMChain"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "035dea0f",
"metadata": {},
"outputs": [],
"source": [
"template = \"\"\"Question: {question}\n",
"\n",
"Answer: Let's think step by step.\"\"\"\n",
"\n",
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "3f3458d9",
"metadata": {},
"outputs": [],
"source": [
"llm = OpenAI()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a641dbd9",
"metadata": {},
"outputs": [],
"source": [
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "9f844993",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"' Justin Bieber was born in 1994, so the NFL team that won the Super Bowl in that year was the Dallas Cowboys.'"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
"\n",
"llm_chain.run(question)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4797d719",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
4 changes: 2 additions & 2 deletions langchain/agents/load_tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ def _get_google_search(**kwargs: Any) -> BaseTool:

def _get_google_serper(**kwargs: Any) -> BaseTool:
return Tool(
name="Search",
name="Serper Search",
func=GoogleSerperAPIWrapper(**kwargs).run,
description="A low-cost Google Search API. Useful for when you need to answer questions about current events. Input should be a search query.",
)
Expand All @@ -154,7 +154,7 @@ def _get_serpapi(**kwargs: Any) -> BaseTool:

def _get_searx_search(**kwargs: Any) -> BaseTool:
return Tool(
name="Search",
name="SearX Search",
description="A meta search engine. Useful for when you need to answer questions about current events. Input should be a search query.",
func=SearxSearchWrapper(**kwargs).run,
)
Expand Down
8 changes: 6 additions & 2 deletions langchain/tools/bing_search/tool.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,12 @@
class BingSearchRun(BaseTool):
"""Tool that adds the capability to query the Bing search API."""

name = "bing_search"
description = "Execute the Bing search API."
name = "Bing Search"
description = (
"A wrapper around Bing Search. "
"Useful for when you need to answer questions about current events. "
"Input should be a search query."
)
api_wrapper: BingSearchAPIWrapper

def _run(self, query: str) -> str:
Expand Down
8 changes: 6 additions & 2 deletions langchain/tools/google_search/tool.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,12 @@
class GoogleSearchRun(BaseTool):
"""Tool that adds the capability to query the Google search API."""

name = "google_search"
description = "Execute the Google search API."
name = "Google Search"
description = (
"A wrapper around Google Search. "
"Useful for when you need to answer questions about current events. "
"Input should be a search query."
)
api_wrapper: GoogleSearchAPIWrapper

def _run(self, query: str) -> str:
Expand Down
9 changes: 7 additions & 2 deletions langchain/tools/wolfram_alpha/tool.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,13 @@
class WolframAlphaQueryRun(BaseTool):
"""Tool that adds the capability to query using the Wolfram Alpha SDK."""

name = "query_wolfram_alpha"
description = "Query Wolfram Alpha with the given query."
name = "Wolfram Alpha"
description = (
"A wrapper around Wolfram Alpha. "
"Useful for when you need to answer questions about Math, "
"Science, Technology, Culture, Society and Everyday Life. "
"Input should be a search query."
)
api_wrapper: WolframAlphaAPIWrapper

def _run(self, query: str) -> str:
Expand Down
Loading