Welcome to our third tutorial, where we will walk you through the process of building your own personal assistant using DeepPavlov Dream. In our previous tutorials, we:
1) developed a bot capable of engaging in conversations about movies and answering factoid questions utilizing existing Dream components without any modifications;
2) created a generative bot with an enthusiastic and adventurous persona modifying the existing Dream persona distribution.
Now, in this tutorial, we will look at an experimental Dream Reasoning distribution that uses OpenAI ChatGPT to think of actions required to handle user requests and to select the appropriate API to complete them. We will dive into the work pipeline of this distribution and, after that, we will learn how to add new APIs to it.
pip install git+https://github.com/deeppavlov/deeppavlov_dreamtools.gitgit clone https://github.com/deeppavlov/dream.gitcd dreamdreamtools clone dist dream_tools
--template dream_reasoning
--display-name "Dream Tools"
--author deepypavlova@email.org
--description "This is a copy of Dream Reasoning with arxiv API added."
--overwrite
dreamtools add component components/sdjkfhaliueytu34ktkrlg.yml
--dist dream_toolsimport arxiv
from df_engine.core import Context, Actor
from scenario.utils import compose_input_for_API
def arxiv_api_response(ctx: Context, actor: Actor, *args, **kwargs) -> str:
api_input = compose_input_for_API(ctx, actor)
search = arxiv.Search(
query = api_input,
max_results = 3,
sort_by = arxiv.SortCriterion.SubmittedDate
)
response = ""
for result in search.results():
response += f"TITLE: {result.title}.\nSUMMARY: {result.summary}\LINK: {result}\n\n"
return responsefrom scenario.api_responses.arxiv_api import arxiv_api_response
assert arxiv_api_response{
"arxiv_api": {
"display_name": "arxiv API",
"description": "arxiv API can search for latest articles on requested keyword.",
"keys": [],
"needs_approval": "False",
"timeout": 30,
"input_template": "Return only the keyword that is needed to be searched for. E.g., 'ChatGPT', 'dialog systems'."
}
}arxiv==1.4.7dff-reasoning-skill:
env_file:
- .env
- .env_secret
build:
args:
SERVICE_PORT: 8169
SERVICE_NAME: dff_reasoning_skill
API_CONFIGS: generative_lm.json,google_api.json,news_api.json,weather_api.json,wolframalpha_api.json,arxiv_api.json
GENERATIVE_SERVICE_URL: http://openai-api-chatgpt:8145/respond
GENERATIVE_SERVICE_CONFIG: openai-chatgpt.json
GENERATIVE_TIMEOUT: 120
N_UTTERANCES_CONTEXT: 1
ENVVARS_TO_SEND: OPENAI_API_KEY,GOOGLE_CSE_ID,GOOGLE_API_KEY,OPENWEATHERMAP_API_KEY,NEWS_API_KEY,WOLFRAMALPHA_APP_ID
context: .
dockerfile: skills/dff_reasoning_skill/Dockerfile
command: gunicorn --workers=1 server:app -b 0.0.0.0:8169 --reload
environment:
SERVICE_PORT: 8169
SERVICE_NAME: dff_reasoning_skill
API_CONFIGS: generative_lm.json,google_api.json,news_api.json,weather_api.json,wolframalpha_api.json,arxiv_api.json
GENERATIVE_SERVICE_URL: http://openai-api-chatgpt:8145/respond
GENERATIVE_SERVICE_CONFIG: openai-chatgpt.json
GENERATIVE_TIMEOUT: 120
N_UTTERANCES_CONTEXT: 1
ENVVARS_TO_SEND: OPENAI_API_KEY,GOOGLE_CSE_ID,GOOGLE_API_KEY,OPENWEATHERMAP_API_KEY,NEWS_API_KEY,WOLFRAMALPHA_APP_ID
deploy:
resources:
limits:
memory: 1.5G
reservations:
memory: 1.5G
OPENAI_API_KEY=If you want to test your distribution with existing APIs (not only arxiv API), you also need to add keys for them. Here is the guide on how to get them. However, if you do not provide certain keys, the skill will still function without the APIs that require those missing keys.
GOOGLE_CSE_ID=
GOOGLE_API_KEY=
OPENWEATHERMAP_API_KEY=
NEWS_API_KEY=
WOLFRAMALPHA_APP_ID=
services:
sentseg:
command: ["nginx", "-g", "daemon off;"]
build:
context: dp/proxy/
dockerfile: Dockerfile
environment:
- PROXY_PASS=proxy.deeppavlov.ai:8011
- PORT=8011
combined-classification:
command: ["nginx", "-g", "daemon off;"]
build:
context: dp/proxy/
dockerfile: Dockerfile
environment:
- PROXY_PASS=proxy.deeppavlov.ai:8087
- PORT=8087
sentence-ranker:
command: [ "nginx", "-g", "daemon off;" ]
build:
context: dp/proxy/
dockerfile: Dockerfile
environment:
- PROXY_PASS=proxy.deeppavlov.ai:8128
- PORT=8128
version: '3.7'docker-compose -f docker-compose.yml -f
assistant_dists/dream_tools/docker-compose.override.yml -f
assistant_dists/dream_tools/proxy.yml up --buildPlease note that in the command, we also utilize the assistant_dists/dream_adventurer_openai_prompted/proxy.yml configuration. This configuration enables you to conserve your local resources by employing proxied copies of certain services hosted by DeepPavlov.
docker-compose exec agent python -m
deeppavlov_agent.run agent.debug=false agent.channel=cmd
agent.pipeline_config=assistant_dists/dream_tools/pipeline_conf.json