Deploy an executive assistant using Langsmith and Langchain
ChatModel.bind_tools()
: Attaches tool definitions to model calls. While providers have different tool definition formats, LangChain provides a standard interface for versatility. Accepts tool definitions as dictionaries, Pydantic classes, LangChain tools, or functions, telling the LLM how to use each tool.
AIMessage.tool_calls
: An attribute on AIMessage that provides easy access to model-initiated tool calls, specifying invocations in the bind_tools format:
create_tool_calling_agent()
: Unifies the above concepts to work across different provider formats, enabling easy model switching.
main.py
: Entrypoint filecerebrium.toml
: Build and environment configurationcerebrium.toml
:
CAL_API_KEY
: Your Cal.com API keyOPENAI_API_KEY
: Your OpenAI API keymain.py
for calendar management:
@tool
decorator to identify functions as LangChain toolsfind_available_slots
helper function to format Cal.com API responses into readable time slots
main.py
file too:
pip install pydantic langchain pytz openai langchain_openai langchain-community
and then run python main.py
to execute your main python file. You will need to replace your secrets with the actual values when running locally. You should then see output similar to the following:
cerebrium.toml
dependencies@traceable
decorator to your functions. LangSmith automatically tracks tool invocations and OpenAI responses through function traversal. Add the decorator to the predict
function and any independently instantiated functions. Edit main.py to have the following: