Build a Simple Tools Enabled Agent with LangChain and OpenAI

March 23, 2025
AI AgentAI AgentsToolsFunction Calling

🛠️ Build a Simple Tools Enabled Agent with LangChain and OpenAI

What if your AI chatbot could not only answer questions but also call real tools like weather APIs or database checks?

In this tutorial, you'll build a mini AI Agent that uses external tools to make better, grounded decisions all using LangChain + OpenAI!


🛠 Prerequisites

Install the dependencies:

1pip install langchain-openai python-dotenv

You'll also need:

  • An OpenAI API key stored in a .env file

📦 Full Python Code Overview

1from dotenv import load_dotenv, find_dotenv
2from langchain_openai import ChatOpenAI
3from langchain_core.tools import tool
4from langchain_core.messages import HumanMessage, ToolMessage
5
6load_dotenv(find_dotenv())
7
8# Initialize LLM
9llm = ChatOpenAI(model="gpt-4o-mini")
10
11# Define fake weather API tool
12@tool
13def fake_weather_api(city: str) -> str:
14  return "Sunny, 22°C"
15
16# Define outdoor seating checker
17@tool
18def outdoor_seating_availability(city: str) -> str:
19  return "Outdoor seating is available."
20
21# Register tools
22tools = [fake_weather_api, outdoor_seating_availability]
23llm_with_tools = llm.bind_tools(tools)
24
25# Tool lookup dictionary
26tool_mapping = {tool.name.lower(): tool for tool in tools}
27
28# Start conversation
29messages = [
30  HumanMessage(
31      "How will the weather be in Munich today? I would like to eat outside if possible"
32  )
33]
34
35# LLM responds, possibly suggesting tool calls
36llm_output = llm_with_tools.invoke(messages)
37messages.append(llm_output)
38
39# Execute any tool calls
40for tool_call in llm_output.tool_calls:
41  tool = tool_mapping[tool_call["name"].lower()]
42  tool_output = tool.invoke(tool_call["args"])
43  messages.append(ToolMessage(tool_output, tool_call_id=tool_call["id"]))
44
45# Final answer
46result = llm_with_tools.invoke(messages)
47print(result.content)

🔍 How It Works

StepPurpose
@tool decoratorsDefine callable functions
bind_toolsTell the LLM what tools it can use
HumanMessageSend user input to the system
LLM decides to call toolsThe model intelligently picks tools
Execute tool functionsReal Python functions are called
Final LLM replyIncorporates tool results into response

✅ You built an agent that thinks, acts, and reasons with external tools!


📘 Why Use Tools in AI?

  • Fetch live or dynamic data (e.g., real weather, booking systems)
  • Expand AI capabilities beyond text generation
  • Make AI more factual and grounded
  • Build powerful AI assistants, copilots, and agents

🖥 Example Output

Question:

1How will the weather be in Munich today? I would like to eat outside if possible.

Tool Calls:

  • Calls fake_weather_api(city="Munich")
  • Calls outdoor_seating_availability(city="Munich")

Final Response:

1The weather in Munich today is sunny, 22°C. Outdoor seating is available, so you can enjoy your meal outside!

🚀 Much smarter and grounded than guessing!


📘 Conclusion

You now know how to build a simple but powerful tools-enabled AI agent using:

  • LangChain's @tool decorators
  • OpenAI's function calling abilities
  • Modular message passing and retrieval

The real future of AI is not just chatting it's thinking + acting.