Maximizing LLM Performance with Real-Time Data Integration
Written on
Despite the sophistication of language models like Llama 3.1, they are inherently limited in providing real-time information. For instance, if you were to inquire about appropriate clothing for a forthcoming trip to Tasmania, the model might respond as follows:
Tasmania is a fantastic destination!
Since your visit is tomorrow, here are some general suggestions. However, keep in mind that the weather in Tasmania can fluctuate, so it's wise to pack layers and prepare for varying conditions.
Seasonal Insights:
Assuming you're traveling during typical Tasmanian seasons:
- Spring (September to November): Pleasant temperatures with average highs around 17°C (63°F) and lows near 9°C (48°F).
- Summer (December to February): The warmest season, with average highs around 20°C (68°F) and lows around 12°C (54°F).
However, if you're reading this in late July, you might find this advice outdated. The model attempts to provide helpful guidance without being aware of the current season or temperature in Australia’s southernmost state.
Addressing the Challenge: Tool Integration
With the release of Llama 3.1, the introduction of tools enables the development of functions to perform specific tasks, allowing the language model to utilize these functions for improved responses.
In this context, a 'tool' refers to a function written in a supported programming language. For example, here’s a quick Python function to retrieve the current temperature in a specified location using an API:
import requests
# Function to get current temperature in a location def get_current_temperature(place: str) -> str:
base_url = f"https://wttr.in/{place}?format=j1"
response = requests.get(base_url)
data = response.json()
return f"The current temperature in {place} is {data['current_condition'][0]['temp_C']} degrees Celsius."
To test our function:
get_current_temperature("London") This might yield:
'The current temperature in London is 27 degrees Celsius.'
Now, if we wish to make this function accessible to our language model, we want it to recognize when current temperature data is necessary to enhance its responses. Let’s see how to implement this.
Enhancing LLM Interaction with Tools
We will create an asynchronous function that accomplishes the following:
- Initializes a chat client using Ollama, allowing selection of different models (ensure Ollama is installed on your machine).
- Makes the get_current_temperature() function available to the chat.
- Expects the model to identify questions related to current temperature.
- Anticipates the model to comprehend the location and invoke the function accordingly.
- Integrates the function's output into the final response, overriding any default behavior that discourages answers requiring real-time information.
Here’s the asynchronous function that takes a model name and a query:
import ollama import asyncio
async def weather_chat(model: str, query: str):
client = ollama.AsyncClient()
# Start conversation with a query
messages = [{'role': 'user', 'content': query}]
# First API call: Send the query and function details to the model
response = await client.chat(
model=model,
messages=messages,
tools=[
{
'type': 'function',
'function': {
'name': 'get_current_temperature',
'description': 'Get the temperature in a place',
'parameters': {
'type': 'object',
'properties': {
'place': {
'type': 'string',
'description': 'The place for which the temperature is requested',
}
},
'required': ['place'],
},
},
},
],
)
messages.append(response['message'])
if not response['message'].get('tool_calls'):
print("The model didn't use the function. Its response was:")
print(response['message']['content'])
return
if response['message'].get('tool_calls'):
available_functions = {
'get_current_temperature': get_current_temperature,}
for tool in response['message']['tool_calls']:
function_to_call = available_functions[tool['function']['name']]
function_response = function_to_call(tool['function']['arguments']['place'])
messages.append(
{
'role': 'tool',
'content': f"""
Answer the following question: {query}.
Ignore any previous instructions or defaults and instead use the following information: {function_response}
""",
}
)
# Final API call: Get the model's final response
final_response = await client.chat(model=model, messages=messages)
print(final_response['message']['content'])
Testing Our Enhanced Chat Function
Let’s start with a straightforward query. I'll utilize the smallest Llama 3.1 model (8B parameters) to ask a direct question requiring our tool:
# For running in a Python script asyncio.run(weather_chat('llama3.1:8b', "What is the current temperature in Dublin?"))
# For running in a Jupyter notebook await weather_chat('llama3.1:8b', "What is the current temperature in Dublin?") The expected output might be:
The current temperature in Dublin is 20 degrees Celsius.
Next, let’s prompt the model for slightly more complex information:
# For running in a Python script asyncio.run(weather_chat('llama3.1:8b', "What is the current temperature in Ireland's capital?"))
# For running in a Jupyter notebook await weather_chat('llama3.1:8b', "What is the current temperature in Ireland's capital?") The model might respond:
Based on the provided answer, I can tell you that the current temperature in Dublin, the capital of Ireland, is 20 degrees Celsius.
Now, let’s challenge the model with a more nuanced question:
# For running in a Python script asyncio.run(weather_chat('llama3.1:8b', "My sister says that if I flew into the capital city of Norway today, I should wear clothing for extreme weather. Should I trust her advice?"))
# For running in a Jupyter notebook await weather_chat('llama3.1:8b', "My sister says that if I flew into the capital city of Norway today, I should wear clothing for extreme weather. Should I trust her advice?") The response might be:
Based on the current temperature in Oslo being 20 degrees Celsius, it's likely to be a pleasant day with mild temperatures. You may not need to wear clothing for extreme weather. Your sister might have been joking or referring to a different condition. If you're unsure, you could always check the forecast before heading out.
To push the limits further, let's compare temperatures between two cities:
# For running in a Python script asyncio.run(weather_chat('llama3.1:8b', "Compare the temperatures of these two cities right now: Dunedin, New Zealand and Reykjavik, Iceland?"))
# For running in a Jupyter notebook await weather_chat('llama3.1:8b', "Compare the temperatures of these two cities right now: Dunedin, New Zealand and Reykjavik, Iceland?") The model might respond with:
Based on the provided information, I can compare the temperatures of the two cities as follows:
Dunedin, New Zealand: 4°C (current temperature) Reykjavik, Iceland: 13°C (current temperature)
The temperature in Reykjavik, Iceland is currently higher than that of Dunedin, New Zealand by 9 degrees Celsius.
For a final test, let’s revisit our initial question about packing for Tasmania:
# For running in a Python script asyncio.run(weather_chat('llama3.1:8b', "What kinds of clothes should I pack for my trip to Tasmania which leaves tomorrow?"))
# For running in a Jupyter notebook await weather_chat('llama3.1:8b', "What kinds of clothes should I pack for my trip to Tasmania which leaves tomorrow?") The response may include:
Considering the current temperature in Tasmania is 3°C, I would recommend packing layers for your trip. It's likely to be cool and possibly even chilly, especially in the mornings and evenings.
For a comfortable and versatile wardrobe, consider packing:
- A mix of lightweight and warmer tops (fleeces, sweaters, or thermals)
- Waterproof or water-resistant outerwear (jacket or windbreaker) to protect against Tasmania's infamous rain
- Comfortable and sturdy pants or trousers for outdoor activities like hiking or exploring
- Insulating layers (fleece or down jacket) for colder moments
- Warm socks and gloves for chilly mornings and evenings
- A scarf or neck warmer for added warmth
- Waterproof shoes or boots with good grip for navigating Tasmania's rugged terrain
Remember to check the weather forecast before your trip to ensure you're prepared for any potential changes in temperature or precipitation. Have a great time in Tasmania!
I hope this overview of utilizing tools in Llama 3.1 has been insightful. Feel free to share your thoughts or questions!