def data_input():
# It gets the data that the llm need and the user wants to ask question related to this data
def chat_with_data(data):
messages = [
{
"role": "system",
"content": "You are given some data and you have to analyze the data correctly if the user asks for any output then give the output as per the data and user's question otherwise don't give answer."
},
{
"role": "user",
"content": data
}
]
try:
while True:
user_input = input("Enter your message: ")
if user_input.lower() == 'exit':
print("Exiting the chat")
break
messages.append({"role": "user", "content": user_input})
assistant_message = ''
ollama_response = ollama.chat(model='llama3.2', messages=messages, stream=True)
for chunk in ollama_response:
assistant_message += chunk['message']['content']
print(assistant_message, flush=True)
messages.append({"role": "assistant", "content": assistant_message})
except Exception as e:
ic(e)
data = data_input()
chat_with_data(data)
The LLM follows the instruction and gives a long summary of the data and the program ends but the program should run in a loop till the user didn't type exit. Should I changet the condition while True?
def data_input():
# It gets the data that the llm need and the user wants to ask question related to this data
def chat_with_data(data):
messages = [
{
"role": "system",
"content": "You are given some data and you have to analyze the data correctly if the user asks for any output then give the output as per the data and user's question otherwise don't give answer."
},
{
"role": "user",
"content": data
}
]
try:
while True:
user_input = input("Enter your message: ")
if user_input.lower() == 'exit':
print("Exiting the chat")
break
messages.append({"role": "user", "content": user_input})
assistant_message = ''
ollama_response = ollama.chat(model='llama3.2', messages=messages, stream=True)
for chunk in ollama_response:
assistant_message += chunk['message']['content']
print(assistant_message, flush=True)
messages.append({"role": "assistant", "content": assistant_message})
except Exception as e:
ic(e)
data = data_input()
chat_with_data(data)
The LLM follows the instruction and gives a long summary of the data and the program ends but the program should run in a loop till the user didn't type exit. Should I changet the condition while True?
Share Improve this question asked Mar 2 at 13:08 Praveen KumarPraveen Kumar 294 bronze badges 2- 1 you're calling chat_with_data(data) after defining it, but you're not actually looping back to get new data after the chat session ends. – steve-ed Commented Mar 2 at 13:23
- how did you run it? Did you click icon or did you open manually console/terminal to see errors? – furas Commented Mar 2 at 13:47
1 Answer
Reset to default 0You have to swap your while True
with your try
statement. Your program currently just executes the try
statement only once. Try the following method:
while True:
user_input = input("Enter your message: ")
if user_input.lower() == 'exit':
print("Exiting the chat")
break
try:
messages.append({"role": "user", "content": user_input})
assistant_message = ''
ollama_response = ollama.chat(model='llama3.2', messages=messages, stream=True)
for chunk in ollama_response:
assistant_message += chunk['message']['content']
print(assistant_message, flush=True)
messages.append({"role": "assistant", "content": assistant_message})
except Exception as e:
ic(e)
data = data_input()
chat_with_data(data)
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745121981a4612467.html
评论列表(0条)