response2 = ollama.chat(
model='llama3.2-vision',
messages = [
{"role": "user",
"content": "Who wrote the book Lord of the Rings?"},
{"role": "assistant",
"content": response1.message.content},
{"role": "user",
"content": "What other books has the author written?"}
],
)
display(Markdown(response2.message.content))
First response
Next input from user