How can I change from OpenAI to ChatOpenAI in langchain and Flask?

2024-09-02 13:52

本文主要是介绍How can I change from OpenAI to ChatOpenAI in langchain and Flask?,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

题意:“在 LangChain 和 Flask 中,如何将 OpenAI 更改为 ChatOpenAI?”

问题背景:

This is an implementation based on langchain and flask and refers to an implementation to be able to stream responses from the OpenAI server in langchain to a page with javascript that can show the streamed response.

“这是一个基于 LangChain 和 Flask 的实现,用于将 OpenAI 服务器的响应流式传输到一个带有 JavaScript 的页面,该页面可以显示流式响应。”

I tried all ways to modify the code below to replace the langchain library from openai to chatopenai without success, i upload below both implementations (the one with openai working) and the one chatopenai with error. thank you to all the community and those who can help me to understand the problem, it would be very helpful if you could also show me how to solve it since I have been trying for days and the error it shows has really no significance.

“我尝试了所有方法修改下面的代码,将 LangChain 库从 OpenAI 替换为 ChatOpenAI,但没有成功。下面上传了两个实现(一个是使用 OpenAI 正常工作的版本,另一个是带有错误的 ChatOpenAI 版本)。感谢所有社区成员以及能够帮助我理解问题的人。如果你们能告诉我如何解决这个问题,那将非常有帮助,因为我已经尝试了好几天,而显示的错误并没有什么实际意义。”

Code version with library that works but reports as deprecated:

“这是使用库的代码版本,它可以正常工作但报告为已弃用:”

from flask import Flask, Response
import threading
import queuefrom langchain.llms import OpenAI
from langchain.callbacks.base import BaseCallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandlerapp = Flask(__name__)@app.route('/')
def index():return Response('''<!DOCTYPE html>
<html>
<head><title>Flask Streaming Langchain Example</title></head>
<body><div id="output"></div><script>
const outputEl = document.getElementById('output');(async function() {try {const controller = new AbortController();const signal = controller.signal;const timeout = 120000; // Imposta il timeout su 120 secondisetTimeout(() => controller.abort(), timeout);const response = await fetch('/chain', {method: 'POST', signal});const reader = response.body.getReader();const decoder = new TextDecoder();let buffer = '';while (true) {const { done, value } = await reader.read();if (done) { break; }const text = decoder.decode(value, {stream: true});outputEl.innerHTML += text;}} catch (err) {console.error(err);}
})();</script>
</body>
</html>''', mimetype='text/html')class ThreadedGenerator:def __init__(self):self.queue = queue.Queue()def __iter__(self):return selfdef __next__(self):item = self.queue.get()if item is StopIteration: raise itemreturn itemdef send(self, data):self.queue.put(data)def close(self):self.queue.put(StopIteration)class ChainStreamHandler(StreamingStdOutCallbackHandler):def __init__(self, gen):super().__init__()self.gen = gendef on_llm_new_token(self, token: str, **kwargs):self.gen.send(token)def llm_thread(g, prompt):try:llm = OpenAI(model_name="gpt-4",verbose=True,streaming=True,callback_manager=BaseCallbackManager([ChainStreamHandler(g)]),temperature=0.7,)llm(prompt)finally:g.close()def chain(prompt):g = ThreadedGenerator()threading.Thread(target=llm_thread, args=(g, prompt)).start()return g@app.route('/chain', methods=['POST'])
def _chain():return Response(chain("Create a poem about the meaning of life \n\n"), mimetype='text/plain')if __name__ == '__main__':app.run(threaded=True, debug=True)

Version with error (OpenAI replaced with ChatOpenAI)“

这是带有错误的版本(将 OpenAI 替换为 ChatOpenAI):”

import threading
import queuefrom langchain.chat_models import ChatOpenAI
from langchain.callbacks.base import BaseCallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandlerapp = Flask(__name__)@app.route('/')
def index():return Response('''<!DOCTYPE html>
<html>
<head><title>Flask Streaming Langchain Example</title></head>
<body><div id="output"></div><script>
const outputEl = document.getElementById('output');(async function() {try {const controller = new AbortController();const signal = controller.signal;const timeout = 120000; // Imposta il timeout su 120 secondisetTimeout(() => controller.abort(), timeout);const response = await fetch('/chain', {method: 'POST', signal});const reader = response.body.getReader();const decoder = new TextDecoder();let buffer = '';while (true) {const { done, value } = await reader.read();if (done) { break; }const text = decoder.decode(value, {stream: true});outputEl.innerHTML += text;}} catch (err) {console.error(err);}
})();</script>
</body>
</html>''', mimetype='text/html')class ThreadedGenerator:def __init__(self):self.queue = queue.Queue()def __iter__(self):return selfdef __next__(self):item = self.queue.get()if item is StopIteration: raise itemreturn itemdef send(self, data):self.queue.put(data)def close(self):self.queue.put(StopIteration)class ChainStreamHandler(StreamingStdOutCallbackHandler):def __init__(self, gen):super().__init__()self.gen = gendef on_llm_new_token(self, token: str, **kwargs):self.gen.send(token)def on_chat_model_start(self, token: str):print("started")def llm_thread(g, prompt):try:llm = ChatOpenAI(model_name="gpt-4",verbose=True,streaming=True,callback_manager=BaseCallbackManager([ChainStreamHandler(g)]),temperature=0.7,)llm(prompt)finally:g.close()def chain(prompt):g = ThreadedGenerator()threading.Thread(target=llm_thread, args=(g, prompt)).start()return g@app.route('/chain', methods=['POST'])
def _chain():return Response(chain("parlami dei 5 modi di dire in inglese che gli italiani conoscono meno \n\n"), mimetype='text/plain')if __name__ == '__main__':app.run(threaded=True, debug=True)

Error showing the console at startup and at the time I enter the web page:

“启动时和进入网页时控制台显示的错误:”

Error in ChainStreamHandler.on_chat_model_start callback: ChainStreamHandler.on_chat_model_start() got an unexpected keyword argument 'run_id'
Exception in thread Thread-4 (llm_thread):
127.0.0.1 - - [09/Sep/2023 18:09:29] "POST /chain HTTP/1.1" 200 -
Traceback (most recent call last):File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 300, in _handle_eventgetattr(handler, event_name)(*args, **kwargs)File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\base.py", line 168, in on_chat_model_startraise NotImplementedError(
NotImplementedError: StdOutCallbackHandler does not implement `on_chat_model_start`During handling of the above exception, another exception occurred:Traceback (most recent call last):File "C:\Users\user22\AppData\Local\Programs\Python\Python311\Lib\threading.py", line 1038, in _bootstrap_inner    self.run()File "C:\Users\user22\AppData\Local\Programs\Python\Python311\Lib\threading.py", line 975, in runself._target(*self._args, **self._kwargs)File "c:\Users\user22\Desktop\Work\TESTPROJ\streamresp.py", line 90, in llm_threadllm(prompt)File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\chat_models\base.py", line 552, in __call__generation = self.generate(^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\chat_models\base.py", line 293, in generaterun_managers = callback_manager.on_chat_model_start(^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 1112, in on_chat_model_start_handle_event(File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 304, in _handle_eventmessage_strings = [get_buffer_string(m) for m in args[1]]^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\callbacks\manager.py", line 304, in <listcomp>message_strings = [get_buffer_string(m) for m in args[1]]^^^^^^^^^^^^^^^^^^^^File "C:\Users\user22\Desktop\Work\TESTPROJ\env\Lib\site-packages\langchain\schema\messages.py", line 52, in get_buffer_stringraise ValueError(f"Got unsupported message type: {m}")
ValueError: Got unsupported message type: p

thank you very much for the support!

“非常感谢您的支持!”

问题解决:

Thanks to python273 user on github I've resolved:

“感谢 GitHub 上的用户 python273,我已经解决了这个问题。”

import os
os.environ["OPENAI_API_KEY"] = ""from flask import Flask, Response, request
import threading
import queuefrom langchain.chat_models import ChatOpenAI
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.schema import AIMessage, HumanMessage, SystemMessageapp = Flask(__name__)@app.route('/')
def index():# just for the example, html is included directly, move to .html filereturn Response('''
<!DOCTYPE html>
<html>
<head><title>Flask Streaming Langchain Example</title></head>
<body><form id="form"><input name="prompt" value="write a short koan story about seeing beyond"/><input type="submit"/></form><div id="output"></div><script>const formEl = document.getElementById('form');const outputEl = document.getElementById('output');let aborter = new AbortController();async function run() {aborter.abort();  // cancel previous requestoutputEl.innerText = '';aborter = new AbortController();const prompt = new FormData(formEl).get('prompt');try {const response = await fetch('/chain', {signal: aborter.signal,method: 'POST',headers: {'Content-Type': 'application/json'},body: JSON.stringify({prompt}),});const reader = response.body.getReader();const decoder = new TextDecoder();while (true) {const { done, value } = await reader.read();if (done) { break; }const decoded = decoder.decode(value, {stream: true});outputEl.innerText += decoded;}} catch (err) {console.error(err);}}run();  // run on initial promptformEl.addEventListener('submit', function(event) {event.preventDefault();run();});</script>
</body>
</html>
''', mimetype='text/html')class ThreadedGenerator:def __init__(self):self.queue = queue.Queue()def __iter__(self):return selfdef __next__(self):item = self.queue.get()if item is StopIteration: raise itemreturn itemdef send(self, data):self.queue.put(data)def close(self):self.queue.put(StopIteration)class ChainStreamHandler(StreamingStdOutCallbackHandler):def __init__(self, gen):super().__init__()self.gen = gendef on_llm_new_token(self, token: str, **kwargs):self.gen.send(token)def llm_thread(g, prompt):try:chat = ChatOpenAI(verbose=True,streaming=True,callbacks=[ChainStreamHandler(g)],temperature=0.7,)chat([HumanMessage(content=prompt)])finally:g.close()def chain(prompt):g = ThreadedGenerator()threading.Thread(target=llm_thread, args=(g, prompt)).start()return g@app.route('/chain', methods=['POST'])
def _chain():return Response(chain(request.json['prompt']), mimetype='text/plain')if __name__ == '__main__':app.run(threaded=True, debug=True)

Link to the original reply: https://gist.github.com/python273/563177b3ad5b9f74c0f8f3299ec13850

“原始回复的链接: https://gist.github.com/python273/563177b3ad5b9f74c0f8f3299ec13850”

这篇关于How can I change from OpenAI to ChatOpenAI in langchain and Flask?的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/1130139

相关文章

基于Flask框架添加多个AI模型的API并进行交互

《基于Flask框架添加多个AI模型的API并进行交互》:本文主要介绍如何基于Flask框架开发AI模型API管理系统,允许用户添加、删除不同AI模型的API密钥,感兴趣的可以了解下... 目录1. 概述2. 后端代码说明2.1 依赖库导入2.2 应用初始化2.3 API 存储字典2.4 路由函数2.5 应

Flask 验证码自动生成的实现示例

《Flask验证码自动生成的实现示例》本文主要介绍了Flask验证码自动生成的实现示例,文中通过示例代码介绍的非常详细,对大家的学习或者工作具有一定的参考学习价值,需要的朋友们下面随着小编来一起学习... 目录生成图片以及结果处理验证码蓝图html页面展示想必验证码大家都有所了解,但是可以自己定义图片验证码

Flask解决指定端口无法生效问题

《Flask解决指定端口无法生效问题》文章讲述了在使用PyCharm开发Flask应用时,启动地址与手动指定的IP端口不一致的问题,通过修改PyCharm的运行配置,将Flask项目的运行模式从Fla... 目录android问题重现解决方案问题重现手动指定的IP端口是app.run(host='0.0.

Python结合Flask框架构建一个简易的远程控制系统

《Python结合Flask框架构建一个简易的远程控制系统》这篇文章主要为大家详细介绍了如何使用Python与Flask框架构建一个简易的远程控制系统,能够远程执行操作命令(如关机、重启、锁屏等),还... 目录1.概述2.功能使用系统命令执行实时屏幕监控3. BUG修复过程1. Authorization

SpringBoot快速接入OpenAI大模型的方法(JDK8)

《SpringBoot快速接入OpenAI大模型的方法(JDK8)》本文介绍了如何使用AI4J快速接入OpenAI大模型,并展示了如何实现流式与非流式的输出,以及对函数调用的使用,AI4J支持JDK8... 目录使用AI4J快速接入OpenAI大模型介绍AI4J-github快速使用创建SpringBoot

Ilya-AI分享的他在OpenAI学习到的15个提示工程技巧

Ilya(不是本人,claude AI)在社交媒体上分享了他在OpenAI学习到的15个Prompt撰写技巧。 以下是详细的内容: 提示精确化:在编写提示时,力求表达清晰准确。清楚地阐述任务需求和概念定义至关重要。例:不用"分析文本",而用"判断这段话的情感倾向:积极、消极还是中性"。 快速迭代:善于快速连续调整提示。熟练的提示工程师能够灵活地进行多轮优化。例:从"总结文章"到"用

fzu 2277 Change 线段树

Problem 2277 Change Time Limit: 2000 mSec    Memory Limit : 262144 KB  Problem Description There is a rooted tree with n nodes, number from 1-n. Root’s number is 1.Each node has a value ai.

时间序列|change point detection

change point detection 被称为变点检测,其基本定义是在一个序列或过程中,当某个统计特性(分布类型、分布参数)在某时间点受系统性因素而非偶然因素影响发生变化,我们就称该时间点为变点。变点识别即利用统计量或统计方法或机器学习方法将该变点位置估计出来。 Change Point Detection的类型 online 指连续观察某一随机过程,监测到变点时停止检验,不运用到

674 - Coin Change

一道很水的DP,但我做的时候也很费劲,由于存在面值为1的时候,所以所有面额都至少有1种方法。 推导出状态方程就是 dp(V,step) += dp(V - i * coin[step],step - 1); 其中V为当前的面值,coin[step]为硬币的面额,递归边界,如果step == 0(也就是递归到硬币面额为1的时候,返回1); #include<cstdio>#include<

flask-login 生成 cookie,session

flask-login 生成 cookie,session Flask-Login login_user() 显示来自 Set-Cookie 标头的加密 cookie # 模拟一个用户类class User(UserMixin):def __init__(self, id):self.id = id@app.route('/login')def login():# 模拟用户登录过程user