Skip to main content
SparkLLM chat models API by iFlyTek. For more information, see iFlyTek Open Platform.

Basic use

"""For basic init and call""" from langchain_community.chat_models import ChatSparkLLM from langchain.messages import HumanMessage  chat = ChatSparkLLM(  spark_app_id="<app_id>", spark_api_key="<api_key>", spark_api_secret="<api_secret>" ) message = HumanMessage(content="Hello") chat([message]) 
AIMessage(content='Hello! How can I help you today?') 
  • Get SparkLLM’s app_id, api_key and api_secret from iFlyTek SparkLLM API Console (for more info, see iFlyTek SparkLLM Intro ), then set environment variables IFLYTEK_SPARK_APP_ID, IFLYTEK_SPARK_API_KEY and IFLYTEK_SPARK_API_SECRET or pass parameters when creating ChatSparkLLM as the demo above.

For ChatSparkLLM with Streaming

chat = ChatSparkLLM(  spark_app_id="<app_id>",  spark_api_key="<api_key>",  spark_api_secret="<api_secret>",  streaming=True, ) for chunk in chat.stream("Hello!"):  print(chunk.content, end="") 
Hello! How can I help you today? 

For v2

"""For basic init and call""" from langchain_community.chat_models import ChatSparkLLM from langchain.messages import HumanMessage  chat = ChatSparkLLM(  spark_app_id="<app_id>",  spark_api_key="<api_key>",  spark_api_secret="<api_secret>",  spark_api_url="wss://spark-api.xf-yun.com/v2.1/chat",  spark_llm_domain="generalv2", ) message = HumanMessage(content="Hello") chat([message]) 

Connect these docs to Claude, VSCode, and more via MCP for real-time answers.