Przejdź do głównej zawartości

Interface: LLMChat<AdditionalChatOptions, AdditionalMessageOptions>

Extended by

Type parameters

AdditionalChatOptions extends object = object

AdditionalMessageOptions extends object = object

Methods

chat()

chat(params): Promise <ChatResponse<AdditionalMessageOptions> | AsyncIterable <ChatResponseChunk<AdditionalMessageOptions>>>

Parameters

params: LLMChatParamsStreaming<AdditionalChatOptions, object> | LLMChatParamsNonStreaming<AdditionalChatOptions, object>

Returns

Promise <ChatResponse<AdditionalMessageOptions> | AsyncIterable <ChatResponseChunk<AdditionalMessageOptions>>>

Source

packages/llamaindex/src/llm/types.ts:40