본문으로 건너뛰기

Class: ContextChatEngine

ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.

Extends

Implements

Constructors

new ContextChatEngine()

new ContextChatEngine(init): ContextChatEngine

Parameters

init

init.chatHistory?: ChatMessage[]

init.chatModel?: LLM<object, object>

init.contextRole?: MessageType

init.contextSystemPrompt?

init.nodePostprocessors?: BaseNodePostprocessor[]

init.retriever: BaseRetriever

init.systemPrompt?: string

Returns

ContextChatEngine

Overrides

PromptMixin . constructor

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:35

Properties

chatHistory

chatHistory: ChatHistory<object>

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:31


chatModel

chatModel: LLM<object, object>

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:30


contextGenerator

contextGenerator: ContextGenerator

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:32


systemPrompt?

optional systemPrompt: string

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:33

Methods

_getPromptModules()

protected _getPromptModules(): Record<string, ContextGenerator>

Returns

Record<string, ContextGenerator>

Overrides

PromptMixin . _getPromptModules

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:56


_getPrompts()

protected _getPrompts(): PromptsDict

Returns

PromptsDict

Inherited from

PromptMixin . _getPrompts

Source

packages/llamaindex/src/prompts/Mixin.ts:78


_updatePrompts()

protected _updatePrompts(promptsDict): void

Parameters

promptsDict: PromptsDict

Returns

void

Inherited from

PromptMixin . _updatePrompts

Source

packages/llamaindex/src/prompts/Mixin.ts:86


chat()

chat(params)

chat(params): Promise<AsyncIterable <EngineResponse>>

Send message along with the class's current chat history to the LLM.

Parameters

params: ChatEngineParamsStreaming

Returns

Promise<AsyncIterable <EngineResponse>>

Implementation of

ChatEngine . chat

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:62

chat(params)

chat(params): Promise <EngineResponse>

Parameters

params: ChatEngineParamsNonStreaming

Returns

Promise <EngineResponse>

Implementation of

ChatEngine . chat

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:65


getPrompts()

getPrompts(): PromptsDict

Returns all prompts from the mixin and its modules

Returns

PromptsDict

Inherited from

PromptMixin . getPrompts

Source

packages/llamaindex/src/prompts/Mixin.ts:27


prepareRequestMessages()

private prepareRequestMessages(message, chatHistory): Promise<object>

Parameters

message: MessageContent

chatHistory: ChatHistory<object>

Returns

Promise<object>

messages

messages: ChatMessage<object>[]

nodes

nodes: NodeWithScore <Metadata>[] = context.nodes

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:106


prependSystemPrompt()

private prependSystemPrompt(message): ChatMessage

Parameters

message: ChatMessage

Returns

ChatMessage

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:121


reset()

reset(): void

Resets the chat history so that it's empty.

Returns

void

Implementation of

ChatEngine . reset

Source

packages/llamaindex/src/engines/chat/ContextChatEngine.ts:102


updatePrompts()

updatePrompts(promptsDict): void

Updates the prompts in the mixin and its modules

Parameters

promptsDict: PromptsDict

Returns

void

Inherited from

PromptMixin . updatePrompts

Source

packages/llamaindex/src/prompts/Mixin.ts:48


validatePrompts()

validatePrompts(promptsDict, moduleDict): void

Validates the prompt keys and module keys

Parameters

promptsDict: PromptsDict

moduleDict: ModuleDict

Returns

void

Inherited from

PromptMixin . validatePrompts

Source

packages/llamaindex/src/prompts/Mixin.ts:10