Table of Contents

Class RagEnabledService

Namespace
Mythosia.AI.Rag
Assembly
Mythosia.AI.Rag.dll

Wraps an AIService to intercept queries through the RAG pipeline before sending to the LLM. All calls go through IRagPipeline.ProcessAsync โ€” the AIService itself is never modified.

public class RagEnabledService
Inheritance
RagEnabledService
Inherited Members

Methods

GetCompletionAsync(Message)

Processes a Message through RAG pipeline (extracts text content for retrieval).

public Task<string> GetCompletionAsync(Message message)

Parameters

message Message

Returns

Task<string>

GetCompletionAsync(Message, RagQueryOptions?, CancellationToken)

Processes a Message through RAG pipeline (extracts text content for retrieval) with per-request query overrides.

public Task<string> GetCompletionAsync(Message message, RagQueryOptions? options, CancellationToken cancellationToken = default)

Parameters

message Message
options RagQueryOptions
cancellationToken CancellationToken

Returns

Task<string>

GetCompletionAsync(string)

Processes the query through RAG pipeline, then sends the request message content to the LLM.

public Task<string> GetCompletionAsync(string query)

Parameters

query string

Returns

Task<string>

GetCompletionAsync(string, RagQueryOptions?, CancellationToken)

Processes the query through RAG pipeline with per-request query overrides, then sends the request message content to the LLM.

public Task<string> GetCompletionAsync(string query, RagQueryOptions? options, CancellationToken cancellationToken = default)

Parameters

query string
options RagQueryOptions
cancellationToken CancellationToken

Returns

Task<string>

RetrieveAsync(string, RagQueryOptions?, CancellationToken)

Performs RAG retrieval with per-request query overrides and returns the processed query (context + references) without calling the LLM.

public Task<RagProcessedQuery> RetrieveAsync(string query, RagQueryOptions? options, CancellationToken cancellationToken = default)

Parameters

query string
options RagQueryOptions
cancellationToken CancellationToken

Returns

Task<RagProcessedQuery>

RetrieveAsync(string, CancellationToken)

Performs RAG retrieval and returns the processed query (context + references) without calling the LLM. Useful for inspecting what context would be sent.

public Task<RagProcessedQuery> RetrieveAsync(string query, CancellationToken cancellationToken = default)

Parameters

query string
cancellationToken CancellationToken

Returns

Task<RagProcessedQuery>

StreamAsync(string, RagQueryOptions?, CancellationToken)

Streams the LLM response after RAG augmentation with per-request query overrides.

public IAsyncEnumerable<string> StreamAsync(string prompt, RagQueryOptions? options, CancellationToken cancellationToken = default)

Parameters

prompt string
options RagQueryOptions
cancellationToken CancellationToken

Returns

IAsyncEnumerable<string>

StreamAsync(string, CancellationToken)

Streams the LLM response after RAG augmentation.

public IAsyncEnumerable<string> StreamAsync(string prompt, CancellationToken cancellationToken = default)

Parameters

prompt string
cancellationToken CancellationToken

Returns

IAsyncEnumerable<string>

StreamOnceAsync(string, RagQueryOptions?, CancellationToken)

Streams the LLM response as a one-off query with per-request query overrides (no conversation history).

public IAsyncEnumerable<string> StreamOnceAsync(string prompt, RagQueryOptions? options, CancellationToken cancellationToken = default)

Parameters

prompt string
options RagQueryOptions
cancellationToken CancellationToken

Returns

IAsyncEnumerable<string>

StreamOnceAsync(string, CancellationToken)

Streams the LLM response as a one-off query (no conversation history).

public IAsyncEnumerable<string> StreamOnceAsync(string prompt, CancellationToken cancellationToken = default)

Parameters

prompt string
cancellationToken CancellationToken

Returns

IAsyncEnumerable<string>

WithoutRag()

Returns the underlying IAIService without RAG processing.

public IAIService WithoutRag()

Returns

IAIService