• Get formatted LLM context for a natural language query

    This is a convenience function that performs a complete vector search and returns only the formatted context string, ready to be passed to an LLM. Use this when you only need the context and don't need the raw search results.

    Parameters

    • db: Surreal

      SurrealDB instance

    • query: string

      Natural language query text

    • limit: number = 5

      Number of results to include in context (default: 5)

    Returns Promise<string>

    Formatted context string containing matching code elements with:

    • Similarity scores
    • Node types, names, and descriptions
    • File paths and locations
    • Function signatures and parameters
    • Module exports
    • Import/ancestor paths
    const context = await getContextForQuery(db, "user authentication", 5);
    // Pass to LLM: "Here's the relevant code:\n" + context