Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

☔ [JS] LlamaIndex.TS instrumentation #173

Open
1 of 8 tasks
mikeldking opened this issue Feb 6, 2024 · 3 comments
Open
1 of 8 tasks

☔ [JS] LlamaIndex.TS instrumentation #173

mikeldking opened this issue Feb 6, 2024 · 3 comments
Assignees
Labels
documentation Improvements or additions to documentation enhancement New feature or request examples Related to the examples in the repo language: js Related to JavaScript or Typescript integration roadmap

Comments

@mikeldking
Copy link
Contributor

mikeldking commented Feb 6, 2024

Add LlamaIndex.TS instrumentation

Spike

  • [JS][LlamaIndex][spike] use callbacks to create spans and see if they stack with OpenAI spans
  • [JS][LlamaIndex][spike] investigate LlamaIndex stream primitives and see if they have controllers and tries
  • [JS][LlamaIndex] Spans for top level span types
  • [JS][LlamaIndex] Support Streaming responses
  • [JS][LlamaIndex] Support tool calls
  • [JS][LlamaIndex] Update examples
  • [JS][LlamaIndex] Documentation on how to use

Setup

Open Questions

  • Should we use the callback system? Does it provide enough visibility to map to OpenInference
  • How do we handle async generators?
@mikeldking mikeldking added enhancement New feature or request triage Issues that require triage labels Feb 6, 2024
Copy link

dosubot bot commented Feb 6, 2024

🤖

Hey there @mikeldking! Nice to see you back 😊. Ready to tackle some LlamaIndex.TS instrumentation together?

Sure, I can provide some guidance on how to add LlamaIndex.TS instrumentation to the OpenInference repository.

  1. Create spans for top-level span types: This can be done by using the TracerProvider and LlamaIndexInstrumentor classes from the OpenInference library. The TracerProvider is used to create a tracer, which is then used to create spans. The LlamaIndexInstrumentor is used to instrument the LlamaIndex framework, which will automatically create spans for each operation.

Here's a sample code snippet:

from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

def instrument():
    resource = Resource(attributes={})
    tracer_provider = trace_sdk.TracerProvider(resource=resource)
    span_exporter = OTLPSpanExporter(endpoint="http://phoenix:6006/v1/traces")
    span_processor = SimpleSpanProcessor(span_exporter=span_exporter)
    tracer_provider.add_span_processor(span_processor=span_processor)
    trace_api.set_tracer_provider(tracer_provider=tracer_provider)
    LlamaIndexInstrumentor().instrument()
  1. Support streaming responses and tool calls: This is handled by the OpenInference library itself. When you instrument the LlamaIndex framework using the LlamaIndexInstrumentor, it will automatically create spans for each operation, including streaming responses and tool calls.

  2. Update examples: You can update the examples in the python/examples/llama-index directory to demonstrate how to use the new instrumentation. This could include updating the README.md file and adding new example scripts.

  3. Provide documentation on its usage: You can add documentation on how to use the new instrumentation in the README.md file. This could include a description of what the instrumentation does, how to use it, and any requirements or limitations.

Remember to call instrument() prior to starting the FastAPI server, so that traces for LlamaIndex will be sent to the phoenix server, providing deep observability into the underlying behavior of the application. This includes information about all of the retrieved context for a given query, and other relevant information such as reranking and synthesis steps that might occur prior to returning the final LLM response to the user.

Please note that the above code snippets are just examples and might need to be adjusted to fit your specific use case. If you have any further questions or need more specific guidance, feel free to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added documentation Improvements or additions to documentation examples Related to the examples in the repo language: js Related to JavaScript or Typescript integration labels Feb 6, 2024
@mikeldking mikeldking added roadmap and removed triage Issues that require triage labels Feb 6, 2024
@mikeldking
Copy link
Contributor Author

@dosu-bot you provided python for js! That's not helpful

Copy link

dosubot bot commented Feb 8, 2024

🤖

I apologize for the confusion earlier. Let's focus on the JavaScript instrumentation for LlamaIndex.TS.

From the context provided, it seems that the OpenInference codebase already supports JavaScript and has a setup for instrumentation using OpenTelemetry's NodeTracerProvider. This setup includes span processors for outputting tracing data to the console and sending it to a specified URL, as well as OpenAIInstrumentation for registering instrumentations.

To add LlamaIndex.TS instrumentation, you would need to follow a similar process. Here's a general outline:

  1. Spans for top-level span types: You would need to create spans for the top-level operations in LlamaIndex.TS. This can be done using the startSpan method from the tracer instance. Each span should have a unique name and optionally, a set of attributes.

  2. Support Streaming responses: To support streaming responses, you would need to ensure that the span remains active for the duration of the stream. This can be done using the withSpan method from the tracer instance, which binds the span to the current context.

  3. Support tool calls: If LlamaIndex.TS makes calls to other tools or services, you would need to create child spans for these calls. This can be done using the startSpan method from the current span instance.

  4. Update examples: The examples should be updated to demonstrate how to use the new LlamaIndex.TS instrumentation. This could include how to start and end spans, how to create child spans, and how to use the spans with streaming responses.

  5. Documentation on how to use: Finally, you would need to update the documentation to explain how to use the new LlamaIndex.TS instrumentation. This should include a description of the span types, how to use the spans with streaming responses, and how to create child spans for tool calls.

I hope this helps! If you have any further questions or need clarification on any of the steps, please let me know.

For more information on how to use OpenTelemetry for instrumentation in JavaScript, you can refer to the OpenTelemetry JavaScript GitHub repository.

-Dosu

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@mikeldking mikeldking added language: js Related to JavaScript or Typescript integration and removed language: js Related to JavaScript or Typescript integration labels Mar 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request examples Related to the examples in the repo language: js Related to JavaScript or Typescript integration roadmap
Projects
Status: 👨‍💻 In progress
Status: No status
Development

No branches or pull requests

2 participants