Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Successful use of gradio SSE with private hf space, but inference causes error client side #8266

Closed
1 task done
Thomas2419 opened this issue May 12, 2024 · 5 comments
Closed
1 task done
Assignees
Labels
bug Something isn't working gradio_client Related to the one of the gradio client libraries pending clarification svelte Frontend-related issue (JS)

Comments

@Thomas2419
Copy link

Thomas2419 commented May 12, 2024

Describe the bug

I have a standard huggingface private space built on the template text to image under gradio. I am able to successfully use the new SSE and open_stream() command very recently introduced. Yet when I go to do inference at my /infer endpoint my script returns the attatched error. The space appears to currently run inference at my request so the issue seems to be the client side error, no errors are thrown in the spaces logs.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction



async function main() {
    try {
        // Connect to a specific Gradio app
        const app = await Client.connect("insert private hf", { hf_token: "insert hftoken" });

        // Define the parameters for the inference
        const params = {
            prompt: " ",
            negative_prompt: " ",
            seed: 0,
            randomize_seed: true,
            width: 256,
            height: 256,
            guidance_scale: 1,
            num_inference_steps: 2,
        };
 
                // Start the stream
        await app.open_stream();

        // Set up event handlers for the stream
        app.stream.onmessage = (event) => {
            const data = JSON.parse(event.data);
            console.log("Received data during stream:", data);
            // Check if the data includes the results
            if (data.results) {
                console.log("Inference result:", data.results);
                // Handle the results, such as extracting the image URL
            }
        };

        app.stream.onerror = (error) => {
            console.error("Stream encountered an error:", error);
        };

        // Send a prediction request
        await app.predict("/infer", params);
        console.log("Prediction request sent.");

    } catch (error) {
        console.error("Error during setup or prediction:", error);
    }
}

main();

Screenshot

No response

Logs

Unexpected error Connection errored out.
Error during setup or prediction: {
  type: 'status',
  stage: 'error',
  message: 'Connection errored out. ',
  queue: true,
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-12T00:03:39.862Z
}

Hugginface space logs after running /infer endpoint:
0%| | 0/2 [00:00<?, ?it/s]
50%|█████ | 1/2 [00:07<00:07, 7.69s/it]
100%|██████████| 2/2 [00:14<00:00, 7.05s/it]
100%|██████████| 2/2 [00:14<00:00, 7.14s/it]

System Info

Using huggingface space with most up to date version of gradio, as well as most up to date version of gradio on local side client. Otherwise normal version of windows 11, and normal js using node to run .js script. 

Gradio hf space environment requirements.txt file: accelerate
diffusers
invisible_watermark
torch
transformers
xformers

Readme file contents: ---
title: art
emoji: 🖼
colorFrom: purple
colorTo: red
sdk: gradio
sdk_version: 4.31.0
app_file: app.py
pinned: false
---

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

Severity

Blocking usage of gradio

@Thomas2419 Thomas2419 added the bug Something isn't working label May 12, 2024
@abidlabs
Copy link
Member

Hi @Thomas2419 does the issue you are facing only happen if the Space is private?

@abidlabs abidlabs added svelte Frontend-related issue (JS) gradio_client Related to the one of the gradio client libraries labels May 12, 2024
@abidlabs abidlabs added this to the @gradio/client 1.0 milestone May 12, 2024
@Thomas2419
Copy link
Author

Hello @abidlabs ! Thank you for the response, yes in my testing when making the space Public infer worked just fine. It was only once transition to private and running infer that the error occurs. Otherwise it successfully returns then images and all.

@hannahblair hannahblair self-assigned this May 13, 2024
@hannahblair
Copy link
Collaborator

@Thomas2419 thanks for letting us know! I'll take a look into this.

@Thomas2419
Copy link
Author

Thomas2419 commented May 13, 2024

@hannahblair In case this assists, I wanted to include a quick update, I was told my usage wasn't quite right and ran some updated tests that should hopefully reflect the intended usage.


 async function testStream() {
    try {
        const app = await Client.connect("space", { hf_token: "tokenW" });
        
        // Define parameters
        const prompt = " "; 
        const negativePrompt = ""; 
        const seed = Math.floor(Math.random() * 100);
        const randomizeSeed = true;
        const width = 256;  // 
        const height = 256;  //
        const guidanceScale = 1;  // 
        const numInferenceSteps = 2;  //

        // Submit the request and handle responses
        const submission = app.submit("/infer", [
            prompt,
            negativePrompt,
            seed,
            randomizeSeed,
            width,
            height,
            guidanceScale,
            numInferenceSteps
        ]).on("data", (data) => {
            console.log("Received data:", data);
        }).on("status", (status) => {
            console.log("Current status:", status);
        });

    } catch (error) {
        console.error('Error during stream setup:', error);
    }
}

// Execute the function
testStream().catch(console.error);

And then output of:

Current status: {
 type: 'status',
 stage: 'pending',
 queue: true,
 endpoint: '/infer',
 fn_index: 1,
 time: 2024-05-13T16:12:38.119Z
}
Unexpected error Connection errored out.
Current status: {
 type: 'status',
 stage: 'error',
 message: 'Connection errored out. ',
 queue: true,
 endpoint: '/infer',
 fn_index: 1,
 time: 2024-05-13T16:12:38.456Z
}

Swap space to public output of:

Current status: {
  type: 'status',
  stage: 'pending',
  queue: true,
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:16:34.528Z
}
Current status: {
  type: 'status',
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:16:34.799Z,
  queue: true,
  stage: 'pending',
  code: undefined,
  size: 1,
  position: 0,
  eta: 48.70225167274475,
  success: undefined
}
Current status: {
  type: 'status',
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:16:34.800Z,
  queue: true,
  stage: 'pending',
  code: undefined,
  size: undefined,
  position: 0,
  success: undefined,
  eta: 48.70225167274475
}
Received data: {
  type: 'data',
  time: 2024-05-13T16:16:53.801Z,
  data: [
    {
      path: '/tmp/gradio/dcd8a6847a30ba2167f57ac6a0a0a5b4a61a12ec/image.webp',
      url: '{Removed for privacy}/file=/tmp/gradio/dcd8a6847a30ba2167f57ac6a0a0a5b4a61a12ec/image.webp',
      size: null,
      orig_name: 'image.webp',
      mime_type: null,
      is_stream: false,
      meta: [Object]
    }
  ],
  endpoint: '/infer',
  fn_index: 1
}
Current status: {
  type: 'status',
  time: 2024-05-13T16:16:53.806Z,
  queue: true,
  message: undefined,
  stage: 'complete',
  code: undefined,
  progress_data: undefined,
  endpoint: '/infer',
  fn_index: 1
}

@Thomas2419
Copy link
Author

This has been fixed in the most recent gradio release @gradio/client@0.19.3 for Javascript and the overall gradio 4.31.2 release. Thanks to @pngwn for fixing it up!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working gradio_client Related to the one of the gradio client libraries pending clarification svelte Frontend-related issue (JS)
Projects
None yet
Development

No branches or pull requests

3 participants