You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PyCharm 2024.1 (Professional Edition)
Build #PY-241.14494.241, built on March 28, 2024
Runtime version: 17.0.10+8-b1207.12 amd64
VM: OpenJDK 64-Bit Server VM by JetBrains s.r.o.
Windows 11.0
GC: G1 Young Generation, G1 Old Generation
Memory: 2048M
Cores: 8
Registry:
ide.experimental.ui=true
terminal.new.ui=true
Non-Bundled Plugins:
com.jetbrains.space (241.14494.150)
com.jetbrains.edu (2024.4-2024.1-742)
com.intellij.ml.llm (241.14494.320)
com.sourcegraph.jetbrains (5.5.9)
Describe the bug
Asking any question I receive the following error:
Cody encountered an error when processing your message:
⚠ org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/new failed with message: No chat model found in server-provided config
Expected behavior
Answer to the question, give a response.
Additional context
Stacktrace:
java.util.concurrent.ExecutionException: org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/new failed with message: No chat model found in server-provided config
at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073)
at com.sourcegraph.cody.chat.AgentChatSession.submitMessageToAgent$lambda$1(AgentChatSession.kt:109)
at com.sourcegraph.cody.agent.CodyAgentService$Companion$withAgent$2$1$task$1.invokeSuspend(CodyAgentService.kt:185)
at com.sourcegraph.cody.agent.CodyAgentService$Companion$withAgent$2$1$task$1.invoke(CodyAgentService.kt)
at com.sourcegraph.cody.agent.CodyAgentService$Companion$withAgent$2$1$task$1.invoke(CodyAgentService.kt)
at com.sourcegraph.cody.agent.CodyAgentService$Companion.coWithAgent(CodyAgentService.kt:241)
at com.sourcegraph.cody.agent.CodyAgentService$Companion$withAgent$2$1.invokeSuspend(CodyAgentService.kt:190)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:280)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:85)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:59)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:38)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.sourcegraph.cody.agent.CodyAgentService$Companion.withAgent$lambda$2(CodyAgentService.kt:182)
at com.intellij.openapi.application.impl.RwLockHolder$executeOnPooledThread$1.run(RwLockHolder.kt:154)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at com.intellij.util.concurrency.ContextCallable.call(ContextCallable.java:32)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at com.intellij.util.concurrency.ContextRunnable.run(ContextRunnable.java:27)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:702)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:699)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1.run(Executors.java:699)
at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/new failed with message: No chat model found in server-provided config
at org.eclipse.lsp4j.jsonrpc.RemoteEndpoint.handleResponse(RemoteEndpoint.java:209)
at org.eclipse.lsp4j.jsonrpc.RemoteEndpoint.consume(RemoteEndpoint.java:193)
at org.eclipse.lsp4j.jsonrpc.json.StreamMessageProducer.handleMessage(StreamMessageProducer.java:194)
at org.eclipse.lsp4j.jsonrpc.json.StreamMessageProducer.listen(StreamMessageProducer.java:94)
at org.eclipse.lsp4j.jsonrpc.json.ConcurrentMessageProcessor.run(ConcurrentMessageProcessor.java:113)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
... 1 more
Additional info: org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/new failed with message: No chat model found in server-provided config
The text was updated successfully, but these errors were encountered:
Cody Version
5.5.9
IDE Information
PyCharm 2024.1 (Professional Edition)
Build #PY-241.14494.241, built on March 28, 2024
Runtime version: 17.0.10+8-b1207.12 amd64
VM: OpenJDK 64-Bit Server VM by JetBrains s.r.o.
Windows 11.0
GC: G1 Young Generation, G1 Old Generation
Memory: 2048M
Cores: 8
Registry:
ide.experimental.ui=true
terminal.new.ui=true
Non-Bundled Plugins:
com.jetbrains.space (241.14494.150)
com.jetbrains.edu (2024.4-2024.1-742)
com.intellij.ml.llm (241.14494.320)
com.sourcegraph.jetbrains (5.5.9)
Describe the bug
Asking any question I receive the following error:
Cody encountered an error when processing your message:
⚠ org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/new failed with message: No chat model found in server-provided config
Expected behavior
Answer to the question, give a response.
Additional context
Stacktrace:
Additional info:
org.eclipse.lsp4j.jsonrpc.ResponseErrorException: Request chat/new failed with message: No chat model found in server-provided config
The text was updated successfully, but these errors were encountered: