Skip to content

LangChain vulnerable to code injection

Critical severity GitHub Reviewed Published Apr 5, 2023 to the GitHub Advisory Database • Updated Apr 17, 2023

Package

pip langchain (pip)

Affected versions

<= 0.0.131

Patched versions

None

Description

In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec() method.

References

Published by the National Vulnerability Database Apr 5, 2023
Published to the GitHub Advisory Database Apr 5, 2023
Reviewed Apr 5, 2023
Last updated Apr 17, 2023

Severity

Critical
9.8
/ 10

CVSS base metrics

Attack vector
Network
Attack complexity
Low
Privileges required
None
User interaction
None
Scope
Unchanged
Confidentiality
High
Integrity
High
Availability
High
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H

CVE ID

CVE-2023-29374

GHSA ID

GHSA-fprp-p869-w6q2

Source code

Checking history
See something to contribute? Suggest improvements for this vulnerability.