Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Source Code Transparency #124

Open
twiss opened this issue Oct 12, 2023 · 2 comments
Open

Source Code Transparency #124

twiss opened this issue Oct 12, 2023 · 2 comments

Comments

@twiss
Copy link

twiss commented Oct 12, 2023

[As presented at the Secure the Web Forward Workshop (transcript, slides and video), and before that at TPAC 2023 in the WebAppSec WG (minutes).]

Introduction

Today, whenever you open a web app, the browser fetches and runs its source code from the server. This enables the ease of deployment and iteration that the web is known for, but can also pose security risks. In particular, for web apps that don't want to trust the server, such as those using client-side encryption to protect the user's data before sending it to the server, or those processing the user's data entirely client-side without sending any sensitive data to the server, the current security model of the web is insufficient.

After all, if a web app claims not to send sensitive data to the server, this is very difficult for users to check, as they would need to read (and understand) the source every time it's loaded from the server. Even for security researchers, such a claim is impossible to verify, as the server could simply serve a different version of the web app to a user it wants to target than to the security researcher.

Therefore, we would like to propose a mechanism to enable security researchers to audit the source code of especially-sensitive web apps that is (or was) sent to any user, not just to themselves.

Proposed Solution

Concretely, this could be done by publishing the (hash of) a Web Bundle of the web app to a transparency log (similar to Certificate Transparency logs), e.g. sigstore, or a yet-to-be-created log à la IETF's SCITT proposal, and asking browsers to check that the source code was published in the transparency log before running it.

To signal to the browser that Source Code Transparency should be used, we propose introducing a X.509 certificate extension indicating that the browser must check the source code of the web app in Source Code Transparency logs. The extension would automatically be included in Certificate Transparency logs, making it detectable for security researchers whether Source Code Transparency is used by a web application. The security researcher could then check the source code of the web app in the transparency logs trusted by the browsers, and be sure that no other (malicious) source code was served.

Read the complete Explainer.

Feedback

I welcome feedback in this thread, but encourage you to file bugs against the Explainer.

@marcoscaceres
Copy link
Contributor

@beurdouche might have some thoughts or feedback... anyone else we can rope into the discussion?

@beurdouche
Copy link

Yes, coincidentally I just emailed Daniel about it...
What you are aware of and that I can publicly share at this point is that there is movement on the Integrity and Transparency front. We are all trying to consolidate interest with different partners before we can re-initiate the technical discussion within WebAppSec, hopefully soon : )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants