In theory a get request sent to a server should not have any side effects and only retrieve some data. In practice implemention is completely up to the developer and their rule is about as useful as putting up an exit sign to prevent people from entering your building.
> Allow only GET requests i.e. make the internet read-only
If only developers never made use of GET to modify resources...
https://www.reddit.com/r/webdev/comments/6999x7/comment/dh4v...
Ironically, your URL demonstrates this nicely, having a bunch of extra superfluous parameters that only serve to update some tracking database. Here is the "cleaned" URL: https://www.reddit.com/r/webdev/comments/6999x7/comment/dh4v...
I thought it'd be this old but memorable article: https://thedailywtf.com/articles/The_Spider_of_Doom
Am I misunderstanding this one? GET still sends information to another server, what is the "read only" aspect?
In theory a get request sent to a server should not have any side effects and only retrieve some data. In practice implemention is completely up to the developer and their rule is about as useful as putting up an exit sign to prevent people from entering your building.
I meant read-only there in the sense of mutability, not exfiltration.
Of course, some websites may permit mutations through GET so it’s probably only sensible to use alongside known hosts.
``` GET https://mysite.com/?query=all+the+secrets ```
This describes httpjail, a new Rust sandbox proxy tool: https://github.com/coder/httpjail
It works for any process, not just Claude Code. I got it working with Codex CLI like this:
After installing it using Cargo (and Homebrew): I wrote more notes about it here: https://simonwillison.net/2025/Sep/19/httpjail/Previously [0]
0. https://news.ycombinator.com/item?id=45307459
We'll merge that comment hither. Thanks!
The timestamps remained accurate! That's awesome.