How to Prevent URL Stripping on Szurubooru
One of the platforms I didn’t anticipate I needed, but ultimately found myself pondering over were images DB. If it were personal or interpersonal photos, there are free and paid viewers, libraries, and management softwares available. If it were creating and catering new images, those exist for the professional illustrators. But neither particularly helps when the images in question are meant to be read and/or referenced. Szurubooru is one of these image board platform that could be the solution for personal uses. I am still in the process of doing a proper write-up on it.
Regardless, one of the odd bits of szurubooru’s behavior is its URL stripping or normalization. It removes the query from the URL, so it would only display the post id. For example, if I were to look up da_vinci sketch, the image board displays images with the tag da_vinci and sketch. The query, if I understand it correctly, is only removed the URL as the webpage loads; e.g. domain.org/post/postID/query= da_vinci sketch is normalized to domain.org/post/postID. For the lack of better word, I suppose it is meant to prettify the URL.
Now, the problem is, neither szurubooru nor modern browsers are particularly keen on keeping the tab (or session) alive. So the problem I had faced multitude of times were: 1. search for an image and then have it on the display, 2. while working on something else, the browser loses the status (including query) info, and 3. now I can’t navigate normally and instead it will take me to the unrelated next post ID instead.
My fix here is simple. Stop szurubooru from stripping the URL query. Modern browsers, at least Safari and Firefox, may lose JavaScript memory, but not the actual URL of the tab. It can be reloaded, if the szurubooru had kept them. Create two files where compose.yaml is: inject.sh and custom.js. And these are the contents respectively.
#!/bin/sh
#inject.sh
cp /var/www/index.htm /var/www/index.htm.bak
if ! grep -q 'custom.js' /var/www/index.htm; then
sed -i 's|</body>|<script src="/custom.js"></script></body>|' /var/www/index.htm
fi
exec "$@"
#custom.js
(function () {
const _replaceState = history.replaceState.bind(history);
history.replaceState = function (state, title, url) {
if (url && /\/post\/\d+$/.test(url)) {
const m = document.cookie.match(/(?:^|;\s*)szuru_q=([^;]*)/);
const q = m ? decodeURIComponent(m[1]) : null;
if (q) url = `${url}/query=${encodeURIComponent(q)}`;
}
_replaceState(state, title, url);
};
document.addEventListener('click', function (e) {
const link = e.target.closest('a[href*="/query="]');
if (!link) return;
const m = link.getAttribute('href').match(/\/query=([^;]+)/);
if (m) document.cookie = `szuru_q=${encodeURIComponent(decodeURIComponent(m[1]))};path=/;max-age=86400`;
}, true);
// Clear cookie when browsing without search
const _pushState = history.pushState.bind(history);
history.pushState = function (state, title, url) {
_pushState(state, title, url);
if (!window.location.href.includes('/query=') && !window.location.href.includes('/post/')) {
document.cookie = 'szuru_q=;path=/;max-age=0';
}
};
})();
With the two files saved, change the compose.yaml to include the following. Likely, volumes: already exist in the Docker compose. So append it from there.
#compose.yaml
volumes:
- ./custom.js:/var/www/custom.js:ro
- ./inject.sh:/inject.sh:ro
entrypoint:
- /bin/sh
- /inject.sh
command:
- /docker-start.sh
Technically, the script is not preventing szurubooru from stripping the URL. It is adding it back after the fact. Also, it is a product of vibe coding, around one to two Claude Pro continuous sessions worth. I find it delightful when I don’t have to tiptoe around JavaScript thanks to LLM as a hobbyist. But more straightforward solution to this problem would have been szurubooru handling it either fully on the server or delete them to the client proper. Hopefully in the near future, there will be a permanent fix (or a simple on/off settings).

Comments will be automatically closed after 30 days.