You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DB system (MySQL, Blazegraph, etc.) and version: MySQL 8.0.31
Issue
Creating a large number of redirects quickly on a wiki using SMW can cause the webserver to get overwhelmed with parses of the target page. After some investigation, I think the it's because any time a redirect gets created, it triggers a "ChangeTitleUpdate" deferred job, which does a full parse of the target page.
Does anyone know what the purpose of this job is? I am having a hard time understanding why the creation of a redirect would require a re-parse of the target page. I tested the 4 cases listed on #895 with the job commented out, and the only degradation was on test case 3: if I redirected a page with semantic data to [[Broken redirect]], then semantic data would end up on [[Broken redirect]]. Notably, test case 4 (redirecting to an existing page) worked just fine, even with the job removed. From my perspective it seems like this job could be replaced with a much simpler check for whether the target page exists, and that would close off the attack vector.
If I'm wrong and the job is indeed necessary, I wonder if it would be better to move it to the job queue, rather than a deferred update. Since it's deferred (but still on the main webserver), it's quite easy for someone bot-creating a bunch of redirects to subtly DOS the server with all the re-parses, since their api.php edit requests won't wait for the parse to finish.
Steps to reproduce the observation:
Find a target page with a long parse time (e.g. 5+ seconds)
Create ~100 redirects in a minute to this target page
This will fully engage ~10 CPU cores with repeated parses of the target page
The text was updated successfully, but these errors were encountered:
leejt
added
the
bug
Occurrence of an unintended or unanticipated behaviour that causes a vulnerability or fatal error
label
Apr 5, 2024
Setup
Issue
Creating a large number of redirects quickly on a wiki using SMW can cause the webserver to get overwhelmed with parses of the target page. After some investigation, I think the it's because any time a redirect gets created, it triggers a "ChangeTitleUpdate" deferred job, which does a full parse of the target page.
Does anyone know what the purpose of this job is? I am having a hard time understanding why the creation of a redirect would require a re-parse of the target page. I tested the 4 cases listed on #895 with the job commented out, and the only degradation was on test case 3: if I redirected a page with semantic data to [[Broken redirect]], then semantic data would end up on [[Broken redirect]]. Notably, test case 4 (redirecting to an existing page) worked just fine, even with the job removed. From my perspective it seems like this job could be replaced with a much simpler check for whether the target page exists, and that would close off the attack vector.
If I'm wrong and the job is indeed necessary, I wonder if it would be better to move it to the job queue, rather than a deferred update. Since it's deferred (but still on the main webserver), it's quite easy for someone bot-creating a bunch of redirects to subtly DOS the server with all the re-parses, since their api.php edit requests won't wait for the parse to finish.
Steps to reproduce the observation:
The text was updated successfully, but these errors were encountered: