fnafgame.io - Error when trying to start the game
Categories
(Core :: Networking, defect, P3)
Tracking
()
People
(Reporter: ctanase, Assigned: valentin)
References
(Blocks 1 open bug, Regression, )
Details
(4 keywords, Whiteboard: [webcompat-source:product])
User Story
platform:windows,mac,linux impact:workflow-broken configuration:general affects:all branch:release
Attachments
(2 files)
Environment:
Operating system: Windows 11/10
Firefox version: Firefox 128.0 (release)/130
Preconditions:
- Clean profile
Steps to reproduce:
- Navigate to: https://fnafgame.io/fnaf-3
- Click on "PLAY NOW" button.
- Observe the behavior.
Expected Behavior:
The game loads correctly.
Actual Behavior:
Attempts to load and the it displays an error.
Notes:
- Reproducible on the latest Firefox Release and Nightly
- Reproducible regardless of the ETP setting
- Works as expected using Chrome
Created from webcompat-user-report:80b55015-263c-4f75-be3d-ea442f2c8de8
Updated•1 year ago
|
Comment 1•1 year ago
|
||
47mb data URI, tripping over bug 1721448.
Comment 2•1 year ago
|
||
Chrome doesn't seem to have much limit. I seem to start running into string length limits of 512MB before running into URL size limits.
Comment 3•1 year ago
|
||
Set release status flags based on info from the regressing bug 1721448
:valentin, since you are the author of the regressor, bug 1721448, could you take a look?
For more information, please visit BugBot documentation.
Updated•1 year ago
|
Updated•1 year ago
|
| Assignee | ||
Comment 4•1 year ago
|
||
According to Nika on the #ipc matrix channel:
We can probably relax the url length limit if we need to now, as we send giant string buffers in shared memory nowadays automatically
I believe a data: URI would be a SImpleURIParams with a gigantic path field (https://searchfox.org/mozilla-central/rev/4f5426084651871759f5346eb0ded2e9ac5326fd/ipc/glue/URIParams.ipdlh#16), which would end up serialized in shared memory.
Let's bump this up to 512Mb to match Chrome.
| Assignee | ||
Comment 5•1 year ago
|
||
Previously we limited data URL sizes to 32 Mb to avoid hitting the IPC
message size of 256 Mb. However, since then we've added the ability
to serialize large strings using shared memory (see bug 1783240).
That means we can now relax this limit to 512 Mb to match Chrome.
Allocating and copying very large strings will be very slow, but
as long as there's enough memory it will work.
Updated•1 year ago
|
Updated•1 year ago
|
Comment 6•1 year ago
|
||
I was trying to investigate improving error logging in DevTools.
In the console, we do show an error
Window.fetch: data:application/octet-stream;base64,UEsDBAoAAAAIAMmDalFaxkUFW6ICAJLEEAAMAAAAcHJvamVjdC5qc29u3F3nsuI6s32VAYq6ZLDJUBQ555yKYGwTDSZnePYLO8wmtLGtmX1m5vtzzsA2Uvfq1lKr3ZIO0iUx79HLhdRRP0gHi/yS6NFSx3K+ojXSCTG+/Fv6/p1GuibmA6LD0JdnD1IN5WiQ63nGte5JlvFJJOnf0NrCYExrLy1Jr/+QaowNDfxcatDrL98efPuXVINxPem/iDAn3h59/6dUY+B6NkkMJkt6QkzIdyFuPr/4lXdFDdgfAXo9IOnF2+/uvrnojUtfS/cjv1ss6fHiRsrPr6QaK9dvS/RkOWCI5YCdvP3w5vOlz5Tey9lr2JsMOt5+8/avy9MY57M5usOyy8Gk9/b8z0/CUPyRnv4U7/nrS79mbmS8yR8p4sMbPj+8lDQ/nV8EW86J6Y8ES37hAnx/aSfd7V6s88PPzif0nLPNTJ+YLNnxD39/QBI/vNPpxTSf3sHxt5cyJolpge31mE//+vh0tRc7HxMM5w+vxv0h+5GnCYamfpr75zcvzHF94KL/j+vz73jcfPHidz7fjwjBMCtyMPnC8vHLy8Dj7vgLd2/00RLe6BUnE6e+70MowxC7T9e7++aVvl+dvEEUvHjd/GKkn3Z79cALygnu6MUPP8MuPvC/+fxCmthqPF2QxPy965+fXvwiM7/wBrtafBns7psrbNz+9fPRC4vc//byxUvPvFiWnVyfkn0a+uOjVGPjGx1JYnJx4rchQDyPj6e/vpTj4VfPLvjqgVfu+Pm7zOoiyJJL2Ke/XoQ18Ar78StuYaEHLsJycvvD76KTd7qC2vz820tBf/L2j/fpZXHP5p/fvnDLr2ffBuLD79++E/Trqw8/9v723ash8aFxiN3uXoD8/OcXw/nuRxzO8PA3ARK+zwXcIgJ/v7gBzjlds…
Unfortunately, the error message is ${dataUrl} is not a valid URL, but as the URL is super long, we don't show the is not a valid URL.
We could do some regexp to strip the URL in DevTools frontend, but I'm wondering if we should rather cap the size of URI in the platform code?
https://searchfox.org/mozilla-central/rev/5b061cdc4d40d44988dc61aa941cfbd98e31791f/dom/fetch/Response.cpp#99-103
I was also wondering if we could spawn a new type of error, like NS_ERROR_OVERSIZED_URI instead of reusing NS_ERROR_MALFORMED_URI over there?
https://searchfox.org/mozilla-central/rev/5b061cdc4d40d44988dc61aa941cfbd98e31791f/netwerk/base/nsNetUtil.cpp#1830-1833
So that the code in Response.cpp could spawn a more specific error message, and do the url size capping only in this case.
In the netmonitor we don't show anything
It would have been neat to see the request in netmonitor... but I imagine this is not actual request even starting as we bail out early on the URL construction. Any idea if we could still notify the DevTools/netmonitor about early failure of fetch calls?
Comment 9•1 year ago
|
||
Moving to Networking since the patch landed in this bug directly.
Updated•1 year ago
|
Updated•1 year ago
|
Updated•1 year ago
|
Comment 10•1 year ago
|
||
FYI MDN 136 docs for this done in https://github.com/mdn/content/pull/37949
Updated•1 year ago
|
Comment 11•1 year ago
|
||
Managed to reproduce the issue on Firefox 133.0a1 (2024-10-06), under Windows 11 x64.
The issue is no longer reproducible on Firefox 137.0a1 (2025-02-07), or on Firefox 136.0b2.
Tests were performed on Windows 11 x64 and on macOS 10.15.
Updated•1 year ago
|
Description
•