-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to Parse Large NBT Arrays #85
Comments
We could add an option to that node protodef type to allow large arrays.
Keep it off by default
And then propagate that option to pnbt
How come your file has such big arrays ?
…On Sun, Dec 31, 2023, 9:17 PM Braden Kopenkoskey ***@***.***> wrote:
I am attempting to parse NBT files that contain very large arrays.
Whenever I do this, an error is thrown because the underlying
node-protodef
<https://github.com/ProtoDef-io/node-protodef/blob/912683467d4eec4df852a4ae66030338416b48b1/src/datatypes/compiler-structures.js#L13>
library caps arrays at a max size of 0xFFFFFF (16,777,215). According to the
commit history
<ProtoDef-io/node-protodef@6a625d6>,
this was originally added to prevent OOM crashes.
Based on this NBT format specification
<https://minecraft.fandom.com/wiki/NBT_format#Binary_format>, a byte
array should be able to have a size of 2^31.
Support should be added to allow larger NBT array sizes to be parsed.
Example Code
const arrBuffer = await file.arrayBuffer();const buffer = new Buffer(arrBuffer);
// Error happens hereconst parsed = (await parse(buffer)).parsed;
Error
⨯ node_modules/protodef/src/compiler.js (70:11) @ readFn
⨯ Error: Read error for undefined : array size is abnormally large, not reading: 18571392
at async uploadFile (./src/app/load/_actions/uploadFile.ts:22:21)
Example NBT File
This is an example NBT file that includes a very large byte array (size of
~18 million). Parsing will throw an error if executed on this file. No
error is thrown when a NBT with smaller array sizes is parsed.
LargeNBTFile.schem.gz
<https://github.com/PrismarineJS/prismarine-nbt/files/13802101/LargeNBTFile.schem.gz>
—
Reply to this email directly, view it on GitHub
<#85>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAR437U4RMOORNBXOYGZRZ3YMHB5DAVCNFSM6AAAAABBIO4DW2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGA3DCMJWGAZTCNI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
This seems like a reasonable solution to me.
I am working with Minecraft schematic files. Every single block has it's own index in the byte array, so a schematic with dimensions of 500 * 500 * 75 gets to a size of about 18 million, surpassing the limit. It is probably not an optimal format, but I am working with the standard that is already set. |
Ok makes sense, you may be interested by https://github.com/PrismarineJS/prismarine-schematic |
Some time ago (2020) there was problems with prismarine-nbt crashing V8 with OOM, so that check was meant to safely fail. I was thinking that check could be replaced with a simple |
Why not add a param for the array type ?
That seems cleaner than a global param
…On Mon, Jan 1, 2024, 8:27 PM extremeheat ***@***.***> wrote:
Some time ago (2020) there was problems with prismarine-nbt crashing V8
with OOM, so that check was meant to safely fail. I was thinking that check
could be replaced with a simple if (arrayLength > (offset +
buffer.length)) throw() check, but actually we can't do that since we
don't know the size of the array items (which may be variable) before
iterating. Since compiler is available to the type gen code it should not
be hard to delete that line based on condition like if
(compiler.skipChecks == true) skip.
—
Reply to this email directly, view it on GitHub
<#85 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAR437U7UA4RIDU4SGXSAHDYMMEZFAVCNFSM6AAAAABBIO4DW2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZTGQ2DQOJVGA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Because then every instance in the protocol schema, nbt.jsons and all the protocol.json that uses an array type would need to be updated. Otherwise the check would not be consistent |
It can be made to default to false so we don't need to update usages
…On Tue, Jan 2, 2024, 1:04 AM extremeheat ***@***.***> wrote:
Because then every instance in the protocol schema, nbt.jsons and all the
protocol.json that uses an array type would need to be updated
—
Reply to this email directly, view it on GitHub
<#85 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAR437QFYM2UUYDX62VOTFLYMNFHLAVCNFSM6AAAAABBIO4DW2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZTGUZTIMZZG4>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Right, but then users would need to have a way to update the type in the relevant part of the schema to toggle the behavior |
@extremeheat and @rom1504, I have an open PR in node-protodef to address the proposed |
I am attempting to parse NBT files that contain very large arrays. Whenever I do this, an error is thrown because the underlying node-protodef library caps arrays at a max size of 0xFFFFFF (16,777,215). According to the commit history, this was originally added to prevent OOM crashes.
Based on this NBT format specification, a byte array should be able to have a size of 2^31.
Support should be added to allow larger NBT array sizes to be parsed.
Example Code
Error
Example NBT File
This is an example NBT file that includes a very large byte array (size of ~18 million). Parsing will throw an error if executed on this file. No error is thrown when a NBT with smaller array sizes is parsed.
LargeNBTFile.schem.gz
The text was updated successfully, but these errors were encountered: