Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented basic support for V11 file format. #5

Merged
merged 6 commits into from
Sep 9, 2023

Conversation

miragedmuk
Copy link

Hacky implementation. Somebody with better knowledge of file formats should be able to tidy up what I've done but it works (at least on the saves I have available to test with)

@alex4401
Copy link

Looks quite alright. It could use a few more checks to ensure things line up with stuff we've seen personally, even if they'll likely never be hit.


A few more personal notes I've made when looking at V11 (most were shared with Cad, and I hope it didn't ruin his vacation) - bunch will overlap with stuff done in the MR. Plenty of these characters redundant:

  • V11 was added primarily for official servers to optimise save times. No idea why Nitrado is enabling it on some servers.
  • All new fields in the header are 64-bit signed integers.
    • Official servers have files often bigger than 4 gigs. Some saves stalled the server for a while.
  • They're pairs: first is a pointer, second is size of the segment. In all files I've checked only the first pair was ever used, rest led to the end of file with size of zero.
    • Either those other segments are currently unused, dedicated for specific struct types, or are an "overflow" zone if the previous buffer is exhausted. Really doubtful they can even store files this big though.
  • Name table is detached and property tags are not serialised in the new data segment.
    • Names are serialised as strings instead.
    • Because there's no property tags, there is no data layout information. This will need to be hardcoded for any struct that needs to be read from this segment.
    • I have no idea how many structs/properties are affected.
    • Possibly, if a mod can include a custom struct inside the segment, changing the struct's definition between mod versions will crash the game on save file load.
    • Lack of property tags is somewhat similar to later UE4's unversioned property serialisation, with some differences: notably, keep-skip bitmasks are missing.
  • I wouldn't be surprised if there are other fields that got misinterpreted as 32-bit with convenient 4 zero bytes after.
  • It seems the redirections can happen on arrays of structs. Don't have enough info to say whether it's just CustomItemDatas or not.

Perhaps ZenRowe could shine some light on the format changes (though I personally doubt it), but that would be dodgy info to proceed with in an open project.

Some nitpicks (though I'm not related to the project, this is just my thoughts) - probably just pedantic and not that important:

  • The redirections I mentioned can be treated within ArkArrayStruct, but the save file version and new block offset need to be somehow passed over throughout the pipeline.
  • Redirected struct elements are replaced - as you've noticed. Always size of 18 bytes per array element, seemingly "always" beginning with a 16-bit 0002, then a 64-bit jump target, then 64-bit field (likely size, but presents as zero - not that uncommon I'd say). IMO it'd be good to assert that there are conditions we could test.
  • Triple check if the unknown properties withint he object data don't line up by any accident with V9 decoded struct.
  • Ideally, struct layout decoding should be done by a separate unit. Less spaghetti. Easier to add other types in.
    • The segment seems to be a new, separate "archive" from UE's perspective, therefore name table should never be mounted. Turn it off before doing the jump, turn it back on when leaving the segment.

@miragedmuk
Copy link
Author

miragedmuk commented Aug 23, 2023

Thank Alex. My work was definitely done on a much less structured approach than you took and I'm still suprised it works for the purposes of what I needed for my own app ASV.

I simply stepped bytes using screen shots and exports from previous file format as a comparrison and read in what data looked to make sense. Using the lookup pointer provided in the new CustomItemDatas to determine where to start for each creature, along with new header offset for stored creature data.

I have tidied it up a little on my own code repo creating a new "ArkStore" class object which reads in all of this new data - which unfortunately was never "forked" from the original toolkit and was done outside of Github before I uploaded it there so can't pull request so easily on that one.

@miragedmuk
Copy link
Author

 public class ArkStore : GameObjectContainerMixin
    {
        public string ClassName { get; internal set; } = string.Empty;
        public string Summary { get;internal set; } = string.Empty;
        public string Gender { get; internal set; } = string.Empty;
        public int[] Colors { get; internal set; } = new int[5];
        public Dictionary<string,string> Stats { get; internal set; } = new Dictionary<string, string>();
        public GameObject? CreatureComponent { get; internal set; } = null;
        public GameObject? StatusComponent { get; internal set; } = null;
        public GameObject? InventoryComponent { get; internal set; } = null;

        private long propertiesOffset = 0;

        public ArkStore(ArkArchive archive) 
        {
            ReadBinary(archive);
        }

        public void ReadBinary(ArkArchive archive)
        {
            var objectType = archive.ReadString(); //type?
            if (objectType.Equals("dino", StringComparison.InvariantCultureIgnoreCase))
            {
                var unknown1 = archive.ReadInt(); //unknown
                var className = archive.ReadString(); //class name instance
                var nameAndLevel = archive.ReadString(); //name and level
                var colorCodeCsv = archive.ReadString(); //csv list of color #
                var unknown2 = archive.ReadInt();//?
                var gender = archive.ReadString(); //gender

                archive.SkipBytes(14);//?

                var hp = archive.ReadFloat();
                var stamina = archive.ReadFloat();
                var weight = archive.ReadFloat();
                var oxy = archive.ReadFloat();
                var food = archive.ReadFloat();
                var speed = archive.ReadFloat();
                var imprint = archive.ReadFloat();
                var torpor = archive.ReadFloat();


                archive.SkipBytes(16); //unknown, need to identify

                var hpMax = archive.ReadFloat();
                var staminaMax = archive.ReadFloat();
                var weightMax = archive.ReadFloat();
                var oxyMax = archive.ReadFloat();
                var foodMax = archive.ReadFloat();
                var speedMax = archive.ReadFloat();
                var imprintMax = archive.ReadFloat();
                var torporMax = archive.ReadFloat();

                archive.SkipBytes(20);//?

                var colorCount = archive.ReadInt(); //color name count
                List<string> colorNames = new List<string>();
                while (colorCount-- > 0)
                {
                    var colorName = archive.ReadString();
                    colorNames.Add(colorName);
                }


                //set properties
                ClassName = className.Substring(0, className.LastIndexOf("_"));
                Summary = nameAndLevel;
                Gender = gender;

                string[] colorSplit = colorCodeCsv.Split(',').Where(e=>e.Length > 0).ToArray();
                Colors = new int[colorSplit.Length];
                for (int i = 0; i< colorSplit.Length; i++)
                {
                    int.TryParse(colorSplit[i], out int colorSplitInt);
                    Colors[i] = colorSplitInt;
                }

                Stats.Clear();
                Stats.Add("Health", string.Concat(hp, " / ", hpMax));
                Stats.Add("Stamina", string.Concat(stamina, " / ", staminaMax));
                Stats.Add("Weight", string.Concat(weight, " / ", weightMax));
                Stats.Add("Oxygen", string.Concat(oxy, " / ", oxyMax));
                Stats.Add("Food", string.Concat(food, " / ", foodMax));
                Stats.Add("Speed", string.Concat(speed, " / ", speedMax));
                Stats.Add("Imprint", string.Concat(imprint, " / ", imprintMax));
                Stats.Add("Torpor", string.Concat(torpor, " / ", torporMax));

            }


            archive.SkipBytes(8);//?

            propertiesOffset = archive.Position;



            //load GameObjects
            Objects.Clear();

            bool useNameTable = archive.UseNameTable;
            archive.UseNameTable = false;

            var objectCount = archive.ReadInt();
            while(objectCount-- > 0)
            {
                Objects.Add(new GameObject(archive));
            }

            if (Objects.Count > 0)
            {
                CreatureComponent = Objects[0];
            }

            if (Objects.Count > 1)
            {
                StatusComponent = Objects[1];
            }

            if (Objects.Count > 2)
            {
                InventoryComponent = Objects[2];
            }

            archive.UseNameTable = useNameTable;

        }

        public void LoadProperties(ArkArchive archive)
        {
            bool useNameTable = archive.UseNameTable;
            archive.UseNameTable = false;

            var pos = archive.Position;

            if(CreatureComponent!=null)
            {
                CreatureComponent.LoadProperties(archive, new GameObject(), (int)propertiesOffset);
            }

            if (StatusComponent != null)
            {
                StatusComponent.LoadProperties(archive, new GameObject(), (int)propertiesOffset);
            }

            if (InventoryComponent!= null)
            {
                InventoryComponent.LoadProperties(archive, new GameObject(), (int)propertiesOffset);
            }

            archive.Position = pos;
            
            archive.UseNameTable = useNameTable;
        }

    }

.. this is then called out to from ArkSavefile.readBinaryStoredObjects:

       var creatureDataOffset = cryoDataOffset + storedOffset;
         archive.Position = creatureDataOffset;

         ArkStore storeSummary = new ArkStore(archive);
         storeSummary.LoadProperties(archive);

@alex4401
Copy link

Thank Alex. My work was definitely done on a much less structured approach than you took and I'm still suprised it works for the purposes of what I needed for my own app ASV.

There's no reason for your patch not to work, and it's in fact pretty clean for just cryopods.

@miragedmuk
Copy link
Author

Another report of another file format change. This time not version dependent though but based on the command line switch "-UseStore". Seems to add some more bytes to hibernation header data. Will investigate after work this evening.

Identified from official saves which use command line switches:

-newsaveformat -usestore
@alex4401
Copy link

IIRC -usestore stores character data within the world dump. Those fields may be related but I haven't done any research on it.

+ Reduced check on byte size for CustomItemDatas and only read in first 10 bytes to include offset to cryo store data.
@cadon cadon merged commit 612cf06 into cadon:master Sep 9, 2023
@cadon
Copy link
Owner

cadon commented Sep 9, 2023

Thank you very much!

@miragedmuk
Copy link
Author

Not quite right mate. Still investigating but official Island PVE 892 crashing on load.

@alex4401
Copy link

alex4401 commented Sep 9, 2023

Do you have a stack trace of that crash?

Edit: Oh wait, I started thinking of the game and not the toolkit. That makes it easier for me to join in I guess.

@cadon
Copy link
Owner

cadon commented Sep 9, 2023

The save file is https://arkfiles.com/pc/2023-08-30/nitradouswest151.newofficialtheisland892.zip .
Due to the filesize of ~2.8 GB it took 10-15 min until the exception occured when trying to import. It happened while reading a byte value at position 871732975. The value length of that expected byte value is calculated by the difference of the current position to the offset of the next value position, that results in 871667283 - 871732975 = -65692, which obviously is an invalid length for reading a byte value. At least that's the issue I ran into.
The stacktrace is (from my memory)

ArkArchive.ReadBytes(int)
GameObject.LoadProperties(ArkArchive, GameObject, int)
ArkSavegame.readBinaryObjectPropertiesImpl(int, ArkArchive)
ArkSavegame.readBinaryObjectProperties(ArkArchive, ReadingOptions)
ArkSavegame.ReadBinary(ArkArchive, ReadingOptions)

So the next GameObject probably has a misinterpreted propertyOffset.

@alex4401
Copy link

alex4401 commented Sep 9, 2023

Perhaps there's an integer used for seeking that's overflowing?

There's 2,684,609 objects in the save file, and the last one has a relative property block offset of 1,023,535,541. The property block itself starts at 198,888,027. Added up, that's a seek to 1,222,423,568, which is int32.MaxValue + 925,060,079.

@alex4401
Copy link

alex4401 commented Sep 9, 2023

Mm, never mind, quite doubtful for problems to start at the offset you mentioned (though GameObject does have an integer variable that will overflow with later objects).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants