- Editado
still running into JAVA heap space error in 4.0 64bit beta
So I decided to finally test out the 4.0 beta and I'm loving all of the new features but the one error that I was hoping to get past was the Java heap space error I get when I try to create a JSON file on my 2018 Mac book pro. My project exports fine at .85 size but I need to export it at 1.0 for the game that I'm working on. Though the error does not show on my 2018 Wacom moble studio pro at the same packing settings, I'm still a bit confused with why this is a problem for my Mac.
Exporting JSON doesn't have a scale, I guess you are texture packing?
I assume you are using the v4 launcher. What version of the editor are you using? If you are using 3.8.99 or earlier then it will behave exactly as it did before
all editor versions before 4.0 are still 32-bit, even when run using the new v4 launcher. You would need to run a v4.0+ editor version to get the 64-bit benefits.
I am running the v4 launcher. I re-downloaded the launcher so I could use the latest 4.0.29 beta update and that's why I was able to export the JSON file with the texture packing on my windows Wacom tablet but when I did the same thing on my Mac book pro it just ran into the heap space error.
I'm checking with my team to see if I can share the project with you because this does seem like an odd problem. I'm sure my tablet isn't more powerful than my Mac so I don't see there should be a problem.
Hmm, strange. It may also help to see your spine.log file from your Mac after the error occurs.
Here's what I get on my Mac.
<removed>
Thanks! Could you post the log as text though? We need the text to decode the error.
Spine Launcher 4.0.13
Esoteric Software LLC (C) 2013-2020 | http://esotericsoftware.com
Mac OS X x86_64 10.15.6
ATI Technologies Inc., AMD Radeon Pro 560 OpenGL Engine, 2.1 ATI-3.10.16
Starting: Spine 4.0.30-beta Professional
Spine 4.0.30-beta Professional
Licensed to: *****, *****
OpenAL 1.1, Default audio device
Started.
Packing.....................................................................
...............................
Writing 9910x6334: /Users/jeffrey/Desktop/KO_OP/JSON files/trish beta test/Trish.png
WARNING: Error packing images:
[OutOfMemoryError] Java heap space
at s.GaI.<init>(:121)
at s.wgM.(:39)
at s.WBq.(:324)
at s.WBq.(:173)
at s.wfy.(:115)
at s.MUS.(:259)
at s.HFt.(:222)
at s.MUS.(:67)
at s.MUS.(:166)
at s.mmh.(:88)
at java.lang.invoke.DirectMethodHandle$Holder.invokeVirtual(Unknown Source)
at 0x00000007c00fb040.invoke(LambdaForm$MH)
at 0x00000007c00fb440.linkToTargetMethod(LambdaForm$MH)
at s.IfL.(_:611)
at 0x00000007c0630840.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Thanks. Interestingly the crash is when the texture packer goes to write images and applies color bleeding. We'll look into making the color bleeding more efficient. You could try disabling Bleed
for now.
Your 9910x6334 image takes up 250MB just to hold it in memory. Some processing can require 2-3 times that size. Still, even 1GB per image should be OK and I'm surprised you get an out of memory error. I'd like to see it myself, I'll experiment with a large image export.
We've greatly reduced the memory usage of Bleed
when texture packing in 4.0.32-beta. Please give that a whirl! It should be released tomorrow morning.
It works now. Thanks for the fix. We're looking to utilize spine 4.0 to it's fullest once its out of beta. Until then I'll update you on anything else I find out.
Great, thanks for confirming it's fixed!
I'm going to shamelessly highjack this post... :smirk:
I've finally got myself to upgrade to v4, because that curve editor seems really nice, along with other features I've seen.
Anyway, I also assumed what Jeffrey mentioned, that 'out of memory' issues would be a problem of the past with the 64 bit memory addressing, but I've just had a crash while packing.
My main objective is to accelerate packing as much as possible, so I guess loading all images in RAM would help. Therefore, I've uncheked the option to Limit Memory, while still using the Fast option:
I have 32 GB of RAM, so when I saw it wasn't using more than 3, I thought everything was going fine, but I've still got an out of memory error...:
You may remember how big my project is but, I kind of expected RAM usage going closer to my total amount of RAM, before receiving an error. :think:
It's not a big deal, tbh, but I guess you'll want to find out why this is happening, considering one of the big advantages of v4 is running as an x64 app. And yes, it seems to be much faster than before.
Do you want my full project for you guys to test it?
@Abelius I know you were hesitant to use the beta, and understandably so. We'll do our best to solve any problems you encounter ASAP. Glad you found v4 faster so far!
How much memory Spine uses is a little tricky/complex. If we set the maximum memory usage very high, it would likely reach that number eventually. This is due to various technical reasons, but in a nutshell reclaiming memory requires effort, so it is possible that effort won't be performed until required. In order to prevent Spine from using up a big chunk of memory, we have it target 2GB by default. That is enough for most people and if not we may adjust the defaults in the future.
You can change the max memory by passing a command line parameter when Spine is started. For example:
Spine.exe -Xmx2048m
That gives Spine a maximum of 2GB. You can replace 2048
with the number of megabytes you want to allow Spine to use. For example for 16GB use -Xmx16384m
(16 * 1024 = 16384).
That said, I'm not sure Limit memory
will give you much of an speed improvement. Probably you are packing many images and most of the time goes to trying to find optimal packing rather than the time spent reading the image files, but it's worth a try.
While Spine's texture packer does a decent job, Spine does a lot besides texture packing. Given the complexity of your project, you may be better served by a texture packing tool that is fully dedicated to that specifc task. The best tool for that is Texture Packer Pro. It's not free, but you may find it worth the $20-40 the author asks. It is widely used for many years, has an ungodly number of powerful features, and can pack to the Spine atlas file format (aka the libgdx atlas file format).
Hi Nate,
Nate escribióThat said, I'm not sure
Limit memory
will give you much of an speed improvement. Probably you are packing many images and most of the time goes to trying to find optimal packing rather than the time spent reading the image files, but it's worth a try.
To be honest, there isn't an actually noticeable difference. So I won't dwell on this matter.
Nate escribióGiven the complexity of your project, you may be better served by a texture packing tool that is fully dedicated to that specifc task.
I didn't even know you could use an external packer! :o
Well, I'll surely have a look. My only concern, though, is that now I'm using the encryption plugin I mentioned in other posts. But if the final result is the same as Spine would do, I guess there won't be a problem. What do you think?
Edit: I've downloaded Texture Packer, but I don't understand what I'm supposed to do.
I've selected LibGDX, but when I try to open a project, expecting being able to read Spine's skeleton JSON, that's not a valid option.
It behaves like it expects you to build spritesheets from the images folders, with complete disregard to what Spine will expect to go to to find the regions.
Is there a tutorial in the forum or the documentation for this? I can't find anything.
Edit 2:
I've added the whole images folder, and it's being like this for more than 30 minutes already...:
I'm not impressed. :rolleyes:
Unless I'm doing something very wrong, I don't think I'll be able to use it, and it adds another layer of complexity to my workflow.
Hmm, I'm surprised, it's typically quite fast. I've contacted the Texture Packer Pro author, hopefully he can help out. He's been very helpful in the past. Some time ago we discussed how Texture Packer Pro would read JSON data so it can do whitespace stripping without wrecking meshes, but I'm not sure if he has implemented it.
Nate escribióHmm, I'm surprised, it's typically quite fast.
I also expected stellar performance, being a dedicated tool. But this may have something to do with the fact that the software doesn't have a clue about how I actually use just a fraction of each huge (exported at canvas size) image.
Nate escribióSome time ago we discussed how Texture Packer Pro would read JSON data so it can do whitespace stripping without wrecking meshes, but I'm not sure if he has implemented it.
Yeah, if it was able to read the JSON file, then it would know it doesn't need to pack a freakingly gigantic set of images. The only other way to take is to trim them as it sees fit, and that indeed would destroy my meshes.
Unless I'm mistaken, I don't see any other way than the tool "asking" Spine how to pack each image, no?
To do whitespace stripping for images used by meshes, Texture Packer Pro needs to read the skeleton data (one or more JSON or binary files), union mesh hulls that use the same image (multiple meshes may use the same image), then whitespace stripping can remove all pixels outside the mesh hulls. It wouldn't be super hard to implement, though a bit of a pain to test.
Nate escribióTo do whitespace stripping for images used by meshes, Texture Packer Pro needs to read the skeleton data (one or more JSON or binary files), union mesh hulls that use the same image (multiple meshes may use the same image), then whitespace stripping can remove all pixels outside the mesh hulls. It wouldn't be super hard to implement, though a bit of a pain to test.
So... if that's not implemented in TP Pro, I still need Spine to do the packing, no?
Why did you suggest this software to me, then?? :lol:
I don't know if it's implemented in Texture Packer Pro, I had a discussion with the author long ago and thought it was. I was hoping he'd jump in and help out, but I guess you can carry on using Spine for packing. Sorry for the disruption!