Home Technical Talk

Maya crashes on load scene + perforce and backup strategies

grand marshal polycounter
Offline / Send Message
Alex_J grand marshal polycounter
I have a maya binary scene that crashes shortly after loading, while the textures are still loading in the scene. Crashes in both 2022 and 2023 versions whether I open the file directly or import it.

I suspect it is corrupted mash network node because going back incremental saves, I cannot open until before I used a mash network to distribute some geometry along a curve.

There is half an hour of work lost - not a huge deal - but is there any way I might be able to recover that work? If I can just get the file open I can export the distributed geometry as obj and then nothing is lost.

Maybe some plugin or something where I could open this maya file in blender or something?


edit: Also I am not seeing anything useful in the Temp directory: Not sure what the MayaCLM log is but doesnt seem to be related to the crash. So I am not sure where I can find any sort of error message.


In the future, if I saved as maya ascii, would that help situations like this at all?


edit: This is as much as the script editor shows before the crash, after I drag/drop the .mb to import into a new empty scene:


Replies

  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    Could not find any way to recover that work.

    Not 100% positive the exact cause, but a good hunch that because I left mash network in the scene (didn't delete it after being finished with it) that was the source of the problem.

    To mitigate lost time, decreasing auto save interval. One problem though is that incremental saves can quickly take up a ton of space. You can limit the number of incremental saves and then upon reaching limit maya will start overwriting from the oldest. This is pretty good but there is one problem - if you are using this combined with autosave, and you leave program AFK for an hour, now you've wiped out your useful incremental saves. 

    Fix for this issue is to use the "prompt for autosave" so that if your are AFK it does not overwrite.

    Saving as .ma file would allow comparing code between problem scene and working scene. Knowing how to fix the code could become rabbit hole for time though if you are not already a programmer familiar with maya. Because .ma files are larger than .mb, if disk space is any issue I think it might not be worth the effort and better to use a more frequent incremental save system.

    Just mentioning this because for the home-learner you never read about things like this anywhere but its important part of computer work.
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    we always use MA at the office because of precisely this crap - you can often just cut the bit you don't like out of the file in a text editor and be on your merry way. 

    The two lessons to learn here are.. 1:never trust maya and 2: use version control. 


  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    poopipe said:

    The two lessons to learn here are.. 1:never trust maya and 2: use version control. 


    Thanks @poopipe . yeah I like maya a lot most of the time but every once in awhile there is just like a week it decides to shit the bed repeatedly.

    is it correct to say that version control is useful only for the .ma files because you can compare the text? Like if you versioned .mb files, the only thing that does is allow rollback, but you get that from incremental saves anyway? Similar to unreal blueprints being that they can't be diff'd because they are not text.

    Are you using version control for all art asset files?

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    yes, you'd not be able to read binaries so you'd only be able to roll back (technically you'd could try to remove bits manually with a hex editor but good luck with that)

    I put anything that contributes to a project and that can't be recreated in a short time into version control.
    eg. my substance designer files go into perforce, the exported textures don't
    I only discriminate because it's my money that's paying for the storage - at work, everything goes in. 

    The only negative to using version control is that you need somewhere to put the files. Because you're storing <n> copies it does eat disk space pretty quickly and you can't easily get it back because it's versioned. 
    On the other hand, disk space is cheap and losing weeks worth of work cos you accidentally saved over something is utterly miserable - if you're actually relying on your work to make a living it could easily cripple you as well. 

    Personally I'd advise most people grab perforce and learn how to use it - it's free for the home user, integrates very well with unreal and helps with managing your files in addition to versioning them. 
    I'd avoid git for the sort of work most people on here do - it's great for code but isn't really built for handling monolithic data like a maya file or unreal assets, people do manage though.
    I'd avoid svn because it's worse at everything than git and perforce. 


  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    poopipe said:
    yes, you'd not be able to read binaries so you'd only be able to roll back
    Personally I'd advise most people grab perforce and learn how to use it - it's free for the home user, integrates very well with unreal and helps with managing your files in addition to versioning them. 
    I'd avoid git for the sort of work most people on here do - it's great for code but isn't really built for handling monolithic data like a maya file or unreal assets, people do manage though.
    I'd avoid svn because it's worse at everything than git and perforce. 


    Thanks again!

    I have been using git with LFS to backup my unreal project. But I have never done versioning for art files like maya scenes and so on. So far, no disasters but I do feel like I ought to back up the maya files, especially those containing animations and rigs. Because these represent a lot of work and are the most unstable. The only thing I've done so far is occasional just copy paste them to another external harddrive.The actual fbx files that make it into the unreal project do get stored in respository which is better than nothing, but not the same as having the original source scene in terms of work to be redone if there is a disaster.

    I understand that perforce can actually version binary files like maya and uasset? That sounds good but I wonder if actually knowing how to handle merges and so on for the solo developer is realistic? What I mean is, if I have some issue with a maya scene going corrupt again, it is probably more likely that just rolling back an increment save is more productive compared to trying to go through code, interpret it (I dont know jack about maya code), and then fix it manually. Same goes for if there was some issue with an fbx or an affinity photo file or anything else.

    I will look into it more - i had done so briefly in past but it sounded like commercial grade stuff and i already seemed to have everything i needed with git which was simple enough. But I do feel some insecurity in that I'm not backing up my art source files in same way as the game engine project.

    The main problem for me is that data on a cloud service grows fast and because I am on shoestring budget it is constant battle to clear out old stuff, and this means potentially dangerous deletion of things. I'd kind of like to just grab a handful of cheap external hard drives and just do a ton of backups instead so that I never need to delete things. (just talking about backup of art production stuff, I continue to use version control for engine stuff since I do use branches pretty regularly)

    would it make sense have a repository that is on an external hard drive, and then like weekly you backup that repository to a few other hard drives?




  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    if you're happy with git then keep using it. 
    it would be worth trying perforce out just in case you like it but I wouldn't migrate a project I was halfway through because it's a huge ballache if you want to retain revision history. 

    You can't merge binaries with perforce by default - you need specific plugins for that (like the diff stuff in unreal editor)

    In terms of storage, I keep everything locally.  My depot lives on a 2tb NVMe drive and I back up current revisions to a NAS with redundant RAID when I remember. 

    I don't particularly worry about losing revision history in the even of the NVMe drive dying because the lifespan of any project I embark on tends to be measured in weeks rather than years and I'm not relying on any of this to make money. 
    If my livelihood relied on it I'd be a lot more careful





  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    poopipe said:
    if you're happy with git then keep using it. 
    it would be worth trying perforce out just in case you like it but I wouldn't migrate a project I was halfway through because it's a huge ballache if you want to retain revision history.

    Thanks again.

    It is feasible at this point that I could change to perforce, I am six months+ into a project but not really dependent on anything old in the version history. I could just leave the git repo alive for awhile after changing to perforce. It would be good to have all of the projects working files backed up by just pushing commit + push and not have to think about having to clear out old stuff every month. So I think I have to get some local storage devices rather than keep adding another $10 to the stack each month for git storage.

    and if i do need to clear space, i'd rather that mean going into local files and just deleting some incremental saves where I can manually review the date per file, compared to sending console commands to a black box that doesn't get me immediate feedback on what I just did, or warn me if I am about to do something destructive. That's my main reservation about version control on the cloud is that you have to know what you are doing and it takes a shit ton of research to figure out any little thing. So I try not to do anything beyond just the basics with it (commit and rollback, and sometimes using feature branches is all i do)

    I am looking for get some more local drives so will look into the RAID thing
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    Perhaps I should post this in the building PC thread, but can anybody say anything about this RAID device?


    It was recommended from Puget Systems which is where I got my desktop from. This is two hard drives which I understand would allow you to do a Raid 0 or Raid 1 type of setup. Raid 1 I think is what I would prefer as it gives you actual redundancy (same content copied to two separate locations), which should be enough for my needs.

    So I think the best workflow would be then to have the version control repo on one HDD and then the RAID program can handle automatically backing that up to other harddrives on a schedule?
    I know that at commercial level you also want some cloud storage, which really is just somebody elses hard drive somewhere, but the end result is that you have three points of redundancy, right? In my case I'd rather pay for some extra external device once instead of a subscription that is also bottlenecked by internet download/upload speeds.


  • Klunk
    Offline / Send Message
    Klunk ngon master
    is there an equivalent to max's Merge ? where you get to pick the specific nodes to load in Maya.
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    @Klunk not that I am aware of. I did some searching but couldn't find anything. But I am not very technically savvy with maya so don't take my word for it.

    I spent day testing out perforce and it is a lot easier to use compared to git, especially with integration in unreal. I also bought a basic RAID setup and will move my versioning so that my remote / depot is on machines harddrive and then is scheduled to incremental backup to two other harddrives nightly.
    should have got on perforce sooner

    for posterity, here is best video tutorial I found for getting started: https://www.youtube.com/watch?v=Hvmvv2MG-UE

  • ThirdDimensionBadger
    Hey I've just had a very similar issue, but I was fortunately saving with .ma not binary. I'd just had a mistype and changed a mash network from 600 dist to 600450 (instead of down to 450), so the PC tried to go to into space whenever it loaded and Fatal Errored.

    I was able to recover the file by opening it in a text editor and (with very beginner coding knowledge) deleting all of the mash node networks in the editor and saving it back out. So this sort of recovery is totally possible in that situation, for anyone else coming here with similar issues. For anyone using binary I wish you luck on your quest!
Sign In or Register to comment.