View Issue Details
|ID||Project||Category||View Status||Date Submitted||Last Update|
|0001943||ardour||features||public||2007-10-29 19:00||2011-01-12 17:54|
|Summary||0001943: A rewrite of the saving functions so that they stream directly to disk and not first to an in-memory buffer.|
|Description||When working on very large projects, that memory sometimes runs dry.|
When very close to the memory ceiling, the best thing a user can do, is save and reopen the project (which currently frees a lot of resources).
But unfortunately, if the project is very large, a pretty big amount of memory is temporarily allocated while saving, which in many cases can cause the system to run completely out of memory, during the save, making Ardour crash, before the file is completely written (This does not corrupt the original file though).
If the saving system was rewritten so that it streams directly to disk, and not be allocating any memory in the process, the saving of a project would always be possible, even when very low on resources.
|Tags||No tags attached.|
I suspect that the problem you are running into is related to the saved history file, not the session file. The latest version of ardour has (a) reduced the default depth of the in-memory history and the history file (b) offers user-controllable limits to how much history to store in memory *and* how much to write to disk. I believe that these should address this problem effectively.
The design of XML makes it very hard to stream directly to disk. XML offers us far, far too many advantages to move away from it to a format that did allow this.
Please let me know what you think.
The saved history file is not causing the problem, since I have disabled it entirely.
I do not agree that the design of XML makes it hard to stream, XML is very easy to stream. If you have an internal data structure containing the project, you simply iterate all internal elements, and write one node at the time, recursively of cause.
Moving away from XML is not what I originally intended. XML is an open format, and we like open formats ;-) And what does a 20Mb XML file mean in an 3Gb project...
the problem is that ardour's data structure is heirarchical, not a list of things.
at the top level there is a session. so sure, ardour collects the XML state for the session and writes it to disk.to do so, it iterates over various collections of elements, just as you describe, collecting XML nodes as it goes. but each of the items in these collections itself has children, and so on and so forth. the problem is - you cannot write these to disk because they are children of the top level Session XML node. writing to disk has to wait till the session node has collected all of its children, grandchildren etc etc, which according to you is the source of a problem.
This is true if you use a DOM based XML system. If you simply use a SAX parser for the reading and normal fprintf and friends for the writing, then the children can be written /while/ they are collected.
I have programmed several of these recursive-datastructure systems and they all use close to zero memory in the process.
I know that the DOM model is convenient, but its failure to work memory efficiently on large XML files will (in my opinion) disqualify it for this project.
|2007-10-29 19:00||deva||New Issue|
|2007-11-05 01:56||paul||Note Added: 0004531|
|2007-11-05 01:56||paul||Status||new => feedback|
|2007-11-06 12:48||deva||Note Added: 0004549|
|2007-11-06 17:20||paul||Note Added: 0004553|
|2007-12-06 09:58||deva||Note Added: 0004586|
|2009-07-05 05:12||seablade||Status||feedback => acknowledged|
|2009-07-05 05:12||seablade||Description Updated|