Ok, so I’ve written a pair of scripts to allow rendering a fractal in strips. This is useful since it allows rendering images that would otherwise be too large to fit into memory. If you render in 3 strips, for example, you’ll require only 1/3 as much memory, but on the flip-side the render will also take 3 times as long.*
It works pretty well: Just run slice_flame.py on a flame, render all flames in the resulting file, then run join_strips.py and point it at one of the rendered strips. The script will find the other strips and combine them all into a single image.
Download the scripts here: slice_rendering.zip
The only issue this approach has it that the resulting image shows visible seams if DE is turned on and the render quality is very low. This happens because the seed used for the random number generator is a different one for each strip. The problem goes away when rendering with high enough quality (let’s say 500 or higher).
One additional note: Don’t use the jpg format when employing this script! Since the image is saved twice (once when rendering the strips, second time when loading and combining them), you’d lose quality due to jpg compression twice as well.
As usual, if you have any issues or find a bug, please let me know so I can fix it!
* Actually, the total render would take the same time, but the quality would only be 1/3. You need to render the strips at 3x the quality to get the same result as rendering in a single slice (because quality is calculated in relation to render size)