6

I have a list of several hundred images, and I want to average the list to get a single image. Each image is 12bit TIFF with 2048*2048 pixels, and I am using the following to do the job:

result=Mean[Map[ImageData[Import[#]]&,imagelist]];

The problem is that the data becomes huge. I tried to do the task with about 400 images with 2048*2048 resolution. The computer ran out of ram, created a 25GB of virtual memory, and could not finish the task.

Is there any other alternative that I can use to do the task?

I would greatly appreciate your help.

hbar
  • 189
  • 3

2 Answers2

8

Here is a way where you hold only two images in memory at any time and the only memory increase when you add up more images is for holding the file names.

Using @stuartw "sample image list generator" :)

(* generate a sample image list*)
img = Import["http://todayinsci.com/H/Hilbert_David/HilbertDavidThm.jpg"];
path = Export[ToFileName[$UserDocumentsDirectory, "hilbert.jpg"], img]; 
set = Table[path, {30}];


(* Now process it *)
Image[Fold[#1 + ImageData@Import@#2 &, 0 set[[1]], set]/Length@set]
Dr. belisarius
  • 115,881
  • 13
  • 203
  • 453
4

By rearranging a few things, you can process one image at a time. I did not test this with large images, but it should help

img = Import["http://todayinsci.com/H/Hilbert_David/HilbertDavidThm.jpg"];
path = Export[ToFileName[{NotebookDirectory[]}, "hilbert.jpg"], img];
ImageDimensions[img]

(*{100, 125}*)

imageList = Table[path, {30}];

sumImageData = Table[{0, 0, 0}, {125}, {100}];
nImages = 0;
processData[{sumArray_, n_}, path_] := Module[{},
    Return[{sumArray + ImageData[Import[path]], n + 1}]
];

({sumImageData, nImages} = 
  processData[{sumImageData, nImages}, #]) & /@ imageList;

Dimensions[sumImageData]
nImages

(*{125, 100, 3}
30 *)

Image[sumImageData/nImages]