Not enough memory for wigner transform

Today when I tried to use wigner transform to calculate the time frequency spectrum, an error message saying not enough memory instantly and the process ceased. My MacBook Pro has 8GB ram. How can I avoid this or how can I increase the virtual memory of my Mac?
On Macintosh Igor gets a virtual address space of 4GB. There is no adjustment for this.

The out-of-memory error may be misleading. If you can post some commands that show the error I can investigate. For example, this:
Make/N=10000 test = p
WignerTransform test

does not return an error.

Alternatively you could post an experiment that shows how to reproduce the error if the experiment is not too big. It would be post a copy of your experiment with any extraneous data and windows removed.
Hi,

Making a new wave of 10000 points and doing WignerTransform is ok in the Mac.

In my case, there are 280,000 rows in the 1D wave. It is too many? What is max. number of rows that can be handled? If I down sample my data, how can I retain the fidelity of the signal?

Thanks.
Quote:
In my case, there are 280,000 rows in the 1D wave. It is too many? What is max. number of rows that can be handled?


Inspecting the code, I see that the Wigner transform makes a buffer that is N^2. Thus, to do a Wigner transform on 280,000 points requires a 78 billion-point buffer.

The number of bytes required, assuming your wave is single-precision floating point, will be N^2 * 4. If you allow 2GB for the buffer, then N=22000 points, roughly.

But memory is also needed for the input wave and the output wave.

Empirically the largest input I am able to use on Macintosh is about 15000 points, single-precision floating point. This creates an output that is 15000 x 7501 = 112E6 points = 562E6 bytes.

However, you can control the size of the buffer and the output wave using the /WIDE flag. Using /WIDE=w, I see that the output wave is N x (w/2+1) - that is, it has N rows where N is the size of your 1D input wave and w/2+1 columns where w is what you specified via /WIDE.

Using N=280000 and /WIDE=1000, I wind up with a 280000 x 501 output wave which, in single-precision floating point, requires 561 MB.

Quote:
If I down sample my data, how can I retain the fidelity of the signal?


I don't know the answer to this.