@raman_kumar you really need to put some effort into this. You are copying/pasting a lot of code without showing the exact line of code that triggered the error.
If you still want help, please share a minimum working example where you construct the two grids and attempt the interpolation.
I got error OutOfMemoryError() in pressure = rand(nelements(grid1)) as it is very large array having 1013691010309291007 elements. Try Minimal Working Example below .
Yes so the error is pretty clear. You cannot store a vector that big in your machine because it doesnât have enough memory.
I created these variables as an example. You should provide the vector of values you already have in grid1 and that youâre trying to interpolate over grid2.
I canât help more, sorry. People can jump in with more help. You probably need to ask questions in the beginners section first.
I am using HDF5.jl as my data is in HDF5 format. I have 32 Gb ram which is good a laptop but not for this big array. What other ways are for me to overcome error? Would it be better to try other interpolation package instead of GeoStats.jl ?
Youâre not going to find a computer with enough memory to hold this grid. (Even considering virtual memory, the limits for Linux and Windows are in the terrabyte range, and this grid requires exabytes. Finding exabytes of disk storage is non-trivial: An exabyte of disk storage at CERN | CERN) You canât interpolate to such a large grid unless you can find some code that will work with out-of-core data structures. So you can choose a coarser resolution or you can make a grid at the fine resolution for a subset of the total grid, or both.
This was also problem when i tried to change data from spherical to Cartessian coordinates as in initial post in this topic. I overcame it by using for loop to transform coordinates it in parts one by one and appending. My point is that i canât throw away my data and i have plot it. Is there any way to plot them in parts. Actually i have to plot even bigger similar data. My data is from Athena++ astrophysical simulation and i shouldnât/ couldnât reduce it.
You still need to rethink your problem. Brute force is just not going to work. Sure, plot it in parts. Read part of your data, and plot it. Repeat. You could be in for a long time.
As many people suggested already, you need to load the data lazily with HD5F.jl and run functions that store the result directly to disk like HD5F.jl probably does. Check DiskArrays.jl to learn more.
From the GeoStats.jl side we try to preserve the array type as much as possible. So if you feed a lazy array you should get out a lazy array.
For the clarity i need to plot all data. What are other ways to deal with it? Can this data be b plotted in parts and finally all plots attached to get actual plot?
A plot is always rendered to a small pixel grid so you always do downsampling anyway, even if you throw a lot of data at your plotting functions. Itâs just a question where it occurs, in your preprocessing or during rendering.
In my case i have points in 3D space having x, y and z coordinates. Each point have associated values like Density or Pressure or Velocity in x,y and z directions etc. in different arrays. If i remove a point in 3D space by downsampling then i have to downsample that point associated value in arrays having Density or Pressure or Velocity in x, y and z directions.