JuliaDB error loading tables

I frequently have problems reloading the JuliaDB tables that I save. Sometimes they’re smaller tables on the order of a few MBs and sometimes a couple of GBs, but never anything close to how much RAM my machine has. I’m currently looking at a dataset that generates an EXCEPTION_ACCESS_VIOLATION on Windows and crashed Julia, but sometimes loading a table results in a stack trace, it’s mixed.

Two questions:

Sometimes an older file I save that was loadable before isn’t loadable later on with an updated version of JuliaDB. Is the file format stable? I will admit that I haven’t been paying attention to which version of JuliaDB a file is being saved or loaded under, but I know that I’ve seen this happen after a package update updated JuliaDB. However, this problem might be remotely connected to my other question below.

I may be using JuliaDB incorrectly. I work with a lot of time history data (time vs pressure, for example), and have been trying to store the time and pressure arrays along with associated meta data in a single table. So a row might look like JuliaDB.table((event_date=[ timestamp ], event_t = [ time_array ], event_p = [ pressure_array ]))

Where time_array and pressure_array could be empty Float64[]'s or could be stuffed with actual times and pressures.

I know from a traditional database perspective, this is a no no. Should I not be doing this with JuliaDBs as well?

-David

The format is based on Julia’s serializer, so anytime a JuliaDB struct changes, that table won’t be loadable in a new version. That being said, I don’t think anyone has been touching JuliaDB internals for a while.

A field with Array elements shouldn’t be a problem for saving, although yes, things like that can make data analysis a mess.