Hi, I've been looking into ObjectDB for our use, where everything has been great apart from I'm stuck on the following problem.
I am trying to load a large file (> 1GB), process certain information and store it back in the database to access / modify later. I require to be able to rollback changes as well, where there can be massive amount of changes.
I have been storing the data using a single entity manager and following the batch shown at https://www.objectdb.com/java/jpa/persistence/store#Batch_Store_. However the problem is that it seems that even calling flush followed by clear the uncommitted changes persist in RAM (not the file db), hence I am getting out of memory exceptions.
Here are snapshots of my code:
protected void onBegin() { this.em = this.emf.createEntityManager(); this.em.getTransaction().begin(); } protected void insertDatabaseData(DatabaseData data) { this.em.persist(data); if (this.commitCount.getAndIncrement() >= 5000) { this.commitCount.set(0); this.em.flush(); this.em.clear(); System.err.println("Flushing"); } } protected void onCommit() { this.em.getTransaction().commit(); this.em.clear(); this.em.close(); System.err.println("Commit Done"); } protected void onRollback() { this.em.getTransaction().rollback(); this.em.clear(); this.em.close(); System.err.println("Rollback Done"); }
The general call of these functions are:
onBegin(); for (DatabaseData d : lotsOfData) insertDatabaseData(d); if (commit) onCommit(); else onRollback();
If anyone can see what is going wrong I would really like to know. I am running the current version 2.2.7_04.
Cheers,
Dave