Hi,
I am running a java application that records data from 100 feeds into 100 different database - this is done is one thread.
It works fine several hours - and suddenly i got an out of memory issue (I checked there is no leak in java - I cleared the entity manager every 10 new entity) - I run it on a VPS with about 4GB of RAM.
1/ I use the default objectDb.conf (i mean i dont use any - so it must be using the default somewhere).
2/ Using Netbeans, I profiled the application and I noticed that there is one thread (ODB FileWriter) running for each open DB.
I am not sure how to solve this problem - does it come from the odb$ temp files? do I have too many open DB? is 100 too much? Is there something in object.conf that i should tweak?
I think I need to understand how the RAM is managed by ObjectDB for each open DB -
Any help/direction would be appreciated
Thanks
EKK
com.objectdb.o.JPE.g(JPE.java:89) com.objectdb.o.ERR.f(ERR.java:60) com.objectdb.o.OBC.onObjectDBError(OBC.java:1484) com.objectdb.jpa.EMImpl.commit(EMImpl.java:290) DB.ActualDB.writeToDB(ActualDB.java:109) DB.DB.writeToDB(DB.java:39) DB.DB.write2DB(DB.java:59) DB.DBRecorder.record(DBRecorder.java:107) DB.DBRecorder$1.runinternal(DBRecorder.java:58) Utils.Mail.MyRunnable.run(MyRunnable.java:26) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:745) e.toString()=com.objectdb.o._RollbackException: Failed to commit transaction: GC overhead limit exceeded e.getCause()=java.lang.OutOfMemoryError: GC overhead limit exceeded e.getClass()=class com.objectdb.o._RollbackException