ObjectDB ObjectDB

Client server mode no longer works after JRE install



I had client server mode working with no problems with my application. Then suddenly I was getting the following error "Error: missing 'server' at 'C:\Program Files\Java\jre7\bin\server\jvm.dll' on a client machine. It worked in embedded mode but not client server mode.

After investigating I found that the new version of java on the machine was a jre with no bin\server folder. If I removed java from the machine and included a jdk in my application there were no problems, client server mode once again worked. However, when the jre was reinstalled it no longer worked. So apparently objectdb requires a jdk on each machine as opposed to a jre.

How can I get objectdb to use the jdk I install with my application as opposed to the version of java on the PATH variable. I will not know in advance what is on the users machine.

I hope this is not too dumb a question.



When you run server.exe the JVM is selected automatically.

But you can also run the server as an ordinary Java program, so by specifying full JVM path you can select any JDK or JRE that is installed on your computer.

You can also use JRE with no server JVM installed by specifying -client JVM

Additional details about the server mode can be found in the following stackoverflow thread:


ObjectDB Support
ObjectDB - Fast Object Database for Java (JPA/JDO)

So this can only be done from the command line or batch file? The conf file cannot be used to state which jvm is being used?


The conf file cannot be used because it is read only after the JVM is found and started.

Please try:

> server-b -client


ObjectDB Support
ObjectDB - Fast Object Database for Java (JPA/JDO)

The server is being started from a java application. So how do I do >server -b -client?


Hi. I fumbled around with batch files and now think I have it working ok. Thanks for the help.

Post Reply