We are creating a web app that will have multiple different clients, each on their own EC2 instance(s). The clients do not need to share any data, so each could have its own ObjectDB. Clients have a user base of about 2,000 to 60,000 users that could possibly be using their application at any given time (100k absolute maximum). We will probably have multiple instances per client for availability concerns as well as auto scale during peak usage, at least for the benefit of our web application if not for objectdb. Its a standard web app with an average amount of data being fetched per page view.
I see two ways to integrate objectdb: first, since during 99% of the time there will only be one active instance of the app, use embedded mode inside the web application itself. Then if we ever have to boot up another instance, we can point objectdb of those new instances at the first instance as slaves (they have their own read-only version, but writes are pushed to the master). Am I understanding this correctly?
Our second option would be to have a separate objectdb server that handles connections from multiple instances. But from what I can tell, there will be about a 50% performance drop from using Server mode to accommodate the <1% of use scenarios. However (at least from a traditional database background) this split would help ObjectDB and our webapp stop competing for system resources (cpu time, memory, etc). Is that a valid concern? The graphs on your site would seem to say otherwise, with such a big performance improvement for embedded mode, but maybe that has a limit. Is there a guideline as to system requirements for certain loads on ObjectDB?
Would you recommend an embedded or server mode for our scenario? Are there concerns that I'm missing such as availability, bad SPOF issues, or security that would make you advise one way or the other?