Hello guys
Amazing project, today I use it to manage our SaaS with mysql and php for monitoring and backups.
My question is we are hosted on google cloud with ssd.
Assuming we are going to scale, each of our customers has its own database.
Assuming a situation of 5,000 databases and each database with 50 tables would give a total table of 250,000 thousand.
Normally each bank is around 10mb but has customers with 500mb of bank.
When creating a new client of ours, we use the Hestiacp api to create the database for every day at dawn, to be backed up…
Regarding performance, backups does anyone have any similar/similar scenario?
Do you think there might be a problem?
Or is there any limitation in this regard?
Anyway, it would be a question regarding large database numbers…
Thanks