Hi,
with bigger template scripts (up to 10MB) and many clones (> 100), the SQLClone website initally loads a lot of data (in our tests between 500 and 800 MB). One of the biggest web requests done is the following URL:
https://mysqlcloneurl/api/v1/clones?take=500With a direct call I can see the huge amount of json data, which includes all templates enrolled for each clone, with a lot of redundancy of the template scripts. Not only is this very inefficient, it is also a
big security issue, since this data is available to all roles, with possibly included passwords in the templates.
Since "clone only" users have no access to the activity logs for security reasons, I guess this is a major bug.
Also, with such amounts of data, the newest Chrome version refuses to load the site - older versions still work.
with bigger template scripts (up to 10MB) and many clones (> 100), the SQLClone website initally loads a lot of data (in our tests between 500 and 800 MB). One of the biggest web requests done is the following URL:
https://mysqlcloneurl/api/v1/clones?take=500
With a direct call I can see the huge amount of json data, which includes all templates enrolled for each clone, with a lot of redundancy of the template scripts. Not only is this very inefficient, it is also a big security issue, since this data is available to all roles, with possibly included passwords in the templates.
Since "clone only" users have no access to the activity logs for security reasons, I guess this is a major bug.
Also, with such amounts of data, the newest Chrome version refuses to load the site - older versions still work.