Activity overview
Latest activity by ursusmaj
The approach we have taken to dealing with this performance issue has been to create a specific user and role in Oracle ("the data compare user") with access to only a sub-set of the tables within the schema we are trying to compare.
The somewhat simplistic approach we have taken is to use a stored procedure to only grant access to non-empty tables to the data compare user. This reduces the number of tables visible to the DCO user to around 3500 which has a dramatic effect on the performance for us:
1. The project stored on disk is some 20 Mb rather than 700 Mb+.
2. The time taken to open/modify the project is reduced from an hour to a couple of minutes.
3. The memory usage of the DCO application is reduced significantly.
Obviously, there are some issues with the approach we have taken:
(a) You need to choose the list of tables to which access is granted AND ensure the same list is granted in each database since the same set of tables may not be empty in each one.
(b) The list of accessible tables needs to be updated as changes occur.
However, compared to the alternative performance issue these are minor from my perspective.
I hope this helps others experiencing this issue. / comments
The approach we have taken to dealing with this performance issue has been to create a specific user and role in Oracle ("the data compare user") with access to only a sub-set of the tables within ...
To be clearer, my reference to "before" was on another site, so the number of tables involved will have been different i.e. probably under 60,000 but not much below that level. Also, the hardware and OS were likely much different too.
The 66,300 tables is the number in each schema (i.e. both source and target DB's have that number of tables). This is a PeopleSoft Financials installation, so it is all in the one schema.
In reality, as I suspect is often the case with ERP installs, the number of tables I want to compare is a small subset of the total number. Perhaps 1% or even less.
The idea of up front filtering would work well for me in theory, but is depends very much on the flexibility of the actual implementation. I will provide my comments against the suggestion. / comments
To be clearer, my reference to "before" was on another site, so the number of tables involved will have been different i.e. probably under 60,000 but not much below that level. Also, the hardware a...
Thanks for the quick response. I do have version 2.1.0.325 and the option "Check tables for data" is unchecked in my project.
Some further info:
Registering database one goes to 5% immediately and then quite quickly to 30%, 10 minutes to go to 50%, another 10 to hit 70%. It sits there for more than 30 minutes on 70%. When it eventually completes, the same is repeated on the second database, although timings are much worse as the system is swapping heavily by this time. In fact the swapping occurs quite quickly after 30% on the first database.
In terms of tables in the schema there are approximately 66,300 in the schema(s). / comments
Thanks for the quick response. I do have version 2.1.0.325 and the option "Check tables for data" is unchecked in my project.
Some further info:
Registering database one goes to 5% immediately and ...
I am also seeing very slow database registration when comparing large schemas (e.g. PeopleSoft).
The application memory usage in particular is the main issue I see - it grows to 3.8 Gb on a 4 Gb server and then causes swapping which makes the performance spectacularly bad. This is on the 64-bit version of Data Compare on 2008 R2.
I plan to add additional memory since the actual required memory looks to be around 5-6 Gb in my specific case.
Given that in most cases (I suspect), people are looking to compare a sub-set of the schema, I think a mechanism to speed the initial registration up is essential.
As an aside, I have used previous versions of Data Compare on other sites and did not see this behaviour previously - granted it was never "quick" for large schemas but it does seem to have got noticeably worse in the newer versions. / comments
I am also seeing very slow database registration when comparing large schemas (e.g. PeopleSoft).
The application memory usage in particular is the main issue I see - it grows to 3.8 Gb on a 4 Gb se...