Before working on one of our customers’ development systems I needed to take a backup of the DAC repository so that I could roll back the changes if necessary. As normal, I kicked the backup off from the DAC client on my laptop and left it to do its thing.
Normally this would take no longer than 30 minutes to complete, so after over an hour of waiting I had a feeling that something had gone wrong……
Log files and errors
The first port of call to check for any errors were the DAC client log files. The files can be found in the log directory under the DAC installation location (e.g. C:\oracle\product\DAC\bifoundation\dac\log). In the export log file I found the following Java error:
152 INFO Tue Jul 17 09:12:45 BST 2012 Exporting entity W_ETL_RUN_SDTL 153 SEVERE Tue Jul 17 09:29:35 BST 2012 ANOMALY INFO::: error MESSAGE:::Java heap space EXCEPTION CLASS::: java.lang.OutOfMemoryError java.lang.reflect.Method.copy(Method.java:143) java.lang.reflect.ReflectAccess.copyMethod(ReflectAccess.java:118) sun.reflect.ReflectionFactory.copyMethod(ReflectionFactory.java:282) java.lang.Class.copyMethods(Class.java:2748)
This seems to indicate that the client was running out of memory while performing the export.
In DAC it is possible to change the maximum memory of the java process of the client. This is done in the startclient.bat file which is in the directory that DAC is installed in (e.g. C:\oracle\product\DAC\bifoundation\dac).
Change the –Xmx parameter to increase the memory (don’t forget to backup the file first):
start %JAVAW% -Xms256m –Xmx2048 m -cp %DACCLASSPATH% com.siebel.analytics.etl.client.view.EtlViewsInitializer
Unfortunately, despite increasing the maximum memory up considerably, the problem wasn’t solved.
Purging
As we couldn’t accommodate the volume of data the client was processing in the memory, we decided to reduce the volume of data being exported instead. As the export had failed on the W_ETL_RUN_SDTL table (indicated in the log file), we needed to reduce the size of this table and related tables. This table contains information linked to historic execution plan runs, so it was acceptable to remove historic records.
The DAC client contains a menu option called Purge Run Details:
When running this, the purge can be controlled by various parameters (date, specific execution plan or specific run name). In our case we purged everything except all runs for the current year:
This process took quite a while (after 2.5 hours I went home and left it running) but this will vary depending on the volume of data and the server specifications. It is recommended that the DAC repository tables are analysed before running this process to help reduce the runtime (Tools -> DAC Repository Management -> Analyze Repository Tables). After the purge process had finished we were able to perform a full DAC export (less the log files) in just over 5 minutes.