Oracle数据库数据恢复、性能优化来问问AskMaclean - ParnassusData诗檀软件旗下网站

热搜: 活动 交友 discuz






发表于 2014-11-4 11:18:43 | 查看: 7601| 回复: 2
Best Practice – Extracting from Oracle Archive log files

In systems where you don’t have direct access to the source system you can now take advantage of remote data mining using Oracle Archive log files only. You can use this method of data extraction against the Source Oracle Database, Standby database, Meta data database or even BCV copy of the source database. One advantage of extracting from archive log files is that the whole process can be offloaded to another system and literally have zero impact on the source system. This method has been used to offload very high volume systems in order to load reporting databases.
This document is intended to document the best practice for setting up and configuring GoldenGate Extract to read from Oracle Archive log files only. The Object ID database can be a Source database, Standby database or even a “stub” database where the only a minimal database is configured that has the object ID’s from the source database
This document is organized by the different configurations for setting up archive only processing:
GoldenGate Software install - Provides an overview of GoldenGate software install.
Object ID Database, Source – The Object ID are resolved in the source database. A sqlnet net connection to the source database can be used but the Archive files that need to be processed must be local to the extract process.
Object ID Database, Oracle Standby Database – The Object ID’s are resolved in a standby database that was created from the source database. Connection to the standby database can be thru a sqlnet connection, but the archive log files must be local to the extract process. If the extract resolves the object id’s at startup, resolve immediate, than the standby database can be put back into managed recovery mode while extract is still running.
Object ID Database, Object database – A Object ID database can start as a standby database or a restored backup of the primary database. The only tablespaces needed in an object ID database are the system tablespace, rollback tablespace. All other tablespaces can be offline dropped before the database is opened for read/write.
Extract Parameter settings – This section covers the different parameter that need to be set for archive log processing.
Performance tuning – This section covers the recommended changes to the extract parameter file that may help extract performance.
Limitations and restrictions – This section covers restrictions and limitations for running GoldenGate in archive log only mode.

GoldenGate Software install
GoldenGate Software needs to be install on the host per the install directions in the GoldenGate Administrator Guide. No special configuration needs to be completed for the reading archive only extraction.

Object ID Database – Source Connection
This is the simplest configuration. All you need is a GoldenGate install, with the necessary Oracle install files (DLL’s) and then a SQLNET connection back to the source database.
You will need some method to move the archive files to the extraction server. The files can be ftp’ed or some kind of cron job on the source can copy the files as they are created to the target.
Any change to table structures will be picked up from the source database. Connection to the source database is only for Object ID lookup. This configuration will not be able to read current redo log member.
Requires a connection to the source database. Lag time, at a minimum, is the time it takes to fill a redo log file and ship it to the extract system.
4.2 Object ID Database – Standby Database
This method requires a standby database be created. In order to create the standby database you will need to read the Oracle documentation on how to create a standby database. This document assumes that the standby database has already been created. The standby database can be used in two different ways, on-line and offline. Both on-line and off-line configurations must be open in read-only mode when you start the extract process in order to resolve the object ID for the tables to extracted. If you use the resolve immediately parameters in the extract file, once the tables have been resolved you can stop the database and continue extracting from the archive log files.
If you are resolving object id immediately, you can stop the database and continue recovery.
If the database has been shutdown for managed recovery and you stop or the extract process fails, you will not be able to restart the extract process until you have restarted the database in “read only” mode.
4.2.1 Standby Database as Object ID lookup only

下载专业ORACLE数据库恢复工具PRM-DUL  For Oracle



服务热线 : 13764045638  QQ: 47079569     邮箱
发表于 2014-11-4 11:19:01

the standby database will need to be refreshed.
4.3 Object ID Database – Object Database
The basic steps are the following:
1. Initial Database setup
a. Add GGS user to source DB
b. Add/Check database level for supplemental logging.
c. Add supplemental logging to tables needed in extract
2. Make backup of primary DB – core tablespaces only
a. This can be using RMAN, BCV splits, O.S. copy, Hot backup etc.
b. System tablespace
c. Rollback tablespace
d. Temp tablespace
3. Create standby controlfile
4. Copy any archive log files to target system
5. Copy Datafiles to like system
a. Should be same structure of you will need to add parameters to map directory names
b. Copy standby control file
6. Startup standby DB no mount status
a. Offline drop any non-system, rollback datafiles
7. Startup mount DB
a. Recover database (standby recovery)
8. Once recovery is until current time or last available archive file
a. Open database read only
b. If database needs more recovery before DB will open you need to check if ADV Replication is on for the source BD - Set in standby parameter file - replication_dependency_tracking = FALSE
c. If ADV Rep is on you will need to disable it.
9. Once DB is open you can start GGS.
If the archived logs reside in a location other than the Oracle default, specify the
location with the ALTARCHIVELOGDEST option of the TRANLOGOPTIONS parameter in
the Extract parameter file. This parameter is supported for RAC installations in release 10 and higher.

5.1 Parameters related to Archive Log Only Processing
Most of the parameter that relate to Archive Log Only processing are found in the parameter TRANLOGOPTIONS listed below.
Causes Extract to read from the archived logs only, without querying or validating the logs from system views such as v$log and v$archived_log. If this parameter is specified or the database is a standby database, you will be required to position the starting position of Extract to the physical address in the log, instead of using a timestamp.
Specifies the number of seconds that Extract waits for a new log file to become available before generating a warning message. Extract generates only one warning message for a given sequence
number. If ALOWARNEOF is not specified, Extract waits for one hour by default. A value of 0 omits the warning no matter how long Extract waits.
Specifies a string that overrides the archived log format of the source database. <string> accepts the same specifier as Oracle's parameter LOG_ARCHIVE_FORMAT.
Extract uses the supplied format specifier to derive the log file name.
Valid for Oracle (single instance and RAC). Points Extract to the archived logs when they reside somewhere other than the default location. Extract first checks the default location, then the location specified with this parameter.  <path name> specifies the fully qualified path to the archived logs. This directory must be NFS mounted to the node where GoldenGate is running. Use that mount point for ALTARCHIVELOGDEST. INSTANCE <instance_name> applies the specified ALTARCHIVELOGDEST behavior to a specific Oracle instance. On RAC, if this option is used, you must specify the ALTARCHIVELOGDEST parameter on each node.
This parameter provide additional information when a row is missing data in the log file. Because you are running in ALO mode, any row that is missing data will be a incomplete row and you need to know if this is happening. This parameter provides a response when GoldenGate cannot

locate a row to be fetched, causing only part of the row (the changed values) to be available for processing. Typically a row cannot be located because it was deleted between the time the change record was created and when the fetch was triggered, or because the row image required was older than the undo retention specification. <action> can be one of the following: IGNORE Ignore the condition and continue processing. This is the default. REPORT Report the condition and contents of the row to the discard file, but continue processing the partial row. A discard file must be specified with the DISCARDFILE parameter.
Valid for Oracle 9i or later. The default, USESNAPSHOT, causes Extract to use the Flashback
Query mechanism to fetch data needed to reconstruct operations containing LOB data, user defined data types, nested tables, and XMLType records from the undo tablespace. NOUSESNAPSHOT causes Extract to fetch the needed data from the source table. For more information about how GoldenGate fetches data from Oracle, see the GoldenGate for Windows and UNIX Administrator Guide.
Valid for Oracle 9i or later. Use with USESNAPSHOT. The default, USELATESTVERSION, directs Extract to fetch data from the source table if it cannot fetch from the undo tablespace. NOUSELATESTVERSION directs Extract to ignore the condition if the snapshot fetch fails, and continue processing. To provide an alternate action if a snapshot fetch does not succeed, use the MISSINGROW option.
REPORTFETCH returns statistics on row fetching, such as that triggered by a FETCHCOLS clause (see reference guide for details) or fetches that must be performed when not enough information is in the transaction record. NOREPORTFETCH turns off reporting of fetch statistics. The default is NOREPORTFETCH. You should not see any fetching in archive log mode.
USELASTREADTIME Forces Extract to always use the time that it last read the redo log to determine whether a transaction is long-running or not. By default, Extract uses the timestamp of the last record that it read from the redo log. This applies to an Extract that is running in archive log only mode, as configured with TRANLOGOPTIONS using the ARCHIVEDLOGONLY option.
5.1.10 Sample Extract Parameter file:
Extract e_all_b
EXTRACT e_all_b
Best Practice Guide
GoldenGate Confidential and Proprietary

RMTTRAIL y:\ggs\dirdat\b1
DISCARDFILE ./dirout/e_all_b_discards.txt, APPEND
TRANSMEMORY RAM 500000000, TRANSRAM 10000000, &
DIRECTORY (/oracle/g01/dbtools/ggs/dirtmp, 10000000000, 1000000000)

回复 显示全部楼层 道具 举报

发表于 2015-3-4 16:26:42


535.85 KB, 下载次数: 1122

回复 显示全部楼层 道具 举报

您需要登录后才可以回帖 登录 | 注册

QQ|手机版|Archiver|Oracle数据库数据恢复、性能优化来问问AskMaclean - ParnassusData诗檀软件旗下网站

GMT+8, 2019-7-21 02:00 , Processed in 0.060191 second(s), 24 queries .

Powered by Discuz! X2.5

© 2001-2012 Comsenz Inc.

TEL/電話+86 13764045638
QQ 47079569