The following tasks provide clear step by step instructions on performing Oracle Apps database Migration to Linux from any operating system.
Section I: Prepare the source system for Export:-
1)Apply the Applications consolidated export/import utility patch:
Apply patch 4872830 to the source administration server node.
2) Apply latest Applications database preparation scripts patch;
Apply patch 7225862 to every application tier server node in the source system
3) Create a working directory:
$ mkdir /dd02/backup/expimp
4)Generate target database instance creation script aucrdb.sql:
$ sqlplus system/manager @$AU_TOP/patch/115/sql/auclondb.sql 10
5)Record Advanced Queue settings:
Copy the auque1.sql script from the $AU_TOP/patch/115/sql directory on the source administration server node to the working directory in the source database server node,
It generates auque2.sql.
$ sqlplus /nolog
SQL> connect / as sysdba;
SQL> @auque1.sql
6) Remove rebuild index parameter in spatial indexes:
Use SQL*Plus to connect to the source database as sysdba and run the following command:
SQL>select INDEX_NAME,PARAMETERS from dba_indexes where index_type='DOMAIN' and upper(parameters) like '%REBUILD%';
To remove the rebuild index parameter:
SQL> alter index
Eg:
alter index MST.MST_MD_ADM_BNDS_N1 rebuild parameters ('sdo_indx_dims=2 sdo_rtr_pctfree=10 tablespace=APPS_TS_TX_IDX');
alter index MST.MST_MD_HYDROS_N1 rebuild parameters ('sdo_indx_dims=2 sdo_rtr_pctfree=10 tablespace=APPS_TS_TX_IDX');
alter index HR.HR_LOCATIONS_SPT rebuild parameters ('sdo_indx_dims=2 sdo_rtr_pctfree=10 tablespace=APPS_TS_TX_IDX');
alter index HR.PER_ADDRESSES_SPT rebuild parameters ('sdo_indx_dims=2 sdo_rtr_pctfree=10 tablespace=APPS_TS_TX_IDX');
0) Create a directory named dmpdir:
$ sqlplus system/
SQL> create directory dmpdir as '/u01/backup/expimp';
1) Create the export parameter file:
Create file auexpdp.dat with the following values, change the log file name & date.
$cat auexpdp.dat
directory=dmpdir
dumpfile=fullexp%U.dmp,fullexp2_%U.dmp,fullexp3_%U.dmp,fullexp4_%U.dmp
filesize=2G
full=y
exclude=SCHEMA:"='MDDATA'"
exclude=SCHEMA:"='OLAPSYS'"
exclude=SCHEMA:"='ORDSYS'"
exclude=SCHEMA:"='DMSYS'"
exclude=SCHEMA:"='OUTLN'"
exclude=SCHEMA:"='ORDPLUGINS'"
#transform=oid:n
logfile=expdpapps_01Feb09.log
JOB_NAME=expfull
2) Shut down All the Services:
Shutdown the Apps ,database and listener and open the database in restricted mode.
SQL>shu immediate;
SQL>host lsnrctl stop
SQL>startup restrict;
3) Back up AZ table data (conditional)
a)If you are using AZ.H, upgrade to AZ.H.DELTA.1. See document 403092.1 b)10.2 datapump does not allow tables with XML type columns to be migrated Perform step 2 of document 402785.1
4) Grant privilege to source system schema:
a) As sysdba run the following sql.
SQL> grant EXEMPT ACCESS POLICY to system;
5) Export the Applications database instance (Full DB):
expdp system/
Typically export runs for several hours, Please monitor the export log file for any errors.
8) Revoke privilege from source system schema:
SQL> revoke EXEMPT ACCESS POLICY from system;
1) Create target Oracle 10g (10.2.0.2) Oracle home (conditional):
If the target instance oracle home doesn’t exist and it’s a new build please perform the following steps to install a new oracle home.
If already oracle home exist, please skip this step and proceed further. $ cd /stage/StageR12/startCD/Disk1/rapidwiz
$ rapidwiz –techstack
Choose the "10gR2 RDBMS" option in the techstack components window and provide the details for the new Oracle home like SID, Port , Oracle home ..etc.
2) Create a working directory and copy files from source:
$ mkdir /u01/backup/expimp
Copy all the files generated in the sorce working directory to target i.e /dd02/backup/expimp to /u01/backup/expimp
3) Create the target database instance:
Update the aucrdb.sql script generated on the source database server node with any necessary changes to the directory structures for the log file(s), data file(s), or tablespaces, reflecting the layout of the target database server node
Note: This is very imp step Before running script ensure all the files locations are proper
Run this from VNC:
$ sqlplus /nolog
SQL> connect / as sysdba;
SQL> spool aucrdb.log;
SQL> startup nomount;
SQL> @aucrdb.sql
SQL> exit;
4) Copy database preparation scripts to target Oracle home:
Copy the following files from the $APPL_TOP/admin directory of the source administration server node to the working directory in the target database server node: addb1020.sql, adsy1020.sql, adjv1020.sql, and admsc1020.sql
5) Set up the SYS schema:
Run this from VNC – It runs for a while:
$ sqlplus "/ as sysdba" @addb1020.sql
6) Set up the SYSTEM schema:
If System password is not known go ahead and reset the password.
SQL>alter user system identified by manager;
$ sqlplus system/
7) Install Java Virtual Machine
$ sqlplus system/
Attention: This script can be run only once in a given database instance, because the scripts that it calls are not re runnable
8) Install other required components:
sqlplus system/
9)Run adpostcrdb.sql script
Copy the adpostcrdb.sql script, generated in Section 1, from the source administration server node to the target database server node. On the target database server node, use SQL*Plus to connect to the database instance as SYSTEM and run the following command.
$ sqlplus system/
10) Disable automatic gathering of statistics:
Copy $APPL_TOP/admin/adstats.sql from the source administration server node to the target database server node.
$ sqlplus "/ as sysdba"
SQL> shutdown normal;
SQL> startup restrict;
SQL> @adstats.sql
SQL> exit;
11) Back up the Blank target database instance:
$ nohup cp –R /u01/oradata /u01/backup &
Section IV: Import the Release 12 database instance:
1) Pre Import checks:
A) Ensure directory dmpdir has been created as /u01/backup/expimp
B) Ensuare database running in No Archive log mode:
SQL> archive log list
Database log mode No Archive Mode
Automatic archival Disabled
Archive destination /u01/arch
Oldest online log sequence 130
Current log sequence 133
C) Resize the redo log files to 400M;
SQL> SELECT a.group#, a.member, b.bytes FROM v$logfile a, v$log b WHERE a.group# = b.group#;
select group#, status from v$log;
alter system switch logfile;
alter database drop logfile group 2;
alter database add logfile group 2 ('/u01/oradata/data03/log02a.dbf','/u01/oradata/data03/log02b.dbf') size 400m reuse;
alter database drop logfile group 3;
alter database add logfile group 3 ('/u01/oradata/data03/log03a.dbf','/u01/oradata/data03/log03b.dbf') size 400m reuse;
alter database drop logfile group 4;
alter database add logfile group 4 ('/u01/oradata/data03/log04a.dbf','/u01/oradata/data03/log04b.dbf') size 400m reuse;
alter database drop logfile group 1;
alter database add logfile group 1 ('/u01/oradata/data03/log01a.dbf','/u01/oradata/data03/log01b.dbf') size 400m reuse;
C) Create import dump directory:
$ sqlplus system/
SQL> create directory dmpdir as '/u01/backup/expimp';
2) Import the Applications database instance:
Create the par file auimpdp.dat with the following parameters,Modify the log file name as appropriate.
$ cat auimpdp.dat
directory=dmpdir
dumpfile=fullexp%U.dmp,fullexp2_%U.dmp,fullexp3_%U.dmp,fullexp4_%U.dmp
full=y
transform=oid:n
logfile=FULL_import_Sep_15.log
Execute the full import from VNC:
Set NLS_LANG parameter as source instance.
Eg: export NLS_LANG= American_America.UTF8
$ nohup impdp system/
Items to take care while import is running:
a) Note down the import start time.
b) Import runs for several hours,depending on the data volume and server performance.
c) Items to monitor while running import.
-Monitor continuously import log file for any errors
-Monitor the alert log file ,If tablespace runs out of space, It will be reported in alert log.
d) Import progress check queries.
SQL> select object_name,owner, TO_CHAR((MAX(LAST_DDL_TIME)),'DD-MON-YYYY HH24:MI:SS') LAST_DDL, TO_CHAR(sysdate,'DD-MON-YYYY HH24:MI:SS')sys_date from dba_objects where last_ddl_time=(select max(last_ddl_time) from dba_objects) group by object_name,owner;
e) Query to identify TableSpaces Filled More Than 80%
SQL> select a.TABLESPACE_NAME, a.MAX, b.USED, round((b.USED/a.MAX)*100,2) PCT from (select TABLESPACE_NAME, sum(MAXBYTES/1024/1024) MAX from dba_Data_files group by TABLESPACE_NAME) a,(select tablespace_name, sum(bytes/1024/1024) USED from dba_segments group by tablespace_name) b where a.TABLESPACE_NAME=b.TABLESPACE_NAME and b.USED > a.MAX *(85/100);f) Check the
errors:
egrep -n '^ERRORERROR:ORA-PLS-SP2-IMP-' FULL_import_Sep_15.log >imp_err.txt
3) Revoke privilege from target system schema:
SQL> revoke EXEMPT ACCESS POLICY from system;
Section V: Post Import Steps:
1) Reset Advanced Queues:
Run the auque2.sql script generated in the source database to enable the Advanced Queue settings that were lost during the export/import process
$ sqlplus /nolog
SQL> connect / as sysdba;
SQL> @auque2.sql
2) Run adgrants.sql:
Copy $APPL_TOP/admin/adgrants.sql from the administration server node to the working directory in the database server node and execute it.
$ sqlplus "/ as sysdba" @adgrants.sql APPLSYS
3) Grant create procedure privilege on CTXSYS:
Copy $AD_TOP/patch/115/sql/adctxprv.sql from the administration server node to the database server node and run it as apps.
sqlplus apps/
4) Gather statistics for SYS schema:
Copy $APPL_TOP/admin/adstats.sql from the administration server node to the working directory in the database server node and execute it.
$ sqlplus "/ as sysdba"
SQL> shutdown normal;
SQL> startup restrict;
SQL> @adstats.sql
SQL> shutdown normal;
SQL> startup;
SQL> exit;
5) Drop and recreate the database links:
Drop and recreate the DB links APPS_TO_APPS and EDW_APPS_TO_WH
6)Restore AZ table data (conditional)
If you performed the step to back up the AZ table data, Perform steps 4 and 5 of document 402785.1 on OracleMetaLink to restore the AZ table data.
7)Create OWA_MATCH package (conditional)
If you are using iAS 1.0.2, perform the steps in document 312165.1 to create SYS.OWA_MATCH on the target database.
8)Create ConText and Spatial objects
Certain ConText and Spatial objects are not preserved by the import process. Run the following.
$ perl $AU_TOP/patch/115/driver/dpost_imp.pl dpost_imp.drv
Once the driver file has been generated, use AutoPatch to apply it on the target administration server node.
9) Compile invalid objects
$ sqlplus "/ as sysdba" @$ORACLE_HOME/rdbms/admin/utlrp.sql
10) Maintain Applications database objects
Run AD Administration on the target administration server node. From the Maintain Applications Database Objects menu, perform the following tasks:
a. Compile flexfield data in AOL tables
b. Recreate grants and synonyms for APPS schema
11) Start Applications server processes
Start all the server processes on the target Applications system. You can allow users to access the system at this time.
12) Create DQM indexes
Create DQM indexes by following these steps:
a) Log on to Oracle Applications with the "Trading Community Manager" responsibility
b) Click Control > Request > Run
c) Select "Single Request" option
d) Enter "DQM Staging Program" name
e) Enter the following parameters:
Number of Parallel Staging Workers: 4
Staging Command: CREATE_INDEXES
Continue Previous Execution: NO
Index Creation: SERIAL
f) Click "Submit"
Hi Naresh,
ReplyDeleteWe are cross platfrom migration from solaris to Linux.We are doing "Create ConText and Spatial objects" as per the doc 230627.1.But t he adpatch fails for 2 workers
below are the errors:
sqlplus -s APPS/***** @/ap01/sit/applsit/appl/per/11.5.0/patch/115/sql/irctxpref.sql &un_per &un_fnd &un_apps CTXSYS
declare
*
ERROR at line 1:
ORA-20000: Oracle Text error:
DRG-10703: invalid framework object basic_wordlist
ORA-06512: at "CTXSYS.DRUE", line 157
ORA-06512: at "CTXSYS.CTX_DDL", line 55
ORA-06512: at "CTXSYS.AD_CTX_DDL", line 152
ORA-06512: at line 44
sqlplus -s APPS/***** @/ap01/sit/applsit/appl/pa/11.5.0/patch/115/sql/paxincx1.sql &un_pa &un_apps
DECLARE
*
ERROR at line 1:
ORA-20001: 220:ORA-20000: Oracle Text error:
DRG-10703: invalid framework object NULL_FILTER
ORA-06512: at line 129
Any help will be highly appreciated.
Thanks,
Mamatha
I really appreciate the information shared above. It’s of great help. If someone wants to learn Online (Virtual) instructor lead live training in Oracle Demantra TECHNOLOGY, kindly contact us http://www.maxmunus.com/contact
ReplyDeleteMaxMunus Offer World Class Virtual Instructor-led training on TECHNOLOGY. We have industry expert trainer. We provide Training Material and Software Support. MaxMunus has successfully conducted 100000+ pieces of training in India, USA, UK, Australia, Switzerland, Qatar, Saudi Arabia, Bangladesh, Bahrain and UAE etc.
For Demo Contact us.
Pratik Shekhar
MaxMunus
E-mail: pratik@maxmunus.com
Ph:(0) +91 9066268701
http://www.maxmunus.com/
The article looks magnificent, but it would be beneficial if you can share more about the suchlike subjects in the future. Keep posting. immigration process
ReplyDeletethanks for article
ReplyDeleteFaceTime on PC
I really like your writing style, great information, thankyou for posting. 먹튀검증
ReplyDeleteYou have done a great job on this article. It’s very readable and highly intelligent. You have even managed to make it understandable and easy to read. You have some real writing talent. Thank you. 먹튀검증
ReplyDeleteShipping from China to Us All the contents you mentioned in post is too good and can be very useful. I will keep it in mind, thanks for sharing the information keep updating, looking forward for more posts.Thanks Shipping from China to Usa
ReplyDeleteThanks for another wonderful post. Where else could anybody get that type of info in such an ideal way of writing? 대전건마
ReplyDeleteWith liposuction, the surgeon can remove stubborn fat deposits, improve your physical appearance by using contouring procedures to give you pleasing curves in all the right places. Fat Transfer
ReplyDeleteI high appreciate this post. It’s hard to find the good from the bad sometimes, but I think you’ve nailed it! would you mind updating your blog with more information? 현금바둑이
ReplyDeleteI should check scarcely that its doing securing! The blog is illuminating unassumingly reliably make tumbling entitys. Lipiflow treatment in delhi
ReplyDeleteWell… I release up zones on an energetically seeming like issue, sitting above I never visited your blog. I added it to populars other than i'll be your strong starter. https://www.buyyoutubeviewsindia.in/youtube-marketing/
ReplyDeleteI learn some new stuff from it too, thanks for sharing your information. 온라인바둑이
ReplyDeleteI learn some new stuff from it too, thanks for sharing your information. 온라인바둑이
It proved to be Very helpful to me and I am sure to all the commentators here! 온라인바둑이
ReplyDeleteI have bookmarked your blog, the articles are way better than other similar blogs.. thanks for a great blog! free nursing test banks
ReplyDeleteThis was really an interesting topic and I kinda agree with what you have mentioned here! test bank free download
ReplyDeletethanks for the tips and information..i really appreciate it.. 먹튀검증
ReplyDeleteThanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic. 벳페어
ReplyDeleteThe website is looking bit flashy and it catches the visitors eyes. Design is pretty simple and a good user friendly interface. 인디벳
ReplyDeleteI read that Post and got it fine and informative. detectives de infidelidades
ReplyDeleteI really loved reading your blog. It was very well authored and easy to undertand. Unlike additional blogs I have read which are really not tht good. I also found your posts very interesting. In fact after reading, I had to go show it to my friend and he ejoyed it as well! 먹튀검증
ReplyDeletewow this upstanding notwithstanding ,I love your enter offering little appreciation to titanic pics might be part personss not a monster store of occupied with sexual relations being defrent mind all around poeple , 카지노사이트
ReplyDeleteI cannot wait to dig deep and kickoff utilizing resources that I received from you. Your exuberance is refreshing. 토토사이트
ReplyDeletePretty good post. I have just stumbled upon your blog and enjoyed reading your blog posts very much. I am looking for new posts to get more precious info. Big thanks for the useful info. digital payment agent training
ReplyDelete