DEV Community

Cover image for Different methods for Grid Infrastructure patching
Project-42
Project-42

Posted on

Different methods for Grid Infrastructure patching

This is an old post I wrote somewhere else and want to have it here for future reference :)

As with anything remotely related to upgrade or patching, is always good to visit Mike Dietrich blog and make sure we have few concepts clear before we start.

The first one I needed to clarify, is this "new term" (that shows you the lack of Upgrade I face daily...) called "Release Update" Of course I heard about it.. and of course I did see the reference when I downloaded the Patch for this test, but from reading something somewhere to really know what is actually about.. there is a big difference. Anyway, stop reading this, and have a look to this post about it from Mr Dietrich :) Differences between PSU / BP and RU / RUR By Mike.Dietrich

Ok, now that we have that clear, lets start getting our hands dirty that is what we like to do here

Always make sure you check the README of the Patch and check all steps are clear before you start

Also, if you want to know more about Patching and Opatch, these 2 Notes are a good start (Thanks Luis! for the hint!! :)

Oracle Database - Overview of Patch Delivery Methods (Doc ID 1958998.1) FAQ: OPatch/Patch Questions/Issues for Oracle Clusterware (Grid Infrastructure or CRS) and RAC Environments (Doc ID 1339140.1)

I'm using 2 clusters for these tests: rac4-node1/rac4-node2 (Where my Standby Databases 11.2 and 12.2 were running) rac5-node1/rac5-node2 (Where couple of Primary Databases 11.2, 12.2 and 12.1 were running) The Intention here was to see the different ways we have to apply the RU. I'm not really an expert (master of none..) so I can't say which one is better, but the best way to know, is trying the different options and make sure we understand all of them.. You will notice there is not particular order of execution, even that some of the patching was don't one day and another node a week after... well.. this is a LAB I use exclusively for that and once I get time to have some fun.. so please ignore that :)

Something I di notice is that the +ASM entries in the oratab file were deleted after patching. Not 100% the reason for that, since we are not changing the Grid home location. I may need to check closer in the future and update here...

[oracle@rac4-node2 ~]$ cat /etc/oratab  
#Backup file is  /u01/app/grid/srvm/admin/oratab.bak.rac4-node2 line added by Agent  
[.....]  
#  
# Multiple entries with the same $ORACLE_SID are not allowed.  
#  
#  
st1222:/u01/app/oracle/product/12.2.0/dbhome_1:N  
st1122:/u01/app/oracle/product/11.2.0/dbhome_1:N  
st112:/u01/app/oracle/product/11.2.0/dbhome_1:N        # line added by Agent  
[oracle@rac4-node2 ~]$

Spoiler alert: This guide is long.. is very long.. sorry?.. but you can jump to the section you are more interested in using the below index

Table Of Contents

Using Opatchauto

On this first scenario, we will apply the Patch Using Opatchauto in rac4 cluster, where our Standby DBs are running For Grid Outpatch execution, we need to use root Opatchauto will execute the different steps with the appropriated user

[root@rac4-node2 ~]# opatchauto apply /tmp/28833258/

OPatchauto session is initiated at Sat Mar  2 15:28:16 2019

System initialization log file is /u01/app/grid/cfgtoollogs/opatchautodb/systemconfig2019-03-02_03-28-22PM.log.

Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-03-02_03-29-35PM.log
The id for this session is 28LG

Executing OPatch prereq operations to verify patch applicability on home /u01/app/grid

Executing OPatch prereq operations to verify patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Patch applicability verified successfully on home /u01/app/grid

Patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Verifying SQL patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Preparing to bring down database service on home /u01/app/oracle/product/12.2.0/dbhome_1
Successfully prepared home /u01/app/oracle/product/12.2.0/dbhome_1 to bring down database service

Bringing down CRS service on home /u01/app/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/rac4-node2/crsconfig/crspatch_rac4-node2_2019-03-02_03-30-12PM.log
CRS service brought down successfully on home /u01/app/grid

Performing prepatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Perpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Binary patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Performing postpatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Postpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/grid
Binary patch applied successfully on home /u01/app/grid

Starting CRS service on home /u01/app/grid
Postpatch operation log file location: /u01/app/oracle/crsdata/rac4-node2/crsconfig/crspatch_rac4-node2_2019-03-02_03-41-43PM.log
CRS service started successfully on home /u01/app/grid

Preparing home /u01/app/oracle/product/12.2.0/dbhome_1 after database service restarted
No step execution required.........

Trying to apply SQL patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

OPatchAuto successful.

--------------------------------Summary--------------------------------

Patching is completed successfully. Please find the summary as follows:

Host:rac4-node2
RAC Home:/u01/app/oracle/product/12.2.0/dbhome_1
Version:12.2.0.1.0
Summary:

==Following patches were SKIPPED:

Patch: /tmp/28833258/28163235
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/26839277
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/28566910
Reason: This patch is not applicable to this specified target type - "rac_database"

==Following patches were SUCCESSFULLY applied:

Patch: /tmp/28833258/28698356
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-32-40PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-32-40PM_1.log

Host:rac4-node2
CRS Home:/u01/app/grid
Version:12.2.0.1.0
Summary:

==Following patches were SUCCESSFULLY applied:

Patch: /tmp/28833258/26839277
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-35-34PM_1.log

Patch: /tmp/28833258/28163235
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-35-34PM_1.log

Patch: /tmp/28833258/28566910
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-35-34PM_1.log

Patch: /tmp/28833258/28698356
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-35-34PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_15-35-34PM_1.log

Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/11.2.0/dbhome_1

OPatchauto session completed at Sat Mar  2 15:56:45 2019
Time taken to complete the session 28 minutes, 30 seconds
[root@rac4-node2 ~]#

Opatchauto (well, clusterware) will start the DBs and services set to start automatically. In my case, I don't have most of the Databases set to start automatically, since this is a lab and I don't need all of them running everytime I need to check something :)

Using Traditional Opatch

Lets try now the Manual process (Traditional Opatch) in the Node1 of Rac4 This requires several manual interventions and we will apply the different patches to Grid and DB homes one by one. The advantage is that we will have more control of the process and by that, we can react faster to any issue. Beforfe we start, we will first analyze the Patch against the system. This will give us the actual patches we need to apply (notice we are executing this step as root)

[root@rac4-node1 ~]# /u01/app/grid/OPatch/opatchauto apply /tmp/28833258/ -analyze

OPatchauto session is initiated at Sat Mar  2 13:51:52 2019

System initialization log file is /u01/app/grid/cfgtoollogs/opatchautodb/systemconfig2019-03-02_01-52-04PM.log.

Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-03-02_01-53-19PM.log
The id for this session is ZI2M

Executing OPatch prereq operations to verify patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1

Executing OPatch prereq operations to verify patch applicability on home /u01/app/grid
Patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Patch applicability verified successfully on home /u01/app/grid

Verifying SQL patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

OPatchAuto successful.

--------------------------------Summary--------------------------------

Analysis for applying patches has completed successfully:

Host:rac4-node1
RAC Home:/u01/app/oracle/product/12.2.0/dbhome_1
Version:12.2.0.1.0

==Following patches were SKIPPED:

Patch: /tmp/28833258/28163235
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/26839277
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/28566910
Reason: This patch is not applicable to this specified target type - "rac_database"

==Following patches were SUCCESSFULLY analyzed to be applied:

Patch: /tmp/28833258/28698356
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Host:rac4-node1
CRS Home:/u01/app/grid
Version:12.2.0.1.0

==Following patches were SUCCESSFULLY analyzed to be applied:

Patch: /tmp/28833258/28698356
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Patch: /tmp/28833258/28163235
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Patch: /tmp/28833258/26839277
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Patch: /tmp/28833258/28566910
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-54-00PM_1.log

Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/11.2.0/dbhome_1

OPatchauto session completed at Sat Mar  2 13:54:13 2019
Time taken to complete the session 2 minutes, 21 seconds
[root@rac4-node1 ~]#

If we go to the Summary, we can see the 2 patches SUCCESSFULLY analyzed to be applied into our 12.2 DB home (u01/app/oracle/product/12.2.0/dbhome_1)

Host:rac4-node1
RAC Home:/u01/app/oracle/product/12.2.0/dbhome_1
Version:12.2.0.1.0

==Following patches were SUCCESSFULLY analyzed to be applied:

Patch: /tmp/28833258/28698356
Patch: /tmp/28833258/28790640

And below, we can see the 5 patches needed to be applied into our Grid Home:

Host:rac4-node1
CRS Home:/u01/app/grid
Version:12.2.0.1.0

==Following patches were SUCCESSFULLY analyzed to be applied:

Patch: /tmp/28833258/28698356
Patch: /tmp/28833258/28163235
Patch: /tmp/28833258/26839277
Patch: /tmp/28833258/28566910
Patch: /tmp/28833258/28790640

Lets now prepare CRS for the patch process using the command "rootcrs.sh -prepatch" (executed as root) This command will stop CRS and make sure the CRS is prepare for the patches unlocking the grid home. Since we have already applied the patch in node2 (using Opatchauto), you will see the CRs status as "The cluster upgrade state is [ROLLING PATCH]"

I didn't include this step in the tutorial, but something that can save you from having a hard time recovering from a bad experience is to stop both DBs and GRID and make a copy of of the Homes to avoid any issue and make sure you have a copy o them to recover from any misstep or mistake in the process

cp -pR /u01/app/grid/ /u01/app/grid_prepatch  
cp -pR /u01/app/oracle/product/12.2.0/dbhome_1 /u01/app/oracle/product/12.2.0/dbhome_1_prepatch
[root@rac4-node1 ~]# /u01/app/grid/crs/install/rootcrs.sh -prepatch
Using configuration parameter file: /u01/app/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-03-02_02-00-30PM.log

Oracle Clusterware active version on the cluster is [12.2.0.1.0]. The cluster upgrade state is [ROLLING PATCH]. The cluster active patch level is [2148646517].

CRS-2791: Starting shutdown of Oracle High Availability Services-managed resources on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.crsd' on 'rac4-node1'
CRS-2790: Starting shutdown of Cluster Ready Services-managed resources on server 'rac4-node1'
CRS-2673: Attempting to stop 'ora.LISTENER.lsnr' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.chad' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.chad' on 'rac4-node2'
CRS-2673: Attempting to stop 'ora.DATA_11.dg' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.DATA_DB.dg' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.DATA.dg' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.RECO.dg' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node1'
CRS-2677: Stop of 'ora.RECO.dg' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.DATA_11.dg' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.DATA.dg' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.DATA_DB.dg' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.asm' on 'rac4-node1'
CRS-2677: Stop of 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.scan1.vip' on 'rac4-node1'
CRS-2677: Stop of 'ora.LISTENER.lsnr' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.asm' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.ASMNET1LSNR_ASM.lsnr' on 'rac4-node1'
CRS-2677: Stop of 'ora.scan1.vip' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.chad' on 'rac4-node2' succeeded
CRS-2677: Stop of 'ora.chad' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.mgmtdb' on 'rac4-node1'
CRS-2677: Stop of 'ora.ASMNET1LSNR_ASM.lsnr' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.mgmtdb' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.MGMTLSNR' on 'rac4-node1'
CRS-2677: Stop of 'ora.MGMTLSNR' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.rac4-node1.vip' on 'rac4-node1'
CRS-2672: Attempting to start 'ora.MGMTLSNR' on 'rac4-node2'
CRS-2677: Stop of 'ora.rac4-node1.vip' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.MGMTLSNR' on 'rac4-node2' succeeded
CRS-2672: Attempting to start 'ora.scan1.vip' on 'rac4-node2'
CRS-2672: Attempting to start 'ora.rac4-node1.vip' on 'rac4-node2'
CRS-2676: Start of 'ora.rac4-node1.vip' on 'rac4-node2' succeeded
CRS-2676: Start of 'ora.scan1.vip' on 'rac4-node2' succeeded
CRS-2672: Attempting to start 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node2'
CRS-2676: Start of 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node2' succeeded
CRS-2673: Attempting to stop 'ora.ons' on 'rac4-node1'
CRS-2677: Stop of 'ora.ons' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.net1.network' on 'rac4-node1'
CRS-2677: Stop of 'ora.net1.network' on 'rac4-node1' succeeded
CRS-2792: Shutdown of Cluster Ready Services-managed resources on 'rac4-node1' has completed
CRS-2677: Stop of 'ora.crsd' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.asm' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.crf' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.drivers.acfs' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.gpnpd' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.mdnsd' on 'rac4-node1'
CRS-2677: Stop of 'ora.drivers.acfs' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.crf' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.gpnpd' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.mdnsd' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.asm' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.cluster_interconnect.haip' on 'rac4-node1'
CRS-2677: Stop of 'ora.cluster_interconnect.haip' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.ctssd' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.evmd' on 'rac4-node1'
CRS-2677: Stop of 'ora.ctssd' on 'rac4-node1' succeeded
CRS-2677: Stop of 'ora.evmd' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.cssd' on 'rac4-node1'
CRS-2677: Stop of 'ora.cssd' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.gipcd' on 'rac4-node1'
CRS-2677: Stop of 'ora.gipcd' on 'rac4-node1' succeeded
CRS-2793: Shutdown of Oracle High Availability Services-managed resources on 'rac4-node1' has completed
CRS-4133: Oracle High Availability Services has been stopped.
2019/03/02 14:01:41 CLSRSC-4012: Shutting down Oracle Trace File Analyzer (TFA) Collector.
2019/03/02 14:01:52 CLSRSC-4013: Successfully shut down Oracle Trace File Analyzer (TFA) Collector.
2019/03/02 14:01:52 CLSRSC-347: Successfully unlock /u01/app/grid
[root@rac4-node1 ~]#

We can now start Installing the Grid Patches (executed as Grid owner user, on this case oracle) in the local node (notice we use "opatch apply local")

if you are sure all services are down, you can execute the patch installation using the silent option. This will avoid any question and speedup the process. Will also allow you execute the patching using nohup and in the background, making sure the process continues even if your session in the host is lost during the process. I don't think is good idea to do your upgrades from a 2G mobile connection and leaving it in the background.... but just wanted to make clear is possible..

[oracle@rac4-node1 ~]$ /u01/app/grid/OPatch/opatch apply -oh /u01/app/grid -local /tmp/28833258/28698356 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/grid
Central Inventory : /u01/oraInventory
   from           : /u01/app/grid/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-06-20PM_1.log

Verifying environment and performing prerequisite checks...

--------------------------------------------------------------------------------
Start OOP by Prereq process.
Launch OOP...

Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/grid
Central Inventory : /u01/oraInventory
   from           : /u01/app/grid/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-06-56PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   28698356

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/grid')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '28698356' to OH '/u01/app/grid'

Patching component oracle.rdbms, 12.2.0.1.0...

Patching component oracle.has.rsf, 12.2.0.1.0...

Patching component oracle.has.common, 12.2.0.1.0...

Patching component oracle.has.common.cvu, 12.2.0.1.0...

Patching component oracle.has.cvu, 12.2.0.1.0...

Patching component oracle.has.db, 12.2.0.1.0...

Patching component oracle.has.deconfig, 12.2.0.1.0...

Patching component oracle.has.crs, 12.2.0.1.0...
Patch 28698356 successfully applied.
Log file location: /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-06-56PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$

[oracle@rac4-node1 ~]$  /u01/app/grid/OPatch/opatch apply -oh /u01/app/grid -local /tmp/28833258/28163235 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/grid
Central Inventory : /u01/oraInventory
   from           : /u01/app/grid/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-12-52PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   28163235

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/grid')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '28163235' to OH '/u01/app/grid'

Patching component oracle.usm, 12.2.0.1.0...
Patch 28163235 successfully applied.
Log file location: /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-12-52PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$ 

[oracle@rac4-node1 ~]$ /u01/app/grid/OPatch/opatch apply -oh /u01/app/grid -local /tmp/28833258/26839277 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/grid
Central Inventory : /u01/oraInventory
   from           : /u01/app/grid/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-17-05PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   26839277

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/grid')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '26839277' to OH '/u01/app/grid'

Patching component oracle.wlm.dbwlm, 12.2.0.1.0...
Patch 26839277 successfully applied.
Log file location: /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-17-05PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$

[oracle@rac4-node1 ~]$ /u01/app/grid/OPatch/opatch apply -oh /u01/app/grid -local /tmp/28833258/28566910 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/grid
Central Inventory : /u01/oraInventory
   from           : /u01/app/grid/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-18-38PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   28566910

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/grid')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '28566910' to OH '/u01/app/grid'

Patching component oracle.tomcat.crs, 12.2.0.1.0...
Patch 28566910 successfully applied.
Log file location: /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-18-38PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$

[oracle@rac4-node1 ~]$ /u01/app/grid/OPatch/opatch apply -oh /u01/app/grid -local /tmp/28833258/28790640 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/grid
Central Inventory : /u01/oraInventory
   from           : /u01/app/grid/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-20-00PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   28790640

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/grid')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '28790640' to OH '/u01/app/grid'
ApplySession: Optional component(s) [ oracle.oid.client, 12.2.0.1.0 ] , [ oracle.ctx, 12.2.0.1.0 ] , [ oracle.network.cman, 12.2.0.1.0 ] , [ oracle.precomp.lang, 12.2.0.1.0 ] , [ oracle.rdbms.oci, 12.2.0.1.0 ] , [ oracle.xdk, 12.2.0.1.0 ] , [ oracle.rdbms.dv, 12.2.0.1.0 ] , [ oracle.rdbms.lbac, 12.2.0.1.0 ] , [ oracle.ons.daemon, 12.2.0.1.0 ] , [ oracle.precomp.common, 12.2.0.1.0 ] , [ oracle.sdo, 12.2.0.1.0 ]  not present in the Oracle Home or a higher version is found.

Patching component oracle.rdbms.crs, 12.2.0.1.0...

Patching component oracle.assistants.server, 12.2.0.1.0...

Patching component oracle.xdk.rsf, 12.2.0.1.0...

Patching component oracle.rdbms, 12.2.0.1.0...

Patching component oracle.ons, 12.2.0.1.0...

Patching component oracle.tfa, 12.2.0.1.0...

Patching component oracle.xdk.parser.java, 12.2.0.1.0...

Patching component oracle.has.deconfig, 12.2.0.1.0...

Patching component oracle.rdbms.rsf.ic, 12.2.0.1.0...

Patching component oracle.ldap.client, 12.2.0.1.0...

Patching component oracle.ldap.rsf, 12.2.0.1.0...

Patching component oracle.rdbms.deconfig, 12.2.0.1.0...

Patching component oracle.rdbms.dbscripts, 12.2.0.1.0...

Patching component oracle.nlsrtl.rsf, 12.2.0.1.0...

Patching component oracle.has.crs, 12.2.0.1.0...

Patching component oracle.oracore.rsf, 12.2.0.1.0...

Patching component oracle.rdbms.util, 12.2.0.1.0...

Patching component oracle.rdbms.rsf, 12.2.0.1.0...

Patching component oracle.ctx.rsf, 12.2.0.1.0...

Patching component oracle.rdbms.rman, 12.2.0.1.0...

Patching component oracle.ldap.rsf.ic, 12.2.0.1.0...

Patching component oracle.network.rsf, 12.2.0.1.0...
Patch 28790640 successfully applied.
Log file location: /u01/app/grid/cfgtoollogs/opatch/opatch2019-03-02_14-20-00PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$

Lets now apply the 2 patches for our DB home /u01/app/oracle/product/12.2.0/dbhome_1

[oracle@rac4-node1 ~]$ /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/opatch apply -oh /u01/app/oracle/product/12.2.0/dbhome_1 -local /tmp/28833258/28698356 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/oracle/product/12.2.0/dbhome_1
Central Inventory : /u01/oraInventory
   from           : /u01/app/oracle/product/12.2.0/dbhome_1/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatch/opatch2019-03-02_14-24-23PM_1.log

Verifying environment and performing prerequisite checks...

--------------------------------------------------------------------------------
Start OOP by Prereq process.
Launch OOP...

Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/oracle/product/12.2.0/dbhome_1
Central Inventory : /u01/oraInventory
   from           : /u01/app/oracle/product/12.2.0/dbhome_1/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatch/opatch2019-03-02_14-24-45PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   28698356

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/oracle/product/12.2.0/dbhome_1')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '28698356' to OH '/u01/app/oracle/product/12.2.0/dbhome_1'
ApplySession: Optional component(s) [ oracle.has.cvu, 12.2.0.1.0 ] , [ oracle.has.crs, 12.2.0.1.0 ]  not present in the Oracle Home or a higher version is found.

Patching component oracle.rdbms, 12.2.0.1.0...

Patching component oracle.has.rsf, 12.2.0.1.0...

Patching component oracle.has.common, 12.2.0.1.0...

Patching component oracle.has.common.cvu, 12.2.0.1.0...

Patching component oracle.has.db, 12.2.0.1.0...

Patching component oracle.has.deconfig, 12.2.0.1.0...
Patch 28698356 successfully applied.
Log file location: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatch/opatch2019-03-02_14-24-45PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$

[oracle@rac4-node1 ~]$ /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/opatch apply -oh /u01/app/oracle/product/12.2.0/dbhome_1 -local /tmp/28833258/28790640 -silent
Oracle Interim Patch Installer version 12.2.0.1.16
Copyright (c) 2019, Oracle Corporation.  All rights reserved.

Oracle Home       : /u01/app/oracle/product/12.2.0/dbhome_1
Central Inventory : /u01/oraInventory
   from           : /u01/app/oracle/product/12.2.0/dbhome_1/oraInst.loc
OPatch version    : 12.2.0.1.16
OUI version       : 12.2.0.1.4
Log file location : /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatch/opatch2019-03-02_14-26-20PM_1.log

Verifying environment and performing prerequisite checks...
OPatch continues with these patches:   28790640

Do you want to proceed? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/oracle/product/12.2.0/dbhome_1')

Is the local system ready for patching? [y|n]
Y (auto-answered by -silent)
User Responded with: Y
Backing up files...
Applying interim patch '28790640' to OH '/u01/app/oracle/product/12.2.0/dbhome_1'
ApplySession: Optional component(s) [ oracle.oid.client, 12.2.0.1.0 ] , [ oracle.network.cman, 12.2.0.1.0 ] , [ oracle.has.crs, 12.2.0.1.0 ] , [ oracle.ons.daemon, 12.2.0.1.0 ]  not present in the Oracle Home or a higher version is found.

Patching component oracle.rdbms.crs, 12.2.0.1.0...

Patching component oracle.assistants.server, 12.2.0.1.0...

Patching component oracle.xdk.rsf, 12.2.0.1.0...

Patching component oracle.rdbms, 12.2.0.1.0...

Patching component oracle.ons, 12.2.0.1.0...

Patching component oracle.tfa, 12.2.0.1.0...

Patching component oracle.ctx, 12.2.0.1.0...

Patching component oracle.xdk.parser.java, 12.2.0.1.0...

Patching component oracle.has.deconfig, 12.2.0.1.0...

Patching component oracle.precomp.lang, 12.2.0.1.0...

Patching component oracle.rdbms.oci, 12.2.0.1.0...

Patching component oracle.rdbms.rsf.ic, 12.2.0.1.0...

Patching component oracle.xdk, 12.2.0.1.0...

Patching component oracle.rdbms.dv, 12.2.0.1.0...

Patching component oracle.ldap.client, 12.2.0.1.0...

Patching component oracle.ldap.rsf, 12.2.0.1.0...

Patching component oracle.rdbms.lbac, 12.2.0.1.0...

Patching component oracle.rdbms.deconfig, 12.2.0.1.0...

Patching component oracle.rdbms.dbscripts, 12.2.0.1.0...

Patching component oracle.nlsrtl.rsf, 12.2.0.1.0...

Patching component oracle.oracore.rsf, 12.2.0.1.0...

Patching component oracle.rdbms.util, 12.2.0.1.0...

Patching component oracle.rdbms.rsf, 12.2.0.1.0...

Patching component oracle.precomp.common, 12.2.0.1.0...

Patching component oracle.ctx.rsf, 12.2.0.1.0...

Patching component oracle.rdbms.rman, 12.2.0.1.0...

Patching component oracle.ldap.rsf.ic, 12.2.0.1.0...

Patching component oracle.network.rsf, 12.2.0.1.0...

Patching component oracle.sdo, 12.2.0.1.0...
Patch 28790640 successfully applied.
Log file location: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatch/opatch2019-03-02_14-26-20PM_1.log

OPatch succeeded.
[oracle@rac4-node1 ~]$

Lets complete the Patching process executing the following 2 steps as root user.

[root@rac4-node1 ~]# /u01/app/grid/rdbms/install/rootadd_rdbms.sh
[root@rac4-node1 ~]# /u01/app/grid/crs/install/rootcrs.sh -postpatch
Using configuration parameter file: /u01/app/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-03-02_02-29-30PM.log
2019/03/02 14:29:48 CLSRSC-4015: Performing install or upgrade action for Oracle Trace File Analyzer (TFA) Collector.
2019/03/02 14:30:06 CLSRSC-4003: Successfully patched Oracle Trace File Analyzer (TFA) Collector.
2019/03/02 14:30:11 CLSRSC-329: Replacing Clusterware entries in file 'oracle-ohasd.conf'
CRS-4123: Starting Oracle High Availability Services-managed resources
CRS-2672: Attempting to start 'ora.mdnsd' on 'rac4-node1'
CRS-2672: Attempting to start 'ora.evmd' on 'rac4-node1'
CRS-2676: Start of 'ora.mdnsd' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.evmd' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.gpnpd' on 'rac4-node1'
CRS-2676: Start of 'ora.gpnpd' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.gipcd' on 'rac4-node1'
CRS-2676: Start of 'ora.gipcd' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.cssdmonitor' on 'rac4-node1'
CRS-2676: Start of 'ora.cssdmonitor' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.cssd' on 'rac4-node1'
CRS-2672: Attempting to start 'ora.diskmon' on 'rac4-node1'
CRS-2676: Start of 'ora.diskmon' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.cssd' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.cluster_interconnect.haip' on 'rac4-node1'
CRS-2672: Attempting to start 'ora.ctssd' on 'rac4-node1'
CRS-2676: Start of 'ora.ctssd' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.cluster_interconnect.haip' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.asm' on 'rac4-node1'
CRS-2676: Start of 'ora.asm' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.storage' on 'rac4-node1'
CRS-2676: Start of 'ora.storage' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.crf' on 'rac4-node1'
CRS-2676: Start of 'ora.crf' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.crsd' on 'rac4-node1'
CRS-2676: Start of 'ora.crsd' on 'rac4-node1' succeeded
CRS-6017: Processing resource auto-start for servers: rac4-node1
CRS-2672: Attempting to start 'ora.ASMNET1LSNR_ASM.lsnr' on 'rac4-node1'
CRS-2672: Attempting to start 'ora.ons' on 'rac4-node1'
CRS-2673: Attempting to stop 'ora.rac4-node1.vip' on 'rac4-node2'
CRS-2673: Attempting to stop 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node2'
CRS-2677: Stop of 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node2' succeeded
CRS-2673: Attempting to stop 'ora.scan1.vip' on 'rac4-node2'
CRS-2677: Stop of 'ora.rac4-node1.vip' on 'rac4-node2' succeeded
CRS-2672: Attempting to start 'ora.rac4-node1.vip' on 'rac4-node1'
CRS-2677: Stop of 'ora.scan1.vip' on 'rac4-node2' succeeded
CRS-2672: Attempting to start 'ora.scan1.vip' on 'rac4-node1'
CRS-2676: Start of 'ora.rac4-node1.vip' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.LISTENER.lsnr' on 'rac4-node1'
CRS-2676: Start of 'ora.ASMNET1LSNR_ASM.lsnr' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.scan1.vip' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node1'
CRS-2676: Start of 'ora.ons' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.LISTENER.lsnr' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.asm' on 'rac4-node1'
CRS-2676: Start of 'ora.LISTENER_SCAN1.lsnr' on 'rac4-node1' succeeded
CRS-2676: Start of 'ora.asm' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.DATA.dg' on 'rac4-node1'
CRS-2676: Start of 'ora.DATA.dg' on 'rac4-node1' succeeded
CRS-2673: Attempting to stop 'ora.MGMTLSNR' on 'rac4-node2'
CRS-2664: Resource 'ora.DATA.dg' is already running on 'rac4-node1'
CRS-2664: Resource 'ora.DATA_11.dg' is already running on 'rac4-node1'
CRS-2664: Resource 'ora.DATA_DB.dg' is already running on 'rac4-node1'
CRS-2664: Resource 'ora.RECO.dg' is already running on 'rac4-node1'
CRS-2677: Stop of 'ora.MGMTLSNR' on 'rac4-node2' succeeded
CRS-2672: Attempting to start 'ora.MGMTLSNR' on 'rac4-node1'
CRS-2676: Start of 'ora.MGMTLSNR' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.mgmtdb' on 'rac4-node1'
CRS-2676: Start of 'ora.mgmtdb' on 'rac4-node1' succeeded
CRS-2672: Attempting to start 'ora.chad' on 'rac4-node1'
CRS-2672: Attempting to start 'ora.chad' on 'rac4-node2'
CRS-2676: Start of 'ora.chad' on 'rac4-node2' succeeded
CRS-2676: Start of 'ora.chad' on 'rac4-node1' succeeded
CRS-6016: Resource auto-start has completed for server rac4-node1
CRS-6024: Completed start of Oracle Cluster Ready Services-managed resources
CRS-4123: Oracle High Availability Services has been started.
Oracle Clusterware active version on the cluster is [12.2.0.1.0]. The cluster upgrade state is [NORMAL]. The cluster active patch level is [2148646517].
SQL Patching tool version 12.2.0.1.0 Production on Sat Mar  2 14:35:03 2019
Copyright (c) 2012, 2018, Oracle.  All rights reserved.

Connecting to database...OK
Note:  Datapatch will only apply or rollback SQL fixes for PDBs
       that are in an open state, no patches will be applied to closed PDBs.
       Please refer to Note: Datapatch: Database 12c Post Patch SQL Automation
       (Doc ID 1585822.1)
Bootstrapping registry and package to current versions...done
Determining current state...done
Adding patches to installation queue and performing prereq checks...done
Installation queue:
  For the following PDBs: CDB$ROOT PDB$SEED GIMR_DSCREP_10
    Nothing to roll back
    The following patches will be applied:
      28790640 (DATABASE JUL 2018 RELEASE UPDATE REVISION 12.2.0.1.190115)

Installing patches...
Patch installation complete.  Total patches installed: 3

Validating logfiles...done
SQL Patching tool complete on Sat Mar  2 14:44:37 2019
[root@rac4-node1 ~]#

Opatchauto Failed attempt

Lets try now using opatchauto in rac5-node1 (This is part of the cluster which has the Primary database running) ** Spoiler alert: Will fail!! **

[root@rac5-node1 ~]# /u01/app/grid/OPatch/opatchauto apply /tmp/28833258/

OPatchauto session is initiated at Sat Feb 23 16:27:47 2019

System initialization log file is /u01/app/grid/cfgtoollogs/opatchautodb/systemconfig2019-02-23_04-27-57PM.log.

Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-02-23_04-29-22PM.log
The id for this session is AC9M

Executing OPatch prereq operations to verify patch applicability on home /u01/app/grid

Executing OPatch prereq operations to verify patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Patch applicability verified successfully on home /u01/app/grid

Patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Verifying SQL patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
SQL patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Preparing to bring down database service on home /u01/app/oracle/product/12.2.0/dbhome_1
Successfully prepared home /u01/app/oracle/product/12.2.0/dbhome_1 to bring down database service

Bringing down CRS service on home /u01/app/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/rac5-node1/crsconfig/crspatch_rac5-node1_2019-02-23_04-31-45PM.log
Failed to bring down CRS service on home /u01/app/grid

Execution of [GIShutDownAction] patch action failed, check log for more details. Failures:
Patch Target : rac5-node1->/u01/app/grid Type[crs]
Details: [
---------------------------Patching Failed---------------------------------
Command execution failed during patching in home: /u01/app/grid, host: rac5-node1.
Command failed:  /u01/app/grid/perl/bin/perl -I/u01/app/grid/perl/lib -I/u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac5-node1/patchwork/crs/install /u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac5-node1/patchwork/crs/install/rootcrs.pl -prepatch
Command failure output:
Using configuration parameter file: /u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac5-node1/patchwork/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/rac5-node1/crsconfig/crspatch_rac5-node1_2019-02-23_04-31-45PM.log
Oracle Clusterware active version on the cluster is [12.2.0.1.0]. The cluster upgrade state is [NORMAL]. The cluster active patch level is [0].
ORA-15158: rolling upgrade prevented by One or more modules not ready for RM
CRS-4698: Error code 2 in retrieving the patch levelCRS-1154: There was an error setting Oracle ASM to rolling patch mode.
CRS-4000: Command Start failed, or completed with errors.
2019/02/23 16:33:06 CLSRSC-430: Failed to start rolling patch mode

After fixing the cause of failure Run opatchauto resume

]
OPATCHAUTO-68061: The orchestration engine failed.
OPATCHAUTO-68061: The orchestration engine failed with return code 1
OPATCHAUTO-68061: Check the log for more details.
OPatchAuto failed.

OPatchauto session completed at Sat Feb 23 16:33:10 2019
Time taken to complete the session 5 minutes, 24 seconds

 opatchauto failed with error code 42
[root@rac5-node1 ~]#

As we can see above, the Opatchauto execution failed. The good news, is that we actually got the message where it failed (can you beleive it??!! :P so we will take that as a start for next actions: Since looks like the Prepatch step failed, lets execute that:

[root@rac5-node1 ~]#  /u01/app/grid/crs/install/rootcrs.sh -prepatch
Using configuration parameter file: /u01/app/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/rac5-node1/crsconfig/crspatch_rac5-node1_2019-02-23_04-34-54PM.log
Oracle Clusterware active version on the cluster is [12.2.0.1.0]. The cluster upgrade state is [NORMAL]. The cluster active patch level is [0].
CRS-2791: Starting shutdown of Oracle High Availability Services-managed resources on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.crsd' on 'rac5-node1'
CRS-2790: Starting shutdown of Cluster Ready Services-managed resources on server 'rac5-node1'
CRS-2673: Attempting to stop 'ora.data_db.acfsvol1.acfs' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.chad' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.db122.db' on 'rac5-node1'
CRS-2677: Stop of 'ora.data_db.acfsvol1.acfs' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.DATA_DB.ACFSVOL1.advm' on 'rac5-node1'
CRS-2677: Stop of 'ora.DATA_DB.ACFSVOL1.advm' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.proxy_advm' on 'rac5-node1'
CRS-2677: Stop of 'ora.chad' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.db122.db' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.DATA.dg' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.DATA_11.dg' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.RECO.dg' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.DATA_DB.dg' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.LISTENER.lsnr' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.LISTENER_SCAN1.lsnr' on 'rac5-node1'
CRS-2677: Stop of 'ora.LISTENER_SCAN1.lsnr' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.LISTENER.lsnr' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.rac5-node1.vip' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.scan1.vip' on 'rac5-node1'
CRS-2677: Stop of 'ora.DATA.dg' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.RECO.dg' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.DATA_11.dg' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.DATA_DB.dg' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.asm' on 'rac5-node1'
CRS-2677: Stop of 'ora.asm' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.ASMNET1LSNR_ASM.lsnr' on 'rac5-node1'
CRS-2677: Stop of 'ora.scan1.vip' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.rac5-node1.vip' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.proxy_advm' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.ASMNET1LSNR_ASM.lsnr' on 'rac5-node1' succeeded
CRS-2672: Attempting to start 'ora.scan1.vip' on 'rac5-node2'
CRS-2672: Attempting to start 'ora.rac5-node1.vip' on 'rac5-node2'
CRS-2676: Start of 'ora.scan1.vip' on 'rac5-node2' succeeded
CRS-2672: Attempting to start 'ora.LISTENER_SCAN1.lsnr' on 'rac5-node2'
CRS-2676: Start of 'ora.rac5-node1.vip' on 'rac5-node2' succeeded
CRS-2676: Start of 'ora.LISTENER_SCAN1.lsnr' on 'rac5-node2' succeeded
CRS-2673: Attempting to stop 'ora.ons' on 'rac5-node1'
CRS-2677: Stop of 'ora.ons' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.net1.network' on 'rac5-node1'
CRS-2677: Stop of 'ora.net1.network' on 'rac5-node1' succeeded
CRS-2792: Shutdown of Cluster Ready Services-managed resources on 'rac5-node1' has completed
CRS-2677: Stop of 'ora.crsd' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.asm' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.crf' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.drivers.acfs' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.gpnpd' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.mdnsd' on 'rac5-node1'
CRS-2677: Stop of 'ora.drivers.acfs' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.gpnpd' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.crf' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.mdnsd' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.asm' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.cluster_interconnect.haip' on 'rac5-node1'
CRS-2677: Stop of 'ora.cluster_interconnect.haip' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.ctssd' on 'rac5-node1'
CRS-2673: Attempting to stop 'ora.evmd' on 'rac5-node1'
CRS-2677: Stop of 'ora.ctssd' on 'rac5-node1' succeeded
CRS-2677: Stop of 'ora.evmd' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.cssd' on 'rac5-node1'
CRS-2677: Stop of 'ora.cssd' on 'rac5-node1' succeeded
CRS-2673: Attempting to stop 'ora.gipcd' on 'rac5-node1'
CRS-2677: Stop of 'ora.gipcd' on 'rac5-node1' succeeded
CRS-2793: Shutdown of Oracle High Availability Services-managed resources on 'rac5-node1' has completed
CRS-4133: Oracle High Availability Services has been stopped.
2019/02/23 16:36:00 CLSRSC-4012: Shutting down Oracle Trace File Analyzer (TFA) Collector.
2019/02/23 16:36:12 CLSRSC-4013: Successfully shut down Oracle Trace File Analyzer (TFA) Collector.
2019/02/23 16:36:12 CLSRSC-347: Successfully unlock /u01/app/grid
[root@rac5-node1 ~]#

Lets now try to resume the previous execution

[root@rac5-node1 ~]# /u01/app/grid/OPatch/opatchauto resume

OPatchauto session is initiated at Sat Feb 23 16:37:56 2019
Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-02-23_04-37-56PM.log
Resuming existing session with id AC9M

Bringing down CRS service on home /u01/app/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/rac5-node1/crsconfig/crspatch_rac5-node1_2019-02-23_04-38-13PM.log
CRS service brought down successfully on home /u01/app/grid

Performing prepatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Perpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Binary patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Performing postpatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Postpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/grid
Binary patch applied successfully on home /u01/app/grid

Starting CRS service on home /u01/app/grid
Postpatch operation log file location: /u01/app/oracle/crsdata/rac5-node1/crsconfig/crspatch_rac5-node1_2019-02-23_04-48-59PM.log
CRS service started successfully on home /u01/app/grid

Preparing home /u01/app/oracle/product/12.2.0/dbhome_1 after database service restarted
No step execution required.........

Trying to apply SQL patch on home /u01/app/oracle/product/12.2.0/dbhome_1
SQL patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

OPatchAuto successful.

--------------------------------Summary--------------------------------

Patching is completed successfully. Please find the summary as follows:

Host:rac5-node1
RAC Home:/u01/app/oracle/product/12.2.0/dbhome_1
Version:12.2.0.1.0
Summary:

==Following patches were SKIPPED:

Patch: /tmp/28833258/28163235
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/26839277
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/28566910
Reason: This patch is not applicable to this specified target type - "rac_database"

==Following patches were SUCCESSFULLY applied:

Patch: /tmp/28833258/28698356
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-38-17PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-38-17PM_1.log

Host:rac5-node1
CRS Home:/u01/app/grid
Version:12.2.0.1.0
Summary:

==Following patches were SUCCESSFULLY applied:

Patch: /tmp/28833258/26839277
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-41-28PM_1.log

Patch: /tmp/28833258/28163235
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-41-28PM_1.log

Patch: /tmp/28833258/28566910
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-41-28PM_1.log

Patch: /tmp/28833258/28698356
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-41-28PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_16-41-28PM_1.log

Patching session reported following warning(s):
_________________________________________________

[WARNING] The database instance 'db1221' from '/u01/app/oracle/product/12.2.0/dbhome_1', in host'rac5-node1' is not running. SQL changes, if any,  will not be applied.
To apply. the SQL changes, bring up the database instance and run the command manually from any one node (run as oracle).
Refer to the readme to get the correct steps for applying the sql changes.

Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/11.2.0/dbhome_1

/u01/app/oracle/product/12.1.0/dbhome_1

OPatchauto session completed at Sat Feb 23 17:06:45 2019
Time taken to complete the session 28 minutes, 49 seconds
[root@rac5-node1 ~]#

It's done!! Ok, lets try the node2 of Rac5 now.. but lets make sure we actually follow the Readme and add Opatch location to the Path:

[root@rac5-node2 ~]# . oraenv
ORACLE_SID = [+ASM1] ? +ASM2
The Oracle base has been set to /u01/app/oracle
[root@rac5-node2 ~]# export PATH=$PATH:/u01/app/grid/OPatch    <<<<<<<<<<<<
[root@rac5-node2 ~]# opatchauto apply /tmp/28833258/

OPatchauto session is initiated at Sat Feb 23 17:12:01 2019

System initialization log file is /u01/app/grid/cfgtoollogs/opatchautodb/systemconfig2019-02-23_05-12-08PM.log.

Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-02-23_05-13-24PM.log
The id for this session is R57L

Executing OPatch prereq operations to verify patch applicability on home /u01/app/grid

Executing OPatch prereq operations to verify patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Patch applicability verified successfully on home /u01/app/grid

Patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Verifying SQL patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
SQL patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Preparing to bring down database service on home /u01/app/oracle/product/12.2.0/dbhome_1
Successfully prepared home /u01/app/oracle/product/12.2.0/dbhome_1 to bring down database service

Bringing down CRS service on home /u01/app/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/rac5-node2/crsconfig/crspatch_rac5-node2_2019-02-23_05-15-37PM.log
CRS service brought down successfully on home /u01/app/grid

Performing prepatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Perpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Binary patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Performing postpatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Postpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/grid
Binary patch applied successfully on home /u01/app/grid

Starting CRS service on home /u01/app/grid
Postpatch operation log file location: /u01/app/oracle/crsdata/rac5-node2/crsconfig/crspatch_rac5-node2_2019-02-23_05-28-29PM.log
CRS service started successfully on home /u01/app/grid

Preparing home /u01/app/oracle/product/12.2.0/dbhome_1 after database service restarted
No step execution required.........

Trying to apply SQL patch on home /u01/app/oracle/product/12.2.0/dbhome_1
SQL patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

OPatchAuto successful.

--------------------------------Summary--------------------------------

Patching is completed successfully. Please find the summary as follows:

Host:rac5-node2
RAC Home:/u01/app/oracle/product/12.2.0/dbhome_1
Version:12.2.0.1.0
Summary:

==Following patches were SKIPPED:

Patch: /tmp/28833258/28163235
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/26839277
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/28566910
Reason: This patch is not applicable to this specified target type - "rac_database"

==Following patches were SUCCESSFULLY applied:

Patch: /tmp/28833258/28698356
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-17-21PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-17-21PM_1.log

Host:rac5-node2
CRS Home:/u01/app/grid
Version:12.2.0.1.0
Summary:

==Following patches were SUCCESSFULLY applied:

Patch: /tmp/28833258/26839277
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-21-29PM_1.log

Patch: /tmp/28833258/28163235
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-21-29PM_1.log

Patch: /tmp/28833258/28566910
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-21-29PM_1.log

Patch: /tmp/28833258/28698356
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-21-29PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-02-23_17-21-29PM_1.log

Patching session reported following warning(s):
_________________________________________________

[WARNING] The database instance 'db1222' from '/u01/app/oracle/product/12.2.0/dbhome_1', in host'rac5-node2' is not running. SQL changes, if any,  will not be applied.
To apply. the SQL changes, bring up the database instance and run the command manually from any one node (run as oracle).
Refer to the readme to get the correct steps for applying the sql changes.

Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/11.2.0/dbhome_1

/u01/app/oracle/product/12.1.0/dbhome_1

OPatchauto session completed at Sat Feb 23 17:32:57 2019
Time taken to complete the session 20 minutes, 57 seconds
[root@rac5-node2 ~]#

No errors.. so I think my mistake was missing to add Opatch path to PATH on node2 :)

Rollback Using Opatchauto

Running the Rollback, I wanted to see how the system reacts in more detail, so I have added some notes to the Opatchauto output about how the system relocate the resources from one node to another

## We are running the Rollback in rac4-node1 where Instance st1221 is running (This is our Standby in 12.2)
## Notice how -MGMTDB was also running in node1

[oracle@rac4-node1 ~]$ ps -ef |grep pmon
oracle    4477  2061  0 12:57 pts/1    00:00:00 grep pmon
oracle    5182     1  0 Feb23 ?        00:00:40 asm_pmon_+ASM1
oracle    7538     1  0 Feb23 ?        00:00:42 mdb_pmon_-MGMTDB
oracle    9895     1  0 Feb23 ?        00:00:50 ora_pmon_st1221
[oracle@rac4-node1 ~]$ date
Sat Mar  2 12:57:13 GMT 2019

[root@rac4-node1 ~]# /u01/app/grid/OPatch/opatchauto rollback /tmp/28833258/

OPatchauto session is initiated at Sat Mar  2 12:54:09 2019

System initialization log file is /u01/app/grid/cfgtoollogs/opatchautodb/systemconfig2019-03-02_12-54-15PM.log.

Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-03-02_12-55-47PM.log
The id for this session is QJKM

Executing OPatch prereq operations to verify patch applicability on home /u01/app/grid

Executing OPatch prereq operations to verify patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Patch applicability verified successfully on home /u01/app/grid

Verifying SQL patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Preparing to bring down database service on home /u01/app/oracle/product/12.2.0/dbhome_1
Successfully prepared home /u01/app/oracle/product/12.2.0/dbhome_1 to bring down database service

## We can see how the process has first stopped st122 instance: 
*more info in /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-03-02_12-57-02AM.log
>  CRS-2673: Attempting to stop 'ora.st122.db' on 'rac4-node1'
>  CRS-2677: Stop of 'ora.st122.db' on 'rac4-node1' succeeded

[oracle@rac4-node1 ~]$ ps -ef |grep pmon
oracle    5182     1  0 Feb23 ?        00:00:40 asm_pmon_+ASM1
oracle    5705  2061  0 12:57 pts/1    00:00:00 grep pmon
oracle    7538     1  0 Feb23 ?        00:00:42 mdb_pmon_-MGMTDB
[oracle@rac4-node1 ~]$ date
Sat Mar  2 12:57:50 GMT 2019

Bringing down CRS service on home /u01/app/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-03-02_12-57-02AM.log
CRS service brought down successfully on home /u01/app/grid

## The process has now stoped CRS (+ASM) in node1 and how the system started -MGMTDB in node2:

[oracle@rac4-node1 ~]$ ps -ef |grep pmon
oracle   12604  2061  0 13:03 pts/1    00:00:00 grep pmon
[oracle@rac4-node1 ~]$

>  CRS-2677: Stop of 'ora.ctssd' on 'rac4-node1' succeeded
>  CRS-2673: Attempting to stop 'ora.cssd' on 'rac4-node1'
>  CRS-2677: Stop of 'ora.cssd' on 'rac4-node1' succeeded
>  CRS-2673: Attempting to stop 'ora.gipcd' on 'rac4-node1'
>  CRS-2677: Stop of 'ora.gipcd' on 'rac4-node1' succeeded
>  CRS-2793: Shutdown of Oracle High Availability Services-managed resources on 'rac4-node1' has completed
>  CRS-4133: Oracle High Availability Services has been stopped.

>  CRS-2676: Start of 'ora.mgmtdb' on 'rac4-node2' succeeded
>  CRS-2672: Attempting to start 'ora.chad' on 'rac4-node2'

[oracle@rac4-node2 ~]$ ps -ef |grep pmon
oracle    8462     1  0 Feb23 ?        00:00:41 asm_pmon_+ASM2
oracle   11644 11556  0 13:03 pts/0    00:00:00 grep pmon
oracle   22218     1  0 Feb23 ?        00:00:55 ora_pmon_st1222
oracle   32221     1  0 12:58 ?        00:00:00 mdb_pmon_-MGMTDB
[oracle@rac4-node2 ~]$

Performing prepatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Perpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start rolling back binary patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Binary patch rolled back successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Performing postpatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Postpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start rolling back binary patch on home /u01/app/grid
Binary patch rolled back successfully on home /u01/app/grid

Starting CRS service on home /u01/app/grid
Postpatch operation log file location: /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-03-02_01-12-40PM.log
CRS service started successfully on home /u01/app/grid

## CRS is up and -MGMTDB back to node1

[oracle@rac4-node1 ~]$ ps -ef |grep pmon
oracle   21594     1  0 13:15 ?        00:00:00 asm_pmon_+ASM1
oracle   22486     1  0 13:16 ?        00:00:00 mdb_pmon_-MGMTDB
oracle   25036  2061  0 13:23 pts/1    00:00:00 grep pmon
[oracle@rac4-node1 ~]$

Preparing home /u01/app/oracle/product/12.2.0/dbhome_1 after database service restarted
No step execution required.........

Trying to roll back SQL patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch rolled back successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

OPatchAuto successful.

--------------------------------Summary--------------------------------

Patching is completed successfully. Please find the summary as follows:

Host:rac4-node1
RAC Home:/u01/app/oracle/product/12.2.0/dbhome_1
Version:12.2.0.1.0
Summary:

==Following patches were SKIPPED:

Patch: /tmp/28833258/28163235
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/26839277
Reason: This patch is not applicable to this specified target type - "rac_database"

Patch: /tmp/28833258/28566910
Reason: This patch is not applicable to this specified target type - "rac_database"

==Following patches were SUCCESSFULLY rolled back:

Patch: /tmp/28833258/28698356
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-02-53PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/oracle/product/12.2.0/dbhome_1/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-02-53PM_1.log

Host:rac4-node1
CRS Home:/u01/app/grid
Version:12.2.0.1.0
Summary:

==Following patches were SUCCESSFULLY rolled back:

Patch: /tmp/28833258/28698356
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-07-16PM_1.log

Patch: /tmp/28833258/28163235
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-07-16PM_1.log

Patch: /tmp/28833258/26839277
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-07-16PM_1.log

Patch: /tmp/28833258/28566910
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-07-16PM_1.log

Patch: /tmp/28833258/28790640
Log: /u01/app/grid/cfgtoollogs/opatchauto/core/opatch/opatch2019-03-02_13-07-16PM_1.log

Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/11.2.0/dbhome_1

OPatchauto session completed at Sat Mar  2 13:28:01 2019
Time taken to complete the session 33 minutes, 53 seconds
[root@rac4-node1 ~]#

Opatchauto (well, clusterware more precisely) will start the DBs and services set to start automatically. In my case, I don't have most of the Databases set to start automatically, since this is a lab and I don't need all of them running everytime I need to check something :)

Opatchauto Gerenate Steps

When I did the Pathing manually, I didn't actually know there is an option for Opatchauto to show you the actual steps to do in case you want to do the execution manually. The steps provided are clear and mention the specific user for each step, so should be easy to follow. Let's try it out:

[root@rac4-node1 ~]# /u01/app/grid/OPatch/opatchauto apply /tmp/28833258/ -generatesteps

OPatchauto session is initiated at Tue Apr 23 19:01:54 2019

System initialization log file is /u01/app/grid/cfgtoollogs/opatchautodb/systemconfig2019-04-23_07-01-59PM.log.

Session log file is /u01/app/grid/cfgtoollogs/opatchauto/opatchauto2019-04-23_07-03-17PM.log
The id for this session is GNVF

Executing OPatch prereq operations to verify patch applicability on home /u01/app/grid

Executing OPatch prereq operations to verify patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Patch applicability verified successfully on home /u01/app/grid

Patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Verifying SQL patch applicability on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch applicability verified successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Preparing to bring down database service on home /u01/app/oracle/product/12.2.0/dbhome_1
Successfully prepared home /u01/app/oracle/product/12.2.0/dbhome_1 to bring down database service

Bringing down CRS service on home /u01/app/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-04-23_06-45-57PM.log
CRS service brought down successfully on home /u01/app/grid

Performing prepatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Perpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Binary patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Performing postpatch operation on home /u01/app/oracle/product/12.2.0/dbhome_1
Postpatch operation completed successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

Start applying binary patch on home /u01/app/grid
Binary patch applied successfully on home /u01/app/grid

Starting CRS service on home /u01/app/grid
Postpatch operation log file location: /u01/app/oracle/crsdata/rac4-node1/crsconfig/crspatch_rac4-node1_2019-04-23_06-45-57PM.log
CRS service started successfully on home /u01/app/grid

Preparing home /u01/app/oracle/product/12.2.0/dbhome_1 after database service restarted
No step execution required.........

Trying to apply SQL patch on home /u01/app/oracle/product/12.2.0/dbhome_1
Skipping SQL patch step execution on standby database : st122
SQL patch applied successfully on home /u01/app/oracle/product/12.2.0/dbhome_1

OPatchAuto successful.

--------------------------------Summary--------------------------------
Step generation is successful. Steps are generated for the following host:
 [rac4-node1]

## Here is the file we can check for the stpes:
File location of the generated step: /u01/app/grid/cfgtoollogs/opatchautodb/2019-04-23-19-03-21/ApplyInstructions.txt

Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/11.2.0/dbhome_1

/u01/app/oracle/product/12.1.0/dbhome_1

OPatchauto session completed at Tue Apr 23 19:03:43 2019
Time taken to complete the session 1 minute, 49 seconds
[root@rac4-node1 ~]#

Now, as indicated above, we got a nice step by step guide to execute our patching (I really like the Summary about the Cluster it creates at the start ! :)

[root@rac4-node1 ~]# cat /u01/app/grid/cfgtoollogs/opatchautodb/2019-04-23-19-03-21/ApplyInstructions.txt

Detailed Manual Patch Apply Steps
----------------------------------

Overview of the System
-----------------------

*Important Note:* The following diagrammatically represents the system configuration information collected by Oracle. Oracle recommends that  you carefully examine this data and verify that it is complete and correct. If you see any discrepancies between the graphic and your actual system configuration, {3}do not{4} follow the instructions outlined in this document. Instead, follow the patch installation instructions provided in the patch README .

There are '21' entities in {1} system.

+---Oracle Cluster: rac4
    |
    +---Host: rac4-node1
    |   |
    |   +---Oracle Home: /u01/app/grid (Grid Infrastructure Home)(Version: 12.2.0.1.0)
    |   |   |
    |   |   +---ASM Instance: asm
    |   |   |
    |   |   +---High Availability Service: has_rac4-node1
    |   |
    |   +---Oracle Home: /u01/app/oracle/product/12.1.0/dbhome_1 (Database Home)(Version: 12.1.0.2.0)
    |   |   |
    |   |   +---RAC DB Instance: cdbst1211
    |   |
    |   +---Oracle Home: /u01/app/oracle/product/11.2.0/dbhome_1 (Database Home)(Version: 11.2.0.4.0)
    |   |   |
    |   |   +---RAC DB Instance: st1121
    |   |
    |   +---Oracle Home: /u01/app/oracle/product/12.2.0/dbhome_1 (Database Home)(Version: 12.2.0.1.0)
    |       |
    |       +---RAC DB Instance: st1221
    |
    +---Host: rac4-node2
        |
        +---Oracle Home: /u01/app/grid (Grid Infrastructure Home)(Version: 12.2.0.1.0)
        |   |
        |   +---ASM Instance: asm
        |   |
        |   +---High Availability Service: has_rac4-node2
        |
        +---Oracle Home: /u01/app/oracle/product/12.1.0/dbhome_1 (Database Home)(Version: 12.1.0.2.0)
        |   |
        |   +---RAC DB Instance: cdbst1212
        |
        +---Oracle Home: /u01/app/oracle/product/11.2.0/dbhome_1 (Database Home)(Version: 11.2.0.4.0)
        |   |
        |   +---RAC DB Instance: st1122
        |
        +---Oracle Home: /u01/app/oracle/product/12.2.0/dbhome_1 (Database Home)(Version: 12.2.0.1.0)
            |
            +---RAC DB Instance: st1222

   Step .1:

         As the 'oracle' user on the host 'rac4-node1' run the following command:

         [oracle@rac4-node1]$
         /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/opatchauto  apply TEMP_PATCH_LOC -oh /u01/app/oracle/product/12.2.0/dbhome_1 -target_type rac_database -binary -invPtrLoc /u01/app/grid/oraInst.loc -jre /u01/app/grid/OPatch/jre -persistresult /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/auto/dbsessioninfo/sessionresult_analyze_rac4-node1_rac.ser -analyze -online -prepare_home

      Troubleshoot:

   Step .2: Run rootcrs.pl -prepatch

      As a 'root' user, run rootcrs.pl -prepatch

         As the 'root' user on the host 'rac4-node1' run the following command:

         [root@rac4-node1]#
         /u01/app/grid/perl/bin/perl -I/u01/app/grid/perl/lib -I/u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac4-node1/patchwork/crs/install /u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac4-node1/patchwork/crs/install/rootcrs.pl -prepatch

   Step .3: Apply Patch to Database Home

      On rac4-node1 Apply the patch to Database Home

         As the 'oracle' user on the host 'rac4-node1' run the following command:

         [oracle@rac4-node1]$
         /tmp/28833258//28698356/custom/scripts/prepatch.sh -dbhome /u01/app/oracle/product/12.2.0/dbhome_1

   Step .4:

         As the 'oracle' user on the host 'rac4-node1' run the following command:

         [oracle@rac4-node1]$
         /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/opatchauto  apply TEMP_PATCH_LOC -oh /u01/app/oracle/product/12.2.0/dbhome_1 -target_type rac_database -binary -invPtrLoc /u01/app/grid/oraInst.loc -jre /u01/app/grid/OPatch/jre -persistresult /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/auto/dbsessioninfo/sessionresult_rac4-node1_rac.ser -analyzedresult /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/auto/dbsessioninfo/sessionresult_analyze_rac4-node1_rac.ser

      Troubleshoot:

   Step .5: Apply Patch to Database Home

      On rac4-node1 Apply the patch to Database Home

         As the 'oracle' user on the host 'rac4-node1' run the following command:

         [oracle@rac4-node1]$
         /tmp/28833258//28698356/custom/scripts/postpatch.sh -dbhome /u01/app/oracle/product/12.2.0/dbhome_1

   Step .6:

         As the 'oracle' user on the host 'rac4-node1' run the following command:

         [oracle@rac4-node1]$
         /u01/app/grid/OPatch/opatchauto  apply TEMP_PATCH_LOC -oh /u01/app/grid -target_type cluster -binary -invPtrLoc /u01/app/grid/oraInst.loc -jre /u01/app/grid/OPatch/jre -persistresult /u01/app/grid/OPatch/auto/dbsessioninfo/sessionresult_rac4-node1_crs.ser -analyzedresult /u01/app/grid/OPatch/auto/dbsessioninfo/sessionresult_analyze_rac4-node1_crs.ser

      Troubleshoot:

   Step .7: Run rootadd_rdbms.sh

      As a 'root' user, run rootadd_rdbms.sh

         As the 'root' user on the host 'rac4-node1' run the following command:

         [root@rac4-node1]#
         /u01/app/grid/rdbms/install/rootadd_rdbms.sh

   Step .8: Run rootcrs.pl -postpatch

      As a 'root' user, run rootcrs.pl -postpatch

Important Note: If the command fails with following error messages, reboot the host and re-try the same step.
CLSRSC-400: A system reboot is required to continue installing

         As the 'root' user on the host 'rac4-node1' run the following command:

         [root@rac4-node1]#
         /u01/app/grid/perl/bin/perl -I/u01/app/grid/perl/lib -I/u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac4-node1/patchwork/crs/install /u01/app/grid/OPatch/auto/dbtmp/bootstrap_rac4-node1/patchwork/crs/install/rootcrs.pl -postpatch

[root@rac4-node1 ~]#

Datapatch

Is important to execute Datapatch (or "catbundle.sql" in 11g ) once you complete the patching so internal packages and objects are updated. Opatchauto, should do it, or that is my understanding, but always double check checking "registry$sqlpatch" in your DB

Datapatch will be executed in any DB running at that moment from the Oracle home we are executing it. So if you have a DB not running at that point and sharing the DB_HOME, remember to execute it once the DB is brought back up

[oracle@rac5-node1 ~]$ /u01/app/oracle/product/12.2.0/dbhome_1/OPatch/datapatch -verbose
SQL Patching tool version 12.2.0.1.0 Production on Sat Feb 23 17:43:36 2019
Copyright (c) 2012, 2018, Oracle.  All rights reserved.

Log file for this invocation: /u01/app/oracle/cfgtoollogs/sqlpatch/sqlpatch_26702_2019_02_23_17_43_36/sqlpatch_invocation.log

Connecting to database...OK
Note:  Datapatch will only apply or rollback SQL fixes for PDBs
       that are in an open state, no patches will be applied to closed PDBs.
       Please refer to Note: Datapatch: Database 12c Post Patch SQL Automation
       (Doc ID 1585822.1)
Bootstrapping registry and package to current versions...done
Determining current state...done

Current state of SQL patches:
Bundle series 12.2.0.1.190115DBJUL2018RUR:
  ID 190115 in the binary registry and not installed in any PDB

Adding patches to installation queue and performing prereq checks...
Installation queue:
  For the following PDBs: CDB$ROOT PDB$SEED PDB1
    Nothing to roll back
    The following patches will be applied:
      28790640 (DATABASE JUL 2018 RELEASE UPDATE REVISION 12.2.0.1.190115)

Installing patches...
 Patch installation complete.  Total patches installed: 3

Validating logfiles...
Patch 28790640 apply (pdb CDB$ROOT): SUCCESS
  logfile: /u01/app/oracle/cfgtoollogs/sqlpatch/28790640/22633787/28790640_apply_DB122_CDBROOT_2019Feb23_17_45_34.log (no errors)
Patch 28790640 apply (pdb PDB$SEED): SUCCESS
  logfile: /u01/app/oracle/cfgtoollogs/sqlpatch/28790640/22633787/28790640_apply_DB122_PDBSEED_2019Feb23_17_50_41.log (no errors)
Patch 28790640 apply (pdb PDB1): SUCCESS
  logfile: /u01/app/oracle/cfgtoollogs/sqlpatch/28790640/22633787/28790640_apply_DB122_PDB1_2019Feb23_17_50_41.log (no errors)
SQL Patching tool complete on Sat Feb 23 17:55:11 2019
[oracle@rac5-node1 ~]$

I think that's enough patching for today...

Discussion (0)