Oracle7 Administrator's Reference for UNIX

Contents Index Home Previous Next

Performing Backups

This section provides sample scripts for backing up your database on a UNIX system. The main script, dbbackup, reads the schedule from the dbbackup_sched.dat file to determine the type of backup to be performed. Then dbbackup calls other procedures to take a cold backup, a hot backup, or an export of the database.

These examples are provided to give an idea of the scripts you will need to develop for your system. You will need to customize the scripts to suit your own environment and requirements. Note also that many variables are hard-coded in the scripts: file names and extensions, for example. Your files may be named differently.

The dbbackup procedure does the following:

The dbbackup_begin procedure does the following:

The dbexport_begin procedure does the following:

Special Tasks

The dbbackup_begin procedure performs SPECIAL_TASKS while in DBA mode. The use of this function will expand over time. To omit this argument, leave it blank or use a hyphen (-).

Post Backup Tasks

The dbbackup job will run the script designated by POST_BACKUP_TASK after the backup has completed.

Required Files

The following files are required for this backup routine:

File Name Description
$TOOLS/db_mgmt/backup/dbbackup main routine
$TOOLS/db_mgmt/backup/dbbackup_begin called by dbbackup
$TOOLS/db_mgmt/backup/dbexport_begin called by dbbackup
$TOOLS/db_mgmt/backup/dbbackup_sched.dat schedule file
$TOOLS/system/crontab.dat crontab schedule
$DBNAME/tools/backup/dbname_backup_admin.sh environment variables
$DBNAME/tools/backup/dbname_backup_date.dyn dynamic sql written by dbbackup_begin
$DBNAME/tools/log/dbname_backup_date.log log written by dbbackup
$DBNAME/tools/log/dbname_backup_date.err error log written by dbbackup
$DBNAME/tools/log/dbname_backup_date.msg email message written by dbbackup
$DBNAME/tools/log/dbname_export.par export parameter file
$DBNAME/log/dbbackup.log crontab log
Table 4 - 1. Files Required for dbbackup Routine

Creating UNIX Scripts

When creating UNIX scripts, follow these general rules:

dbbackup

Start the dbbackup script with your database name as follows:

dbbackup dbname

The sample script contains:

#! /bin/sh 
# name		$TOOLS/db_mgmt/backup/dbbackup 
# 
# purpose	Perform a backup of a database. 
# 
# usage		$TOOLS/db_mgmt/backup/dbbackup dbname 
#				Calls $TOOLS/db_mgmt/backup/dbbackup_begin 
# 
# parameters	$1=dbname 
# 
# set environment 
# ..........................................................
. /db_admin/tools/system/crontab.env >> /dev/null 
# .............................................................. 
# set local variables 
# ...........................................................
BEGIN_JOB="`date`" 
ERRMSG='$TOOLS/db_mgmt/backup/dbbackup: syntax error, parameter=dbname' 
if [ "$1" ] 
then DBNAME=$1 
else echo $ERRMSG 
     exit 
fi  
LOGFILE="/db_admin/db_$DBNAME/tools/log/${DBNAME}_backup_`date '+%y%m%d'`.log" 
LOGFILE2="/db_admin/db_$DBNAME/tools/log/${DBNAME}_backup_`date 
'+%y%m%d'`_old.log" 
ERRFILE="/db_admin/db_$DBNAME/tools/log/${DBNAME}_backup_`date '+%y%m%d'`.err" 
ERRFILE2="/db_admin/db_$DBNAME/tools/log/${DBNAME}_backup_`date 
'+%y%m%d'`_old.err" 
MSGFILE="/db_admin/db_$DBNAME/tools/log/${DBNAME}_backup_`date '+%y%m%d'`.msg" 
ADMIN_FILE="/db_admin/db_$DBNAME/tools/backup/${DBNAME}_backup_admin.sh" 
SCHED_FILE="/db_admin/tools/db_mgmt/backup/dbbackup_sched.dat" 
JOBNAME="$TOOLS/db_mgmt/backup/dbbackup" 
DBBACKUP_BEGIN="$TOOLS/db_mgmt/backup/dbbackup_begin" 
DBEXPORT_BEGIN="$TOOLS/db_mgmt/backup/dbexport_begin" 
TODAY="`date`" 
THIS_DAY="`date '+%a'`" 
MSG="$DBNAME Backup succeeded at `date`"  
# .............................................................. 
# begin 
# .............................................................. 
if [ -f "$LOGFILE" ]; then 
  #  
  # Save old file 
  # 
  cat $LOGFILE >> $LOGFILE2 
fi 
if [ -f "$ERRFILE" ]; then 
  # 
  # Save old file 
  # 
  cat $ERRFILE >> $ERRFILE2 
fi 
$TOOLS/system/script_header $JOBNAME > $MSGFILE 
$TOOLS/system/script_header $JOBNAME > $LOGFILE 
# 
# Read backup schedule 
# 
awk -v dbname=$DBNAME -v this_day=$THIS_DAY '{ 
    # 
    # get a record 
    # 
    cmd=$0 
    sizeofarray=split(cmd,rec," ") 
    dbname2=rec[1] 
    day_of_week=rec[2] 
    backup=rec[3] 
    export=rec[4] 
    special_task=rec[5] 
    post_backup_task=rec[6] 
 
    if (( dbname2 == dbname ) && ( this_day == day_of_week ))  
	print " " backup " " export " " special_task " " post_backup_task  
}' $SCHED_FILE | while read BACKUP EXPORT SPECIAL_TASK POST_BACKUP_TASK 
do 
  . /db_admin/db_$DBNAME/.orauser_$DBNAME 
  # 
  # Parameters 
  # 
  PARAMETER_MSG=" 
\n............................................................... 
\nBackup Job Parameters: 
\n 
\nDatabase Name = $DBNAME 
\nBackup Type   = $BACKUP 
\nExport Type   = $EXPORT 
\nSpecial Task  = $SPECIAL_TASK 
\nPost Backup   = $POST_BACKUP_TASK 
\n 
\nEnvironment Variables: 
\nORACLE_HOME   = $ORACLE_HOME 
\nORACLE_SID    = $ORACLE_SID 
\nORACLE_BASE   = $ORACLE_BASE 
\nPATH          = $PATH 
\n............................................................... 
\n 
" 
  echo $PARAMETER_MSG >> $LOGFILE 2> $ERRFILE 
  echo $PARAMETER_MSG >> $MSGFILE 
  bdf >> $LOGFILE 2> $ERRFILE 
  echo " " >> $LOGFILE 2> $ERRFILE 
 
  # 
  # Backup 
  #  
  if [ "$BACKUP" != "nobackup" ]; then 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
    echo "Begin backup at `date`" >> $LOGFILE 2>> $ERRFILE 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
    $DBBACKUP_BEGIN $DBNAME $BACKUP $SPECIAL_TASK >> $LOGFILE 2>> $ERRFILE 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
    echo "End backup at `date`" >> $LOGFILE 2>> $ERRFILE 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
  fi 
 
  # 
  # Export 
  # 
  if [ "$EXPORT" != "noexport" ]; then 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
    echo "Begin export at `date`" >> $LOGFILE 2>> $ERRFILE 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
    $DBEXPORT_BEGIN $DBNAME $EXPORT $SPECIAL_TASK >> $LOGFILE 2>> $ERRFILE 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
    echo "End export at `date`" >> $LOGFILE 2>> $ERRFILE 
    echo "..................................." >> $LOGFILE 2>> $ERRFILE 
  fi 
 
  echo " " >> $MSGFILE 
  echo "Backup log file errors and warnings:" >> $MSGFILE 
  
  echo " " >> $LOGFILE 2>> $ERRFILE 
  bdf >> $LOGFILE 2>> $ERRFILE 
  echo " " >> $LOGFILE 2>> $ERRFILE 
 
  # 
  # Errors 
  # 
  grep -e error -e warning -e ORA- -e EXP- -e fatal $LOGFILE | grep -v "No 
errors." >> $MSGFILE  
  ERRCNT=`grep -e error -e ORA- -e EXP- -e fatal $LOGFILE | grep -c -v "No 
errors."`  
  grep -e error -e warning -e ORA- -e EXP- -e fatal $ERRFILE | grep -v "Export 
terminated successfully" >> $MSGFILE 
  ERRCNT2=`grep -e error -e ORA- -e EXP- -e fatal $ERRFILE | grep -c -v 
"Export terminated successfully"` 
  END_JOB="`date`" 
  if [ "$ERRCNT" -gt 0 -o "$ERRCNT2" -gt 0 ] 
  then MSG="**ALERT** $DBNAME backup failed at ${END_JOB}" 
  else MSG="$DBNAME backup succeeded at ${END_JOB}" 
  fi 
 
  echo " " >> $MSGFILE 
  echo "Log files: " >> $MSGFILE 
  echo "Log file=$LOGFILE" >> $MSGFILE 
  echo "Error file=$ERRFILE" >> $MSGFILE 
  echo "Message file=$MSGFILE" >> $MSGFILE  
 
  $TOOLS/templates/script.footer "$BEGIN_JOB" "$END_JOB" >> $MSGFILE 
  $TOOLS/templates/script.footer "$BEGIN_JOB" "$END_JOB" >> $LOGFILE 
 
  # 
  # Mail message 
  # 
  $TOOLS/mail/dba_mail_list "$MSG" $MSGFILE $DBNAME 0 
 
  # 
  # Space Report 
  # 
  . $ADMIN_FILE 
  $TOOLS/sql/space_report $DBNAME $ACCT1 $ACCT2 
 
  # 
  # Post Backup Task 
  # 
  $POST_BACKUP_TASK & 
done 

dbbackup_sched.dat

ITS Sun hot export - /db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITS Mon hot export - /db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITS Tue hot export - /db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITS Wed hot export - /db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITS Thu hot export - /db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITS Fri cold export /dbadev/ITS/db_management/ITS_restrict1.sh 
/db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITS Sat nobackup export - 
/db_admin/db_ITS/tools/disaster_recovery/push_export.sh 
ITSDEMO Sun nobackup noexport 
ITSDEMO Mon nobackup noexport 
ITSDEMO Tue nobackup noexport 
ITSDEMO Wed nobackup noexport 
ITSDEMO Thu nobackup noexport 
ITSDEMO Fri nobackup noexport 
ITSDEMO Sat nobackup export 
itstest Sun hot export 
itstest Mon hot export 
itstest Tue hot export 
itstest Wed hot export 
itstest Thu hot export 
itstest Fri cold export 
itstest Sat cold export 
PREPO Sun hot export - /dbadev/PR_server/bin/clear_unused_slots.sh 
PREPO Mon hot export - /dbadev/PR_server/bin/clear_unused_slots.sh 
PREPO Tue hot export - /dbadev/PR_server/bin/clear_unused_slots.sh 
PREPO Wed hot export - /dbadev/PR_server/bin/clear_unused_slots.sh 
PREPO Thu hot export - /dbadev/PR_server/bin/clear_unused_slots.sh 
PREPO Fri cold export /dbadev/PR/db_management/PR_restrict1.sh 
/dbadev/PR_server/bin/clear_unused_slots.sh 
PREPO Sat nobackup export 
PRTEST Sun hot noexport 
PRTEST Mon hot export 
PRTEST Tue hot export 
PRTEST Wed hot export 
PRTEST Thu hot export 
PRTEST Fri cold export 
PRTEST Sat cold export 

dbbackup_begin

The dbbackup_begin procedure is called by dbbackup if a hot or cold backup is scheduled.

#! /bin/sh 
# name		$TOOLS/db_mgmt/backup/dbbackup_begin 
# 
# purpose	Perform a backup of a database. 
# 
# usage		$TOOLS/db_mgmt/backup/dbbackup_begin dbname <backup>  
#			<export> <special task> 
# 
# parameters	$1=dbname 
#		$2=backup type 
#		$3=special task 
# 

# .............................................................. 
# set local variables 
# .............................................................. 
ERRMSG=' 
$TOOLS/db_mgmt/backup/dbbackup_begin: syntax error:  
    dbbackup_begin dbname <hot|cold|nobackup> <special task>. 
' 
DBF_REPORT="$TOOLS/db_mgmt/dbf_report " 
 
# 
# parameters 
# 
if [ "$1" ] 
then DBNAME=$1 
else echo $ERRMSG 
     exit 1  
fi 
if [ "$2" ] 
then BACKUP=$2 
else echo $ERRMSG 
     exit 1 
fi 
if [ "$3" -a "$3" != "-" ] 
then SPECIAL_TASK=$3 
else SPECIAL_TASK=" " 
fi 
 
# 
# booleans 
#  
TRUE=0 
FALSE=1 
SHUTDOWN_FAILED_B=1 
RESTART_FAILED_B=1 
OMAIL_B="false" 
 
# 
# local variables 
# 
JOBNAME="$TOOLS/db_mgmt/backup/dbbackup_begin" 
JOBNAME_SHORT="dbbackup_begin" 
ADMIN_FILE="/db_admin/db_$DBNAME/tools/backup/${DBNAME}_backup_admin.sh"  
DBBACKUP="/db_admin/db_$DBNAME/tools/backup/${DBNAME}_backup_`date 
'+%y%m%d'.dyn" 
WALL="/etc/wall" 
CKSUM="/bin/cksum" 
CMP="/bin/cmp" 
ANALYZE="$TOOLS/sql/analyze" 
ANALYZE_CREATE="$TOOLS/sql/analyze_create" 
BANNER="/db_admin/db_${DBNAME}/banner/status" 
CKSUM_SIZE_ERR="${JOBNAME_SHORT}: fatal error in cksum size comparison." 
CKSUM_VALUE_ERR="${JOBNAME_SHORT}: fatal error in cksum value comparison." 
CKSUM_VALUE_WAR="${JOBNAME_SHORT}: warning in cksum value comparison." 
CMP_ERR="${JOBNAME_SHORT}: fatal error in cmp." 
DBSERR="${JOBNAME_SHORT}: fatal error in dbs file copy." 
DBSWAR="${JOBNAME_SHORT}: warning with database file copy." 
ARCERR="${JOBNAME_SHORT}: fatal error in archive log copy." 
if [ "$OMAIL" = "true" ] 
then OMAIL_B="true" 
else OMAIL_B="false" 
fi 
THISNODE=`uname -n` 
STATFS="$TOOLS/source/system/statfs.exe " 
 
# 
# node specific logic 
# 
if [ "$THISNODE" = "prodhp1" ] 
then TMP='/bugtmp' 
else TMP='/dbatmp' 
fi 
 
# .............................................................. 
# begin 
# .............................................................. 
 
# 
# run the orauser 
# 
. /db_admin/db_$DBNAME/.orauser_$DBNAME 
. $ADMIN_FILE 
 
# 
# check for database online 
# 
STATUS=`ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep "$DBNAME " | 
grep ora_ | grep -v grep` 
if [ $? != 0 ]; then 
  if [ "$BACKUP" = "hot" ]; then 
    echo "${JOBNAME_SHORT}: Error - database is not online." 
    echo "${JOBNAME_SHORT}: process listing is to follow..." 
    echo "${JOBNAME_SHORT}: ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | 
grep -v grep | grep "$DBNAME " | grep ora_" 
    ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep -v grep | grep 
"$DBNAME " | grep ora_ 
    echo "${JOBNAME_SHORT}: exiting." 
    exit  
  else 
    echo "${JOBNAME_SHORT}: Database is already down.  Continuing." 
    echo "${JOBNAME_SHORT}: kill sqlnet v1 processes." 
    $TOOLS/unix/kill_processes.sh oracle${DBNAME} 
  fi 
else  
  if [ "$BACKUP" = "cold" ]; then 
    #  
    # broadcast shutdowns 
    # 
    $WALL /db_admin/db_${DBNAME}/banner/${DBNAME}_shutdown_15min.banner 
    sleep 600  
    $WALL /db_admin/db_${DBNAME}/banner/${DBNAME}_shutdown_5min.banner 
    sleep 240 
    $WALL /db_admin/db_${DBNAME}/banner/${DBNAME}_shutdown_1min.banner 
    sleep 60 
    #  
    # shutdown 
    # 
    echo "${JOBNAME_SHORT}: Shutting down abort." 
    /db_admin/db_${DBNAME}/sql/shutdown_abort_${DBNAME}.sh 
    # 
    # kill sqlnet processes 
    # 
    echo "${JOBNAME_SHORT}: kill sqlnet v1 processes." 
    $TOOLS/unix/kill_processes.sh oracle${DBNAME} 
  fi  
fi 
 
if [ "$BACKUP" = "cold" ]; then 
  echo "${JOBNAME_SHORT}: Starting up restrict." 
  /db_admin/db_${DBNAME}/sql/startup_restrict_${DBNAME}.sh 
fi 
 
# ................................................................. 
# begin backup 
# ................................................................. 
# 
# build database file list 
# 
echo "${JOBNAME_SHORT}: building dynamic parameter file." 
sqlplus -s / > $DBBACKUP <<EOF 
set pagesize 0 
set linesize 2048 
set heading off 
set feedback off 
column TNAME format a20 
column FNAME format a80 
select tablespace_name TNAME,  
       file_name FNAME, 
       ' \$BACKUPDIR'||  
       substr(file_name,instr(translate(file_name,'1234567890'
,'0000000000'),'0'), 
              instr(file_name,'/',1,2)- instr(translate(file_name,'1234567890', 
              '0000000000'),'0')) 
	from sys.dba_data_files 
        order by tablespace_name,file_name; 
exit 
EOF 
 
# 
# Check dynamic file for size 
# 
DYNSIZE=`$STATFS $DBBACKUP st_size` 
if [ $DYNSIZE = 0 ]; then  
  echo "${JOBNAME_SHORT}: fatal error during backup file creation.  Backup 
aborting." 
  echo "${JOBNAME_SHORT}: cat $DBBACKUP" 
  cat $DBBACKUP 
  return 
fi 
 
# 
# Get file list 
# 
echo " " 
echo "${JOBNAME_SHORT}: Print before backup file listing." 
$DBF_REPORT $DBNAME 
 
# 
# shutdown 
# 
if [ $BACKUP = "cold" ]; then 
  # 
  # Build analyze script 
  # 
  echo " " 
  echo "${JOBNAME_SHORT}: Building analyze script." 
  $ANALYZE_CREATE $DBNAME 
 
  echo "${JOBNAME_SHORT}: Shutting down immediate."   
  . /db_admin/db_${DBNAME}/sql/shutdown_${DBNAME}.sh 
 
  STATUS=`ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep -v grep | 
grep "$DBNAME " | grep -v ${DBNAME}1 | grep ora_` 
  if [ $? = 0 ]; then 
    echo "${JOBNAME_SHORT}: error in shutdown. Cold backup aborting." 
    SHUTDOWN_FAILED_B="$TRUE" 
  else 
    echo "${JOBNAME_SHORT}: Database is shutdown." 
    if [ "$OMAIL_B" = "true" ]; then 
      . $TOOLS/mail/move_mail_logs 
      echo "${JOBNAME_SHORT}: kill OMail processes." 
      $TOOLS/unix/kill_processes.sh msync 
      $TOOLS/unix/kill_processes.sh mpost 
      $TOOLS/unix/kill_processes.sh mcollect 
      $TOOLS/unix/kill_processes.sh mremote 
    fi 
    echo "${JOBNAME_SHORT}: move alert log." 
    mv /db_admin/db_${DBNAME}/bdump/alert_${DBNAME}.log \ 
    /db_admin/db_${DBNAME}/bdump/alert_${DBNAME}.log_`date '+%y%m%d'` 
  fi 
fi 
 
# 
# delete last backup files 
# 
if [ $SHUTDOWN_FAILED_B = $TRUE ]; then 
  echo "${JOBNAME_SHORT}: Skipping backup file deletion." 
else 
  echo " " 
  echo "${JOBNAME_SHORT}: Deleting previous backup..." 
  if [ -f $BACKUPDIR1/${DBNAME}_*.dbf ]; then rm $BACKUPDIR1/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR2/${DBNAME}_*.dbf ]; then rm $BACKUPDIR2/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR3/${DBNAME}_*.dbf ]; then rm $BACKUPDIR3/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR4/${DBNAME}_*.dbf ]; then rm $BACKUPDIR4/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR5/${DBNAME}_*.dbf ]; then rm $BACKUPDIR5/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR6/${DBNAME}_*.dbf ]; then rm $BACKUPDIR6/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR7/${DBNAME}_*.dbf ]; then rm $BACKUPDIR7/${DBNAME}_*.dbf; 
fi 
  if [ -f $BACKUPDIR8/${DBNAME}_*.dbf ]; then rm $BACKUPDIR8/${DBNAME}_*.dbf; 
fi 
	 
  if [ -f $BACKUPDIR1/${DBNAME}_*.ctl ]; then rm $BACKUPDIR1/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR2/${DBNAME}_*.ctl ]; then rm $BACKUPDIR2/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR3/${DBNAME}_*.ctl ]; then rm $BACKUPDIR3/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR4/${DBNAME}_*.ctl ]; then rm $BACKUPDIR4/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR5/${DBNAME}_*.ctl ]; then rm $BACKUPDIR5/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR6/${DBNAME}_*.ctl ]; then rm $BACKUPDIR6/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR7/${DBNAME}_*.ctl ]; then rm $BACKUPDIR7/${DBNAME}_*.ctl; 
fi 
  if [ -f $BACKUPDIR8/${DBNAME}_*.ctl ]; then rm $BACKUPDIR8/${DBNAME}_*.ctl; 
fi 
 
  if [ $BACKUP = "cold" ]; then 
    if [ -f $BACKUPDIR1/${DBNAME}_*.log ]; then rm 
$BACKUPDIR1/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR2/${DBNAME}_*.log ]; then rm 
$BACKUPDIR2/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR3/${DBNAME}_*.log ]; then rm 
$BACKUPDIR3/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR4/${DBNAME}_*.log ]; then rm 
$BACKUPDIR4/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR5/${DBNAME}_*.log ]; then rm 
$BACKUPDIR5/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR6/${DBNAME}_*.log ]; then rm 
$BACKUPDIR6/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR7/${DBNAME}_*.log ]; then rm 
$BACKUPDIR7/${DBNAME}_*.log; fi 
    if [ -f $BACKUPDIR8/${DBNAME}_*.log ]; then rm 
$BACKUPDIR8/${DBNAME}_*.log; fi 
  else 
    if [ -f $CONBACK1/${DBNAME}_*.ctl ]; then rm $CONBACK1/${DBNAME}_*.ctl; fi 
  fi 
fi 
   
# 
# Begin backup 
# 
if [ $SHUTDOWN_FAILED_B = $FALSE ]; then 
  echo " " 
  echo "${JOBNAME_SHORT}:  Starting $BACKUP backup using $DBBACKUP..." 
  # 
  # check hot backup status 
  # 
  if  [ $BACKUP = "hot" ]; then 
    sqldba lmode=y <<EOF 
    connect internal 
    select * from v_backup; 
    exit 
EOF 
  fi 
  # 
  # begin reading dynamic parameter file 
  # 
  cat $DBBACKUP | while read TABLESPACE FILE DIR 
  do 
    if [ $BACKUP = "hot" ]; then 
      sqldba lmode=y <<EOF    
      connect internal 
      alter tablespace $TABLESPACE begin backup; 
      exit 
EOF 
    fi 
    # 
    # copy a database file 
    # 
    BACKUPDIR=`eval echo \$DIR` 
    echo "${JOBNAME_SHORT}: cp $FILE $BACKUPDIR" 
    cp $FILE $BACKUPDIR 
    STATUS=$? 
    if [ "$STATUS" != 0 ]; then 
      echo  "${JOBNAME_SHORT}: error during file copy $FILE." 
      echo "${JOBNAME_SHORT}: $CKSUM $FILE $BACKUPDIR/$DATAFILE" 
      $CKSUM $FILE $BACKUPDIR/$DATAFILE 
    fi 
    DATAFILE=`basename $FILE` 
    if [ $BACKUP = "hot" ]; then 
       echo "${JOBNAME_SHORT}: ls -l $FILE $BACKUPDIR/$DATAFILE" 
       ls -l $FILE $BACKUPDIR/$DATAFILE 
#      echo "${JOBNAME_SHORT}: $CKSUM $FILE $BACKUPDIR/$DATAFILE" 
#      $CKSUM $FILE $BACKUPDIR/$DATAFILE 
#      CKSUM_OUT=`$CKSUM $FILE $BACKUPDIR/$DATAFILE` 
#      echo $CKSUM_OUT | read VALUE1 SIZE1 NAME1 VALUE2 SIZE2 NAME2 
#      if [ "$VALUE1" != "$VALUE2" ]; then 
#        echo "$CKSUM_VALUE_WAR" 
#      fi 
#      if [ "$SIZE1" != "$SIZE2" ]; then 
#        echo "$CKSUM_SIZE_ERR" 
#      fi 
     else 
      echo "${JOBNAME_SHORT}: $CMP $FILE $BACKUPDIR/$DATAFILE" 
      $CMP $FILE $BACKUPDIR/$DATAFILE 
      STATUS="$?" 
      if [ "$STATUS" != 0 ]; then 
        echo "$CMP_ERR" 
      fi  
    fi 
    if [ $BACKUP = "hot" ]; then 
      sqldba lmode=y <<EOF 
      connect internal 
      alter tablespace $TABLESPACE end backup; 
      exit 
EOF 
    fi 
  done 
  # 
  # check hot backup status 
  # 
  if  [ $BACKUP = "hot" ]; then 
    sqldba lmode=y <<EOF 
    connect internal 
    select * from v_backup; 
    exit 
EOF 
  fi 
  # 
  # Backup control files and online redo logs 
  # 
  if [ $BACKUP = "hot" ]; then 
    echo "${JOBNAME_SHORT}: backing up controlfile to 
${CONBACK1}/${DBNAME}_control01.ctl" 
    sqldba lmode=y <<EOF 
    connect internal 
    alter database backup controlfile to '${CONBACK1}/${DBNAME}_control01.ctl'; 
    exit 
EOF 
    # 
  else 
    # 
    # control files 
    # 
    echo "${JOBNAME_SHORT}: backing up all control files..." 
    if [ -f /dbf1/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf1/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR1; fi 
    if [ -f /dbf2/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf2/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR2; fi 
    if [ -f /dbf3/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf3/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR3; fi 
    if [ -f /dbf4/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf4/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR4; fi 
    if [ -f /dbf5/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf5/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR5; fi 
    if [ -f /dbf6/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf6/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR6; fi 
    if [ -f /dbf7/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf7/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR7; fi 
    if [ -f /dbf8/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf8/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR8; fi 
    if [ -f /dbf9/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf9/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR9; fi 
    if [ -f /dbf10/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf10/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR10; fi 
    if [ -f /dbf11/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf11/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR11; fi 
    if [ -f /dbf12/$DBNAME/${DBNAME}_*.ctl ]; then cp 
/dbf12/$DBNAME/${DBNAME}_*.ctl $BACKUPDIR12; fi 
    # 
    # redo logs 
    # 
    echo "${JOBNAME_SHORT}: backing up all online redo logs..." 
    if [ -f /dbf1/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf1/$DBNAME/${DBNAME}_*.log $BACKUPDIR1; fi 
    if [ -f /dbf2/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf2/$DBNAME/${DBNAME}_*.log $BACKUPDIR2; fi 
    if [ -f /dbf3/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf3/$DBNAME/${DBNAME}_*.log $BACKUPDIR3; fi 
    if [ -f /dbf4/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf4/$DBNAME/${DBNAME}_*.log $BACKUPDIR4; fi 
    if [ -f /dbf5/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf5/$DBNAME/${DBNAME}_*.log $BACKUPDIR5; fi 
    if [ -f /dbf6/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf6/$DBNAME/${DBNAME}_*.log $BACKUPDIR6; fi 
    if [ -f /dbf7/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf7/$DBNAME/${DBNAME}_*.log $BACKUPDIR7; fi 
    if [ -f /dbf8/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf8/$DBNAME/${DBNAME}_*.log $BACKUPDIR8; fi 
    if [ -f /dbf9/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf9/$DBNAME/${DBNAME}_*.log $BACKUPDIR9; fi 
    if [ -f /dbf10/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf10/$DBNAME/${DBNAME}_*.log $BACKUPDIR10; fi 
    if [ -f /dbf11/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf11/$DBNAME/${DBNAME}_*.log $BACKUPDIR11; fi 
    if [ -f /dbf12/$DBNAME/${DBNAME}_*.log ]; then cp 
/dbf12/$DBNAME/${DBNAME}_*.log $BACKUPDIR12; fi 
  fi 
   
  # 
  # archive logs 
  # 
  # 
  # force a log switch  
  #  
  if [ $BACKUP = "hot" ]; then 
    sqldba lmode=y <<EOF 
    connect internal 
    alter system switch logfile; 
    exit 
EOF 
  # 
  # wait for archive log copy to complete 
  # 
  sleep 120 
  fi 
  # 
  # copy archive logs  
  # 
  if [ -f $ARCOLD/${DBNAME}_*.arc ]; then 
    echo " " 
    echo "${JOBNAME_SHORT}: Delete previous backup archive logs..." 
    ls -l $ARCOLD/${DBNAME}_*.arc 
    for I in $ARCOLD/${DBNAME}_*.arc 
      do 
        ls -l $I 
        ARCNAME=`basename $I` 
        rm $ARCOLD/$ARCNAME 
        STATUS="$?" 
        if [ "$STATUS" != 0 ]; then 
           echo "${JOBNAME_SHORT}: error deleting old archive log: 
$ARCOLD/$ARCNAME" 
        fi 
    done 
  else    echo " " 
          echo "${JOBNAME_SHORT}: No old archive logs to delete." 
  fi 
 
  if [ -f $ARC/${DBNAME}_*.arc ]; then 
    echo " " 
    echo "${JOBNAME_SHORT}: Copying archive logs..." 
    for I in $ARC/${DBNAME}_*.arc 
      do 
     	ls -l $I 
       	ARCNAME=`basename $I` 
	echo "${JOBNAME_SHORT}: cp $ARC/$ARCNAME $ARCOLD" 
     	cp $ARC/$ARCNAME $ARCOLD 
	STATUS="$?" 
      	if [ "$STATUS" != 0 ]; then 
            echo "$ARCERR" 
	fi 
	echo "${JOBNAME_SHORT}: $CMP $ARC/$ARCNAME $ARCOLD/$ARCNAME" 
	$CMP $ARC/$ARCNAME $ARCOLD/$ARCNAME 
        STATUS="$?" 
        if [ "$STATUS" != 0 ]; then 
            echo "$CMP_ERR" 
            echo "${JOBNAME_SHORT}: $CKSUM $ARC/$ARCNAME $ARCOLD/$ARCNAME" 
            $CKSUM $ARC/$ARCNAME $ARCOLD/$ARCNAME 
            echo "${JOBNAME_SHORT}: Archive log deletion skipped." 
       	else  
#   echo "${JOBNAME_SHORT}: $CKSUM $ARC/$ARCNAME $ARCOLD/$ARCNAME" #   $CKSUM $ARC/$ARCNAME $ARCOLD/$ARCNAME	 
#   CKSUM_OUT=`$CKSUM $ARC/$ARCNAME $ARCOLD/$ARCNAME` 
#   echo $CKSUM_OUT | read VALUE1 SIZE1 NAME1 VALUE2 SIZE2 NAME2 
#   if [ "$VALUE1" != "$VALUE2" -o "$SIZE1" != "$SIZE2" ]; then 
#          echo "$DIFFERR" 
#          echo "${JOBNAME_SHORT}: Archive log deletion skipped." 
#   else rm $ARC/$ARCNAME 
#          if [ $? != 0 ]; then 
#            echo "${JOBNAME_SHORT}: Archive deletion failed." 
#          fi 
#   fi 
          rm $ARC/$ARCNAME 
          if [ $? != 0 ]; then 
            echo "${JOBNAME_SHORT}: Archive deletion failed." 
          fi 
        fi 
      done 
  else echo "${JOBNAME_SHORT}: Found no archives to copy." 
  fi 
 
  # 
  # startup 
  # 
  if [ $BACKUP = "cold" ]; then 
    #  
    # dba mode tasks 
    # 
    echo "${JOBNAME_SHORT}: Begin startup restrict..." 
    /db_admin/db_${DBNAME}/sql/startup_restrict_${DBNAME}.sh 
    STATUS=`ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep -v grep | 
grep "$DBNAME " | grep ora_` 
    if [ $? != 0 ]; then 
      echo "${JOBNAME_SHORT}: error in restrict startup." 
    else  
      echo "${JOBNAME_SHORT}: Begin analyze at `date`..." 
      $ANALYZE $DBNAME 
      echo "${JOBNAME_SHORT}: End analyze at `date`..." 
      # 
      # special task 
      # 
      if [ "$SPECIAL_TASK" != " " ]; then 
        echo "${JOBNAME_SHORT}: Running DBA mode task..." 
        . ${SPECIAL_TASK} > $TMP/${DBNAME}_restrict.log 2> 
$TMP/${DBNAME}_restrict.err  
        # sqlplus / @${SPECIAL_TASK} ${SPECIAL_TASK}.log 2>> 
${SPECIAL_TASK}.err  
      fi 
    fi 
    echo "${JOBNAME_SHORT}: Shutdown..." 
    /db_admin/db_${DBNAME}/sql/shutdown_${DBNAME}.sh 
    STATUS=`ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep -v grep | 
grep "$DBNAME " | grep ora_` 
    if [ $? = 0 ]; then 
      echo "${JOBNAME_SHORT}: error in shutdown following analyze..." 
    else 
      echo "${JOBNAME_SHORT}: End shutdown..." 
    fi 
    # 
    # startup 
    # 
    /db_admin/db_${DBNAME}/sql/startup_exclusive_${DBNAME}.sh 
    STATUS=`ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep -v grep | 
grep "$DBNAME " | grep ora_` 
    if [ $? != 0 ]; then 
      echo "${JOBNAME_SHORT}: error in database startup." 
      RESTART_FAILED_B=0 
    else 
      echo "${JOBNAME_SHORT}: Database restarted." 
      echo "1" > $BANNER 
      $WALL /db_admin/db_${DBNAME}/banner/${DBNAME}_db_online.banner       
    fi 
  fi 
  	 
  # 
  # Get file list 
  # 
  echo " " 
  echo "${JOBNAME_SHORT}: Print after file listing..." 
  $DBF_REPORT $DBNAME 
fi 
 
# 
# Oracle*Mail  
# 
if [ "$OMAIL_B" = "true" ]; then 
  echo "${JOBNAME_SHORT}: Running mverify." 
  mverify 
  echo "${JOBNAME_SHORT}: Completed mverify." 
fi 

dbexport_begin

The dbexport_begin procedure is called by dbbackup if a database export is scheduled.

#! /bin/sh 
# name		$TOOLS/db_mgmt/backup/dbexport_begin 
# 
# purpose	Perform a backup of a database. 
# 
# usage		$TOOLS/db_mgmt/backup/dbexport_begin dbname   
#			<export> <special task> 
# 
# parameters	$1=dbname 
#		$2=export 
#		$3=special task 
# 
 .............................................................. 
# set local variables 
# .............................................................. 
ERRMSG=' 
$TOOLS/db_mgmt/backup/dbexport_begin: syntax error:  
    dbexport_begin dbname <export|noexport> <special task>. 
' 
# 
# parameters 
# 
if [ "$1" ] 
then DBNAME=$1 
else echo $ERRMSG 
     exit 1  
fi 
if [ "$2" ] 
then EXPORT=$2 
else echo $ERRMSG 
     exit 1 
fi 
if [ "$3" ] 
then SPECIAL_TASK=$3 
else SPECIAL_TASK=" " 
fi 
 
# 
# booleans 
#  
TRUE=0 
FALSE=1 
SHUTDOWN_FAILED_B=1 
RESTART_FAILED_B=1 
OMAIL_B="false" 
 
# 
# local variables 
# 
JOBNAME="$TOOLS/db_mgmt/backup/dbexport_begin" 
JOBNAME_SHORT="dbexport_begin" 
ADMIN_FILE="/db_admin/db_$DBNAME/tools/backup/${DBNAME}_backup_admin.sh"  
CMP="/bin/cmp" 
PARFILE="/db_admin/db_$DBNAME/tools/backup/${DBNAME}_export.par" 
CMP_ERR="${JOBNAME_SHORT}: fatal error in cmp." 
EXPERR="${JOBNAME_SHORT}: fatal error in export file copy." 
 
# .............................................................. 
# begin 
# .............................................................. 
 
# 
# run the orauser 
# 
. /db_admin/db_$DBNAME/.orauser_$DBNAME 
. $ADMIN_FILE 
 
# 
# check for database online 
# 
STATUS=`ps -fu oracle | awk '{ print $8" "$9" "$10" " }' | grep -v grep | grep 
$DBNAME" " | grep ora_` 
if [ $? != 0 ]; then 
  echo "${JOBNAME_SHORT}: error - database not online." 
  echo "${JOBNAME_SHORT}: process listing is to follow..." 
  echo "${JOBNAME_SHORT}: ps -fu oracle | grep -v grep | awk '{ print $8" "$9" 
"$10" " }' | grep $DBNAME" " | grep ora_" 
  ps -fu oracle | grep -v grep | awk '{ print $8" "$9" "$10" " }' | grep 
$DBNAME" " | grep ora_ 
  echo "${JOBNAME_SHORT}: exiting." 
  exit 1 
fi 
 
# ................................................................. 
# Export 
# ................................................................. 
echo " " 
echo "${JOBNAME_SHORT}: List previous export files..." 
ls -l $EXPORTDIR/${DBNAME}.exp* 
ls -l $EXPORTDIROLD/${DBNAME}.exp* 
 
# 
# delete old export 
# 
if [ -f $EXPORTDIROLD/${DBNAME}.exp_old ]; then 
  rm $EXPORTDIROLD/${DBNAME}.exp_old 
  if [ $? != 0 ]; then 
    echo "${JOBNAME_SHORT}: error deleting previous export."  
  else 
    echo "${JOBNAME_SHORT}: Deleted previous export file."  
  fi 
else echo "${JOBNAME_SHORT}: Found no previous export file." 
fi	 
# 
# copy current export to old 
# 
if [ -f $EXPORTDIR/${DBNAME}.exp ]; then 
  chmod 642 $EXPORTDIR/${DBNAME}.exp 
  echo "${JOBNAME_SHORT}: cp $EXPORTDIR/${DBNAME}.exp $EXPORTDIROLD" 
  cp $EXPORTDIR/${DBNAME}.exp $EXPORTDIROLD 
  if [ $? != 0 ]; then  
    echo "$EXPERR" 
  else echo "${JOBNAME_SHORT}: $CMP $EXPORTDIR/${DBNAME}.exp 
$EXPORTDIROLD/${DBNAME}.exp"  
    $CMP $EXPORTDIR/${DBNAME}.exp $EXPORTDIROLD/${DBNAME}.exp 
    STATUS="$?" 
    if [ "$STATUS" != 0 ]; then 
      echo "$CMP_ERR" 
    fi 
    echo "${JOBNAME_SHORT}: mv ${EXPORTDIROLD}/${DBNAME}.exp 
${EXPORTDIROLD}/${DBNAME}.exp_old" 
    mv ${EXPORTDIROLD}/${DBNAME}.exp ${EXPORTDIROLD}/${DBNAME}.exp_old 
    if [ $? != 0 ]; then 
      echo "$EXPERR" 
    fi 
    rm $EXPORTDIR/${DBNAME}.exp 
    if [ $? != 0 ]; then 
      echo "${JOBNAME_SHORT}: error deleting export file." 
      exit 
    fi 
  fi 
else echo "${JOBNAME_SHORT}: Found no current export file to copy." 
fi 
 
# 
# Begin export 
# 
exp parfile=$PARFILE  
echo " " 
echo "${JOBNAME_SHORT}: Export complete.  " 
ls -l $EXPORTDIR/${DBNAME}.exp* 
ls -l $EXPORTDIROLD/${DBNAME}.exp*      

hot_backup.sql

The following SQL script is run by the hot backup script. The output file lists all the data files.

set feedback off
set pagesize 0
set heading off
set echo off
set termout off
spool hot_backup.dat

select tablespace_name||'   '||file_name||'   '||'ORADISK'||
       substr(file_name,INSTR(TRANSLATE(FILE_NAME,'1234567890',
                                                  '0000000000'),'0'),
                        INSTR(FILE_NAME,':')-
                        INSTR(TRANSLATE(FILE_NAME,'1234567890',
                                                  '0000000000'),'0'))||':'
FROM SYS.DBA_DATA_FILES order by tablespace_name;

spool off;

EXIT


Contents Index Home Previous Next