PLM Tech Tips

PLM Tech Tips: Automating Oracle Agile PLM Database Exports

Server administrators are always looking for ways to automate routine tasks. Automation helps to standardize an operation and make it easily repeatable. Automating the Agile export facility makes it easier to keep development and testing systems refreshed and can provide an automated backup system for the Agile database. This article is intended to provide a framework for using Windows PowerShell to automate Agile database exports.

was an old school Windows Batch script guy back in the day. By the time PowerShell had come out I had moved on to other jobs and other scripting languages, but I recently became interested in Windows PowerShell. I have begun using PowerShell more for Windows automation and scripting tasks because of the sheer power, flexibility and support of the language to automate tasks or administer Windows Systems. There is so much functionality in PowerShell that there is little you cannot do. Check the Resources section at the end of this article for more information on PowerShell. There are a multitude of example scripts that you can use (as I have here) to build just what you need in PowerShell.

Agile includes several useful database scripts which are built when you use the Agile schema builder to create the database schema for the first time. You can recreate these scripts later by running the schema creation tool using the 'Scripts Only’ option. For our purposes we are interested in the two scripts that perform exports: agile9exp.bat and agile9expdp.bat. I use PowerShell and these two scripts to automate Agile database exports.

Background on the AGILE9EXP.BAT

This script uses the old school Oracle Database Dump facility to create an export of your Agile database. It is slower than the newer Data Pump facility and creates larger export files; however, it has the advantage of being compatible with almost all versions of Oracle Database whether they are newer or older than your source.

Background on the AGILE9EXPDP.BAT

The agile9expdp.bat script uses the more modern Oracle Data Pump facility and is the preferred method for Agile versions 9.3.X and above. It is much faster than Data Dump and creates smaller export files; however, it is not backward compatible with versions of Oracle Database that are older than your source. In those cases you should use agile9exp.bat (data dump). I will be using agile9expdp.bat in my examples.

Common Traits of the Two Scripts

Both of the Agile provided scripts utilize a .par file to pass parameters such as output file location and export type. Each export script has its own .par file (agile9exp.par and agile9expdp.par) where you can customize the parameters; however, that topic is beyond the scope of this article. Each of the scripts creates a log file: agile9exp.log and agile9expdp.log. We will be using the stock Agile provided scripts and configurations along with Windows PowerShell to automate the database export.

I always recommend that clients use the appropriate Agile export script to export the Agile database. Doing so ensures the export contains everything needed to rebuild the system or refresh a system when it is used with the ancillary agile9imp.bat or agile9impdp.bat import script. I realize that there are some who would argue against this in favor of using their own scripts. On average, I deal with 15 to 20 different Agile databases every week. Some of the Agile database exports are internally generated for Agile refreshes while others are client exports used on our fleet of Agile PLM Virtual Machines. Usually if difficulties arise in trying to import a client database, it is due to the client not using the standard Agile scripts. What a client does internally is not my concern but, when exchanging data with Zero Wait-State, we require use of the Agile provided export and import scripts. The automation techniques discussed in this article are built around using those Agile provided scripts and focus on Windows automation.

Building a Basic Export Task

The standard script is invoked from an Administrator command prompt in Windows , for example :

agile9expdp.bat password

Since we are using the Agile provided batch file (agile9expdp.bat) to perform the export, we can easily automate the task in the Windows Task Scheduler. The most obvious issue in this process is that the Agile schema password is left in clear text within the automation script or task, providing a vulnerability that could potentially be exploited. I have spent months attempting to devise a means of using encryption for the password. Perhaps someone has solved this problem and could let us all know how it can be accomplished.

For now, I have resolved the issue to my satisfaction by applying the best security possible to the automation script - that is limiting its access only to those that absolutely need to run the script (including a Task Scheduler event). I have not yet decided if it is more secure to pass the ‘password’ as a parameter in the Task Scheduler or just code it into the script, I will leave that to each of you.

Creating a Basic Scheduler Task

Begin by accessing the Windows Task Scheduler from the Administrative Tools icon.

FIgure1 08-24-17, 1.02.57 PM.png

  1. Select the ‘Task Scheduler library.’
  2. Click on ‘Create Basic Task.’

This will launch the ‘Create Basic Task Wizard’ to step you through the process.

Figure2 08-25-17, 10.24.36 AM.png

  1. Give the ‘Task’ a name of your choosing; click Next.

 

Figure3 08-25-17, 4.08.29 PM.png

  1. Select when you want the task to run (‘Daily’ for example).
  2. Click Next.


Figure3a 08-25-17, 4.08.58 PM.png

  1. Select the Start date, time of day and the 'Recur every' parameter. Click Next.

Figure4a 08-25-17, 4.09.30 PM.png


Figure4 08-25-17, 4.09.53 PM.png

  1. Use the ‘Browse’ button to locate the agile9expdp.bat file.
  2. In my default installation, it is located in E:\app\oracle\admin\agile9\create\agile folder.
  3. Place the schema password in the 'Add Arguments' box and place the folder location of the script in the 'Start in' box.
  4. Click Next. Then click 'Finish'

 

This completes the most basic task and we can now test it using the ‘Run’ selection. If everything is correct, you will see a command prompt window open and the export will run. So far we have simply created the Task in the Windows Task Scheduler that will run the Agile provided export script at the same time every day. There are two issues with stopping here. The first issue is that the Task is defined to use your account credentials, a less than ideal situation when creating automation because your credentials could change in compliance with organizational policy. The second issue is that Oracle data pump will not overwrite the AGILE9EXPDP.dmp file that was created by the script on its first run. This results in the script failing on any subsequent attempts. Now we turn to PowerShell to overcome these issues. Using PowerShell may require you to adjust the PowerShell Execution Policy in your environment.

Credential Issue Solution

There are a couple of obvious choices to overcome the credentials issue.

  • • Use the local Administrator account and its credentials to run the Task, or
  • • Create a special local account just for running automation tasks like this.

I prefer the second choice but you may prefer using the local Administrator account. Either will work and both choices have some trade-offs in convenience and possibly security.

File Overwrite Solutions Using PowerShell

For the purposes of this article I will be demonstrating the use of PowerShell to automate this export task and overcome the file overwrite issue. I have begun to rely more heavily on PowerShell in the Windows environment due to its flexibility and power when combined with simple cmdlets (as you will see later). The power within PowerShell is realized through the use of ‘cmdlets’, a form of higher level commands. These cmdlets offer rich functionality from simple file manipulation to complex Windows management tasks. Because PowerShell is a native Windows application it has hooks into system level functionality that have been abstracted by the cmdlets. PowerShell is a flexible language supporting common flow control structures such as ‘if’, ‘else’ and ‘elsif’, switch statements and ‘for’, ‘foreach’ and ‘while’ looping structures. If you have not yet taken the plunge to PowerShell, I will explain each step as clearly as possible. The ‘#’ character in PowerShell denotes a comment and is used here for clarity. In its simplest form, the basic PowerShell script to run AGILE9EXPDP.bat looks like this:

e:
cd E:\app\Oracle\admin\agile9\create\agile
#Run the export
&E:\app\Oracle\admin\agile9\create\agile\agile9expdp.bat tartan

The above PowerShell script runs the Agile script and duplicates what we already had by using the Agile script directly in the Windows Task Scheduler. Note that the Agile schema password is now included on the command line when using the script. Now we will build on that to tackle the problem of writing over the existing AGILE9EXPDP.dmp file. To do this, we need to add variables for the Source location of the dump file, (i.e., the Target location in which to archive it), so we add the following variables to the script.

$Source="E:\app\Oracle\admin\agile9\create\agile\AGILE9EXPDP.dmp"
$ArchiveLoc="E:\BACKUPS\"

Next we add code to move the AGILE9EXPDP.dmp file to the archive location as shown below using the following Move-Item PowerShell cmdlet. Move-Item is a PowerShell cmdlet that handles the copy/delete of the original file for us.

Move-Item $Source $ArchiveLoc -force

Building Unique File Names

To solve the problem of the Agile export being unable to overwrite the existing dump file or to save multiple exports, we need to create unique file names for the archived dump files. The following code can be added to our PowerShell script to create a unique file name each time the script runs. I prefer names that include the original file name (AGILE9EXPDP) and the date it was created. In the example below I build a target filename using the previous day’s date since that is when the export was run.

#Source File Name and Location
$Source=“E:\app\Oracle\admin\agile9\create\agile\AGILE9EXPDP.dmp”
# Create archive file name
$currDate= Get-Date
# Minus one day because we are copying yesterday’s export
$previousDate = $currDate.AddDays(-1)
$a = $previousDate.Day
$ArchiveLoc=“E:\Backups”
$ArchiveLoc += $a
$ArchiveLoc += $PreviousDate.Month
$ArchiveLoc += $PreviousDate.Year
$ArchiveLoc += ".dmp"

We can then ‘Move’ the existing AGILE9EXPDP.dmp file to the E:\BACKUPS location and rename it per the above date-built name using the same PowerShell Move-Item cmdlet: Move-Item $Source $ArchiveLoc -force Our complete script will now perform the following tasks:

  • • Build a unique name for the archive dump file using the date.
  • • Move yesterday’s export to the archive location using the unique name.
  • • Remove the existing AGILE9EXPDP.dmp file so a new one can be created.
  • • Run the Agile provided agile9expdp.bat.

And now the PowerShell script should look like this (note we have placed the Agile schema Password in the PowerShell script):

# Set my variables
    ## The paths on the local computer etc
    $ArchiveLoc=“E:\Backups\AGILE9EXPDP.dmp"
    $SchemaPassword="tartan" 
#Source File Name and Location
$Source=“E:\app\Oracle\admin\agile9\create\agile\agile9exp.dmp”
# Create archive file name
# Minus one day because we are copying yesterday’s export
$currDate= Get-Date
$previousDate = $currDate.AddDays(-1)
$a = $previousDate.Day
# Make sure Path is reset, it was used above
$ArchiveLoc=“E:\Backups”
$ArchiveLoc += $a
$ArchiveLoc += $PreviousDate.Month
$ArchiveLoc += $PreviousDate.Year
$ArchiveLoc += ".dmp"
# Agile stock scripts look only in the current folder 
# for SQL supporting scripts, so we have to run
# the export in the folder it's conataining folder
E:
cd E:\app\Oracle\admin\agile9\create\agile
# Move the existing export to the archive
Move-Item $Source $ArchiveLoc -force

#Run the export
&E:\app\Oracle\admin\agile9\create\agile\agile9expdp.bat $SchemaPassword

Cleaning Up Old Archives

PowerShell provides us with an easy means of deleting ALL files older than a given time as shown below (thanks to StackOverflow for this tidbit.

$limit = (Get-Date).AddDays(-2)
$path = "E:\Backups"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

So if we want to maintain exports for 3 days, for example, then we have to use some of the earlier created scripts to give unique names to the dump files in E:\Backups and then to delete files older than 3 days. The following script will accomplish the tasks identified earlier and delete archived files older than 3 days.

# Set my variables
    ## The paths on the local computer etc
    
    $Source="E:\app\Oracle\admin\agile9\create\agile\AGILE9EXPDP.dmp"
    $ArchiveLoc="E:\Backups\"
    $SchemaPassword="tartan" 
   
$limit = (Get-Date).AddDays(-2)
$path = "E:\Backups"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

# Create archive file name
# Minus one day because we are copying yesterdays export
# If you are running exports more than once per day, 
# you can include hour, minutes and seconds if needed to the name
#
$currDate = Get-Date
$currDate.AddDays(-1)
$previousDate = $currDate.AddDays(-1)
$a = $previousDate.Day
# Make sure Path is reset, it was used above
$ArchiveLoc="E:\Backups\"
$ArchiveLoc += $a
$ArchiveLoc += $PreviousDate.Month
$ArchiveLoc += $PreviousDate.Year
$ArchiveLoc += ".dmp"
# Agile stock scripts look only in the current folder 
# for SQL supporting scripts
E:
cd E:\app\Oracle\admin\agile9\create\agile
# Move the existing export to the archive
Move-Item $Source $ArchiveLoc -force

#Run the export
&E:\app\Oracle\admin\agile9\create\agile\agile9expdp.bat $SchemaPassword

 

Building the PowerShell Task in Task Scheduler

Now that we have all of the PowerShell pieces (and I will add the full script at the bottom of the post) we need to add a new Task in the Task Scheduler to accommodate running a PowerShell script.

Figure5 08-24-17, 11.25.30 AM.png

  1. Start the Task Manager and select the ‘Task Scheduler Library’ as shown in the Building a Basic Export Task section earlier.
  2. Select ‘Create Task’ (not the ‘Basic Task’ this time).


Figure6 08-25-17, 2.46.24 PM.png

  1. Type in a ‘Name for the Task’.
  2. Click the ‘Run whether user is logged in or not’;
  3. Select ‘Run with highest privileges'
  4. In the ‘Configure for’ box select 'Windows Server 2012 R2'.
  5. Click the Actions tab


Figure7 08-25-17, 11.20.50 AM.png

  1. Click ‘New’ to create an action


Figure8 08-25-17, 11.22.41 AM.png

  1. Select ‘Start a Program’ as the ‘Action’.
  2. In the Program/Script box, enter the full path to the PowerShell executable (i.e. C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe).
  3. In the ‘Add Arguments’ box, place the full path to the script and its name (i.e. E:\Scripts\AgileExport.ps1). You will notice that we are no longer placing the schema password in the arguments but have added it to the script file instead. This is my preferred method when using PowerShell for this type of automation
  4.  In the ‘Start in’ box, I use the E:\Scripts folder as the ‘Start in’ value; you should use your own script location.
  5. Click ‘OK’ to continue.


TriggersTab 08-24-17, 11.55.50 AM.png
Now use the Triggers Tab to build a schedule for the Task to meet your needs.

Compression

To minimize the effect of large export files taking up disk space, you can add compression to your archived files using a variety of command line compression tools. You can compress the dump files using the same unique file name process described previously. Any good command line compression utility will work for this purpose-my favorite is the PACL version of PowerArchiver as it is very flexible and easy to use. Below is an example of code that can be added to our PowerShell script to move the dump file to the archive folder, compress the dump file and then delete the current dump file in preparation for running the next export.

# Move the existing export to the archive 
# compress the file
# then delete the original file
Move-Item $Source $ArchiveLoc -force
&E:\PACL\PACOMP.exe -a $ArchiveLoc+.zip $ArchiveLoc 
Remove-Item $ArchiveLoc -force

Security

I mentioned earlier in the article that the downside to automating exports is that the schema password is either hard coded in the script or passed as a parameter in the Task Scheduler. Either approach signifies a security vulnerability. The dilemma is that you need the database password to get into the database which means you cannot store a key or the password in the database. I have also considered several possibilities involving key-pair and one time key solutions but they all leave the key exposed making the encryption only a nuisance and not a solution. I have settled on using PS2EXE to turn the completed PowerShell script into an executable file and then storing the source script in another secure location. This DOES NOT resolve the issue completely as the script can be extracted from the .exe file. However, this method does at least keep the password from casual observance. As I said earlier, perhaps you know a better way to do it and could let us know.

Summary

Automating Agile schema exports is a useful tool for any administrator whether it is done for backups or for refreshing other Agile PLM systems. The techniques I have demonstrated here can be applied to automate a variety of administrative tasks saving time and improving the accuracy of these tasks. This type of robust language support makes PowerShell a scripting language that is comparable to many other recognized languages. Add PowerShell’s capability to call other programs (like EXE or BAT files) and its ability to utilize API calls from .NET, C and others, and you have a complete scripting package for your Windows administration tasks. Using PowerShell to automate tasks allows the addition of Windows Event logging and some form of error monitoring based on the logs. You can also consider using something like PS2EXE to convert the completed script to an EXE file for some added security. If you have not yet tried PowerShell to script automation and repeated tasks in your Windows environment, now is the right time to give it a try and give-up the old-school batch language for a more powerful and modern tool. Moving forward to Microsoft Server 2016, PowerShell is the interface to get anything done. The full AgileExport_Template.ps1 script created in this blog is available here.

Resources

There are several excellent books on the market that cover PowerShell in depth but there are also many resources available on the Internet including many from Microsoft. These include the “Official PowerShell Blog and the Hey, Scripting Guy blog. In addition a Google or Bing search for “powershell tutorial” will return many useful links for sites offering PowerShell training and “getting started” guides. Microsoft has an extensive library of cmdlets that are built into PowerShell and many more are available from the PowerShell community. You can learn more about the available cmdlets by visiting the Microsoft Powershell area on MSDN (Microsoft PowerShell). Microsoft also makes available a Basic Cookbooks Reference that is very helpful. Microsoft also makes available a free PowerShell Integrated Scripting Environment (known as PowerShell ISE) which allows a user to directly run statements and see the results all in the same windowed environment. Of course there are other scripting/development environments (including Visual Studio) available for PowerShell. If you’re just looking for a more general purpose smart editor for PowerShell scripts you can try Notepad++, ATOM, SublimeText, PowerShellPlus, PrimalScript or the aforementioned PowerShell ISE which is free and very well supported by Microsoft. Of the others mentioned, only Notepad++, PowerShell ISE and ATOM are free, though the others offer free trials so try them all and see what you like. I tend to use Notepad++ on all of my servers for a variety of purposes so it is always there. I have begun to use the PowerShell ISE when developing more complex scripts and I am actively using ATOM as it also does an excellent job of supporting Python. ATOM is also useful because it has Windows , Linux and Mac OS versions and I live (for the most part) in Mac OS. Altough my favorite Mac OS editor is BBEDIT. Several of these choices also support Git or GitHub source code repositories if that is part of your workflow.

bobm-1.jpg

Author Bob McDuffee, Certified Ethical Hacker (CEH), has over 30 years experience and is a System Engineer for Zero Wait-State. He is responsible for installing software for clients and overseeing hosted and virtual environments. He provides configuration information for customers and debugs hardware issues both for clients and the company internally. His experience includes implementing, troubleshooting and upgrading PDM systems on Linux, Solaris and Windows servers utilizing both WebLogic and Oracle Application Server.

 

r

Subscribe to the ZWS Blog

Recent Posts

Request