Your DB Admins and product owners will help you find and collect the data you need. If you don’t have all this information already collated in a spreadsheet, create one. The Oracle Server Worksheet (OSW) is a great place to start. Bonus here is that in the event of an Oracle audit you won’t have to waste time filling it in, you’ll already have one that you can trust.
If you don’t have the OSW template, don’t worry, you can create a similar spreadsheet capturing the same information. The essential columns you need to have in your spreadsheet are:
# Host Information- Physical Cluster Name
- Physical Server Name
- Virtual Server Name
- Partitioning Technology
- DB Instance Name
- DR Server
- DR Type
- Operating System
# CPU Information- Server Model
- Processor Model
- Processors (Sockets)
- Cores per Processor
- Physical Cores
- Threads per Core
# Software Information- Product Version
- License Metric Allocated
- Restricted Use
- Application Name
- Application Vendor
- Application Type
- Architecture TypeUser Type
- Web/App Tier Server Name
We’ll go through how you can collect all this information later in this article.
Next, you’ll need to get the Oracle LMS (License Management Services) scripts. These are Oracle’s property and the easiest way to get them is by requesting it from Oracle LMS team. There is understandable fear that such a request might trigger an audit, but that’s a calculated risk you may or may not take. Most companies we’ve been in touch have already been audited by Oracle at least once, so procuring the script is typically not an issue. Alternatively, you could also get in touch with your preferred SAM consultant, they might even help you down the line with making sense of all the data you’ve been collecting.
So you have your spreadsheet and your scripts ready. Before you start tracking down the people that have the permissions to run the scripts for you, let’s get one more thing out of the way: collecting infrastructure data. Oracle license metrics are all processor related and so the licensing implications of the underlying infrastructure are major. One surprise VMWare cluster can seriously change your outlook on life. Especially if you find it during an audit.
Time to get in touch with your infrastructure teams. Most of the time they will have reports available that can provide a lot of the information you need. If you’re talking to virtualisation teams, make sure to capture the complete topology:
This topology may be different depending on the types of virtualisation used and you may have to adjust your worksheet to hold all this information. Study the oracle partitioning policy and make sure you can provide reliable information for all required metrics.
To complete the infrastructure mapping process, begin by inserting every database host in the spreadsheet, along with the additional fields. The general principle when licensing Oracle is that the smallest hardware limited resource that can run the software should be licensed. A hardware limitation used to mean that a physical change would be required to increase the number of processors assigned to a partition. Nowadays, a hardware limitation is defined in Oracle’s hard partitioning policy. Make sure you capture any evidence needed to demonstrate that your resources are correctly partitioned (CPU Pools, VMware screenshots etc).
Disaster Recovery (DR) is another common pitfall, so make sure you include that in your reporting. You’ll need to know what the DR servers are, what resources they have allocated and the type of setup.
After you have completed this step, you can go ahead and start running the Oracle LMS script. Prioritize large servers and servers hosted on VMware clusters. Work your way through the spreadsheet and collect Review Lite outputs for all databases.
As you gather your script outputs, you’ll need to start analyzing them. This is where a tool or a consultant may be instrumental. However, if you feel you’re up for the challenge, you can start analyzing them manually. For a relatively small number of databases (less than 50) you can complete this step in a couple of weeks, also accounting for time spent running the scripts. Using a tool for the data crunching will at least half your hours spent analysing.
In our next post we’ll talk about the data analysis process, stay tunned.