Friday, January 27, 2012

Noetix: Reporting AP Credit Card Transactions

If you are, like the company I work at, a user of both Oracle Payables and Oracle Internet Expenses you'll realise that the line between then is incredibly blurred. Especially if you have the situation where some departments/ areas use i-Expenses but other areas have an Excel-based expenses system (that is then sent to Finance who enter the data directly into Payables).

Oracle have made the situation a little worse by licensing Internet Expenses separately to Payables and it looks very much like Noetix have continued the trend.

What makes the Noetix decision even stranger is that the majority of the data for Internet Expenses is stored in the same tables as Payables in Oracle.

After looking at what was offered by the Noetix "Internet Expenses" module and our own reporting requirements the only "gap" that could be identified was for reporting on Credit Card Transactions (primarily around AP.AP_CREDIT_CARD_TRXNS_ALL).

Our reporting requirements are pretty simple, here is a simple list (with properties) of the columns we'd like to report on related to Credit Card Transactions;


Converting these requirements into a Noetix View took a fair bit of trial and error but the script is available here. As it's quite a large one I won't be copy/pasting it below (as I normally would).

Any questions/ suggestions feel free to post a comment.

Monday, January 23, 2012

Submitting an e-Petition to Cambridgeshire County Council

This blog post gives simple step-by-step instructions on how to create your first e-Petition on Cambridgeshire County Councils (CCC) e-Petition website.

Go to the website;

Cambridgeshire County Council's e-Petitions Website

Select the "Register" link (top-right, highlighted above);

CCC e-Petition Website Registration Pages
 As an alternative to creating an account you can always just login with your accounts from any of the following;
CCC e-Petition Website Supported Sign-In Providers
You will still have to complete your contact details, but you will be able to use your Twitter/ Yahoo/ OpenID/ Google/ Aol account to login to the website.

Once you have registered (or if you just login on the main screen) you will see the following page;

CCC e-Petitions Website "My Activities" Page
Click on "Add a petition" on the left;

CCC e-Petition Website "Add a petition" Page  
You now need to complete the petition details. The following notes maybe useful;

  • What do you want to achieve? If you want to present your petition at a meeting of the Full Council then you should look at the County Councils website and pick an end-date at least 2-weeks before a meeting (but still allowing you enough time to collect signatures)
  • Is this just an on-line petition or do you want the ability to submit paper signatures as well? Technically there is no reason why you wouldn't tick *both* check-boxes as you can then just submit zero signatures of the type you don't want to use but if you change your mind later it's a lot easier if you check the box at this stage!
  • You will be contacted by an Officer from the County Council within a couple of days of creating your petition. It's vital your email address is up to date (and you might want to check your Spam Folder just in case!)
  • If you are submitting a petition on behalf of or organisation or group it's vital that you agree a wording with them in advance - this will solve a lot of problems in the long run
  • Stay away from politics (unless that's your aim). Saying how rubbish the Conservative administration is at something is unlikely to convince a) them to change their minds, or b) their supporters to sign your petition. Equally declaring what a triumph Liberal Democrat, Labour, Green, or UKIP policy is in a specific area in the text of your petition is unlikely to motivate people of a different political persuasion to sign it even less the Conservative administration to adopt it! The broader appeal you have in your petition the more likely people are to sign it and circulate it to their friends/ colleagues!
Once your petition has been approved all you need to do now is collect signatures!

Friday, January 20, 2012

SSRS: Removing Blank Pages In Your Reports

If you find yourself writing reports for a truly international audience (i.e. US and EU) then you'll know about the nightmare of paper sizes when your users want to hit "Print" but a larger, and much more significant, issue is the blank pages that sometimes get printed either between or at the end of your reports.

This blog post lists a few things you might want to check to make sure your report is printing on as few pages as possible.

NOTE: If you are having problems printing matrix reports then there is a specific solution for you at the bottom (and an explanation of why you're seeing the blank pages).

Formatting For Printing
If you follow the steps to create a sample report here and then right-click the background of your created report and select "Report Properties";

Report Builder 3: Right-click in the Red Hatched Area for "Report Properties"
This will show you the properties for your report;

Report Builder 3: Report Properties > Page Setup
This is the "default" size created by the wizard. You'll notice that it has defaulted Centimetres, the UK-standard of A4 (as I'm in the UK) and that we have 2cm margins all round. These defaults (at least the Centimetres and A4) seem to be there as I've set them previously.

The major change I make on this screen is changing all the margins to 1.27cm which is just large enough for the printer to handle and gives over a centimetre of extra real-estate both vertically and horizontally. Of course another significant change is to switch the paper layout between Portrait and Landscape if that makes sense for your report!

Now assuming you are using A4, Portrait, and you have changed your margins to 1.27cm then the maximum size for a label you will be able to fit onto a single page is 184.6mm (210mm total width - 2 * 12.7 margins). To show this create a label on the main report, set it's location to 0,0 and make it 184.6mm wide. Put some right-aligned text (I'm using &ReportName) into the label;

Report Builder 3: 184.6mm Wide Label (A4, Portrait)
Now run the report;

Report Builder 3: &ReportName as GUID
It's displaying the &ReportName value as a GUID (I haven't yet saved the report) but the point is that it's some way over to the right. To find out what this looks like on paper click the "Print Layout" button in the ribbon above the GUID;

Report Builder 3: Print Layout
As you can see the GUID is now at the very right of the page and the page count is 1. Just as a quick test if you enlarge the label to 184.7mm (adding just .1mm) and then re-run the report;

Report Builder 3: Page Count Increased To TWO
You can see that the total page count for the whole report has increased to two (with the second page actually looking blank!).

Additional Formatting Problems With Matrix Reports
As soon as you start dynamically adding columns based on new data in the query the risk of getting blank pages, if you are using fixed width headers, dramatically increases.

Once again follow the steps given here to create a report only this time use the SQL;

SELECT 'Red', 'A4 Folder', 9 FROM DUAL UNION
SELECT 'Orange', 'Pencil', 23 FROM DUAL UNION
SELECT 'Cyan', 'Ruler', 21 FROM DUAL UNION
SELECT 'Orange', 'Rubber', 14 FROM DUAL UNION
SELECT 'Cyan', 'A4 Folder', 17 FROM DUAL UNION
SELECT 'Purple', 'Rubber', 14 FROM DUAL

And on the "Arrange fields" dialog instead of adding all three columns into the "Values" box add Colour to "Row Groups" and Item to "Column groups" as below;

Report Builder 3: Arrange Fields using Column and Row Groups
At the end of this process you will have something like this;
Report Builder 3: A Table With Column and Row Groups
When you run this report and look at the Print Layout (assuming you have selected A4, and 1.27mm margins in the Report Properties dialog - see above) you will get something like this;

Report Builder 3:

The point that might surprise you is if you look at the page count in the ribbon you will notice that it is TWO pages long and if you forward on to the second page it's blank.

If you look at the image below you can see the huge amount of white space to the right and below the table;

Report Builder 3: White Space
Now when the table expands the white space is ADDED so in order to not have blanks to the right and below the table we need to re-size the background and remove as much of it as possible. After re-sizing this becomes;

Report Builder 3: Resized Report Minimising White Space
Now when you run the report and look at the Print Layout you will see that we are back down to a single page.

Now a little "quirk" of how this tidying up takes place is that if you expand the width of the title to 184.6mm and then re-run the report you'll find you're back to two pages. It seems that because the amount of white space is now significant it's actually being added back in again. The easiest way round this is to add a new column to the right of "Total";

Report Builder 3: New Column
Now mark the column visibility as "Hide" and re-run the report;

Report Builder 3: All On One Page
And now you're back to having your report on a single page.

Thursday, January 19, 2012

SSRS: Creating a Simple Report With An Embedded Dataset

The purpose of this blog post is to put together a simple guide to producing a "test" report. I'm doing this as a separate post so that I can re-use it in Other Posts (rather than having to include basic setup information every time).

Other Report Builder 3 and select "New ..." to trigger the New Report of Dataset wizard;

Report Builder 3: New Report of Dataset Wizard

"New Report" report will be automatically selected on the left, on the right select "Table or Matrix Wizard" (the top item);
Report Builder 3: New Table or Matrix Report Wizard
At the bottom left check the "Create a dataset" radio group and then click "Next >" at the bottom right;

Report Builder 3: Choose a Data Source

From the list you need to select a Data Source connection to use and then click "Next" at the bottom right;

Report Builder 3: Design A Query
Now you can enter the SQL. As this is just a simple test I'm going to use the following (Oracle) SQL;


This simple piece of SQL just gives us a single row with today's date as well as the first and last days of the current month. Click the Run button (red exclamation mark above where you entered the SQL) to check the SQL works;
Design A Query: Testing the SQL
Once everything works click "Next";

Report Builder 3: Arrange Fields
The three fields in the SQL we've just added are in the "Available Fields" box on the left of the dialog. Drag/Drop them into the "Values" box at the bottom right;

Report Builder 3: Arrange Fields
Click "Next";

Report Builder 3: Choose the Layout
We don't want to make any changes here so just click "Next";

Report Builder 3: Choose a Style
Similarly here we don't want to make any changes so just click "Finish".

The report has now been completely generated and you will be presented with something similar to;

Report Builder 3: Fully Generated Report
Click on "Run" at the top left (to test the report);

Report Builder 3: Testing Generated Report

And you're done ....

Wednesday, January 18, 2012

SSRS: Adding Calculated Fields To Data Sets

This blog post covers an example of how to add a simple calculated field to a Dataset in SQL Server Reporting Services using Report Builder 3 (against SQL Server 2008R2). It isn't a massively complicated process except that Microsoft seem to be assuming absolutely no-one does this as all the "nice" features of expressions when working with Reports (i.e. being able to select fields by double-clicking a list) seem to be unavailable.

The easiest thing to do is to start Report Builder 3 and select "New Dataset" from the Getting Started wizard;
Report Builder 3: Getting Started Wizard
Select and active data source connection (for the purposes of this example I'll be connecting to an Oracle data source). Once you've selected a data source you'll be presented with the editor window;

Report Builder 3: SQL Editor Window
Now enter the following simple SQL;


This is a very simple piece of SQL that will just return a single row with two columns called VALUE1 and VALUE2 which contain the values 1 and 2;

Report Builder 3: Sample SQL with Result
Now we are going to add two Calculated Columns;

  • VALUE3 = VALUE1 + VALUE2, and

Click on "Set Options" (in the ribbon bar at the top of the window);

Report Builder 3: Shared Dataset Properties
Click "Add" and select "Calculated Field" from the drop down that appears;

Shared Dataset Properties: New Field
Enter the values;

  • VALUE3 (Field Name)
  • =Fields!VALUE1.Value + Fields!VALUE2.Value (Field Source)
  • VALUE4 (Field Name)
  • =Fields!VALUE1.Value / Fields!VALUE2.Value (Field Source)

And click "OK" to apply the change.

Re-run the query and you'll notice that your two new fields are NOT being displayed. This is slightly unexpected but if the expression you have entered is incorrect you will see an error message like;

Report Builder 3: Invalid Expression Error
The actual text of the error message is;

The Value expression for the field ‘=Fields!VALUE1.Value1 / Fields!VALUE2.Value’ contains an error: [BC30456] 'Value1' is not a member of 'Microsoft.ReportingServices.ReportProcessing.ReportObjectModel.Field'.

You'll notice that I just added a "1" after the Value property name.

Now you've saved the dataset close and re-open Report Builder and this time from the wizard select "New Report" and then "Table or Matrix Wizard", select the dataset you've just saved, click "Next" and you will see;

Report Builder 3: New Table or Matrix Wizard Screen
Which shows the four available fields. Select them all (i.e. move them into the Values box) and then click "Next" all the way through to the generated report and then run it;

Report Builder 3: Successfully Showing Four Fields

Now, as a final check, if you go through into the Dataset Properties of the Dataset you've just added and look at the fields;

Report Builder 3: Dataset Properties
You can see that the new fields have been added from the Dataset and that the calculation is being done in the dataset (otherwise Field Source would be an expression).

Monday, January 16, 2012

Google Knol: Final Farewell ...

So that's it, I've just migrated the last one of my Google Knols back to Blogger (where I started creating posts many many years ago). To be honest it's a great shame - I can't see how with Googles goals to get the whole of human knowledge online they can justify closing that service. It was convienent, easy to use, and gave me the feeling that I was contributing to something bigger than, say, a blog.

The bit I find most surprising is the lack of loyalty Google seems to be expecting from the many people who used Google Knol. I for one don't have any interest in transferring the documents I have spent literally years creating to a non-Google service. Even one recommended by Google. Not providing a migrating path to Blogger is incredible - I have literally no idea what they were thinking!

Anyway, here's to you Google Knol, it was fun while it lasted and you will, in my household at least, be sorely missed ...

Oracle EBS: Creating New Menu Items in Oracle e-Business Suite

NOTE: Don't do this on a production environment. Did that need saying? Apparently one person who submitted a comment seemed to think so ... You really can completely mess it up. Run it on a test environment and MAKE SURE IT WORKS before you run it anywhere else.The script below was written against 11i, I would be loathe to run it against a different version. The point of this post is to allow you to script a change to 11i - you might be better off just doing this in the UI if you're not managing lots of instances. Anyway ... You have been warned!

This blog post takes you through a step-by-step guide to how to add a new menu item (that will punch out to this Knol) to the root menu of an existing responsibility using the Oracle API's (so the change can be scripted rather than done in the forms).

A completed example script (with error checking and reporting) is included.

Step 1: Getting The Responsibility Details
In order to add a new menu item you need to know which set of menus your responsiblity is currently using. To find this out you need to go into the e-Business Suite and choose the "System Administrator" Responsibility and then under Security > Responsibility choose "Define".

Search for the Responsibility you wish to use. For the purposes of this example I'm going to use the "Alert Manager" responsibility as it should be one that is installed on every instance and will have a fairly limited user base.

When you view the Responsibility you will see something like this;

Figure 1: "Alert Manager" Responsibility
The important piece of information on this screen is the "Menu" (in the middle). You can see that the responsibility is using the "ALR_OAM_NAV_GUI" Menu as it's root. We'll add the new menu item in here.

Stage 2: Adding a Simple HTML-redirect Script To Oracle
See Linking Directly to Microsoft Reporting Services from Oracle e-Business Suite (Stage 2)

Stage 3: Creating a Function using FND_FORM_FUNCTIONS_PKG.INSERT_ROW
This API provides a quick way of creating records in the FND_FORM_FUNCTIONS set of tables, in order to use the API you first need to get a new ID from the the FND_FORM_FUNCTIONS_S sequence. As we're going to be doing nothing more than a simple punch-out to Google the API call will look something like this;

  x_rowid                    => v_RowID,
  x_function_id              => v_Id,
  x_web_host_name            => null,
  x_web_agent_name           => null,
  x_web_html_call            => 'verysimpleredirect.html?redirect=',
  x_web_encrypt_parameters   => 'N',
  x_web_secured              => 'N',
  x_web_icon                 => null,
  x_object_id                => null,
  x_region_application_id    => null,
  x_region_code              => null,
  x_function_name            => 'GOOGLEKNOL',
  x_application_id           => null,
  x_form_id                  => null,
  x_parameters               => null,
  x_type                     => 'JSP',
  x_user_function_name       => 'Google Knol Viewer', --:x_user_function_name,
  x_description              => 'Google Knol Viewer', --:x_description,
  x_creation_date            => sysdate,
  x_created_by               => 0,
  x_last_update_date         => sysdate,
  x_last_updated_by          => 0,
  x_last_update_login        => -1,
  x_maintenance_mode_support => 'NONE',
  x_context_dependence       => 'RESP',
  x_jrad_ref_path            => null);

I've called the function "GOOGLEKNOL" and it's being created by the System Administrator (if you look in FND_USER it's ID 0). If you have disabled this user then it's best if you create it as someone else. You can always use your ID but I prefer to distance myself from these created objects (it's one less thing to worry about if I ever choose to leave my job and have to hand all this over to someone else!).

Unfortunately there seems to be a bug with this API in the the "Type" (FND_FORM_FUNCTIONS.TYPE) does not appear to be being written correctly into the database. In order to fix this you need to do a SQL update;

 update applsys.fnd_form_functions t
     set type = 'JSP'
   where function_id = v_ID;

Where v_ID is the ID you retrieved from the sequence earlier.

Stage 4: Associating Function with Existing Oracle Menu
This uses the FND_MENU_ENTRIES_PKG.INSERT_ROW API as published by Oracle to hook together the new menu item with the existing menu. In stage 1 we learnt that the menu we wish to alter is called "ALR_OAM_NAV_GUI" and by querying the FND_MENUS and FND_MENU_ENTRIES tables we can get the Menu ID and the next available menu sequence number as follows;

select fm.menu_id, max(entry_sequence) + 1
    from fnd_menus fm, fnd_menu_entries fme
   where fm.menu_name = 'ALR_OAM_NAV_GUI'
     and fm.menu_id = fme.menu_id
   group by fm.menu_id;

Using these values we can call the API;

  x_rowid             => v_RowID,
  x_menu_id           => v_MenuId,
  x_entry_sequence    => v_EntrySequence,
  x_sub_menu_id       => null,
  x_function_id       => v_Id,
  x_grant_flag        => 'Y',
  x_prompt            => 'Google Knol Viewer',
  x_description       => 'View Google Knol',
  x_creation_date     => sysdate,
  x_created_by        => 0,
  x_last_update_date  => sysdate,
  x_last_updated_by   => 0,
  x_last_update_login => -1);

Now we have created all the records we're almost there.

Stage 5: Running the "Compile Security" Concurrent Request
This is performed using the FND_REQUEST.SUBMIT_REQUEST concurrent request API;

  application => 'FND',
  program     => 'FNDSCMPI',
  argument1   => 'No')

This is function so you'll need to do something with the returned value.

After completing these steps you'll find that when you log-in and switch to the "Alert Manager" responsiblity you will have a new menu item and clicking on that will bring up this Knol;

Figure 2: Completed System 
A script to perform these changes automatically (with additional error checking and report) is available by clicking here.

Saturday, January 14, 2012

Enabling Auto-Login Under Windows 7

First of all this is a MASSIVE security hole if you choose to enable it - anyone getting their hands on your machine will have full control over everything (logged into you bank? your Facebook account? your photos? the lot).

There are, however, very good reasons to actually do this; for example I have setup a Windows 7 machine which just runs iTunes because that's all I need - clearly it's pointless to password protect this as all my data is on (non-protected) NAS drive.

So here are the instructions;
1) Go to the start menu
2) type into the "search programs and files" text box the words; "control userpasswords2" and hit return
3) This dialog will appear;
User Accounts Dialog (Windows 7)
4) Make sure the checkbox at the very top (titled "Users must enter a user name and password to use this computer") is unchecked. You will then be prompted to enter your password.

And that's it, when you reboot your machine you will no longer be prompted for a password.

Friday, January 13, 2012

Oracle EBS: Cloning Script To Clear Email Addresses

We use the following cloning script to clear down email addresses on each of the cloned systems we create from our production EBS system. In exchange we have setup a mailbox which all the test systems feed into and each tiem we create a new clone we add an email address to the mailbox so that it receives emails from the new clone (this way you can filter the messages in the mailbox to get the ones you're interested in for the clone system you are looking at).

The script itself sends an email with what it's done to the mail box (so we know when the system was last cloned), it also includes a check to make sure it's not being run against production (PROD) by accident!

The actual script (which was written against Oracle version 12.1.3) is below;

  c_EMAIL constant varchar2(60) := '%DATABASE%_APPSTEST@<your domain>';

  v_Database v$Database.NAME%TYPE;
  v_Body     CLOB;
  procedure addToBody(p_Text in varchar2) as
    dbms_lob.writeappend(v_Body, length(p_Text) + 4, p_Text || '<br>');
  dbms_lob.createtemporary(lob_loc => v_Body, cache => False);

  addToBody('0010 Checking database instance ... ');
  select name into v_Database from v$Database;
  dbms_output.put_line('020 Database instance = "' || v_Database || '"');
  if v_Database = 'PROD' then
                            'This script must NOT be run against PROD.');
  end if;

  v_Email := replace(c_EMAIL, '%DATABASE%', v_Database);
  addToBody('0100 All emails will be set to "' || v_Email || '"');

  addToBody('0110 Updating AP supplier contacts (AP_SUPPLIER_CONTACTS) records ... ');
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0120 Updating AP supplier site details (AP_SUPPLIER_SITES_ALL) records ... ');
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) ||
            ' records updated (EMAIL_ADDRESS)');
     set REMITTANCE_EMAIL = v_Email
   where REMITTANCE_EMAIL is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) ||
            ' records updated (REMITTANCE_EMAIL)');

  addToBody('0130 Updating User (FND_USER) records ... ');
  update fnd_user
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0140 Updating Vendor Contact Points (HZ_CONTACT_POINTS) records ... ');
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0150 Updating Parties (HZ_PARTIES) records ... ');
  update HZ_PARTIES
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0160 Updating external payees (IBY_EXTERNAL_PAYEES_ALL) records ... ');
     set remit_advice_email = v_Email
   where remit_advice_email is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0170 Updating Employee (PER_PEOPLE_X) records ... ');
  update PER_PEOPLE_X
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0180 Updating points (RA_CONTACT_PHONES_INT_ALL) records ... ');
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0190 Updating Supplier (PO_VENDOR_SITES_ALL) records ... ');
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');


  -- Send an e-mail to the mailbox so you know when it was last refreshed.
  napp_emailmanager.addEmail(p_From    => v_Email,
                             p_To      => v_Email,
                             p_Subject => v_Database || ' refreshed on ' ||
                                          to_char(SYSDATE, 'DD-MON-YYYY'),
                             p_Body    => v_Body);

Have I missed anything?

Tuesday, January 10, 2012

Noetix: XU2/XU5 Scripts run but generate no logfiles!

This problem has been vexing me for the last 30 minutes. After I found the solution I thought I'd post it here just in case other people are experiencing the same issue!

The first thing to do is to open the file for which you aren't seeing a logfile and check to make sure that the first line begins with;

@utlspon <filename>

Where the name of the file is the same as the file you're running without any extension.

Assuming everything in your file is correct then you need to open the *previous* file in XU2/XU5 and make sure that it ends with;


If this is missing the log details for your script will just be appended to those of the previous script.

Now assuming that you have used @utlspon/off correctly and you're still not seeing a log file and you're certain your script is running (the situation I found myself in) then check to see if you have used;


In any of your files. If you turn this off (usually so you can insert some special characters into a field) then you must make sure that before you call @utlspoff you turn it back on by executing SET SCAN ON. If you don't then, for some reason, you no longer get logfiles.

This proved to be the cause of the problem we were seeing.

Monday, January 9, 2012

Noetix: Removing Item Information Caching @ Release 12

We use Oracle E-Business Suite to manage our (pharmaceutical) production facility. Because of the nature of the work we do we create items, batches, and lots every single day of the week. Normally this isn't a problem but in Noetix views item information is cached in the N_KFF_SYS_Item table and is only refreshable by using a supplied concurrent processes.

Clearly as we're creating multiple items every day it's just not practical for us to add a "Now run the concurrent process" step across our business to make sure that the reporting information is up to date for those running reports. It also creates an issue of "trust" - if users are looking at cached data is it up to date or not?

Looking at the source code for the Noetix views which access the cached information rather than going directly to the cached table they instead go via a view; XXK_Sys_Item which just does a direct select from the table  (I'm making a point here of not just copy/pasting blocks of Noetix source code so you'll find a lot of this is instructions on how to find the code rather than the code itself). The benefit of this approach is that if we want to stop using caching we just replace the view with a new view that goes directly to the table. You may be thinking that it might actually be easier to replace the references to the view in the Noetix configuration tables but unfortunately this isn't actually possible. If you look in the table N_View_Table_Templates (or just N_View_Tables) for the view_label INV_Items you will notice that the only NOETIX table in use is N_MFG_LOOKUPS_VL - no mention of the XXK_Sys_Item view. This must be added "elsewhere".

Doing a search of the source code (.sql) files in the Noetix install directory for the table name reveals that the regenerate process builds a package called N_KFF_SYS_Item_Pkg to handle the management of the cached table.

So in summary here is the structure (with INVG0_Items being the public view we're going to use for testing);

Looking in more detail at the N_Kff_SYS_Item_Pkg package we can see that the routine called Init_Upld (which is the basis for the concurrent request) does the initial upload of rows into the table - it's the SQL to do this that we need to build into a view to replace the XXK_Sys_Item view.

In the source code (in the N_KFF_Sys_Item_pkg.sql file) I'm looking at line 145 which does the insert (it reads "INSERT  /*+ APPEND */ INTO N_KFF_Sys_Item" if you want to search for it in your Source code. The format for the insert is a simple SELECT ... which makes life a lot easier when you're looking at creating a view!

Looking at the columns inserted not all of them are making it through to the view (last_update_date for instance) so this column isn't needed in our replacement view. Clearly some clever logic has gone into building this view which would pick up multiple segments and structures but in our system it seems a lot of this is wasted as the data (probably explaining why we feel caching is unnecessary) seems to default to a single value. Looking at this code seems to indicate that the SQL we need to build the replacement view is;

       msib.SEGMENT1 CV$System_Item_KFF,
       msib.SEGMENT1 SV$ITEM,
       'Item' SEGMENT_NAME_LIST,
       101 STRUCTURE_ID,
       'System Items' STRUCTURE_NAME,
       null Z$XXK_SYS_ITEM

Clearly this is quite a substantial change so what we need to do is provide us with a way to back it out quickly after it's gone live in the event of substantial problems being found (low level-testing has been done but this will never cover 350+ reports) so the script to make the update will rename the existing view rather than just overwriting it so we can do a quick rollback*. Here is the script;

@utlspon xu6_replace_xxk_sys_item_view

  for v_Data in (SELECT 'x'
                   FROM ALL_VIEWS av
                  WHERE av.owner = USER
                    and av.view_name = 'XXK_SYS_ITEM_BK') loop
    execute immediate 'drop view XXK_SYS_ITEM_BK'; -- Remove an existing backup view if it exists
  end loop;


       msib.SEGMENT1 CV$System_Item_KFF,
       msib.SEGMENT1 SV$ITEM,
       'Item' SEGMENT_NAME_LIST,
       101 STRUCTURE_ID,
       'System Items' STRUCTURE_NAME,
       null Z$XXK_SYS_ITEM

  dbms_output.put_line('Recompiling invalid Views under NOETIX_SYS Schema');
  dbms_output.put_line('Remove Invalid Noetix Views (that are not set to be ommitted)');
  for v_SQL in (select 'ALTER VIEW NOETIX_SYS.' || do.object_name || ' COMPILE' text
                  from dba_objects do
                 where do.status = 'INVALID'
                   and do.OWNER = 'NOETIX_SYS'
                   and do.OBJECT_TYPE = 'VIEW'
                   and do.OBJECT_NAME not in
                       (select view_name
                          from n_views
                         where nvl(omit_flag, 'N') = 'N')) loop
    dbms_output.put_line('  ' || v_SQL.Text);
    execute immediate v_SQL.Text;
  end loop;



This is available as a file here (via Google Docs).

You then need to add a call in XU6 to call the new file.

*- Just drop the new view and reverse the "RENAME ..."

UPDATE (11-JAN-2012): I've added in a section to the script to do a COMPILE of the views that are invalidated by this change. Normally it wouldn't be required as the first query will automatically recompile the view - however we have a check in XU6 for invalid views so they need to be valid by that stage otherwise our regenerate fails.

Friday, January 6, 2012

Noetix: Checking Incremental Refresh

The following script will generate a piece of SQL that will check the current status (and any pending records) for the Noetix Key Flex Field (KFF) Incremental Refresh process;

  for v_Kff in (SELECT N.*,
                       (select max(data_table_key) from n_f_kff_flex_sources) as max_data_table_key
                  FROM n_f_kff_flex_sources N
                 ORDER BY N.data_table_key) loop
    dbms_output.put_line('SELECT ''' || v_kff.kff_view_description ||
                         ''' kff_view_description, ');
    dbms_output.put_line('       ''' || v_kff.target_value_object_name ||
                         ''' kff_cache_table_name, ');
    dbms_output.put_line('       ''' || v_kff.data_application_table_name ||
                         ''' oracle_cache_source_table, ');
    dbms_output.put_line('       (SELECT ut.status ');
    dbms_output.put_line('          FROM user_triggers ut ');
    dbms_output.put_line('         WHERE ut.trigger_name = ''' ||
                         UPPER(v_kff.cd_trigger_name) ||
                         ''') trigger_status, ');
    dbms_output.put_line('       (SELECT COUNT(*) FROM ' ||
                         v_kff.target_value_object_name ||
                         ') current_kff_cached_rowcount, ');
    dbms_output.put_line('       (SELECT COUNT(*) FROM APPS.' ||
                         v_kff.data_application_table_name ||
                         ') current_oracle_source_rowcount, ');
    dbms_output.put_line('       (SELECT COUNT(*) FROM ' ||
                         v_kff.cd_table_name || ') pending_cd_rowcount, ');
    dbms_output.put_line('       (SELECT COUNT(*) FROM ' ||
                         v_kff.cde_table_name ||
                         ') pending_cde_rowcount, ');
    dbms_output.put_line('       (SELECT COUNT(*) FROM ' ||
                         v_kff.cdeh_table_name ||
                         ') pending_cdeh_rowcount ');
    dbms_output.put_line('FROM DUAL ');
    if (v_Kff.Max_Data_Table_Key != v_Kff.Data_Table_Key) then
    end if;
  end loop;

On our system this generates the following SQL;

SELECT 'Accounting Key Flexfield' kff_view_description,
       'N_KFF_GL_Acct' kff_cache_table_name,
       'GL_CODE_COMBINATIONS' oracle_cache_source_table,
       (SELECT ut.status
          FROM user_triggers ut
         WHERE ut.trigger_name = 'N_CDT_GL_ACCT') trigger_status,
       (SELECT COUNT(*) FROM N_KFF_GL_Acct) current_kff_cached_rowcount,
       (SELECT COUNT(*) FROM APPS.GL_CODE_COMBINATIONS) current_oracle_source_rowcount,
       (SELECT COUNT(*) FROM N_CD_GL_Acct) pending_cd_rowcount,
       (SELECT COUNT(*) FROM N_CDE_GL_Acct) pending_cde_rowcount,
       (SELECT COUNT(*) FROM N_CDEH_GL_Acct) pending_cdeh_rowcount
SELECT 'Item Categories Key Flexfield' kff_view_description,
       'N_KFF_Mtl_Cat' kff_cache_table_name,
       'MTL_CATEGORIES_B' oracle_cache_source_table,
       (SELECT ut.status
          FROM user_triggers ut
         WHERE ut.trigger_name = 'N_CDT_MTL_CAT') trigger_status,
       (SELECT COUNT(*) FROM N_KFF_Mtl_Cat) current_kff_cached_rowcount,
       (SELECT COUNT(*) FROM APPS.MTL_CATEGORIES_B) current_oracle_source_rowcount,
       (SELECT COUNT(*) FROM N_CD_Mtl_Cat) pending_cd_rowcount,
       (SELECT COUNT(*) FROM N_CDE_Mtl_Cat) pending_cde_rowcount,
       (SELECT COUNT(*) FROM N_CDEH_Mtl_Cat) pending_cdeh_rowcount
SELECT 'Item Catalogs Key Flexfield' kff_view_description,
       'N_KFF_Ctlg_Grp' kff_cache_table_name,
       'MTL_ITEM_CATALOG_GROUPS_B' oracle_cache_source_table,
       (SELECT ut.status
          FROM user_triggers ut
         WHERE ut.trigger_name = 'N_CDT_CTLG_GRP') trigger_status,
       (SELECT COUNT(*) FROM N_KFF_Ctlg_Grp) current_kff_cached_rowcount,
       (SELECT COUNT(*) FROM APPS.MTL_ITEM_CATALOG_GROUPS_B) current_oracle_source_rowcount,
       (SELECT COUNT(*) FROM N_CD_Ctlg_Grp) pending_cd_rowcount,
       (SELECT COUNT(*) FROM N_CDE_Ctlg_Grp) pending_cde_rowcount,
       (SELECT COUNT(*) FROM N_CDEH_Ctlg_Grp) pending_cdeh_rowcount
SELECT 'System Items Key Flexfield' kff_view_description,
       'N_KFF_Sys_Item' kff_cache_table_name,
       'MTL_SYSTEM_ITEMS_B' oracle_cache_source_table,
       (SELECT ut.status
          FROM user_triggers ut
         WHERE ut.trigger_name = 'N_CDT_SYS_ITEM') trigger_status,
       (SELECT COUNT(*) FROM N_KFF_Sys_Item) current_kff_cached_rowcount,
       (SELECT COUNT(*) FROM APPS.MTL_SYSTEM_ITEMS_B) current_oracle_source_rowcount,
       (SELECT COUNT(*) FROM N_CD_Sys_Item) pending_cd_rowcount,
       (SELECT COUNT(*) FROM N_CDE_Sys_Item) pending_cde_rowcount,
       (SELECT COUNT(*) FROM N_CDEH_Sys_Item) pending_cdeh_rowcount
SELECT 'Stock Locators Key Flexfield' kff_view_description,
       'N_KFF_Item_Loc' kff_cache_table_name,
       'MTL_ITEM_LOCATIONS' oracle_cache_source_table,
       (SELECT ut.status
          FROM user_triggers ut
         WHERE ut.trigger_name = 'N_CDT_ITEM_LOC') trigger_status,
       (SELECT COUNT(*) FROM N_KFF_Item_Loc) current_kff_cached_rowcount,
       (SELECT COUNT(*) FROM APPS.MTL_ITEM_LOCATIONS) current_oracle_source_rowcount,
       (SELECT COUNT(*) FROM N_CD_Item_Loc) pending_cd_rowcount,
       (SELECT COUNT(*) FROM N_CDE_Item_Loc) pending_cde_rowcount,
       (SELECT COUNT(*) FROM N_CDEH_Item_Loc) pending_cdeh_rowcount

This generates output similar to the following (on our production system);

Wednesday, January 4, 2012

Noetix: INV_Period_Close_Details and Process Manufacturing (at Release 12)

If you are one of Oracles (few) Process Manufacturing module customers like us you will already have noticed that there are substantual changes to Noetix when you go from Release 11 to Release 12. One of the most significant is the change from using the GMI_Month_End_Inventory view (which has been rendered obsolete by Oracles changes) to using the INV_Period_Close_Details template.

The first thing you'll notice when running against R12 is that if you do;

  FROM INV_Period_Close_Details;

In your test system you will get ZERO rows (unless you are also an Oracle Discrete Manufacturing customer in which case you will just get your rows from Discrete manufacturing).

The reason for this is the inclusion of the table BOM.CST_PERIOD_CLOSE_SUMMARY as the source for the costing data. This table is not used by Process Manufacturing (the correct table in GMF.GMF_PERIOD_BALANCES).

You have to wonder exactly what Process Manufacturing testing Noetix has done as this was *clearly* never going to work.

Luckily (for everyone else) we have been through the support loop with Noetix and they have produced a modified version of the INV_Period_Close_Details views which not only includes Process Manufacturing Costs but also includes Lot details - something that will be necessary if you are migrating from the Release 11 view to the Release 12 view.

The support reference number is B27089.

What the file does (for those of you trying to test/understand it) is to;
  1. Insert a new query (n_view_query_templates) based on the existing query, 
  2. Copies across all the tables from the existing query into the new query (including the CST_Period_Close_Summary table), 
  3. Add in the MTL tables (MTL_material_Statuses_tl, MTL_Grades_TL, MTL_Item_Locations, and MTL_Lot_Numbers
  4. Update the CPCS table (CST_Period_Close_Summary) replacing it with GMF.GMF_PERIOD_BALANCES in the new query,
  5. Copy across all the columns from the existing query that are associated with any table included in the new query,
  6. Update the Accounted_On_Hand and Period_Close_Quantity columns to use the correct values,
  7. Insert new columns based on the new tables (i.e. Lot_Grade_Code, Lot_Number, etc),
  8. Insert the requried n_view_col_property_templates records for flexfields,
  9. Copy across the existing WHERE-clause records,
  10. Insert new where clause components to join up new tables, and finally
  11. Update the version for the first query to be pre-Release 12