Pages

Showing posts with label EBS. Show all posts
Showing posts with label EBS. Show all posts

Tuesday, April 16, 2013

Lot Genealogy, Part 3: Debugging Lots and Batches

As you might have noticed I've just updated the LOT_GENEALOGY source code for this project to a new version following the discovery of a bug where the product of a batch is two different item numbers. This is a rare occurrence in our system, which is why it went undetected for so long, but it might not be in yours.

Rather than dwell on that I thought I'd talk about exactly what the process was that led to us discovering the fixing the error.

The first report came from the users; a lot which they knew had two item numbers was only showing one. Other lots didn't seem to be affected. This was then traced back to a single batch, 73591, running this SQL gave us the transactions that affected the cache for that batch. To publish this data I'm obfuscating the data (see here);

SELECT LGG.TRANSACTION_ID,
       HIDEDATA.Hide('LGG.LN', LGG.LOT_NUMBER) Lot_Number,
       LGG.TRANSACTION_QUANTITY,
       LGG.TRANSACTION_TYPE_NAME,
       LGG.Item_Number,
       HIDEDATA.Hide('LGG.ID', LGG.Item_Description) Item_Description,
       LGG.ITEM_TYPE
  FROM LOT_GENEALOGY_GETTRANSACTIONS LGG
 WHERE 1 = 1
   AND LGG.Batch_No = '73591';

The result of this is the following transactions;

LG Transactions For A Single Batch
As you can see the first thing we have is three "WIP Completion" transactions. Each of these represents the completion of the batch and the generation of an output. In this case Item Numbers 073081 and 07440. However without knowing what information was put *into* the batch in the first place it's not possible to usefully use this information in the Genealogy - It has to wait until the "WIP Issue" transactions are processed (each of which represents an ingredient).

The next stage for debugging the cache was to (on a test system!) change the LOT_GENEALOGY_GETTRANSACTIONS view so that rather than looking at *everything* it only looks at the records for a single batch - this is simply done by adding the following where clause to the view SQL;

-- Restrict transactions to a single batch (for debugging)
AND GBH.Batch_No = '73591'

Now we're restricted the input records to just those affecting that batch we can just re-build the whole cache - it will take longer to do the delete than the insert. The script to do it is;

begin
  -- Call the procedure
  lot_genealogy.refreshwholecache;
end;


Once that has been completed the entire content of the cache is the single batch we're tracking. The SQL to show it is;

SELECT HIDEDATA.Hide('LGG.LN', LGW.Master_Lot_Number) Master_Lot_Number,
       HIDEDATA.Hide('LGG.LN', LGW.Ingred_Lot_Number) Ingred_Lot_Number,
       LGW.Ingred_Item_Number,
       HIDEDATA.Hide('LGG.ID', LGW.Ingred_Item_Description) Ingred_Item_Description,
       LGW.Ingred_Item_Type_Code,
       LGW.Batch_Number,
       HIDEDATA.Hide('LGG.LN', LGW.PRODUCT_LOT_NUMBER) PRODUCT_LOT_NUMBER,
       LGW.PRODUCT_ITEM_NUMBER,
       HIDEDATA.Hide('LGG.ID', LGW.Product_Item_Description) Product_Item_Description,
       LGW.Product_Item_Type_Code
  FROM LOT_GENEALOGY_WHEREUSED LGW
 WHERE 1 = 1


The result of this is;

Lot Genealogy Cache: Result for a Single Batch
As you can see the only product for Ingredient Lot LGG.LN 37 is item number 073002, if we look at the transactions earlier we can see that it should be reporting Item Number 07440 as well - it's not which means something is going wrong in the "WIP Issue" part of the cache processing.

If we look at the source code (available here - via Google Drive) you'll see that the final part of the WIP Issue is creating the records in LOT_GENEALOGY_BATCH_PRODUCT so the next stage to check is to see if these records are being created correctly. Here is the SQL;

SELECT LGBP.batch_number,
       HIDEDATA.Hide('LGG.LN', LGBP.PRODUCT_LOT_NUMBER) PRODUCT_LOT_NUMBER,
       LGBP.product_item_number,
       HIDEDATA.Hide('LGG.ID', LGBP.Product_Item_Description) Product_Item_Description,
       LGBP.product_item_type_code
  FROM LOT_GENEALOGY_BATCH_PRODUCT LGBP
 WHERE 1 = 1
   AND LGBP.Batch_Number = '73591'


This gives us the result;

Lot Genealogy Batch Products
This shows us that the correct batch products are being recorded - this is important as we now have both ends of the transaction; the correct transactions are going in and the correct products are coming out. However we also know that the cache isn't being updated correctly therefore the culprits must be the two pieces of SQL that are actually doing the inserts.

If you look at the first one;

INSERT INTO NOETIX_SYS.Lot_Genealogy_WhereUsed
    SELECT v_Transaction.Lot_Number, -- Master_Lot_Number
           v_Transaction.Lot_Number, -- Ingred_Lot_Number
           v_Transaction.Item$item, -- INGRED_ITEM$ITEM
           v_Transaction.Item_Description, -- INGRED_ITEM_DESCRIPTION
           v_Transaction.Item_Type_Code, -- INGRED_ITEM_TYPE_CODE
           v_Transaction.Batch_Number, -- Batch_Number
           v_Product.Product_Lot_Number, -- Product_Lot_Number
           v_Product.PRODUCT_ITEM$ITEM, -- PRODUCT_ITEM$ITEM
           v_Product.PRODUCT_ITEM_DESCRIPTION, -- PRODUCT_ITEM_DESCRIPTION
           v_Product.PRODUCT_ITEM_TYPE_CODE -- PRODUCT_ITEM_TYPE_CODE
      FROM DUAL
     WHERE NOT EXISTS
     (SELECT 1
              FROM NOETIX_SYS.Lot_Genealogy_WhereUsed LGI
             WHERE LGI.MASTER_LOT_NUMBER = v_Transaction.Lot_Number
               AND LGI.Ingred_LOT_NUMBER = v_Transaction.Lot_Number
               AND LGI.Batch_Number = v_Transaction.Batch_Number
               AND (v_Product.Product_Lot_Number IS NULL AND
                   LGI.PRODUCT_LOT_NUMBER IS NULL OR
                   v_Product.Product_Lot_Number = LGI.PRODUCT_LOT_NUMBER));


You can see that this is actually just a select from DUAL so the fact that it's only working on the first pass (i.e. for the first product) means that the offending part of the code must be the NOT EXISTS.

Looking at the WHERE clause in the sub-select reveals that it's not using the Product_Item_Number field. Not using this field means that after it's inserted the first product it mistakes the second one for a duplicate and skips over it.

Altering the final AND statement to;

AND (v_Product.Product_Lot_Number IS NULL AND LGI.PRODUCT_LOT_NUMBER IS NULL OR (v_Product.Product_Lot_Number = LGI.PRODUCT_LOT_NUMBER AND v_Product.Product_Item_Number = LGI.PRODUCT_ITEM_NUMBER))

For BOTH pieces of SQL (the one updating  all the existing records as well as this one which is creating a new master record) addresses the issue logically, if you rebuild the cache from scratch (using the script above) and re-run the SQL to get the content of the cache you will now see;

Lot Genealogy Cache - Complete
The first thing you notice is there are a lot more records. The new records are for Item Number 07440 which is the one we were missing earlier.

If you do a source comparison between the new and old versions of the package you'll notice that this wasn't the only change I made - I've added a great deal more logging to the WIP Issue transaction so it's possible to see what is going on and what "SQL%ROWCOUNT = 0" actually means! I also added a warning to WIP Completion if no records are updated.

I hope this is of some interest ... Debugging an application, even one you've written yourself, is a skill that's very difficult to transfer. Hopefully there is enough information here for people to at least make a start for future issues (which will, by their very nature, be completely different I'm sure!).

Tuesday, April 9, 2013

Lot Genealogy (Part 2): Automating Testing of a Genealogy

If you are going to roll out any caching of your lot genealogy data the one clear thing you have to get right is testing. If, like us, you are working in a highly regulated environment it's necessary to ensure that you've got your testing right.

To this end I've created a new table called LOT_GENEALOGY_WHEREUSED_TESTS;

Lot_Genealogy_WhereUsed_Tests Table Description
The purpose of this new table is to hold copies of the lot genealogy data from Lot_Genealogy_WhereUsed. By holding these "snapshots" we can check after each rebuild of the cache that the data that was there and we had previously validated as correct is still there.

To create tests the following SQL will insert an existing Lot Genealogy into the testing table;

INSERT INTO LOT_GENEALOGY_WHEREUSED_TESTS
  SELECT 'ATR002', -- test_Ref
         'Product lot (166449) consists of two items (038003 and 038001)', -- test_description
         -- data from Lot Genealogy
         MASTER_LOT_NUMBER,
         INGRED_LOT_NUMBER,
         INGRED_ITEM_NUMBER,
         INGRED_ITEM_DESCRIPTION,
         INGRED_ITEM_TYPE_CODE,
         BATCH_NUMBER,
         PRODUCT_LOT_NUMBER,
         PRODUCT_ITEM_NUMBER,
         PRODUCT_ITEM_DESCRIPTION,
         PRODUCT_ITEM_TYPE_CODE
    FROM LOT_GENEALOGY_WHEREUSED LGW
   WHERE 1 = 1
     AND LGW.MASTER_LOT_NUMBER = '0490/0002';


This SQL is taking the lot genealogy for Master Lot Number 0490/0002 and copying it into the testing table, adding a test reference (ATR002) and a test description so we know what the test is supposed to be checking for.

That, as they say, was the easy bit. Now we need to create some SQL that is capable of running a test and returning a result. Our reporting tool (in case you can't tell from my other blog posts!) is SQL Server Reporting Services (SSRS) hence I'm going to split the SQL into two pieces, one to give me a list of all the tests and the other to run an individual test - SSRS will allow me to embed the latter as a sub-report into a report driven by the former.

List of Tests SQL

This was by far the easiest of the two;

SELECT DISTINCT LGWT.TEST_REF VALUE,
                LGWT.Test_Ref || ' (Lot ' || LGWT.MASTER_LOT_NUMBER || ')' LABEL
  FROM LOT_GENEALOGY_WHEREUSED_TESTS LGWT
 ORDER BY 1, 2


This returns something similar to;

List of Tests Generated Using SQL
I've added the bit of additional detail (for the Label) just so that in addition for being able to use this in the main report to get a list of tests I can also use it in the sub-report as a source for a picker on the "Test_Ref" parameter.

Run A Test SQL

This is slightly larger but here's the code then I'll try and explain it;

SELECT :Test_Ref "Test Ref",
       (SELECT TEST_DESCRIPTION
          FROM LOT_GENEALOGY_WHEREUSED_TESTS LGWT1
         WHERE LGWT1.TEST_REF = :Test_Ref
           AND ROWNUM = 1) "Test Description",
       TR.Test_Row_Count "Test Row Count",
       TR.Cache_Row_Count "Cache Row Count",
       TR.Union_Row_Count "Union Row Count",
       CASE
         WHEN TR.Test_Row_Count = TR.Cache_Row_Count AND
              TR.Cache_Row_Count = TR.Union_Row_Count THEN
          'PASS'
         ELSE
          'FAIL'
       END "Test Result"
  FROM (SELECT (SELECT COUNT(*)
                  FROM LOT_GENEALOGY_WHEREUSED_TESTS LGWT1
                 WHERE LGWT1.TEST_REF = :Test_Ref) Test_Row_Count,
               (SELECT COUNT(*)
                  FROM LOT_GENEALOGY_WHEREUSED LGW
                 WHERE LGW.MASTER_LOT_NUMBER =
                       (SELECT MASTER_LOT_NUMBER
                          FROM LOT_GENEALOGY_WHEREUSED_TESTS
                         WHERE TEST_REF = :Test_Ref
                           AND ROWNUM = 1)) Cache_Row_Count,
               (SELECT COUNT(*)
                  FROM (SELECT MASTER_LOT_NUMBER,
                               INGRED_LOT_NUMBER,
                               INGRED_ITEM_NUMBER,
                               INGRED_ITEM_DESCRIPTION,
                               INGRED_ITEM_TYPE_CODE,
                               BATCH_NUMBER,
                               PRODUCT_LOT_NUMBER,
                               PRODUCT_ITEM_NUMBER,
                               PRODUCT_ITEM_DESCRIPTION,
                               PRODUCT_ITEM_TYPE_CODE
                          FROM LOT_GENEALOGY_WHEREUSED_TESTS LGWT1
                         WHERE LGWT1.TEST_REF = :Test_Ref
                        UNION
                        SELECT MASTER_LOT_NUMBER,
                               INGRED_LOT_NUMBER,
                               INGRED_ITEM_NUMBER,
                               INGRED_ITEM_DESCRIPTION,
                               INGRED_ITEM_TYPE_CODE,
                               BATCH_NUMBER,
                               PRODUCT_LOT_NUMBER,
                               PRODUCT_ITEM_NUMBER,
                               PRODUCT_ITEM_DESCRIPTION,
                               PRODUCT_ITEM_TYPE_CODE
                          FROM LOT_GENEALOGY_WHEREUSED LGW
                         WHERE LGW.MASTER_LOT_NUMBER =
                               (SELECT MASTER_LOT_NUMBER
                                  FROM LOT_GENEALOGY_WHEREUSED_TESTS
                                 WHERE TEST_REF = :Test_Ref
                                   AND ROWNUM = 1))) Union_Row_Count
          FROM DUAL) TR


As you can see it takes a single parameter, the Test Reference Number. How it works is it counts the number of records in the cache, counts the number of records in the test table, and then does a select of all the records in the test table and, using a straight UNION, all the records in the cache. Because of the way UNIONs work (stripping out duplicates) the COUNT of the number of records returned by the UNION should be the same as the number of records in each of the other two queries. If they are all the same the TEST_RESULT is 'PASS' otherwise it's 'FAIL'.

NOTE: I'm sure this could be done a lot more efficiently but to be honest given the relative sizes of the tables I don't think you'll be sitting round too long for a result. On our system it takes less then .02 of a second. Your mileage will vary, but probably not by much!

Now that I've got the SQL I've setup a simple SSRS report to display the result for a single test back to me;

SSRS Report Showing A Single Test Result
You'll also notice that the test result includes a listing of all the records included in the test. This is a simple SELECT * ... WHERE TEST_REF = ... so I'm not going to give you the SQL for it.

The master report looks like this (in Report Builder 3);

SSRS Master Reporting Showing All Tests
I've included text boxes (on each page as part of the footer) to record who the Tester and Checker are and the Date. For our internal quality purposes we need this level of detail, you might not but what's the harm?

When executed the report appears like this;

SSRS Testing Report - Final Result
Hopefully you will find this useful.

Monday, July 30, 2012

Noetix: Working With Item Cost Summaries

If you are, like us, users of Oracle Process Manufacturing then you'll have written a lot of reports which rely on Item Costings (both for products and ingredients). One of the things that makes the current structure of the Noetix View difficult to work with is that it provides a detailed breakdown of the costs - something that's useful if you're reporting on the costs themselves - which is not useful if you're reporting on, for example, the value of inventory where you're just interested in the *total* cost.

Of course it's fairly easy to get there, you just total up all the costs but what it does do is introduce unnecessary complexity in your SQL often leading to sub-selects in JOINS like the following;

select 
  gic.ITEM$Item,
  sum(decode(gic.Component_Group, 'LAB', gic.Item_Cost)) as TOTAL_LABOUR_COST,
  sum(decode(gic.Component_Group, 'OVE', gic.Item_Cost)) as TOTAL_OVERHEAD_COST,
  sum(decode(gic.Component_Group, 'MAT', gic.Item_Cost)) as TOTAL_MATERIAL_COST
from gmfg0_item_costs gic
where gic.PERIOD_CODE = :PERIODCODE
group by gic.ITEM$Item

As you can see we have three component groups LABour, OVErhead, and MATerial and while this is a fairly understandable piece of SQL what happens is that it gets repeated over and over again in various reports and when, as recently, we change the way costing works (by moving costs from one Organsiation to anotehr) it becomes very complex to make the change to all the affected reports.

Because of this we've introduced a summary view which sits on top of GMFG0_Item_Costs called GMFG0_Item_Cost_Summary which looks like this;

Name                Type         Nullable Default Comments
------------------- ------------ -------- ------- --------
ORGANIZATION_CODE   VARCHAR2(40) Y                        
PERIOD_CODE         VARCHAR2(10)                          
Z$INVG0_ITEMS       ROWID        Y                        
ITEM$ITEM           VARCHAR2(40) Y                        
TOTAL_LABOUR_COST   NUMBER       Y                        
TOTAL_OVERHEAD_COST NUMBER       Y                        
TOTAL_MATERIAL_COST NUMBER       Y                        
TOTAL_ITEM_COST     NUMBER       Y  


The SQL to generate it is;

create or replace view gmfg0_item_cost_summary as
select gic.Organization_Code,
       gic.period_code,
       gic.Z$INVG0_Items,
       gic.ITEM$Item,
       sum(decode(gic.Component_Group, 'LAB', gic.Item_Cost)) as TOTAL_LABOUR_COST,
       sum(decode(gic.Component_Group, 'OVE', gic.Item_Cost)) as TOTAL_OVERHEAD_COST,
       sum(decode(gic.Component_Group, 'MAT', gic.Item_Cost)) as TOTAL_MATERIAL_COST,
       sum(gic.Item_Cost) as TOTAL_ITEM_COST
  from gmfg0_item_costs gic
 group by gic.Organization_Code,
          gic.period_code,
          gic.Z$INVG0_Items,
          gic.ITEM$Item;

To have this view rebuilt every time we do a Noetix regenerate we have created a XU6 script to automatically build the view after every regenerate - it's available here (via Google Docs).


Friday, May 18, 2012

Oracle R12: Reporting Issues With Subledger Accounting

On 1st January this year we successfully migrated our existing R11 Oracle e-Business Suite installation to R12 and went live. The huge bulk of our reports had been upgraded to deal with the changes (largely now using Noetix Views) and with only about 100 reports remaining we were expecting to have completed the migration of existing reports and the creation of any new reports by the end of June 2012 (I'm writing this at the end of the May).

One of the process changes that was made as part of the R12 upgrade project was to enable some rules in Subledger Accounting. These are fairly simple rules and it probably helps to give an example so here goes;

A purchase order is raised for Item 7426-4 and then journalled into the GL (via the SLA). The original GL code defined with the purchase order line is;

2.001.9999.0000.0000.0000

It's not really important what the segments mean, but suffice to say when the final transaction is posted to the GL the account code has changed (by an SLA rule) to;

2.000.11111.0000.FOIL.0000

Looking at the lines in the GL (in Oracle) we see;
GL Journal Lines Showing the New Account Segments

From a Finance perspective this is good news as it dramatically reduces the chance of something being miss-coded by ignoring the codes the user has entered and replacing them with some pre-defined values.

However from a reporting standpoint this is a complete and total nightmare.

Why? Because the GL account code changes are not rolled back into the values defined for the Purchase Order. This means that if we are using the GL Account code in order to break down spending, which we are, then the only "true" value is in the GL.

On it's own this wouldn't be an issue except when you combine it with two others;
  1. Subledger records in XLA_Distribution_Links only allow a GL Line to be linked to a *single* external entity. Thus if, for example, you want GL Lines with their Purchase Orders on a report (for example a Department Spend Report) then you need to look both at XLA_Distribution_Links that are linked directly to purchase orders, then those that are linked to invoices (which in turn are linked to purchase orders), then those lines which are linked to inventory transactions and then linked to purchase orders, etc. In short a relatively simple query against PO.PO_DISTRIBUTIONS_ALL becomes a complete nightmare of multiple UNIONS
  2. The relationship between GL_JE_Lines and PO_Distributions_All isn't one-to-one. The tax line of distributions are merged into a single GL Line which means in XLA_Distribution_Links that single GL line is linked to every line on the purchase order.
Does anyone have a solution to this problem?!

Wednesday, April 25, 2012

Noetix: Release 11 to Release 12 Migration (Replacing Obsolete Views)

While migration to R12 with Noetix is very easy and relatively painless in some areas the significant changes in Inventory between the two versions make it practically impossible for Noetix views to handle the migration "invisibly" (as they tend to do in other areas like Account Payable).

This is especially true if your organisation, like mine, is taking this opportunity to do a bit of restructuring the system so that, for example, rather than using stock being in a particular warehouse to mean that it's a retained sample achieving the same thing using statuses and getting rid of the warehouse.

Here is a list of R11 and R12 Noetix View Templates with "suggested" replacements. It's not a complete list, but it is the list we have been working of for all our reports;

View Label >>>> Recommended Replacement
GMP_Forecast_Details >>>> INV_Forecasts
GMI_Unallocated_Inventory >>>> INV_Unallocated_Inventory view
GMI_Onhand_Inv_By_Lot  >>>> INV_Item_Onhand_By_Lot*
GMI_Onhand_Inv_By_Location  >>>> INV_Onhand_Quantities*
GMI_Month_End_Inventory  >>>> INV_Period_Close_Details*
GMI_Item_Master  >>>> INV_Items and INV_Item_Inventory_Attributes
GMI_Inv_Transactions  >>>> INV_Transactions or INV_Transaction_Details
GMF_Update_Subledger  >>>> GMF_SLA_Cost_Subledger
GMF_Cost_Warehouse_Assoc  >>>> GMF_Cost_Organization_Assc
GMF_Order_Details_Base  >>>> GMF_SLA_Cost_Subledger

*- The migration from R11 Process Manufacturing Inventory to R12 Common Inventory for us has been less than smooth and has not been helped by a few, dare I say it, "obvious" issues that Noetix should have addressed prior to us attempting the upgrade.

The most obvious issue was that in R12 there was no view that would tell us the lot number, the location, and the onhand quantity. The best we could do is two out of three. I just cannot imagine any organisation out there not wanting to see all three on the same report. This led to us having to make significant customisations in order for the migration path to actually be a migration path.

For example looking at the view template INV_Item_Onhand_By_Lot we have added the following customisation columns (with some additional information where appropriate);

Days_Since_First_Transaction
ITEM  (Table Alias= ITEM, Column expression=ITEM) << Flexfield
Item_Type  (Table Alias= ITEM, Column expression=ITEM_TYPE)
Item_Type_Code  (Table Alias= ITEM, Column expression=ITEM_TYPE)
LOCT  (Table Alias= LOCT, Column expression=LOCT) << Flexfield
Material_Status  (Table Alias= MMST, Column expression=DESCRIPTION)
Material_Status_Code  (Table Alias= MMST, Column expression=STATUS_CODE)
Organization_Code  (Table Alias= MPARM, Column expression=ORGANIZATION_CODE)











Here is our list of customisation to the GMI_Onhand_Inv_By_Lot view;
Batch_Id 
Company_Code  (Table Alias= XMAP, Column expression=COMPANY_CODE)
Company_Name  (Table Alias= XMAP, Column expression=COMPANY_NAME)
Days_Since_First_Transaction
Inventory_Class  (Table Alias= ITEM, Column expression=INV_CLASS)
Item_Cost_Class  (Table Alias= ITEM, Column expression=ITEMCOST_CLASS)
Item_Id  (Table Alias= ITEM, Column expression=Item_Id)
Lots_MFD_Flag  (Table Alias= LOTS, Column expression=DECODE(LOTS.DELETE_MARK,0,'N',1,'Y'))
Qc_Hold_Reason_Code  (Table Alias= LOINV, Column expression=QCHOLD_RES_CODE)





I've highlighted the location flexfield which if you look at the Noetix columns for the original view seems to directly map to the INVENTORY_LOCATION_CODE. There seems to be no mapping for this column down the migration path we were recommended to take.

I'm currently going through the process of sorting out our customisations (something that I'm sure will greatly relieve Noetix Support!) and I'll be publishing as much as I possibly can of the work we've done (including the difference between Lot Status and Material_Status which I'm sure will intrigue and worry people).

Friday, February 17, 2012

Oracle PL/SQL: Querying Inventory Quantities in R12.

As you have probably gathered from my other posts the company I work for uses NoetixViews for reporting against Oracle e-Business Suite however it looks like, at Release 12, Oracle has started providing some reasonably useful API's to get inventory quantities.

Take a look at the INV_Quantity_Tree_Pub package (under the APPS user) in your database. The "query_quantities" API has the following parameters;

P_API_VERSION_NUMBER    NUMBER    IN   
P_INIT_MSG_LST    VARCHAR2    IN    Y
X_RETURN_STATUS    VARCHAR2    OUT   
X_MSG_COUNT    NUMBER    OUT   
X_MSG_DATA    VARCHAR2    OUT   
P_ORGANIZATION_ID    NUMBER    IN   
P_INVENTORY_ITEM_ID    NUMBER    IN   
P_TREE_MODE    NUMBER    IN   
P_IS_REVISION_CONTROL    BOOLEAN    IN   
P_IS_LOT_CONTROL    BOOLEAN    IN   
P_IS_SERIAL_CONTROL    BOOLEAN    IN   
P_DEMAND_SOURCE_TYPE_ID    NUMBER    IN    Y
P_DEMAND_SOURCE_HEADER_ID    NUMBER    IN    Y
P_DEMAND_SOURCE_LINE_ID    NUMBER    IN    Y
P_DEMAND_SOURCE_NAME    VARCHAR2    IN    Y
P_LOT_EXPIRATION_DATE    DATE    IN    Y
P_REVISION    VARCHAR2    IN   
P_LOT_NUMBER    VARCHAR2    IN   
P_SUBINVENTORY_CODE    VARCHAR2    IN   
P_LOCATOR_ID    NUMBER    IN   
P_ONHAND_SOURCE    NUMBER    IN    Y
X_QOH    NUMBER    OUT   
X_RQOH    NUMBER    OUT   
X_QR    NUMBER    OUT   
X_QS    NUMBER    OUT   
X_ATT    NUMBER    OUT   
X_ATR    NUMBER    OUT   
P_TRANSFER_SUBINVENTORY_CODE    VARCHAR2    IN    Y
P_COST_GROUP_ID    NUMBER    IN    Y
P_LPN_ID    NUMBER    IN    Y
P_TRANSFER_LOCATOR_ID    NUMBER    IN    Y

I said they have "provided a reasonably useful API" not that they had provided an "easy to use API"! Clearly if you want to get anything useful out of this API then you need to build a wrapper package for this (particularly if you want to be able to query quantities in SQL rather than PL/SQL - which I'm assuming you do).

I'm going to suggest you create a wrapper function called "get_onhand_quantity" as an example, it should be pretty clear how you alter this to return some of the other quantities associated with the item, here is the SQL:

CREATE OR REPLACE FUNCTION get_onhand_quantity(p_organization_id   IN NUMBER,
                                               p_inventory_item_id IN NUMBER,
                                               p_subinventory_code IN VARCHAR2,
                                               p_locator_id        IN NUMBER,
                                               p_lot_number        IN VARCHAR2)
  RETURN NUMBER IS
  -- The various quantity values returned
  v_QuantityOnhand              NUMBER;
  v_QuantityReservableOnhand    NUMBER;
  v_QuantityReserved            NUMBER;
  v_QuantitySuggested           NUMBER;
  v_QuantityAvailableToTransact NUMBER;
  v_QuantityAvailableToReserve  NUMBER;

  -- Monitor the return status of the API
  v_ReturnStatus       VARCHAR2(1);
  v_ReturnMessageCount NUMBER;
  v_ReturnMessageData  VARCHAR2(256);
begin
  INV_QUANTITY_TREE_PUB.query_quantities(p_api_version_number         => 1,
                                         p_init_msg_lst               => fnd_api.g_false,
                                         x_return_status              => v_ReturnStatus,
                                         x_msg_count                  => v_ReturnMessageCount,
                                         x_msg_data                   => v_ReturnMessageData,
                                         p_organization_id            => p_organization_id,
                                         p_inventory_item_id          => p_inventory_item_id,
                                         p_tree_mode                  => apps.INV_Quantity_Tree_PUB.g_transaction_mode,
                                         p_is_revision_control        => FALSE,
                                         p_is_lot_control             => TRUE,
                                         p_is_serial_control          => FALSE,
                                         p_demand_source_type_id      => -9999,
                                         p_demand_source_header_id    => -9999,
                                         p_demand_source_line_id      => -9999,
                                         p_demand_source_name         => NULL,
                                         p_lot_expiration_date        => NULL,
                                         p_revision                   => NULL,
                                         p_lot_number                 => p_lot_number,
                                         p_subinventory_code          => p_subinventory_code,
                                         p_locator_id                 => p_locator_id,
                                         p_onhand_source              => inv_quantity_tree_pvt.g_all_subs,
                                         x_qoh                        => v_QuantityOnhand,
                                         x_rqoh                       => v_QuantityReservableOnhand,
                                         x_qr                         => v_QuantityReserved,
                                         x_qs                         => v_QuantitySuggested,
                                         x_att                        => v_QuantityAvailableToTransact,
                                         x_atr                        => v_QuantityAvailableToReserve,
                                         p_transfer_subinventory_code => NULL,
                                         p_cost_group_id              => NULL,
                                         p_lpn_id                     => NULL,
                                         p_transfer_locator_id        => NULL);

  RETURN(v_QuantityOnhand);
END get_onhand_quantity;





The script is available as a Google Document here (which might be slightly easier to see).

Now a quick test script to demonstrate how this works;

select ii.Onhand_Quantity
  from noetix_sys.INVG0_item_onhand_by_lot II
 where ii.ORGANIZATION_CODE = '01'
   and ii.Onhand_Quantity > 0
   and ii.Inventory_Item_ID = 5097
   and ii.lot_number = '155498RET'
   and rownum = 1;

select get_onhand_quantity(302, 5097, 'RETAIN', 37976, '155498RET') Onhand_Quantity
  from dual

The first step (at the top) is to use Noetix views to find the details of an item with Onhand Quantity greater than zero so that can be plugged into the Get_Onhand_Quantity function (it was also slightly comforting to see that both queries returned the same value).

What's it like for speed? Well that's an interesting question; it's substantially faster (as you'd expect; it's not going through a view) but as the timings for the first and second queries are 0.093 and 0.031 seconds respectively the difference isn't really significant in the wider scheme of things.

It will be interesting to see if a future version of Noetix uses the API (I'd certainly recommend it!).




Wednesday, February 1, 2012

PL/SQL: When Were Your Tables Last Analysed?

As I'm sure any DBA will tell you one sure-fire way to have terrible-performing queries is not to update the statistical information Oracle is using to work out how to efficiently execute your query. Luckily Oracle keeps track of when the information was last updated for each column in the DBA_Table_Columns view.

Here is a simple query that will give you a list of tables that have not been analysed in the last week;

SELECT dtc.OWNER, dtc.TABLE_NAME, dtc.LAST_ANALYZED
  FROM dba_tab_columns dtc
 WHERE dtc.OWNER NOT IN ('SYS', 'SYSTEM')
   AND dtc.COLUMN_ID = 1
   AND dtc.TABLE_NAME NOT LIKE 'BIN$%'
   AND dtc.TABLE_NAME NOT LIKE '%$%'
   AND dtc.OWNER NOT LIKE '%$%'
   AND dtc.LAST_ANALYZED IS NOT NULL
   AND dtc.LAST_ANALYZED < TRUNC(SYSDATE) - 7
 ORDER BY dtc.LAST_ANALYZED DESC

This gives you a list of tables that haven't been analysed but, if you're looking at a large complex system like Oracle e-Business Suite (which I am) this isn't terribly useful as it's quite possible for table to not be analysed simply because it's not in use.

I've written this simple script that produces (via DBMS_OUTPUT) a list of the tables which contain records which haven't been analysed in the last week;

DECLARE
  v_RowCount  NUMBER;
  v_Startdate DATE;
  v_Enddate   DATE;
BEGIN
  dbms_output.put_line('"Table Owner","Table Name","Row Count","Last Analysed Date", "Time To Query"');
  FOR v_Table IN (SELECT dtc.OWNER, dtc.TABLE_NAME, dtc.LAST_ANALYZED
                    FROM dba_tab_columns dtc
                   WHERE dtc.OWNER NOT IN ('SYS', 'SYSTEM')
                     AND dtc.COLUMN_ID = 1
                     AND dtc.TABLE_NAME NOT LIKE 'BIN$%'
                     AND dtc.TABLE_NAME NOT LIKE '%$%'
                     AND dtc.OWNER NOT LIKE '%$%'
                     AND dtc.LAST_ANALYZED IS NOT NULL
                     AND dtc.LAST_ANALYZED < TRUNC(SYSDATE) - 7
                   ORDER BY dtc.LAST_ANALYZED DESC) LOOP
    v_Startdate := SYSDATE;
    BEGIN
      EXECUTE IMMEDIATE 'select count(*) from ' || v_Table.Owner || '.' ||
                        v_Table.Table_Name
        INTO v_RowCount;
    EXCEPTION
      WHEN OTHERS THEN
        v_RowCount := -1;
    END;
    v_Enddate := SYSDATE;
    IF v_RowCount > 0 THEN
      dbms_output.put_line(v_Table.Owner || ',' || v_Table.Table_Name || ',' ||
                           to_char(v_RowCount) || ',' ||
                           to_char(v_Table.Last_Analyzed, 'DD-MON-YYYY') || ',' ||
                           to_char((v_EndDate - v_StartDate),
                                   '0.9999999999'));
    END IF;
  END LOOP;
END;

Friday, January 27, 2012

Noetix: Reporting AP Credit Card Transactions

If you are, like the company I work at, a user of both Oracle Payables and Oracle Internet Expenses you'll realise that the line between then is incredibly blurred. Especially if you have the situation where some departments/ areas use i-Expenses but other areas have an Excel-based expenses system (that is then sent to Finance who enter the data directly into Payables).

Oracle have made the situation a little worse by licensing Internet Expenses separately to Payables and it looks very much like Noetix have continued the trend.

What makes the Noetix decision even stranger is that the majority of the data for Internet Expenses is stored in the same tables as Payables in Oracle.

After looking at what was offered by the Noetix "Internet Expenses" module and our own reporting requirements the only "gap" that could be identified was for reporting on Credit Card Transactions (primarily around AP.AP_CREDIT_CARD_TRXNS_ALL).

Our reporting requirements are pretty simple, here is a simple list (with properties) of the columns we'd like to report on related to Credit Card Transactions;

CARD_NUMBER    VARCHAR2(30)
DISPUTE_DATE    DATE
EMAIL_ADDRESS    VARCHAR2(240)
EMPLOYEE_FULL_NAME    VARCHAR2(240)
EXPENSED_AMOUNT    NUMBER
LAST_UPDATE_DATE    DATE
MERCHANT_NAME    VARCHAR2(80)
REFERENCE_NUMBER    VARCHAR2(240)
TRANSACTION_AMOUNT    NUMBER
TRANSACTION_DATE    DATE
TRANSACTION_ID    NUMBER(15)

Converting these requirements into a Noetix View took a fair bit of trial and error but the script is available here. As it's quite a large one I won't be copy/pasting it below (as I normally would).

Any questions/ suggestions feel free to post a comment.



Monday, January 16, 2012

Oracle EBS: Creating New Menu Items in Oracle e-Business Suite

NOTE: Don't do this on a production environment. Did that need saying? Apparently one person who submitted a comment seemed to think so ... You really can completely mess it up. Run it on a test environment and MAKE SURE IT WORKS before you run it anywhere else.The script below was written against 11i, I would be loathe to run it against a different version. The point of this post is to allow you to script a change to 11i - you might be better off just doing this in the UI if you're not managing lots of instances. Anyway ... You have been warned!


This blog post takes you through a step-by-step guide to how to add a new menu item (that will punch out to this Knol) to the root menu of an existing responsibility using the Oracle API's (so the change can be scripted rather than done in the forms).

A completed example script (with error checking and reporting) is included.


Step 1: Getting The Responsibility Details
In order to add a new menu item you need to know which set of menus your responsiblity is currently using. To find this out you need to go into the e-Business Suite and choose the "System Administrator" Responsibility and then under Security > Responsibility choose "Define".

Search for the Responsibility you wish to use. For the purposes of this example I'm going to use the "Alert Manager" responsibility as it should be one that is installed on every instance and will have a fairly limited user base.

When you view the Responsibility you will see something like this;

Figure 1: "Alert Manager" Responsibility
The important piece of information on this screen is the "Menu" (in the middle). You can see that the responsibility is using the "ALR_OAM_NAV_GUI" Menu as it's root. We'll add the new menu item in here.

Stage 2: Adding a Simple HTML-redirect Script To Oracle
See Linking Directly to Microsoft Reporting Services from Oracle e-Business Suite (Stage 2)

Stage 3: Creating a Function using FND_FORM_FUNCTIONS_PKG.INSERT_ROW
This API provides a quick way of creating records in the FND_FORM_FUNCTIONS set of tables, in order to use the API you first need to get a new ID from the the FND_FORM_FUNCTIONS_S sequence. As we're going to be doing nothing more than a simple punch-out to Google the API call will look something like this;

fnd_form_functions_pkg.insert_row(
  x_rowid                    => v_RowID,
  x_function_id              => v_Id,
  x_web_host_name            => null,
  x_web_agent_name           => null,
  x_web_html_call            => 'verysimpleredirect.html?redirect=http://knol.google.com/k/andy-pellew/creating-new-menu-items-in-oracle-e/',
  x_web_encrypt_parameters   => 'N',
  x_web_secured              => 'N',
  x_web_icon                 => null,
  x_object_id                => null,
  x_region_application_id    => null,
  x_region_code              => null,
  x_function_name            => 'GOOGLEKNOL',
  x_application_id           => null,
  x_form_id                  => null,
  x_parameters               => null,
  x_type                     => 'JSP',
  x_user_function_name       => 'Google Knol Viewer', --:x_user_function_name,
  x_description              => 'Google Knol Viewer', --:x_description,
  x_creation_date            => sysdate,
  x_created_by               => 0,
  x_last_update_date         => sysdate,
  x_last_updated_by          => 0,
  x_last_update_login        => -1,
  x_maintenance_mode_support => 'NONE',
  x_context_dependence       => 'RESP',
  x_jrad_ref_path            => null);

I've called the function "GOOGLEKNOL" and it's being created by the System Administrator (if you look in FND_USER it's ID 0). If you have disabled this user then it's best if you create it as someone else. You can always use your ID but I prefer to distance myself from these created objects (it's one less thing to worry about if I ever choose to leave my job and have to hand all this over to someone else!).

Unfortunately there seems to be a bug with this API in the the "Type" (FND_FORM_FUNCTIONS.TYPE) does not appear to be being written correctly into the database. In order to fix this you need to do a SQL update;

 update applsys.fnd_form_functions t
     set type = 'JSP'
   where function_id = v_ID;

Where v_ID is the ID you retrieved from the sequence earlier.

Stage 4: Associating Function with Existing Oracle Menu
This uses the FND_MENU_ENTRIES_PKG.INSERT_ROW API as published by Oracle to hook together the new menu item with the existing menu. In stage 1 we learnt that the menu we wish to alter is called "ALR_OAM_NAV_GUI" and by querying the FND_MENUS and FND_MENU_ENTRIES tables we can get the Menu ID and the next available menu sequence number as follows;

select fm.menu_id, max(entry_sequence) + 1
    from fnd_menus fm, fnd_menu_entries fme
   where fm.menu_name = 'ALR_OAM_NAV_GUI'
     and fm.menu_id = fme.menu_id
   group by fm.menu_id;

Using these values we can call the API;

fnd_menu_entries_pkg.insert_row(
  x_rowid             => v_RowID,
  x_menu_id           => v_MenuId,
  x_entry_sequence    => v_EntrySequence,
  x_sub_menu_id       => null,
  x_function_id       => v_Id,
  x_grant_flag        => 'Y',
  x_prompt            => 'Google Knol Viewer',
  x_description       => 'View Google Knol',
  x_creation_date     => sysdate,
  x_created_by        => 0,
  x_last_update_date  => sysdate,
  x_last_updated_by   => 0,
  x_last_update_login => -1);

Now we have created all the records we're almost there.

Stage 5: Running the "Compile Security" Concurrent Request
This is performed using the FND_REQUEST.SUBMIT_REQUEST concurrent request API;

apps.FND_REQUEST.SUBMIT_REQUEST(
  application => 'FND',
  program     => 'FNDSCMPI',
  argument1   => 'No')

This is function so you'll need to do something with the returned value.

Summary
After completing these steps you'll find that when you log-in and switch to the "Alert Manager" responsiblity you will have a new menu item and clicking on that will bring up this Knol;

Figure 2: Completed System 
A script to perform these changes automatically (with additional error checking and report) is available by clicking here.

Friday, January 13, 2012

Oracle EBS: Cloning Script To Clear Email Addresses

We use the following cloning script to clear down email addresses on each of the cloned systems we create from our production EBS system. In exchange we have setup a mailbox which all the test systems feed into and each tiem we create a new clone we add an email address to the mailbox so that it receives emails from the new clone (this way you can filter the messages in the mailbox to get the ones you're interested in for the clone system you are looking at).

The script itself sends an email with what it's done to the mail box (so we know when the system was last cloned), it also includes a check to make sure it's not being run against production (PROD) by accident!

The actual script (which was written against Oracle version 12.1.3) is below;

declare
  c_EMAIL constant varchar2(60) := '%DATABASE%_APPSTEST@<your domain>';

  v_Email    PO_VENDOR_SITES_ALL.EMAIL_ADDRESS%TYPE;
  v_Database v$Database.NAME%TYPE;
  v_Body     CLOB;
  procedure addToBody(p_Text in varchar2) as
  begin
    dbms_lob.writeappend(v_Body, length(p_Text) + 4, p_Text || '<br>');
    dbms_output.put_line(p_Text);
  end;
begin
  dbms_lob.createtemporary(lob_loc => v_Body, cache => False);

  addToBody('0010 Checking database instance ... ');
  select name into v_Database from v$Database;
  dbms_output.put_line('020 Database instance = "' || v_Database || '"');
  if v_Database = 'PROD' then
    raise_application_error(-20005,
                            'This script must NOT be run against PROD.');
  end if;

  v_Email := replace(c_EMAIL, '%DATABASE%', v_Database);
  addToBody('0100 All emails will be set to "' || v_Email || '"');

  addToBody('0110 Updating AP supplier contacts (AP_SUPPLIER_CONTACTS) records ... ');
  update AP_SUPPLIER_CONTACTS
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0120 Updating AP supplier site details (AP_SUPPLIER_SITES_ALL) records ... ');
  update AP_SUPPLIER_SITES_ALL
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) ||
            ' records updated (EMAIL_ADDRESS)');
  update AP_SUPPLIER_SITES_ALL
     set REMITTANCE_EMAIL = v_Email
   where REMITTANCE_EMAIL is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) ||
            ' records updated (REMITTANCE_EMAIL)');

  addToBody('0130 Updating User (FND_USER) records ... ');
  update fnd_user
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0140 Updating Vendor Contact Points (HZ_CONTACT_POINTS) records ... ');
  update AR.HZ_CONTACT_POINTS
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0150 Updating Parties (HZ_PARTIES) records ... ');
  update HZ_PARTIES
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0160 Updating external payees (IBY_EXTERNAL_PAYEES_ALL) records ... ');
  update IBY.IBY_EXTERNAL_PAYEES_ALL
     set remit_advice_email = v_Email
   where remit_advice_email is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0170 Updating Employee (PER_PEOPLE_X) records ... ');
  update PER_PEOPLE_X
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0180 Updating points (RA_CONTACT_PHONES_INT_ALL) records ... ');
  update RA_CONTACT_PHONES_INT_ALL
     set EMAIL_ADDRESS = v_Email
   where EMAIL_ADDRESS is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  addToBody('0190 Updating Supplier (PO_VENDOR_SITES_ALL) records ... ');
  update PO_VENDOR_SITES_ALL
     set email_address = v_Email
   where email_address is not null;
  addToBody(' ... ' || to_char(SQL%ROWCOUNT) || ' records updated');

  commit;

  -- Send an e-mail to the mailbox so you know when it was last refreshed.
  napp_emailmanager.addEmail(p_From    => v_Email,
                             p_To      => v_Email,
                             p_Subject => v_Database || ' refreshed on ' ||
                                          to_char(SYSDATE, 'DD-MON-YYYY'),
                             p_Body    => v_Body);
  napp_emailmanager.processMails;
end;


Have I missed anything?



Friday, December 30, 2011

Oracle EBS: Java "Freezing" When Starting An Oracle Form

This is a very tricky problem and certainly not something you'd want to see when rolling out the Java update across your company. The error manifests itself by a "Warning - Security" dialog appearing and not allowing the user to click on anything on it;

"Frozen" Warning - Security Java Dialog

The only way to proceed beyond this point is to kill the Java process using Task Manager;
Windows Task Manager - Showing the java.exe process

Right-click the java.exe process and choose "End Process" (and then again on the dialog that appears).

Shutdown all Internet Explorer windows.

Start the "Java Control Panel" and select the Advanced tab and expand the "Java Plug-In" node in the tree view;
Java Control Panel - Advanced Tab - Java Plug-in Node Expanded
Check the checkbox and click "Apply" (you will need an Admin account in order to save the change), when it's saved you will see the following dialog;
Java Plug-In Settings Change

Now un-check the checkbox and click "Apply" (again you will need an admin account) and again you will see the settings change dialog.

Start Internet Explorer, Log into Oracle, and select a Form-based Function;
Oracle Form Displayed within Oracle E-Business Suite
Which should now be working correctly.