Monday, October 6, 2025

How to invoke a Web Service from BI Publisher report in Oracle Fusion Cloud

In Oracle Fusion Cloud, BI Publisher reports are widely used for generating formatted outputs and documents. One of the unique business requirement is to invoke a web service from a BI Publisher report, say to fetch real-time data from a particular web service provider. So how do we achieve this ?

Let's see the step by step process to achieve this.


- For a sample purpose, I'm going to use a Public Web Service that provides details about various Countries. Since this is a public web service, it doesn't need any authentication.

http://webservices.oorsprong.org/websamples.countryinfo/CountryInfoService.wso

For example, this web service provides an operation named ListOfCountryNamesGroupedByContinent which is parameter-less and provides full list of Countries along with the Continents they belong to.



Output:



Another operation is FullCountryInfo which accepts a Parameter for Country Code and provides the full country information in return.



Output:


Now, let's see the steps for configuring and using this web service in our BI Publisher reports.

- Login to Oracle Fusion environment

- Add /xmlpserver at the end of URL and enter



- This will get us to the Analytics area

- Click on My Profile and navigate to Administration



- Navigate to Data Sources -> Web Service Connection



- Here, we are going to create a new Web Service Data Source

For this, we are going to refer to this WSDL provided by the web service.

http://webservices.oorsprong.org/websamples.countryinfo/CountryInfoService.wso?WSDL

WSDL typically provides details about all the operations offered by the given web service -




- Let's create a new Data Source as shown below:



Protocol: http in our case

Server: Use the base URL of the web service

Port: Default http port is 80 and default https port is 443

URL Suffix: the remaining part of the WSDL URL

Check the Complex Type box

Set WS-Security to 2002

Authentication Type: Set to HTTP

I'm leaving Username and Password empty since the web service I'm using is a Public service and doesn't need any authentication. If you are using any web-service that needs authentication, then the credentials can be entered here.

- Click Apply to save the Data Source


Let's create two Reports (and hence two Data Models) showcasing various options.


Report 1: No Parameters

List of Countries by Continent


- Let's create a new Data Model of type Web Service




- I'm going to name it CountryList_DataModel

- Here, I'm going to select Data Source as the Web Service Data Source (CountryWebService) we created in previous steps.

- It will automatically populate WSDL URL and Web Service name (CountryInfoService) from our data source.

- Under Method option, we need to choose which Operation we want to use. I'm going to select ListOfCountryNamesGroupedByContinent operation from the drop-down.





- Since this operation doesn't have any parameters, the Parameter section will show no options for configuration



- Let's save and navigate to View Data


- Click View to see the sample output data


- We can see the data provided by web service appears here




- Let's click Save As Sample Data and Create Report using this Data Model


- In the report layout builder, we will be able to see all the columns offered by our Web Service Data Model in left pane.


- Drag and arrange the desired columns in the layout area.




- Click Done and Save the Report


- Run the Report and we will be able to see the output based on the data provided by web service:






Report 2: Parametric Report

Country Info Report


- Let's create a new Data Model

- We are going to use FullCountryInfo operation from our Country Web Service.

Since this operation needs a Parameter for Country Code, let's first create a Parameter P_COUNTRY_CODE in our Data Model



- Now, let's create a Data Set of type Web Service




- I'm going to name it CountryInfo_DM


- Here, I'm going to select Data Source as the same Web Service Data Source (CountryWebService) we used in previous report.


- It will automatically populate WSDL URL and Web Service name (CountryInfoService) from our data source.


- This time, under Method option, I'm going to select FullCountryInfo operation from the drop-down.





- Since this operation needs a parameter, it will automatically show us the Name of the parameter and we can map it to our Data Model parameter P_COUNTRY_CODE as shown below:




- Let's Save it and navigate to View Data


- Here, we can enter a sample country code (e.g. US) in the parameter and click View


- We'll be able to see that the web service has returned the full country details for the given Country Code




- Let's click Save As Sample Data and Create Report using this Data Model


- In the report layout builder, we will be able to see all the columns offered by our Web Service Data Model in left pane.


- Drag and arrange the desired columns in the layout area.





- Click Done and Save the Report


- Run the Report and we will be able to see the web service output based on the Parameter we entered in report parameter:






Conclusion:

Oracle Fusion Cloud offers a robust and secure methods to invoke web services from within BI Publisher reports and consume the data for reporting purposes. This supports reporting based on a parameter-less or parameter-based web service calls which gives us a liberty of making a variety of HTTP or HTTPS web service calls to fetch real-time data for our reporting needs.


Share:

Friday, September 19, 2025

How to build Data Integrations from Oracle Fusion to Autonomous Data Warehouse using OCI-DI and BI Cloud Connector

In this blog, I'll walk through how to extract data from Oracle Fusion using BICC (Business Intelligence Cloud Connector) and interface it into Oracle Autonomous Data Warehouse (ADW) using OCI Data Integration Service (OCI-DI).

We’ll use a supplier example to demonstrate the process.


Step 1: Create a Custom Offering in BIC

The first step is to set up a custom offering in BIC using a Public View Object (PVO).

You can access to the Oracle Fusion BICC console typically by navigating to https://<your-fusion-host>/biacm




What is a PVO?

A PVO is a Public View Object that provides a structured way to access base tables in Oracle Fusion Cloud. Some PVOs are extract PVOs, which are optimized for extraction. Non-extract PVOs may run larger queries in the background.


Example:

We created a custom offering using the Supplier PVO.




Storage Configuration:

Storage Configuration in BICC lets us connect BICC with OCI Object Storage so that the data extracts can be staged over the Buckets in the object storage.

In OCI, create or identify the Object Storage bucket and note the following:
Bucket name
Namespace
Tenancy OCID
Region / host URL
User OCID for the OCI user you'll authenticate as.

The OCI user must be able to read/write the bucket (appropriate IAM policies must be in place).

In BICC, navigate to Configure External Storage -> OCI Object Storage Connection



Fill-in the fields as required:

Name: Friendly name for this connection.

Host: Object Storage endpoint URL for the region.
Example: https://objectstorage.<region>.oraclecloud.com
(e.g. https://objectstorage.us-ashburn-1.oraclecloud.com).

Tenancy OCID: Paste the tenancy OCID you copied earlier. 

User OCID: Paste the OCI user OCID for the user that will be used to access the bucket. 

Namespace: The object storage namespace for your tenancy. 

Bucket: The bucket name you created/are using.


In the OCI API Signing Key area click Generate API Signing Key. BICC generates a key and shows the Fingerprint.

Click Export Public Key. This will download a .pem file (public key).



BICC keeps the private key for signing. You will paste the public key into the OCI Console for the user specified above.


In OCI Console, add the exported public key to the OCI user.

- Navigate to OCI Console -> Identity & Security -> Users -> Select the user (the same OCID you entered above).

- Under API Keys, click Add Public Key -> provide the .pem file exported from BICC. Click Add.

The fingerprint shown in OCI should match the fingerprint shown in the BICC UI.


Now, navigate back to BICC console -> OCI Object Storage Connection

Under Verification, click Test Connection.

If all goes well, the test would succeed

Save the connection and close.

At this stage we have completed all the required configurations in Oracle Fusion BI Cloud Connector (BICC).



Step 2: OCI Data Integration

Oracle OCI provides a cloud-native service named Data Integration to extract, transform, and load data.


- Navigate to OCI Data Integration.



- Go go Workspaces -> Create a workspace (e.g. BICC-ADW)



- Define data assets (connectors) for the source and target:

1. BICC Data Asset (Source)


- Type: Oracle Fusion Applications

- Oracle fusion applications host: Provide the host of your Fusion pod

- Default connection type: Oracle BI Cloud Connector

- Provide the Fusion username and password




2. ADW Data Asset (Target)


- Type: Oracle Autonomous Data Warehouse

- Upload Wallet: Upload the database wallet pertaining to your ADW

- Wallet Password: Provide if applicable

- Provide database username and password

- Select the desired TNS alias from the available list





3. Object Storage Data Asset (for staging the files intermediately)


- Type: OCI Object Storage

- Namespace: Provide Namespace in which your bucket resides

- OCI Region: Select the applicable region as per your tenancy




Once above steps are completed, we should have below data assets configured:





Step 3: Build a Data Flow


Once data assets are ready, create a project in Data Integration and define data flows:

Project:



Create a new data flow for our extracting Supplier data from BICC offering we created earlier.





1. Source Node:

    Use the BICC data asset.

    Select the custom offering as the schema.

    Choose the PVO you want to extract.

    Set extract strategy to incremental to pull only new or updated records.

    Select “Manage” to let the OCI-DI track the last run date automatically.



2. Target Node:


    Use the ADW data asset.

    Specify the schema and target table.

    Set the merge strategy: new rows are inserted, updated rows are updated.

    Define merge keys (unique identifiers for records).



3. Filtering:

Apply a filter based on the system variable ${SYS.LAST_LOAD_DATE}, ensuring only delta data is extracted.

This approach allows data to be pulled incrementally from BICC and merged into ADW seamlessly.




Step 4: Encapsulate Data Flow in a Task


Each data flow is wrapped in a task for execution.




Step 5: Build a Pipeline


- We can organize multiple tasks using a pipeline.

- Create a Pipeline and use the Integration Task as one of the nodes between Start and End.

- We can integrate many such nodes in Serial or Parallel manner in a Pipeline, based on the requirements.

- We can also apply conditional logic, expressions, or SQL tasks.

- Pipelines allow complex orchestration similar to Oracle Integration Cloud (OIC) integrations.





Step 6: Encapsulate Pipeline in a Task and Publish the tasks to Application


- Similar to the Data Flow, even Pipeline needs to be wrapped in a task for execution.

- Once done, Publish the Tasks to the Application





Step 7: Run the Pipeline


- Let's first create a new supplier (Test Supplier) in Fusion so that we can validate if this records gets interfaced to ADW once we run the Pipeline.



- Navigate to the OCI-DI Application

- Go to Tasks -> Locate the Pipeline Task -> Click the context menu -> Run



- OCI-DI Pipeline run can be tracked under Runs section




- Behind the scenes, OCI-DI automatically submits an ESS job to extract data from BICC.



- Data is extracted and stored as a CSV in OCI Object Storage.



- Pipeline completes Successfully and under the Integration Task node, we can see it extracted 8 new records from Fusion over to ADW




Step 8: Verify the data

- As we can see, the newly created supplier (Test Supplier) has been successfully interfaced over in ADW




Conclusion:

OCI Data Integration (OCI-DI) provides a powerful, cloud-native way to move data from Oracle Fusion BICC into Autonomous Data Warehouse (ADW) seamlessly. By leveraging features like incremental extraction, managed load strategies, and merge operations, organizations can avoid heavy full-load extractions and instead synchronize only the data that has changed since the last run. This not only improves performance and reduces costs but also ensures that ADW always reflects the latest state of Fusion data with minimal lag.

With its intuitive workspace, reusable data assets, and orchestration pipelines, OCI Data Integration enables teams to design and automate robust data flows across multiple PVOs and Fusion modules. Combined with BICC’s ability to export data securely to OCI Object Storage, the solution establishes an efficient and scalable data integration pattern that supports daily refreshes, near-real-time reporting, and downstream analytics with confidence.



Share: