Oracle Fusion BICC Interview Questions
top of page

Tags

Archive

Oracle Fusion BICC Interview Questions

Updated: Mar 17, 2022

In this Post , I will write some interview questions about the Business Intelligence Cloud Connector. In a way I will cover A to Z of BICC .



 
  1. What is BICC?

BICC is an out of the box feature comes with  oracle SaaS offering like 		  		ERP,SCM,HCM,CX cloud introduced to meet the bulk extract needs to fulfill the External data warehouse requirements.

2. How the Data Extraction happens from BICC?

In BICC there is some thing called Data Stores where the ADF's VO's (View Objects) made available to extract the data . There are number of VO'S are available for each module can be found here .

3. Where do you store the extracted data from BICC?

you have three options available


1.ucm
2.OCI Object Storage
3.Storage Service Connection (It can be OAC External Storage or OCI File Storage-Cloud SFTP)

4. Can we write the Extracted files to non oracle storage's like Amazon S3 or Azure Blob Services ?

No.

5. Can we add multiple data stores to a Schedule Job ?

yes

6. Will BICC Extracts support incremental data loads?

yes . When you run a Extract , BICC Job will store the last run date , When the Subsequent job fires the extract will only pick the data from last run date. It is Also possible you to reset this date to your desired date.

7. What is a Global Data Store List in BICC?

Under Global Data Store , one will see the offerings like Financials , Public Sector , Sales and so on..  One have to Select the Choice of Area to access the relevant business objects.

8. Is there a Way to filter the data while extracting it ?


Yes. when you edit a data store , you will get an option to add the filter . It must follow this syntax .. _DATASTORE_.BI VO Column Name, for example select __DATASTORE__.InvoiceId=1234 where InvoiceId is the column name in the BI VO and of data type number. Similarly, to filter results based on timestamp, you can follow the format __DATASTORE__.InvoiceDate >= TIMESTAMP '2019-01-15 07:45:00' where the results are filtered as per the given parameter TIMESTAMP '2019-01-15 07:45:00'. You can use the appropriate operators such as = or >= to get the expected results. 

9. Can I Create a Custom offering ?

No

10. What are Data Store Options ?


1.Silent Error : Any Error occurred during the Extraction process will be Ignored when selected this option 
2.Disable Effective Date Filter : Lets say if your data is not driven by Effective Dates ,  Select this option to Bring all the Historical data
3.Extract Data Store Metadata : This Option is Enabled by Default , which basically creates a .mdcsv file along with the data extract .
4.Use UNION All : Enable this if you schedule Incremental Extracts 

11. Can I Create Custom Data Stores ?

	yes 

12. What if the columns I am Looking was not available via any of the BICC VO's ?

 you have to rely on BIP or Bulk  Extract options from Oracle ERP cloud

13. What are limitations of BICC?

 1. you cannot Create custom data stores
 2. you cannot write the data to non oracle storage options
 3. you cannot add new columns to the data stores
 4. These Vo's represents a specific functionality and not the true 
    representations of the tables
 5. No REST Support for submitting the BICC Extract Jobs . Need to 
    relay on SOAP WS.

14. Advantages of BICC?

Bulk Data Extraction is Super Fast , For Example like 100 Million rows extracted in say 30 to 40 mins.

15. What are Extract Preferences

1. Language - Select the Language , your Extracts to be Made in.
2. Prune Time In Minutes - When the Jobs are scheduled for incremental 
   loads , to ensure the data dependencies across objects the prune 
   time will determines how long before the last extract date to 
   extract data from . The default is 1,440 minutes or 24 hours.
   For Example : An extract runs on 2/25 , so the last extract date is 
                 updated to 2/25.
                 Another extract runs after 24 hours later on 2/26. It 
                 would filter data from 2/24 forward 
                 The extract would have data for 48 hours i.e from 2/24 
                 to 2/26
3. Job Settings - Which Handles the Time Out options - The Default is 
   10 Hours
4.File Parameters -- Which contains options like file compression (zip,gzip) , Split File Size in GB , Default 1 GB and can set the file size from 1 to 5 GB . upload file Expiry days - Default 90 days

16. While creating or editing schedules , how many external storage options can be supported,


Two                               .      	  		  

17. What are retry Parameters ?


Connections to BI Server or Queries may fail during the extraction which causes retries . In the analytic server connection retry limit field, one can specify the number of connection attempts made to complete the extraction.	

18 . Is it possible to send the notifications for the Extract jobs ?

yes . On Start , Success or Failure.

19. what is MDCSV file ?


It contains the Metadata definition


20. What chunking options are available ?


In the advanced Extract configurations , we can specify initial extract date and chunking for creation date and primary key for full loads.

21. What is BI Broker Extract mode ?


This can be enabled from manage extract mode . When this option is enabled , extract jobs don't use the BI server instead , they interact directly with data stores and the cloud application source database. Please note that not every data store is supported broker mode .







1,927 views0 comments

Recent Posts

See All

Dynamic Insertion#GL#FUSION

When transactions are entered, Dynamic Insertion in Oracle Fusion General Ledger (GL) allows an organisation to automatically insert extra segments or values in the GL chart of accounts. This feature

Segment Security #GL#Fusion

Oracle Fusion Segment Security The General Ledger (GL) feature enables an organisation to restrict access to data in the GL depending on certain parts of the GL chart of accounts. A segment is a subse

Other Posts you may Interested 

bottom of page