Logo

The Data Daily

Tips for Optimizing the Performance of Web Intelligence Documents

Last updated: 06-10-2021

Read original article here

Tips for Optimizing the Performance of Web Intelligence Documents

Tips for Optimizing the Performance of Web Intelligence Documents
 
Tips for Optimizing the Performance of Web Intelligence Documents
DRAFT DISCLAIMER - This document is a work in progress and will be released 1 chapter at a time. Please follow, bookmark and subscribe to updates to ensure you are notified of the latest changes. It is also a living document and we would love to hear your feedback and tips & tricks! Comment or private message anything you would like to see added, changed or removed.
.
Created Initial Document Structure and Completed Chapter 1 - Client Side Performance
10-02-2014
Made some minor updates to the formatting and some links
10-08-2014
Started on Chapter 2 - Process Best Practices Started on Chapter 2 - Process Best Practices
10-09-2014
Finished Chapter 2 - Fixed some formatting issues
10-15-2014
Updated Introduction to discuss overlap with SCN DOC http://scn.sap.com/docs/DOC-58532
10-17-2014
Started Chapter 3. TIPS 3.1 - 3.4 Added.
10-22-2014
Added Tips 3.5 and 3.6
10-24-2014
Added Tips 3.7 - 3.9 to complete Chapter 3.
10-31-2014
Completed Chapter 4 and Published the latest version of Doc.
11-14-2014
12-01-2014
Jonathan Brown
Modified list of functions that can turn off caching as per Matthew Shaw's suggestion in comments
12-09-2014
Added Link to Ted Ueda's blog about sizing
02-19-2015
Added Tip 4.9 on SL Security impacts on Performance
05-21-2015
Added Tip 3.10 - Mandatory vs Optional prompts -- Started Chapter 7
12-04-2015
- Added Tips 4.10, 5.5 and 1.7.
- Also added link to NEW Performance Testing Pattern Book in Introduction Section
12-21-2015
- Added Tip 3.11 and updated Tip 3.7
05-11-2017
 Added Tip 2.5 - Configure a separate database for BI Commentary
29 Nov 2017
 
Pascal Gaulin
Updated the chapter 1 with the latest developments (HTML features parity with Java, new FIORI interactive client)
 
Introduction
.
This document will become a central repository for all things Web Intelligence & Performance related. It is a living document and will be growing over time as new tips, tricks and best practices are discovered. We encourage suggestions and contradictions on the content within and hope the community will collaborate on this content to ensure accuracy.
.
Please feel free to bookmark this Document and received Email notifications on updates. I would also love to hear your feedback on the contents of this doc so feel free to comment below, private message me, or just like and rate the document to give me feedback.
.
I am the writer of this document but information contained within is a collection of tips from many sources. The bulk of the material was gathered from within SAP Product Support and from the SAP Products & Innovation / Development teams. Some of the content also came from shared knowledge on the SAP Community Network and other like websites.
.
The purpose of this document is really to bring awareness to known issues, solutions, and best practices in hopes of increasing the throughput of existing hardware, improving the end user/consumers experience, and to save time and money on report design/consumption.
.
The Origin of this idea was from an America's SAP User Group session that was presented in Sept 2014. That presentation spawned this document as well as another high level best practices document found here: Best Practices for Web Intelligence Report Design
.
Where the purpose of this document is to focus on Performance of Web Intelligence Documents, the Best Practices Guide above will cover high level best practices across Web Intelligence in general. There is a lot of overlap between this document and the Best Practices document referenced above as they both spawn from the same source presentation of the 2014 ASUG User Conference.
.
** NEW ** (12/2015)  –  BI Platform 4.x Performance Testing Pattern Book V1.0 Released!WIKI - BI Platform 4.x Performance Testing Pattern Book - Business Intelligence (BusinessObjects) - SCN Wiki
.
Chapter 1 - Client Side Performance
.
Client side performance tips and tricks cover anything that is specific to the client machine. This includes the new FIORI interface, the HTML, Applet and Rich Client Interfaces as well as the Browser that the client uses to a certain degree.
.
TIP 1.1 - Use HTML Interface for Faster viewing/refreshing of Reports
.
The HTML Interface is a light-weight thin client viewer. It uses HTML to display and edit the WebI Documents. Since it is a thin client application that requires little more than displaying and consuming HTML, it is a great choice for those users that want fast document viewing and refreshing in their browser.
 
Since version 4.2 SP4, the HTML interface is almost identical to the Applet interface and can therefore be used for any document editing task.

Chapter 1.4 of the Webi User Guide covers the few differences remaining between the HTML, Applet and Rich Client Interfaces. Review this to help you make a decision whether or not the HTML Interface will do everything you need it to do.
..
Below is a link to our Web Intelligence documentation page on our Support Portal. Go to the End User Guides section to find the latest Webi User Guide documentation.
..
All documentation can be found at the SAP Help Portal  https://help.sap.com/viewer/product/SAP_BUSINESSOBJECTS_WEB_INTELLIGENCE/  .
.
TIP 1.2 - Upgrade to BI 4.1 SP03+ for single JAR file Applet Interface
.
BI 4.x introduced a new architecture for the Applet Interface, aka Java Report Panel/Java Viewer. Previous versions were a single JAR file called ThinCadenza.jar.
.
BI 4.0 and earlier versions of BI 4.1 split this architecture out into over 60 jar files. This was done for ease of maintenance and deployment originally but Java updates later made this architecture more cumbersome. Java security updates and restrictions that are now enforced by default have made the performance of this new architecture too slow in many cases.
.
BI 4.1 SP03 and above have reverted back to a single .jar file deployment. This will often improve performance on the client side due to a reduced number of security and validation checks that have to happen on each .jar file.
.
The below What's New Guide talks about this change briefly. It should mostly be invisible to the end users though. Except for maybe the improved performance.
.
TIP 1.3 - Ensure Online Certificate Revocation Checks aren't slowing down your Applet Interface
.
Online Certificate Revocation Checks are turned on my default in newer versions of the Java Runtime Engine (JRE). These basically tell the client side JRE to go out to online servers to validate the certificates that the applet jar files are signed with. On slower networks, this can add a lot of overhead.
.
Older versions of the JRE did not have this enabled by default so it wasn't an issue.
.
Since BI 4.x had 60+ jar files to load for the Applet, it could potentially take much longer to run these checks across 60+ files. On slower internet connections, this could equate to several minutes of delays!.
 
I talk about this in much more detail in the following Wiki and KBA:
.
WIKI - Tips for Fine Tuning Performance for the Webi Applet
.
TIP 1.4 - Make sure JRE Client Side Caching is working
.
When troubleshooting client side JRE performance issues, one of the first things you want to check is that JRE Caching is enabled and WORKING. We have seen issues with performance when caching was either disabled, configured incorrectly, or was just not working because of some sort of system or deployment issue.
.
One example is on a Citrix deployment. Since each user can potentially have a unique and dynamic "Users" folder, the cache may not be persistent across sessions. Setting the cache to a common location that can be persistent across sessions may help in this type of scenario.
.
We cover a lot more on how to enable and validate JRE cache in my Wiki below
.
TIP 1.5 - Ensure you are not running into these known JRE Security Change issues
.
A number of Java security updates and changes have caused issues with the Applet Interface. The known issues are well documented and can be found on this Wiki:
.
WIKI - Web Intelligence and Oracle Java Runtime Engine Known Issues
.
This is divided into individual sections for the known issues on different XI 3.1 and BI 4.x versions.
.
Here are direct links for the BI 4.0 and BI 4.1 known issues pages
.
While these are not technically performance issues, they will slow down your end users and will cause delays in viewing, modifying and refreshing documents and instances.
.
SAP only releases Patches/Support Packs every few months so when Oracle security changes come into play, there can sometimes be a bit of a delay before we can have a patch out to resolve/address the change. Keep this in mind when pushing the latest and greatest Oracle JRE updates out to your clients.
.
Known Issue - Applet Performance Issue with JRE 8
** NEW - SAP Note 2260485 - The Web Intelligence applet takes much longer to launch with Java Runtime Environment 8 installed
.
TIP 1.6 - Choose the right client - WebI Rich Client vs HTML vs FIORI vs Applet Interfaces
.
Each of the Interfaces has a list of pros and cons. Choosing the right client interface for Web Intelligence is about striking a balance between functionality, performance and convenience.
.
Chapter 1.4 of the Webi User Guide covers the differences between the interfaces. Reading and understanding this should help you decide which interface to use. It's often not as cut and dry and standardizing on only one interface though. Some users may like the HTML interface for viewing of documents but prefer the Rich Client Interface for creating and editing documents. It is really up to the user which interface they use.
.
Use the Portal link below to find the latest Webi User Guide. Chapter 1.4 covers the interface differences.
.
As a general guideline, we recommend the following use cases for the interfaces:
.
WebI FIORI Interface
This is the latest WebI client. It is available since version 4.2 SP4.
With a modern look and feel, touch-enabled, based on HTML5 and SAP FIORI, it works equally well on a laptop and a tablet.
Does not support an editing mode but offers interactive functionalities not available in any of the other clients, in reading mode.
 WebI HTML Interface
Best interface for report consumers who will mostly be running predesigned reports and doing only light modifications
The HTML interface utilizes the 64-bit backend servers and, since 4.2 SP4, has nearly the same design capabilities of the Applet Interface
.
WebI Applet Interface
Best interface for report designers and power users who will be creating, modifying and doing advanced analysis of documents and data.
This interface takes advantage of 64-bit backend servers and can generally handle larger amounts of data/calculations as it utilizes backend servers to do the heavy lifting.
Since this is a Web Application, timeouts can occur when leaving the session idle or when carrying out long running actions.
However, the Java plugin is gradually being phased-out by the browser vendors. Therefore, the WebI Applet interface might soon become obsolete.
.
WebI Rich Client Interface
This stateless interface has almost all of the features and functionality that the Applet interface does plus a few additional features of its own. This should be used by advanced designers and power users that wish to have a stable design environment for larger documents
Can be used with local data sources and some desktop type data sources such as Excel and Access
Also can be used in 3-tier mode which takes advantage of backend servers for data retrieval
 
** TIP 1.7 - Hide the Left Panel to gain some time when opening documents
.
Documents will open faster if you hide the left panel for a document by default. This is because WebI does not need to spend the extra time to render the different components that make up that left panel until they are needed. This will give the end user a faster initial load time and will postpone the delay to the point they need to see that left panel. If they never need to open the left panel, then that time is saved for the entire workflow!
As an added benefit, the users will see more of the report on initial view and will have to do less scrolling to see the whole width of the page.
.
Notes:
This does require the users to know how to find that panel again. So some user training might be necessary. This option is set Per User / Per Client as well so once a user minimizes this in a particular viewer, it should persist until they expand it again.
The new Web Intelligence FIORI interface has no left panel.
.
Here is where the option is:
.
Be sure to save your document with it minimized to benefit from this tip
.
Recommendation: Save your documents with the left panel minimized to help reduce the load time of a document. Especially helpful for report consumers who will not be doing any modification of the documents.
.
Chapter 2 - Process Best Practices
.
When we talk about "Process" Best Practices, we are really talking about the workflows around how we utilize Web Intelligence reports in our Business Processes.
.
This chapter will cover a number of Best Practices that will allow you to build good business processes or workflows around your Web Intelligence documents
.
TIP 2.1 - Schedule reports to save time and resources
.
This may seem like a no-brainer but we see countless support incidents come through that could be avoided with a simple process around when to schedule a document vs view it on-demand.
.
The Best Practices threshold for Scheduling is 5 minutes. If a report takes more than 5 minutes to refresh and render, then that report should be scheduled.
.
Scheduling allows for a user or administrator to offload the processing of a document to a backend server so they are not forced to sit and wait for the report to finally come up on their screen.
.
Provides lower user wait times when implemented correctly
Allows us to offload processing to non-peak times
Can help reduce sizing requirements for concurrent users
Reduces impact on Database during Peak times
Can combine Instances with Report Linking to produce smaller, faster documents
.
Studies have shown that in today's world, end users are unlikely to wait for more than 5 seconds for a video to load. For example, if you are on YouTube and click the Play button, would you wait 5 minutes for that video to load up and start playing? I think most of us would give up or try to refresh the video again after about 10-20 seconds.
.
This holds true for Web Application users too. If a report doesn't view within a minute or two, the consumer is very likely to close the request and try again, or just give up all together. The danger in them submitting the request again is that they are using up even more resources on the backend servers when they do this. Here's a workflow as an example:
.
UserA logins to BI Launchpad and navigates to the "Monster Finance Report" document
UserA Views this document and clicks the refresh button to get the latest data
After about 2 minutes, UserA is not sure what is going on. The report appears to be refreshing still, but given the fact that UserA is impatient, he suspects that the refresh has "hung" and closes the viewer.
UserA decides to test his luck and submit the request again. This essentially creates a new request for the same data and potentially works against BOTH requests as they compete for resources on the BI servers and the database side.
After a few more minutes, UserA gives up and moves on. Meanwhile he has no idea the amount of resources and time he's wasted in the background.
.
In the above scenario a few bad things happened:
UserA Never got his report and had a bad experience
Backend resources were wasted without any usable results
.
Both of these could have been avoided by building processes around proper use of scheduling.
.
Here are some tips on how to best utilize scheduling:
.
Educate your users to schedule anything that takes over 5 minutes to run
Encourage users to Schedule reports that they know they will need throughout the day to non-peak hours before their day begins
Schedule Documents to formats that you know your end users will want such as Excel, Text, or PDF. This can save time and resources during the day
Utilize Publications when multiple users have different requirements for the same documents
.
For more information on Scheduling Objects and Publications, see the BI Launchpad User Guide  https://help.sap.com/viewer/p/SAP_BUSINESSOBJECTS_BUSINESS_INTELLIGENCE_PLATFORM  
Chapter on
TIP 2.2 - Use the Retry options when Scheduling to Automate Retries
.
Although this isn't really a true performance tip, I do find that it is a best practice that goes hand in hand with scheduling. It often amazes me how many people are not aware of the Retry functionality within the Schedule (CMC Only) Dialog. This feature allows you to configure your scheduled instances to retry X number of times and after X number of seconds if a failure occurs.
.
Here is a screenshot of this option in BI 4.1
.
Where this tip DOES save you time is in hunting down and manually rescheduling reports that may have failed due to database issues or resource issues on the BI Platform side. Intermittent failures are usually tied to resources somewhere in the process flow so simply setting up retries a few minutes apart can help in limiting the number of true failures we see in a busy environment.
.
This option can be set in the Default Settings/Recurrence section of the Schedule Dialog or under the Schedule/Recurrence section. The difference between the two is that the Default Settings option will set the default retry values for any future schedules. Setting it under the Schedule section only sets it for that particular schedule.
.
NOTE: It is important to note that this option is only available in the CMC and not through BI Launchpad currently
.
TIP 2.3 - Use Instance Limits to help reduce the # of Instances in your environment
.
This is another little known feature that you can use to help improve the performance of your system. The feature is called Instance Limits and you can set it on a Folder or Object Level.
.
The basic concept is that you can set limits on the # of instances a folder or object will keep. If the limit is exceeded, the CMS will clean up the oldest instances to help reduce the amount of metadata and resources that is stored in the CMS database and on the Filestore disk.
.
Here are the basic instructions on how to enable and set limits, as found in the CMC Help guide:
.
Setting limits enables you to automatically delete report instances in the BI platform. The limits you set on a folder affect all objects in the folder.
At the folder level, you can set limits for:
The number of instances for each object, user, or user group
The number of days that instances are retained for a user or a group
.
Steps to enable Instance Limits in the CMC
Go to the Folders management area of the CMC.
Locate and select the folder for which to set limits, and select Actions/Limits.
In the Limits dialog box, select the Delete excess instances when there are more than N instances of an object check box, and enter the maximum number of instances per object the folder can contain before instances are deleted in the box. The default value is 100.
Click Update.
To limit the number of instances per user or group, Click the Add button beside Delete excess instances for the following users/groups option.
Select a user or a group, click > to add the user or group to the Selected users/groups list, and click OK.
For each user or group you added in step 6, in the Maximum instance count per object per user box, type the maximum number of instances you want to appear in the BI platform. The default value is 100.
To limit the age of instances per user or group, click Add beside the Delete instances after N days for the following users/groups option.
Select a user or a group, click > to add the user or group to the Selected users/groups list, and click OK.
For each user or group you added in step 9, in the Maximum instance age in days box, type the maximum age for instances before they are removed from the BI platform. The default value is 100.
Click Update.
Below is a screenshot of the dialog for your reference
.
.Once you have enabled Instance Limits, you will have better control over the size of your CMS and Input/Output FRS. A bloated CMS database and Filestore can definitely contribute to a slower running BI system in general so having a handle on this can definitely help keep your system running at top speed.
.
TIP 2.4 - Platform Search Tweaking for Performance
.
Have you ever seen a bunch of resources (CPU/RAM) being used on your BI Platform server without any user activity? If you have, this is most likely the Continuous Crawl feature of Platform Search doing a whole lot of indexing.
.
What is Platform Search?
.
Platform Search enables you to search content within the BI platform repository. It refines the search results by grouping them into categories and ranking them in order of their relevance.
.
There is no doubt that Platform Search is a great feature! It is just a factor that needs to be taken into consideration when sizing an environment for Performance.
.
The BI Administrators guide talks about this feature and how to configure it
.DOC -   https://help.sap.com/viewer/p/SAP_BUSINESSOBJECTS_BUSINESS_INTELLIGENCE_PLATFORM   -  Chapter on Platform Search
..
When BI 4.0 first came out, support saw a lot of instances where customers were seeing performance degradation and resource issues on their system AFTER migrating the bulk of their content over to the new BI 4.0 system.
.
After an extensive investigation, we discovered that in most of these cases, the issue was the Indexing of this "new" content that was added to the server.
So how does this affect performance? How can adding new content to a BI 4.x system cause Processing Servers and other resources to spike up?
.
Behind the scenes, the Platform Search application detects that there is new content that needs to be indexed and cataloged. This means that for every new object (Webi Doc, Universe, Crystal Report, etc...) needs to be analyzed, cataloged and indexed by the Search Service. To do this, The Platform Search Service, found on an Adaptive Processing Server, will utilize Processing Servers (Webi, Crystal, Etc...) to read the Report contents and generate an index that it can use to map search terms to the content. Really cool functionality, but with large documents with lots of data, objects, keywords, etc... this can add a lot of overhead to the system. Especially if a lot of new objects are added at once.
.
By default the indexer is configured to Continuously Crawl the system and index the Metadata of the objects. If you find this is taking up a lot of resources on your system then you may want to use the Schedule option to control when it runs. Running indexing outside of regular business hours or peak times would provide you with the best performance
.
Luckily we can configure the frequency and verbosity level used by the Indexer. These options are discussed in Chapter 22 of the Administrators guide above.
.
In short, be sure to keep Platform Search on your radar in case you have unexplained resource consumption on your server.
.
More Info:
.
TIP 2.4 - Platform Search Tweaking for Performance
TIP 2.5 - Configure a separate database for BI Commentary
In  BI4.2 SP3 a new feature was introduced, the BI Commentary . By default it uses the Auditing Database, and in large deployments can create bad performance when opening any Web Intelligence document.
Please see KBA 2346055 for more information.
 
Chapter 3 - Report Design Best Practices
.
This chapter will discuss some Report Design Best Practices that can help you optimize your report for Performance. These tips should be considered whenever a new report is being designed. A lot of these can also be applied to existing reports with little effort.
.
A compilation of Report Design Tips & Tricks, not necessarily related to performance, can also be found in the below document by William Marcy . This is a great document and is a must see for anyone striving to design better reports.
.
DOC - Webi 4.x Tricks - By William Marcy & various other contributors on SCN.
.
TIP 3.1 - Steer Clear of Monster Webi Documents
.
A "Monster Document" is a document that contains many large reports within in. A Web Intelligence document can contain multiple Reports. When we are referring to Reports, we mean the tabs at the bottom of a Webi document. We often use the term Report to mean a Webi Document, but it is important to differentiate between the two. A document can contain multiple reports.
.
When creating a Document, we need to start with the actual Business Need for that document. We can do this by asking the stakeholder questions like:
.
What is the primary purpose of this document?
What question(s) does this document have to answer?
How many different consumers will be utilizing this document?
Can this document be split into multiple documents that service smaller, more specific needs?
.
By asking questions like the above, we are drilling in on what the actual needs are and can use the answers to these questions to help eliminate waste. If we build a Monster Document that accounts for every possible scenario that a consumer may want to look at, then we are potentially wasting a lot of time for both the document designer and the consumer. For example, if only 10-20% of a large document is actually utilized by the consumer on a regular basis, then that means 80-90% of the document is waste.
.
Once we know the Business Needs of the consumer, we can design a focused document that eliminates much of the waste.
.
Below are a few recommended best practices to keep in mind when building a document:
.
Avoid using a large number of Reports (tabs) within a Document
10 or less Reports is a reasonable number
Exceeding 20 reports in a single document should be avoided
Creating smaller documents for specific business needs allows for faster runtime and analysis
Utilize Report linking to join smaller documents together. This is discussed more in TIP 3.2
Aim to satisfy only 1-2 business needs per document.
Provide only the data required for the business need(s) of the Document
50.000 rows of data per document is a reasonable number
Do not exceed 500.000 rows of data per document
Do not add additional Data Providers if not needed or beyond document needs
5 data providers is a reasonable number
Do not Exceed 15 data providers per document
.
There of course will be exceptions to the above recommendations but I urge you to investigate other ways of designing your documents if you find your document is growing too large.
.
You will see the following benefits by creating smaller, reusable documents based only on the business needs of the consumers.
Reduce the time it takes to load the document initially in the viewer/interface
Smaller documents will load quicker in the viewers. This is because the resources needed to transfer the document and process it initially will be much less with smaller documents.
Reduce the refresh time of the document.
The larger the document, the more time it will take to process the document during a refresh. Once the report engine receives the data from the data providers, it has to render the report and perform complex calculations based on the document design. Larger documents with many variables and large amounts of data can take much longer to render during a refresh.
Reduce the system resources needed on the both the client side and the server side.
The resources needed to work with a large document are going to be much greater than those needed for smaller documents. By reducing the size of your documents, you are potentially reducing the overall system resources, such as CPU, RAM, Disk space, that your system will consume on average. This can equate to better throughput on your existing hardware.
Improve performance while modifying the document
When modifying a large document, the client and server has to load the document structure and data into memory. As you add/change/move objects in the reports, this causes client/server communication to occur. This can slow down the designer as updates require reprocessing on any objects involved. The more objects in a document, the longer each operation during a modify action can take.
Improve performance for the consumer during adhoc query and analysis.
Slicing, dicing, filtering and drilling actions will perform quicker on smaller documents as well. This will equate to faster response times to the consumers as they navigate and do detailed analysis on the documents
.
TIP 3.2 - Utilize Report Linking Whenever Possible
.
Report Linking is a great way to relate two documents together. This can be an alternative to drilling down and allows the report designer better control over the size and performance of their documents. Report Linking can be used to help reduce the size of single documents by allowing the designer to break out documents into smaller chunks while still allowing them to be related to each other. This compliments the recommendation to steer clear of Monster Documents very nicely
.
The concept of Report Linking is simple. You basically embed a hyperlink into a document that calls another document. This hyperlink can use data from the source report to provide prompt values to the destination report. Below is an example that explains the concept
.
Sales_Summary is a summary report that summarizes the sales for all 100 sites of Company XYZ Inc.
Sales_Summary has a hyperlink that allows a user to "drill into" a 2nd Report (Sales_Details) to get the sales details on any of the 100 sites.
Sales_Summary is scheduled to run each night and take ~20 minutes to complete.
Users can view the latest instance of Sales_Summary which takes only a few seconds to load.
Users can drill down into Site Sales data for each of the 100 sites which launches Sales_Details report using Report Linking and a prompt value
The prompt value filters the Sales_Details report using a Query Filter so that it only displays the sales details for the 1 site that the user drilled into.
.
In the above scenario, we see many benefits
The Sales_Summary report only contains the Summary details. Therefore it runs faster than if it contained both summary and detailed data
The Sales_Summary report is smaller and will load/navigate much quicker on its own
The User can drill down and get a much faster response time because the details report only contains the specific data that they are interested in
.
The Web Intelligence User Guide covers this in more details in Section on - Linking to another document in the CMS
.
.
.
The easiest way to generate these Hyperlinks is using the Hyperlink Wizard. This Wizard is currently only available in the HTML Interface. For manual creation of the hyperlinks, you will want to follow the OpenDocument guidelines available in the below link:
.
DOC - Viewing Documents Using OpenDocument    (In the Development Section)
.
Here is a screenshot of the Wizard and where the button is on the toolbar. It can be a little tricky to find if you haven't used it before:
It is important to note that this can add a little more time to the planning and design phases of your Document creation process. Properly implemented though, this can save your consumer a lot of waiting and will reduce the backend resources needed to fulfill requests
When configuring a hyperlink using OpenDocument or the HTML Hyperlink Wizard, you can choose whether or not you want the report to refresh on open, or to open the latest instance. Our recommendation is to use Latest Instance whenever possible. This allows you to schedule the load on your database and backend processing server and will reduce the time it takes for the consumer to get their reports.
TIP 3.3 - Avoid Autofit When not Required
The Autofit functionality allows you to set a cell, table, cross-tab or chart to be resized automatically based on the data. A cell for example, has the option to Autofit the Height and Width of the cell based on the data size. The below screenshot shows this feature in the Applet Interface for a cell.
.
.
This is a great feature for the presentation of the report but it can cause some performance delays when navigating through pages or generating a complete document.
.
NOTE: The default setting for a cell is to enable the Autofit height option. This could impact the performance of your reports so it is important to no how this can affect performance.
.
How does this affect performance of the report?
.
When autofit is enabled for objects on a report, the Processing Server has to evaluate the data used in every instance of that object in order to determine the size of the object. This means that in order to skip to a particular page of the report, the processing server would need to calculate the size for every object that comes before that page. For example, if I have 100,000 rows of data in my report and I navigate to page 1000, then the processing server has to generate all of the pages leading up to page 1000 before it can display that page. This is because the size of the objects on each page is dynamically linked to the rows of data so it is impossible to determine what rows will be on page 1000 without first calculating the size of the objects for each page preceding it.
.
In short, this option adds a lot more work to the page generation piece of the report rendering process. A fixed size for height and width allows the processing server to determine how many objects fit on each page and allows it to skip the generation process for pages that are not requested.
.
For another example: if I have 100,000 rows and have set my objects to fixed width/height, then the processing server knows that 50 rows will fit on each and every page. If I request page 1000, it will know that the rows on that page will be rows 50,000 to 50,050. It can then display that page with just those rows in it. Way quicker than having to generate 999 pages first!
.
--------
.
As you can imagine, this mostly just affects reports that have many rows of data and have many pages. If you have a report with only a few pages, it probably isn't worth looking at this option. For larger, longer reports, it might be worth investigating.
.
TIP 3.4 - Utilize Query Filters instead of Report Filters whenever possible
.
A Query Filter is a filter that is added to the SQL Statement for a report. Query Filters limit the data that is returned by the Database server itself by adding to the WHERE clause of the SQL Statement.
.
A Report Filter is a filter that is applied at the Report Level and is only used to limit the data displayed on the report itself. All of the data fetched from the Database is still available behind the scenes, but the report itself is only showing what is not filtered out.
.
There is a time and a place for both Query Filters and Report Filters but understanding the differences between them is a good way to ensure that you are not causing unnecessary delays in your report rendering and refreshing. It is best to predesign Query Filters in your Semantic Layer design but you can also add them manually using the Query Panel within Web Intelligence itself.
.
Here is a screenshot of a Predefined Filter being added to a Query in Query Panel
.
And here is an example of a similar Query Filter being added manually
.
In both of the above cases, the WHERE clause of the SQL Statement will be updated to reduce the data returned to the report to filter based on the year.
.
Alternatively, here is a screenshot of a Report Filter that does something similar
.
In this Report Filter example, the display data is being filtered to the selected year but the data contained in the cube itself still contains ALL years. This can affect performance so be sure to use Query Filters to limit the data whenever possible. There is of course scenarios where Report Filters are the better choice for slicing and dicing, but it is just something to keep in mind when designing reports for performance.
.
TIP 3.7 - Limit the # of Data Providers Used
.
Best practices from the field is to limit the # of data providers to 15 or less for faster performing reports. If you have a need for more than 15 data providers, then you may want to consider a different way of combining your data in a single source. Using a proper ETL Tool and Data Warehouse is a better way to achieve this and pushes the consolidation of data to a data warehouse server instead of the BI Server or Client machine.
.
The current design of the Webi Processing Server is to run Data Providers in series. This means that each data provider is run one after another and not in parallel as you might expect. So, the combined total runtime of ALL of your data providers is how long the report will take to get the data.
.
Here is a representation of the combined time it might take for a report with multiple data providers in XI 3.1 of BI 4.1:
.
 
 
Another consideration for reports with a lot of data providers is that merging dimensions between multiple sources adds overhead into the processing time. Keeping it simple will certainly result in a better performing report.
.
SAP BusinessObjects Business Intelligence Platform 4.2 includes a new feature called parallel data fetching.
This will allows data providers to run in parallel (simultaneously) and allows performing several data refresh actions simultaneously in Web Intelligence reports based on multiple data providers without performance drop.
 
Here is a diagram that shows the same scenario as above but with the Parallelized Data Provider feature:
.Her
TIP 3.8 - Don't accidentally Disable the Report Caching
.
Web Intelligence utilizes disk and memory caching to improve the performance of loading and processing documents & universes. This can provide a faster initial load time for common reports and universes when implemented correctly.
.
The good news is that caching is enabled by default so in most cases this will be happening automatically for you and your users behind the scenes. There are a few situations where cache cannot be used though so we wanted to make sure report designers were aware of these:
.
The following functions will force a document to bypass the cache:
.
GetContentLocale()
.
If you use these within your document, then cache will not be utilized. These functions are quite common so it is important to be aware of the potential impact on caching they can have.
.
At the current time, caching is done at a document level and not an individual Report (tab) level. Therefore, if these functions are used anywhere in the document, the cached copies will not be used for subsequent requests.
..
TIP 3.9 - Test Using Query Drill for Drill Down Reports
.
What is Query Drill? As quoted from the Web Intelligence User Guide :
.
"When you activate query drill, you drill by modifying the underlying query (adding and removing dimensions and query filters) in addition to applying drill filters.
.
You use query drill when your report contains aggregate measures calculated at the database level. It is designed in particular to provide a drill mode adapted to databases such as Oracle 9i OLAP, which contain aggregate functions which are not supported in Web Intelligence, or which cannot be accurately calculated in the report during a drill session.
.
Query drill is also useful for reducing the amount of data stored locally during a drill session. Because query drill reduces the scope of analysis when you drill up, it purges unnecessary data."
.
Performance gains can be appreciated by reducing the amount of data that a Webi document stores locally and by pushing some of the aggregation to the database server side.
.
Performance gains may or may not be realized by using this option but it is simple enough to test it out to see if it will improve performance for you. To enable this option, go into the Document Properties and check the "Use Query Drill" option. Below is a screenshot of the option:
.
TIP 3.10 - Mandatory Prompts vs Optional Prompts
.
This tip came to me while investigating a support incident. The customer I was working with noticed that reports took significantly longer to refresh when his prompts were Optional vs Mandatory. we were seeing a 30 second difference in even one of the more simple reports he had for testing. We investigated this through the traces and noticed that the SQL Generation functions were executing twice when Optional Prompts were involved and this was adding to the overhead of running the report.
.
This was happening in XI 3.1 SP7 on the customers side so it was with a legacy UNV universe. I could replicate the issue internally with our simple eFashion universe but since it executes very quickly, the extra time was barely noticeable in my own testing. I collected my internal logs and BIAR file and asked a developer for a quick review.
.
The developer confirmed that the delay I saw from the SQL Generation functions as suspected. He then did a code review to see why this was happening. His explanation was that Optional prompts may or may not have values and therefore the the SQL generation could change after the prompt dialog appears. For example, if an optional prompt value is not selected, then the Where clause will omit that object. With Mandatory prompts, the SQL structure will always be the same before and after prompts are selected so it does not need to regenerate the SQL after a value is selected.
.
So, in short, Optional vs Mandatory can give different performance results so it should be considered before choosing one vs the other. As with many of the other tips in this doc though, this does not mean that you should not use Optional prompts. They are useful and are often necessary, but they are a factor and as long as you know this, you can optimize your report design.
.
** TIP 3.11 - Function NumberOfPages() will force generation of all pages
.
Be careful when using the function NumberOfPages() as this will force the report to generate all of the pages in order to determine the total page count. If this function is not used, the document will not generate all of the pages right away. It will only generate them when they are specifically request, when the document is exported to another format, or when the Last Page navigation button is pressed.
.
This can be a useful function but be aware that it could cause unnecessary delays. Be sure to evaluate whether or not your end users require this functionality before using it..
..
Chapter 4 - Semantic Layer Best Practices
.
Most of the below Best Practices involve the Semantic Layer, or SL as we sometimes refer to it as. These Best Practices can help you design faster running queries which can result in faster running Webi Docs.
.
TIP 4.1 - Only Merge Dimensions that are needed
.
A Merged Dimension is a mechanism for synchronizing data from different Data Providers. For example, if your document had 2 Data Providers and each of them has a "Product Name" dimension, you could merge the two different dimensions into a single "Merged" dimension that would contain the complete list of Product Names from each data provider.
.
Web Intelligence will automatically merge dimensions in BI 4.x by default, so you may want to evaluate if there are performance gains you can achieve by reviewing the merged dimensions. If you do not want your dimensions to be automatically merged, you can uncheck the "Auto-merge dimensions" property in the Document Properties of your reports.
.
We have 2 Document Properties within a Webi document that can affect the merging of dimensions:
.
Auto-merge dimensions -- Automatically Merges dimensions with the same name from the same universe.
.
Extend merged dimension values -- This option will automatically include merged dimension values for a dimension even if the merged dimension object is not used in a table.
.
Merging dimensions has overhead associated to it that can impact the performance of your Webi documents. If you do not need to have certain dimensions merged within your document, you can simply choose to unmerge them. This removes the overhead performance hit that is associated with merging those dimensions. Besides, you can always merge them again later if needed.
.
.
In short, to squeeze a little extra performance out of your larger reports, it might be worth unmerging dimensions that are not being used as merged dimensions.
.
TIP 4.2 - Build Universes & Queries for the Business Needs of the Document
.
Like any successful project, the key to a successful Webi Document is good planning. This helps avoid scope/feature creep when you build out the document. During the planning stage, it is important to determine exactly what the business needs for your document are. Once you know the business needs, you can build a lean document that only contains the information needed to fulfill those needs.
.
Just like the previous tip that talks about "Monster" documents, we also need to avoid "Monster" queries/universes as well. The fact is, the larger a universe or query is, the worse the performance and overhead resources will be. By focusing only on the business needs, we can minimize the size of our queries and optimize the runtime of our documents.
.
As a real-life example, I have seen a report that was built off of a query that contained over 300 objects. This report pulled back around 500,000 rows of data and took over 45 minutes to complete. On inspecting the document, only about 1/4 of the objects were used in the document. When asked "Why" they were using a query that had over 300 objects in it they didn't have an answer. If we do the math on this, 300 objects x 500,000 rows = 1.5 million cells. It was likely that this query was designed to account for ALL scenarios that the query designer could account for and NOT based on the business needs of the report consumer.
.
In Summary, it is important to know who will be utilizing the universes and what their needs will be. You then want to build a lean universe, and supporting queries, that are optimized to suit those needs.
.
TIP 4.3 - Array Fetch Size Optimizations
.
The Array Fetch Size (AFS) is the maximum # of rows that will be fetched at a time when running a Web Intelligence document. For example, if you run a query that returns 100,000 rows of data and you have an Array Fetch Size of 100, it will take 1000 fetches of 100 rows per fetch (1000 x 100 = 100,000) to retrieve all of those rows.
.
In newer versions of Web Intelligence, we automatically determine what an optimal AFS should be based on the size of the objects within your query. For most scenarios, this results in an optimized value that will return the data with the good performance. Sometimes though, manually setting this value to a higher value can squeeze a little better performance out.
.
I did some quick testing on my side and here are the results that the Array Fetch Size had on my test server:
.
 
As you can see above, the time took to run the same query varied based on the AFS value that was set. The optimized value (which I believe was around 700 behind the scenes) took around 30 seconds. By overriding this and setting my AFS to 1000, I was able to shave another 12 seconds off to take it down to 18 seconds. This is great for performance, but keep in mind that this means large packets are sent over the network and extra memory will be needed to accommodate the larger fetches.
.
As I mentioned, by default the optimized value will be used for newly created connections/universes. To override this and test your own values, you have to disable the AFS optimization using a Universe Parameter called "DISABLE_ARRAY_FETCH_SIZE_OPTIMIZATION". Setting this to "Yes" will disable the optimization and take the Array Fetch Size value set on your connection.
.
More information on this can be found in the Information Design Tool or Universe Designer Guides
https://help.sap.com/viewer/p/SAP_BUSINESSOBJECTS_BUSINESS_INTELLIGENCE_PLATFORM   (Application Help section)
TIP 4.4 - Ensure Query Stripping is Enabled
.
Query Stripping is a feature that will remove unused objects from a query automatically to improve performance and reduce the data contained in the cube. Query Stripping was originally only available for BICS based connectivity to BEx queries but was introduced for Relational database connections starting in BI 4.1 SP3.
.
Query Stripping needs to be enabled for Relational Databases through three different options:
.
1. Enable "Allow query stripping" option at the Business Layer level in the Universe (UNX)
2. In the Document Properties of the Webi Document
.
3. In the Query Properties
.
.
It is best to double-check those 3 places when implementing Query Stripping. If it is unchecked at any level, you may not be benefiting from the Query Stripping feature.
.
There is also a way to tell if it is working. With Query Stripping enabled, refresh your query and then go back in to the Query Panel and click the View SQL button. You should see that only objects used in a block within the report are used. In this example, I am only using 3 of the 6 objects in my report, so the query only selects those objects.
.
 
 
You can see above that the SQL has been stripped of any unused objects and should run quicker as a result.
.
For BICS based documents, Query Stripping is enabled by default.
.
In summary, you will want to ensure your documents are utilizing Query Stripping to get better performance when refreshing queries.
.
TIP 4.5 - Follow these Best Practices for Performance Optimizing SAP BW (BICS) Reports
.
There is a lot of great information contained in the below document. It outlines many best practices for reporting off of SAP BW using the BICS connectivity. Please review the below guide for more details on optimizing the performance of BICS based Reports.
.
TIP 4.6 - Using Index-Awareness for Better Performance
.
Index-Awareness is described in the Information Design Tool User guide in section 12.7 as:
.
"Index awareness is the ability to take advantage of the indexes on key columns to improve query performance.
.
The objects in the business layer are based on database columns that are meaningful for querying data. For example, a Customer object retrieves the value in the customer name column of the customer table. In many databases, the customer table has a primary key (for example an integer) to uniquely identify each customer. The key value is not meaningful for reporting, but it is important for database performance.
.
When you set up index awareness, you define which database columns are primary and foreign keys for the dimensions and attributes in the business layer. The benefits of defining index awareness include the following:
Joining and filtering on key columns are faster than on non-key columns.
Fewer joins are needed in a query, therefore fewer tables are requested. For example, in a star schema database, if you build a query that involves filtering on a value in a dimension table, the query can apply the filter directly on the fact table by using the dimension table foreign key.
Uniqueness in filters and lists of values is taken into account. For example, if two customers have the same name, the application retrieves only one customer unless it is aware that each customer has a separate primary key."
.
Utilizing Index Awareness can help improve performance as key columns will be utilized behind the scenese in the queries to do faster lookups and linking on the database side.
.
The Information Design Tool User Guide covers Index Awareness  .
 
TIP 4.7 - Using Aggregate Awareness for Performance
.
Aggregate Awareness is described as the following in the IDT User Guide:
.
"Aggregate awareness is the ability of a relational universe to take advantage of database tables that contain pre-aggregated data (aggregate tables). Setting up aggregate awareness accelerates queries by processing fewer facts and aggregating fewer rows.
.
If an aggregate aware object is included in a query, at run time the query generator retrieves the data from the table with the highest aggregation level that matches the level of detail in the query.
For example, in a data foundation there is a fact table for sales with detail on the transaction level, and an aggregate table with sales summed by day. If a query asks for sales details, then the transaction table is used. If a query asks for sales per day, then the aggregate table is used. Which table is used is transparent to the user.
.
Setting up aggregate awareness in the universe has several steps. See the related topic for more information"
.
Utilizing the database to pre-aggregate data can help speed up the performance of your Webi documents. This is because the Webi Processing Server will not have to do the aggregations locally and will only have to work with the aggregated data that is returned from the database side.
.
Use Aggregate Awareness whenever it makes sense.
.
TIP 4.8 - Utilizing JOIN_BY_SQL to avoid multiple queries
.
The JOIN_BY_SQL parameter determines how the SQL Generation handles multiple SQL statements. By default, SQL Statements are not combined and in some scenarios, performance gains can be realized by allowing the SQL Generation to combine multiple statements.
.
The JOIN_BY_SQL parameter is found in the Information Design Tool in the Business Layer and/or Data Foundation. Below is a screenshot of the parameter in its default state.
.
.
By changing this Value to "Yes", you are instructing the SQL Generation process to use combined statements whenever possible. This can result in faster query execution so it may be worth testing this option out on your universes/documents.
.
TIP 4.9 - Security Considerations for the Semantic Layer
.
There is no doubt that security is a necessity when dealing with sensitive data. The purpose of this tip is to prompt you to review your security model and implementation to ensure it is as lean as it can be. Performance can definitely be impacted, sometimes quite severely, by the complexity of the security model at both your Semantic Layer, and your BI Platform (Users and Groups) levels.
.
As an example, I have worked on an incident recently where we were seeing roughly a 10-40% performance difference when opening a Webi document with the built in Administrator account vs another User Account. On closer examination, the user was a member of over 70 groups and a good portion of the load time was spent on rights aggregation and look-ups.
.
We also found that there were some inefficiencies in our code that could be optimized in future Support Packages/Minor Releases. These should help improve performance for customers who may be unaware of the performance impacts their complex security model may be having.
.
So, some actions you may want to consider for this tip are:
.
Review your business requirements and reduce/remove any unnecessary Data/Business Layer Security profiles at the Universe level.
Consider using the Change State / "Hidden" option in the Business Layer for fields that you do not want any users to see.
Consider using Access Levels in the Business Layer to control which users have access to objects
Reduce the # of User Groups/Roles that your Users are apart of
Test performance with an Administrator User and compare it to a Restricted User to gauge the impact on Performance
.
TIP 4.10 - Try Using The UTF8 Charset For NLS_LANG Environment Variable (Oracle Specific)
.
In our internal testing, it was found that using the UTF8 charset was faster than using other charsets along with the NLS_LANG environment variable. This affects only Oracle based connections and universes but there was about a 10-15% performance gain when the charset UTF8 was used vs others.
.
For example, you can try setting the Environment variable:
.
NLS_LANG = AMERICAN_AMERICA.UTF8
.
On the Web Intelligence Processing Server and Connection Server machines. Ensure that you do this either as a System Environment Variable or for the User that is running the Server Intelligence Agent (SIA) on that node.
.
Chapter 5 - Formula & Calculation Engine Tips
.
These tips involve some insight from the product developers around how the backend calculation engine handles calculations in regards to performance.
.
TIP 5.1 - Use Nested Sections with Conditions with caution
.
A Nested section, or subsection as they are sometimes referred to, is a section within a section. For example, you might have a Country Section and a Region section within it. This would be considered a "Nested" section. Nested sections can add overhead into the report rendering/processing time. This is especially true when you add conditions to the section such as "Hide section when...". This doesn't mean that you should not use Nested Sections, they are certainly useful for making a report look and feel the way you want it to but you should consider the performance impact before you heavily utilize nested sections within your documents.
.
Here is a example of 4 levels of nested sections from an eFashion based report
.
Here are the Format Section options that can affect performance when overused:
.
 
 
When using the conditions within Nested sections, the calculation engine needs to figure out which sections are displayed. The more nested sections you have, the more overhead there is to figure out which levels of sections are actually visible. Again, very useful in most cases but for reports with thousands of dimensions in the sections and conditions associated to them, this can impact the performance.
.
TIP 5.2 - Use IN instead of ForEach and ForAll when possible
.
This tip came directly from our developers that work with the calculation engine. Behind the scenes, the code is much more efficient when processing the IN context vs the ForEach or ForAll contexts.
.
The following Document is available on our help portal. It covers using functions, formulas, calculations and contexts within a Webi document in more detail:
.
DOC - Using functions, formulas, and calculations in Web Intelligence   (In the Application Help section, click on View All)
.
Section 4.3.1.1 (4.1 SP4)  covers the "IN" context operator with examples of how it works. In short, the IN context operator specifies dimensions explicitly in a context.
.
Section 4.3.1.2 and 4.3.1.3  (4.1 SP4)  cover the ForEach and ForAll context operators. In short, these two functions allow you to modify the default context by including or excluding dimensions from the calculation context.
.
In a lot of cases, IN can be used to achieve similar results to the ForEach and ForAll operators so if you suspect these are contributing to performance issues, try changing your formulas to use IN instead.
.
TIP 5.3 - Use IF...THEN...ELSE instead of Where operator when possible
.
In most cases, the IF/THEN/ELSE operators can be used instead of a Where Operator. This is more efficient from a calculation engine perspective according to our developers. If you ever suspect that the Where operator is causing performance degradation issues in your report, try swapping it for a IF statement if you can.
.
The following document discusses these operators in more details:
..
 
 
DOC - Using functions, formulas, and calculations in Web Intelligence   (In the Application Help section, click on View All)
.
Section 6.2.4.14 covers the usage of the Where Operator and provides examples.
.
Section 6.1.10.11 covers the IF...Then...Else functionality
..
TIP 5.4 - Factorize (Reuse) Variables
.
Factorizing variables essentially means to reuse them within other variables. By doing this, you are reducing the number of calculations that the engine needs to do to calculate the results.
.
Here is an example of what we mean when we say Factorizing variables:
.
v_H1_Sales = Sum([Sales Revenue]) Where ([Quarter] InList("Q1";"Q2"))
.
v_H2_Sales = Sum([Sales Revenue]) Where ([Quarter] InList("Q3";"Q4"))
.
Now we reuse these two to get the Years sales (H1+H2 revenue combined)
.
v_Year_Sales = v_H1_Sales + v_H2_Sales
.
By reusing variables, you are saving the time needed to recalculate values that have already been calculated. The above is a simple example but applying the same logic to more complex calculations can save you some real time on the calculation side.
.
** TIP 5.5 - Use BIG numbers sparingly
.
SAP BusinessObjects BI Platform version 4.2 will include a new feature that allows the precision of numerical objects to be about 40 digits. In BI 4.0/4.1, the precision was 15. This is great if you are using very large numbers and will ensure more accurate rounding of these large numbers. It does however have an impact on performance and memory.
.
When using big numbers, it is important to remember that there will be a slight decrease in performance within the Web Intelligence Calculator and a slight increase in the memory required for each measure that is defined as a big number
.
To check the type of a measure, you can right click it in the Available Objects panel and choose the "Change Type" option. Number is the standard (15 digit) precision and Decimal is the new "Big Number" 40 digit precision.
.
 
 
Recommendation: Use this feature only when the precision of a number needs to be greater than 15 digits. This will ensure your performance is not penalized for measures that do not benefit from this feature.
.
Chapter 6 - Sizing for Performance
.
One of the keys to a faster performing report is proper sizing on the back-end components. More often than not, we see systems that were originally sized correctly for "Day One" usage but have since outgrown the original sizing and are now experiencing performance issues due to resource and concurrency limits. It is important to size your system for Today's usage as well as the usage for the near future. It is equally important to have checkpoints a few times a year to ensure you are not outgrowing your original sizing estimates.
.
UPDATE: Ted Ueda has written a great blog that goes over some of these recommendations in greater detail. Link below:
.
BLOG - Revisit the Sizing for your deployment of BI 4.x Web Intelligence Processing Servers!
.
.The following tips will help you size your system for performance and may help you avoid some common mistakes we have seen in support.
..


Read the rest of this article here