Analytic development best practices
Analytics support most of the commands that you can use in a regular Analytics script. However, you must ensure that analytics run without user interaction, and that they do not include commands not supported by the engine that processes the analytics in the deployment environment.
Analytics support all Analytics functions.
General best practices
Use one Analytics project per robot or analysis app
Create a new Analytics project in Analytics for each robot or analysis app. The project must contain all the analytics that make up the robot or the analysis app, and any required subscripts. For an analysis app, the project must also contain any data files required by any of the analytics.
Test locally
Test all analytics locally before deploying them to the target environment. Ensure that analytics run as expected, and that they do not require user interaction.
For more information, see Developing analytic scripts.
Use consistent data connections for testing
To test an analytic locally if it uses an ODBC data source, you must configure an ODBC connection on your local computer that is identical to the connection in the environment where the analytic will run.
For analytics distributed for use in the Analysis App window, end users must configure an identical ODBC connection on their computers.
Avoid absolute file paths
Avoid using absolute file paths in analytics (for example, C:\results) unless you are certain that identical file paths exist in the environment where the analytic will run.
Using relative file paths such as \results allows you to develop and test analytics locally and then deploy them in another environment without requiring that the other environment has an identical directory structure.
Use SET for preference settings
Use the SET command to specify any preference settings required by the analytic. If you do not specify preferences in the analytic, the default Analytics preferences are used. Position the SET command after the analytic header but before any of the analytic logic.
Do not use computed fields in results or data tables
Do not use computed fields in any tables you intend to keep beyond the session in which the analytic script runs.
Results and data tables that are kept for use in interpretations or as input for subsequent scripts may display unexpected values if they contain computed fields. Computed values are dependent on settings defined in the preference file (.prf), or by the SET command, and therefore different environments may produce different values.
If you need to retain the values in a computed field, use the EXTRACT command with the FIELDS or ALL option to convert the field to a physical field in a result or data table. For more information, see EXTRACT command.
Encrypt data connection passwords
To avoid having a data source password in plain text in an analytic, use the PASSWORD analytic tag. This tag prompts the user for a password before running the analytic, and encrypts the entered value.
Use a password when importing from or exporting to HighBond
The PASSWORD parameter is required in any command that imports from or exports to HighBond:
- IMPORT GRCRESULTS
- IMPORT GRCPROJECT
- EXPORT... ACLGRC
Without the PASSWORD parameter, the commands fail in Robots, Analytics Exchange, or the Analysis App window.
When you use the PASSWORD parameter in an analytic script, you must also specify an associated password input parameter in the analytic header. For more information, see PASSWORD.
Note
The PASSWORD parameter is not required when running the import and export commands in Analytics because the current user's HighBond access token is automatically used.
Avoiding user interaction
Analytics must be able to run without user interaction. If a command in an analytic tries to create a dialog box, the engine in the deployment environment stops processing the analytic, and an error is entered in the log.
Replace user interaction commands with analytic tags
Do not use Analytics commands that require user interaction. Replace them with equivalent analytic tags in the analytic header. Analytic tags allow users to provide input values before the analytic runs.
Do not use | Replace with |
---|---|
DIALOG | //TABLE, //FIELD, //PARAM |
ACCEPT | //TABLE, //FIELD, //PARAM |
PASSWORD | //PASSWORD |
PAUSE | no equivalent |
Guidelines
- to prevent analytic processing failures, remove all interactive commands
- to ensure files can be overwritten as necessary without displaying a confirmation dialog box, add the SET SAFETY OFF command at the beginning of an analytic and then add the SET SAFETY ON command at the end of the analytic to restore the default behavior
- to prevent confirmation dialogs from crashing the analytic, add the OK parameter after any commands
that normally display a confirmation dialog box:
- RENAME
- DELETE
Checking script syntax
Analytics provides a tool for detecting script syntax that causes analytics to fail, or that requires alignment between your local environment and the environment where the analytics are deployed. The tool provides a warning only, and you are still free to commit or import analytic scripts that have warnings.
What the tool checks
The tool checks all scripts in a project for the following items:
- any command that requires user interaction
- any absolute file path
- any call of an external script
When the check is performed
Script syntax checking is performed automatically when you commit scripts to Robots.
Automatic syntax checking is enabled by default. If you want to turn it off, select Disable Script Syntax Check Before Commit Scripts in the Options dialog box (Tools > Options > Interface).
Perform checking manually
You can perform script syntax checking manually. You may need to first add the Check Scripts button to the Analytics toolbar.
- If necessary, add the Check Scripts button to the Analytics toolbar:
- Double-click an empty spot on the toolbar to open the Customize Toolbar dialog box.
- In the Available toolbar buttons list, select the Check Scripts button and click Add.
- In the Current toolbar buttons list, select the Check Scripts button and click Move Up or Move Down to change the location of the button.
The order of the buttons from top to bottom corresponds to their location from left to right on the toolbar.
Click Close to save your changes.
- On the toolbar, click Check Scripts
.
A message appears telling you that the script syntax in the project is valid, or specifying one or more warnings.
- Do one of the following:
- Correct any script syntax that generates a warning, and click Check Scripts
again to ensure that the warnings no longer appear.
- Ensure that the deployment environment contains a directory structure, or external scripts, that align with the paths or external scripts specified in the analytic.
- Correct any script syntax that generates a warning, and click Check Scripts
Best practices for analytics run on AX Server
Develop in Analytics
Develop analytics and their supporting scripts primarily in Analytics before importing them to AX Server.
As a convenience feature, the AX Client script editor does allow you to add new analytics or subscripts or edit existing analytics or subscripts. This feature is useful for fine-tuning the behavior of an analytic without having to export it to Analytics and then reimport it to AX Server. However, analytic development work beyond minor adjustments is easier to accomplish in Analytics.
Store related files with the Analytics project
Related files such as database profile files should be stored in the same folder as the Analytics project, but must be imported to AX Server separately.
Avoid commands not supported on AX Server
- direct database server tables linked to Analytics Server Edition for z/OS
- the NOTIFY command supports only SMTP messaging. The MAPI and VIM mail protocols are not supported
- to use the PRINT or TO PRINT command, a default printer must be configured on the server
- the SAVE GRAPH and PRINT GRAPH commands are not supported
- do not use the SET LEARN command in analytics
Minimize AX Server table transactions
Optimize the performance of analytics by minimizing the number of times tables on AX Server are accessed:
- Use the FILTER command to select the records you need.
- Use the EXTRACT command to extract only the required fields.
The reduced data set will be processed locally on the server where the analytic is being run by the AX Engine.
Optimizing analytics in this way is important when the data files are not located on the same server as AX Server or the AX Engine Node processing the analytic, and the Copy analytic data option is not selected in the AX Server Configuration web application.
Inefficient analytic example
OPEN LargeTable SET FILTER TO trans_date >= `20091201` AND trans_date < `20100101` COUNT TOTAL amount CLASSIFY ON account ACCUMULATE amount TO TransClassAccount
Efficient analytic example
OPEN LargeTable SET FILTER TO trans_date >= `20091201` AND trans_date < `20100101` EXTRACT FIELDS trans_date desc account type amount TO AnalysisTable OPEN AnalysisTable COUNT TOTAL amount CLASSIFY ON account ACCUMULATE amount TO TransClassAccount
Access SAP data in background mode
Use Background mode to access data from SAP ERP systems using Direct Link.