Automating work with Robots

Major new capabilities coming to Robots

Starting in Spring 2021, Diligent is beginning a phased introduction of major new capabilities in Robots. Our plan is to roll out the capabilities over a number of months, with completion expected some time in the final quarter of 2021.

The new capabilities are listed in the expected sequence of their release:

  • Cloud-based scripting (Released)

    Use Python as well as HighBond Command Language (HCL), our new custom Python library, to write scripts directly in Robots.

    With cloud-based scripting we're adding two new robot types: Workflow robots, followed by HighBond robots.

    The existing robot type, now called an ACL robot, continues to support data analysis and data automation using ACLScript and scripts uploaded from Analytics.

  • Workflow robots (Released, customizable event-based triggering not yet available)

    Event-based Workflow robots allow System Admins with a Professional subscription type to automate portions of their organization's HighBond workflow.

  • HighBond robots (Released)

    HighBond robots support domain-focused data analysis.

  • File and data management (Not released)

    Associated with the new robot types are new capabilities for securely and conveniently storing and accessing files and data.

For more information, see Scripting in Robots.

 

Robots is a HighBond app that also forms part of the ACL Robotics product suite.

The main organizing component in Robots is a container called a robot, which houses uploaded analytic scripts, any auxiliary scripts, Analytics data tables, and related files. A robot task is the object that you configure to automate repetitive tasks using scripts built in Analytics.

Note

For information about administering Robots, and installing and configuring an on-premise Robots Agent, see Robots Agent administration.

How do I automate with a robot?

To automate repetitive tasks with a robot, you follow a workflow that involves both Analytics and the Robots app.

  1. Create or edit analytic scripts

    You begin by creating one or more analytic scripts in an Analytics project.

    You can write analytic scripts from scratch, copy them from elsewhere, or import them from ScriptHub. Any subsequent updates or edits of analytic scripts are also performed in Analytics.

    An analytic script is a regular Analytics script that uses an analytic header to declare certain properties and instructions for running the script. For more information, see Analytic scripts.

  2. Create a robot

    In the Robots app, you create a robot, which is a container that houses uploaded analytic scripts.

    You can manually create an empty robot, or automatically create a robot when you commit an analytic script from Analytics to Robots for the first time. The result is the same regardless of which method you use.

  3. Commit scripts to a robot (upload scripts)

    Once the analytic scripts are working correctly in Analytics, you commit the scripts from Analytics to the robot.

    You repeat the commit process to upload analytic scripts that you have updated or edited.

  4. Create and schedule a task

    In the robot, you create one or more tasks to run the uploaded scripts according to specific configurations, which can include a schedule.

  5. Run a task

    The robot runs the scheduled tasks until you instruct it to stop. You can also run tasks ad hoc.

  6. View task results

    Depending on the type of results output by a task run, you view the results in Analytics, or in the native application for the output results, such as Excel.

Where do tasks run and where is the data stored?

A robot runs a task on the Robots Agent, which is a separate piece of software associated with the Robots app. The Robots Agent also accesses the source data required by a task.

There are two different possibilities for the Robots Agent:

  • On-premise Robots Agent available with ACL Robotics Enterprise Edition
  • Cloud-based Robots Agent available with all editions of ACL Robotics

Note

For detailed information about the differences between ACL Robotics editions, see Robots specifications and limits.

On-premise Robots Agent

An on-premise Robots Agent is installed locally by the Robots Admin or your IT team on a server within your secured network.

The Robots Agent initiates all communication with the Robots app and pulls task information, which is encrypted in transit.

For more information, see Installing or upgrading a Robots Agent.

Task and data details:

  • Tasks Tasks run on the Robots Agent locally.
  • Data tables Data sources are accessed by the Robots Agent and all Analytics data tables created from the data sources are stored locally. Data tables are not uploaded to the Robots app. Only table metadata, such as field names, is stored in Robots.
  • Output results A global configuration option in Robots allows you to specify where Analytics result tables, non-Analytics result files such as Excel, and result logs are stored. You can specify that all output results remain local, or you can upload some types, or all types, to the Robots app, where they are encrypted. The Robots app is located in a secured data center operated by Amazon Web Services (AWS).

Cloud-based Robots Agent

A cloud-based Robots Agent is co-located with the Robots app in a secured data center operated by Amazon Web Services (AWS). Customers have no responsibility for the installation or management of the Robots Agent.

Note

A cloud-based Robots Agent is not available in GovCloud.

Task and data details:

  • Tasks Tasks run on the Robots Agent in the cloud.
  • Data tables Data sources are accessed by the Robots Agent and all Analytics data tables created from the data sources are stored in the AWS data center. The data is encrypted in transit and at rest.
  • Output results Analytics result tables, non-Analytics result files such as Excel, and result logs are stored in the AWS data center, where they are encrypted.

Data segregation when running tasks

Data segregation enforced

Robot tasks are prevented from accessing data in other robots, or from outputting data to other robots. Task can only interact with data in the robot in which they are located, or with data located outside the Robots Agent directory structure. In a script, any command that attempts to traverse a robot boundary will cause the script to fail.

Reading Analytics data tables that are explicitly shared between robots is the one exception to this restriction.

Note

Any file or table exported or output to a location outside the Robots Agent directory structure is no longer subject to the data segregation provided by Robots.

Data access policies supported

Strict data segregation when running robot tasks allows different departments in your organization to use the same instance of Robots while ensuring that employees see only data that they are entitled to see. Accidental or intentional violations of your organization's data access policy are prevented.

Data segregation means that organizations using an on-premise Robots Agent can use a single agent installed on a single Windows server.

For information about controlling user access to individual robots, see Robots app permissions.