ACCESSDATA command

Imports data from a variety of ODBC-compliant data sources.

The command takes the form ACCESSDATA64 or ACCESSDATA32 depending on whether you are using a 64-bit or 32-bit ODBC driver.

Syntax

{ACCESSDATA64 | ACCESSDATA32} {CONNECTOR | ODBC {"Driver"|"Dsn"|"File"}} NAME value <USER user_id> <PASSWORD num | PROMPT_PASSWORD> <PASSWORD num AS password_keyname <...n>> TO table_name CHARMAX max_field_length MEMOMAX max_field_length <ALLCHARACTER> SOURCE (connection_settings) <HASH(salt_value, fields)>
SQL_QUERY
(SQL_syntax)
END_QUERY

Parameters

Name Description
CONNECTOR | ODBC {"Driver"|"Dsn"|"File"}

The type of ODBC connection you want to make:

  • CONNECTOR connect using a native Analytics data connector
  • ODBC "Driver" connect using a Windows ODBC driver installed on your computer
  • ODBC "Dsn" connect using a saved DSN (data source name) on your computer
  • ODBC "File" connect using a file DSN (a saved .dsn file)
NAME value

The name of the Analytics data connector, the ODBC driver, or the DSN.

For example:

  • NAME "Amazon Redshift"
  • NAME "Microsoft Access Driver (*.mdb, *.accdb)"
  • NAME "My Excel DSN"
  • NAME "excel.dsn"

USER user_id

optional

The user ID for data sources that require a user ID.

PASSWORD num | PROMPT_PASSWORD

optional

For data sources that require a single password:

  • PASSWORD num the password definition to use
  • PROMPT_PASSWORD display a password prompt

For more information, see Using password definitions with ACCESSDATA.

Password value suppressed

When you use the Data Access window in Analytics to run the ACCESSDATA command, and you provide a password, the password value is not written to the log. Instead, the PROMPT_PASSWORD parameter is substituted.

Using the PASSWORD command in conjunction with PASSWORD num is similar to using PROMPT_PASSWORD. Both approaches prompt the user for a password. PROMPT_PASSWORD has the benefit of allowing updating of the user_id.

PASSWORD num AS password_keyname <...n>

optional

For data sources that require multiple passwords, the password definitions to use.

password_keyname must replicate exactly the password key name as it appears in the connection settings specified by SOURCE.

For more information, see Using password definitions with ACCESSDATA.

TO table_name

The location to send the results of the command to:

  • table_name saves the results to an Analytics table

    Specify table_name as a quoted string with a .FIL file extension. For example: TO "Output.FIL"

    By default, the table data file (.FIL) is saved to the folder containing the Analytics project.

    Use either an absolute or relative file path to save the data file to a different, existing folder:

    • TO "C:\Output.FIL"
    • TO "Results\Output.FIL"

    Note

    Table names are limited to 64 alphanumeric characters, not including the .FIL extension. The name can include the underscore character ( _ ), but no other special characters, or any spaces. The name cannot start with a number.

CHARMAX max_field_length

The maximum length in characters for any field in the Analytics table that originates as character data in the source from which you are importing.

The default value is 50. Data that exceeds the maximum field length is truncated when imported to Analytics.

MEMOMAX max_field_length

The maximum length in characters for text, note, or memo fields you are importing.

The default value is 100. Data that exceeds the maximum field length is truncated when imported to Analytics.

ALLCHARACTER

optional

Automatically assign the Character data type to all imported fields.

Once the data is in Analytics, you can assign different data types, such as Numeric or Datetime, to the fields, and specify format details.

Tip

ALLCHARACTER is useful if you are importing a table that contains numeric ID values. You can use ALLCHARACTER to prevent Analytics automatically assigning the Numeric data type to values that should use the Character data type.

SOURCE connection_settings

The connection settings (connection string) required to connect to the data source.

HASH(salt_value, fields)

optional

Imports the specified fields as cryptographic hash values. Hash values are one-way transformations and cannot be decoded after you import the fields:

  • salt_value an alphanumeric string that is concatenated with the source data values to strengthen the hashing of the values in the fields. Enter the hash value as a quoted string.

    The salt value is limited to 128 characters. Do not use any of the following characters: (  )  "

  • fields a list of one or more fields to hash. Enter the fields as a quoted string and separate each field with a comma.

    You must specify the field name you see in the Data Access window preview and staging area, not the physical field name in the data source.

    Note

    The field name that is shown in the Data Access window preview is the field alias value in the SQL query ("field_name" AS "alias"). You must use the alias value to reference fields.

HASH("QZ3x7", "SSN_NO, CC_NO, Last_Name")

For information about comparing values hashed during import to values hashed in ACLScript, see Comparing data hashed with ACCESSDATA to data hashed with the ACLScript HASH( ) function.

SQL_QUERY (SQL_syntax) END_QUERY

The SQL import statement.

Everything inside the parentheses is part of the SQL query and must be valid SQL.

Note

You cannot use ACLScript syntax (commands or functions) in the body of the SQL import statement. You must use valid SQL syntax only.

Examples

Importing data using a native Analytics data connector

You need to import data from the Amazon Redshift cloud data service. To do so, you use the Analytics Amazon Redshift data connector:

ACCESSDATA64 CONNECTOR NAME "Amazon Redshift" USER "ACL_user" PROMPT_PASSWORD TO "Entitlement_History.FIL" CHARMAX 50 MEMOMAX 100
  SOURCE( boolsaschar=0;cachesize=100;database=usage;declarefetchmode=0;maxbytea=255;maxlongvarchar=8190;maxvarchar=255;port=5439;servername=acl_test.highbond.com;singlerowmode=1;sslmode=require;textaslongvarchar=0;usemultiplestatments=0;useunicode=1)
  SQL_QUERY(
    SELECT
      "entitlement_history"."organization" AS "organization",
      "entitlement_history"."user_email" AS "user_email",
      "entitlement_history"."plan_id" AS "plan_id",
      "entitlement_history"."date_from" AS "date_from",
      "entitlement_history"."date_to" AS "date_to"
    FROM
      "prm"."entitlement_history" "entitlement_history"
) END_QUERY

Importing data using a Windows ODBC driver

You need to import data from a Microsoft Access database. To do so, you use a Windows ODBC driver to connect to MS Access and complete the import:

ACCESSDATA32 ODBC "Driver" NAME "Microsoft Access Driver (*.mdb)" TO "Invoices.FIL" CHARMAX 50 MEMOMAX 100
  SOURCE( dbq=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files\Sample.mdb;defaultdir=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files;driverid=281;fil=MS Access;maxbuffersize=2048;maxscanrows=8;pagetimeout=5;safetransactions=0;threads=3;usercommitsync=Yes)
  SQL_QUERY(
    SELECT
      `Customer`.`CustID` AS `CustID`,
      `Customer`.`Company` AS `Company`,
      `Customer`.`Address` AS `Address`,
      `Customer`.`City` AS `City`,
      `Customer`.`Region` AS `Region`,
      `Customer`.`PostalCode` AS `PostalCode`,
      `Customer`.`Country` AS `Country`,
      `Customer`.`Phone` AS `Phone`,
      `Orders`.`OrderID` AS `OrderID`,
      `Orders`.`CustID` AS `Orders_CustID`,
      `Orders`.`ProdID` AS `ProdID`,
      `Orders`.`OrderDate` AS `OrderDate`,
      `Orders`.`Quantity` AS `Quantity`,
      `Product`.`ProdID` AS `Product_ProdID`,
      `Product`.`ProdName` AS `ProdName`,
      `Product`.`UnitPrice` AS `UnitPrice`,
      `Product`.`Descript` AS `Descript`,
      `Product`.`ShipWt` AS `ShipWt`
    FROM
      (`Customer` `Customer`
    INNER JOIN
      `Orders` `Orders`
       ON `Customer`.`CustID` = `Orders`.`CustID`
      )
INNER JOIN
    `Product` `Product`
      ON `Orders`.`ProdID` = `Product`.`ProdID`
WHERE
    (
      `Customer`.`Region` = 'BC'
      OR `Customer`.`Region` = 'WA'
    )
  ) END_QUERY

Importing data using a Windows DSN (data source name)

You need to import data from a Microsoft Excel file. To do so, you use a Windows DSN to connect to Excel and complete the import:

ACCESSDATA32 ODBC "Dsn" NAME "Excel Files" TO "Trans_April_15_cutoff.FIL" CHARMAX 50 MEMOMAX 100
  SOURCE( dbq=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files\Trans_April.xls;defaultdir=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files;driverid=1046;maxbuffersize=2048;pagetimeout=5)
  SQL_QUERY(
    SELECT
      `Trans_Apr_`.`CARDNUM` AS `CARDNUM`,
      `Trans_Apr_`.`AMOUNT` AS `AMOUNT`,
      `Trans_Apr_`.`TRANS_DATE` AS `TRANS_DATE`,
      `Trans_Apr_`.`CODES` AS `CODES`,
      `Trans_Apr_`.`CUSTNO` AS `CUSTNO`,
      `Trans_Apr_`.`DESCRIPTION` AS `DESCRIPTION`
    FROM
      `Trans_Apr$` `Trans_Apr_`
    WHERE
      (
       `Trans_Apr_`.`TRANS_DATE` <= {ts '2003-04-15 00:00:00'}
      )
  ) END_QUERY

Remarks

For more information about how this command works, see Working with the Data Access window.

Using password definitions with ACCESSDATA

The data sources that you connect to with the ACCESSDATA command often require authentication using a password, a token, or some other secret authentication value. For some data sources, more than one authentication value is required.

As part of the ACCESSDATA command, you can provide this authentication by specifying one or more password definitions. A password definition is not the password or authentication value itself. Rather, it is like a password variable that securely stores a previously provided password or authentication value. Specifying a password definition with the ACCESSDATA command allows you to avoid displaying an actual password in clear text in the connection settings specified by SOURCE.

Creating a password definition

In an import script, you must create a password definition first, before the definition can be used by the ACCESSDATA command.

For information about creating a password definition for use in Analytics, see PASSWORD command.

For information about creating a password definition for use in Robots, Analytics Exchange, or the Analysis App window, see PASSWORD tag.

Two options for specifying password definitions

You have two options when specifying password definitions with ACCESSDATA:

  • PASSWORD num specifies a single password definition for data sources that require a single password
  • PASSWORD num AS password_keyname can be used repeatedly to specify multiple password definitions for data sources that require multiple authentication values

Note

You can use the two options separately, or you can use them together.

How PASSWORD num works

Use the PASSWORD num parameter if a data source requires only a single password.

In the example below:

  1. The PASSWORD 1 command prompts a user to enter a password and creates a password definition that securely stores the entered password.
  2. In the ACCESSDATA command, the PASSWORD 1 parameter references the password definition and securely passes the stored password value into the connection setting specified by SOURCE ( auth_accesstoken=[$pwd] ).
PASSWORD 1
ACCESSDATA64 CONNECTOR NAME "Concur" PASSWORD 1 TO "Concur_data_import.FIL" CHARMAX 50 MEMOMAX 100
  SOURCE( auth_accesstoken=[$pwd];auth_type=OAuth 2.0;enabledoublebuffer=1;host=www.concursolutions.com;useencryptedendpoints=1;userparam=all)
  SQL_QUERY(
    SELECT
      "List_Items"."Level_7_Code" AS "Level_7_Code",
      "List_Items"."Name" AS "Name",
      "List_Items"."Level_10_Code" AS "Level_10_Code",
      "List_Items"."Level_8_Code" AS "Level_8_Code",
      "List_Items"."URI" AS "URI",
      "List_Items"."Id" AS "Id",
      "List_Items"."Level_3_Code" AS "Level_3_Code",
      "List_Items"."List_Id" AS "List_Id",
      "List_Items"."Level_4_Code" AS "Level_4_Code",
      "List_Items"."Level_1_Code" AS "Level_1_Code",
      "List_Items"."Parent_Id" AS "Parent_Id",
      "List_Items"."Level_2_Code" AS "Level_2_Code",
      "List_Items"."Level_5_Code" AS "Level_5_Code",
      "List_Items"."Level_6_Code" AS "Level_6_Code",
      "List_Items"."Level_9_Code" AS "Level_9_Code"
    FROM
      "Concur"."List_Items" "List_Items"
) END_QUERY

How PASSWORD num AS password_keyname works

Use the PASSWORD num AS password_keyname parameter if a data source requires multiple passwords or authentication values.

In the example below:

  1. In Robots or Analytics Exchange, the //PASSWORD tags in an analytic header create four password parameters for which the user must enter authentication values. The four parameters create four password definitions that securely store the entered values.
  2. In the ACCESSDATA command, the four PASSWORD parameters reference the password definitions and securely pass the stored authentication values into the connection settings specified by SOURCE :
    • oauthclientid=
    • oauthclientsecret=
    • oauthaccesstoken=
    • oauthaccesstokensecret=

For more information, see Configuring ACCESSDATA to work with multiple password definitions.

COMMENT
//ANALYTIC TYPE IMPORT Import Twitter data
//PASSWORD 1 Enter OAuth Client ID:
//PASSWORD 2 Enter OAuth Client Secret:
//PASSWORD 3 Enter OAuth Access Token:
//PASSWORD 4 Enter OAuth Access Token Secret:
//RESULT TABLE Twitter_user_data
END

ACCESSDATA64 CONNECTOR NAME "Twitter" PASSWORD 1 AS oauthclientid PASSWORD 2 AS oauthclientsecret PASSWORD 3 AS oauthaccesstoken PASSWORD 4 AS oauthaccesstokensecret TO "Twitter_user_data.FIL" CHARMAX 50 MEMOMAX 100
  SOURCE( oauthclientid=;oauthclientsecret=;oauthaccesstoken=;oauthaccesstokensecret=;readonly=true;drivertype=ACL Connector for Twitter;connectonopen=true;convertdatetimetogmt=true;limitkeysize=255;maptolongvarchar=-1;maptowvarchar=true;uppercaseidentifiers=false;supportenhancedsql=true;proxyauthscheme=BASIC;proxyautodetect=true;_persist_=encrypted-dp{AQA ... kX3E8yyh05HoG1rH4bm1lhwudUQ==}})
  SQL_QUERY(
    SELECT
      "Users"."ID" AS "ID",
      "Users"."Name" AS "Name",
      "Users"."Screen_Name" AS "Screen_Name",
      "Users"."Location" AS "Location",
      "Users"."Profile_URL" AS "Profile_URL",
      "Users"."Lang" AS "Lang",
      "Users"."Created_At" AS "Created_At",
      "Users"."Friends_Count" AS "Friends_Count",
      "Users"."Followers_Count" AS "Followers_Count",
      "Users"."Favourites_Count" AS "Favourites_Count",
      "Users"."Statuses_Count" AS "Statuses_Count",
      "Users"."Time_Zone" AS "Time_Zone",
      "Users"."Following" AS "Following",
      "Users"."Contributors_Enabled" AS "Contributors_Enabled",
      "Users"."Follow_Request_Sent" AS "Follow_Request_Sent",
      "Users"."Listed_Count" AS "Listed_Count",
      "Users"."Description" AS "Description",
      "Users"."Default_Profile" AS "Default_Profile"
    FROM
      "Twitter"."Users" "Users"
) END_QUERY

Configuring ACCESSDATA to work with multiple password definitions

To configure the ACCESSDATA command to work with multiple password definitions, you insert PASSWORD parameters in the command and copy password key names from the SOURCE parameter to the PASSWORD parameters.

  1. In Analytics, use the Data Access window to import data from a data source that requires more than one authentication value.
  2. Copy the ACCESSDATA command from the log to an open script in the Script Editor.

    Typically, only one authentication value is masked ( [$pwd] ) in the SOURCE parameter, and additional values appear in clear text. For example:

    SOURCE( oauthclientid=cXQ ... dR4;oauthclientsecret=QUt ... beo;oauthaccesstoken=913 ... cPn;oauthaccesstokensecret=[$pwd]; ... )
  3. Delete the clear text authentication values from the SOURCE parameter and leave only the password key names and the equals sign.

    For example:

    SOURCE( oauthclientid=;oauthclientsecret=;oauthaccesstoken=;oauthaccesstokensecret=[$pwd]; ... )

    Note

    Authentication values must now be supplied by users via password definitions created with the PASSWORD command or the PASSWORD analytic tag. For more information, see Creating a password definition.

  4. Optional. Delete [$pwd] from the one password key name with a masked authentication value.

    With this password key name you can use either of the two methods for specifying a password definition in the ACCESSDATA command. For more information, see Two options for specifying password definitions.

  5. Delete the PROMPT_PASSWORD parameter from the ACCESSDATA command.
  6. Insert numbered PASSWORD parameters at the location where you deleted PROMPT_PASSWORD and copy and paste the password key names from the SOURCE parameter to the PASSWORD parameters.

    For example:

    ACCESSDATA64 CONNECTOR NAME "Twitter" PASSWORD 1 AS oauthclientid PASSWORD 2 AS oauthclientsecret PASSWORD 3 AS oauthaccesstoken PASSWORD 4 AS oauthaccesstokensecret ...
      SOURCE( oauthclientid=;oauthclientsecret=;oauthaccesstoken=;oauthaccesstokensecret=; ... )

    Important

    Password key names must be exactly identical between the SOURCE parameter and the PASSWORD parameters. If they are not, the ACCESSDATA command fails.

  7. If you did not remove [$pwd] from the password key name with a masked authentication value, use the method for specifying a single password definition.

    For example:

    ACCESSDATA64 CONNECTOR NAME "Twitter" PASSWORD 1 AS oauthclientid PASSWORD 2 AS oauthclientsecret PASSWORD 3 AS oauthaccesstoken PASSWORD 4 ...
      SOURCE( oauthclientid=;oauthclientsecret=;oauthaccesstoken=;oauthaccesstokensecret=[$pwd]; ... )

    Result The ACCESSDATA command, in conjunction with separately created password definitions, can now be used in an import script without displaying clear text authentication values in the SOURCE connection settings.

Multi-factor authentication is not suitable

You cannot use the ACCESSDATA command to access data sources that require multi-factor authentication (MFA) since scripts cannot authenticate this way. If you need to access data protected with MFA, see if your organization will permit the use of a generic worker account that does not require MFA.

Data connector updates

When you upgrade Analytics, the Robots Agent, or AX Server, you should test any of your scripts that import data using one of the Analytics data connectors (ACCESSDATA command).

The possibility exists that changes made by third-party data sources or ODBC driver vendors required updates to one or more of the data connectors. Scripted data connections may need to be updated in order to continue working correctly.

  • Re-run the import The easiest way to update a connection is to manually perform an import using the Data Access window in the upgraded version of Analytics. Copy the ACCESSDATA command from the log and use it to update your script.

    Note

    Before connecting to a data source and re-running the import, clear the connector cache to flush the existing set of table names.

    In the Existing Connections tab in the Data Access window, beside the connector name, select > Clear cache.

  • Update field specifications You may also need to update field specifications in the script body to align with table schema changes in the data source or ODBC driver. Possible changes include field names, field data types, and field and record lengths.
  • Check the results of any filtering You should also check the results of any filtering that you apply as part of the data import. Confirm that the import filtering is including and excluding records correctly.

Creating ODBC connection settings and SQL import statements

ODBC connection settings, and SQL import statements, are often quite lengthy and involved, as shown in the examples.

The easiest way to create these portions of the ACCESSDATA command is to first use the Data Access window in Analytics to connect to the target data source, and import data. You can then copy the entire ACCESSDATA command from the log, including the connection settings and import statement, and customize the command in any way you require.

ACCESSDATA log files

Two log files record the transactions associated with the ACCESSDATA command, and can be used for troubleshooting if a data connection fails:

  • ServerDataAccess.log records all activities and errors prior to importing the data

    Location: C:\Users\<user account>\AppData\Local\ACL\ACL for Windows\Data Access\ServerDataAccess.log

    Note

    The "Server" in ServerDataAccess.log refers to the data access component of Analytics running locally on the computer where Analytics is installed.

  • DataAccess.log records information about the import operation and the Analytics project that you are importing data to

    Location: ..\<Analytics project folder>\DataAccess.log

Comparing data hashed with ACCESSDATA to data hashed with the ACLScript HASH( ) function

Even though you cannot read the raw values of hashed data, it is still useful when combining or analyzing data.

If you want to compare values that are hashed by ACCESSDATA during import with values that are hashed using ACLScript's HASH( ) function, you must convert any numeric or datetime Analytics fields to character values and trim all leading and trailing spaces before hashing the data.

Datetime fields must use the following formats when converted to character:

  • Datetime "YYYY-MM-DD hh:mm:ss"
  • Date "YYYY-MM-DD"
  • Time "hh:mm:ss"

The following example uses the STRING( ) and ALLTRIM( ) functions to convert a numeric credit card number field to character data before hashing the value using ACLScript's HASH( ) function:

COMMENT ACL HASH function used after importing data
HASH(ALLTRIM(STRING(CC_No, 16)), "QZ3x7")

Once you hash the Analytics values, you can compare them with values hashed as part of the ACCESSDATA command import.