Imports data from a variety of ODBC-compliant data sources.
The command takes the form ACCESSDATA64 or ACCESSDATA32 depending on whether you are using a 64-bit or 32-bit ODBC driver.
Syntax
{ACCESSDATA64 | ACCESSDATA32} {CONNECTOR | ODBC {"Driver"|"Dsn"|"File"}} NAME value <USER user_id> <PASSWORD num | PROMPT_PASSWORD> TO table_name CHARMAX max_field_length MEMOMAX max_field_length <ALLCHARACTER> SOURCE (connection_settings) <HASH(salt_value, fields)> SQL_QUERY (SQL_syntax) END_QUERY
Parameters
Name | Description |
---|---|
CONNECTOR | ODBC {"Driver"|"Dsn"|"File"} |
The type of ODBC connection you want to make:
|
NAME value |
The name of the Analytics data connector, the ODBC driver, or the DSN. For example:
|
USER user_id optional |
The user ID for data sources that require a user ID. |
PASSWORD num | PROMPT_PASSWORD optional |
For data sources that require a password:
If you use PASSWORD num, you must specify a previously created password definition. For more information, see PASSWORD command and SET command. Tip Using the PASSWORD command with PASSWORD num is similar to using PROMPT_PASSWORD. Both approaches prompt the user for a password. PROMPT_PASSWORD has the benefit of allowing updating of the user_id. |
TO table_name |
The location to send the results of the command to:
|
CHARMAX max_field_length |
The maximum length in characters for any field in the Analytics table that originates as character data in the source from which you are importing. The default value is 50. Data that exceeds the maximum field length is truncated when imported to Analytics. |
MEMOMAX max_field_length |
The maximum length in characters for text, note, or memo fields you are importing. The default value is 100. Data that exceeds the maximum field length is truncated when imported to Analytics. |
ALLCHARACTER optional |
Automatically assign the Character data type to all imported fields. Once the data is in Analytics, you can assign different data types, such as Numeric or Datetime, to the fields, and specify format details. Tip ALLCHARACTER is useful if you are importing a table that contains numeric ID values. You can use ALLCHARACTER to prevent Analytics automatically assigning the Numeric data type to values that should use the Character data type. |
SOURCE connection_settings |
The connection settings (connection string) required to connect to the data source. |
HASH(salt_value, fields) optional |
Imports the specified fields as cryptographic hash values. Hash values are one-way transformations and cannot be decoded after you import the fields:
HASH("QZ3x7", "SSN_NO, CC_NO, Last_Name")
For information about comparing values hashed during import to values hashed in ACLScript, see Comparing data hashed with ACCESSDATA to data hashed with the ACLScript HASH( ) function. |
SQL_QUERY (SQL_syntax) END_QUERY |
The SQL import statement. Everything inside the parentheses is part of the SQL query and must be valid SQL. |
Examples
Importing data using a native Analytics data connector
You need to import data from the Amazon Redshift cloud data service. To do so, you use the Analytics Amazon Redshift data connector:
ACCESSDATA64 CONNECTOR NAME "Amazon Redshift" USER "ACL_user" PROMPT_PASSWORD TO "Entitlement_History.FIL" CHARMAX 50 MEMOMAX 100 SOURCE( boolsaschar=0;cachesize=100;database=usage;declarefetchmode=0;maxbytea=255;maxlongvarchar=8190;maxvarchar=255;port=5439;servername=acl_test.highbond.com;singlerowmode=1;sslmode=require;textaslongvarchar=0;usemultiplestatments=0;useunicode=1) SQL_QUERY( SELECT "entitlement_history"."organization" AS "organization", "entitlement_history"."user_email" AS "user_email", "entitlement_history"."plan_id" AS "plan_id", "entitlement_history"."date_from" AS "date_from", "entitlement_history"."date_to" AS "date_to" FROM "prm"."entitlement_history" "entitlement_history" ) END_QUERY
Importing data using a Windows ODBC driver
You need to import data from a Microsoft Access database. To do so, you use a Windows ODBC driver to connect to MS Access and complete the import:
ACCESSDATA32 ODBC "Driver" NAME "Microsoft Access Driver (*.mdb)" TO "Invoices.FIL" CHARMAX 50 MEMOMAX 100 SOURCE( dbq=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files\Sample.mdb;defaultdir=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files;driverid=281;fil=MS Access;maxbuffersize=2048;maxscanrows=8;pagetimeout=5;safetransactions=0;threads=3;usercommitsync=Yes) SQL_QUERY( SELECT `Customer`.`CustID` AS `CustID`, `Customer`.`Company` AS `Company`, `Customer`.`Address` AS `Address`, `Customer`.`City` AS `City`, `Customer`.`Region` AS `Region`, `Customer`.`PostalCode` AS `PostalCode`, `Customer`.`Country` AS `Country`, `Customer`.`Phone` AS `Phone`, `Orders`.`OrderID` AS `OrderID`, `Orders`.`CustID` AS `Orders_CustID`, `Orders`.`ProdID` AS `ProdID`, `Orders`.`OrderDate` AS `OrderDate`, `Orders`.`Quantity` AS `Quantity`, `Product`.`ProdID` AS `Product_ProdID`, `Product`.`ProdName` AS `ProdName`, `Product`.`UnitPrice` AS `UnitPrice`, `Product`.`Descript` AS `Descript`, `Product`.`ShipWt` AS `ShipWt` FROM (`Customer` `Customer` INNER JOIN `Orders` `Orders` ON `Customer`.`CustID` = `Orders`.`CustID` ) INNER JOIN `Product` `Product` ON `Orders`.`ProdID` = `Product`.`ProdID` WHERE ( `Customer`.`Region` = 'BC' OR `Customer`.`Region` = 'WA' ) ) END_QUERY
Importing data using a Windows DSN (data source name)
You need to import data from a Microsoft Excel file. To do so, you use a Windows DSN to connect to Excel and complete the import:
ACCESSDATA32 ODBC "Dsn" NAME "Excel Files" TO "Trans_April_15_cutoff.FIL" CHARMAX 50 MEMOMAX 100 SOURCE( dbq=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files\Trans_April.xls;defaultdir=C:\Users\lachlan_murray\Documents\ACL Data\Sample Data Files;driverid=1046;maxbuffersize=2048;pagetimeout=5) SQL_QUERY( SELECT `Trans_Apr_`.`CARDNUM` AS `CARDNUM`, `Trans_Apr_`.`AMOUNT` AS `AMOUNT`, `Trans_Apr_`.`TRANS_DATE` AS `TRANS_DATE`, `Trans_Apr_`.`CODES` AS `CODES`, `Trans_Apr_`.`CUSTNO` AS `CUSTNO`, `Trans_Apr_`.`DESCRIPTION` AS `DESCRIPTION` FROM `Trans_Apr$` `Trans_Apr_` WHERE ( `Trans_Apr_`.`TRANS_DATE` <= {ts '2003-04-15 00:00:00'} ) ) END_QUERY
Remarks
Note
For more information about how this command works, see the Analytics Help.
Data connector updates
When you upgrade Analytics, the Robots Agent, or AX Server, you should test any of your scripts that import data using one of the Analytics data connectors (ACCESSDATA command).
The possibility exists that changes made by third-party data sources or ODBC driver vendors required updates to one or more of the data connectors. Scripted data connections may need to be updated in order to continue working correctly.
- Re-run the import The easiest way to update a connection is to manually perform an import using the Data Access window in the upgraded version of Analytics. Copy the ACCESSDATA command from the log and use it to update your script.
Note
Before connecting to a data source and re-running the import, clear the connector cache to flush the existing set of table names.
In the Existing Connections tab in the Data Access window, beside the connector name, select
> Clear cache.
- Update field specifications You may also need to update field specifications in the script body to align with table schema changes in the data source or ODBC driver. Possible changes include field names, field data types, and field and record lengths.
- Check the results of any filtering You should also check the results of any filtering that you apply as part of the data import. Confirm that the import filtering is including and excluding records correctly.
Creating ODBC connection settings and SQL import statements
ODBC connection settings, and SQL import statements, are often quite lengthy and involved, as shown in the examples.
The easiest way to create these portions of the ACCESSDATA command is to first use the Data Access window in Analytics to connect to the target data source, and import data. You can then copy the entire ACCESSDATA command from the log, including the connection settings and import statement, and customize the command in any way you require.
Password value suppressed
When you use the Data Access window in Analytics to run the ACCESSDATA command, and provide a password, the password value is not written to the log. Instead, the PROMPT_PASSWORD parameter is substituted.
ACCESSDATA log files
Two log files record the transactions associated with the ACCESSDATA command, and can be used for troubleshooting if a data connection fails:
- ServerDataAccess.log records all activities and errors prior to importing the data
Location: C:\Users\<user account>\AppData\Local\ACL\ACL for Windows\Data Access\ServerDataAccess.log
Note
The "Server" in ServerDataAccess.log refers to the data access component of Analytics running locally on the computer where Analytics is installed.
- DataAccess.log records information about the import operation and the Analytics project that you are importing data to
Location: ..\<Analytics project folder>\DataAccess.log
Comparing data hashed with ACCESSDATA to data hashed with the ACLScript HASH( ) function
Even though you cannot read the raw values of hashed data, it is still useful when combining or analyzing data.
If you want to compare values that are hashed by ACCESSDATA during import with values that are hashed using ACLScript's HASH( ) function, you must convert any numeric or datetime Analytics fields to character values and trim all leading and trailing spaces before hashing the data.
Datetime fields must use the following formats when converted to character:
- Datetime "YYYY-MM-DD hh:mm:ss"
- Date "YYYY-MM-DD"
- Time "hh:mm:ss"
The following example uses the STRING( ) and ALLTRIM( ) functions to convert a numeric credit card number field to character data before hashing the value using ACLScript's HASH( ) function:
COMMENT ACL HASH function used after importing data HASH(ALLTRIM(STRING(CC_No, 16)), "QZ3x7")
Once you hash the Analytics values, you can compare them with values hashed as part of the ACCESSDATA command import.