Saltar al contenido

HDFS Connection Details

Introduction

Connector Version

This documentation is based on version 23.0.8895 of the connector.

Get Started

HDFS Version Support

The connector leverages the HDFS API to enable bidirectional access to HDFS.

Establish a Connection

Connect to HDFS

In order to connect, set the following connection properties:

  • Host: Set this value to the host of your HDFS installation.
  • Port: Set this value to the port of your HDFS installation. Default port: 50070
  • UseSSL: (Optional) Set this value to 'True', to negotiate TLS/SSL connections to the HDFS server. Default: 'False'.

Authenticate to HDFS

There are two authentication methods available for connecting to the HDFS data source, Anonymous Authentication and Negotiate (Kerberos) Authentication.

Anonymous Authentication

In some situations, HDFS may be connected to without any authentication connection properties. To do so, set the AuthScheme to None (default).

Kerberos

When authentication credentials are required, you can use Kerberos. See Using Kerberos for details on how to authenticate with Kerberos.

Fine-Tuning Data Access

Fine Tuning Data Access

You can use the following properties to gain more control over the data returned from HDFS:

  • DirectoryRetrievalDepth: How many subfolders to be recursively scanned before stopping.
    -1 specifies that all subfolders are scanned. 0 specifies that only the current folder will be scanned for items.
  • Path: Limit the subfolders recursively scanned.

Use Kerberos

Kerberos

To authenticate to HDFS with Kerberos, set AuthScheme to NEGOTIATE.

Authenticating to HDFS via Kerberos requires you to define authentication properties and to choose how Kerberos should retrieve authentication tickets.

Retrieve Kerberos Tickets

Kerberos tickets are used to authenticate the requester's identity. The use of tickets instead of formal logins/passwords eliminates the need to store passwords locally or send them over a network. Users are reauthenticated (tickets are refreshed) whenever they log in at their local computer or enter kinit USER at the command prompt.

The connector provides three ways to retrieve the required Kerberos ticket, depending on whether or not the KRB5CCNAME and/or KerberosKeytabFile variables exist in your environment.

MIT Kerberos Credential Cache File

This option enables you to use the MIT Kerberos Ticket Manager or kinit command to get tickets. With this option there is no need to set the User or Password connection properties.

This option requires that KRB5CCNAME has been created in your system.

To enable ticket retrieval via MIT Cerberos Credential Cache Files:

  1. Ensure that the KRB5CCNAME variable is present in your environment.
  2. Set KRB5CCNAME to a path that points to your credential cache file. (For example, C:\krb_cache\krb5cc_0 or /tmp/krb5cc_0.) The credential cache file is created when you use the MIT Kerberos Ticket Manager to generate your ticket.
  3. To obtain a ticket:

    1. Open the MIT Kerberos Ticket Manager application.
    2. Click Get Ticket.
    3. Enter your principal name and password.
    4. Click OK.

    If the ticket is successfully obtained, the ticket information appears in Kerberos Ticket Manager and is stored in the credential cache file.

The connector uses the cache file to obtain the Kerberos ticket to connect to HDFS.

Note

If you would prefer not to edit KRB5CCNAME, you can use the KerberosTicketCache property to set the file path manually. After this is set, the connector uses the specified cache file to obtain the Kerberos ticket to connect to HDFS.

Keytab File

If your environment lacks the KRB5CCNAME environment variable, you can retrieve a Kerberos ticket using a Keytab File.

To use this method, set the User property to the desired username, and set the KerberosKeytabFile property to a file path pointing to the keytab file associated with the user.

User and Password

If your environment lacks the KRB5CCNAME environment variable and the KerberosKeytabFile property has not been set, you can retrieve a ticket using a user and password combination.

To use this method, set the User and Password properties to the user/password combination that you use to authenticate with HDFS.

Enable Cross-Realm Authentication

More complex Kerberos environments can require cross-realm authentication where multiple realms and KDC servers are used. For example, they might use one realm/KDC for user authentication, and another realm/KDC for obtaining the service ticket.

To enable this kind of cross-realm authentication, set the KerberosRealm and KerberosKDC properties to the values required for user authentication. Also, set the KerberosServiceRealm and KerberosServiceKDC properties to the values required to obtain the service ticket.

Important Notes

Configuration Files and Their Paths

  • All references to adding configuration files and their paths refer to files and locations on the Jitterbit agent where the connector is installed. These paths are to be adjusted as appropriate depending on the agent and the operating system. If multiple agents are used in an agent group, identical files will be required on each agent.

Advanced Features

This section details a selection of advanced features of the HDFS connector.

User Defined Views

The connector allows you to define virtual tables, called user defined views, whose contents are decided by a pre-configured query. These views are useful when you cannot directly control queries being issued to the drivers. See User Defined Views for an overview of creating and configuring custom views.

SSL Configuration

Use SSL Configuration to adjust how connector handles TLS/SSL certificate negotiations. You can choose from various certificate formats; see the SSLServerCert property under "Connection String Options" for more information.

Proxy

To configure the connector using private agent proxy settings, select the Use Proxy Settings checkbox on the connection configuration screen.

Query Processing

The connector offloads as much of the SELECT statement processing as possible to HDFS and then processes the rest of the query in memory (client-side).

See Query Processing for more information.

User Defined Views

The HDFS connector allows you to define a virtual table whose contents are decided by a pre-configured query. These are called User Defined Views, which are useful in situations where you cannot directly control the query being issued to the driver, e.g. when using the driver from Jitterbit. The User Defined Views can be used to define predicates that are always applied. If you specify additional predicates in the query to the view, they are combined with the query already defined as part of the view.

There are two ways to create user defined views:

  • Create a JSON-formatted configuration file defining the views you want.
  • DDL statements.

Define Views Using a Configuration File

User Defined Views are defined in a JSON-formatted configuration file called UserDefinedViews.json. The connector automatically detects the views specified in this file.

You can also have multiple view definitions and control them using the UserDefinedViews connection property. When you use this property, only the specified views are seen by the connector.

This User Defined View configuration file is formatted as follows:

  • Each root element defines the name of a view.
  • Each root element contains a child element, called query, which contains the custom SQL query for the view.

For example:

{
    "MyView": {
        "query": "SELECT * FROM Files WHERE MyColumn = 'value'"
    },
    "MyView2": {
        "query": "SELECT * FROM MyTable WHERE Id IN (1,2,3)"
    }
}

Use the UserDefinedViews connection property to specify the location of your JSON configuration file. For example:

"UserDefinedViews", "C:\Users\yourusername\Desktop\tmp\UserDefinedViews.json"

Define Views Using DDL Statements

The connector is also capable of creating and altering the schema via DDL Statements such as CREATE LOCAL VIEW, ALTER LOCAL VIEW, and DROP LOCAL VIEW.

Create a View

To create a new view using DDL statements, provide the view name and query as follows:

CREATE LOCAL VIEW [MyViewName] AS SELECT * FROM Customers LIMIT 20;

If no JSON file exists, the above code creates one. The view is then created in the JSON configuration file and is now discoverable. The JSON file location is specified by the UserDefinedViews connection property.

Alter a View

To alter an existing view, provide the name of an existing view alongside the new query you would like to use instead:

ALTER LOCAL VIEW [MyViewName] AS SELECT * FROM Customers WHERE TimeModified > '3/1/2020';

The view is then updated in the JSON configuration file.

Drop a View

To drop an existing view, provide the name of an existing schema alongside the new query you would like to use instead.

DROP LOCAL VIEW [MyViewName]

This removes the view from the JSON configuration file. It can no longer be queried.

Schema for User Defined Views

User Defined Views are exposed in the UserViews schema by default. This is done to avoid the view's name clashing with an actual entity in the data model. You can change the name of the schema used for UserViews by setting the UserViewsSchemaName property.

Work with User Defined Views

For example, a SQL statement with a User Defined View called UserViews.RCustomers only lists customers in Raleigh:

SELECT * FROM Customers WHERE City = 'Raleigh';

An example of a query to the driver:

SELECT * FROM UserViews.RCustomers WHERE Status = 'Active';

Resulting in the effective query to the source:

SELECT * FROM Customers WHERE City = 'Raleigh' AND Status = 'Active';

That is a very simple example of a query to a User Defined View that is effectively a combination of the view query and the view definition. It is possible to compose these queries in much more complex patterns. All SQL operations are allowed in both queries and are combined when appropriate.

SSL Configuration

Customize the SSL Configuration

By default, the connector attempts to negotiate SSL/TLS by checking the server's certificate against the system's trusted certificate store.

To specify another certificate, see the SSLServerCert property for the available formats to do so.

Data Model

The HDFS connector models HDFS objects as relational tables and views. HDFS objects have relationships to other objects; in the tables, these relationships are expressed through foreign keys. The following sections show the available API objects and provide more information on executing SQL to HDFS APIs.

Schemas for most database objects are defined in simple, text-based configuration files.

Key Features

  • The connector models HDFS entities such as files and permissions as relational views, allowing you to write SQL to query HDFS data.
  • Stored procedures allow you to execute operations to HDFS
  • Live connectivity to these objects means any changes to your HDFS account are immediately reflected when using the connector.

Views

Views are similar to tables in the way that data is represented; however, views are read-only.

Queries can be executed against a view as if it were a normal table.

HDFS Connector Views

Name Description
Files Lists the contents of the supplied path.
Permissions Lists the permissions of the files/file specified in the path.

Files

Lists the contents of the supplied path.

Table Specific Information
Select

This will return a list of all the files and directories in your system. By default all subfolders are recursively scanned to list their children. You can configure the depth of subfolders you want to be recursively scanned with DirectoryRetrievalDepth property. All filters are executed client side within the connector.

Columns
Name Type Description
FileId [KEY] Long The unique ID associated with the file.
PathSuffix String The path suffix.
FullPath String The full path of the file.
Owner String The user who is the owner.
Group String The group owner.
Length Long The number of bytes in a file.
Permission String The permission represented as a octal string
Replication Integer The number of replication of a file.
StoragePolicy Integer The name of the storage policy
ChildrenNum Integer The number of children the file has.
BlockSize Long The block size of a file.
ModificationTime Datetime The modification time.
AccessTime Datetime The access time.
Type String The type of the path object.

Permissions

Lists the permissions of the files/file specified in the path.

Table Specific Information
Select

This will return a list of permissions of all the files and directories in your system. All filters are executed client side within the connector.

Columns
Name Type Description
FullPath [KEY] String The full path of the file.
OwnerRead Boolean Whether the owner this file belongs to has read access.
OwnerWrite Boolean Whether the owner this file belongs to has write access.
OwnerExecute Boolean Whether the owner this file belongs to has execute access.
GroupRead Boolean Whether the group this file belongs to has read access.
GroupWrite Boolean Whether the group this file belongs to has write access.
GroupExecute Boolean Whether the group this file belongs to has execute access.
OthersRead Boolean Whether everyone else has read access.
OthersWrite Boolean Whether everyone else has write access.
OthersExecute Boolean Whether everyone else has execute access.

Stored Procedures

Stored procedures are function-like interfaces that extend the functionality of the connector beyond simple SELECT operations with HDFS.

Stored procedures accept a list of parameters, perform their intended function, and then return any relevant response data from HDFS, along with an indication of whether the procedure succeeded or failed.

HDFS Connector Stored Procedures

Name Description
AppendToFile Create and Write to a File.
Concat Concatenate a group of files to another file.
CreateSnapshot Create a Snapshot of a folder.
CreateSymLink Create a Symbolic Link.
DeleteFile Delete a file or a directory.
DeleteSnapshot Create a Snapshot of a folder.
DownloadFile Open and read a file.
GetContentSummary Get the content summary of a file/folder.
GetFileChecksum Get the checksum of a file.
GetHomeDirectory Get the home directory for the current user.
GetTrashRoot Get the root directory of Trash for current user when the path specified is deleted.
ListStatus Lists the contents of the supplied path.
MakeDirectory Create a directory in the specified path.
RenameFile Rename a file or a directory.
RenameSnapshot Rename a Snapshot of a folder.
SetOwner Set owner and group of a path.
SetPermission Set permission of a path.
TruncateFile Truncate a file to a new length.
UploadFile Create and Write to a File.

AppendToFile

Create and Write to a File.

Input
Name Type Required Description
Path String True The absolute path of the file for which content will be appended.
FilePath String False The path of the file whose content will be appended to the specified file. Has higher priority than Content.
Content String False The content as a string which will be appended to the specified file. Has lower priority than FilePath.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

Concat

Concatenate a group of files to another file.

Input
Name Type Required Description
Path String True The path which will be concatenated with other paths/sources.
Sources String True A comma separated list of paths/sources. These will be joined to the Path input.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

CreateSnapshot

Create a Snapshot of a folder.

Input
Name Type Required Description
Path String False The path for which the snapshot will be created. Must be a folder.
SnapshotName String False The name of the snapshot which will be created.
Result Set Columns
Name Type Description
Path String The path of the created snapshot.

Create a Symbolic Link.

Input
Name Type Required Description
Path String True The path for which a symbolic link will be created.
Destination String True The destination of the symbolic link.
CreateParent Boolean False Whether the parent of the symbolic link should be created.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

DeleteFile

Delete a file or a directory.

Input
Name Type Required Description
Path String True The path (file or folder) which will be renamed.
Recursive Boolean False If the the path to be deleted is a folder, whether all children should be deleted as well.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

DeleteSnapshot

Create a Snapshot of a folder.

Input
Name Type Required Description
Path String True The path for which the snapshot will be deleted. Must be a folder.
SnapshotName String True The name of the snapshot which will be deleted.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

DownloadFile

Open and read a file.

Input
Name Type Required Description
Path String True The path of the file which will be opened.
Offset Integer False The offset from which the reading will start.
Length Integer False The amount of how much will be read from the file.
BufferSize Integer False The internal size of the buffer which will be used for the reading of the file
WriteToFile String False The local location of the file where the output will be written to.
Encoding String False The FileData input encoding type. The allowed values are NONE, BASE64. The default value is NONE.
Result Set Columns
Name Type Description
Output String The file's content. Output will be displayed only if WriteToFile and FileStream are not set.

GetContentSummary

Get the content summary of a file/folder.

Input
Name Type Required Description
Path String True The absolute path of the file/folder whose content summary will be returned.
Result Set Columns
Name Type Description
DirectoryCount String The number of directories in this folder.
FileCount String The number of files in this folder.
Length Integer The length of the folder/file.
Quota Integer The quota of the folder/file.
SpaceConsumed Integer The amount of space consumed by this folder/file.
SpaceQuota Integer The space quota of the folder/file.

GetFileChecksum

Get the checksum of a file.

Input
Name Type Required Description
Path String True The path for which the checksum will be returned. Must be a file.
Result Set Columns
Name Type Description
Algorithm String The algorithm used for the checksum.
Bytes String The checksum returned.
Length Integer The length of the file.

GetHomeDirectory

Get the home directory for the current user.

Result Set Columns
Name Type Description
Path String The path of the current user's home directory.

GetTrashRoot

Get the root directory of Trash for current user when the path specified is deleted.

Input
Name Type Required Description
Path String False The path whose trash root will be returned.
Result Set Columns
Name Type Description
Path String The path of the current user's trash root (for the specified path).

ListStatus

Lists the contents of the supplied path.

Input
Name Type Required Description
Path String False
Result Set Columns
Name Type Description
FileId Long The unique ID associated with the file.
PathSuffix String The path suffix.
Owner String The user who is the owner.
Group String The group owner.
Length Long The number of bytes in a file.
Permission String The permission represented as a octal string
Replication Integer The number of replication of a file.
StoragePolicy Integer The name of the storage policy
ChildrenNum Integer The number of children the file has.
BlockSize Long The block size of a file.
ModificationTime Datetime The modification time.
AccessTime Datetime The access time.
Type String The type of the path object.

MakeDirectory

Create a directory in the specified path.

Input
Name Type Required Description
Path String False The path of the new directory which will be created.
Permission String False The permission of the new directory. If no permissions are specified, the newly created directory will have 755 permission as default.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

RenameFile

Rename a file or a directory.

Input
Name Type Required Description
Path String True The path which will be renamed.
Destination String True The new path for the renamed file/folder.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

RenameSnapshot

Rename a Snapshot of a folder.

Input
Name Type Required Description
Path String False The path for which the snapshot will be renamed. Must be a folder.
SnapshotName String True The new name of the snapshot.
OldSnapshotName String True The old name of the snapshot.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

SetOwner

Set owner and group of a path.

Input
Name Type Required Description
Path String False The path whose owner/group will be changed.
Owner String False The new owner.
Group String False The new group.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

SetPermission

Set permission of a path.

Input
Name Type Required Description
Path String False The path whose permissions will be changed
Permission String True Unix permissions in an octal (base-8) notation.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

TruncateFile

Truncate a file to a new length.

Input
Name Type Required Description
Path String False The path of the file which will be truncated.
NewLength Integer True The new length for this file.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

UploadFile

Create and Write to a File.

Input
Name Type Required Description
Path String False The absolute path of the file which will be created.
Overwrite Boolean False If set to true, the file will be overwritten.
BlockSize Integer False The block size for this file.
Permission String False The permissions which will be set for the created file.
FilePath String False The path of the file whose content will be written to the newly created file. Has higher priority than Content.
Content String False The content as a string which will be written to the newly created file. Has lower priority than FilePath.
Result Set Columns
Name Type Description
Success Boolean Whether the operation completed successfully or not.

System Tables

You can query the system tables described in this section to access schema information, information on data source functionality, and batch operation statistics.

Schema Tables

The following tables return database metadata for HDFS:

Data Source Tables

The following tables return information about how to connect to and query the data source:

  • sys_connection_props: Returns information on the available connection properties.
  • sys_sqlinfo: Describes the SELECT queries that the connector can offload to the data source.

Query Information Tables

The following table returns query statistics for data modification queries:

  • sys_identity: Returns information about batch operations or single updates.

sys_catalogs

Lists the available databases.

The following query retrieves all databases determined by the connection string:

SELECT * FROM sys_catalogs
Columns
Name Type Description
CatalogName String The database name.

sys_schemas

Lists the available schemas.

The following query retrieves all available schemas:

SELECT * FROM sys_schemas
Columns
Name Type Description
CatalogName String The database name.
SchemaName String The schema name.

sys_tables

Lists the available tables.

The following query retrieves the available tables and views:

SELECT * FROM sys_tables
Columns
Name Type Description
CatalogName String The database containing the table or view.
SchemaName String The schema containing the table or view.
TableName String The name of the table or view.
TableType String The table type (table or view).
Description String A description of the table or view.
IsUpdateable Boolean Whether the table can be updated.

sys_tablecolumns

Describes the columns of the available tables and views.

The following query returns the columns and data types for the Files table:

SELECT ColumnName, DataTypeName FROM sys_tablecolumns WHERE TableName='Files'
Columns
Name Type Description
CatalogName String The name of the database containing the table or view.
SchemaName String The schema containing the table or view.
TableName String The name of the table or view containing the column.
ColumnName String The column name.
DataTypeName String The data type name.
DataType Int32 An integer indicating the data type. This value is determined at run time based on the environment.
Length Int32 The storage size of the column.
DisplaySize Int32 The designated column's normal maximum width in characters.
NumericPrecision Int32 The maximum number of digits in numeric data. The column length in characters for character and date-time data.
NumericScale Int32 The column scale or number of digits to the right of the decimal point.
IsNullable Boolean Whether the column can contain null.
Description String A brief description of the column.
Ordinal Int32 The sequence number of the column.
IsAutoIncrement String Whether the column value is assigned in fixed increments.
IsGeneratedColumn String Whether the column is generated.
IsHidden Boolean Whether the column is hidden.
IsArray Boolean Whether the column is an array.
IsReadOnly Boolean Whether the column is read-only.
IsKey Boolean Indicates whether a field returned from sys_tablecolumns is the primary key of the table.

sys_procedures

Lists the available stored procedures.

The following query retrieves the available stored procedures:

SELECT * FROM sys_procedures
Columns
Name Type Description
CatalogName String The database containing the stored procedure.
SchemaName String The schema containing the stored procedure.
ProcedureName String The name of the stored procedure.
Description String A description of the stored procedure.
ProcedureType String The type of the procedure, such as PROCEDURE or FUNCTION.

sys_procedureparameters

Describes stored procedure parameters.

The following query returns information about all of the input parameters for the Open stored procedure:

SELECT * FROM sys_procedureparameters WHERE ProcedureName='Open' AND Direction=1 OR Direction=2
Columns
Name Type Description
CatalogName String The name of the database containing the stored procedure.
SchemaName String The name of the schema containing the stored procedure.
ProcedureName String The name of the stored procedure containing the parameter.
ColumnName String The name of the stored procedure parameter.
Direction Int32 An integer corresponding to the type of the parameter: input (1), input/output (2), or output(4). input/output type parameters can be both input and output parameters.
DataTypeName String The name of the data type.
DataType Int32 An integer indicating the data type. This value is determined at run time based on the environment.
Length Int32 The number of characters allowed for character data. The number of digits allowed for numeric data.
NumericPrecision Int32 The maximum precision for numeric data. The column length in characters for character and date-time data.
NumericScale Int32 The number of digits to the right of the decimal point in numeric data.
IsNullable Boolean Whether the parameter can contain null.
IsRequired Boolean Whether the parameter is required for execution of the procedure.
IsArray Boolean Whether the parameter is an array.
Description String The description of the parameter.
Ordinal Int32 The index of the parameter.

sys_keycolumns

Describes the primary and foreign keys.

The following query retrieves the primary key for the Files table:

SELECT * FROM sys_keycolumns WHERE IsKey='True' AND TableName='Files'
Columns
Name Type Description
CatalogName String The name of the database containing the key.
SchemaName String The name of the schema containing the key.
TableName String The name of the table containing the key.
ColumnName String The name of the key column.
IsKey Boolean Whether the column is a primary key in the table referenced in the TableName field.
IsForeignKey Boolean Whether the column is a foreign key referenced in the TableName field.
PrimaryKeyName String The name of the primary key.
ForeignKeyName String The name of the foreign key.
ReferencedCatalogName String The database containing the primary key.
ReferencedSchemaName String The schema containing the primary key.
ReferencedTableName String The table containing the primary key.
ReferencedColumnName String The column name of the primary key.

sys_foreignkeys

Describes the foreign keys.

The following query retrieves all foreign keys which refer to other tables:

SELECT * FROM sys_foreignkeys WHERE ForeignKeyType = 'FOREIGNKEY_TYPE_IMPORT'
Columns
Name Type Description
CatalogName String The name of the database containing the key.
SchemaName String The name of the schema containing the key.
TableName String The name of the table containing the key.
ColumnName String The name of the key column.
PrimaryKeyName String The name of the primary key.
ForeignKeyName String The name of the foreign key.
ReferencedCatalogName String The database containing the primary key.
ReferencedSchemaName String The schema containing the primary key.
ReferencedTableName String The table containing the primary key.
ReferencedColumnName String The column name of the primary key.
ForeignKeyType String Designates whether the foreign key is an import (points to other tables) or export (referenced from other tables) key.

sys_primarykeys

Describes the primary keys.

The following query retrieves the primary keys from all tables and views:

SELECT * FROM sys_primarykeys
Columns
Name Type Description
CatalogName String The name of the database containing the key.
SchemaName String The name of the schema containing the key.
TableName String The name of the table containing the key.
ColumnName String The name of the key column.
KeySeq String The sequence number of the primary key.
KeyName String The name of the primary key.

sys_indexes

Describes the available indexes. By filtering on indexes, you can write more selective queries with faster query response times.

The following query retrieves all indexes that are not primary keys:

SELECT * FROM sys_indexes WHERE IsPrimary='false'
Columns
Name Type Description
CatalogName String The name of the database containing the index.
SchemaName String The name of the schema containing the index.
TableName String The name of the table containing the index.
IndexName String The index name.
ColumnName String The name of the column associated with the index.
IsUnique Boolean True if the index is unique. False otherwise.
IsPrimary Boolean True if the index is a primary key. False otherwise.
Type Int16 An integer value corresponding to the index type: statistic (0), clustered (1), hashed (2), or other (3).
SortOrder String The sort order: A for ascending or D for descending.
OrdinalPosition Int16 The sequence number of the column in the index.

sys_connection_props

Returns information on the available connection properties and those set in the connection string.

When querying this table, the config connection string should be used:

jdbc:cdata:hdfs:config:

This connection string enables you to query this table without a valid connection.

The following query retrieves all connection properties that have been set in the connection string or set through a default value:

SELECT * FROM sys_connection_props WHERE Value <> ''
Columns
Name Type Description
Name String The name of the connection property.
ShortDescription String A brief description.
Type String The data type of the connection property.
Default String The default value if one is not explicitly set.
Values String A comma-separated list of possible values. A validation error is thrown if another value is specified.
Value String The value you set or a preconfigured default.
Required Boolean Whether the property is required to connect.
Category String The category of the connection property.
IsSessionProperty String Whether the property is a session property, used to save information about the current connection.
Sensitivity String The sensitivity level of the property. This informs whether the property is obfuscated in logging and authentication forms.
PropertyName String A camel-cased truncated form of the connection property name.
Ordinal Int32 The index of the parameter.
CatOrdinal Int32 The index of the parameter category.
Hierarchy String Shows dependent properties associated that need to be set alongside this one.
Visible Boolean Informs whether the property is visible in the connection UI.
ETC String Various miscellaneous information about the property.

sys_sqlinfo

Describes the SELECT query processing that the connector can offload to the data source.

Discovering the Data Source's SELECT Capabilities

Below is an example data set of SQL capabilities. Some aspects of SELECT functionality are returned in a comma-separated list if supported; otherwise, the column contains NO.

Name Description Possible Values
AGGREGATE_FUNCTIONS Supported aggregation functions. AVG, COUNT, MAX, MIN, SUM, DISTINCT
COUNT Whether COUNT function is supported. YES, NO
IDENTIFIER_QUOTE_OPEN_CHAR The opening character used to escape an identifier. [
IDENTIFIER_QUOTE_CLOSE_CHAR The closing character used to escape an identifier. ]
SUPPORTED_OPERATORS A list of supported SQL operators. =, >, <, >=, <=, <>, !=, LIKE, NOT LIKE, IN, NOT IN, IS NULL, IS NOT NULL, AND, OR
GROUP_BY Whether GROUP BY is supported, and, if so, the degree of support. NO, NO_RELATION, EQUALS_SELECT, SQL_GB_COLLATE
STRING_FUNCTIONS Supported string functions. LENGTH, CHAR, LOCATE, REPLACE, SUBSTRING, RTRIM, LTRIM, RIGHT, LEFT, UCASE, SPACE, SOUNDEX, LCASE, CONCAT, ASCII, REPEAT, OCTET, BIT, POSITION, INSERT, TRIM, UPPER, REGEXP, LOWER, DIFFERENCE, CHARACTER, SUBSTR, STR, REVERSE, PLAN, UUIDTOSTR, TRANSLATE, TRAILING, TO, STUFF, STRTOUUID, STRING, SPLIT, SORTKEY, SIMILAR, REPLICATE, PATINDEX, LPAD, LEN, LEADING, KEY, INSTR, INSERTSTR, HTML, GRAPHICAL, CONVERT, COLLATION, CHARINDEX, BYTE
NUMERIC_FUNCTIONS Supported numeric functions. ABS, ACOS, ASIN, ATAN, ATAN2, CEILING, COS, COT, EXP, FLOOR, LOG, MOD, SIGN, SIN, SQRT, TAN, PI, RAND, DEGREES, LOG10, POWER, RADIANS, ROUND, TRUNCATE
TIMEDATE_FUNCTIONS Supported date/time functions. NOW, CURDATE, DAYOFMONTH, DAYOFWEEK, DAYOFYEAR, MONTH, QUARTER, WEEK, YEAR, CURTIME, HOUR, MINUTE, SECOND, TIMESTAMPADD, TIMESTAMPDIFF, DAYNAME, MONTHNAME, CURRENT_DATE, CURRENT_TIME, CURRENT_TIMESTAMP, EXTRACT
REPLICATION_SKIP_TABLES Indicates tables skipped during replication.
REPLICATION_TIMECHECK_COLUMNS A string array containing a list of columns which will be used to check for (in the given order) to use as a modified column during replication.
IDENTIFIER_PATTERN String value indicating what string is valid for an identifier.
SUPPORT_TRANSACTION Indicates if the provider supports transactions such as commit and rollback. YES, NO
DIALECT Indicates the SQL dialect to use.
KEY_PROPERTIES Indicates the properties which identify the uniform database.
SUPPORTS_MULTIPLE_SCHEMAS Indicates if multiple schemas may exist for the provider. YES, NO
SUPPORTS_MULTIPLE_CATALOGS Indicates if multiple catalogs may exist for the provider. YES, NO
DATASYNCVERSION The Data Sync version needed to access this driver. Standard, Starter, Professional, Enterprise
DATASYNCCATEGORY The Data Sync category of this driver. Source, Destination, Cloud Destination
SUPPORTSENHANCEDSQL Whether enhanced SQL functionality beyond what is offered by the API is supported. TRUE, FALSE
SUPPORTS_BATCH_OPERATIONS Whether batch operations are supported. YES, NO
SQL_CAP All supported SQL capabilities for this driver. SELECT, INSERT, DELETE, UPDATE, TRANSACTIONS, ORDERBY, OAUTH, ASSIGNEDID, LIMIT, LIKE, BULKINSERT, COUNT, BULKDELETE, BULKUPDATE, GROUPBY, HAVING, AGGS, OFFSET, REPLICATE, COUNTDISTINCT, JOINS, DROP, CREATE, DISTINCT, INNERJOINS, SUBQUERIES, ALTER, MULTIPLESCHEMAS, GROUPBYNORELATION, OUTERJOINS, UNIONALL, UNION, UPSERT, GETDELETED, CROSSJOINS, GROUPBYCOLLATE, MULTIPLECATS, FULLOUTERJOIN, MERGE, JSONEXTRACT, BULKUPSERT, SUM, SUBQUERIESFULL, MIN, MAX, JOINSFULL, XMLEXTRACT, AVG, MULTISTATEMENTS, FOREIGNKEYS, CASE, LEFTJOINS, COMMAJOINS, WITH, LITERALS, RENAME, NESTEDTABLES, EXECUTE, BATCH, BASIC, INDEX
PREFERRED_CACHE_OPTIONS A string value specifies the preferred cacheOptions.
ENABLE_EF_ADVANCED_QUERY Indicates if the driver directly supports advanced queries coming from Entity Framework. If not, queries will be handled client side. YES, NO
PSEUDO_COLUMNS A string array indicating the available pseudo columns.
MERGE_ALWAYS If the value is true, The Merge Mode is forcibly executed in Data Sync. TRUE, FALSE
REPLICATION_MIN_DATE_QUERY A select query to return the replicate start datetime.
REPLICATION_MIN_FUNCTION Allows a provider to specify the formula name to use for executing a server side min.
REPLICATION_START_DATE Allows a provider to specify a replicate startdate.
REPLICATION_MAX_DATE_QUERY A select query to return the replicate end datetime.
REPLICATION_MAX_FUNCTION Allows a provider to specify the formula name to use for executing a server side max.
IGNORE_INTERVALS_ON_INITIAL_REPLICATE A list of tables which will skip dividing the replicate into chunks on the initial replicate.
CHECKCACHE_USE_PARENTID Indicates whether the CheckCache statement should be done against the parent key column. TRUE, FALSE
CREATE_SCHEMA_PROCEDURES Indicates stored procedures that can be used for generating schema files.

The following query retrieves the operators that can be used in the WHERE clause:

SELECT * FROM sys_sqlinfo WHERE Name = 'SUPPORTED_OPERATORS'

Note that individual tables may have different limitations or requirements on the WHERE clause; refer to the Data Model section for more information.

Columns
Name Type Description
NAME String A component of SQL syntax, or a capability that can be processed on the server.
VALUE String Detail on the supported SQL or SQL syntax.

sys_identity

Returns information about attempted modifications.

The following query retrieves the Ids of the modified rows in a batch operation:

SELECT * FROM sys_identity
Columns
Name Type Description
Id String The database-generated ID returned from a data modification operation.
Batch String An identifier for the batch. 1 for a single operation.
Operation String The result of the operation in the batch: INSERTED, UPDATED, or DELETED.
Message String SUCCESS or an error message if the update in the batch failed.

sys_information

Describes the available system information.

The following query retrieves all columns:

SELECT * FROM sys_information
Columns
Name Type Description
Product String The name of the product.
Version String The version number of the product.
Datasource String The name of the datasource the product connects to.
NodeId String The unique identifier of the machine where the product is installed.
HelpURL String The URL to the product's help documentation.
License String The license information for the product. (If this information is not available, the field may be left blank or marked as 'N/A'.)
Location String The file path location where the product's library is stored.
Environment String The version of the environment or rumtine the product is currently running under.
DataSyncVersion String The tier of Sync required to use this connector.
DataSyncCategory String The category of Sync functionality (e.g., Source, Destination).

Advanced Configurations Properties

The advanced configurations properties are the various options that can be used to establish a connection. This section provides a complete list of the options you can configure. Click the links for further details.

Authentication

Property Description
AuthScheme The scheme used for authentication. Accepted entries are None, and Negotiate (Kerberos). None is the default.
Host This property specifies the host of your HDFS installation.
Port This property specifies the port of your HDFS installation.
User The user name to login to the HDFS server.
Password The password used to authenticate to the HDFS server. Only used when Kerberos authentication is selected.
AccessToken The HDFS Access Token.
UseSSL This field sets whether SSL is enabled.

Connection

Property Description
Path This property specifies the HDFS path which will be used as the working directory.
DirectoryRetrievalDepth Limit the subfolders recursively scanned.

Kerberos

Property Description
KerberosKDC The Kerberos Key Distribution Center (KDC) service used to authenticate the user.
KerberosRealm The Kerberos Realm used to authenticate the user.
KerberosSPN The service principal name (SPN) for the Kerberos Domain Controller.
KerberosKeytabFile The Keytab file containing your pairs of Kerberos principals and encrypted keys.
KerberosServiceRealm The Kerberos realm of the service.
KerberosServiceKDC The Kerberos KDC of the service.
KerberosTicketCache The full file path to an MIT Kerberos credential cache file.

SSL

Property Description
SSLServerCert The certificate to be accepted from the server when connecting using TLS/SSL.

Schema

Property Description
Location A path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemas This property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA, SchemaB, SchemaC.
Tables This property restricts the tables reported to a subset of the available tables. For example, Tables=TableA, TableB, TableC.
Views Restricts the views reported to a subset of the available tables. For example, Views=ViewA, ViewB, ViewC.

Miscellaneous

Property Description
MaxRows Limits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.
Other These hidden properties are used only in specific use cases.
PseudoColumns This property indicates whether or not to include pseudo columns as columns to the table.
Timeout The value in seconds until the timeout error is thrown, canceling the operation.
UserDefinedViews A filepath pointing to the JSON configuration file containing your custom views.

Authentication

This section provides a complete list of authentication properties you can configure.

Property Description
AuthScheme The scheme used for authentication. Accepted entries are None, and Negotiate (Kerberos). None is the default.
Host This property specifies the host of your HDFS installation.
Port This property specifies the port of your HDFS installation.
User The user name to login to the HDFS server.
Password The password used to authenticate to the HDFS server. Only used when Kerberos authentication is selected.
AccessToken The HDFS Access Token.
UseSSL This field sets whether SSL is enabled.

AuthScheme

The scheme used for authentication. Accepted entries are None, and Negotiate (Kerberos). None is the default.

Possible Values

None, Negotiate, Token

Data Type

string

Default Value

None

Remarks

This field is used to authenticate against the server. Use the following options to select your authentication scheme:

  • None: Set this to use anonymous authentication and connect to the HDFS data source without specifying the user credentials.
  • Negotiate: If AuthScheme is set to Negotiate, the connector will negotiate an authentication mechanism with the server. Set AuthScheme to Negotiate if you want to use Kerberos authentication.
  • Token: Set this to authenticate using an AccessToken.

Host

This property specifies the host of your HDFS installation.

Data Type

string

Default Value

""

Remarks

This property specifies the host of your HDFS installation.

Port

This property specifies the port of your HDFS installation.

Data Type

string

Default Value

50070

Remarks

This property specifies the port of your HDFS installation.

User

The user name to login to the HDFS server.

Data Type

string

Default Value

""

Remarks

The user name to login to the HDFS server. If AuthScheme=None, it is used as the authenticated user. If AuthScheme=Negotiate, it is used in the Kerberos Authentication as a client principal.

Password

The password used to authenticate to the HDFS server. Only used when Kerberos authentication is selected.

Data Type

string

Default Value

""

Remarks

The password used to authenticate to the HDFS server. Only used when Kerberos authentication is selected.

AccessToken

The HDFS Access Token.

Data Type

string

Default Value

""

Remarks

The HDFS Access Token used to authenticate the requests.

UseSSL

This field sets whether SSL is enabled.

Data Type

bool

Default Value

false

Remarks

This field sets whether the connector will attempt to negotiate TLS/SSL connections to the server. By default, the connector checks the server's certificate against the system's trusted certificate store. To specify another certificate, set SSLServerCert.

Connection

This section provides a complete list of connection properties you can configure.

Property Description
Path This property specifies the HDFS path which will be used as the working directory.
DirectoryRetrievalDepth Limit the subfolders recursively scanned.

Path

This property specifies the HDFS path which will be used as the working directory.

Data Type

string

Default Value

""

Remarks

This property specifies the HDFS path which will be used as the working directory. Used in views Files and Permissions.

DirectoryRetrievalDepth

Limit the subfolders recursively scanned.

Data Type

int

Default Value

-1

Remarks

DirectoryRetrievalDepth specifies how many subfolders will be recursively scanned before stopping. -1 specifies that all subfolders are scanned. 0 specifies that only the current folder will be scanned for items.

Kerberos

This section provides a complete list of Kerberos properties you can configure.

Property Description
KerberosKDC The Kerberos Key Distribution Center (KDC) service used to authenticate the user.
KerberosRealm The Kerberos Realm used to authenticate the user.
KerberosSPN The service principal name (SPN) for the Kerberos Domain Controller.
KerberosKeytabFile The Keytab file containing your pairs of Kerberos principals and encrypted keys.
KerberosServiceRealm The Kerberos realm of the service.
KerberosServiceKDC The Kerberos KDC of the service.
KerberosTicketCache The full file path to an MIT Kerberos credential cache file.

KerberosKDC

The Kerberos Key Distribution Center (KDC) service used to authenticate the user.

Data Type

string

Default Value

""

Remarks

The Kerberos properties are used when using SPNEGO or Windows Authentication. The connector will request session tickets and temporary session keys from the Kerberos KDC service. The Kerberos KDC service is conventionally colocated with the domain controller.

If Kerberos KDC is not specified, the connector will attempt to detect these properties automatically from the following locations:

  • KRB5 Config File (krb5.ini/krb5.conf): If the KRB5_CONFIG environment variable is set and the file exists, the connector will obtain the KDC from the specified file. Otherwise, it will attempt to read from the default MIT location based on the OS: C:\ProgramData\MIT\Kerberos5\krb5.ini (Windows) or /etc/krb5.conf (Linux).
  • Java System Properties: Using the system properties java.security.krb5.realm and java.security.krb5.kdc.
  • Domain Name and Host: If the Kerberos Realm and Kerberos KDC could not be inferred from another location, the connector will infer them from the configured domain name and host.

Note

Windows authentication is supported in JRE 1.6 and above only.

KerberosRealm

The Kerberos Realm used to authenticate the user.

Data Type

string

Default Value

""

Remarks

The Kerberos properties are used when using SPNEGO or Windows Authentication. The Kerberos Realm is used to authenticate the user with the Kerberos Key Distribution Service (KDC). The Kerberos Realm can be configured by an administrator to be any string, but conventionally it is based on the domain name.

If Kerberos Realm is not specified, the connector will attempt to detect these properties automatically from the following locations:

  • KRB5 Config File (krb5.ini/krb5.conf): If the KRB5_CONFIG environment variable is set and the file exists, the connector will obtain the default realm from the specified file. Otherwise, it will attempt to read from the default MIT location based on the OS: C:\ProgramData\MIT\Kerberos5\krb5.ini (Windows) or /etc/krb5.conf (Linux)
  • Java System Properties: Using the system properties java.security.krb5.realm and java.security.krb5.kdc.
  • Domain Name and Host: If the Kerberos Realm and Kerberos KDC could not be inferred from another location, the connector will infer them from the user-configured domain name and host. This might work in some Windows environments.

Note

Kerberos-based authentication is supported in JRE 1.6 and above only.

KerberosSPN

The service principal name (SPN) for the Kerberos Domain Controller.

Data Type

string

Default Value

""

Remarks

If the SPN on the Kerberos Domain Controller is not the same as the URL that you are authenticating to, use this property to set the SPN.

KerberosKeytabFile

The Keytab file containing your pairs of Kerberos principals and encrypted keys.

Data Type

string

Default Value

""

Remarks

The Keytab file containing your pairs of Kerberos principals and encrypted keys.

KerberosServiceRealm

The Kerberos realm of the service.

Data Type

string

Default Value

""

Remarks

The KerberosServiceRealm is the specify the service Kerberos realm when using cross-realm Kerberos authentication.

In most cases, a single realm and KDC machine are used to perform the Kerberos authentication and this property is not required.

This property is available for complex setups where a different realm and KDC machine are used to obtain an authentication ticket (AS request) and a service ticket (TGS request).

KerberosServiceKDC

The Kerberos KDC of the service.

Data Type

string

Default Value

""

Remarks

The KerberosServiceKDC is used to specify the service Kerberos KDC when using cross-realm Kerberos authentication.

In most cases, a single realm and KDC machine are used to perform the Kerberos authentication and this property is not required.

This property is available for complex setups where a different realm and KDC machine are used to obtain an authentication ticket (AS request) and a service ticket (TGS request).

KerberosTicketCache

The full file path to an MIT Kerberos credential cache file.

Data Type

string

Default Value

""

Remarks

This property can be set if you wish to use a credential cache file that was created using the MIT Kerberos Ticket Manager or kinit command.

SSL

This section provides a complete list of SSL properties you can configure.

Property Description
SSLServerCert The certificate to be accepted from the server when connecting using TLS/SSL.

SSLServerCert

The certificate to be accepted from the server when connecting using TLS/SSL.

Data Type

string

Default Value

""

Remarks

If using a TLS/SSL connection, this property can be used to specify the TLS/SSL certificate to be accepted from the server. Any other certificate that is not trusted by the machine is rejected.

This property can take the following forms:

Description Example
A full PEM Certificate (example shortened for brevity) -----BEGIN CERTIFICATE----- MIIChTCCAe4CAQAwDQYJKoZIhv......Qw== -----END CERTIFICATE-----
A path to a local file containing the certificate C:\\cert.cer
The public key (example shortened for brevity) -----BEGIN RSA PUBLIC KEY----- MIGfMA0GCSq......AQAB -----END RSA PUBLIC KEY-----
The MD5 Thumbprint (hex values can also be either space or colon separated) ecadbdda5a1529c58a1e9e09828d70e4
The SHA1 Thumbprint (hex values can also be either space or colon separated) 34a929226ae0819f2ec14b4a3d904f801cbb150d

If not specified, any certificate trusted by the machine is accepted.

Certificates are validated as trusted by the machine based on the System's trust store. The trust store used is the 'javax.net.ssl.trustStore' value specified for the system. If no value is specified for this property, Java's default trust store is used (for example, JAVA_HOME\lib\security\cacerts).

Use '*' to signify to accept all certificates. Note that this is not recommended due to security concerns.

Schema

This section provides a complete list of schema properties you can configure.

Property Description
Location A path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemas This property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA, SchemaB, SchemaC.
Tables This property restricts the tables reported to a subset of the available tables. For example, Tables=TableA, TableB, TableC.
Views Restricts the views reported to a subset of the available tables. For example, Views=ViewA, ViewB, ViewC.

Location

A path to the directory that contains the schema files defining tables, views, and stored procedures.

Data Type

string

Default Value

%APPDATA%\HDFS Data Provider\Schema

Remarks

The path to a directory which contains the schema files for the connector (.rsd files for tables and views, .rsb files for stored procedures). The folder location can be a relative path from the location of the executable. The Location property is only needed if you want to customize definitions (for example, change a column name, ignore a column, and so on) or extend the data model with new tables, views, or stored procedures.

If left unspecified, the default location is "%APPDATA%\HDFS Data Provider\Schema" with %APPDATA% being set to the user's configuration directory:

Platform %APPDATA%
Windows The value of the APPDATA environment variable
Mac ~/Library/Application Support
Linux ~/.config

BrowsableSchemas

This property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.

Data Type

string

Default Value

""

Remarks

Listing the schemas from databases can be expensive. Providing a list of schemas in the connection string improves the performance.

Tables

This property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.

Data Type

string

Default Value

""

Remarks

Listing the tables from some databases can be expensive. Providing a list of tables in the connection string improves the performance of the connector.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the tables you want in a comma-separated list. Each table should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Tables=TableA,[TableB/WithSlash],WithCatalog.WithSchema.`TableC With Space`.

Note that when connecting to a data source with multiple schemas or catalogs, you will need to provide the fully qualified name of the table in this property, as in the last example here, to avoid ambiguity between tables that exist in multiple catalogs or schemas.

Views

Restricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Data Type

string

Default Value

""

Remarks

Listing the views from some databases can be expensive. Providing a list of views in the connection string improves the performance of the connector.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the views you want in a comma-separated list. Each view should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Views=ViewA,[ViewB/WithSlash],WithCatalog.WithSchema.`ViewC With Space`.

Note that when connecting to a data source with multiple schemas or catalogs, you will need to provide the fully qualified name of the table in this property, as in the last example here, to avoid ambiguity between tables that exist in multiple catalogs or schemas.

Miscellaneous

This section provides a complete list of miscellaneous properties you can configure.

Property Description
MaxRows Limits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.
Other These hidden properties are used only in specific use cases.
PseudoColumns This property indicates whether or not to include pseudo columns as columns to the table.
Timeout The value in seconds until the timeout error is thrown, canceling the operation.
UserDefinedViews A filepath pointing to the JSON configuration file containing your custom views.

MaxRows

Limits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.

Data Type

int

Default Value

-1

Remarks

Limits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.

Other

These hidden properties are used only in specific use cases.

Data Type

string

Default Value

""

Remarks

The properties listed below are available for specific use cases. Normal driver use cases and functionality should not require these properties.

Specify multiple properties in a semicolon-separated list.

Integration and Formatting
Property Description
DefaultColumnSize Sets the default length of string fields when the data source does not provide column length in the metadata. The default value is 2000.
ConvertDateTimeToGMT Determines whether to convert date-time values to GMT, instead of the local time of the machine.
RecordToFile=filename Records the underlying socket data transfer to the specified file.

PseudoColumns

This property indicates whether or not to include pseudo columns as columns to the table.

Data Type

string

Default Value

""

Remarks

This setting is particularly helpful in Entity Framework, which does not allow you to set a value for a pseudo column unless it is a table column. The value of this connection setting is of the format "Table1=Column1, Table1=Column2, Table2=Column3". You can use the "*" character to include all tables and all columns; for example, "*=*".

Timeout

The value in seconds until the timeout error is thrown, canceling the operation.

Data Type

int

Default Value

60

Remarks

If Timeout = 0, operations do not time out. The operations run until they complete successfully or until they encounter an error condition.

If Timeout expires and the operation is not yet complete, the connector throws an exception.

UserDefinedViews

A filepath pointing to the JSON configuration file containing your custom views.

Data Type

string

Default Value

""

Remarks

User Defined Views are defined in a JSON-formatted configuration file called UserDefinedViews.json. The connector automatically detects the views specified in this file.

You can also have multiple view definitions and control them using the UserDefinedViews connection property. When you use this property, only the specified views are seen by the connector.

This User Defined View configuration file is formatted as follows:

  • Each root element defines the name of a view.
  • Each root element contains a child element, called query, which contains the custom SQL query for the view.

For example:

{
    "MyView": {
        "query": "SELECT * FROM Files WHERE MyColumn = 'value'"
    },
    "MyView2": {
        "query": "SELECT * FROM MyTable WHERE Id IN (1,2,3)"
    }
}

Use the UserDefinedViews connection property to specify the location of your JSON configuration file. For example:

"UserDefinedViews", C:\Users\yourusername\Desktop\tmp\UserDefinedViews.json

Note that the specified path is not embedded in quotation marks.