Skip to main content
Skip table of contents

Data Source Support

Oracle Connector

Oracle Database (commonly referred to as Oracle RDBMS or simply as Oracle) is a multi-model database management system produced and marketed by Oracle Corporation. The following table lists the versions that have been tested in the lab setup:

Platforms

Version

Linux

  • Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production - AWS

  • Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production - GCP

  • User on source database must select privileges

  • User on target database side must have all privileges and SELECT_CATALOG_ROLE.

Supported Data Types

The following are the different data types that are tested in our lab setup:

  • VARCHAR

  • VARCHAR2

  • NUMBER

  • FLOAT

  • DATE

  • TIMESTAMP(default)

  • CLOB

  • BLOB(with text)

Hyperscale Compliance restricts the support of the following special characters for a database column name: ~!@#$%^&*()\\\"?:;,/\\\\`+=[]{}|<>'-.\")]

Property values

Property

Value

SKIP.LOAD.SPLIT.COUNT.VALIDATION

false

SKIP.UNLOAD.SPLIT.COUNT.VALIDATION

false

For default values, see Configuration settings .

MS SQL Connector

Supported versions

Microsoft SQL Server 2019

Supported data types

The following are the different data types that are tested in our lab setup:

  • VARCHAR

  • CHAR

  • DATETIME

  • INT

  • TEXT

  • XML (only unload/load))

  • VARBINARY (only unload/load)

  • SMALLINT

  • SMALLMONEY

  • MONEY

  • BIGINT

  • NVARCHAR

  • TINYINT

  • NUMERIC(X,Y)

  • DECIMAL(X,Y)

  • FLOAT

  • NCHAR

  • BIT

  • NTEXT

  • MONEY

Property Values

Property

Value

SKIP.LOAD.SPLIT.COUNT.VALIDATION

false

SKIP.UNLOAD.SPLIT.COUNT.VALIDATION

false

For default values, see Configuration settings .

Known Limitations

  • If the applied algorithm's produced mask data exceeds the corresponding target table columns datatype's max value range, then job execution will fail in load service.

  • Schemas, tables, and column names having special characters are not supported.

  • Masking of columns with VARBINARY datatype is not supported.

  • Hyperscale Compliance can mask up to a maximum 1000 tables in a single job.

Delimited Files Connector

The connector can be used to mask large delimited files. The delimited unload service splits the large files into smaller chunks and passes them onto the masking service. After the masking is completed, the files are sent to the load service which joins back the split files (the end user also has a choice to disable the join operation).  

Pre-requisites

  • The source and target (NFS) locations have to be mounted onto the docker containers of unload and load service. Please note that the locations on the containers are what needs to be used when creating the connector-info’s using the controller. 

    CODE
    # As an example
    unload-service:
         image: delphix-delimited-unload-service-app:${VERSION}
         ...
         volumes:
              ...
         - /path/to/nfs/mounted/source1/files:/mnt/source1
         - /path/to/nfs/mounted/source2/files:/mnt/source2
    ...
    load-service:
         image: delphix-delimited-load-service-app:${VERSION}
         ...
         volumes:
              ...
         - /path/to/nfs/mounted/target1/files:/mnt/target1
         - /path/to/nfs/mounted/target2/files:/mnt/target2
    

Property values

Property

Value

SOURCE_KEY_FIELD_NAMES (Mandatory change required for Delimited Connector, in the docker-compose file)

unique_source_files_identifier

LOAD_SERVICE_REQUIRE_POST_LOAD (to be edited in the .env file)

false

For default values, see Configuration settings.

Supported data types

The following are the different data types that are tested in our lab setup:

  • String/Text

  • Double

    • Columns with values such as 36377974237282886994505 get converted to type double with value 3.637e22, inherently by PyArrow. In order to not have any loss of data, the Delimited Files Connector converts all double interpretations as strings.

    • In this case, 36377974237282886994505 will be converted to "36377974237282886994505".

  • Int64

    • Columns with values such as M get inferred as type Int64, but the conversion fails for the above value. 

    • In order to mitigate the same, all Int64 types are converted to type String. In this case, 00009435304391722556805 will be converted to "00009435304391722556805".

  • Timestamp

Known Limitations

The backend technology used to perform the split and join operations is PyArrow, which comes with certain limitations:

  1. It supports only single-character delimiter.

  2. The end-of-record character can only be `\n`, `\r`, or `\r\n`.

The output files will have all string types quoted with double quotes (`”`) only.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.