Table of Contents

Search

  1. Preface
  2. Introduction to PowerExchange for Amazon S3
  3. PowerExchange for Amazon S3 Configuration Overview
  4. Amazon S3 Connections
  5. PowerExchange for Amazon S3 Data Objects
  6. PowerExchange for Amazon S3 Mappings
  7. PowerExchange for Amazon S3 Lookups
  8. Appendix A: Amazon S3 Data Type Reference
  9. Appendix B: Troubleshooting

PowerExchange for Amazon S3 User Guide

PowerExchange for Amazon S3 User Guide

Creating an Amazon S3 Data Object

Creating an Amazon S3 Data Object

Create an Amazon S3 data object to add to a mapping.
PowerExchange for Amazon S3 supports only UTF-8 encoding to read or write data.
  1. Select a project or folder in the
    Object Explorer
    view.
  2. Click
    File
    New
    Data Object
    .
  3. Select
    Amazon S3 Data Object
    and click
    Next
    .
    The
    Amazon S3 Data Object
    dialog box appears.
  4. Enter a name for the data object.
  5. In the
    Resource Format
    list, select any of the following formats:
    • Intelligent Structure Model: to read any format that an intelligent structure parses.
    • Binary: to read any resource format.
    • Flat: to read a flat resource.
    • Avro: to read an Avro resource.
    • ORC: to read an ORC resource.
    • JSON: to read a JSON resource.
    • Parquet: to read a Parquet resource.
  6. Click
    Browse
    next to the
    Location
    option and select the target project or folder.
  7. Click
    Browse
    next to the
    Connection
    option and select the Amazon S3 connection from which you want to import the Amazon S3 object.
  8. To add a resource, click
    Add
    next to the
    Selected Resources
    option.
    The
    Add Resource
    dialog box appears.
  9. Select
    Amazon S3
    object and click
    OK
    .
    To use an intelligent structure model, select the appropriate
    .amodel
    file.
  10. Click
    Next
    .
  11. Configure the format properties.
    Property
    Description
    Delimiters
    Character used to separate columns of data. If you enter a delimiter that is the same as the escape character or the text qualifier, you might receive unexpected results. Amazon S3 reader and writer support Delimiters. You cannot specify a multibyte character as a delimiter.
    Text Qualifier
    Quote character that defines the boundaries of text strings. If you select a quote character, the Developer tool ignores delimiters within pairs of quotes. Amazon S3 reader supports Text Qualifier.
    Import Column Names From First Line
    If selected, the Developer tool uses data in the first row for column names. Select this option if column names appear in the first row. The Developer tool prefixes"FIELD_" to field names that are not valid. Amazon S3 reader and writer support Import Column Names From First Line.
    Row Delimiter
    Specify a line break character. Select from the list or enter a character. Preface an octal code with a backslash (\).
    To use a single character, enter the character. The Data Integration Service uses only the first character when the entry is not preceded by a backslash. The character must be a single-byte character, and no other character in the code page can contain that byte.
    Default is line-feed, \012 LF (\n).
    Escape Character
    Character immediately preceding a column delimiter character embedded in an unquoted string, or immediately preceding the quote character in a quoted string.
    When you specify an escape character, the Data Integration Service reads the delimiter character as a regular character.
    The
    Start import at line
    ,
    Treat consecutive delimters as one
    , and
    Retain escape character in data
    properties in the
    Column Projection
    dialog box are not applicable for PowerExchange for Amazon S3.
  12. Click
    Next
    to preview the flat file data object.
  13. Click
    Finish
    .
    The data object appears under the Physical Data Objects category in the project or folder in the
    Object Explorer
    view. When you create an Amazon S3 data object, the value of the folder path is displayed incorrectly in the
    Resources
    tab. Read and write operations are created for the data object. Depending on whether you want to use the Amazon S3 data object as a source or target, you can edit the read or write operation properties.
    Select a read transformation for a data object with an intelligent structure model. You cannot use a write transformation for a data object with an intelligent structure model in a mapping.
  14. For a read operation with an intelligent structure model, specify the path to the input file or folder. In the
    Data Object Operations
    panel, select the
    Advanced
    tab. In the
    File path
    field, specify the path to the input file or folder.

0 COMMENTS

We’d like to hear from you!