Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Understanding Data Types and Field Properties
  3. Designing Processes
  4. Using and Displaying Data
  5. Designing Guides
  6. Designing Process Objects
  7. Designing Service Connectors
  8. Using App Connections
  9. System Services, Listeners and Connectors
  10. Designing Human Tasks

Design

Design

tokenize

tokenize

The tokenize function in XQuery is used to split a string into a sequence of substrings based on a specified regular expression pattern that serves as a delimiter. This function is particularly useful for parsing and separating components of a string for further processing.

Syntax

fn:tokenize(input, pattern, flags)
The following table describes the argument for this command:
Argument
Required/Optional
Description
input
Required
The input string to be tokenized. If this argument is an empty sequence (
()
), it is treated as an empty string.
pattern
Required
A regular expression pattern that specifies the delimiter by which the input string will be split. The pattern should be a valid regular expression according to the XML Schema definition.
flags
Optional
A string of flags that modify the behavior of the regular expression as follows:
  • s
    : Single-line mode, where the dot (.) matches all characters, including newline.
  • m
    : Multi-line mode, where start
    ^
    and end
    $
    anchors match the start and end of any line.
  • i
    : Case-insensitive mode.

Return Value

Returns a sequence of strings obtained by partitioning the
$input
string at delimiters specified by the
$pattern
. If the input string is empty or the pattern does not find any matches, an empty sequence or the entire input string is returned as appropriate.

Examples

The following table lists some sample values and return values:
SAMPLE FUNCTION
OUTPUT
fn:tokenize( 'a b c', '\s')
('a', 'b', 'c')
fn:tokenize( 'a b c', '\s')
('a', '', '', 'b', 'c')
fn:tokenize( 'a b c', '\s+')
('a', 'b', 'c')
fn:tokenize( ' b c', '\s')
('', 'b', 'c')
fn:tokenize( 'a,b,c', ',')
('a', 'b', 'c')
fn:tokenize( 'a,b,,c', ',')
('a', 'b', '', 'c')
fn:tokenize( 'a, b, c', '[,\s]+')
('a', 'b', 'c')
fn:tokenize( '2006-12-25T12:15:00', '[\-T:]')
('2006', '12', '25', '12', '15', '00')
fn:tokenize( 'Hello, there.', '\W+')
('Hello', 'there', '')
fn:tokenize( (), '\s+')
()
fn:tokenize( 'abc', '\s')
abc
fn:tokenize( 'abcd', 'b?')
Error FORX0003
fn:tokenize( 'a,xb,xc', ',|,x')
('a', 'xb', 'xc')
fn:tokenize("", ",")
()

0 COMMENTS

We’d like to hear from you!