Skip to main content

Sending data to targets

The basic pattern for a target is:

CQ > stream > target

For example:

CREATE CQ JoinDataCQ
INSERT INTO JoinedDataStream ...
 
CREATE TARGET JoinedDataTarget
USING SysOut(name:JoinedData)
INPUT FROM JoinedDataStream;

This writes the output from JoinedDataStream to SysOut.

When writing an application with a more complex target, it may be most efficient to write an end-to-end application that simply gets the data from the source and writes it to the target, then add the "intelligence" to the application once you know that the data is being read and written correctly. For example, this application will read PosApp's sample data, parse it (see Getting data from sources), and write it to Hadoop:

CREATE SOURCE CSVSource USING FileReader (
  directory:'Samples/PosApp/AppData',
  WildCard:'posdata.csv',
  positionByEOF:false
)
PARSE USING DSVParser (
  header:'yes'
)
OUTPUT TO CsvStream;

CREATE TYPE CSVType (
  merchantId String,
  dateTime DateTime,
  hourValue Integer,
  amount Double,
  zip String
);
CREATE STREAM TypedCSVStream OF CSVType;

CREATE CQ CsvToPosData
INSERT INTO TypedCSVStream
SELECT data[1],
  TO_DATEF(data[4],'yyyyMMddHHmmss'),
  DHOURS(TO_DATEF(data[4],'yyyyMMddHHmmss')),
  TO_DOUBLE(data[7]),
  data[9]
FROM CsvStream;

CREATE TARGET hdfsOutput USING HDFSWriter(
  filename:'hdfstestOut',
  hadoopurl:'hdfs://192.168.1.13:8020/output/',
  flushinterval: '1'
)
FORMAT USING DSVFormatter (
)
INPUT FROM TypedCSVStream;

After verifying that the data is being written to the target correctly, you could then add additional components between CsvToPosData and hdfsOutput to filter, aggregate, or enrich the data.

For additional end-to-end examples, see Database Writer, JMSWriter, and Kafka Writer.