With an introduction from Teradata, SQLstream and ECaTS demo:
– the creation and management of data pipelines through automatic discovery and transformation of any data format from or to any data format, interfacing with a wide array of sources and destinations including Amazon Kinesis and Firehose, Hadoop, data warehouses, message buses (including Kafka), files, and devices
– the delivery of accurate, complete, and consistent data flows through operations like continuous and real-time: LOAD; data wrangling, parsing, and filtering
– the transformation of both live data streams and historical data streams brought live through streaming ingestion
– the integration of multiple, disparate, data streams, concurrently, continuously, and at rates of millions of records per second per CPU core.