 
      
    | You are here | risingwave.com | ||
| | | | | www.onehouse.ai | |
| | | | | The Onehouse Data Integration Datasheet highlights strengths of the Universal Data Lakehouse: universal connectivity; the ability to ingest once, then query anywhere; and flexibility for streaming and batch updates. | |
| | | | | www.kai-waehner.de | |
| | | | | Blog about architectures, best practices and use cases for data streaming, analytics, hybrid cloud infrastructure, internet of things, crypto, and more | |
| | | | | itsallabet.com | |
| | | | | It's been a long lockdown few months..... so not much posting. I did, however, write a comparison of Apache Kafka and Apache Pulsar for my employer Digitalis. Enjoy! https://digitalis.io/blog/kafka/apache-kafka-vs-apache-pulsar/ | |
| | | | | jack-vanlightly.com | |
| | | In the previous post, I covered append-only tables, a common table type in analytics used often for ingesting data into a data lake or modeling streams between stream processor jobs. I had promised to cover native support for changelog streams, aka change data capture (CDC), but before I do so, I think we should first look at how the table formats support the ingestion of data with row-level operations (insert, update, delete) rather than query-level operations that are commonly used in SQL batch commands. | ||