Explore >> Select a destination


You are here

risingwave.com
| | www.getorchestra.io
2.2 parsecs away

Travel
| | Discover how Data Engineers are using Apache Iceberg in Snowflake to replace External Tables, thereby truly deocupling compute and storage. Find out more.
| | www.starburst.io
2.7 parsecs away

Travel
| | How Onehouse and Starburst enable near real-time analytics by efficiently building a data lake while simplifying data ingestion & analysis.
| | rmoff.net
1.8 parsecs away

Travel
| | [AI summary] A blog post from September 2025 highlights various data engineering, AI, and cloud computing topics, including recent developments in tools like DuckDB, Iceberg, and Kafka, along with insights on AI's growing impact.
| | jack-vanlightly.com
20.6 parsecs away

Travel
| Over the past few months, I've seen a growing number of posts on social media promoting the idea of a "zero-copy" integration between Apache Kafka and Apache Iceberg. The idea is that Kafka topics could live directly as Iceberg tables. On the surface it sounds efficient: one copy of the data, unified access for both streaming and analytics. But from a systems point of view, I think this is the wrong direction for the Apache Kafka project. In this post, I'll explain why.