Presentations

Speeding Up Enterprise Search with In-Memory Indexing

In this presentation, Gil Tene, CTO of Azul Systems, discusses the benefits of using in-memory, in-process, and in-heap index representations with heap sizes that can now make full use of current commodity server scale. He also compares throughput and latency characteristics of the various architectural alternatives, using some of the configurable choices available in Apache Lucene™ 4.0 for specific examples.


Speeding Up Enterprise Search With In-Memory Indexing

Speaker: Gil Tene, CTO and Co-founder, Azul Systems

Abstract: Recent advancements in robust, very large in-heap memory support for Java/Linux environments have made truly “new” scales and speeds possible in search. In this presentation, Gil Tene, CTO of Azul Systems, discusses the benefits of using in-memory, in-process, and in-heap index representations with heap sizes that can now make full use of current commodity server scale. Gil further describes some of the commonly available choices for index storage and representation, and focuses on memory-based index storage options. We compare throughput and latency characteristics of the various architectural alternatives, using some of the configurable choices available in Apache Lucene™ 4.0 for specific examples.

VIEW PRESENTATION

© Azul Systems, Inc. 2018 All rights reserved.