224% ROI and payback in under 3 months for Azul Zing.
Read Forrester’s Total Economic Impact™ Study.

Slash your Java support costs as much as 90%!
We make it easy, safe and secure.

In this presentation, Gil Tene, CTO of Azul Systems, discusses the benefits of using in-memory, in-process, and in-heap index representations with heap sizes that can now make full use of current commodity server scale. He also compares throughput and latency characteristics of the various architectural alternatives, using some of the configurable choices available in Apache Lucene™ 4.0 for specific examples.


Speeding Up Enterprise Search With In-Memory Indexing

Speaker: Gil Tene, CTO and Co-founder, Azul Systems

Abstract: Recent advancements in robust, very large in-heap memory support for Java/Linux environments have made truly “new” scales and speeds possible in search. In this presentation, Gil Tene, CTO of Azul Systems, discusses the benefits of using in-memory, in-process, and in-heap index representations with heap sizes that can now make full use of current commodity server scale. Gil further describes some of the commonly available choices for index storage and representation, and focuses on memory-based index storage options. We compare throughput and latency characteristics of the various architectural alternatives, using some of the configurable choices available in Apache Lucene™ 4.0 for specific examples.

VIEW PRESENTATION

© Azul 2020 All rights reserved.