LlamaIndex 0.11: Major Update Brings New Features and Improvements

LlamaIndex 0.11 introduces Workflows, Instrumentation, async streaming, and a 42% smaller package. Upgrade now for better performance and new AI features!
LlamaIndex 0.11 Major Update Brings New Features and Improvements

Table of Contents

LlamaIndex has just launched version 0.11 of its Python library, bringing new features, optimizations, and breaking changes. Building on the foundation of version 0.10, this latest update aims to provide developers with more powerful tools for building generative AI applications while improving performance and usability. 

Key Features: Workflows and Instrumentation 

A significant highlight of LlamaIndex 0.11 is the introduction of Workflows, an event-driven architecture designed to simplify the creation of complex generative AI applications. This replaces the deprecated Query Pipelines feature and is set to become the new standard for developers working with LlamaIndex. The team has released a comprehensive tutorial to help users understand and implement Workflows in their projects.

Another notable addition is Instrumentation, which dramatically enhances the observability of LlamaIndex applications. This feature allows developers to monitor and debug their applications more efficiently, improving reliability and performance. 

Improved Property Graph Support and Package Size Reduction 

The Property Graph Index has also significantly improved, delighting users with better support for property graphs. Additionally, LlamaIndex has streamlined its core package, reducing its size by an impressive 42%. This was achieved by removing OpenAI as a core dependency and making libraries like Pandas optional. These changes help optimize application performance while keeping the package lightweight. 

Additional Enhancements 

LlamaIndex 0.11 introduces other valuable features, including async streaming support for query engines and throughout the framework. The new Structured Planning Agent enhances agentic capabilities, and the Function Calling LLM improves tool-calling in language models. Furthermore, adding a Chat Summary Memory buffer helps maintain context during conversations, making interactions more coherent.

While these are just some highlights, version 0.11 includes hundreds of new features and bug fixes.

Breaking Changes in Version 0.11 

As with any significant update, there are some breaking changes to be aware of. Pydantic V2 is fully supported, meaning LlamaIndex’s Pydantic types will seamlessly integrate with FastAPI and other frameworks. This change required significant effort, so developers are encouraged to report bugs.

The ServiceContext object has been completely removed, making way for the Settings object. Developers will need to update their code accordingly. The LLMPredictor has also been deprecated in favor of the new LLM class, which is a direct replacement.

Overall, LlamaIndex 0.11 represents a substantial leap forward regarding functionality, performance, and ease of use.

Related Articles

The Role of AI in Enhancing Data Privacy Measures

The Role of AI in Enhancing Data Privacy Measures explained: automated discovery, masking, redaction, anomaly detection, and audits that scale trust....

Context-Aware Tokenization: How Protecto Unlocked Safer, Smarter Healthcare Data Analysis

Understanding AI and Data Privacy: Key Principles

Your clear guide to Understanding AI and Data Privacy: definitions, risk hot spots, PETs, metrics, and steps to launch privacy by design....