Local LLM Plugin

Run LLM locally in UE5 project

Product Details

This product is designed to integrate AI chatbots into games without using online services.

This plugin allows to load large language models (LLMs) of GGUF format and run them on Unreal Engine.

Run locally and within BP/C++

・Runs offline on a local PC.
・Just add one component to your BP and you are ready to use it.
・No Python or dedicated server is required.

Useful features

・Works asynchronously, additional questions can be asked at any time during answer generation.
・You can save and load “state” that preserve the context of a conversation, allowing you to resume a previous conversation later.
・Supports multibyte characters.

Free Demo

Sample scenes including chat with multiple characteres are available for free.
Check the quality and the performance.

Download↗

Download

Fab↗

Manual

Manual Page↗