Skip to content
Media 0 for listing Local offline chat AI

Description

This is a plug-in that runs large language models locally on mid - and high-end personal computers, without any external services. It integrates the llamacpp open source library and can efficiently use the CPU for model inference.

It can be applied in the digital person project, can also be applied to the AI NPC of the game project, and supports the custom character background information, and supports the retrieval and generation of knowledge base. Imagine that each character is an AI NPC with its own unique story and knowledge base, and there are an infinite number of possible ways for players to talk.

The plugin uses the Qwen2.5-1.5B open source large model, and you can choose other models with higher or lower parameters depending on the hardware configuration.

Document address on OneDrive:

Local Chat AI User Guide.pdf

Document address on GoogleDrive:

https://drive.google.com/file/d/1EvoBAh-2wbxgKyIaZ18XXbicJjDy1gjQ/view?usp=drive_link

Included formats

  • logo of Unreal Engine format