fix: fix unload in LLM and ModelHostObject to properly free LLM memory#954
fix: fix unload in LLM and ModelHostObject to properly free LLM memory#954
Conversation
|
Do you believe that similar problems might be in other modules? |
|
We should wait with this PR until #892 is merged and check if the problem occurs also in this version. |
|
@msluszniak I don't think the problem with LLM not freeing the module memory occurs in other models as they don't override the |
Description
Fixes two related memory management bugs in the LLM delete flow:
LLM::unload()was destroying the runner object but never callingBaseModel::unload()and thus not releasing the module memory.LLM::unload()now callsBaseModel::unload()after resetting the runner.load()was called on an already-loadedLLMController, the previous native module instance was simply overwritten without callingunload()first. This caused a memory leak. It now unloads the existing native module before loading a new one.0aftermodel->unload()so the JS GC is correctly informed that native memory has been freed.Introduces a breaking change?
Type of change
Tested on
Testing instructions
Use the provided screen (you can simply replace the current LLM app screen) to reproduce the bug and verify the fix. It lets you load and unload an LLM and observe memory behavior with
vmmap/adb.- iOS:
xcrun simctl spawn booted launchctl list | grep llmandwatch -n 0.1 "vmmap <pid> | tail -12"- Android:
watch -n 0.1 "adb shell dumpsys meminfo com.anonymous.llm"Related issues
#948
Checklist