Release v0.3.1
This version again fixes minor issues. Specifically, the Windows executable is now properly code-signed. Additionally, you can now regenerate a model response, and delete individual conversation messages.
Note that there is still no auto-update mechanism in place, so make sure to check the GitHub releases page from time to time, or subscribe to notifications.
Changelog
- fix: Prepare release script
- feat: Attempt to force llama.cpp to use the Metal GPU
- feat: Allow deletion of messages and response regeneration
- fix: Typo in build workflow
- fix: win32 code signing issue
- feat: Add win32 Code certificate
- feat: Improve sidebar action buttons
- feat: Switch to vue-feather
- fix: Improve export abortion
- fix: Improve old chat message conversion code
- feat: Proper chat scrolling; remove abort button from chat
- refactor: Create Modal reusable component
- fix: Remove unnecessary console.log
- feat: Improve help menu
- feat: Add script to auto-bump the version string