Skip to content

Releases: nathanlesage/local-chat

Release v0.11.0

13 Mar 10:41
Compare
Choose a tag to compare

In this release, we have made several improvements and bug fixes to enhance the user experience. We upgraded our dependencies for better performance and added a configuration option for the light/dark mode. Additionally, we introduced global styles for a consistent look and feel across the application. We also fixed issues with input/output sanitization, and the documentation.

Furthermore, we implemented features such as allowing users to create new conversations directly from the start window and not pre-selecting a conversation on load. Lastly, we resolved an issue where the same conversation would be re-selected. Overall, this release aims to provide a more seamless and enjoyable experience for our users.

These release notes have been generated using openchat_openchat-3.5-0106.

Changelog

  • chore: Upgrade dependencies
  • fix: Better input/output sanitization
  • feat: Add Radio button group
  • feat: Add a configuration
  • fix: Search bar outline color
  • feat: Add global styles
  • feat: Add Funding information
  • fix: Card button dark mode styles
  • feat: Allow immediately to create a new conversation from the start window
  • feat: Do not pre-select a conversation on load
  • fix: Don't re-select the same conversation
  • fix: Jekyll interpolation in prompt placeholders
  • fix: Better code block generation and HTML sanitization
  • refactor: Remove dead code
  • fix: Remove linux vulkan binaries from win32 builds

Full Changelog: v0.10.0...v0.11.0

Release v0.10.0

06 Mar 13:22
Compare
Choose a tag to compare

This release updates node-llama-cpp to v3.0.0-beta.13. Furthermore, it introduces the ability of users to specify custom prompt templates. See the documentation to learn how to set up custom prompts.

Changelog

  • feat: Add custom prompt documentation
  • feat: Add custom prompts
  • refactor: Clean up code
  • fix: Logical code error
  • refactor: Move code around
  • feat: bump node-llama-cpp to 3.0.0-beta.13
  • refactor: Simplify source code
  • fix: More prominent download link in README
  • fix: Use macOS icon for README
  • feat: Add proper macOS-specific application icon
  • fix: add quick download link to README

Full Changelog: v0.9.0...v0.10.0

Release v0.9.0

20 Feb 22:14
Compare
Choose a tag to compare

This release updates node-llama-cpp (and llama.cpp alongside) to the most recent release. Now, LocalChat also shows you how much VRAM is being used in the statusbar. Lastly, we've implemented Electron fuses support to prevent unauthorized access to the binary.

Changelog

  • feat: Show vram status in statusbar
  • feat: Upgrade node-llama-cpp to v3.0.0-beta.11
  • feat: Lock down LocalChat binary with Electron fuses

Full Changelog: v0.8.0...v0.9.0

Release v0.8.0

06 Feb 20:15
Compare
Choose a tag to compare

LocalChat 0.8.0 is now available for download! This release includes updates to dependencies (node-llama-cpp ⇾ v3.0.0-beta.9), conversations can now be searched, only populated conversation groups are shown, uncollapsing the most recent month's conversations, and the model selector widget is now disabled while the model is generating. Additionally, it fixes issues with button styling and sidebar header positioning. Enjoy the improved user experience!

These release notes have been partially written with the help of OpenChat 3.5 0106 Q4_K_M in LocalChat.

Changelog

  • chore: Remove additional incompatible binaries
  • chore: Improve llama provider logging
  • chore: Update dependencies
  • feat: Enable searching of conversations
  • feat: Ensure only populated conversation groups are shown
  • feat: Only uncollapse most recent month's conversations
  • feat: Move llama status to store, model selector widget disabled while generating
  • fix: Move new conversation button to sidebar header
  • fix: Button styling
  • fix: sidebar header positioning

Release v0.7.1

01 Feb 21:21
Compare
Choose a tag to compare

This patch fixes an issue with the sidebar header that was hidden, and it fixes a bug when renaming conversations.

Changelog

  • fix: Better solution to input field selection focus
  • fix: Bug during conversation rename

Release v0.7.0

01 Feb 10:25
Compare
Choose a tag to compare

We are excited to announce the latest changes in our application. The first change is a fix for conversation styling and autofocus description input. Next, we have introduced automatic assignment of conversation descriptions. We have also added features to sort conversations by time, group them by month, and make them collapsible. Additionally, we have fixed an issue with highlighting code blocks on mount. Lastly, a bug in the update process has been resolved. Stay tuned for more updates!

These release notes have been generated with OpenChat 3.5 0106 Q4_K_M in LocalChat.

Changelog

  • fix: Conversation styling + autofocus description input
  • feat: Automatically assign conversation descriptions
  • feat(conversations): Sort by time; group by month; make collapsible
  • fix: Highlight code blocks on mount
  • fix: Bug in update process

Release v0.6.0

31 Jan 12:35
Compare
Choose a tag to compare

This update contains a few improvements over 0.5.0. Specifically, now you can customize the system prompts for your chats. Additionally, we have now armed the auto-updater for Windows and macOS installations. This means if you use LocalChat on these operating systems, the apps should be able to auto-update themselves. Note, however, that despite turning on the auto-updater, it could be that there are a few issues that we can only debug in production.

Thus, if your install does not notify you of the next upcoming release, please open an issue so that we can tend to it.

Changelog

  • chore(build): Update checkout action to v4
  • fix(build): Whitespace in filename
  • fix(build): Properly move files
  • chore(build): Make directory structure more resilient
  • chore(build): Update setup-node action to v4
  • chore(build): Update artifact actions to v4
  • fix: Sidebar header
  • fix: Ensure ZIP files are checksummed and uploaded
  • feat: Add ZIP files for Darwin updates
  • fix: Provide Windows installer icon
  • feat: Upload Windows updater files to releases page
  • refactor: Simplify conversation logic
  • feat: Allow custom system prompts
  • refactor: Generic model update

Release v0.5.0

29 Jan 11:35
Compare
Choose a tag to compare

LocalChat 0.5.0 is an exciting release that brings a host of new features and improvements to the platform. One of the most notable additions is the ability to copy messages to the clipboard, making it easier than ever to share interesting or important conversations with others.

The sidebar has also been updated for a more modern look and feel, while the window positioning feature now remembers your preferences across different screen configurations. LocalChat now retains its window position, ensuring a seamless user experience.

In addition to these improvements, LocalChat 0.5.0 introduces support for tables in messages, providing better previews and styling options.

The documentation has been expanded with proper guides and instructions, making it easier for new users to get started with LocalChat. The start guide has been replaced with a welcome message.

LocalChat 0.5.0 also includes several bug fixes, such as ensuring that models load correctly in new conversations and providing a fallback model if necessary. The platform now defaults to a context size of 512 tokens instead of 2,048, offering users more control over their conversation length.

Overall, LocalChat 0.5.0 is a significant update that enhances the user experience and adds valuable new features. Users can expect improved performance, better styling, and greater flexibility in managing their conversations.

These release notes have been generated with openchat-3.5-0106.Q4_K_M using LocalChat based on the changelog.

Changelog

  • refactor: Remove default bounds requirement
  • feat: Allow copy message to clipboard
  • refactor: Move model name retrieval to store
  • refactor: Move generation time format to util
  • refactor: Chat more concise
  • fix: more modern sidebar header
  • fix: Reposition windows on screen config change
  • feat: LocalChat now remembers the window position
  • refactor: Move out context menu
  • fix: Better sidebar toggle style
  • fix: Conversation info
  • feat: Move model manager into main area
  • refactor: Move sidebar div to app
  • fix: Replace repeat symbol with refresh
  • fix: Only download actual models
  • feat: Add support for tables; better preview; add table styling
  • fix: Final fix for breadcrumb generation
  • fix: Breadcrumb generation
  • fix: Try to use absolute URLs in Docs
  • fix: 404 in the documentation
  • feat: Add proper documentation
  • feat: Remove first start guide, instead, display a welcome message
  • refactor: Move model selection widget to reusable folder
  • refactor: Unify list item margins
  • refactor: Unify button appearance
  • fix: Better Llama Status indication
  • fix: Model loads correctly on new conversations
  • fix: Actually provide a fallback model if asked to do so
  • fix: Use proper model name in sidebar
  • fix: Default to 512 tokens context size instead of 2,048
  • feat: Allow 512 tokens context size

Release v0.4.0

25 Jan 11:59
Compare
Choose a tag to compare

The following release notes have been generated from the changelog using openchat-3.5-0106.Q4_K_M in LocalChat.

We are excited to announce the latest updates to our platform, which include several bug fixes and new features. Firstly, we have fixed a typo in the release preparation script, making it more accurate and efficient. We have also added a copy code block button for easier sharing of code snippets.

Additionally, we have improved the visibility of inline code by using a better code font, making it easier to read and understand. The sidebar resizer is now properly hidden when the sidebar is not shown, providing a cleaner interface. We have also fixed an issue where the regenerate button was shown even when it wasn't possible to use it.

New features include the addition of a context menu for more options and control over the platform. Furthermore, we have replaced the conversation ID with the model name for better clarity and understanding.

Lastly, we have bumped node-llama-cpp to version 3.0.0-beta.5, ensuring that our platform stays up-to-date with the latest developments in the technology. These updates are aimed at improving user experience and providing a more efficient and feature-rich platform for all users.

Changelog

  • fix: Typo in release preparation script
  • feat: Add copy code block button
  • fix: Make inline code easier to see; better code font
  • fix: Properly hide sidebar resizer if sidebar is not shown
  • fix: Only show regenerate button if possible
  • feat: Add context menu
  • feat: Allow long dates and times
  • feat: Show model name instead of conversation ID
  • feat: Bump node-llama-cpp to v3.0.0-beta.5

Release v0.3.1

23 Jan 11:55
Compare
Choose a tag to compare

This version again fixes minor issues. Specifically, the Windows executable is now properly code-signed. Additionally, you can now regenerate a model response, and delete individual conversation messages.

Note that there is still no auto-update mechanism in place, so make sure to check the GitHub releases page from time to time, or subscribe to notifications.

Changelog

  • fix: Prepare release script
  • feat: Attempt to force llama.cpp to use the Metal GPU
  • feat: Allow deletion of messages and response regeneration
  • fix: Typo in build workflow
  • fix: win32 code signing issue
  • feat: Add win32 Code certificate
  • feat: Improve sidebar action buttons
  • feat: Switch to vue-feather
  • fix: Improve export abortion
  • fix: Improve old chat message conversion code
  • feat: Proper chat scrolling; remove abort button from chat
  • refactor: Create Modal reusable component
  • fix: Remove unnecessary console.log
  • feat: Improve help menu
  • feat: Add script to auto-bump the version string