Skip to content

v1.4.1

Compare
Choose a tag to compare
@codybrom codybrom released this 06 Jan 21:07
· 19 commits to main since this release
101d807

What's New in 1.4.1

  • New Versioning: This extension now aligns with semantic versioning and is bumping from 0.3 to 1.4 on a minor release, and then to .1 for a final fix for a last minute WASM build copy issue.
  • New Name: The name change to "LLM Context Generator" makes it clear that the extension can be used with various LLMs, not just GPT models.
  • Improved File Exclusions: Adds a new .ignore configuration to add additional .ignore pattern lists.
  • Improved File Marking System: Adds multi-file/folder marking system using the Explorer menu to easily select specific files for context generation. Automatic file tracking handles when marked files are moved or deleted.
  • Enhanced File Extension Handling: Broadened support for more diverse file types and programming languages out of the box.
  • Improved Token Count Estimation: Switched from gpt-3-encoder to @dqbd/tiktoken for enhanced token counting accuracy and efficiency. for the generated context to help manage usage within LLM token limits, with a configurable warning if the context exceeds a predefined number of tokens (default: 8,000).
  • Codebase Refactor: Improved code structure and maintainability using ESBuild, TypeScript, and a more modular design. Removed unused test files and outdated dependencies.
  • Updated README: Revised documentation to reflect the new name, features, and usage instructions.

New Contributors

Full Changelog: v0.3...v1.4.1