.
minor | -
Fix | +
Addition | ^
improvement | !
Change | *
Refactor
IMPORTANT:
0.1.x
will still have some breaking changes in patches.
- Make sure to lock your version, e.g.,
genai = "=0.1.6"
.- Version
0.2.x
will follow semver more strictly.- API changes will be denoted as "
!
- API CHANGE ...."
+
ChatOption Add json mode for openai type models.
groq - added the Llama 3.1 previews, and grog-..-tool-use.. to the groq model list names!
nowchat::printer::print_chat_stream
(wasutils::print_chat_stream
)!
NowChatOptions
(wasChatRequestOptions
) ! Removeclient_builder.with_default_chat_request_options
(available withclient_builder.with_chat_options
).
readme - add youtube videos doc
!
- API CHANGE now ClientBuilder::insert_adapter_config (was with_adapter_config).
- code clean
!
API CHANGE - refactor Error- With new
ModelInfo
- Back to
genai::Error
(adapter::Error
was wrongly exposing internal responsibility)
- With new
.
update tests and examples from 'gpt-3.5-turbo' to 'gpt-4o-mini'-
Fix namingClientConfig::with_adapter_kind_resolver
(was wrongly...auth_resolver
)*
refactor code layout, internal Adapter calls to use ModelInfo+
Add ModelName and ModelInfo types for better efficient request/error context!
API CHANGE - nowClient::resolve_model_info(model)
(wasClient::resolve_adapter_kind(mode)
)^
ChatRequest
- addChatRequest::from_system
.
updated provider supported list
^
openai - addedgpt-4o-mini
and switched all openai examples/tests to it!
API CHANGE - NewMessageContent
type forChatMessage.content
,ChatResponse.content
, andStreamEnd.captured_content
(only ::Text variant for now).- This is in preparation for multimodal support
!
API CHANGE - (should be minor, asInto
implemented) -ChatMessage
now takesMessageContent
with only::Text(String)
variant for now.!
API CHANGE - Error refactor - addedgenai::adapter::Error
andgenai::resolver::Error
, and updatedgenai::Error
with appropriateFroms
+
Added token usage for ALL adapters/providers -ChatResponse.usage
andChatRequestOption
.capture_usage
/.capture_content
(for streaming) support for all Adapters (see note in Readme for Ollama for streaming)!
API CHANGE:ClientConfig::with_chat_request_options
(waswith_default_chat_request_options
)!
API CHANGE:PrintChatStreamOptions::from_print_events
(wasfrom_stream_events
)^
AdapterKind
- addedas_str
andas_lower_str
^
ChatRequest
- added.iter_systems()
and.combine_systems()
(includes eventualchat_req.system
as part of the system messages)!
API CHANGE:Client::all_model_names(..)
(wasClient::list_models(..)
)^
groq - add gemma2-9b-it to the list of Groq models!
API CHANGE:genai::Client
(wasgenai::client::Client
, same forClientBuilder
ClientConfig
)-
groq - remove groq whisper model from list_models as it is not a chat completion model^
ollama - implement live list_models for ollama!
Makes AdapterDispatcher crate only (should be internal only)
+
ChatRequestOptions
- addedtemperature
,max_tokens
,top_p
for all adapters (see readme for property mapping).!
SyncAdapterKindResolverFn
- Change signature to return Result<Option> (rather than Result).
made publicclient.resolve_adapter_kind(model)
+
implement groq completions
-
gemini - proper stream message error handling
.
print_chat_stream - minor refactor to ensure flush
-
ollama - improve Ollama Adapter to support multi system messages-
gemini - fix adapter to set "systemInstruction" (Supported in v1beta)
+
Added AdapterKindResolver-
Adapter::list_models api impl and change^
chat_printer - added PrintChatStreamOptions with print_events