Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: matlab-deep-learning/llms-with-matlab
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v3.3.0
Choose a base ref
...
head repository: matlab-deep-learning/llms-with-matlab
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v3.4.0
Choose a head ref
  • 13 commits
  • 43 files changed
  • 3 contributors

Commits on Aug 14, 2024

  1. Configuration menu
    Copy the full SHA
    a7e6170 View commit details
    Browse the repository at this point in the history
  2. Rename mustBeValidTopP to mustBeValidProbability

    We're now using this validator for more than just `TopP`, and a new name is in order.
    ccreutzi committed Aug 14, 2024
    Configuration menu
    Copy the full SHA
    08160b7 View commit details
    Browse the repository at this point in the history

Commits on Aug 20, 2024

  1. tests still unreliable with Ollama version in GitHub CI

    These tests should work and do work locally. But they fail in GitHub CI – for an unknown reason that almost certainly is in Ollama, not in our code.
    ccreutzi committed Aug 20, 2024
    Configuration menu
    Copy the full SHA
    b0023dc View commit details
    Browse the repository at this point in the history
  2. Merge pull request #77 from matlab-deep-learning/minp

    Add min-p sampling for `ollamaChat`
    ccreutzi authored Aug 20, 2024
    Configuration menu
    Copy the full SHA
    d127953 View commit details
    Browse the repository at this point in the history

Commits on Aug 23, 2024

  1. Configuration menu
    Copy the full SHA
    1c3de4b View commit details
    Browse the repository at this point in the history
  2. Update +llms/+utils/errorMessageCatalog.m

    Co-authored-by: MiriamScharnke <mscharnk@mathworks.com>
    ccreutzi and MiriamScharnke authored Aug 23, 2024
    Configuration menu
    Copy the full SHA
    fd9bf75 View commit details
    Browse the repository at this point in the history

Commits on Aug 26, 2024

  1. Merge pull request #79 from matlab-deep-learning/remove-from-empty-hi…

    …story
    
    Special error message for removing from empty history
    ccreutzi authored Aug 26, 2024
    Configuration menu
    Copy the full SHA
    9ce8f94 View commit details
    Browse the repository at this point in the history

Commits on Sep 3, 2024

  1. Configuration menu
    Copy the full SHA
    19effe4 View commit details
    Browse the repository at this point in the history

Commits on Sep 10, 2024

  1. Use correct ModelName default

    `openAIChat.generate` needs to use the given `openAIChat`'s model as the default, not a static `gpt-4o-mini"`.
    
    Fixes #80.
    ccreutzi committed Sep 10, 2024
    Configuration menu
    Copy the full SHA
    cc43556 View commit details
    Browse the repository at this point in the history
  2. Merge pull request #81 from matlab-deep-learning/fix-model-default

    Use correct `ModelName` default
    ccreutzi authored Sep 10, 2024
    Configuration menu
    Copy the full SHA
    bb7a186 View commit details
    Browse the repository at this point in the history

Commits on Sep 13, 2024

  1. Add support for o1-preview and o1-mini models (#9)

    Adding support for [⁠the new o1-preview and o1-mini models](https://openai.com/index/introducing-openai-o1-preview/).
    
    Note that these two no longer support the name we translate our `MaxNumTokens` into. Fortunately, the new name does get accepted in the older models, too.
    
    ```
    ================================================================================
    Error occurred in topenAIChat/canUseModel(ModelName=o1-preview) and it did not run to completion.
        ---------
        Error ID:
        ---------
        'llms:apiReturnedError'
        --------------
        Error Details:
        --------------
        Error using openAIChat/generate (line 259)
        Server returned error indicating: "Unsupported parameter: 'max_tokens' is not supported with this model. Use
        'max_completion_tokens' instead."
    
        Error in topenAIChat/canUseModel (line 100)
                    testCase.verifyClass(generate(openAIChat(ModelName=ModelName),"hi",MaxNumTokens=1),"string");
    ================================================================================
    ```
    ccreutzi committed Sep 13, 2024
    Configuration menu
    Copy the full SHA
    9bed1a8 View commit details
    Browse the repository at this point in the history

Commits on Sep 16, 2024

  1. Configuration menu
    Copy the full SHA
    4101015 View commit details
    Browse the repository at this point in the history

Commits on Sep 27, 2024

  1. Documentation: Reference Pages (#5)

    * Documentation: Reference Pages
    
    We have added a new directory `functions` to the `doc` directory. The `functions` folder contains documentation for these functions and objects:
    - openAIChat
    - azureChat
    - ollamaChat
    - generate
    - openAIFunction
    - addParameter
    - openAIImages
    - openAIImages.generate
    - edit
    - createVariation
    - messageHistory
    - addSystemMessage
    - addUserMessage
    - addUserMessageWithImages
    - addToolMessage
    - addResponseMessage
    - removeMessage
    MiriamScharnke authored and ccreutzi committed Sep 27, 2024
    Configuration menu
    Copy the full SHA
    57cf331 View commit details
    Browse the repository at this point in the history
Loading