Skip to content

Conversation

@ericcurtin
Copy link
Contributor

@ericcurtin ericcurtin commented Dec 23, 2025

The interactive mode now maintains conversation context across chat
exchanges. The readline-based interactive mode was enhanced to track
conversation history and pass it to the model on each request. A new
ChatWithMessagesContext method was added to support conversation history
while maintaining backward compatibility through the existing
ChatWithContext method. The conversation history uses the processed user
messages after file inclusions and image processing to ensure accuracy.
Additionally, the code now properly handles context cancellation by
returning the assistant response when cancelled. The basic interactive
mode fallback was removed since readline initialization is now required.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @ericcurtin, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the interactive chat experience by implementing conversation history. Users can now engage in more natural, multi-turn dialogues with the assistant, as previous messages are preserved and provided as context for subsequent responses. This change involves introducing a new API method to handle the message history and integrating it into the existing interactive chat interfaces.

Highlights

  • Conversation History: Interactive chat modes now maintain conversation context across exchanges, allowing the assistant to reference previous messages in the current session.
  • New API Method: A new API method, ChatWithMessagesContext, has been introduced to accept conversation history and return the assistant's response for tracking.
  • Interactive Mode Updates: Both the readline-based and basic interactive chat modes have been updated to utilize the new ChatWithMessagesContext functionality for enhanced multi-turn conversations.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@ericcurtin
Copy link
Contributor Author

$ make -C cmd/cli/ && cmd/cli/model-cli run --openaiurl http://127.0.0.1:8080/v1 gemma3
Building model-cli...
go build -ldflags="-s -w -X github.com/docker/model-runner/cmd/cli/desktop.Version=v1.0.6-36-gec4164d4-dirty" -o model-cli .
> Hi my name is Eric
Hi Eric! It’s nice to meet you. How are you doing today? Is there anything you’d like to chat about or do?
> What is my name?
Your name is Eric! 😊

You told me that’s your name.

Is there anything else you’d like to talk about?
>

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request successfully adds conversation history support to the interactive chat modes. The changes are logical and correctly implement the desired functionality by introducing a new ChatWithMessagesContext API method and updating the interactive modes to use it. My review focuses on improving maintainability by addressing code duplication. I've identified significant duplication in cmd/cli/desktop/desktop.go where the new ChatWithMessagesContext function largely copies the existing ChatWithContext. I've also noted smaller-scale duplication in cmd/cli/commands/run.go. Addressing these points will make the codebase cleaner and easier to maintain.

sourcery-ai[bot]

This comment was marked as resolved.

@ericcurtin ericcurtin force-pushed the fix-conversation-context branch 3 times, most recently from 515d22a to 742411d Compare December 23, 2025 12:05
@ericcurtin ericcurtin force-pushed the fix-conversation-context branch from 742411d to 5331e19 Compare December 23, 2025 12:41
The interactive mode now maintains conversation context across chat
exchanges. The readline-based interactive mode was enhanced to track
conversation history and pass it to the model on each request. A new
ChatWithMessagesContext method was added to support conversation history
while maintaining backward compatibility through the existing
ChatWithContext method. The conversation history uses the processed user
messages after file inclusions and image processing to ensure accuracy.
Additionally, the code now properly handles context cancellation by
returning the assistant response when cancelled. The basic interactive
mode fallback was removed since readline initialization is now required.

Signed-off-by: Eric Curtin <eric.curtin@docker.com>
@ericcurtin ericcurtin force-pushed the fix-conversation-context branch from 5331e19 to e58a48c Compare December 23, 2025 12:45
@ericcurtin ericcurtin merged commit 19fa5e6 into main Dec 23, 2025
13 checks passed
@ericcurtin ericcurtin deleted the fix-conversation-context branch December 23, 2025 12:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants