Overview

An object that represents a conversation with an agent.

Members

Member values include the conversation contents as messages, the agent being talked to, the temperature of the agent, and the step that the agent is on in the development process.

Unknown members: branches: dict. Looking at its use it seems to be a dictionary of conversation checkpoints to stash valid states and return to if an error state is hit.

Loading from a known good state from branches also rolls back any files modified by the agent during that time (replace_files).

send_message

Sends a message in the conversation. Takes a prompt path, data to be interpolate into the prompt template, and a series of functions to be added to the request to be utilized.

If functions are passed, it parses the response as a JSON object, otherwise it treats the response as text.

continuous_conversation

This initializes a looped conversation with the agent. As long as the agent doesn’t return a specific constant (EVERYTHING_CLEAR) it will continue to reprompt the agent to continue generating content.

So if given an initial prompt, it will generate the first batch of content. If it is not finished generating, the EVERYTHING_CLEAR constant will not appear. It then prompts the user if they would like to add anything else.

If the user has nothing to add, it treats the response from the LLM as valid and adds it to a list of accepted messages. The loop then continues.

Question

Interesting that if the user has feedback for a given response, it discards the contents of that response. My guess is that this has to do with preventing duplicate data from being generated at the cost of any unique values generated on the first pass.