Package edu.tufts.hrilab.llm
Class LLMComponent
java.lang.Object
edu.tufts.hrilab.diarc.DiarcComponent
edu.tufts.hrilab.llm.LLMComponent
-
Field Summary
FieldsFields inherited from class edu.tufts.hrilab.diarc.DiarcComponent
executionLoopCycleTime, shouldRunExecutionLoop
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprotected List<org.apache.commons.cli.Option>
Command line options available in sub-class.chatCompletion
(Symbol prompt) Returns a Completion object for chat-based completions, based on the specified prompt and set service.chatCompletion
(Symbol model, Symbol prompt) Returns a Completion object for chat-based completions, based on the specified prompt and model using the set service.chatCompletion
(Symbol model, Chat chat) Generates a Completion object by passing a Chat object to the appropriate chat completion method based on the currently set service.chatCompletion
(Chat chat) Generates a Completion object by passing a Chat object to the appropriate chat completion method based on the currently set service.chatCompletion
(String prompt) chatCompletion
(String model, String prompt) chatCompletion
(List<Message> messages) Generates a chat completion for a list of messages using the selected service.completion
(Symbol model, Prompt prompt) completion
(Prompt prompt) Returns a Completion object for the provided prompt based on the specified service.completion
(String prompt) completion
(String model, Prompt prompt) Returns a Completion object for the provided prompt based on the specified service and model.completion
(String model, String prompt) protected void
parseArgs
(org.apache.commons.cli.CommandLine cmdLine) Called directly after construction to pass runtime values that will override default values.void
setLLMEndpoint
(Symbol endpoint) Sets the LLM endpoint (URL).void
setLLMEndpoint
(String endpoint) void
setLLMModel
(Symbol model) Sets the language model to use for the language learning model service.void
setLLMModel
(String modelStr) void
setLLMService
(Symbol service) Sets the LLM service to the given value.void
setLLMService
(String serviceStr) Sets the LLM service: [openai, llama].void
setLLMStopWords
(String[] stopWords) Sets the stop words for the LLM to terminate token stream.void
setLLMTemperature
(float temperatureFloat) Sets the temperature for the LLM.Methods inherited from class edu.tufts.hrilab.diarc.DiarcComponent
createInstance, createInstance, createInstance, createInstance, executionLoop, getMyGroupConstraints, getMyGroups, getMyService, getMyServices, init, main, shutdown, shutdownComponent
-
Field Details
-
service
-
model
-
temperature
public float temperature
-
-
Constructor Details
-
LLMComponent
public LLMComponent()
-
-
Method Details
-
additionalUsageInfo
Description copied from class:DiarcComponent
Command line options available in sub-class. This should be paired with a parseArgs implementation.- Overrides:
additionalUsageInfo
in classDiarcComponent
- Returns:
-
parseArgs
protected void parseArgs(org.apache.commons.cli.CommandLine cmdLine) Description copied from class:DiarcComponent
Called directly after construction to pass runtime values that will override default values. This should parse all the options that additionalUsageInfo provides.zs- Overrides:
parseArgs
in classDiarcComponent
-
setLLMService
Sets the LLM service to the given value.- Parameters:
service
- a string representing the LLM service
-
setLLMService
Sets the LLM service: [openai, llama].- Parameters:
serviceStr
- - The LLM service to set
-
setLLMEndpoint
Sets the LLM endpoint (URL).- Parameters:
endpoint
- - The endpoint to set
-
setLLMEndpoint
-
setLLMModel
Sets the language model to use for the language learning model service.- Parameters:
model
- the name of the model to set
-
setLLMModel
-
setLLMStopWords
Sets the stop words for the LLM to terminate token stream.- Parameters:
stopWords
- - An array of stop words to set.
-
setLLMTemperature
public void setLLMTemperature(float temperatureFloat) Sets the temperature for the LLM.- Parameters:
temperatureFloat
- - The temperature value to set
-
completion
Returns a Completion object for the provided prompt based on the specified service.- Parameters:
prompt
- the prompt to generate completion for- Returns:
- a Completion object for the provided prompt
-
completion
-
completion
Returns a Completion object for the provided prompt based on the specified service and model.- Parameters:
model
- the model to use when generating a completionprompt
- the prompt to generate completion for- Returns:
- a Completion object for the provided prompt
-
completion
-
completion
-
chatCompletion
Returns a Completion object for chat-based completions, based on the specified prompt and set service.- Parameters:
prompt
- the prompt to be used for the chat completion- Returns:
- a Completion object containing the completed text and any additional data returned by the API
-
chatCompletion
-
chatCompletion
Returns a Completion object for chat-based completions, based on the specified prompt and model using the set service.- Parameters:
model
- the model to use for the chat completion requestprompt
- the prompt to be used for the chat completion- Returns:
- a Completion object containing the chat completion response
-
chatCompletion
-
chatCompletion
Generates a chat completion for a list of messages using the selected service.- Parameters:
messages
- the list of previous messages in the conversation- Returns:
- a Completion object containing the chat completion response
-
chatCompletion
Generates a Completion object by passing a Chat object to the appropriate chat completion method based on the currently set service.- Parameters:
chat
- the Chat object to use for chat completion- Returns:
- a Completion object containing the chat completion response
-
chatCompletion
Generates a Completion object by passing a Chat object to the appropriate chat completion method based on the currently set service.- Parameters:
model
- the model to use for the chat completion requestchat
- the Chat object to use for chat completion- Returns:
- a Completion object containing the chat completion response
-