Parameters for LLM Options in SuiteScript
Explore parameters for LLM options in SuiteScript, including types and requirements to optimize AI responses.
TL;DR
This article details the parameters available for configuring options within the LLM methods in SuiteScript. These parameters allow you to modify how the language model processes inputs and provides outputs, enhancing the effectiveness of AI implementations.
What Are LLM Parameters?
In SuiteScript, particularly for the N/llm module, parameters are critical for fine-tuning the interaction with a Large Language Model (LLM). The parameters must be structured correctly to achieve the desired behavior from the AI, influencing both input and output during the invocation of methods.
Key Parameters Overview
Below is a summary of key parameters involved in LLM configurations:
| Parameter | Type | Required / Optional | Description |
|---|---|---|---|
options.prompt | string | required | The main text input for the LLM. |
options.chatHistory | [llm.ChatMessage][] | optional | Previous interactions to consider. |
options.documents | [llm.Document][] | optional | Additional context for the model. Only for Cohere models. |
options.modelFamily | string | optional | Specifies which LLM to use. Default is the Cohere Command A model. |
options.modelParameters | Object | optional | Contains parameters for adjusting model behavior. |
options.ociConfig | Object | optional | Configuration needed for OCI service access. |
options.responseFormat | Object | optional | Use for structured JSON formatted output. Only for Cohere models. |
Detailed Parameter Descriptions
options.prompt
The prompt is a crucial parameter, as it acts as the input query for the LLM resulting in the output. It must always be provided when calling the language model.
options.chatHistory
This optional parameter allows you to include previous messages in a chat. This can help contextualize the responses generated, especially for conversational applications.
options.documents
When working with Cohere models, this parameter allows for inclusion of external documents that provide additional context to improve the responses generated by the model.
options.modelFamily
Sets which LLM family is to be utilized, with fallbacks in case of omission. Specific values can be drawn from the llm.ModelFamily enum available in the documentation.
options.modelParameters
This parameter details specific configurations:
frequencyPenalty: Penalizes token frequency in outputs, discouraging repetitions.maxTokens: Maximum length of generated content.presencePenalty: Encourages the inclusion of new tokens in the output.temperature: Controls randomness in responses, allowing for more creative outputs with higher values.topKandtopP: Define limits on token selection to balance output quality and diversity.
options.ociConfig
These settings are essential for users working with Oracle Clouds, allowing for comprehensive configurations including user credentials and access keys.
Examples of Usage
Here’s an example of how to utilize parameters with the LLM API:
1var response = llm.generateText({2 prompt: "Hello, I am interested in learning more about SuiteScript.",3 responseFormat: {4 "type": "object",5 "required": ["message"],6 "properties": {7 "message": { "type": "string" }8 }9 }10});This code snippet demonstrates how to structure the request including the responseFormat parameter which is especially beneficial when extracting specific data.
Who This Affects
This information is pertinent to:
- Developers implementing LLM features within SuiteScript.
- Administrators working with AI configurations in Oracle Cloud environments.
- Technical teams requiring detailed understanding of language model interactions.
Key Takeaways
- Understanding and utilizing LLM parameters is crucial for effective AI response generation.
- Key parameters influence the behavior of the language model and should be tailored to specific use cases.
- Proper configuration through OCI settings is mandatory for Oracle Cloud users.
Source: This article is based on Oracle's official NetSuite documentation.
Frequently Asked Questions (4)
Do I need to enable a feature flag for using LLM text generation parameters in NetSuite 2024.1?
What permissions are required to access and utilize LLM parameters in SuiteScript 2.1?
How does using options.chatHistory affect the output of AI-generated responses?
Will using external documents with options.documents impact performance in NetSuite 2025.1?
Was this article helpful?
More in SuiteScript
- N/https Module: Binary File Support in NetSuite 2025.2
NetSuite 2026.1 adds binary file support to N/https for streamlined file handling.
- API Governance Units Calculation in NetSuite 2026.1
NetSuite 2026.1 introduces examples illustrating API governance unit calculations for both user event and scheduled scripts.
- Attach and Detach Operations in NetSuite 2026.1
Attach and detach operations for record relationships in NetSuite enhance data management and connectivity.
- Create-Form Operation in NetSuite 2026.1 REST Web Services
Create-form operation in NetSuite 2026.1 APIs streamlines record creation and enhances efficiency.
