Model Parameter Values for LLM in SuiteScript 2.1
Discover how to customize LLM responses with SuiteScript 2.1's N/llm module and its adjustable model parameters.
The N/llm module in SuiteScript 2.1 introduces powerful capabilities for integrating large language models (LLMs) from Oracle Cloud Infrastructure (OCI), including Cohere Command A and OpenAI GPT-OSS. This feature allows developers to customize response generation by adjusting model parameters such as creativity, response length, and diversity of output.
What are Model Parameters?
When invoking methods in the N/llm module, you can specify various parameters to control the behavior and outputs of the LLM models. For example, adjusting the temperature parameter influences the randomness and creativity of generated text; higher values yield more varied outputs, while lower values produce more deterministic responses.
Supported Methods
Here are the methods where you can utilize these model parameters:
llm.generateText(options)llm.generateText.promise(options)llm.generateTextStreamed(options)llm.generateTextStreamed.promise(options)
Accepted Model Parameters
The table below outlines the various parameters you can customize when using LLMs:
| Parameter | Accepted Ranges | Default Value |
|---|---|---|
maxTokens | Cohere: 1 - 4,000; OpenAI: 1 - 16,000 | Cohere: 2,000; OpenAI: 2,000 |
frequencyPenalty | Cohere: 0 - 1; OpenAI: -2 to 2 | Cohere: 0; OpenAI: 0 |
presencePenalty | Cohere: 0 - 1; OpenAI: -2 to 2 | Cohere: 0; OpenAI: 0 |
prompt | Cohere: 256,000; OpenAI: 128,000 | N/A |
temperature | Cohere: 0 - 1; OpenAI: 0 - 2 | Cohere: 0.2; OpenAI: 0.2 |
topK | Cohere: 0 - 500; OpenAI: 1 - 100,000 | Both: 500 |
topP | 0 - 1 | 0.7 |
Who This Affects
This enhancement impacts the following roles and modules:
- Developers utilizing SuiteScript to customize LLM interactions.
- Data Scientists interested in fine-tuning AI model responses for specific applications.
- Administrators managing integrations within Oracle Cloud Infrastructure.
Key Takeaways
- The N/llm module empowers developers to customize responses from LLMs in SuiteScript.
- Various model parameters, including
temperatureandmaxTokens, can be tailored for optimal output. - Understanding how to adjust these parameters can greatly enhance the application of LLMs in business processes.
Frequently Asked Questions (4)
How do model parameter values by LLM interact with existing SuiteScript features?
Are there specific permissions required to customize model parameter values?
Do I need to enable a feature flag to use model parameter values by LLM in NetSuite 2026.1?
Is the ability to customize model parameter values available in all editions of NetSuite 2026.1?
Weekly Update History (3)
Updated Model Parameter Values by LLM to include values for the OpenAI GPT-OSS model.
View Oracle DocsUpdated Model Parameter Values by LLM to include the input token limit for Cohere Command A (256,000 tokens).
View Oracle DocsUpdated Model Parameter Values by LLM to remove interdependency statements for the frequencyPenalty and presencePenalty parameters, which were confusing. This topic has also been moved to appear under N/llm Module instead of under llm.generateText(options).
View Oracle DocsWas this article helpful?
More in SuiteScript
- API Governance Units Calculation in NetSuite 2026.1
NetSuite 2026.1 introduces examples illustrating API governance unit calculations for both user event and scheduled scripts.
- Binary File Support in N/https Module for SuiteScript
SuiteScript enhances capabilities with binary file support in the N/https module, allowing improved data handling in external communications.
- Attach and Detach Operations in NetSuite 2026.1
Attach and detach operations for record relationships in NetSuite enhance data management and connectivity.
- Create-Form Operation in NetSuite 2026.1 REST Web Services
Create-form operation in NetSuite 2026.1 APIs streamlines record creation and enhances efficiency.
