Model Parameter Values for LLM Integration in SuiteScript 2.1

Customize LLM responses in SuiteScript 2.1 with model parameter values like temperature and maxTokens.

·2 min read·12 views·View Oracle Docs

TL;DR: This article discusses how to utilize model parameter values within the N/llm module in SuiteScript 2.1. This enables developers to customize responses from large language models (LLMs) such as Cohere Command A and OpenAI GPT-OSS for applications in Oracle Cloud Infrastructure (OCI).

What is the N/llm Module?

The N/llm module is a critical component of SuiteScript 2.1 that allows developers to interact with large language models to generate text-based responses. With support for models hosted in Oracle Cloud Infrastructure, this module offers advanced capabilities for integrating generative AI into NetSuite applications.

Customizing Model Responses with Parameters

When calling certain methods within the N/llm module, you can provide model parameter values to tailor how the LLM generates its output. Here are some key parameters you can adjust:

Key Model Parameters

ParameterAccepted RangesDefault Value
maxTokensCohere Command A: 1 to 4000; OpenAI GPT-OSS: 1 to 160002000
frequencyPenaltyCohere Command A: 0 to 1; OpenAI GPT-OSS: -2 to 20
presencePenaltyCohere Command A: 0 to 1; OpenAI GPT-OSS: -2 to 20
promptCohere Command A: 256,000 tokens; OpenAI GPT-OSS: 128,000 tokensN/A
temperatureCohere Command A: 0 to 1; OpenAI GPT-OSS: 0 to 20.2
topKCohere Command A: 0 to 500; OpenAI GPT-OSS: 1 to 100,000500
topP0 to 10.7

You can leverage these parameters to enhance the model's responses. For example, adjusting the temperature can increase or decrease the creativity of the response — higher values (up to 1 for Cohere and 2 for OpenAI) lead to more diverse outputs.

Supported Methods

The following methods within the N/llm module utilize these model parameters:

  • llm.generateText(options)
  • llm.generateText.promise(options)
  • llm.generateTextStreamed(options)
  • llm.generateTextStreamed.promise(options)

When using these methods, ensure you provide the appropriate options for your use case to optimize the AI output. For example, in the llm.generateText(options) method, the prompt parameter must be specified unless options.toolResults is provided.

Conclusion

The N/llm module offers powerful capabilities for integrating generative AI into NetSuite applications. By customizing model parameter values, developers can ensure that the LLM produces responses suited to specific business needs.

Who This Affects

  • Developers: Those implementing AI features in NetSuite.
  • Administrators: Those configuring SuiteScript for generative AI applications.

Key Takeaways

  • Customize LLM responses with model parameters like temperature and maxTokens.
  • Methods like generateText support these configurations for tailored outputs.
  • Understanding default values and accepted ranges is crucial for effective integration.

Frequently Asked Questions (4)

How do model parameter values by LLM interact with existing SuiteScript features?
The article does not specify how the new model parameter values interact with existing SuiteScript features. You may need to refer to the detailed SuiteScript documentation for compatibility and integration details.
Are there specific permissions required to customize model parameter values?
The article does not mention the permissions required to customize model parameter values. Reviewing the release notes or documentation may provide additional insight into necessary permissions.
Do I need to enable a feature flag to use model parameter values by LLM in NetSuite 2026.1?
The article does not specify whether a feature flag needs to be enabled to use the model parameter values by LLM. Checking in the NetSuite system or the detailed release notes may confirm this.
Is the ability to customize model parameter values available in all editions of NetSuite 2026.1?
The article does not state whether the customizable model parameter values feature is available in all editions. Further investigation in NetSuite's edition-specific documentation would be required to confirm this.

Weekly Update History (3)

SuiteScriptupdated

Updated Model Parameter Values by LLM to include values for the OpenAI GPT-OSS model.

View Oracle Docs
SuiteScriptupdated

Updated Model Parameter Values by LLM to include the input token limit for Cohere Command A (256,000 tokens).

View Oracle Docs
SuiteScriptupdated

Updated Model Parameter Values by LLM to remove interdependency statements for the frequencyPenalty and presencePenalty parameters, which were confusing. This topic has also been moved to appear under N/llm Module instead of under llm.generateText(options).

View Oracle Docs
Source: Model Parameter Values by LLM Oracle NetSuite Help Center. This article was generated from official Oracle documentation and enriched with additional context and best practices.

Was this article helpful?

More in SuiteScript

View all SuiteScript articles →