Parameters for SuiteScript Generative AI Integration
Explore the required and optional parameters for SuiteScript's LLM integration, enhancing AI interactions in NetSuite.
TL;DR Opening
This article details the parameters for configuring SuiteScript integrations with Large Language Models (LLMs). These parameters help in tailoring AI responses, optimizing usage, and ensuring the generation of relevant content.
What Are the Parameters?
The following table outlines the parameters used to configure LLM interactions within SuiteScript. Each parameter is categorized by its type and whether it is required or optional.
| Parameter | Type | Required / Optional | Description |
|---|---|---|---|
options.prompt | string | Required | The prompt for the LLM. |
options.chatHistory | llm.ChatMessage[] | Optional | An array of chat messages to provide context during interactions. This parameter is optional. |
options.documents | llm.Document[] | Optional | A list of documents to provide additional context for generating responses. Supported only for Cohere models. |
options.modelFamily | enum | Optional | Specifies the LLM to use. Defaults to the Cohere Command A model if not specified. |
options.modelParameters | Object | Optional | Contains several sub-parameters to customize model behavior, detailed below. |
options.modelParameters.frequencyPenalty | number | Optional | A penalty assigned to tokens that frequently appear in the prompt or previous outputs. Higher values apply stronger penalties. |
options.modelParameters.maxTokens | number | Optional | The maximum number of tokens the LLM can generate, with an average of 3 tokens per word. |
options.modelParameters.presencePenalty | number | Optional | Similar to frequency penalty but applies to all previously used tokens. |
options.modelParameters.temperature | number | Optional | Defines randomness in responses. Lower values yield factual results; higher values allow for creative responses. |
options.modelParameters.topK | number | Optional | Limits the number of tokens considered during generation. |
options.modelParameters.topP | number | Optional | Sets a probability threshold for token consideration; operates in conjunction with topK. |
options.ociConfig | Object | Optional | Configuration details for unlimited usage through OCI Generative AI service. Required only for Oracle Cloud account access. |
options.preamble | string | Optional | An initial guiding message for the LLM. |
options.safetyMode | string | Optional | Safety mode settings, applicable for Cohere models only. Defaults to strict mode if not specified. |
options.timeout | number | Optional | Specifies the timeout in milliseconds for requests, defaulting to 30,000. |
options.ociConfig.compartmentId | string | Optional | The OCID of the compartment. |
options.ociConfig.endpointId | string | Optional | Required only when using a custom OCI dedicated AI cluster. |
options.ociConfig.fingerprint | string | Optional | Public key fingerprint for OCI user, requiring a NetSuite secret. |
options.ociConfig.privateKey | string | Optional | The private key for the OCI user, also requiring a NetSuite secret. |
options.ociConfig.tenancyId | string | Optional | OCID of the tenancy. |
options.ociConfig.userId | string | Optional | OCID of the user. |
Key Considerations
- Parameter Configuration: Properly setting the
options.promptis crucial for successful LLM outputs. - Model Parameter Implications: Adjusting
frequencyPenalty,presencePenalty,temperature,topK, andtopPsignificantly affects the generated output style and content relevance. - OCI Configuration: Ensure that OCI configuration is correctly set up when accessing the LLM through an Oracle Cloud account for optimal performance.
- Safety Modes: Choose appropriate safety modes depending on the application context, especially for user-facing interactions.
Key Takeaways
- Required parameters include
options.prompt, while others are optional but enhance LLM interaction. - Customizations via
options.modelParameterscan influence how the LLM generates responses. - Proper OCI configuration is essential for utilizing the Oracle Cloud's Generative AI service effectively.
Source: This article is based on Oracle's official NetSuite documentation.
Frequently Asked Questions (4)
Do I need to enable a feature flag for SuiteScript 2.1 LLM parameters?
What permissions are required to use the N/llm module for LLM functionality?
How does the options.documents parameter interact with the LLM's response generation?
What happens if I do not specify the options.modelFamily parameter?
Was this article helpful?
More in SuiteScript
- N/https Module: Binary File Support in NetSuite 2025.2
NetSuite 2026.1 adds binary file support to N/https for streamlined file handling.
- API Governance Units Calculation in NetSuite 2026.1
NetSuite 2026.1 introduces examples illustrating API governance unit calculations for both user event and scheduled scripts.
- Attach and Detach Operations in NetSuite 2026.1
Attach and detach operations for record relationships in NetSuite enhance data management and connectivity.
- Create-Form Operation in NetSuite 2026.1 REST Web Services
Create-form operation in NetSuite 2026.1 APIs streamlines record creation and enhances efficiency.
