Parameters for SuiteScript Generative AI Integration

Explore the required and optional parameters for SuiteScript's LLM integration, enhancing AI interactions in NetSuite.

·3 min read·2 views·View Oracle Docs

TL;DR Opening

This article details the parameters for configuring SuiteScript integrations with Large Language Models (LLMs). These parameters help in tailoring AI responses, optimizing usage, and ensuring the generation of relevant content.

What Are the Parameters?

The following table outlines the parameters used to configure LLM interactions within SuiteScript. Each parameter is categorized by its type and whether it is required or optional.

ParameterTypeRequired / OptionalDescription
options.promptstringRequiredThe prompt for the LLM.
options.chatHistoryllm.ChatMessage[]OptionalAn array of chat messages to provide context during interactions. This parameter is optional.
options.documentsllm.Document[]OptionalA list of documents to provide additional context for generating responses. Supported only for Cohere models.
options.modelFamilyenumOptionalSpecifies the LLM to use. Defaults to the Cohere Command A model if not specified.
options.modelParametersObjectOptionalContains several sub-parameters to customize model behavior, detailed below.
options.modelParameters.frequencyPenaltynumberOptionalA penalty assigned to tokens that frequently appear in the prompt or previous outputs. Higher values apply stronger penalties.
options.modelParameters.maxTokensnumberOptionalThe maximum number of tokens the LLM can generate, with an average of 3 tokens per word.
options.modelParameters.presencePenaltynumberOptionalSimilar to frequency penalty but applies to all previously used tokens.
options.modelParameters.temperaturenumberOptionalDefines randomness in responses. Lower values yield factual results; higher values allow for creative responses.
options.modelParameters.topKnumberOptionalLimits the number of tokens considered during generation.
options.modelParameters.topPnumberOptionalSets a probability threshold for token consideration; operates in conjunction with topK.
options.ociConfigObjectOptionalConfiguration details for unlimited usage through OCI Generative AI service. Required only for Oracle Cloud account access.
options.preamblestringOptionalAn initial guiding message for the LLM.
options.safetyModestringOptionalSafety mode settings, applicable for Cohere models only. Defaults to strict mode if not specified.
options.timeoutnumberOptionalSpecifies the timeout in milliseconds for requests, defaulting to 30,000.
options.ociConfig.compartmentIdstringOptionalThe OCID of the compartment.
options.ociConfig.endpointIdstringOptionalRequired only when using a custom OCI dedicated AI cluster.
options.ociConfig.fingerprintstringOptionalPublic key fingerprint for OCI user, requiring a NetSuite secret.
options.ociConfig.privateKeystringOptionalThe private key for the OCI user, also requiring a NetSuite secret.
options.ociConfig.tenancyIdstringOptionalOCID of the tenancy.
options.ociConfig.userIdstringOptionalOCID of the user.

Key Considerations

  • Parameter Configuration: Properly setting the options.prompt is crucial for successful LLM outputs.
  • Model Parameter Implications: Adjusting frequencyPenalty, presencePenalty, temperature, topK, and topP significantly affects the generated output style and content relevance.
  • OCI Configuration: Ensure that OCI configuration is correctly set up when accessing the LLM through an Oracle Cloud account for optimal performance.
  • Safety Modes: Choose appropriate safety modes depending on the application context, especially for user-facing interactions.

Key Takeaways

  • Required parameters include options.prompt, while others are optional but enhance LLM interaction.
  • Customizations via options.modelParameters can influence how the LLM generates responses.
  • Proper OCI configuration is essential for utilizing the Oracle Cloud's Generative AI service effectively.

Source: This article is based on Oracle's official NetSuite documentation.

Frequently Asked Questions (4)

Do I need to enable a feature flag for SuiteScript 2.1 LLM parameters?
No specific feature flag is mentioned for enabling SuiteScript 2.1 LLM parameters in the article. The parameters are available for use in the `N/llm` module as part of the 2025.1 release.
What permissions are required to use the N/llm module for LLM functionality?
The article does not specify any particular permissions required to use the N/llm module. However, typical scripting permissions within NetSuite should be adequate for a developer to utilize these parameters.
How does the options.documents parameter interact with the LLM's response generation?
The options.documents parameter provides additional context that influences the LLM's responses. This is particularly effective in scenarios requiring domain-specific information, enhancing the output through retrieval-augmented generation (RAG).
What happens if I do not specify the options.modelFamily parameter?
If the options.modelFamily parameter is not specified, the default model used is Cohere's Command A model. This may affect the output quality based on the specific requirements of a project.
Source: Parameters Oracle NetSuite Help Center. This article was generated from official Oracle documentation and enriched with additional context and best practices.

Was this article helpful?

More in SuiteScript

View all SuiteScript articles →