Mastering NetSuite's LLM Text Generation Parameters for SuiteScript 2.1
A comprehensive guide on configuring LLM APIs in NetSuite for effective text generation and customization.
NetSuite's recent updates have introduced parameters for enhancing the capabilities of text generation through large language models (LLMs) in SuiteScript 2.1. These parameters allow developers to tailor the behavior of artificial intelligence models, ensuring responses are customized to suit specific business needs. Let's dive into these parameters and how they can optimize your LLM API interactions in NetSuite.
Key Parameters Overview
The parameters for LLM in NetSuite help define the nature and scope of AI-generated text responses. Here are the core parameters you should know:
Required Parameter
- options.prompt: This is a string required to prompt the LLM and forms the foundation of the text generation task. Introduced in 2024.1, it acts as the initial input question or statement for generating a response.
Optional Parameters
-
options.chatHistory: Allows past chat messages represented as
llm.ChatMessage[]to be used by the LLM for contextual understanding, potentially leading to more relevant responses. -
options.documents: This optional parameter provides additional context for response generation using a list of
llm.Document[]. From version 2025.1, this is supported exclusively for Cohere models and can enhance the LLM's precision by using external documents. -
options.modelFamily: Determines which LLM to use, specified through the values in
llm.ModelFamilyenum. Default is the Cohere Command A model if not specified.
Model Parameter Customizations
- options.modelParameters: An object where you can fine-tune the LLM's behavior with attributes like:
- frequencyPenalty: Penalizes token repetition.
- maxTokens: Sets a cap on the number of tokens in a response.
- presencePenalty: Discourages repeated token use, promoting diverse responses.
- temperature: Determines randomness and creativity level.
- topK and topP: Control token consideration and probability mass.
These configurations give developers extensive control over the response generation, thus allowing tailoring from factual to creative, depending on business requirements.
OCI Configuration
- options.ociConfig: Enables configuration for Oracle Cloud Infrastructure, necessary for unlimited AI service usage. When both set on the SuiteScript subtab and through parameters, the parameter configuration will override the subtab settings, ensuring precision control over cloud interactions.
Advanced Options
- options.preamble: Allows a preamble to be defined, which sets guidelines or context messages for the LLM, enhancing the initial prompt context.
- options.responseFormat: Directs the LLM to format the response in JSON schema, which makes integration with other systems streamlined.
- options.safetyMode: Ensures content moderation by using values from the
llm.SafetyModeenum, defaulting toSTRICTmode. - options.timeout: Provides a customizable timeout setting, standard at 30,000 milliseconds.
Best Practices
When utilizing these parameters, keep the following in mind:
- Start with the broad usage of default settings and gradually introduce custom parameters as needed.
- Use document and chat history features to enrich responses where nuanced or company-specific contextual understanding is necessary.
- Regularly review Oracle Cloud settings to ensure they align with AI service consumption and business needs.
Key Takeaways
- Customize LLM responses through detailed parameters for tailored SuiteScript applications.
- Utilizing chat history and document inputs can greatly enhance response relevance.
- Oracle Cloud Infrastructure configurations provide flexibility, crucial for enterprise-scale AI applications.
By leveraging these parameters effectively, NetSuite developers can enhance the AI-driven capabilities within their applications, ensuring they meet nuanced business demands and improve operational efficiency.