LLM Supported Models and Capabilities in NetSuite 2026.1

Discover LLM model support and capabilities in NetSuite 2026.1 for enhanced AI-driven text generation.

·3 min read·NetSuite 2026.1·View Oracle Docs

Starting in NetSuite 2026.1, new enhancements to the LLM (Large Language Model) capabilities have been introduced, specifically detailing the available models for text generation. This update allows developers to understand which models to use with the llm.generateText(options) and llm.generateTextStreamed(options) methods, alongside their supported features. The inclusion of retrieval-augmented generation (RAG) provides a way to enrich responses by utilizing additional documents, offering a more informative output from the AI.

Supported Models and Their Capabilities

The following table summarizes the LLM models available and their respective features:

Model FamilyModel CodeRAG SupportPreambles SupportNotes
ModelFamily.COHERE_COMMANDcohere.command-a-03-2025YesYes-
ModelFamily.COHERE_COMMAND_LATESTcohere.command-a-03-2025YesYes-
ModelFamily.GPT_OSSopenai.gpt-oss-120bNoYes-
ModelFamily.GPT_OSS_LATESTopenai.gpt-oss-120bNoYes-

Key Features Explained

  • Retrieval-Augmented Generation (RAG): This is an advanced feature that allows users to incorporate external documents into the generation process. By passing an array of llm.Document objects through the options.documents parameter, the LLM references this content to provide enhanced and contextually relevant responses. The output includes llm.Citation objects, which indicate the source of the information used in the response.
  • Preambles: Developers can set an initial context for the LLM using the options.preamble parameter. This acts as a guiding message that shapes the generation process, allowing for more tailored responses.

Using the LLM in SuiteScript

Method Overview

  • The methods llm.generateText(options) and llm.generateTextStreamed(options) are primarily used for generating responses from the LLM. While similar, generateText returns a complete response, and generateTextStreamed allows developers to access the response in real-time, token by token.

Important Parameters for LLM Methods

  • options.prompt (required): The prompt text that guides the LLM.
  • options.chatHistory (optional): It allows the inclusion of previous messages to inform the current response.
  • options.documents (optional): Relevant documents for enriched context in response generation, applicable only to certain models.

Practical Note

When utilizing these features, always remember to test with various models and parameters to find the optimal output for your specific use case. The inclusion of contextual documents can significantly enhance the relevance of generated responses.

Who This Affects

This update impacts several roles and modules within the NetSuite environment:

  • Developers: Those implementing AI functionalities within applications.
  • Administrators: Users overseeing SuiteScript configurations and AI Preferences.
  • Product Managers: Professionals looking to leverage AI for enhanced customer interactions and information retrieval.

Key Takeaways

  • NetSuite 2026.1 introduces support for various LLM models with distinct capabilities.
  • Retrieval-augmented generation enhances response quality by integrating external documents.
  • Preambles provide context, guiding the LLM towards generating more relevant outputs.
  • The LLM methods are viable for both complete and streamed responses, offering flexibility for developers.
  • Effective use of options can lead to significantly improved AI interactions based on specific business needs.

Frequently Asked Questions (4)

What models support Retrieval-Augmented Generation (RAG) in NetSuite 2026.1?
In NetSuite 2026.1, the models that support Retrieval-Augmented Generation (RAG) are from the ModelFamily.COHERE_COMMAND, including cohere.command-a-03-2025, and ModelFamily.COHERE_COMMAND_LATEST.
How do I enrich LLM responses using external documents in NetSuite 2026.1?
To enrich LLM responses with external documents, use the `options.documents` parameter to pass an array of `llm.Document` objects. This is applicable to LLM models that support Retrieval-Augmented Generation.
Is it possible to preconfigure a context for LLM responses in NetSuite 2026.1?
Yes, in NetSuite 2026.1, you can configure context for LLM responses using the `options.preamble` parameter. This allows you to set a guiding message to influence the LLM's generation process.
What's the difference between generateText and generateTextStreamed methods in SuiteScript?
The `generateText` method returns a complete response from the LLM, whereas `generateTextStreamed` allows access to the response in real-time, token by token, providing more flexible handling of the output.

Weekly Update History (1)

SuiteScriptupdated

Updated llm.ModelFamily to add Cohere Command A (cohere.command-a-03-2025) to the list of supported models for the N/llm module.

View Oracle Docs
Source: Values Oracle NetSuite Help Center. This article was generated from official Oracle documentation and enriched with additional context and best practices.

Was this article helpful?