Exploring SuiteScript 2.1 LLM Parameters: A Developer's Guide

Learn how to effectively use parameters in SuiteScript 2.1 with NetSuite's N/llm module for optimal AI-driven results.

·View Oracle Docs

In the evolving landscape of NetSuite's SuiteScript 2.1, understanding the various parameters for large language models (LLMs) is crucial for developers who aim to leverage AI-driven functionalities effectively. This guide explores the parameters used in the N/llm module, introduced in version 2025.1, providing practical insights and considerations for each.

Understanding the Core Parameters

Required Parameters

  • options.prompt (string): This parameter is the engine driving the LLM. It serves as the initial query or statement that you provide to generate a response. As the only required parameter, it is essential to craft your prompt carefully to guide the AI appropriately toward the desired output.

Optional Parameters

  • options.chatHistory (llm.ChatMessage[]): Useful for continuity in interactive sessions, this parameter helps the LLM maintain context by considering previous conversations. It’s essential for applications that rely on coherent dialogue.

  • options.documents (llm.Document[]): By providing additional context through documents, you can influence the LLM’s responses, especially in scenarios where domain-specific information is crucial. This parameter supports only Cohere models and enhances the generation process by utilizing retrieval-augmented generation (RAG).

  • options.modelFamily (enum): Selecting the right model family can significantly affect the output quality. If unspecified, the default is Cohere's Command A model. Different projects might benefit from models optimized for creativity versus factual accuracy.

  • options.modelParameters: These parameters allow fine-tuning of the LLM's output:

    • frequencyPenalty (number): Adjusts repetitiveness by penalizing frequently used tokens.
    • presencePenalty (number): Encourages the exploration of diverse vocabulary.
    • temperature (number): Controls randomness; lower values yield more predictable results, whereas higher values foster creativeness.
    • maxTokens (number): Limits the output length, impacting response completeness.
    • topK and topP (number): Handle token selection by limiting options to high-probability choices.
  • options.ociConfig (Object): Critical for users leveraging Oracle Cloud Infrastructure. Configuration details here can override default SuiteScript AI preferences, enabling tailored unlimited usage for specific scripts.

  • options.preamble (string): Allows customization of the initial preamble used by the LLM, altering its guiding context.

  • options.safetyMode (string): Only applicable to Cohere models, this parameter ensures that outputs conform to safety standards, defaulting to a strict mode if unspecified.

  • options.timeout (number): Specifies the duration (in milliseconds) that the system waits before terminating a request. The standard is set to 30,000 ms, adaptable to suit application needs.

Key Takeaways

  • Prompting Strategy: Always tailor the prompt and consider accompanying documents to harness the full potential of LLMs for precise results.
  • Model Customization: Leverage modelParameters, such as frequency and presence penalties, to refine output relevance and creativity.
  • Infrastructure Flexibility: Use ociConfig for streamlined integration with Oracle Cloud, allowing consistency across NetSuite environments.
  • Safety and Control: safetyMode and timeout provide essential scaffolding to ensure reliable and secure AI interactions.

By fully understanding and utilizing these parameters, developers can unlock enhanced capabilities in their SuiteScript applications, driving both innovation and practical utility.

Source: Parameters — Oracle NetSuite Help Center. This article was generated from official Oracle documentation and enriched with additional context and best practices.