This guide provides detailed instructions for implementing and using the PromptIDE SDK with the functions and classes available in Module prompt_ide. The SDK version is 1.1.

Implementation Instructions for the PromptIDE SDK

Functions

as_string() -> str

  • Description: This function returns a string representation of the current context.

as_token_ids() -> list[int]

  • Description: This function returns a list of token IDs stored in the current context.

clone() -> Context

  • Description: The clone function creates a new context by duplicating the current prompt context.

create_context() -> Context

  • Description: The create_context function generates a new context and adds it as a child context to the current one.

force_context(ctx: Context)

  • Description: The force_context function allows you to override the current context with the provided context.

get_context() -> Context

  • Description: get_context returns the current context.

prompt(text: str, strip: bool = False) -> Sequence[Token]

  • Description: The prompt function tokenizes the input text and adds the tokens to the current context.

prompt_fn(fn)

  • Description: prompt_fn is a context manager that is used to execute a function in a fresh prompt context. It is typically used as a decorator for asynchronous functions.

Example:

@prompt_fn
async def add(a, b):
    prompt(f"{a}+{b}=")
    result = await sample(max_len=10, stop_strings=[" "])
    return result.as_string().split(" ")[0]
  • Args:
  • fn: An asynchronous function to execute in a newly created context.
  • Returns: The wrapped function.

randomize_rng_seed() -> int

  • Description: The randomize_rng_seed function samples a new random number generator (RNG) seed and returns it.

read_file(file_name: str) -> bytes

  • Description: The read_file function reads a file that the user has uploaded to the file manager.
  • Args:
  • file_name: Name of the file to read.
  • Returns: The file’s content as raw bytes.

sample(max_len: int = 256, temperature: float = 1.0, nucleus_p: float = 0.7, stop_tokens: Optional[list[str]] = None, stop_strings: Optional[list[str]] = None, rng_seed: Optional[int] = None, add_to_context: bool = True, return_attention: bool = False, allowed_tokens: Optional[Sequence[Union[int, str]]] = None, disallowed_tokens: Optional[Sequence[Union[int, str]]] = None)

  • Description: The sample function generates a model response based on the current prompt.
  • Args:
  • max_len: Maximum number of tokens to generate.
  • temperature: Temperature of the final softmax operation. Lower values reduce token distribution variance.
  • nucleus_p: Threshold for Top-P sampling.
  • stop_tokens: List of strings to stop sampling.
  • stop_strings: List of strings to stop sampling.
  • rng_seed: RNG seed for sampling.
  • add_to_context: If true, generated tokens are added to the context.
  • return_attention: If true, returns the attention mask.
  • allowed_tokens: If set, only these tokens can be sampled.
  • disallowed_tokens: If set, these tokens cannot be sampled.
  • Returns: The generated text.

select_model(model_name: str)

  • Description: The select_model function is used to select the model name for the current context. This should be set before adding any tokens to the context.
  • Args:
  • model_name: Name of the model to use.

set_title(title: str)

  • Description: The set_title function sets the title of the context, which is displayed in the user interface.
  • Args:
  • title: The title to set.

user_input(text: str) -> str | None

  • Description: The user_input function asks the user to enter text into the text field displayed in the completion dialog.
  • Args:
  • text: Placeholder text displayed in the text field.
  • Returns: A string if the user enters text, or None if the user cancels.

Classes

Context

  • Description: A Context represents a sequence of tokens used as a prompt for model sampling.

Class Variables:

  • body: A list of tokens and contexts.
  • context_id: Identifier for the context.
  • model_name: Name of the model associated with the context.
  • next_rng_seed: The next RNG seed to be used.
  • parent: The parent context.

Instance Variables:

  • children: Returns all child contexts.
  • tokens: Returns the tokens stored in this context.

Methods:

  • as_string() -> str: Returns a string representation of this context.
  • as_token_ids() -> list[int]: Returns a list of token IDs stored in this context.
  • clone() -> Context: Clones the current prompt.
  • create_context() -> Context: Creates a new context and adds it as a child context.
  • prompt(text: str, strip: bool = False) -> Sequence[Token]: Tokenizes the argument and adds the tokens to the context.
  • randomize_rng_seed() -> int: Samples a new RNG seed and returns it.

SampleResult

  • Description: SampleResult holds the results of a sampling call.

Class Variables:

  • end_time: End time of sampling.
  • first_token_time: Time when the first token was sampled.
  • request: Information about the sampling request.
  • start_time: Start time of sampling.
  • tokens: List of tokens in the response.

Methods:

  • append(token: Token): Adds a token to the result and reports progress in the terminal.
  • as_string() -> str: Returns a string representation of the context.
  • print_progress(): Prints the sampling progress to stdout.

Token

  • Description: A Token represents an element of the vocabulary with a unique index and string representation.

Class Variables:

  • attn_weights: Attention weights for the token.
  • prob: Probability of the token.
  • token_id: Identifier for the token.
  • token_str: String representation of the token.
  • token_type: Type of the token (e.g., _MODEL).
  • top_k: List of alternative tokens.

Static Methods:

  • from_proto_dict(values: dict) -> Token: Converts a protobuffer dictionary to a Token instance.

Use the provided SDK functions and classes to interact with the PromptIDE for various natural language processing tasks and creative writing. You can create, manipulate, and sample from contexts, and even interact with the user in a prompt-like manner.

For further implementation and code examples, refer to the Module prompt_ide documentation, and you can

use the provided SDK functions as demonstrated in the code samples to build your applications and projects.