Inventory

Updated: October 30, 2025

The FireTail platform helps you discover and monitor AI resources-such as AI services, models, prompts, and logs-within your codebase or cloud environment.

Integrations

Before you can discover AI resources, you must set up integrations with your environments. When the integrations are configured, FireTail will automatically scan your environments for AI resources.

AI Services

An AI service acts as the interface between users and the underlying AI models. It accepts inputs, processes them using the model, and returns the output. To view AI Services:

  1. In the side menu, go to AI and under Inventory select AI Services.
  2. Service Details: Click on an individual AI service to see its details, which include:
    • Resources Grouped by Providers: Lists the resources organized by provider (e.g., OpenAI, Cohere, Amazon, Anthropic). This section shows a count of models, prompts, logs input and output tokens and total number of tokens per provider.
  • AI logs grouped by provider: A chart that displays the number of logs generated by the service over the displayed time period, grouped by provider. Learn more about Cloud Logs.
  • Total tokens metric by datetime: A graph showing how token usage varies over time.

  • Resources: A display of all AI models, AI prompts and AI logs associated with the service. Click the model, prompt or log to view more details.

AI Models

An AI model processes data, identifies patterns, and makes predictions or generates outputs.

  1. In the side menu, go to AI and under Inventory select AI Models.
  2. All discovered AI models are displayed.
  3. Click on a model to view further details, including:
    • Model Name: Identifies the AI model.
    • Provider: Identifies the provider (e.g., OpenAI, Cohere, AI21 Labs).
    • Scanned Via & Source: Specifies how the model was discovered (e.g., AWS Bedrock, Cloudwatch).
    • Creation & Modification Timestamps: Tracks when the model was first detected and last updated.
    • Model Metadata: Includes details such as:
      • Account ID: Identifies the cloud account associated with the model.
      • Model ARN: The unique Amazon Resource Name (ARN) for AWS-hosted models.
      • Model ID: A version-specific identifier (e.g., ai21.jamba-instruct-v1:0).
      • Discovery Method: Specifies how the model was found (e.g., CloudWatch logs).
      • Region Name: The geographical cloud region where the model is deployed (e.g., us-east-1).
      • Response Streaming Supported: Indicates whether the model supports real-time streaming responses.
  • Prompts: Information about the prompts associated with the model.
  • Logs: Detailed records showing when the model was called, what inputs were used, and what outputs were generated

AI Prompts

AI prompts are the specific inputs provided to an AI model. They guide the model's output and record the interaction details. To view prompts, you can

  • In the side menu, go to AI and under Inventory select AI Prompts.
  • Click on an AI model to view its associated prompts. Select a prompt by clicking its ID.

Prompt Details can include:

  • Name and Model: Identification of the prompt and the model it belongs to.
  • Provider and Scanned Via: Information on who provided the prompt and how it was discovered.
  • Creation Date and Last Modified: When the prompt was created and last updated.
  • Temperature and Response Format: Settings that control the randomness of the output and the format in which responses are generated.
  • Messages: A record of the prompts.