Skip to main content
Version: 2026-03-31

Overview

AIE is an enterprise execution platform for AI workflows. It lets you configure, run, and manage AI tasks in production.


When to Use AIE

AIE is designed for production AI workloads where you need more than a raw model call: specifically when you require modular pipeline configuration or multi-model flexibility. Common use cases include document processing, extraction, classification, and multi-step workflows that need to run reliably at scale.


Key Capabilities

CapabilityWhat It Means for You
Modular WorkflowsComponents such as Large Language Models (LLMs), Optical Character Recognition (OCR), retrieval, context refinement, orchestration, post-processing, and evaluation are configurable on a per task basis.
High-Fidelity DecisioningDelivers highly accurate, expert recommendations and decisions on a per use case basis with consistent, reliable results.
Zero Training, No TuningWorks "out of the box" with optimal results achieved through prompting alone.
MultimodalEnables analysis of images, handwritten documents, and signature detection.
ExplainabilityFull verbal explanations of results provide transparency.
Deployment FlexibilityAIE runs in hosted, VPC, or on-premise environments.
Independent UpgradesComponents can be upgraded or rolled back independently without affecting the rest of your pipeline.

How It Works

AIE is available through endpoints in the Lazarus API framework. Processing is asynchronous: you submit work in bulk and receive results to your output URL when they're ready.

  1. Prepare your request by specifying the prompt and the file(s) you want to submit in input.
  2. Configure a task or workflow by specifying the execution stack through the capability and tier parameters in modelSettings. The AIE will do the rest of the configuration for you.
  3. Submit a request to the bulk processing endpoint. The AIE runs each stage of the pipeline asynchronously.
    • To receive output, specify a url in output. Optionally, you can configure the HTTP method and contentType.
    • If you want to receive status updates for the request, specify a url for statusWebhook.
  4. (Optionally) Poll the status endpoint to check the status of your request.
  5. Retrieve structured results through your specified output URL.