- Supplier delays slow replenishment before stock runs out.
- Rising demand accelerates inventory depletion.
- Regional logistics issues block inbound shipments.
- Warehouse imbalances leave some locations critically understocked.
- Ingest supply chain events and convert them into embeddings.
- Store events with structured metadata in Actian VectorAI DB.
- Retrieve similar historical incidents using semantic search and payload filters.
- Generate risk alerts using a lightweight reasoning layer.
Architecture overview
The agent is built around four connected stages:- Ingestion pipeline — Converts raw supply chain events into embeddings and stores them in Actian VectorAI DB.
- Query pipeline — Embeds incoming natural-language questions.
- Retrieval layer — Combines semantic search with payload filters to surface relevant historical incidents.
- Risk reasoning layer — Evaluates each result against five rule-based checks to generate actionable alerts.
Environment setup
This tutorial requires Python, a sentence embedding model, and the Actian VectorAI Python SDK. Run the following command to install both required packages:actian-vectorai— Official Python SDK for Actian VectorAI DB (async/sync clients, Filter DSL, gRPC transport).sentence-transformers— For generating text embeddings withall-MiniLM-L6-v2.
Implementation
This section walks through the implementation steps for building the inventory risk intelligence workflow using Actian VectorAI DB.Step 1: Import dependencies and configure
The following block imports the Actian VectorAI client, the embedding model, and defines the connection endpoint and collection settings. Running it prints a confirmation that the configuration was loaded correctly.- VectorAI server — The Actian VectorAI gRPC endpoint (default port
50051). - Collection name — Identifies the vector collection for supply chain events.
- Embedding model — Converts event text into 384-dimensional dense vectors.
Step 2: Define embedding helpers
The following functions wrap the embedding model so the rest of the pipeline can convert event text to vectors with a single call.embed_text handles single strings; embed_texts processes a list of strings in one model pass, which is more efficient when ingesting batches.
- “Supplier delay caused battery shortage risk.”
- “Low battery stock after repeated replenishment delays.”
Step 3: Initialize the vector database collection
Collections in Actian VectorAI DB define the vector dimensionality, distance metric, and index parameters. The following code callsget_or_create, which is idempotent. It creates the collection if it does not exist and skips creation if it already does. Running this block prints a confirmation that the collection is ready.
- Vectors will have 384 dimensions.
- Similarity is computed with cosine distance.
- The HNSW index uses
m=32connections andef_construct=256for high recall.
Step 4: Prepare sample supply chain events
The following block defines a dataset of realistic supply chain incidents. Each event includes anevent_text field for semantic meaning and structured fields for payload filtering. Running it prints the number of events loaded.
Step 5: Embed and ingest events into VectorAI DB
The following code embeds each event description, packages it as aPointStruct with payload, and upserts all points into the collection in a single operation. Running it prints the number of events ingested and the updated total stored in the collection.
id— Sequential integer identifier.vector— 384-dim dense embedding fromall-MiniLM-L6-v2.payload— All structured metadata (category, supplier, stock level, region, etc.).
vde.flush() call ensures vectors are persisted to disk immediately. After ingestion, the pipeline prints the following confirmation:
Step 6: Run basic semantic search
The following code embeds a natural-language inventory risk query and usespoints.search to retrieve the most semantically similar events from the collection. Running it prints each result’s ID, similarity score, and a preview of the event text.
points.search method accepts the following parameters:
vector— The query embedding.limit— Number of results.with_payload— Whether to return metadata.
Step 7: Apply structured payload filters using the Filter DSL
Actian VectorAI provides a type-safeField / FilterBuilder API for payload filtering. The following code adds a server-side filter that restricts results to electronics events with stock below 20 before ranking by similarity. Running it prints only the two events that pass both the category and stock-level filters.
Field("category").eq("electronics")— Exact match filter.Field("stock_level").lt(20.0)— Numeric range filter.FilterBuilder().must(...)— AND logic.
Step 8: Add boolean logic with must, should, and must_not
The Filter DSL supportsmust (AND), should (OR/preference), and must_not (exclusion) for complex business queries. The following code demonstrates all three clause types in a single filter. Running it returns only the Supplier Alpha events that match the low-stock electronics criteria, with the deprecated-region events excluded.
.must()— All conditions must match (AND logic). Used here to requirecategory = electronicsandstock_level < 20..should()— Preference boost. Events fromSupplier Alphaare ranked higher but not excluded if absent..must_not()— Hard exclusion. Events fromDeprecated Regionare removed from results entirely.
Step 9: Build the hybrid inventory risk query
This step combines semantic search with multiple filter dimensions — category, stock level, risk, and supplier — into a single reusable function. Each filter parameter is optional, so the function adapts to different query scenarios without code changes. Running the example call returns the high-risk, low-stock electronics events that are semantically closest to the query.Step 10: Build the risk reasoning layer
Retrieval alone is not enough. The following function adds a rule-based reasoning layer that evaluates each retrieved event’s payload and returns a list of structured risk alerts. Each alert includes a rule name, severity, recommended message, and action.| Rule | Trigger | Severity |
|---|---|---|
stock_below_reorder | Stock < reorder point | Critical if >= 50% below, else warning |
demand_spike | Demand change >= 20% | Critical if >= 30%, else warning |
supplier_delay | Event type is supplier_delay | Based on risk level |
quality_issue | Event type is quality_issue | Always warning |
logistics_disruption | Event type is logistics_disruption | Based on risk level |
Step 11: Run the end-to-end flow
The following code connects all the pieces into a single pipeline function and runs it with a sample query. Callingrun_risk_intelligence performs a hybrid semantic search, then runs the risk reasoning layer on every result and prints all triggered alerts with their severity and recommended action.
Step 12: Retrieve a specific event by ID for risk assessment
Actian VectorAI DB supports retrieving points by ID usingpoints.get, which is useful for inspecting individual events without running a vector search. The following code fetches event ID 0 and runs the risk reasoning layer against it, printing the event summary and any triggered alerts.
points.get allows the system to inspect specific events without a vector search, which is useful for dashboards and audit trails.
Step 13: Collection administration
Actian VectorAI provides VDE (Vector Data Engine) operations for managing collections. Useget_vector_count to check collection size and flush to persist data to disk (already shown in step 5). To remove a collection entirely, call client.collections.delete(COLLECTION).
Actian VectorAI features used
The following table summarizes every Actian VectorAI DB API used in this tutorial and the role each one plays:| Feature | API | Purpose |
|---|---|---|
| Collection creation | client.collections.get_or_create() | Create vector space with HNSW config |
| Point upsert | client.points.upsert() | Store vectors with payload metadata |
| Semantic search | client.points.search() | Nearest-neighbour retrieval |
| Filtered search | client.points.search(filter=...) | Combine similarity with payload constraints |
| Filter DSL | Field().eq(), .lt(), FilterBuilder().must() | Type-safe filter construction |
| Point retrieval | client.points.get() | Fetch specific events by ID |
| Vector count | client.vde.get_vector_count() | Collection statistics |
| Flush | client.vde.flush() | Persist vectors to disk |
| Delete collection | client.collections.delete() | Clean up |
Conclusion
This tutorial built an AI supply chain inventory risk intelligence agent using Actian VectorAI DB as the retrieval engine. The full pipeline covered the following steps:- Create a collection with
VectorParamsandHnswConfigDiff. - Embed supply chain events with
all-MiniLM-L6-v2(384-dim). - Store vectors with rich payload metadata via
PointStruct. - Run semantic search with
points.search. - Refine results with the type-safe
Field/FilterBuilderDSL. - Retrieve specific events by ID with
points.get. - Apply a rule-based risk reasoning layer.
- Generate actionable inventory risk alerts.
Next steps
Explore these related tutorials to deepen your understanding of the Actian VectorAI DB features used in this workflow:Hybrid search patterns
Learn how to combine vector similarity with structured constraints
Similarity search basics
Learn the core similarity search workflow
Filtering with boolean logic
Learn how to build filters with
must, should, and must_notGeospatial search
Learn how to make retrieval location-aware with geographic filters