OPAL Installation and Setup Guide
A comprehensive guide to installing, configuring, and utilizing the OPAL add-on for advanced AI-driven data integration.

Introduction
Prior to the emergence of the Model Context Protocol (MCP), functionality provided by Virtuoso’s OPAL add-on middleware layer was only accessible through a limited set of options. The introduction of MCP provides an open standards–based interface for loosely coupled interactions with the powerful capabilities offered by OPAL.
This functionality is packaged as a collection of tools that support a wide range of operations, including:
- Deductive interactions with relational tables and entity relationship graphs.
- Comprehensive query execution (read/write).
- Unified Data Space Administration (databases, knowledge graphs, filesystems).
- Virtual Database Management for remote data sources.
- Seamless interactions with a broad spectrum of LLMs.
- Access to specialized OPAL Assistants and OpenAPI-based web services.
Benefits
LLM-based natural language processing marks the beginning of a new computing era—one powered by loosely coupled AI Agents that operate collaboratively to enhance productivity. The OPAL Server for MCP brings this powerful vision to life by enabling any MCP client to execute a wide range of operations across data spaces for seamless data access, integration, and management.
Post-Installation Capabilities Overview
Setup and Installation
Prerequisites
- Administrative and operational familiarity with MCP setup and use (e.g., for ODBC, JDBC, etc.).
- An installed and configured OPAL add-on for your Virtuoso instance.
Installation Steps
- Install the latest on-premise Virtuoso installer for your OS (macOS, Windows, Linux).
- Install the
VAL
andpersonal-assistant
VADs via Conductor or iSQL. - Navigate to
https://{your-host}:8891/chat
. - Log in with the
dba
user and provide your LLM provider’s API key when prompted.
Additional Components
Expand your OPAL instance's capabilities by installing optional VAD packages like the Assistant-Metal UI for managing AI agents or the Linked Data Cartridges for advanced data transformation and crawling.
Supported LLMs
OPAL supports a wide array of LLM providers, including OpenAI, Google, Anthropic, Microsoft, and many others. It also supports any local or hosted LLMs that are compatible with OpenAI's Tools API.
Attribute-based Access Controls (ABAC)
Secure your OPAL environment with fine-grained access controls. By executing SPARQL commands, you can define who can log in and what actions they are permitted to perform, using standardized ontologies for precise rule definition.
Login Authorization for /chat endpoint
-- Grant dba user access to the /chat endpoint
PREFIX acl: <http://www.w3.org/ns/auth/acl#>
WITH <urn:virtuoso:val:default:rules>
INSERT {
<#rulePublicChat> a acl:Authorization ;
acl:accessTo <urn:oai:chat> ;
acl:agent <http://localhost/dataspace/person/dba#this> .
} ;
Login Authorization for /assist-metal endpoint
-- Grant dba user access to the /assist-metal endpoint
PREFIX acl: <http://www.w3.org/ns/auth/acl#>
WITH <urn:virtuoso:val:default:rules>
INSERT {
<#assistantsAdmin> a acl:Authorization ;
acl:accessTo <urn:oai:assistants> ;
acl:agent <http://localhost/dataspace/person/dba#this> .
} ;
Large Language Models Registration & Use
Bind your OPAL instance to one or more LLMs using the commands below in the Conductor or iSQL interfaces.
Listing Bound LLMs
OAI.DBA.FILL_CHAT_MODELS('{Api Key}', '{llm-vendor-tag}');
System-Wide LLM API Key Registration
To avoid entering API keys on every login, register them system-wide.
OAI.DBA.SET_PROVIDER_KEY('{llm-vendor-tag}', '{api-key}');
API Access & Protocols
Your OPAL instance is also API-accessible. Obtain credentials (OAuth tokens, Bearer Tokens)
from the /oauth/applications.vsp
endpoint to interact with your instance via
protocols like MCP and A2A.
Model Context Protocol (MCP) Usage
OPAL has built-in MCP support (client and server). Enable this by setting up CORS access for
the /.well-known
and /OAuth2
virtual directories in the Conductor
UI.
MCP Server Endpoints:
- Streamable HTTP:
https://{CNAME}:8891/chat/mcp/messages
- Server-Sent Events:
https://{CNAME}:8891/chat/mcp/sse
Agent-2-Agent (A2A) Protocol Usage
A2A support enables sophisticated workflows between AI agents. Agents are discoverable via a
JSON-based Agent Card at https://{CNAME}:8891/.well-known/agent.json
.
Frequently Asked Questions
What is OPAL?
OPAL (OpenLink AI Layer) is an AI-powered platform that provides integration with multiple Large Language Models (LLMs) and supports advanced protocols like MCP and A2A for building AI assistants and agents.
How do I secure my OPAL instance?
Use Attribute-based Access Controls (ABAC) by executing SPARQL commands to set up fine-grained access controls that determine who can log in and under what restrictions.
Can I avoid entering API keys on every login?
Yes, you can register LLM API keys system-wide using the
OAI.DBA.SET_PROVIDER_KEY()
command to avoid repetitive entry.
What is MCP?
Model Context Protocol (MCP) is a protocol that enables AI applications to securely connect to external data sources and tools, providing standardized access to resources.
Glossary of Terms
A2A (Agent-2-Agent Protocol)
A communication protocol that enables AI agents to interact and coordinate with each other in sophisticated workflows.
ABAC (Attribute-based Access Control)
A security model that uses attributes, policies, and environmental conditions to control access to resources.
LLM (Large Language Model)
AI models trained on large amounts of text data to understand and generate human-like text.
MCP (Model Context Protocol)
A protocol that enables AI applications to securely connect to external data sources and tools.