OPAL MCP Server: Setup & Installation Guide

A comprehensive guide to installing, configuring, and utilizing the OPAL MCP Server for advanced AI-driven data integration.

OPAL MCP Architecture

Introduction

Prior to the emergence of the Model Context Protocol (MCP), functionality provided by Virtuoso’s OPAL add-on middleware layer was only accessible through a limited set of options. The introduction of MCP provides an open standards–based interface for loosely coupled interactions with the powerful capabilities offered by OPAL.

This functionality is packaged as a collection of tools that support a wide range of operations, including:

  • Deductive interactions with relational tables and entity relationship graphs.
  • Comprehensive query execution (read/write).
  • Unified Data Space Administration (databases, knowledge graphs, filesystems).
  • Virtual Database Management for remote data sources.
  • Seamless interactions with a broad spectrum of LLMs.
  • Access to specialized OPAL Assistants and OpenAPI-based web services.

Benefits

LLM-based natural language processing marks the beginning of a new computing era—one powered by loosely coupled AI Agents that operate collaboratively to enhance productivity. The OPAL Server for MCP brings this powerful vision to life by enabling any MCP client to execute a wide range of operations across data spaces for seamless data access, integration, and management.

Post-Installation Capabilities Overview

Setup and Installation

Prerequisites

Installation Steps

  1. Install the latest on-premise Virtuoso installer for your OS (macOS, Windows, Linux).
  2. Install the VAL and personal-assistant VADs via Conductor or iSQL.
  3. Navigate to https://{your-host}:8891/chat.
  4. Log in with the dba user and provide your LLM provider’s API key when prompted.

Additional Components

Expand your OPAL instance's capabilities by installing optional VAD packages like the Assistant-Metal UI for managing AI agents or the Linked Data Cartridges for advanced data transformation and crawling.

Supported LLMs

OPAL supports a wide array of LLM providers, including OpenAI, Google, Anthropic, Microsoft, and many others. It also supports any local or hosted LLMs that are compatible with OpenAI's Tools API.

Attribute-based Access Controls (ABAC)

Secure your OPAL environment with fine-grained access controls. By executing SPARQL commands, you can define who can log in and what actions they are permitted to perform, using standardized ontologies for precise rule definition.

Login Authorization for /chat endpoint

Chat Endpoint Authorization
SPARQL
-- Grant dba user access to the /chat endpoint
PREFIX acl: <http://www.w3.org/ns/auth/acl#>
WITH <urn:virtuoso:val:default:rules>
INSERT {
  <#rulePublicChat> a acl:Authorization ;
      acl:accessTo <urn:oai:chat> ;
      acl:agent <http://localhost/dataspace/person/dba#this> .
} ;

Login Authorization for /assist-metal endpoint

Assist-Metal Endpoint Authorization
SPARQL
-- Grant dba user access to the /assist-metal endpoint
PREFIX acl: <http://www.w3.org/ns/auth/acl#>
WITH <urn:virtuoso:val:default:rules>
INSERT {
  <#assistantsAdmin> a acl:Authorization ;
      acl:accessTo <urn:oai:assistants> ;
      acl:agent <http://localhost/dataspace/person/dba#this> .
} ;

System-wide LLM API Key Registration

Rather than repetitively entering LLM API Keys when you log in, it might be preferred to have those keys registered system-wide. To achieve this goal, you need to create a restriction for successfully logged-in users by executing the following:

SPARQL
-- Create a restriction to allow system-wide API keys for the dba user
PREFIX oplres: <http://www.openlinksw.com/ontology/restrictions#>
WITH <urn:virtuoso:val:default:restrictions>
INSERT {
    <#restrictionAuthChatKey> a oplres:Restriction ;
        oplres:hasRestrictedResource <urn:oai:chat> ;
        oplres:hasRestrictedParameter <urn:oai:chat:enable-api-keys> ;
        oplres:hasAgent <http://localhost/dataspace/person/dba#this> ;
        oplres:hasRestrictedValue "true"^^xsd:boolean .
} ;

Large Language Models Registration & Use

Bind your OPAL instance to one or more LLMs using the commands below in the Conductor or iSQL interfaces.

Listing Bound LLMs

SQL
OAI.DBA.FILL_CHAT_MODELS('{Api Key}', '{llm-vendor-tag}');

For providers like Google Gemini that don't offer API listing, use:

SQL
OAI.DBA.REGISTER_CHAT_MODEL('{llm-vendor-tag}','{llm-name}');

You can view the effects of this command at: https://{CNAME}:8891/chat/admin/models.vsp

Bound LLMs Screenshot

System-Wide LLM API Key Registration

To avoid entering API keys on every login, register them system-wide.

SQL
OAI.DBA.SET_PROVIDER_KEY('{llm-vendor-tag}', '{api-key}');

API Access & Protocols

Your OPAL instance is also API-accessible. Obtain credentials (OAuth tokens, Bearer Tokens) from the /oauth/applications.vsp endpoint to interact with your instance via protocols like MCP and A2A.

Model Context Protocol (MCP) Usage

OPAL has built-in MCP support (client and server). Enable this by setting up CORS access for the /.well-known and /OAuth2 virtual directories in the Conductor UI.

MCP Server Endpoints:

  • Streamable HTTP: https://{CNAME}:8891/chat/mcp/messages
  • Server-Sent Events: https://{CNAME}:8891/chat/mcp/sse

Other MCP Interaction Options

Bridge-based access is also available via our generic MCP servers for various runtimes:

Your OPAL instance as an MCP Client

As an MCP client, OPAL can bind to tools published by any MCP Server. You can test this by obtaining an API key from a public endpoint (like https://demo.openlinksw.com/chat/mcp/messages) and registering it in your instance's admin area at https://{CNAME}:8891/chat/admin/.

Agent-2-Agent (A2A) Protocol Usage

A2A support enables sophisticated workflows between AI agents. Agents are discoverable via a JSON-based Agent Card at https://{CNAME}:8891/.well-known/agent.json.

Frequently Asked Questions

What is OPAL?

OPAL (OpenLink AI Layer) is an AI-powered platform that provides integration with multiple Large Language Models (LLMs) and supports advanced protocols like MCP and A2A for building AI assistants and agents.

How do I secure my OPAL instance?

Use Attribute-based Access Controls (ABAC) by executing SPARQL commands to set up fine-grained access controls that determine who can log in and under what restrictions.

Can I avoid entering API keys on every login?

Yes, you can register LLM API keys system-wide using the OAI.DBA.SET_PROVIDER_KEY() command to avoid repetitive entry.

What is MCP?

Model Context Protocol (MCP) is a protocol that enables AI applications to securely connect to external data sources and tools, providing standardized access to resources.

Glossary of Terms

A2A (Agent-2-Agent Protocol)

A communication protocol that enables AI agents to interact and coordinate with each other in sophisticated workflows.

ABAC (Attribute-based Access Control)

A security model that uses attributes, policies, and environmental conditions to control access to resources.

LLM (Large Language Model)

AI models trained on large amounts of text data to understand and generate human-like text.

MCP (Model Context Protocol)

A protocol that enables AI applications to securely connect to external data sources and tools.