Setup
The project has multiple components, the easiest way to start is by using Docker Compose or Helm Charts.VibraniumDome supports both OpenAI API & Azure AI API.
OpenSearch requires max virtual memory areas
Run first:
if that command return
and run again the previos command to make sure the new value set to
vm.max_map_count to be 262144, and in some environments it’s 65530, so check first and apply workaround to make sure it’s the correct value.Run first:
docker run -u root -it opensearchproject/opensearch:2.9.0 bash -c "cat /proc/sys/vm/max_map_count"if that command return
65530, run that command to increase it temporarily:docker run --rm --privileged alpine sysctl -w vm.max_map_count=262144and run again the previos command to make sure the new value set to
262144docker run -u root -it opensearchproject/opensearch:2.9.0 bash -c "cat /proc/sys/vm/max_map_count"OpenSearch on Linux machine need to change the owner of the filesystem volumes, so run:
mkdir vibraniumdome-opensearch/vibraniumdome-opensearch-data1mkdir vibraniumdome-opensearch/vibraniumdome-opensearch-data2sudo chown -R 1000:1000 vibraniumdome-opensearch/vibraniumdome-opensearch-data1sudo chown -R 1000:1000 vibraniumdome-opensearch/vibraniumdome-opensearch-data2Set
OPENAI_API_KEY environment variable in environment file, you can generate one hereStart the project
With docker-compose:
docker-compose up ; With Helm: helm install test-release helm/Open The Application
Navigate to Vibranium-App
There are two authentication providers in the app: Google and Basic credentials. To make the Google login work, you need to register an OAuth 2.0 application on Google Cloud Platform. The default username and password for the login are: username:
admin@admin.com and password: admin. You can change it here.Analyze your first LLM interaction
Analyze LLM Interactions by streamlit app
After you run the the whole system like in the above instructions, navigate to VibraniumDome-Streamlit-App. In the left pane of the streamlit app in theEnvironment Variables section you need to configure the OpenAI API Key.
The Streamlit app, by default, uses http://localhost:5001 for the Vibranium Dome Base URL (optional). You can change this to point to a different URL.
Now you can chat in the app, and see the results in Vibranium-App.
Analyze LLM Interactions by code
Both the old OpenAI SDK version 0.28.1 and the new 1.* OpenAI SDK version are supported.
Install the Vibranium Dome SDK
This step should be done in the Agent code, can be skipped in the server installation.
Create demo application with the OLD OpenAI SDK in main.py
Create demo application with the NEW OpenAI SDK in main.py
In newer versions of VibraniumDome, it requires to set
VIBRANIUM_DOME_API_KEY; If not provided, it used the default one defined in VibraniumDome System