ExCL User Docs
HomeAbout
  • Introduction
  • Acknowledgment
  • System Overview
    • amundsen
    • apachepass
    • clark
    • cousteau
    • docker
    • emu
    • equinox
    • excl-us
    • explorer
    • faraday
    • Hudson
    • leconte
    • lewis
    • mcmurdo
    • Milan
    • minim1
    • Oswald
    • pcie
    • quad
    • radeon
    • snapdragon
    • thunderx
    • Triple Crown
    • Xavier
    • zenith
  • ExCl Support
    • ExCL Team
    • Frequently Encountered Problems
    • Access to ExCL
    • Contributing
    • Glossary & Acronyms
    • Requesting Access
    • Outages and Maintenance Policy
    • Backup & Storage
  • Quick-Start Guides
    • ExCL Remote Development
    • Apptainer
    • Conda and Spack Installation
    • Devdocs
    • GitHub CI
    • Gitlab CI
    • Groq
    • Julia
    • Jupyter Notebook
    • Marimo
    • Ollama
    • Open WebUI
    • Python
    • Siemens EDA
    • ThinLinc
    • Visual Studio Code
    • Vitis FPGA Development
  • Software
    • Compilers
    • ExCl DevOps: CI/CD
    • Git
    • Modules
    • MPI
  • Devices
    • BlueField-2
  • Contributing via Git
    • Git Basics
      • Git Command Line
      • Git Scenarios
    • Authoring Guide
Powered by GitBook
On this page
  • Ollama API
  • Links

Was this helpful?

Edit on GitHub
Export as PDF
  1. Quick-Start Guides

Ollama

Getting started with Ollama.

PreviousMarimoNextOpen WebUI

Last updated 2 months ago

Was this helpful?

is deployed in ExCL as a module. To use Ollama, load the module, and then you have access to the ollama CLI interface.

Load the Ollama module with:

module load ollama

Ollama API

Links

Ollama has a server component which stores files in its home. This server component should be launched using a service account by ExCL admin, since it provides ollama for the entire system. Ollama is already running on some of the workers in ExCL. See the output from the model load for an up-to-date list. Contact if you would like ollama to be available on a specific system.

When interacting with the Ollama server via the in ExCL, you need to unset the http_proxy and https_proxy environment variables, since you are trying to connect to an internal http server instead of a remote one.

Examples of using the Ollama API can be found at .

excl-help@ornl.gov
REST API
ollama-python/examples/chat.py
Ollama Website
Ollama GitHub
Ollama CLI Reference
Ollama