BDEPEND=systemd? ( sys-apps/systemd ) virtual/pkgconfig DEFINED_PHASES=install postinst postrm preinst prepare pretend setup unpack DESCRIPTION=Get up and running with large language models locally EAPI=8 HOMEPAGE=https://ollama.com/ IUSE=cuda rocm systemd KEYWORDS=~amd64 ~arm64 LICENSE=MIT RDEPEND=acct-group/ollama acct-user/ollama cuda? ( dev-util/nvidia-cuda-toolkit ) rocm? ( dev-libs/rocm-opencl-runtime sci-libs/clblast ) virtual/tmpfiles REQUIRED_USE=rocm? ( amd64 ) cuda? ( amd64 ) RESTRICT=mirror strip SLOT=0 SRC_URI=amd64? ( !rocm? ( https://github.com/ollama/ollama/releases/download/v0.18.0/ollama-linux-amd64.tar.zst -> ollama-bin-0.18.0-amd64.tar.zst ) rocm? ( https://github.com/ollama/ollama/releases/download/v0.18.0/ollama-linux-amd64-rocm.tar.zst -> ollama-bin-0.18.0-rocm.tar.zst ) ) arm64? ( https://github.com/ollama/ollama/releases/download/v0.18.0/ollama-linux-arm64.tar.zst -> ollama-bin-0.18.0-arm64.tar.zst ) _eclasses_=check-reqs 12ab9d3fc16bfe1f87c6fb652324b57c systemd a964c0cbe818b5729da1dbfcee5be861 tmpfiles e0b49bcd7a0daea941c0fbe4cb35ff4e toolchain-funcs 5195689ff6a73b0e789acfa09d4fbcb9 _md5_=d9a988a11b68ef4a98eb86168fe3605b