BDEPEND=systemd? ( sys-apps/systemd ) virtual/pkgconfig DEFINED_PHASES=install postinst postrm preinst prepare pretend setup unpack DESCRIPTION=Get up and running with large language models locally EAPI=8 HOMEPAGE=https://ollama.com/ IUSE=cuda rocm systemd KEYWORDS=~amd64 ~arm64 LICENSE=MIT RDEPEND=acct-group/ollama acct-user/ollama cuda? ( dev-util/nvidia-cuda-toolkit ) rocm? ( dev-libs/rocm-opencl-runtime sci-libs/clblast ) virtual/tmpfiles REQUIRED_USE=rocm? ( amd64 ) cuda? ( amd64 ) RESTRICT=mirror strip SLOT=0 SRC_URI=amd64? ( !rocm? ( https://github.com/ollama/ollama/releases/download/v0.12.11/ollama-linux-amd64.tgz -> ollama-bin-0.12.11-amd64.tgz ) rocm? ( https://github.com/ollama/ollama/releases/download/v0.12.11/ollama-linux-amd64-rocm.tgz -> ollama-bin-0.12.11-rocm.tgz ) ) arm64? ( https://github.com/ollama/ollama/releases/download/v0.12.11/ollama-linux-arm64.tgz -> ollama-bin-0.12.11-arm64.tgz ) _eclasses_=check-reqs 2a9731073c152554078a9a8df8fc0f1b systemd a964c0cbe818b5729da1dbfcee5be861 tmpfiles 9a9814db5a3fbd4f1e921c05297e7735 toolchain-funcs 98d9f464d912ae6b7316fb8a3721f5db _md5_=b9fcb115ed90aa63bb6a09db72e41005