BDEPEND=app-alternatives/ninja >=dev-build/cmake-3.20.5 DEFINED_PHASES=compile configure install prepare test DEPEND=dev-python/numpy dev-libs/date:= >=dev-libs/boost-1.66:= dev-libs/protobuf:= dev-libs/re2:= dev-libs/flatbuffers:= dev-cpp/nlohmann_json:= dev-libs/nsync dev-cpp/eigen:3 benchmark? ( dev-cpp/benchmark ) test? ( dev-cpp/gtest ) DESCRIPTION=cross-platform, high performance ML inferencing and training accelerator EAPI=7 HOMEPAGE=https://github.com/microsoft/onnxruntime IUSE=benchmark test KEYWORDS=~amd64 LICENSE=MIT RDEPEND=dev-python/numpy dev-libs/date:= >=dev-libs/boost-1.66:= dev-libs/protobuf:= dev-libs/re2:= dev-libs/flatbuffers:= dev-cpp/nlohmann_json:= dev-libs/nsync dev-cpp/eigen:3 benchmark? ( dev-cpp/benchmark ) RESTRICT=test SLOT=0 SRC_URI=https://github.com/microsoft/onnxruntime/archive/refs/tags/v1.9.1.tar.gz -> onnxruntime-1.9.1.tar.gz https://github.com/pytorch/cpuinfo/archive/5916273f79a21551890fd3d56fc5375a78d1598d.tar.gz -> pytorch-cpuinfo-5916273f79.tar.gz https://github.com/onnx/onnx/archive/1f63dcb7fcc3a8bf5c3c8e326867ecd6f5c43f35.tar.gz -> onnx-1f63dcb7fc.tar.gz https://github.com/boostorg/mp11/archive/21cace4e574180ba64d9307a5e4ea9e5e94d3e8d.tar.gz -> boost_mp11-21cace4e574.tar.gz https://github.com/google/flatbuffers/archive/v1.12.0.tar.gz -> flatbuffers-1.12.0.tar.gz https://github.com/martinmoene/optional-lite/archive/4acf4553baa886e10e6613fe1452b706b0250e78.tar.gz -> optional-lite-4acf4553ba.tar.gz https://github.com/dcleblanc/SafeInt/archive/a104e0cf23be4fe848f7ef1f3e8996fe429b06bb.tar.gz -> SafeInt-a104e0cf23.tar.gz _eclasses_=cmake d3613e557da672de5255dab91c8f731f flag-o-matic f14aba975c94ccaa9f357a27e3b17ffe multilib b2a329026f2e404e9e371097dda47f96 multiprocessing 1e32df7deee68372153dca65f4a7c21f ninja-utils 2df4e452cea39a9ec8fb543ce059f8d6 toolchain-funcs 14648d8795f7779e11e1bc7cf08b7536 xdg-utils 42869b3c8d86a70ef3cf75165a395e09 _md5_=1dba250e2b9b2b933724d6c7aa028105