# -*- coding: utf-8; mode: tcl; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*- vim:fenc=utf-8:ft=tcl:et:sw=4:ts=4:sts=4 PortSystem 1.0 PortGroup github 1.0 PortGroup python 1.0 github.setup containers ramalama 0.16.0 v github.tarball_from archive revision 0 checksums rmd160 60c0bd52cb6a8ca976684eb6c0f703682708cefa \ sha256 c2d9cab713e38d902375cd733880830d907f6b35c971bd78b3137e1973db243c \ size 1674609 homepage https://ramalama.ai/ license MIT description A tool to simplify the use of local AI models long_description \ Ramalama is an open-source developer tool that simplifies the local serving \ of AI models from any source and facilitates their use for inference in \ production, all through the familiar language of containers. maintainers {cal @neverpanic} openmaintainer categories llm science supported_archs noarch python.default_version 313 depends_run-append \ port:krunkit \ port:podman \ port:py${python.version}-jinja2 \ port:py${python.version}-yaml notes \ "${name} defaults to running AI models in podman containers in a podman\ machine (i.e., VM) started by libkrun. This is not the podman default, so\ you will have to change it, either by exporting the\ CONTAINERS_MACHINE_PROVIDER=libkrun environment variable, or by adding\ 'provider = \"libkrun\"' to the '\[machine]' section of\ '\$HOME/.config/containers/containers.conf'. See man 7 ramalama-macos for\ more information."