Skip to content

A collection of scripts for deploying LLMs on Aleph

License

Notifications You must be signed in to change notification settings

Libertai/libertai-iac

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

libertai-iac

Less IaC than it is collection of scripts. Allows you to easily:

  • Download and manage Gguf models from Hugging Face
  • Push models to Aleph as immutable SquashFs images
  • Build and deploy Llama.Cpp on an Aleph VM, loading models from as immutable volumes

Requirements

  • docker OR podman
  • huggingface-cli
  • git
  • jq
  • aleph cli, available on pip!
  • squashfs-tools, including mksquashfs and unsquashfs

Setup

All you need to get started is to copy .env.example to .env.

cp .env.example .env

The repository comes with suitable defaults for deploying a simple Llama.Cpp server on Aleph, but you can modify these to suit your needs. See the .env.example file for more information.

Usage

Pull the configured model so its available locally

./bin/lib.sh pull-model

Push the configured model to Aleph as an object

./bin/lib.sh push-model

Pull a suitable runtime locally to build and deploy Llama.Cpp

./bin/lib.sh pull-runtime

And finally, push the runtime to Aleph, loading our configured model as a volume

./bin/lib.sh push-engine

About

A collection of scripts for deploying LLMs on Aleph

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages