AIStor AIHub is a private repository for storing AI models and datasets directly in AIStor. It is API compatible with Hugging Face, enabling enterprises to create their own data and model repositories on the private cloud or in air-gapped environments without changing a single line of code. This eliminates the risk of developers leaking sensitive data sets or models.
AIHub operates as a proxy server compatible with Hugging Face APIs, intercepting all calls without requiring code modifications. This allows users to use Hugging Face libraries while managing model and dataset storage locally.
AIHub stores models and datasets in a local MinIO bucket, ensuring that data remains within the enterprise's control. This local storage solution eliminates the risk of accidentally uploading private data back to a public bucket.
The AIHub checks for the requested model or dataset in the AIStor bucket. If it exists, it is retrieved locally; if not, it’s downloaded from Hugging Face and saved in the bucket. This automated caching improves efficiency and speed for subsequent requests.
When models are fine-tuned or updated, new versions are stored alongside the originals within the AIStor bucket. This versioning approach supports multiple model iterations and securely stores fine-tuned versions.
Multiple AIHub instances can be hosted within a single AIStor tenant. Administrators can configure these instances either through the AIStor Console or declaratively with YAML configurations for flexible and scalable management.
Users only need to set an environment variable (e.g., HF_ENDPOINT) to start using AIHub with their existing Hugging Face workflows. This simplicity reduces setup time and ensures seamless integration for ML engineers.
Complete this form and the team will reach out to get you an evaluation license.