Azazelle

Azazelle

MN-Halide-12b-v1.0

Dedicated Endpoints

Run this model inference on single tenant GPU with unmatched speed and reliability at scale.

Learn more
Container

Run this model inference with full control and performance in your environment.

Learn more

Get help setting up a custom Dedicated Endpoints.

Talk with our engineer to get a quote for reserved GPU instances with discounts.

Model provider

Azazelle

Azazelle

Model tree

Base

Epiculous/Azure_Dusk-v0.2

Base

mpasila/Mistral-freeLiPPA-LoRA-12B

Base

Epiculous/Crimson_Dawn-v0.2

Base

nbeerbower/mistral-nemo-bophades-12B

Base

jeiku/Aura-NeMo-12B

Base

nbeerbower/mistral-nemo-wissenschaft-12B

Base

nbeerbower/mistral-nemo-cc-12B

Base

elinas/Chronos-Gold-12B-1.0

Base

TheDrummer/Rocinante-12B-v1.1

Base

jtatman/mistral_nemo_12b_reasoning_psychology_lora

Base

anthracite-org/magnum-v2.5-12b-kto

Base

anthracite-org/magnum-v2-12b

Base

nbeerbower/Lyra4-Gutenberg-12B

Base

SillyTilly/mistralai_Mistral-Nemo-Base-2407

Base

UsernameJustAnother/Nemo-12B-Marlin-v8

Base

nbeerbower/mistral-nemo-gutenberg-12B-v4

Base

TheDrummer/Rocinante-12B-v1

Merged

this model

Modalities

Input

Text

Output

Text

Pricing

Dedicated Endpoints

View details

Supported Functionality

Model APIs

Dedicated Endpoints

Container

More information

Explore FriendliAI today