🧩 Model Card: LiquidAI/LFM2-1.2B

▶️ Run with FastFlowLM in PowerShell:

flm run lfm2:1.2b

🧩 Model Card: LiquidAI/LFM2-2.6B

▶️ Run with FastFlowLM in PowerShell:

flm run lfm2:2.6b

🧩 Model Card: LiquidAI/LFM2-2.6B-Transcript

▶️ Run with FastFlowLM in PowerShell:

flm run lfm2-trans:2.6b

⚠️ This model is intended for single-turn conversations with a specific format, described here.


🧩 Model Card: LiquidAI/LFM2.5-1.2B-Instruct

▶️ Run with FastFlowLM in PowerShell:

flm run lfm2.5-it:1.2b

🧩 Model Card: LiquidAI/LFM2.5-1.2B-Thinking

▶️ Run with FastFlowLM in PowerShell:

flm run lfm2.5-tk:1.2b