Local LLM for Low-RAM Laptops 3 lightweight, open-source LLMs that actually run on 8–16 GB RAM—no GPU needed