Zed
@zed.dev
about 2 months ago See git diff stats in the git panel next to each entry.
Enable with `"git_panel": { "diff_stats": true }`
Thanks bobbymannino!
💬 1
♻️ 0
❤️ 6
See git diff stats in the git panel next to each entry.
Enable with `"git_panel": { "diff_stats": true }`
Thanks bobbymannino! Use Vercel AI Gateway as an LLM provider. Thanks dancer!
Astro 6 is here, and upgrading to it on Netlify is one command: npx @astrojs/upgrade
Updates Astro, the adapter, and all official integrations at once.
Full changelog with migration tips at the link.
netlify.com/changelog/2026-03-10-astro-6/ Dropping 3 announcements in a single talk at Vue Amsterdam this week.
@vuejsamsterdam.bsky.social Our newsletter is back with the first edition since last summer:
blog.val.town/newsletter-26
If it's been a while since you've visited val town, you can read this to catch up on what we're up to...and to rsvp to our party tomorrow :) AI writes the code. You're still responsible for what it ships.
Three things that actually help: Deploy Previews for visual sign-off, secret scanning before every deploy, and Observability to catch what slips through.
The full breakdown: www.netlify.com/blog/how-to-... I think I've become way too into colons. I ctrl-f'd 15 in the ~1,500 word blog post I just published
⌘ Built with Command Code.
star the repo
└ github.com/ahmadawais/...
└ npmjs.com/package/mmm...
└ built with @CommandCodeAI
└ $ npm i -g command-code - booleans rendered as `yes/no`
- ASCII connectors instead of box-drawing glyphs
No dashboards.
No browser spelunking. Big win. Need I say more.
Just:
$ npx mmmodels
Or install it globally:
$ npm i -g mmmodels
If you work with models a lot and prefer terminals over tabs, try it. Each column has min/max widths and alignment. The renderer fits the table to the current terminal width, and if the requested columns still do not fit, it errors instead of wrapping into unreadable soup.
`--plain` actually means plain:
- no banner
- no color
- no spinner - do --sync or -s with any command to fetch live
Normal mode goes memory -> disk -> network.
If the fetch fails, it falls back to disk cache instead of hard-failing.
Tables are width-aware. Queries are tokenized, normalized, version-aware, and AND-matched across candidates. Version tokens like `4.6` are handled carefully so they do not accidentally match `4.5`.
Caching is simple on purpose:
- in-process cache
- disk cache in tmp
- network fetch from source If the same model family appears from multiple providers, search prefers the default source for that family instead of returning an arbitrary duplicate first.
Under the hood the search is custom-scored. PRs welcome. $ mmmodels claude
$ mmmodels list --provider anthropic --table
$ mmmodels search gpt --provider openai --json
$ mmmodels search claude --fields id,provider_id,limit.context,cost.input
One subtle feature I like: provider-aware ranking. - agent-friendly output with `--fields`, `--ids-only`, `--ndjson`, and `--json`
- width-aware terminal tables that fail cleanly instead of overflowing
- `--plain` mode for scripts, CI, and remote boxes
- local disk cache with offline-friendly fallback behavior
A few examples: What it does well:
- no-arg interactive TUI for browsing `mmmodels` all you need
- fuzzy search across model IDs, model names, and provider names
- filtering by provider, capabilities, and status
- explicit sorting and limiting with `--sort` and `--limit` Cognitive load of issues like: Name drifts. IDs collide across providers. Pricing and capability metadata changes constantly. The browser/tab workflow is too slow if you do this often (as someone building a frontier coding agent).
Again built with Command Code, with my CLI taste. - who ships them at what price
- how much context they have
- what they cost (esp caching)
- which ones support tools, reasoning, files, or structured output
The data is all there. The workflow as a CLI was missing. Introducing mmmodels 𝌭
𝚖𝚖𝚖𝚘𝚍𝚎𝚕𝚜 is a CLI for browsing, filtering, and exploring AI models from hundreds of providers.
Built for both humans and agents.
$ 𝚗𝚙𝚡 𝚖𝚖𝚖𝚘𝚍𝚎𝚕𝚜
I wanted one terminal-native place to answer questions like:
- what models exist (fuzzy search) AI use case: Code search
AI can instantly find functions, files, or patterns across huge codebases. It helps you jump straight to the code that matters.
How do you search through your codebase? Does anyone have any cool blogs they follow? Updating my feed reader with new stuff, what should I add?