Problem: Claude Code 2.1.0 crashes with Invalid Version: 2.1.0 (2026-01-07) because the CHANGELOG.md format changed to include dates in version headers (e.g., ## 2.1.0 (2026-01-07)). The code parses these headers as object keys and tries to sort them using semver's .gt() function, which can't parse version strings with date suffixes.
Affected functions: W37, gw0, and an unnamed function around line 3091 that fetches recent release notes.
Fix: Wrap version strings with semver.coerce() before comparison. Run these 4 sed commands on cli.js:
CLI_JS="$HOME/.nvm/versions/node/$(node -v)/lib/node_modules/@anthropic-ai/claude-code/cli.js"
# Backup first
cp "$CLI_JS" "$CLI_JS.backup"
# Patch 1: Fix ve2.gt sort (recent release notes)
sed -i 's/Object\.keys(B)\.sort((Y,J)=>ve2\.gt(Y,J,{loose:!0})?-1:1)/Object.keys(B).sort((Y,J)=>ve2.gt(ve2.coerce(Y),ve2.coerce(J),{loose:!0})?-1:1)/g' "$CLI_JS"
# Patch 2: Fix gw0 sort
sed -i 's/sort((G,Z)=>Wt\.gt(G,Z,{loose:!0})?1:-1)/sort((G,Z)=>Wt.gt(Wt.coerce(G),Wt.coerce(Z),{loose:!0})?1:-1)/g' "$CLI_JS"
# Patch 3: Fix W37 filter
sed -i 's/filter((\[J\])=>!Y||Wt\.gt(J,Y,{loose:!0}))/filter(([J])=>!Y||Wt.gt(Wt.coerce(J),Y,{loose:!0}))/g' "$CLI_JS"
# Patch 4: Fix W37 sort
sed -i 's/sort((\[J\],\[X\])=>Wt\.gt(J,X,{loose:!0})?-1:1)/sort(([J],[X])=>Wt.gt(Wt.coerce(J),Wt.coerce(X),{loose:!0})?-1:1)/g' "$CLI_JS"
Note: If installed via different method, adjust CLI_JS path accordingly (e.g., /usr/lib/node_modules/@anthropic-ai/claude-code/cli.js).
It's a good idea. Obviously you can't preemptively OCR all images but having "context menu -> follow link" which works on QR codes and images with links in them seems totally doable do me
I think that a separate program might be better, which could be used with any video source in any program, including a QR code made up from multiple pictures or from CSS, or a video, etc. Furthermore, it is not necessarily a URL.
However, it might also be made as a browser extension in case you do not want to use a separate program, or if you want to be able to follow such links directly without going through another program.
The datacenter OS doesn't have to be the same as the developer OS. At my work (of similar scale) the datacenters all run Linux but very nearly all developers are on MacOS
MacOSX is a popular choice for dev boxes (if I understand correctly, they provide some pretty good tooling for managing a fleet of machines; more expensive hardware than a Linux dev machine fleet, but less DIY for company-wide administration).
... but Google solves the "A Linux fleet requires investment to maintain" problem by investing. They maintain their own in-house distro.
Not really, it is just a well known outside distro plus internal CI servers to make sure that newly updated packages don't break things. Also some internal tools, of course.
Relative to what the rest of the world does, that is maintaining your own in-house distro.
It's downstream of Ubuntu (unless that's changed) but it's tweaked in the ways you've noted (trying to remember if they also maintain their own package mirrors or if they trust apt to fetch from public repositories; that's a detail I no longer recall).
Not to be a jerk, but 'hundreds of devs and dozens of MR per day' is not 'huge repos'. Certain functionality only becomes relevant at scale, and what is easy on a repo worth hundreds of megabytes doesn't work anymore once you have terabytes of source code to deal with.
Google's monorepo is in fact terabytes with no binaries. It does stretch the definition of source code though - a lot of that is configuration files (at worst, text protos) which are automatically generated.
Dang, that's mind boggling - especially if I keep in mind that a book series like lord of the rings is mere kilobytes if saved as plain text.
Having 86 TB of plain text/source code - I can't fathom the scale, honestly
Are you absolutely sure there aren't binaries in there (honestly asking, the scale is just insane from my perspective - even the largest book compilation like Anna's isn't approaching that number - if you strip out images ... And that's pretty much all books in circulation - with multiple versions per title)
I'm pretty sure Meta's team has written about that at length. It's about many things, such as (power/transportation/internet/energy) infrastructure, political situation, available workforce, vicinity to population centers, property prices, and a whole lot more
That's disappointing, they've done a great job making plant meat ubiquitous and took away some of the hippy aura that has kept many people from trying plant-based meat alternatives. I really hope they can turn it around, both selfishly as a happy customer, as well as for the planet.
I disagree. I'm reading and typing this from an iPhone 13 mini. I use a big one for work so it's not like I don't know what I'm missing. I very strongly prefer the small form factor
reply