-
327
DeepSeek uses banned Nvidia chips for AI model, report says
(finance.yahoo.com)
-
376
Django: what’s new in 6.0
(adamj.eu)
-
274
Revisiting "Let's Build a Compiler"
(eli.thegreenplace.net)
-
368
I got an Nvidia GH200 server for €7.5k on Reddit and converted it to a desktop
(dnhkng.github.io)
-
27
Show HN: Jottings; Anti-social microblog for your thoughts
(jottings.me)
-
189
Flow: Actor-based language for C++, used by FoundationDB
(github.com)
-
75
Qualcomm acquires RISC-V focused Ventana Micro Systems
(qualcomm.com)
-
3
New window insulation blocks heat, but not your view
(techxplore.com)
-
6
African athlete lured to Russia for work, ends up on front lines of Ukraine war
(abc.net.au)
-
22
How Geometry Is Fundamental for Chess
(lichess.org)
-
328
Is it a bubble?
(oaktreecapital.com)
-
257
Vibe coding is mad depressing
(law.gmnz.xyz)
-
3
Mount Git repo to view commits and branches as files
(github.com)
-
127
Agentic AI Foundation
(block.xyz)
-
157
My favourite small hash table
(corsix.org)
-
8
Folkscanomy: Tandy and Radio Shack Books
(archive.org)
-
117
Nova Programming Language
(nova-lang.net)
-
60
Intermittent hypoxia increases blood flow and benefits executive function
(onlinelibrary.wiley.com)
-
886
Valve: HDMI Forum Continues to Block HDMI 2.1 for Linux
(heise.de)
-
11
China's trade surplus tops record US$1T, defying trade war uncertainty
(scmp.com)
-
175
No more O'Reilly subscriptions for me
(zerokspot.com)
-
24
Show HN: GPULlama3.java Llama Compilied to PTX/OpenCL Now Integrated in Quarkus
-
98
Cloudflare error page generator
(github.com)
-
15
BehindTheMedspeak: A Spinal Tap
(bookofjoe2.blogspot.com)
-
4
Magit-insert-worktrees improves status buffers
(huonw.github.io)
-
26
Huge undersea wall dating from 5000 BC found in France
(bbc.com)
-
87
Qt, Linux and everything: Debugging Qt WebAssembly
(qtandeverything.blogspot.com)
-
44
Show HN: Gotui – a modern Go terminal dashboard library
(github.com)
-
286
Donating the Model Context Protocol and establishing the Agentic AI Foundation
(anthropic.com)
-
12
A LLM trained only on data from certain time periods to reduce modern bias
(github.com)