On Friday OpenAI dropped GPT-5.5-Cyber, a model fine-tuned for security operations and shipped behind a "Trusted Access" tier rather than the consumer product. The announcement leans hard on triage: incident response, vulnerability research, and patch generation are the three workloads called out by name.
The interesting move is the access pattern. OpenAI is gating the model behind enterprise verification because it can meaningfully accelerate exploit development as well — the same dual-use line Anthropic walked when it shipped cyber-tuned variants of Claude. If you do appsec or run a red team, this is the one to evaluate this week.
On May 7 OpenAI shipped a refreshed line of voice models in the API alongside Trusted Contact in ChatGPT — a way to verify a human is on the other end of a call before the model keeps acting on what it hears. Both releases point at the same problem: agentic flows are now happening over phone lines, and the surface area for prompt injection by audio is real.
For anyone wiring an agent into a call centre or voice interface, the new latency and steerability numbers are worth benchmarking against your current stack. The rollout came the same day Perplexity put its Personal Computer agent on Mac — agentic plumbing is becoming the surface area, not the model.
Friday May 8 saw Linus push 7.0.5 stable and Greg KH stamp 6.18.28, 6.12.87, and 6.6.138 on the longterm branches in a single coordinated drop — the first proper batch since the 7.0 transition in mid-April.
If you run prod boxes on 6.6 LTS (RHEL 9 derivatives, Debian stable point releases), 6.6.138 is the one to schedule. The 6.18 longterm continues to be the migration target for anyone jumping off the now-EOLed 6.x mainline.
Beijing-based Moonshot — the lab behind the open-weight Kimi K-series — closed roughly $2 billion in fresh capital at a $20B valuation this week. It's a striking number for a Chinese lab that has been competing on open weights rather than hosted product, and it lands while OpenAI's own custom silicon push with Broadcom is reportedly stuck on an $18B financing hurdle.
Source · Crunchbase News · TechCrunch
Meta, Amazon, Microsoft and Alphabet have now collectively signalled roughly $725B of 2026 capex — up more than 75% year-on-year — with nearly the entire delta earmarked for data centres, custom silicon, GPUs, and AI training runs. The number lands the same week OpenAI's Broadcom partnership reportedly hit an $18B financing wall, forcing a rework of how the chip programme gets funded.
For anyone planning capacity, pricing, or a startup that depends on inference economics, the takeaway isn't the absolute number — it's the speed of the curve. The hyperscaler buildout is now visibly compressing supplier roadmaps.
The pitch is simple: if your qubit is a CMOS transistor at millikelvin, the world already knows how to manufacture you a million of them.
UCL spinout Quantum Motion closed a $160M Series C this week co-led by Mundi Ventures and DCVC, betting on silicon spin-qubit chips that piggy-back on existing semiconductor fabs. It's the unglamorous quantum thesis — no superconducting cryostats, no trapped ions, just transistors at sub-Kelvin temperatures — and the round suggests the patient money is quietly siding with it.
NASA confirmed this week that RAVEN — a transit-signal classifier trained on the Transiting Exoplanet Survey Satellite archive — has surfaced more than one hundred exoplanets including 31 brand-new worlds. The interesting catches sit in the so-called Neptunian desert, where planets of a certain mass aren't supposed to exist, and on ultra-short orbits under twenty-four hours.
RAVEN didn't find anything new in the universe; it found things humans had already collected and never looked at. The bottleneck wasn't telescope time. It was attention.
A re-analysis of samples from the Cumberland drill site has turned up seven previously undetected organic molecules preserved in clay-rich rock that once held standing water. None of them are biosignatures on their own, but the diversity — long-chain hydrocarbons sitting alongside sulphur-bearing organics — is the most varied set the rover has produced.
The team used a one-shot pyrolysis technique they had been saving for high-confidence targets, which is why the result took a decade to surface from samples drilled in 2013. The chemistry is consistent with prebiotic process; the absence of extant biology, equally so.
Researchers in Ecuador documented a Markia-species katydid sliding through a colour change that the field had assumed impossible at this taxonomic level — pink in low light, green in high. Reversible carotenoid-mediated colour change shows up in cephalopods and a few fish; a katydid making the same trick on a longer timescale forces a small rewrite of what insect cuticle chromatophores can do.
The paper notes the animal isn't camouflaging in the moment — it's tracking the dominant wavelength of its microhabitat over hours. Slow camouflage. A new category, basically.
The Internet Archive has stood up an independent Swiss foundation in St. Gallen, jurisdictionally severed from the US-based parent and tasked with preserving endangered archives — including, notably, the weights of public generative AI models. The piece is partly about copyright risk-spreading and partly about the realisation that, of all digital artefacts, frontier model weights might be the most ephemeral. A partnership with the University of St. Gallen's Computer Science school anchors the technical side.
→ HN discussionSix days into the experiment, Jarred Sumner reports a ported-to-Rust Bun runtime — roughly 960k lines — passing 99.8% of the existing Zig test suite on Linux glibc. He is explicit that this is exploratory ("a very high chance all this code gets thrown out") and that the goal is a side-by-side comparison on perf, memory, maintainability and DX. The thread is mostly an argument about what an AI-assisted port at this scale actually proves.
→ HN discussionChris Morgan documents a Caddyfile policy that returns an error for any unknown query parameter on his personal site — a blanket refusal aimed squarely at tracking parameters appended by social platforms and link shorteners. The comments split predictably between "elegant minimalism" and "this breaks things in ways you haven't thought about," which is exactly the conversation worth having.
→ HN discussionymawky is a static-file HTTP server written entirely in ARM64 assembly for macOS — fork-per-connection, no libc, syscall-only. It supports GET/PUT/DELETE/OPTIONS/HEAD, guards against path traversal, and includes slowloris mitigations. The author's framing is honest: this is recreational assembly programming pretending to be a useful project, and it's the better for it.
→ HN discussionGoogle extended Gemini's managed RAG product so it now indexes images alongside text using Gemini Embedding 2, with custom metadata filters and page-level citations. The feature target is the same one Anthropic's Files API and OpenAI's File Search have been chasing — make "verifiable retrieval over your enterprise corpus" a one-API-call problem rather than a six-week pipeline build.
→ HN discussion"Frontier models still do the hard agentic work, but everything routed through the gateway is a control surface we own — not a contract we hope our vendor honours." — iMARS team, paraphrased from the post