Hacker News new | past | comments | ask | show | jobs | submit

Python 3.15's JIT is now back on track

https://fidget-spinner.github.io/posts/jit-on-track.html
Python really needs to take the Typescript approach of "all valid Python4 is valid Python3". And then add value types so we can have int64 etc. And allow object refs to be frozen after instantiation to avoid the indirection tax.

Sensible type-annotated python code could be so much faster if it didn't have to assume everything could change at any time. Most things don't change, and if they do they change on startup (e.g. ORM bindings).

loading story #47421515
loading story #47422889
loading story #47425096
loading story #47427084
loading story #47424620
But that's just not what python is for. Move your performance-critical logic into a native module.
loading story #47421675
loading story #47423185
loading story #47425649
loading story #47421417
loading story #47421604
loading story #47421911
SPy [1] is a new attempt at something like this.

TL;DR: SPy is a variant of Python specifically designed to be statically compilable while retaining a lot of the "useful" dynamic parts of Python.

The effort is led by Antonio Cuni, Principal Software Engineer at Anaconda. Still very early days but it seems promising to me.

[1] https://github.com/spylang/spy

loading story #47423708
loading story #47421557
I'm been occasionally glancing at PR/issue tracker to keep up to date with things happening with the JIT, but I've never seen where the high level discussions were happening; the issues and PRs always jumped right to the gritty details. Is there anywhere a high-level introduction/example of how trace projection vs recording work and differ? Googling for the terms often returns CPython issue tracker as the first result, and repo's jit.md is relatively barebones and rarely updated :(

Similarly, I don't entirely understand refcount elimination; I've seen the codegen difference, but since the codegen happens at build time, does this mean each opcode is possibly split into two (or more?) stencils, with and without removed increfs/decrefs? With so many opcodes and their specialized variants, how many stencils are there now?

loading story #47418254
loading story #47418684
loading story #47418678
Oh man, Python 2 > 3 was such a massive shift. Took almost half a decade if not more and yet it mainly changing superficial syntax stuff. They should have allowed ABIs to break and get these internal things done. Probably came up with a new, tighter API for integrating with other lower level languages so going forward Python internals can be changed more freely without breaking everything.
loading story #47419572
I cannot believe people are still acting like Python 2->3 was a huge fuck-up and an enormous missed opportunity. When in reality Python is by most measures the most popular language and became so AFTER that switch.

Since the switch we have seen enormous companies being built from scratch. There is no reason for anyone to be complaining about it being too hard to upgrade in 2026

loading story #47421376
loading story #47422061
loading story #47421150
Those are unrelated.
loading story #47421657
loading story #47421147
loading story #47419498
I'm curious is the JIT developers could mention any Python features that prevent promising JIT features. An earlier Ken Jin blog [1], mentions how __del__ complicates reference counting optimization.

There is a story that Python is harder to optimize than, say, Typescript, with Python flexibility and the C API getting mentioned. Maybe, if the list of troublesome Python features was out there, programmers could know to avoid those features with the promise of activating the JIT when it can prove the feature is not in use. This could provide a way out of the current Python hard-to-JIT trap. It's just a gist of an idea, but certainly an interesting first step would be to hear from the JIT people which Python features they find troublesome.

[1] https://fidget-spinner.github.io/posts/faster-jit-plan.html

loading story #47419830
loading story #47420871
loading story #47419932
> However, I misunderstood and came up with an even more extreme version: instead of tracing versions of normal instructions, I had only one instruction responsible for tracing, and all instructions in the second table point to that. Yes I know this part is confusing, I’ll hopefully try to explain better one day. This turned out to be a really really good choice. I found that the initial dual table approach was so much slower due to a doubling of the size of the interpreter, causing huge compiled code bloat, and naturally a slowdown.

> By using only a single instruction and two tables, we only increase the interpreter by a size of 1 instruction, and also keep the base interpreter ultra fast. I affectionally call this mechanism dual dispatch.

I really do hope they'll write that better explanation one day because this sounds pretty intriguing all on its own.

> We don’t have proper free-threading support yet, but we’re aiming for that in 3.15/3.16. The JIT is now back on track.

I recently read an interview about implementing free-threading and getting modifications through the ecosystem to really enable it: https://alexalejandre.com/programming/interview-with-ngoldba...

The guy said he hopes the free-threaded build'll be the only one in "3.16 or 3.17", I wonder if that should apply to the JIT too or how the JIT and interpreter interact.

loading story #47419194
loading story #47422867
Doesn't PyPy already have a jit compiler? Why aren't we using that?
loading story #47417974
loading story #47418776
loading story #47418446
loading story #47418075
loading story #47418339
loading story #47417975
Thanks for all the amazing work! I have Noob question. Wouldn't this get the funding back? Or would that not be preferable way to continue(as opposed to just volunteer driven)?

Like this is a big deal to get a project to a state where volunteers are spun up and actively breaking tasks and getting work done, no? It's a python JIT something I know next to nothing about — as do most application developers — which tells one how difficult this must have been.

loading story #47419957
loading story #47419996
I always wanted this for Python but now that machines write code instead of humans I feel like languages like Python will not be needed as much anymore. They're made for humans, not machines. If a machine is going to do the dirty work I want it to produce something lean, fast, and strictly verified.
loading story #47419863
loading story #47423902
loading story #47425566
What is wrong with the Python code base that makes this so much harder to implement than seemingly all other code bases? Ruby, PHP, JS. They all seemed to add JITs in significantly less time. A Python JIT has been asked for for like 2 decades at this point.
loading story #47418609
loading story #47418742
loading story #47418397
loading story #47419074
loading story #47419017
loading story #47418408
loading story #47418578
loading story #47418721
{"deleted":true,"id":47419994,"parent":47416486,"time":1773791761,"type":"comment"}
(what are blueberry, ripley, jones and prometheus?)
loading story #47418353
loading story #47418319
loading story #47418993
Sorry but the graphs are completely unreadable. There are four code names for each of the lines. Which is jit and which is cpython?
loading story #47418439
loading story #47418440
The JIT work is exciting, but I wonder if the focus on trace-based optimization might hit a ceiling without more aggressive escape analysis. We've seen in V8 that inline caching combined with deoptimization can handle many dynamic cases that pure tracing struggles with. Any thoughts on whether CPython's JIT will eventually need a hybrid approach?
Python is obviously going the same route as PHP.

Substitute Wordpress for Django, it’s the same slow user/permissions platform built in a different slow language.

The rest of Python larps in Go fashion as a real language like JavaScript.

All these dynamic languages that lack a major platform and use case beyond syntax preference should just go away.

[flagged]
loading story #47418068
[flagged]
loading story #47418598
loading story #47418520