Agreed. This reminds me – and I wonder if it could be applied – to Computational Type Theory, which relies on a similar concept of "reducing" types to their primitive forms, actually taking computation into account (something type theories normally do not!)
This lecture series goes into how it works: https://www.youtube.com/watch?v=LE0SSLizYUI
loading story #42072875