A 2.0 version could even merge the two versions slightly, tracking irl people into the virtual space (with pose and position estimates?) and programs (? I don't know the lingo off hand, but I mean the paper sheets everything revolves around) and in the opposite direction project the programs from VR onto the real table.
I've been interested in it for years so I'm very glad to see it's still moving forwards and alive. There were years where I couldn't find any actual new information coming out of the project.
I dunno. Someone in VR can't manipulate the same physical objects that people in the real world can. You are forced to compromise physicality so that VR users aren't second-class users, or you build two different experiences for VR and non-VR users which goes against the idea of a shared experience.
I think what you're describing has value, but I also think you're removing the fundamental piece of what makes this unique and special.
That would be something entirely different.
Many of the ideas can be worked on and improved without requiring the expensive physical space so the OS and the concepts of the composed tools they talk about for the future of the project can all be improved and played with in a virtual space too separate from the physical presence.