Hacker News new | past | comments | ask | show | jobs | submit

Show HN: I 3D scanned the tunnels inside the Maya Pyramid Temples at Copan

https://mused.com/guided/158/temple-26-and-excavation-tunnels-copan-ruinas/
Superb!

I expect whoever coated the remains with that red cinnabar stuff died rather early, probably with tooth and hair loss and severe mental issues. Perhaps this fate was expected but given that "mad hatters" were a thing until fairly recently, people can be a bit strange when it comes to dealing with poisons.

The guide notes point out that only the most sacred rituals involved this red mercurial stuff. I'm not surprised. It might be rare but rarer still will be people willing to deploy it unless that fate is considered a good way to go.

That tour is a remarkable use of the technology.

It's something we have to be careful of while working on site! We're really careful around the rooms that have mercury in them--there are few that I didn't put in the guide also.

I was wondering about this too: they've found high levels of mercury in the water supply at Maya cities and believe now it contributed to the eventual collapse: https://arstechnica.com/science/2020/06/mercury-and-algal-bl...

> “The drinking and cooking water for the Tikal rulers and their elite entourage almost certainly came from the Palace and Temple Reservoirs,” wrote Lentz and his colleagues. “As a result, the leading families of Tikal likely were fed foods laced with mercury at every meal.”

This makes me think: what if today's rulers are being poisoned by something making them act like idiots?

Claiming that today’s rulers are acting like idiots seems off topic. And subjective too; even if only because yesterday’s rulers weren’t different.
loading story #41855476
loading story #41854521
loading story #41854394
loading story #41856215
This is great use of the technology. There should be scans of all our national monuments, world wonders, etc. So much better a use for the tech than just Redfin.
loading story #41853818
loading story #41854004
This reminds me of a recent Lex Fridman podcast with an expert in ancient American civilizations: https://www.youtube.com/watch?v=AzzE7GOvYz8
This is very cool!

Can you share the technical background you've used for creating the 3D reconstruction? Like software packages, or algorithms used.

Are we looking at the result of packages like OpenSfM here, or COLMAP?

Onsite I used the Matterport capture app and Matterport Pro 2 and BLK 360. For the web version linked here, I built on top of the Matterport SDK with Three.js https://matterport.github.io/showcase-sdk/sdkbundle_home.htm...

So in the virtual tour, you're seeing 360 imagery from the cameras and a lower resolution version of the 3d capture data, optimized for web. The lower res mesh from the scanner is transparent in first-person view mode so users get cursor effects on top of the 360 image.

For film, PBS sent out a documentary crew, and they wanted me to render some footage of the full tunnel system, so I exported the e57 pointcloud data from Matterport and rendered the clips they needed in Unreal. It should be coming out soon with "In the Americas."

{"deleted":true,"id":41852711,"parent":41852421,"time":1729023779,"type":"comment"}
I don't know which is cooler. The 3D scan itself or the 3D map in the browser.

This is amazing. Thank you for sharing.

Thanks for visiting! It was so many tunnels.. I feel bad that I don't build faster sometimes, but this took awhile.
I'm glad to hear you're working on getting an Unreal environment for these scans. I find the movement in the web version to be incredibly clunky. This really needs to have a game like environment to do it justice.

In general we clearly have the technology to capture 4K-8K environments and turn them into very realistic virtual worlds. Is anybody even doing such work? For example capturing a neighborhood in San Francisco (or any city) as it looks in 2024 for historical reference? Seems like that should be a thing.

I've seen high quality environmental scans, even way back in the Silicon Graphics days when they showed an amazing scan of the Sistine Chapel. But it seems to me all such scans wind up in some proprietary player format which was designed by somebody who never played a decent open world game like Fallout 4, Cyberpunk, Battlefield, Red Dead Redemption. I have yet to see a museum environmental scan which gets anywhere near the immersive quality of those games. This is not so much a criticism of such work - it's awsome! - but maybe more of a call to arms for game people to help out the scholars.

I was working towards that aiming at the medieval city of Rothenburg o.d.T. in Germany.

Unfortunately it's a lot of code writing to support rolling shutter cameras strapped to multicopters, where you capture video with short enough exposure to prevent blur. The 3D recovery has to respect the fact that the rows of the image are taken from different positions and angles, causing this up infiltrate basically the entire pipeline.

And global shutter cameras are barely accessible.

If there's some group with the man power and funding to actually pull this off, please get in touch, I would like to pick back up!

loading story #41854040
Were the tombs and other structures originally sealed in with no path to the outside world? Were there other rooms accessible for rituals without archaeologists having to excavate tunnels in the modern day?
loading story #41855281
This is great. I think you shared 3d scan of some other pyramid sometime ago here on HN. I think you should try processing this data through a Guassian Splatting software. I have no idea how many images Guassian Splats require to work well or the CPU/GPU requirements but I have seen very very cool Guassian Splatting demos on twitter where you can absolutely freely fly around the scene and view it from any angle.
loading story #41854121
Wow! That sure brings back memories. I've been there twice, 2011 & 2012. Congratulations. I'm very impressed.
loading story #41855541
The fact that people carved this tunnels with simple tools and their bare hands into the underground is so freaking amazing i cant find better words for it

Edit: also very nice tool :)!

loading story #41854276
Definitely one of the better implementations I've seen using Matterport's SDK, nice work.

Did you use the Pro3 as the capture device? Before the collapse anyway!

Thanks, and I was still on Pro 2 + BLK 360 unfortunately. Haha, thankfully all the cameras survived and made it home, just muddier.
loading story #41852674
That is so cool.

Is it hard to avoid integrator error in long tunnels?

It is so hard! The long tunnel sections were the worst, but thankfully most of them had multiple join points, like in the Temple 16 / Rosalila temple excavations.
I love all these Maya inscriptions. I hope more are discovered (and hopefully some manuscripts) - the little we do have of Maya text is amazing. What are your top 3 things to tell people at parties that no little about Maya?
I love the inscriptions too--the stela only get more meaningful the more you learn about them. But it's kind of like Egypt, the iconography is harder to understand because we inherited from a different culture.

For me, the Maya have always been important because they're our history and our stories in the Americas (I'm from the US) -- more than the greco-roman mythology I grew up with. They grew corn and love ball games. Their stories are more directly our stories, and their struggles are our struggles.

Not to be political, but they also kind of wiped themselves out by large scale environmental collapse, and the jungle is filled with their undiscovered monuments. There's still so much to learn.

People really geek out on how much the Maya knew about astronomy too -- they shot archaeoastronomy docs twice while I was working on site. Richard Feynman even helped decipher Maya glyphs and writes about it in "Surely you're joking Mr Feynman". He gave a lecture also if the audio file can be checked out somehow: https://collections.archives.caltech.edu/repositories/2/acce...

Very cool. Any other Maya sites in the pipeline to do?
loading story #41855249
Is there any plans to support a WebXR interface in the future?
loading story #41855291
Very well done! I was pleasantly surprised how well this works on a phone.

Did you take any scans after sections collapsed? Would love to hear more about what happened.

Matterport's SDK is so good -- I'm so impressed with the details like mobile performance the more time I spend building with it.

I did take some scans after the collapse! After we'd dug ourselves out and crawled out on our bellies, I went back with Polycam. The collapsed section we dug through was comparatively small, maybe 4-5 meters: Section 1: https://poly.cam/capture/4BB863F2-1CC3-46E3-8BDB-232EE3057BD...? (you can see where we crawled out to the intersection here -- the whole intersection had ceiling collapse, but only the section we dug out through was fully covered). Section 2: https://poly.cam/capture/3C5BB7BD-5FC9-4C00-AE1C-84E0544C51C...

We're just lucky it wasn't a rocky ceiling that fell, that would've been much worse.

The team taking care of the tunnels is doing an amazing job with the resources they have, and they're continually backfilling tunnels now and maintaining the ones that are there. It took us about an hour to dig out.

You can compare to the intersection in the matterport version in the same vicinity: https://my.matterport.com/show/?m=r5BR6K6Qxix&ss=338&sr=-.21...

I don't want to editorialize too much, but at that moment we were totally brothers--I was still early with Spanish, and the language, country, age differences fell away, and we dug ourselves out.

The transitions are much smoother than Google street view
If you like smooth transitions, check my startup's:

https://benaco.com/go/k4-green-hn-2024

These look great! What strategy did you use to do the transitions?
It works as you described: Texturing the mesh "live" as you move through it. It does use Three.js as a base, but needs custom shaders to make it 60 FPS.
loading story #41854402
Matterport works special magic on the transition between 360 images in their SDK. As far as I understand it, they render 360 cube camera onto the material of the mesh of the environment as you move so it looks like you're moving in the real mesh but only seeing the 360 image. I tried to approximate this myself in Three.js but didn't get anywhere near the quality or performance of their work. Homage.
Luke I'm so happy to see you here on HN. What you and the Mused team are doing is incredible.
Wow, thanks so much! I don't recognize your un, who is this? But thanks again!
loading story #41852175
Does anyone know if there’s a simple solution to generating NeRFs from a continuous all directional camera (like a GoPro Max). It would be fun to make an explorable universe like that.
You should be able to do this with nerfstudio: https://github.com/nerfstudio-project/nerfstudio/ I've done it a few times, you can test 3d Gaussian Splatting also instead.
loading story #41853247
loading story #41852817
Amazing. So inspiring!
loading story #41854123