Bring in the Drones: 3D Modeling Using Aerial Imagery at Archaeological Excavations

This is the fourth in a series of posts exploring 3D modeling in Mediterranean and European archaeology. For more on this project click here. We hope these papers will start a discussion either in the comments of the blog or on Twitter using the #3DMedArch hashtag.

Ryan Baker, B.A. Student in Classical Archaeology. University of Texas. Co-Founder ArchAerial LLC.

For every joke I endured this summer about technology from the Starship Enterprise coming to the field of archaeology, a real conversation followed about the future of the discipline in terms of digital representations of an excavation.

3DThrusday

I’m an undergraduate student finishing my B.A. in Classical Archaeology at the University of Texas at Austin, and last fall I started a small business called Arch Aerial LLC that set out to create easy to use aerial photography platforms with autopilot capability in the form of multi-rotor helicopters and small fixed-wing UAV’s.  Initially, we were only looking to work with universities and research institutions working in the field of archaeology, but our project has evolved into something with the potential to increase efficiency in any field or industry with a spatial component.  

Atlas 1 Quad Rotor

This summer, we field-tested our new prototype at over 10 excavations within the Programme for Belize Archaeological Project, the Swedish Institute in Rome’s San Giovenale Tomb Survey in Blera, and the Poggio Civitate Archaeological Project in Murlo.  We conducted a number of different surveys over the course of the summer, often depending upon the environment and research interests of the project directors.  Our teams directed large-scale aerial survey, videography, and (most importantly for this article) a great deal of 3D modeling in Agisoft Photoscan using images taken from the air by our multi-rotor aircraft.

Using these small multi-rotors we produced 3D models of the trenches at closing, and in some cases, at the beginning of each day of excavation to show the progress of the trenches as the season progressed. For example at Poggio Civitate, using my MacBook and a minimal number of photos, I built a model of the trench each morning in relatively real-time (within 20 minutes of capturing the photos) and gave the site director a real-time view of the state of excavation. This is a task that could be completed by any member of an excavation. At the beginning of each day work day, or even as each locus of a trench closes, a staff member could image the site in a matter of minutes and create 3D representations of the excavation for internal research or later publication. This method is not limited to only small features: in addition to areas of open excavation, we captured the imagery to build a 3D model of the entire hill at Poggio Civitate, and rendered models of ball courts and temples from multiple sites within the Programme for Belize Archaeological Project.

Ball Court  Unexcavated 3D Model3D model of the ball court

Adverse conditions were an environmental trait we sought out this summer. Within the Rio Bravo Conservation and Management area, we operated under the rainforest canopy for nearly the entire season while surveying excavation at the Programme for Belize Archaeological Project. Similar to our procedure at Poggio Civitate, we were able to take photos and build our model on site in an extremely short amount of time. Below is a 3D model of excavation on Medicinal Trail by Dr. David Hyde, which we built in Agisoft using imagery taken by one of our quad-copters.

Medicinal Trail 3D Model Screenshot  2Medicinal Trail 3D Model

Many of the parts for our new multi-rotor are prototyped and made in house using our 3D printers. Although this is certainly useful for manufacturing parts, it also gives us the opportunity to take the next step past just rendering 3D models in Agisoft. Using 3D .OBJ files we’ve built of artifacts, trenches, and the entire hill at Poggio Civitate, we can print small-scale versions of the models in ABS plastic.  This has huge implications in terms of giving researchers hands on access to artifacts that might not be allowed out of a host country.  If researchers are able to 3D model artifacts that can’t leave the excavation archives, they can then print them and have to-scale models of their artifact using inexpensive materials and an easily repeatable process. We’ve decided to do just that, and upon returning to Texas our team will be printing models of artifacts and trenches for the excavations mentioned above.

Over the course of the summer I heard plenty of hokey sci-fi references, but I think it is safe to assume that 3D modeling and printing will have their place in the field of archaeology. As the cost of adopting these methods decreases (as it has with Agisoft Photoscan), it only makes sense to add another dimension to the description of archaeological excavation and it’s dissemination to the public through digital publication.

6 Comments

  1. Reblogged this on AIA Geospatial Interest Group and commented:
    3D Thursday!

    Reply

  2. Russell Alleen-Willems October 4, 2013 at 6:30 pm

    Very interesting read, Ryan! I see your company is based in Austin – do you plan to have a booth or any presentation at the SAA meeting there in April?

    Reply

    1. Thanks Russell! We’ll definitely be at the SAA meeting in April. Make sure you stop by! We’ll have a booth at the AIA Annual Meeting in January as well.

      Reply

      1. Ryan, awesome! I will be sure to stop by for a look.

  3. Well done – fantastic ingenuity and a great example of what can be done thinking outside the box when it comes to 3D capture and taking it to the next level in archaeology.

    May I ask what camera you are using, and how it is triggered? I’m guessing a remote of some kind or an intervalometer.

    It would be fun to do some point cloud comparisons each day to see what has changed on site. If you were bored enough you could even measure the volume (and make a guess at the weight) of soil moved by each team 🙂

    Reply

    1. Thanks Tom!

      We use a number of different cameras, all triggered using apps made by their manufacturers that utilize the camera’s own wifi network. Although range is limited (as compared to CHDK radio transmitter triggering), it’s a great way to trigger the camera and view a live feed on the ground.

      Interesting idea on point cloud comparisons!

      Reply

Leave a comment