Times Insider explains who we are and what we do, and delivers behind-the-scenes insights into how our journalism comes together.
The Apollo 11 mission returned to Earth in July 1969 with boxes of moon rocks and rolls of iconic photographs: a boot print on the moon, a wrinkled American flag, a portrait of Buzz Aldrin with Neil Armstrong reflected in his golden visor.
These photographs are intimately familiar to many of our readers. But could we find a new way to show and explain these 50-year-old images?
We spent weeks looking through every Hasselblad photograph taken during the mission and synchronizing them with NASA transcripts to figure out what the astronauts were talking about when each image was taken.
The astronauts frequently shot sequences of photos with intentional or accidental overlaps. We found that by aligning and stacking the images in the order they were taken, we could often convey a sense of place or movement: a sunrise in Earth orbit, Buzz Aldrin at work inside the lunar module Eagle, Michael Collins watching the Eagle maneuvering to rejoin him in orbit.
The mission transcripts are packed with acronyms and technical jargon, but the astronauts’ personalities come through: Collins and Aldrin are both quick with a joke, while Armstrong is often more reserved.
We decided to merge the images and transcripts into a condensed, evocative story of the Apollo 11 mission using only the astronauts’ own words and photographs — without added descriptions, superlatives or other narration.
We also drew inspiration from a map, originally created by NASA in 1970, that pinpoints the location and direction of every photo taken during the moonwalk.
Using the map as a foundation, Karthik Patanjali, an editor on the Immersive Storytelling team, wrote a custom program to determine how the moonwalk photographs were oriented in space. For each photograph he calculated the height of the camera, its direction and tilt, and the field of view of the lens.
Then we built a virtual gallery of the lunar environment by projecting the photos in space and aligning them with 3-D models of the terrain, the flag and the lunar module.
Essentially, we are able to place viewers in space at the exact positions where Neil Armstrong and Buzz Aldrin were standing when they took their historic photographs. Readers can experience this through a three-part interactive article enhanced with 3-D and augmented reality features.*
In addition, The Times collaborated with The Mill to develop an immersive, interactive virtual-reality experience. Viewers can walk in the astronauts’ footsteps and manipulate a virtual Hasselblad camera to capture many of the iconic moonwalk images.
The VR experience will be installed at The Town Hall in New York City on Sunday for “One Giant Leap: The Apollo 11 Moon Landing, 50 Years On,” a live Times event commemorating the anniversary. It will also be available for download on the Oculus Store.
*Note: Augmented reality features are available in the NYTimes app on supported devices.
Scan this QR tag with your phone camera to launch a New York Times AR experience featuring iconic photographs by the Apollo 11 astronauts.
Follow the @ReaderCenter on Twitter for more coverage highlighting your perspectives and experiences and for insight into how we work.
Graham Roberts is the director of immersive platforms storytelling, and leads a team that explores virtual and augmented reality projects, as well as innovation in video and motion-graphics. @Grahaphics
Jonathan Corum is the Science graphics editor. He produces the “Out There” video series, about space exploration, and has developed virtual reality films about Pluto and Antarctica. He joined The Times in 2005. @13pt • Facebook
Source: Read Full Article