Mount Rushmore to add laser scans, digital mapping in preservation efforts

Source: http://www.rapidcityjournal.com/articles/2009/05/23/news/local/doc4a1702bb8841b911374516.txt

Officials at Mount Rushmore National Memorial have added new tools in the constant effort to preserve the national icon.

While crews traditionally use silicone sealant, fracture-mapping techniques and other technology to preserve the faces, they have added laser scanning and digital mapping to tools use to keep the 68 year old in good condition.

Duane Bubac, director of cultural resources and facilities at Mount Rushmore National Memorial, said the mapping project will give the National Park Service detailed information about the features of Mount Rushmore. And the data could later be used to create virtual, up-close tours of Mount Rushmore.

In the past, maintenance was mostly a once-a-year event. Each fall, rope-access crews descended from the top of the memorial to inspect the granite and caulk the surface cracks. The fall maintenance work still takes place every September, Bubac said, but it’s now part of a much larger, year-around effort to monitor and preserve Mount Rushmore National Memorial.

“The process involves a lot more than sealing cracks now,” Bubac said.

Under the National Park Service long-range preservation plan, crews are fracture-mapping and crack-mapping the granite surface. They are also removing vegetation, cleaning and sealing cracks, removing hazard rocks and testing the next generation of sealant materials.

Mount Rushmore’s biggest threat is water. It seeps into cracks, freezes and expands. That freeze-thaw cycle could eventually degrade the surface of the granite. Dirt and vegetation could also be a threat. If dirt can accumulate in a small hollow, it gives opportunist weeds, bushes or trees the foothold they need to take root.

NASA Uses Laser Scan Data and Photosynth to Evaluate Shuttle Damage

By Adam Sheppard
Source: http://blogs.nasa.gov/cm/blog/chrisckemp/posts/post_1242080940877.html

In October 2006 I was sitting in my office at Microsoft trawling through a backlog of email and voice messages. It had been a busy couple of months leading up to the Technology Preview of Photosynth at San Francisco’s Web 2.0. We’d brought the house down with our on stage demonstration of Photosynth’s ability to take a large number of digital images and automatically assemble them into a high resolution, 3D environment that anyone could explore at home from within their web browser.

Around the same time, Chris Kemp (Currently CIO at NASA Ames) had just joined the Agency and was leading their business development efforts to seek out new and emerging technologies coming out of the private sector that could help NASA’s mission. He had seen the Photosynth demo and was eager to learn more about what he’d seen. He was full of ideas and saw the potential for Photosynth to bring the public closer to the space program than they had ever been before. They could follow in the tracks of the Mars Rovers, see every nut and bolt of a shuttle on the launch pad and experience first hand life onboard the International Space Station (ISS). At Microsoft Live Labs we were bringing some of the top computer scientists in the world together with talented engineers to explore new ground on the web. We couldn’t have asked for a more kindred spirit than the men and women of NASA who, through their own genius and engineering skills, were exploring our universe for the benefit of mankind.

Throughout 2007 we began planning and experimenting with images provided by NASA. Together with Microsoft Researcher Drew Steedly (one of our lead scientists on Photosynth) we visited Kennedy Space Center as preparations were underway for Shuttle Endeavor STS-118’s mission to the ISS. Chris had arranged unprecedented access to for us to photograph and document the Shuttle as tiles were being replaced, in the Vehicle Assembly Building, and on the launch pad itself. We were really eager to get some aerial shots of the shuttle awaiting launch, and Drew was lucky enough to sit with a SWAT team in one of those ‘Men In Black’ NASA helicopters as they flew a security fly-by around the shuttle on the pad, producing some amazing 360 degree shots we would later include in the synth. (video)

For the launch itself, we sat with the families of the astronauts, and one couldn’t help but remember the footage of the Apollo missions with people watching from the same bleachers as their loved ones journeyed to the Moon. A few days later, back in Seattle, we were proudly sharing our unique stories with our colleagues, when a call came in that had us jump into action. Since the Columbia tragedy occurred three years prior, many new safety procedures were instituted to help ensure the safety of the crew. One such measure was the detailed inspection of the underside of the shuttle from the ISS using both photography and a laser scan to assess any potential damage. Some damaged tiles had been found on Endeavour while in orbit and NASA was scrambling to make a decision on whether they should attempt a repair. During our previous discussions with NASA engineers, the idea of using Photosynth for safety procedures had been discussed and this was the perfect test case for us. Each tile is unique, carefully cut and individually serial numbered before being adhered by hand to the underside of the vehicle. Typically a specialist would have to trawl through thousands of close up images, cross-referencing against plans to visualize where on the shuttle the particular tile was. To help the specialists, we took all the images downlinked from the ISS while it was still in orbit, and used Photosynth to reconstruct the bottom of the shuttle, automatically placing the images together. We were all amazed at the results as we zoomed into the damaged tile. And while the decision had already been made to proceed without a repair, the value of this new technology had certainly been demonstrated.

Shortly afterwards I found myself at NASA headquarters in Washington DC demonstrating the synths to the Associate Administrator for Space Operations, William H. Gerstenmaier. He immediately saw the potential to share life onboard the ISS using Photosynth, and we soon found ourselves on our way to NASA’s Johnson Space Center in Houston to help train the astronaut on creating a full-size synth of the International Space Station. (video)

As we worked with the astronaut trainers, we soon realized that there were some unique challenges to consider in zero gravity. For example, Photosynth makes some assumptions on which way is ‘up’, but when you’re floating in space there is no ‘up’. With barely enough room to stand up, and every available space full of equipment, we also had to consider what technique would best capture both the panoramic views and detail shots that would lead to a successful synth. Towards the end of the day we got to meet some of the astronauts being trained on emergency procedures on a life-sized mockup of the Shuttle and finally had the chance to sit in a real Russian Soyuz capsule also used for training.

Needless to say it was my life long dream to be crawling through a space station and meeting the men and women who work tirelessly behind the scenes at NASA.

I personally moved on to other projects mid 2008, just when the flight plan for the new ISS Photosynths was coming together. All I can say is the resulting synths have turned out better than I could have ever imagined. I’m delighted to think that somewhere there’s a child exploring them right now who will one day walk on the surface of the Moon or even Mars.

3D tours added to the Google Earth Gallery

[Cross-posted from the Official Google SketchUp Blog]

If you’re a 3D enthusiast, then you’ll probably enjoy the latest addition to the Google Earth Gallery: 3D Buildings. This new category provides a number of self-running tours on various themes. The tours showcase some fascinating 3D buildings (along with bridges and statues and other structures) around the world, most of which were built by our passionate Google SketchUp users who model buildings for Google Earth. Whether your interest is castles,bridgesmuseumsbaseball stadiums or skyscrapers, I think you’ll find a self-guided tour that is of interest to you.


The tours were developed by geo-modelers Adam and Jordan, both of whom are 3D experts who really know their way around Google Earth.


To play a tour, simply click on the “Open in Google Earth” link to download the KML file. Then click the “Start tour here” link in the “Places” panel in Google Earth (download the latest version of Google Earth). Make sure the “3D Buildings” layer is checked in the “Layers” panel. The tour will pause at each location to ensure the 3D building is fully loaded. Click the play button to continue the tour. Enjoy!

Crafting Quality Laser Scan Animations using Cyclone

5 Tips from the CyArk Team
By: Hannah Bowers
May 26th, 2009

It is often difficult to capture the beauty of a site with drawings and photographs alone. This is why animations are often helpful for viewers to get a sense of the space and relation of objects to one another. Yet, animations can be a tricky thing to master—particularly in Leica’s Cyclone software. This list is intended to help you add greater eloquence and expression to your Cyclone animations while speeding up the preparation time.

1) Simulate the Animation
Use the view mode to fly through the modelspace to get an idea of what areas would be the best to animate. Manipulate the modelspace as though you were watching an animation. This will help you notice if there are areas that you may not want to show, or areas that need emphasizing.

2) Pick a Point
When you figure out what will be the focus of your animation, pick a point that will be the center with Cyclone’s Seek Mode tool.

Save a “view” (located in the drop-down menu “View”, under “Save View”) with this point as the center.

By doing this, you will always be able to reference this point, even when you pan through the modelspace. With simple, rotating animations, it is best to always check the box “Keep Current Focal Point” in the Animation Editor window.

This will ensure that your animation will have a smooth gliding effect and will also keep the focus of the animation on the point that you selected. Note that this, however, may not be the desired result if your video is traversing a site as it may then cause unintentional camera-angle changes.

3) Frames
Make the number of frames between key points a multiple of 15. Exporting the animation as a 15fps movie will make the movie smooth and consistent. Keep in mind that the spacing of Key Points will affect the amount of frames needed between them; e.g., if you want two seconds between the key points indicated, then 30 frames will be needed.

4) Number of Cameras
The number of cameras used in your animations will depend on the size of the modelspace. But a good number should be 4-15 cameras. If you have more, or less, than it is possible that the animations are too complicated, or not enough. In either case, the animation may become bumpy or uncontrolled. Here we see a Moai from Rapa Nui (Easter Island) with 4 cameras located around the monument to create a simple fly-around animation.

5) Animation Sequence Length
If you desire long and/or complex animations, then it is best to create several short animations with these tips and then merge the individual, short clips together in a separate video editing program. This provides greater video editing ability in a second program more suited to video production; these aditional editing abilities would include options such as fades, transitions and title sequences. The short videos are also easier and less time-consuming to generate within Cyclone.

As social media becomes a hit with agencies, GSA plans more offerings

Source: http://www.nextgov.com/nextgov/ng_20090522_7519.php

Agencies have launched social networking applications at such a fast pace that government officials said on Friday that they plan to add more applications, such as Apple’s popular media store iTunes and the career-networking site LinkedIn.

The government also is negotiating agreements with social publishing site Scribd and commenting platform Intense Debate, said Martha Dorris, acting associate administrator for the General Services Administration’s Office of Citizen Services and Communications. The federal government has signed modified terms of service agreements to allow agencies to use social media sites Flickr, Facebook, YouTube, Vimeo, Slideshare and AddThis, among others.

The response to the agreements has been excellent, Dorris said. Agencies have set up Facebook profiles, Twitter accounts and YouTube channels to share information with the public. “We’re looking at taking information wherever citizens go to get information,” she said, adding GSA will consider adding more social networking sites, depending on requests from agencies.

Sheila Campbell, team leader of best practices for USA.gov and co-chairwoman of the Federal Web Managers Council, said tools such as Twitter offer the government the ability to communicate with the public very quickly in times of emergency. She cited as examples the Center for Disease Control and Prevention’s use of social media during its peanut recall and the swine flu outbreak.

Despite the common perception that social media tools are used mostly by younger users, baby boomers are the fastest-growing segment of users for most applications, according to Dorris. She said agencies were inspired by President Obama’s use of new media tools during his campaign and by WhiteHouse.gov, and have applied those lessons to outreach programs.

“Government needs to provide services and information the way the public wants it,” Dorris said. “One way is to engage the citizen, get an understanding of what they want. Give them a chance to be part of the decision-making in a way they never have before.”

When asked how agencies can move beyond simple broadcast of information to using social media to collect feedback, Dorris said Obama’s recent request for public input to support his open government initiative is an example. She said requests for public comments on health care and the recovery act were other examples.

“Let the public rate comments,” Dorris said. “Then take the issues, rated and ranked and look at the top 10. I think there’s a value to that externally and internally to solicit comments and feedback within your organization.”

The Federal Web Managers Council is working with agencies to craft policies for using social media tools, but the rules aren’t likely to be much different from existing guidelines on releasing information publicly, according to Campbell. “I think the same rules and ethics apply,” she said. “If you’re on Twitter, you shouldn’t be posting confidential information, just as you wouldn’t if speaking at a conference.”

GSA general counsel Seth Greenfield said any new social media policies would only complement existing ethical standards and regular use policies being applied to the use of IT.

NIST’s LIDAR May Offer Peerless Precision in Remote Measurements

Source: http://www.nist.gov/public_affairs/techbeat/tbx20090526_lidar.htm

By combining the best of two different distance measurement approaches with a super-accurate technology called an optical frequency comb, researchers at the National Institute of Standards and Technology (NIST) have built a laser ranging system that can pinpoint multiple objects with nanometer precision over distances up to 100 kilometers. The novel LIDAR (“light detection and ranging”) system could have applications from precision manufacturing lines on Earth to maintaining networks of satellites in perfect formation, creating a giant space-based platform to search for new planets.

LIDAR transmits light through the air and analyzes the weak reflected signal to measure the distance, or range, to the target. NIST’s new LIDAR, described in Nature Photonics,* has a unique combination of capabilities, including precision, rapid updates from multiple reference points at the same time, and minimal “measurement ambiguity.” The system can update measurements to multiple targets simultaneously every 200 microseconds. Measurement ambiguity in a LIDAR system is due to the fact that, if the target is at long range from the instrument, the system can’t distinguish between two different distances that are multiples of its “ambiguity range.” The new NIST LIDAR has a comfortably large ambiguity range of at least 1.5 meters—large enough to check the coarse distance with widely available technologies such as GPS.

No other ranging system offers this combination of features, according to the new paper. NIST’s LIDAR could enable multiple satellites to maintain tight spacing and pointing while flying in precision formations, acting as a single research instrument in space, the paper states. Formation flying has been proposed as a means to enhance searches for extraterrestrial planets, enable imaging of black holes with multiple X-ray telescopes on different satellites, and support tests of general relativity through measurements of satellite spacing in a gravitational field. The new LIDAR could enable continuous comparisons and feedback of distances to multiple reference points on multiple satellites. There also may be applications in automated manufacturing, where many parts need to fit together with tight tolerances, according to Nate Newbury, the principal investigator.

NIST’s LIDAR design derives its power from combining the best of two different approaches to absolute distance measurements: the time-of-flight method, which offers a large ambiguity range, and interferometry, which is ultraprecise. The LIDAR relies on a pair of optical frequency combs, tools for precisely measuring different colors (or frequencies) of light. The frequency combs used in the LIDAR are based on ultrafast-pulsed fiber lasers, which are potentially smaller and more portable than typical combs that generate laser light from crystals. The two combs operate at slightly different numbers of pulses per second. Pulses from one comb are reflected from a moving target and a stationary reference plane. The second comb serves as precise timer to measure the delay between the reflections returning from the target and from the reference plane. A computer calculates the distance between the target and the reference plane by multiplying the time delay by the speed of light.

* I. Coddington, W.C. Swann, L. Nenadovic and N.R. Newbury. Rapid, precise absolute distance measurements at long range. Nature Photonics. Published online May 24, 2009.

Media Contact: Laura Ost, laura.ost@nist.gov, (303) 497-4880