Autodesk Leica BLK360 Affordable 3D Laser Scanner

Leica Geosystems and Autodesk Announce BLK360 $16k 3D Laser Scanner

Leica Geosystems Announces Complete Imaging Solution: Leica BLK360 Imaging Laser Scanner and Autodesk ReCap 360 Pro App

Las Vegas, November 16th, 2016, Leica Geosystems announced the BLK360, a revolutionary miniaturized black 3D imaging laser scanner. The product was revealed at Autodesk University 2016 and will be bundled with Autodesk’s ReCap 360 Pro and the new ReCap 360 Pro app for iPad. Both companies will demonstrate the product for the duration of the conference at the ReCap booth #2033 and the Leica Geosystems booth #1537.

The BLK360 captures the world around you with full-color panoramic images overlaid on a high accuracy point cloud. The one-button Leica BLK360 is not only the smallest and lightest of its kind, but also offers a simple user experience. Anyone who can operate an iPad can now capture the world around them with high resolution 3D panoramic images.

The Leica BLK360 defines a new category: the imaging laser scanner. It is so small and light that it fits in a typical messenger bag and can be carried almost anywhere. It features a 60 meter measurement range for full dome scans. A complete full-dome laser scan, 3D panoramic image capture and transfer to the iPad Pro takes only 3 minutes.

ReCap360 and Leica BLK360 3D Imaging SolutionUsing the ReCap Pro 360 mobile app, the BLK360 streams image and point cloud data to iPad. The app filters and registers scan data in real-time. After capture, ReCap 360 Pro enables for point cloud data transfer to a number of CAD, BIM, VR and AR applications. The integration of BLK360 and Autodesk software will dramatically streamline the reality capture process thereby opening this technology to non-surveying individuals.

“When Autodesk first introduced ReCap, it was for one purpose:  the democratization of reality capture,” said Aaron Morris, who oversees reality solutions at Autodesk. “We saw the tremendous power of this technology for the AEC industry, but realized that the cost and portability of scanners combined with difficult-to-use data was limiting the adoption of reality capture.  Autodesk’s collaboration with Leica Geosystems helps solve these issues by giving just about anyone access to the amazing advantages of reality data.”

“As the leader in the spatial measurement arena, we recognized the gap between Leica Geosystems’ scientific-grade 3D laser scanners and emerging camera and handheld technologies, and set out to bring reality capture to everyone,” said Dr. Burkhard Boeckem, CTO of Hexagon Geosystems. “By combining and miniaturizing technologies available within Hexagon, the BLK360 defines a new category: the Imaging Laser Scanner. It is significantly smaller and lighter (1 kg) than any comparable device on the market. As we developed the ultimate sensor, we worked with Autodesk to create new software and ultimately achieved the next milestone in 3D reality capture. Together with Autodesk’s ReCap 360 Pro, the Leica BLK360 empowers every AEC professional to realize the benefits gained by incorporating high resolution 360° imagery and 3D laser scan data in their daily work.”

BLK360 & Autodesk ReCap 360 Pro Bundle will be available to order in March 2017. The anticipated bundle suggested retail price is $15,990/€15,000, which includes: BLK360 Scanner, Case, Battery, Charger and an annual subscription to ReCap 360 Pro. For customers who want to secure their spot in line to receive the first batch of BLK360 laser scanners, Autodesk and Leica Geosystems are offering a special limited promotion for a discounted three-year ReCap 360 Pro subscription with a voucher giving priority access to buy the BLK360. Go to this link to learn more.

Leica Geosystems – when it has to be right
Revolutionizing the world of measurement and survey for nearly 200 years, Leica Geosystems creates complete solutions for professionals across the planet. Known for premium products and innovative solution development, professionals in a diverse mix of industries, such as aerospace and defense, safety and security, construction, and manufacturing, trust Leica Geosystems for all their geospatial needs. With precise and accurate instruments, sophisticated software, and trusted services, Leica Geosystems delivers value every day to those shaping the future of our world.

Leica Geosystems is part of Hexagon (Nasdaq Stockholm: HEXA B; hexagon.com), a leading global provider of information technologies that drive quality and productivity improvements across geospatial and industrial enterprise applications.

Autodesk, the Autodesk logo, Autodesk ReCap 360, and Autodesk ReCap 360 Pro are registered trademarks or trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries. All other brand names, product names or trademarks belong to their respective holders. Autodesk reserves the right to alter product and services offerings, and specifications and pricing at any time without notice, and is not responsible for typographical or graphical errors that may appear in this document. 

Keeping up with the Joneses VFX 3D Scanning

Zoic Amps Up Neighborly Antics for ‘Keeping Up with The Joneses’

Led by VFX supervisor Everett Burrell, Zoic Studios delivers big Hollywood explosions and high impact antics to amplify the humor of the film’s chance encounters and secret missions.

VANCOUVER, B.C. — Zoic Studios blasts mellow neighborhood envy to espionage proportions in the new action-adventure comedy Keeping Up with the Joneses from 20th Century Fox. The film follows an ordinary suburban couple (Zach Galifianakis, Isla Fisher) who finds it’s not easy keeping up with the Joneses (Jon Hamm, Gal Gadot), their impossibly gorgeous and ultra-sophisticated neighbors — especially when they discover that Mr. and Mrs. “Jones” are covert operatives. The Zoic team, led by VFX Supervisor Everett Burrell, delivered big Hollywood explosions and high impact antics to amplify the humor of the film’s chance encounters and secret missions. The film was released in theaters nationwide on October 21, 2016.Burrell worked closely with Special Effects Supervisor Michael Lantieri while on set in Atlanta, GA, to properly sync all practical effects for the augmentation executed in post-production. One heavy-hitting scene called for a fiery demolition of the Joneses’ house. The Zoic team collaborated with director Greg Mottola (Superbad, Paul) to bring his vision of an ‘80s movie-style detonation to reality. They intensified the larger-than-life eruption to leave audiences with no question of complete destruction, all while maintaining a photo-real look. Since an actual location was being utilized in lieu of a soundstage, none of the mansion blaze was captured practically, requiring the entire sequence to be created in CG.

Another scene called for Fisher and Galifianakis to spark a living room fire in a fantasy setting during a romantic moment gone wrong. The living room set was built twice, enabling the crew to practically ignite certain set elements and have Zoic Studios composite the flaming pieces into the scene in post-production.

While some practical stunt work called for VFX to bolster the action, certain moments required more comedy chops, including a scene where Galifianakis attempts a Hollywood-style jump through a glass window, only to bounce off the glass upon contact. To practically capture the moment, a stunt actor handled the brunt of the impact with Galifianakis jumping through an empty window placed in a hallway. The Zoic team seamlessly transitioned between the two performances to create a laughter-inducing moment for all.

Another sequence entailed creating the look of a 24-story vantage point from what was actually a 5-story perspective at a hotel. To implement this, the team placed a green screen on the floors between the pool, adding scale to the shots in CG to make the pool look significantly lower to the ground. Additional effects work included cleaning up shots to accelerate the action scenes, from practical stunts like removing rigs to muzzle flashes and bullet hits. The team also crafted a CG snake for a comical reptilian dining sequence between Hamm and Galifianakis.

“It was great working with director Greg Mottola to take the comedy to the next level. He has such extensive expertise in the genre and it was enjoyable to collaborate on the effects that would remain real and plausible in this comedy universe,” notes Burrell. “The comedy is so dialogue-driven, so we really worked to keep the VFX subtle and let the writing and performances speak for themselves.”

Zoic Studios amps the action in an explosion sequence from ‘Keeping Up with The Joneses.’

3D Scanning and LiDAR by SCANable
Source: Zoic Studios
velodyne_lidar_tm_puck_hires_fronttop_3k

Velodyne LiDAR Announces Puck Hi-Res™ LiDAR Sensor

Velodyne LiDAR Announces Puck Hi-Res™ LiDAR Sensor, Offering Higher Resolution to Identify Objects at Greater Distances

Industry-leading, real-time LiDAR sensor impacts autonomous vehicle, 3D mapping and surveillance industries with significantly higher resolution of 3D images

velodyne_lidar_tm_puck_hires_fronttop_3k

MORGAN HILL, Calif.–(BUSINESS WIRE)–Velodyne LiDAR Inc., the recognized global leader in Light, Detection and Ranging (LiDAR) technology, today unveiled its new Puck Hi-Res™ sensor, a version of the company’s groundbreaking LiDAR Puck that provides higher resolution in captured 3D images, which allows objects to be identified at greater distances. Puck Hi-Res is the third new LiDAR sensor released by the company this year, joining the standard VLP-16 Puck™ and the Puck LITE™.

“Not only does the Puck Hi-Res provide greater detail in longer ranges, but it retains all the functions of the original VLP-16 Puck that shook up these industries when it was introduced in September 2014.”

“Introducing a high-resolution LiDAR solution is essential to advancing any industry that leverages the capture of 3D images, from autonomous navigation to mapping to surveillance,” said Mike Jellen, President and COO, Velodyne LiDAR. “The Puck Hi-Res sensor will provide the most detailed 3D views possible from LiDAR, enabling widespread adoption of this technology while increasing safety and reliability.”

Expanding on Velodyne LiDAR’s groundbreaking VLP-16 Puck, a 16-channel, real-time 3D LiDAR sensor that weighs just 830 grams, Puck Hi-Res is used in applications that require greater resolution in the captured 3D image. Puck Hi-Res retains the VLP-16 Puck’s 360° horizontal field-of-view (FoV) and 100-meter range, but delivers a 20° vertical FoV for a tighter channel distribution – 1.33° between channels instead of 2.00° – to deliver greater details in the 3D image at longer ranges. This will enable the host system to not only detect, but also better discern, objects at these greater distances.

“Building on the VLP-16 Puck and the Puck LITE, the Puck Hi-Res was an intuitive next step for us, as the evolution of the various industries that rely on LiDAR showed the need for higher resolution 3D imaging,” said Wayne Seto, product line manager, Velodyne LiDAR. “Not only does the Puck Hi-Res provide greater detail in longer ranges, but it retains all the functions of the original VLP-16 Puck that shook up these industries when it was introduced in September 2014.”

“The 3D imaging market is expected to grow from $5.71B in 2015 to $15.15B in 2020, led by the development of autonomous shuttles for large campuses, airports, and basically anywhere there’s a need to safely move people and cargo,” said Dr. Rajender Thusu, Industry Principal for Sensors & Instruments, Frost & Sullivan. “We expect Velodyne LiDAR’s line of sensors to play a key role in this surge in autonomous vehicle development, as the company leads the way in partnerships with key industry drivers, along with the fact that sensors like the new Puck Hi-Res are substantially more sophisticated than competitive offerings and increasingly accessible to all industry players.”

Velodyne LiDAR is now accepting orders for Puck Hi-Res, with a lead-time of approximately eight weeks.

About Velodyne LiDAR

Founded in 1983 by David S. Hall, Velodyne Acoustics Inc. first disrupted the premium audio market through Hall’s patented invention of virtually distortion-less, servo-driven subwoofers. Hall subsequently leveraged his knowledge of robotics and 3D visualization systems to invent ground breaking sensor technology for self-driving cars and 3D mapping, introducing the HDL-64 Solid-State Hybrid LiDAR sensor in 2005. Since then, Velodyne LiDAR has emerged as the leading supplier of solid-state hybrid LiDAR sensor technology used in a variety of commercial applications including advanced automotive safety systems, autonomous driving, 3D mobile mapping, 3D aerial mapping and security. The compact, lightweight HDL-32E sensor is available for applications including UAVs, while the VLP-16 LiDAR Puck is a 16-channel LiDAR sensor that is both substantially smaller and dramatically less expensive than previous generation sensors. To read more about the technology, including white papers, visit http://www.velodynelidar.com.

Contacts

Velodyne LiDAR
Laurel Nissen
lnissen@velodyne.com
or
Porter Novelli/Voce
Andrew Hussey
Andrew.hussey@porternovelli.com

Hennessy VSOP 3D Scan

Hennessy Launches “Harmony. Mastered from Chaos.” Interactive Campaign using LiDAR Scans

NEW YORK, June 30, 2016 /PRNewswire/ — Hennessy, the world’s #1 Cognac, today announced “Harmony. Mastered from Chaos.” –a dynamic new campaign that brings to life the multitude of complex variables that are artfully and expertly mastered by human touch to create the brand’s most harmonious blend, V.S.O.P Privilège. Set to launch June 30th, the campaign showcases the absolute mastery exuded at every stage of crafting this blend. This first campaign in over ten years also offers a glimpse into the inner workings of Hennessy’s mysterious Comité de Dégustation (Tasting Committee)—perhaps the ideal example of Hennessy’s mastery—that crafts the same rich, high quality liquid year over year. Narrated by Leslie Odom, Jr., the campaign features 60, 30 and 15 second digital spots and an interactive digital experience, adding another vivid chapter to the brand’s “Never stop. Never settle.” platform.

“Sharing the intriguing story of the Hennessy Tasting Committee, its exacting practices and long standing rituals, illustrates the crucial role that over 250 years of tradition and excellence play in mastering this well-structured spirit,” said Giles Woodyer, Senior Vice President, Hennessy US. “With more and more people discovering Cognac and seeking out the heritage behind brands, we knew it was the right time to launch the first significant marketing campaign for V.S.O.P Privilège.”

Hennessy’s Comité de Dégustation is a group of seven masters, including seventh generation Master Blender, Yann Fillioux, unparalleled in the world of Cognac. These architects of time oversee the eaux-de-vie to ensure that every bottle of V.S.O.P Privilège is perfectly balanced despite the many intricate variables present during creation of the Cognac. From daily tastings at exactly 11am in the Grand Bureau (whose doors never open to the public) to annual tastings of the entire library of Hennessy eaux-de-vie (one of the largest and oldest in the world), this august body meticulously safeguards the future of Hennessy, its continuity and legacy.

Through a perfectly orchestrated phalanx marked by an abundance of tradition, caring and human touch, V.S.O.P Privilège is created as a complete and harmonious blend: the definitive expression of a perfectly balanced Cognac. Based on a selection of firmly structured eaux-de-vie, aged largely in partially used barrels in order to take on subtle levels of oak tannins, this highly characterful Cognac reveals balanced aromas of fresh vanilla, cinnamon and toasty notes, all coming together with a seamless perfection.

“Harmony. Mastered from Chaos.”
In partnership with Droga5, the film and interactive experience were directed by Ben Tricklebank of Tool of North America, and Active Theory, a Los Angeles-based interactive studio. From the vineyards in Cognac, France, to the distillery and Cognac cellars, viewers are taken on a powerful and modern cinematic journey to experience the scrupulous process of crafting Hennessy VSOP Privilège. The multidimensional campaign uses a combination of live-action footage and technology, including 3D lidar scanning, depth capture provided by SCANable, and binaural recording to visualize the juxtaposition of complexity versus mastery that is critical to the Hennessy V.S.O.P Privilège Cognac-making process.

“Harmony. Mastered from Chaos.” will be supported by a fully integrated marketing campaign including consumer events, retail tastings, social and PR initiatives. Consumers will be able to further engage with the brand through  the first annual “Cognac Classics Week” hosted by Liquor.com, taking place July 11-18 to demonstrate the harmony that V.S.O.P Privilège adds to classic cocktails. Kicking off on Bastille Day in a nod to Hennessy’s French heritage, mixologists across New York City, Chicago, and Los Angeles will offer new twists on classics such as the French 75, Sidecar, and Sazerac, all crafted with the perfectly balanced V.S.O.P Privilège.

For more information on Cognac Classics Week, including a list of participating bars and upcoming events, visitwww.Liquor.com/TBD and follow the hashtag #CognacClassicsWeek.

To learn more about “Harmony. Mastered from Chaos.” visit Hennessy.com or Facebook.com/Hennessy.

ABOUT HENNESSY
In 2015, the Maison Hennessy celebrated 250 years of an exceptional adventure that has lasted for seven generations and spanned five continents.

It began in the French region of Cognac, the seat from which the Maison has constantly passed down the best the land has to give, from one generation to the next. In particular, such longevity is thanks to those people, past and present, who have ensured Hennessy’s success both locally and around the world. Hennessy’s success and longevity are also the result of the values the Maison has upheld since its creation: unique savoir-faire, a constant quest for innovation, and an unwavering commitment to Creation, Excellence, Legacy, and Sustainable Development. Today, these qualities are the hallmark of a House – a crown jewel in the LVMH Group – that crafts the most iconic, prestigious Cognacs in the world.

Hennessy is imported and distributed in the U.S. by Moët Hennessy USA. Hennessy distills, ages and blends spanning a full range: Hennessy V.S, Hennessy Black, V.S.O.P Privilège, X.O, Paradis, Paradis Impérial and Richard Hennessy. For more information and where to purchase/ engrave, please visit Hennessy.com.

 

 

Video – https://youtu.be/vp5e8YV0pjc
Photo – http://photos.prnewswire.com/prnh/20160629/385105
Photo – http://photos.prnewswire.com/prnh/20160629/385106

SOURCE Hennessy

Leica_Pegasus_Backpack_Keyvisual_PIC_655x180

Leica Pegasus Backpack Wearable Reality Capture – Indoors, Outdoors, Anywhere

Ultra mobile reality capture sensor platform – authoritative professional documentation indoors or outdoors

Leica Pegasus Backpack is a unique wearable reality capturing sensor platform combining cameras and Lidar profilers with the lightness of a carbon fiber chassis and a highly ergonomic design. The Pegasus:Backpack enables extensive and efficient indoor or outdoor documentation at a level of accuracy that is authoritative and professional. The Pegasus:Backpack is designed for rapid and regular reality capture – no longer is scanning registration needed for progressive scanning. The Pegasus:Backpack is just completely portable – enabling it to be checked in as luggage on a flight – simply fly-in, wear, collect, then fly-out. As part of the Pegasus platform, the Pegasus:Backpack is designed to act as a sensor platform with our standard external trigger and sync port outputs.

Leica Pegasus:Backpack

Map indoors, outdoors, underground, anywhere
Making progressive professional BIM documentation a reality with the Leica Pegasus:Backpack solution, synchronising imagery and point cloud data together, therefore assuring a complete documentation of a building for full life cycle management. By using SLAM (Simultaneous Localization and Mapping) technology and a high precision IMU, we ensure accurate positioning with GNSS outages – ensuring the best known position independent of how it is used.

With the Leica Pegasus:Backpack outdoor areas or underground infrastructures with limited access professional data collection is no longer limited . By capturing full 360 spherical view and Lidar together means you never forget an object or return to a project site – no matter where you are. A hardware light sensor ensures the operator that all images are usable while other functions are verifiable and adjustable over the operators tablet device.


Main features

  • Indoor and outdoor mapping in one single solution – position agnostic
  • Marries imagery and point cloud data into a single calibrated, user-intuitive platform
  • Full calibrated spherical view
  • External trigger output and external time stamping for additional sensors
  • Light sensor for auto brightness and balance control for image capture
  • Software enables access to Esri® ArcGIS for Desktop
  • Capture and edit 3D spatial objects from images and / or within the point cloud
  • Economical with data – balances data quantity and quality, with project logistics and post-processing
  • Ultra light weight carbon fiber core frame with an extensive ergonomic support for prolonged use
  • Real time view of the captured data through the tablet device
  • Up to 6 hours operational time with optional battery pack

Hardware features

  • Two profilers with 600,000 pts/sec, 50 m usable range and 16 channels
  • Largest sensor to pixel in the market – 5.5 um x 5.5 um
  • Five 4 MB cameras positioned to capture 360° x 200° view
  • User adjustable acquisition intervals based on the distance travelled
  • NovAtel ProPak6™ provides the latest and most sophisticated precise GNSS receiver with a robust field proven IMU for position accuracy of 20 mm RMS after 10 seconds of outage
  • Marrying a triple band GNSS system with the latest multiple beam enabled SLAM algorithms
  • INS determination of the location, speed, velocity and orientation at a rate of 200 Hz
  • Ultra portable system fitting into one carrying case (system weight 13 kg)
  • Battery based – using four batteries in a hot swappable configuration
  • Multi-core industrial PC, 1 TB SSD, USB3 interface, ethernet, and wireless connection from the system to the tablet device

Leica Pegasus:Backpack Indoor mapping solution

Leica Pegasus:Backpack enables unimaginable applications for indoor and outdoor mapping combining  visual images with the accuracy of a point cloud for professional documentation – in a wearable, ergonomic, and ultra light carbon fiber construction.

Software features

  • User capable of adding acquisition point objects in a Shapefile format during data acquisition
  • Advanced export capability for CAD-systems and others (DWG, DXF, SHP, GDB, DGN, E57, HPC, LAS, PTS, NMEA, KMZ)
  • Semi-automatic extraction tools
  • Sequenced images and videos for rapid navigation and object recognition
  • Software pointer “snaps” automatically and continuously onto the point cloud data from within an image
  • Immediate access to point clouds for accurate measurement
  • 3D stereoscopic view to decrease errors and increase throughput
  • Shadowed or missing 3D points can be acquired via photogrammetric processes
  • Data capture module displays the current location based on a GIS user interface
  • Data capture module displays all cameras and Lidar scans live, simultaneously
  • Data capture module enables laser scanner management and GNSS Operation
  • Live status monitoring of system during data acquisition

Software benefits

  • Lidar accuracy with image-based usability
  • Digitise spatial objects through mobile mapping
  • A more natural approach for non-professional users while offering technical interface for advanced users
  • Scalable to your applications including less accurate simple GIS needs
  • Short data acquisition time
  • High acquisition throughput
  • High post-processing throughput
  • Manageable license options – compatible with thin-client viewer
  • Esri® ArcGIS for Desktop compatible
  • Leverages Esri® relational platform for advanced features
SynthEyes 3D Tracking Software

Andersson Technologies releases SynthEyes 1502 3D Tracking Software

Andersson Technologies has released SynthEyes 1502, the latest version of its 3D tracking software, improving compatibility with Blackmagic Design’s Fusion compositing software.

Reflecting the renewed interest in Fusion
According to the official announcement: “Blackmagic Design’s recent decision to make Fusion 7 free of charge has led to increased interest in that package. While SynthEyes has exported to Fusion for many years now — for projects such as Battlestar Galactica — Andersson Technologies LLC upgraded SynthEyes’s Fusion export.”

Accordingly, the legacy Fusion exporter now supports 3D planar trackers; primitive, imported, or tracker-built meshes; imported or extracted textures; multiple cameras; and lens distortion via image maps.

The new lens distortion feature should make it possible to reproduce the distortion patterns of any real-world lens without its properties having been coded explicitly in the software or a custom plugin.

A new second exporter creates corner pin nodes in Fusion from 2D or 3D planar trackers in SynthEyes.

Other new features in SynthEyes 1502 include an error curve mini-view, a DNG/CinemaDNG file reader, and a refresh of the user interface, including the option to turn toolbar icons on or off.

Pricing and availability
SynthEyes 1502 is available now for Windows, Linux and Mac OS X. New licences cost from $249 to $999, depending on which edition you buy. The new version is free to registered users.

New features in SynthEyes 1502 include:

  • Toolbar icons are back! Some love ’em, some hate ’em. Have it your way: set the preference. Shows both text and icon by default to make it easiest, especially for new users with older tutorials. Some new and improved icons.
  • Refresh of user interface color preferences to a somewhat darker and trendier look. Other minor appearance tweaks.
  • New error curve mini-view.
  • Updated Fusion 3D exporter now exports all cameras, 3D planars, all meshes (including imported), lens distortion via image maps, etc.
  • New Fusion 2D corner pinning exporter.
  • Lens distortion export via color maps, currently for Fusion (Nuke for testing).
  • During offset tracking, a tracker can be (repeatedly) shift-dragged to different reference patterns on any frame, and SynthEyes will automatically adjust the offset channel keying.
  • Rotopanel’s Import tracker to CP (control point) now asks whether you want to import the relative motion or absolute position.
  • DNG/CinemaDNG reading. Marginal utility: DNG requires much proprietary postprocessing to get usable images, despite new luma and chroma blur settings in the image preprocessor.
  • New script to “Reparent meshes to active host” (without moving them)
  • New section in the user manual on “Realistic Compositing for 3-D”
  • New tutorials on offset tracking and Fusion.
  • Upgraded to RED 5.3 SDK (includes REDcolor4, DRAGONcolor2).
    • Faster camera and perspective drawing with large meshes and lidar scan data.
  • Windows: Installing license data no longer requires “right click/Start as Administrator”—the UAC dialog will appear instead.
  • Windows: Automatically keeps the last 3 crash dumps. Even one crash is one too many.
  • Windows: Installers, SynthEyes, and Synthia are now code-signed for “Andersson Technologies LLC” instead of showing “Unknown publisher”.
  • Mac OS X: Yosemite required that we change to the latest XCode 6—this eliminated support for OS X 10.7. Apple made 10.8 more difficult as well.

About SynthEyes

SynthEyes is a program for 3-D camera-tracking, also known as match-moving. SynthEyes can look at the image sequence from your live-action shoot and determine how the real camera moved during the shoot, what the camera’s field of view (~focal length) was, and where various locations were in 3-D, so that you can create computer-generated imagery that exactly fits into the shot. SynthEyes is widely used in film, television, commercial, and music video post-production.

What can SynthEyes do for me? You can use SynthEyes to help insert animated creatures or vehicles; fix shaky shots; extend or fix a set; add virtual sets to green-screen shoots; replace signs or insert monitor images; produce 3D stereoscopic films; create architectural previews; reconstruct accidents; do product placements after the shoot; add 3D cybernetic implants, cosmetic effects, or injuries to actors; produce panoramic backdrops or clean plates; build textured 3-D meshes from images; add 3-D particle effects; or capture body motion to drive computer-generated characters. And those are just the more common uses; we’re sure you can think of more.

What are its features? Take a deep breath! SynthEyes offers 3-D tracking, set reconstruction, stabilization, and motion capture. It handles camera tracking, 2- and 3-D planar tracking, object tracking, object tracking from reference meshes, camera+object tracking, survey shots, multiple-shot tracking, tripod (nodal, 2.5-D) tracking, mixed tripod and translating shots, stereoscopic shots, nodal stereoscopic shots, zooming shots, lens distortion, light solving. It can handle shots of any resolution (Intro version limited to 1920×1080)—HD, film, IMAX, with 8-bit, 16-bit, or 32-bit float data, and can be used on shots with thousands of frames. A keyer simplifies and speeds tracking for green-screen shots. The image preprocessor helps remove grain, compression artifacts, off-centering, or varying lighting and improve low-contrast shots. Textures can be extracted for a mesh from the image sequence, producing higher resolution and lower noise than any individual image. A revolutionary Instructible Assistant, Synthia™, helps you work faster and better, from spoken or typed natural language directions.

SynthEyes offers complete control over the tracking process for challenging shots, including an efficient workflow for supervised trackers, combined automated/supervised tracking, offset tracking, incremental solving, rolling-shutter compensation, a hard and soft path locking system, distance constraints for low-perspective shots, and cross-camera constraints for stereo. A solver phase system lets you set up complex solving strategies with a visual node-based approach (not in Intro version). You can set up a coordinate system with tracker constraints, camera constraints, an automated ground-plane-finding tool, by aligning to a mesh, a line-based single-frame alignment system, manually, or with some cool phase techniques.

Eyes starting to glaze over at all the features? Don’t worry, there’s a big green AUTO button too. Download the free demo and see for yourself.

What can SynthEyes talk to? SynthEyes is a tracking app; you’ll use the other apps you already know to generate the pretty pictures. SynthEyes exports to about 25 different 2-D and 3-D programs. The Sizzle scripting language lets you customize the standard exports, or add your own imports, exports, or tools. You can customize toolbars, color scheme, keyboard mapping, and viewport configurations too. Advanced customers can use the SyPy Python API/SDK.

endeavor space shuttle lidar

Endeavour: The Last Space Shuttle as she’s never been seen before.

[source by Mark Gibbs]

Endeavour, NASA’s fifth and final space shuttle, is now on display at the California Science Center in Los Angeles and, if you’re at all a fan of space stuff, it’s one of the most iconic and remarkable flying machines ever built.

David Knight, a trustee and board member of the foundation recently sent me a link to an amazing video of the shuttle as well as some excellent still shots.

David commented that these images were:

 “…captured by Chuck Null on the overhead crane while we were doing full-motion VR and HD/2D filming … the Payload Bay has been closed for [a] few years … one door will be opened once she’s mounted upright in simulated launch position in the new Air & Space Center.

Note that all of this is part of the Endeavour VR Project by which we are utilizing leading-edge imaging technology to film, photograph and LIDAR-scan the entire Orbiter, resulting in the most comprehensive captures of a Space Shuttle interior ever assembled – the goal is to render ultra-res VR experiences by which individuals will be able to don eyewear such as the Oculus Rift (the COO of Oculus himself came down during the capture sessions), and walk or ‘fly’ through the Orbiter, able to ‘look’ anywhere, even touch surfaces and turn switches, via eventual haptic feedback gloves etc.

The project is being Executive Produced by me, with the Producer being Ted Schilowitz (inventor of the RED camera and more), Director is Ben Grossman, who led the special effects for the most recent Star Trek movie. Truly Exciting!”

Here are the pictures …

Endeavour - the last Space Shuttle
Endeavour - the last Space ShuttleCharles Null / David Knight on behalf of the California Science Center
Endeavour - the last Space Shuttle

 

Rent or Buy Leica Geosystems Cyclone 9

Leica Geosystems HDS Introduces Patent-Pending Innovations for Laser Scanning Project Efficiency

With Leica Cyclone 9.0, the industry leading point cloud solution for processing laser scan data, Leica Geosystems HDS introduces major, patent-pending innovations for greater project efficiency. Innovations benefit both field and office via significantly faster, easier scan registration, plus quicker deliverable creation thanks to better 2D and 3D drafting tools and steel modelling. Cyclone 9.0 allows users to scale easily for larger, more complex projects while ensuring high quality deliverables consistently.

Greatest advancement in office scan registration since cloud-to-cloud registration
When Leica Geosystems pioneered cloud-to-cloud registration, it enabled users – for the first time – to accurately execute laser scanning projects without having to physically place special targets around the scene, scan them, and model them in the office. With cloud-to-cloud registration software, users take advantage of overlaps among scans to register them together.

“The cloud-to-cloud registration approach has delivered significant logistical benefits onsite and time savings for many projects. We’ve constantly improved it, but the new Automatic Scan Alignment and Visual Registration capabilities in Cyclone 9.0 represent the biggest advancement in cloud-to-cloud registration since we introduced it,” explained Dr. Chris Thewalt, VP Laser Scanning Software. “Cyclone 9.0 lets users benefit from targetless scanning more often by performing the critical scan registration step far more efficiently in the office for many projects. As users increase the size and scope of their scanning projects, Cyclone 9.0 pays even bigger dividends. Any user who registers laser scan data will find great value in these capabilities.“

With the push of a button, Cyclone 9.0 automatically processes scans, and digital images if available, to create groups of overlapping scans that are initially aligned to each other. Once scan alignment is completed, algorithmic registration is applied for final registration. This new workflow option can be used in conjunction with target registration methods as well. These combined capabilities not only make the most challenging registration scenarios feasible, but also exponentially faster. Even novice users will appreciate their ease-of-use and ready scalability beyond small projects.

Power user Marta Wren, technical specialist at Plowman Craven Associates (PCA – leading UK chartered surveying firm) found that Cyclone 9.0’s Visual Registration tools alone sped up registration processing of scans by up to four times (4X) faster than previous methods. PCA uses laser scanning for civil infrastructure, commercial property, forensics, entertainment, and Building Information Modelling (BIM) applications.

New intuitive 2D and 3D drafting from laser scans
For civil applications, new roadway alignment drafting tools let users import LandXML-based roadway alignments or use simple polylines imported or created in Cyclone. These tools allow users to easily create cross section templates using feature codes, as well as copy them to the next station and visually adjust them to fit roadway conditions at the new location. A new vertical exaggeration tool in Cyclone 9.0 allows users to clearly see subtle changes in elevation; linework created between cross sections along the roadway can be used as breaklines for surface meshing or for 2D maps and drawings in other applications.

For 2D drafting of forensic scenes, building and BIM workflows, a new Quick Slice tool streamlines the process of creating a 2D sketch plane for drafting items, such as building footprints and sections, into just one step. A user only needs to pick one or two points on the face of a building to get started. This tool can also be used to quickly analyse the quality of registrations by visually checking where point clouds overlap.

Also included in Cyclone 9.0 are powerful, automatic point extraction features first introduced in Cyclone II TOPO and Leica CloudWorx. These include efficient SmartPicks for automatically finding bottom, top, and tie point locations and Points-on-a-Grid for automatically placing up to a thousand scan survey points on a grid for ground surfaces or building faces.

Simplified steel fitting of laser scan data
For plant, civil, building and BIM applications, Cyclone 9.0 also introduces a patent-pending innovation for modelling steel from point cloud data more quickly and easily. Unlike time consuming methods that require either processing an entire available cloud to fit a steel shape or isolating a cloud section before fitting, this new tool lets users to quickly and accurately model specific steel elements directly within congested point clouds. Users only need to make two picks along a steel member to model it. Shapes include wide flange, channel, angle, tee, and rectangular tube shapes.

Faster path to deliverables
Leica Cyclone 9.0 also provides users with valuable, new capabilities for faster creation of deliverables for civil, architectural, BIM, plant, and forensic scene documentation from laser scans and High-Definition Surveying™ (HDS™).

Availability
Leica Cyclone 9.0 is available today. Further information about the Leica Cyclone family of products can be found at http://hds.leica-geosystems.com, and users may download new product versions online from this website or purchase or rent licenses from SCANable, your trusted Leica Geosystems representative. Contact us today for pricing on software and training.

Capturing Real-World Environments for Virtual Cinematography

Capturing Real-World Environments for Virtual Cinematography

[source] written by Matt Workman

Virtual Reality Cinematography

As Virtual Reality HMDs (Oculus) come speeding towards consumers, there is an emerging need to capture 360 media and 360 environments. Capturing a location for virtual reality or virtual production is a task that is well suited for a DP and maybe a new niche of cinematography/photography. Not only are we capturing the physical dimensions of the environment using LIDAR, but we capturing the lighting using 360 degree HDR light probes captured with DSLRs/Nodal Tripod systems.

A LIDAR scanner is essentially a camera that shoots in all directions. It lives on a tripod and it can record the physical dimensions and color of an environment/space. It captures millions of points and saves their position and color to be later used to construct the space digitally.

An HDR Latlong Probe in Mari

Using a DSLR camera and a nodal tripod head, the DP would capture High Dynamic Range (32bit float HDR) 360 degree probes of the location, to record the lighting.  This process would essentially capture the lighting in the space at a VERY high dynamic range and that would be later reprojected onto the geometry constructed using the LIDAR data.

Realtime 3D Asset being lit by an HDR environment real time (baked)

The DP is essentially lighting the entire space in 360 degrees and then capturing it. Imagine an entire day of lighting a space in all directions. Lights outside windows, track lighting on walls, practicals, etc. Then capturing that space using the above outlined techniques as an asset to be used later. Once the set is constructed virtually, the director can add actors/props and start filmmaking, like he/she would do on a real set.  And the virtual cinematographer would line up the shots, cameras moves, and real time lighting.

I’ve already encountered a similar paradigm as a DP, when I shot a 360 VR commercial. A few years ago I shot a commercial for Bacardi with a 360 VR camera and we had to light and block talent in all directions within a loft space. The end user was then able to control which way the camera looked in the web player, but the director/DP controlled it’s travel path.

360 Virtual Reality Bacardi Commercial

 

http://www.mattworkman.com/2012/03/18/bacardi-360-virtual-reality/

Capturing a set for VR cinematography would allow the user to control their position in the space as well as which way they were facing. And the talent and interactive elements would be added later.

Final Product: VR Environment Capture

 

In this video you can see the final product of a location captured for VR. The geometry for the set was created using the LIDAR as a reference. The textures and lighting data are baked in from a combination of the LIDAR color data and the reprojected HDR probes.

After all is said in done, we have captured a location, it’s textures, and it’s lighting that can be used a digital location however we need. For previs, virtual production, background VFX plates, a real time asset for Oculus, etc.

SIGGRAPH 2014 and NVIDIA

SG4141: Building Photo-Real Virtual Reality from Real Reality, Byte by Byte
http://www.ustream.tv/recorded/51331701

In this presentation Scott Metzger speaks about his new virtual reality company Nurulize and his work with the Nvidia K5200 GPU and The Foundry’s Mari to create photo real 360 degree environments. He shows a demo of the environment that was captured in 32bit float with 8k textures being played in real time on an Oculus Rift and the results speak for themselves. (The real time asset was down sampled to 16bit EXR)

UDIM Texture Illustration

Some key technologies mentioned were the development of virtual texture engines that allow objects to have MANY 8k textures at once using the UDIM model. The environment’s lighting was baked from V-Ray 3 to a custom UDIM Unity shader and supported by Amplify Creations beta Unity Plug-in.

The xxArray 3D photometry scanner

The actors were scanned in using xxArray photogrammetry system and Mari was used to project the high resolution textures. All of this technology was being enabled by Nvidia’s Quadro GPU line, to allow fast 8k texture buffering.  The actors were later imported in to the real time environment that had been captured and were viewable from all angles through an Oculus Rift HMD.

Real time environment for Oculus

Virtual Reality Filmmaking

Scott brings up some incredibly relevant and important questions about virtual reality for filmmakers (directors/DPs) who plan to work in virtual reality.

  • How do you tell a story in Virtual Reality?
  • How do you direct the viewer to face a certain direction?
  • How do you create a passive experience on the Oculus?

He even give a glimpse at the future distribution model of VR content. His demo for the film Rise will be released for Oculus/VR in the following formats:

  1. A free roam view where the action happens and the viewer is allowed to completely control the camera and point of view.
  2. A directed view where the viewer and look around but the positioning is dictated by the script/director. This model very much interests me and sounds like a video game.
  3. And a tradition 2D post rendered version. Like a tradition cinematic or film, best suited for Vimeo/Youtube/DVD/TV.

A year ago this technology seemed like science fiction, but every year we come closer to completely capturing humans (form/texture), their motions, environments with their textures, real world lighting, and viewing them in real time in virtual reality.

The industry is evolving at an incredibly rapid pace and so must the creatives working in it. Especially the persons responsible for the camera and the lighting, the director of photography.

FARO SCENE Cloud to Cloud Registration

FARO SCENE 5.3 Laser Scanning Software Provides Scan Registration without Targets

[source]

FARO® Technologies, Inc. (NASDAQ: FARO), the world’s most trusted source for 3D measurement, imaging, and realization technology, announced the release of their newest version of laser scanning software, SCENE 5.3, and scan data hosting-service, SCENE WebShare Cloud 1.5.

FARO’s SCENE 5.3 software, for use with the Laser Scanner Focus3D X Series, delivers scan registration by eliminating artificial targets, such as spheres and checkerboards. Users can choose from two available registration methods: Top View Based or Cloud to Cloud. Top View Based registration allows for targetless positioning of scans. In interiors and in built-up areas without reliable GPS positioning of the individual scans, targetless positioning represents a highly efficient and largely automated method of scanning. The second method, Cloud to Cloud registration, opens up new opportunities for the user to position scans quickly and accurately, even under difficult conditions. In exterior locations with good positioning of the scans by means of the integrated GPS receiver of the Laser Scanner Focus3D X Series, Cloud to Cloud is the method of choice for targetless registration.

In addition, the software also offers various new processes that enable the user to flexibly respond to a wide variety of project requirements. For instance, Correspondence Split View matches similar areas in neighbouring scans to resolve any missing positioning information, and Layout Image Overlay allows users to place scan data in a geographical context using image files, CAD drawings, or maps.

Oliver Bürkler, Senior Product Manager for 3D Documentation Software, remarked, “SCENE 5.3 is the ideal tool for processing laser scanning projects. FARO’s cloud-based hosting solution, SCENE WebShare Cloud, allows scan projects to be published and shared worldwide via the Internet. The collective upgrades to FARO’s laser scanning software solution, SCENE 5.3 and WebShare Cloud 1.5, make even complex 3D documentation projects faster, more efficient, and more effective. “

About FARO
FARO is the world’s most trusted source for 3D measurement, imaging and realization technology. The Company develops and markets computer-aided measurement and imaging devices and software. Technology from FARO permits high-precision 3D measurement, imaging and comparison of parts and complex structures within production and quality assurance processes. The devices are used for inspecting components and assemblies, production planning, documenting large volume spaces or structures in 3D, surveying and construction, as well as for investigation and reconstruction of accident sites or crime scenes.

Worldwide, approximately 15,000 customers are operating more than 30,000 installations of FARO’s systems. The Company’s global headquarters is located in Lake Mary, FL., its European head office in Stuttgart, Germany and its Asia/Pacific head office in Singapore. FARO has branches in Brazil, Mexico, Germany, United Kingdom, France, Spain, Italy, Poland, Netherlands, Turkey, India, China, Singapore, Malaysia, Vietnam, Thailand, South Korea and Japan.

Click here for more information or to download a 30-day evaluation version.