Leap Motion Controller Update to Offer Affordable Individual Joint MoCap

Leap Motion Controller Update to Offer Affordable Individual Joint MoCap

Leap Motion has announced that the software for its self-titled PC gesture-control device will soon track the movement of individual finger joints, as well as the overall motion of a user’s hands.

Since its launch in 2012, the $80 Leap Motion controller has attracted a lot of interest in the CG community, with Autodesk releasing Maya and MotionBuilder plugins last year.

Individual joint tracking, more parameters captured
In a post on the company’s blog, Leap Motion CEO Michael Buckwald revealed that version 2 of its software will track the individual joints of a user’s fingers, compensating automatically where individual fingers are occluded.

The software will also expose “much more granular data” via its SDK, including 27 dimensions per hand.

Affordable Individual MoCap tools coming soon
The update, which will be free, and does not require a change of hardware, is now in public beta for developers, although there’s no news of a consumer release date yet.

Jasper Brekelmans, creator of upcoming hand-tracking tool Brekel Pro Hands, has already announced that he is using the SDK.

Read more about the Leap Motion V2 update on the developer’s blog

hardware-independent-3d-laser-scanning-large-1152x648

Autodesk Announces ReCap Connect Partnership Program

With its new ReCap Connect Partnership Program, Autodesk will open up Autodesk ReCap – its reality capture platform – to third party developers and partners, allowing them to extend ReCap’s functionality.

“Autodesk has a long history of opening our platforms to support innovation and extension,” said Robert Shear, senior director, Reality Solutions, Autodesk. “With the ReCap Connect Partnership Program, we’ll be allowing a talented pool of partners to expand what our reality capture software can do. As a result, customers will have even more ways to start their designs with accurate dimensions and full photo-quality context rather than a blank screen.”

There are many ways for partners to connect to the ReCap pipeline, which encompasses both laser-based and photo-based workflows.  Partners can write their own import plug-in to bring structured point cloud data into ReCap and ReCap Pro using the Capture Codec Kit that is available as part of the new ReCap desktop version. DotProduct – a maker of handheld, self-contained 3D scanners – is the first partner to take advantage of this capability.

“Autodesk’s ReCap Connect program will enable a 50x data transfer performance boost for DotProduct customers — real time 3D workflows on tablets just got a whole lot faster. Our lean color point clouds will feed reality capture pipelines without eating precious schedule and bandwidth.” Tom Greaves, Vice President, Sales and Marketing, DotProduct LLC.

Alternately, partners can take advantage of the new Embedded ReCap OEM program to send Reality Capture Scan (RCS) data exports from their point cloud processing software directly to Autodesk design products, which all support this new point cloud engine, or to ReCap and ReCap Pro. The first signed partners in the Embedded ReCap OEM program are: Faro, for their Faro Scenesoftware; Z+F for their LaserControl software; CSA for their PanoMap software, LFM for their LFM software products; and Kubit for their VirtuSurv software.  All these partners’ software will feature this RCS export in their coming releases.

“Partnering with Autodesk and participating in the ReCap Connect program helps FARO to ensure a fluent workflow for customers who work with Autodesk products. Making 3D documentation and the use of the captured reality as easy as possible is one of FARO’s foremost goals when developing our products. Therefore, integrating with Autodesk products suits very well to our overall product strategy.” – Oliver Bürkler, Senior Product Manager, 3D Documentation Software & Innovation, FARO

As a third option, partners can build their own application on top of the Autodesk photo-to-3D cloud service by using the ReCap Photo Web API. More than 10 companies – serving markets ranging from medical and civil engineering, to video games and Unmanned Aerial Vehicles (UAVs) – have started developing specific applications that leverage this capability, or have started integrating this capability right into their existing apps. Some of the first partners to use the ReCap Photo Web API include Soundfit, SkyCatch and Twnkls.

“Autodesk’s cloud based ReCap is an important part of the SoundFit’s 3D SugarCube Scanning Service.  Autodesk’s ReCap service has enabled SoundFit to keep the per scan cost of its service very low, opening new markets, such as scans for hearing aids, custom fit communications headsets, musicians monitors and industrial hearing protection. ReCap allows SoundFit to export 3D models in a wide variety of popular 3D formats, so SoundFit customers and manufacturers can import them into Autodesk CAD packages from AutoCAD to 123D Design, or send them directly to any 3D printer or 3D printing service bureau.” – Ben Simon-Thomas, CEO & Co-Founder

For more information about the ReCap Connect Partnership Program, contact Dominique Pouliquen at Email Contact.

Additional Partner Supporting Quotes

“ReCap Connect gives our PointSense and PhoToPlan users smart and fully integrated access to powerful ReCap utilities directly within their familiar AutoCAD design environments. The result is a more simple and efficient overall workflow. ReCap Photo 360 image calibration eliminates the slowest part of a kubit user’s design process resulting in significant time savings per project.” – Matthias Koksch, CEO, kubit

“ReCap, integrated with CSA’s PanoMap Server, provides a powerful functionality to transfer laser scan point cloud data from large-scale 3D laser scan databases to Autodesk products.  Using the interface, the user can select any plant area by a variety of selection criteria and transfer the laser scan points to the design environment in which they are working. The laser scan 3D database of the plant can have thousands of laser scans.” – Amadeus Burger, President, CSA Laser Scanning

“Autodesk’s industry leading Recap photogrammetry technology will be instrumental in introducing BuildIT’s 3D Metrology solution to a broader audience by significantly reducing data capture complexity and cost.” – Vito Marone, Director Sales & Marketing, BuildIT Software & Solutions

“I am very pleased with the ReCap Photo API performance and its usefulness in fulfilling our 3D personalization needs. I believe the ReCap Photo API is the only product that is available in the market today that meets our needs.” – Dr. Masuma, PhD., Founder of iCrea8

 

Angela Costa Simoes

Senior PR Manager

DIRECT  +1 415 547 2388

MOBILE  +1 415 302 2934

@ASimoes76

Autodesk, Inc.

The Landmark @ One Market, 5th Floor

San Francisco, CA 94105

www.autodesk.com

Massive Software announces Massive 6.0 crowd simulation software

Massive 6.0

New look, new GUI

Massive has a completely new graphic user interface. With graphic design by Lost in Space the new interface not only looks stylish and modern but provides a much smoother interactive user experience. Dialog windows and editors now turn up in the new side panel, keeping the workspace clear and tidy. The main window now hosts multiple panels that can be configured to suit the users needs, and the configurations can be recalled for later use. Since any panel can be a viewport it’s now possible to have up to 5 viewports open at once, each using a different camera.

Massive

 

3D placement

The existing placement tools in Massive have been extended to work in 3 dimensions, independently of the terrain. The point generator can be placed anywhere in space, the circle generator becomes a sphere, the polygon generator gains depth, the spline generator becomes tubular. There’s also a new generator called the geometry generator, which takes a wavefront .obj file and fills the polygonal volume with agents.

 

Auto action import

Building complex agents with hundreds of actions can be a time consuming process, but it doesn’t have to be anymore. In Massive 6.0 the action importing process can be completely automated, reducing what could be months of work to a few minutes. Also, all of the import settings for all of the actions can be saved to a file so that revisions of source motion can be imported in seconds using the same settings as earlier revisions.

Massive

Bullet dynamics

To effortlessly build a mountain of zombies it would be useful to have extremely stable rigid body dynamics. Massive 6.0 supports bullet dynamics, significantly increasing dynamics stability. Just for fun we had 1000 mayhem agents throw themselves off a cliff into a pile on the ground below. Without tweaking of parameters we easily created an impressive zombie shot, demonstrating the stability and ease of use of bullet dynamics.

No typing required

While it is possible to create almost any kind of behaviour using the brain nodes in Massive, it has always required a little typing to specify inputs and outputs of the brain. This is no longer necessary with the new channel menu which allows the user to very quickly construct any possible input or output channel string with a few mouse clicks.

These are just some of the new features of Massive 6.0, which is scheduled for release in September.

 

Massive for Maya

 

Massive has always been a standalone system, and now there’s the choice to use Massive standalone as Massive Prime and Massive Jet, or in Maya as Massive for Maya.

 

Set up and run simulations in Maya

Massive for Maya facilitates the creation of Massive silmuations directly in Maya. All of the Massive scene setup tools such as flow field, lanes, paint and placement editors have been seamlessly reconstructed inside Maya. The simulation workflow has been integrated into Maya to allow for intuitive running, recording and playback of simulations. To achieve this a record button has been added next to the transport controls and a special status indicator has been included in the Massive shelf. Scrubbing of simulations of thousands of agents in Maya is now as simple and efficient as scrubbing the animation of a single character.

Massive Software 3D imaging for crowd placement and simulation

Set up lighting in Maya

The Massive agents automatically appear in preview renders as well as batch renders alongside any other objects in the scene. Rendering in Maya works for Pixar’s RenderMan, Air, 3Delight, Mental Ray and V-Ray. This allows for lighting scenes using the familiar Maya lighting tools, without requiring any special effort to integrate Massive elements into the scene. Furthermore, all of this has been achieved without losing any of the efficiency and scalability of Massive.

 

Edit simulations in Maya graph editor

Any of the agents in a simulation can be made editable in the Maya graph editor. This allows for immediate editing of simulations without leaving the Maya environment. Any changes made to the animation in the graph editor automatically feed back to the Massive agents, so the tweaked agents will appear in the render even though the user sees a Maya character for editing purposes in the viewport. The editing process can even be used with complex animation control rigs, allowing animators and motion editors complete freedom to work however they want to.

 

 

Massive Software 3D imaging for crowd placement and simulationDirectable characters

A major advantage of Massive for Maya is the ability to bring Massive’s famous brains to character animation, providing another vital tool for creating the illusion of life. While animation studios have integrated Massive into their pipeline to do exactly this for years, the ability to create directable characters has not been within easy reach for those using off-the-shelf solutions. With Massive for Maya it’s now possible to create characters using a handful of base cycles, takes and expressions that can handle such tasks as keeping alive, responding to the the focus of the shot, responding to simple direction, or simply walking along a path, thus reducing the amount of work required to fill out a scene with characters which are not currently the focus of the shot. For example, in a scene in which two characters are talking with eachotherand a third character, say a mouse, is reacting, the mouse could be driven by it’s Massive counterpart. The talking characters would drive their Massive counterparts thereby being visible to the mouse. Using attributes in the talking characters, their Massive counterparts could change colour to convey their emotional states to the mouse agent. The mouse agent then performs appropriately, using it’s animation cycles, blend shape animations etc in response to the performance of the talking characters, and looking at whichever character is talking. Once the agents for a project have been created, setting up a shot for this technique requires only a few mouse clicks and the results happen in real-time. Any edits to the timing of the shot will simply flow through to the mouse performance.

SCANable offers on-site 3D imaging of real-world people/characters to populate your 3D crowd asset library in Massive’s crowd placement and simulation software. Contact us today for a free quote.

R3dS Wrap Topology Transfer Software

Introducing R3DS Wrap – Topology Transfer Tool

Wrap is a topology transfer tool. It allows to utilize the topology you already have and transfer your new 3D-scanned data onto it. The resulting models will not only share the same topology and UV-coordinates but also will naturally become blendshapes of each other. Here’s a short video how it works:

and here are a couple of examples based on 3D-scans kindly provided by Lee Perry-Smith

R3dS Wrap Topology Transfer Software

You can download a demo-version from their website http://www.russian3dscanner.com

As with all new technology during its final beta stages, Wrap is not perfect yet. R3DS would be highly appreciative and grateful of everyone that gives us the support and feedback to finalize things in the best possible way. This software has some potential to be a great tool. Check it out!

RIEGL_Software_RiALITY_Screen-5d

Introducing the World’s First App for LiDAR data visualization on the iPad: RiALITY

RIEGL proudly announces its new iPad point cloud viewer: RiALITY, now available for free in the iTunes App Store.

This new, innovative App, the first of its kind, allows users to experience LiDAR data in a completely new environment. It also allows easier LiDAR data demonstrations through the use of an iPad.

RIEGL’s RiALITY App enables users to visualize and navigate through point clouds acquired with RIEGL laser scanners. As an example, users are able to explore a dataset of the beautiful Rosenburg Castle in Austria. RIEGL scans can also be imported from RIEGL’s RiSCAN PRO software into the App, as well.

“We’re pleased to present a new way of visualizing point clouds. RiALITY delivers this new technology by providing Augmented Reality technology in an easy-to-use app. Now you can easily send your client a 3D point cloud that they can visualize on their iPad, for free.” said Ananda Fowler, RIEGL’s manager of terrestrial laser scanning software.

RiALITY features true color point clouds and 3D navigation. In a breakthrough technological development, the app features an Augmented Reality Mode. The Augmented Reality Mode allows point clouds to be virtually projected into the real world.

Dive into the point cloud!

Find out more at www.riegl.com/app.

3D-Scanned Olympians Wear Uniforms Suited for Superheroe

Olympic athletes will wear state-of-the-art, 3D-scanned, custom-fitted uniforms

[Source: Mashable]

As if we needed another reason to worship athletes, select Olympic hockey players will wear state-of-the-art, 3D-scanned uniforms custom-fitted to their body parts. That’s right, like superheroes.

Hockey equipment manufacturer Bauer officially unveiled the new line of high-tech hockey equipment, called “OD1N,” in December. CEO Kevin Davis has touted the gear as the “concept car” of hockey equipment. Pouring a cool million dollars into outfitting six elite hockey players, Bauer used a tech-friendly combination of composite materials, compression-molded foam and 3D optical scanning to personalize the equipment, while lightening the skates, protective gear and goalie pads by one-third.

Hockey enthusiasts will see the line in action on the ice when it debuts at the Sochi Winter Games in February. The equipment will be worn by the NHL‘s Jonathan Toews (Chicago Blackhawks/Team Canada), Patrick Kane (Chicago Blackhawks/Team USA), Nicklas Backstrom (Washington Capitals/Team Sweden) and goaltender Henrik Lundqvist (New York Rangers/Team Sweden). Claude Giroux (Philadelphia Flyers/Team Canada) and Alex Ovechkin (Washington Capitals/Team Russia) round out the group of six players who worked with Bauer to test the equipment.

Lundqvist has been practicing and playing with the Od1n goal pads since November, while Toews, Kane and Backstrom are sporting elements of the protective body suit.

Bauer OD1N Equipment

 

Bauer’s new line of hockey equipment comprises skates, goal pads and protective base-layer suits molded to each player’s form.

IMAGE: BAUER

The equipment’s weight reduction should provide a significant on-ice advantage. The skates alone, with their lighter, carbon-composite blade holders, amount to roughly 1,000 fewer pounds of lifted weight during a regulation game, according to Bauer. Lundqvist will lift 180 fewer pounds with the Od1n goalie pads, which replace traditional layers of synthetic leather with compression-molded foam that can be modified depending on the goaltender’s style of play.

“The benefit is not only in the quickness to the puck but in their ultimate endurance and stamina going into the third period,”

“The benefit is not only in the quickness to the puck but in their ultimate endurance and stamina going into the third period,” says Craig Desjardins, Bauer’s general manager of player equipment and project leader for Od1n. “For [Lundqvist], that was the difference between getting a block or getting scored on.”

Like many a concept car, Bauer also drew on new technologies via its designs. Using 3D optical scanning borrowed from the automotive industry, it manufactured protective base-layer suits molded to each player’s physique. The scans generated computerized models, from which Bauer designed custom equipment.

“Being able to customize, for example, a shin guard or elbow pad based on the individual geometry of a player, we’ve taken the guesswork out completely,” says Desjardins. “It’s going to better protect you if it stays in place.”

 

 

It’s all very spiffy, but in automotive terms, a concept car showcases radical new developments in technology and design that make it prohibitively expensive for consumers. The cars don’t often make it to mass production. The cost of Bauer’s own “concept car” design, with its attendant technological advancements, places the equipment well out of reach of all but the most elite hockey players.

Much like BMW’s shape-shifting sedan, the idea is to ogle Od1n, not to own it — although, Bauer will likely outfit a few more NHL bodies in the future.

Its creators are optimistic that certain elements will make their way to mass production, however.

“We’re trying to invent the future of hockey equipment, to show the industry and consumers where it could go, where it will go,” says Desjardins. “In the next few years, we’ll be able to take that technology down into multiple price points.”

So if, a few years from Sochi, your neighborhood is teeming with hockey prodigies, you’ll know why.

LiDAR for Visual Effects - Rebirth

Krakatoa Creates CG Visual Effects from LIDAR Scans for Short Film “Rebirth”

Film director and cinematographer Patryk Kizny – along with his talented team at LookyCreative – put together the 2010 short film “The Chapel” using motion controlled HDR time-lapse to achieve an interesting, hyper-real aesthetic. Enthusiastically received when released online, the three-minute piece pays tribute to a beautifully decaying church in a small Polish village built in the late 1700s. Though widely lauded, “The Chapel” felt incomplete to Kizny, so in fall of 2011, he began production on “Rebirth” to refine and add dimension to his initial story.

LiDAR for Visual Effects - Rebirth

Exploring the same church, “Rebirth” comprises three separate scenes created using different visual techniques. Contemplative, philosophical narration and a custom orchestral soundtrack composed by Kizny’s collaborator, Mateusz Zdziebko, help guide the flow and overall aspirational tone of the film, which runs approximately 12 minutes. The first scene features a point cloud representation of the chapel with various pieces and cross-sections of the building appearing, changing and shifting to the music. Based on LIDAR scans taken of the chapel for this project, Kizny generated the point clouds with Thinkbox Software’s volumetric particle renderer, Krakatoa, in Autodesk 3ds Max.

LiDAR for VFX - Rebirth

“About a year after I shot ”The Chapel,” I returned to the location and happened to get involved in heritage preservation efforts,” Kizny explained. “At the time, laser scanning was used for things like archiving, set modeling and support for integrating VFX in post production, but I hadn’t seen any films visualizing point clouds themselves, so that’s what I decided to do.”

EKG Baukultur an Austrian/German company that specializes in digital heritage documentation and laser scanning, scanned the entire building in about a day from 25 different scanning positions. The collected data was then registered and processed – creating a dataset of about 500 million points. Roughly half of the collected data was used to create the visualizations.

3D Laser Scanning for Visual Effects - Rebirth

Data processing was done in multiple stages using various software packages. Initially, the EKG Baukultur team registered the separate scans together in a common coordinates space using FARO Scene software. Using .PTS format, the data was then re-imported into Alice Labs Studio Clouds (acquired by Autodesk in 2011) for clean up. Kizny manually removed any tripods with cameras, people, checkerboards and balls that had been used to reference scans. Then, the data was processed in Geomagic Studio to reduce noise, fill holes and uniformly downsample selected areas of the dataset. Later, the data was exported back to the .PTS ASCII format with the help of MeshLab and processed using custom Python scripting so that it could be ingested using the Krakatoa importer. Lacking a visual effects background, Kizny initially tested a number of tools to find the best way to visualize point cloud data in a cinematic way with varying and largely disappointing results. Six months of extensive R&D led Kizny to Krakatoa, a tool that was astonishingly fast and a fraction of the price of similar software specifically designed for CAD/CAM applications.

“I had a very basic understanding of 3ds Max, and the Krakatoa environment was new to me. Once I began to figure out Krakatoa, it all clicked and the software proved amazing throughout each step of the process,” he said.

Even with mixing the depth of field and motion blur functions in Krakatoa, Kizny was able to keep his render time to roughly five to ten minutes per frame, even while rendering 200 million points in 2K, by using smaller apertures and camera passes from a higher distance.

“Krakatoa is an amazing manipulation toolkit for processing point cloud data, not only for what I’m doing here but also for recoloring, increasing density, projecting textures and relighting point clouds. I have tried virtually all major point cloud processing software, but Krakatoa saved my life on this project,” Kizny noted.

In addition to using Krakatoa to visualize all the CG components of “Rebirth” as well as render point clouds, Kizny also employed the software for advanced color manipulation. With two subsets of data – a master with good color representation and a target that lacked color information – Kizny used a Magma flow modifier and a comprehensive set of nodes to cast and spatially interpolate the color data from the master subset onto the target subset so that they blended seamlessly in the final dataset. Magma modifiers were also used for the color correction of the entire dataset prior to rendering, which allowed Kizny greater flexibility compared to trying to color correct the rendering itself. Using Krakatoa with Magma modifiers also provided Kizny with a comprehensive set of built-in nodes and scripting access.

3D Laser Scanning for Visual Effects - Rebirth

The second scene of “Rebirth” is a time-lapse reminiscent of “The Chapel,” while the final scene shows live action footage of a dancer. Footage for each scene was captured using Canon DSLR cameras, a RED ONE camera and DitoGear motion control equipment. Between the second and third scene, a short transition visualizes the church collapsing, which was created using 3ds Max Particle Flow with help of Thinkbox Ember, a field manipulation toolkit, and Thinkbox Stoke, a particle reflow tool.

“In the transition, I’m trying to collapse a 200 million-point data cloud into smoke, then create the silhouette of a dancer as a light point from the ashes,” shared Kizny. “Even though it’s a short scene, I’m making use of a lot of technology. It’s not only rendering this point cloud data set again; it’s also collapsing it. I’m using the software in an atypical way, and Thinkbox has been incredibly helpful in troubleshooting the workflow so I could establish a solid pipeline.”

Collapsing the church proved to be a challenge for Kizny. Traditionally, when creating digital explosions, VFX artists are blowing up a solid, rigid object. Not only did Kizny need to collapse a point cloud – a daunting task in of itself – but he also had to do so in the hyper-realistic aesthetic he’d established, and in a way that would be both ethereal and physically believable. Using 3ds Max Particle Flow as a simulation environment, Kizny was able to generate a comprehensive vector field of high resolution that was more efficient and precisely controlled with Ember. Ember was also used to animate two angels appearing from the dust and smoke along with the dancer silhouette. The initial dataset of each of angels was pushed through a specific vector noise field that produced a smoke-like dissolve and then reversed thanks to retiming features in Krakatoa, Ember and Stoke, which was also used to add density.

3D Laser Scanning for Visual Effects - Rebirth

“To create the smoke on the floor, I decided to go all the way with Thinkbox tools,” Kizny said. “All the smoke you see was created using Ember vector fields and simulated with Stoke. It was good and damn fast.”

Another obstacle was figuring out how to animate the dancer in the point clouds. Six cameras recorded a live performer with markerless motion capture tracking done using iPi Motion Capture Studio package. The data obtained from the dancer was then ported onto a virtual, rigged model in 3ds Max and used to emit particles for a Particle Flow simulation. Ember vector fields were used for all the smoke-like circulations and then everything was integrated and rendered using Thinkbox’s Deadline, a render management system, and Krakatoa – almost 900 frames and 3 TB of data caches only for particles. Deadline was also used to distribute high volume renders and allocate resources across Kizny’s render farm.

Though an innovative display of digitally artistry, “Rebirth” is also a preservation tool. Interest generated from “The Chapel” and continued with “Rebirth” has enticed a Polish foundation to begin restoration efforts on the run-down building. Additionally, the LIDAR scans of the chapel will be donated to CyArk, a non-profit dedicated to the digital preservation of cultural heritage sites, and made widely available online.

The film is currently securing funding to complete postproduction. Support the campaign and learn more about the project at the IndieGoGo campaign homepage at http://bit.ly/support-rebirth. For updates on the film’s progress, visit http://rebirth-film.com/.

About Thinkbox Software
Thinkbox Software provides creative solutions for visual artists in entertainment, engineering and design. Developer of high-volume particle renderer Krakatoa and render farm management software Deadline, the team of Thinkbox Software solves difficult production problems with intuitive, well-designed solutions and remarkable support. We create tools that help artists manage their jobs and empower them to create worlds and imagine new realities. Thinkbox was founded in 2010 by Chris Bond, founder of Frantic Films. http://www.thinkboxsoftware.com

Asif-Khan-Megaface-Sochi-3d

2014 Sochi Winter Olympics to Feature Giant 3D Pinscreen of Your Face

Visitors to this year’s Sochi Winter Olympics will have the opportunity to see their face rendered on the side of a building in giant 3D mechanical polygons. The work of British architect Asif KhanMegaface is like a cross between Mount Rushmore’s sculpted facade and the pinscreens that adorned executive offices of the ’90s.

The design of a 2000 sq.m pavilion and landscape for MegaFon, one of the largest Russian telecoms companies and general partner of the Sochi Winter Olympics.

3D photo booths within the pavilion and across Russia in MegaFon retail stores will scan visitors’ portraits to be recreated in by the pavilion. It’s facade is designed to function like a huge pin screen. It is made up of over 10,000 actuators which transform the building’s skin into a three-dimensional portrait of each visitor’s face.

The concept is to give everyone the opportunity to be the face of the Olympics.

The structure is sited at the entrance to the Olympic Park, and incorporates an exhibition hall, hospitality areas, a rooftop viewing deck and 2 broadcasting suites.

The installation consists of 10,000 actuators fitted with LEDs and arranged into triangles that can extend up to six feet out from the side of the building to form 3D shapes. Visitors will be invited to have their face scanned at on-site “3D photo booths” before Khan’s actuators will move to form giant 500-square-foot representations of the scans. Three faces will be shown at any given time for 20 seconds, and it’s estimated 170,000 faces will be rendered during the games. Visitors will also be given a link where they can watch a 20-second video showing the exact moment when their face was on the side of the building.

170,000 FACES WILL BE RENDERED DURING THE GAMES

Megaface will comprise one side of Russian carrier Megafon’s pavilion — the installation’s name itself part of the massive branding exercise that is the Olympics. It’s some way from completion, but Khan and Swiss firm iart, which is realizing Khan’s vision, have successfully demonstrated a prototype (shown below) that uses just 1,000 actuators to render a small-scale image.

[Asif Khan via Verge]

The Kinetic Facade of the MegaFaces Pavilion: Initial Batch Test from iart on Vimeo.

3D Systems Logo

3D Systems buys company behind Star Wars, Hobbit and Harry Potter models

3D Systems Acquires Gentle Giant Studios

  • Accesses decades of licensed content from industry’s greatest brands
  • Expands leadership capabilities and know-how in retail merchandising
Release Date:
Friday, January 3, 2014 – 08:36

ROCK HILL, South Carolina – January 3, 2014 – 3D Systems  (NYSE:DDD) announced today the acquisition of Gentle Giant Studios, the leading provider of 3D modeling for the entertainment and toy industry. For over two decades, Gentle Giant Studios has led the development of state-of-the-art content using 3D scanning and modeling to develop and manufacture licensed 3D printed characters, toys and collectibles from a variety of franchise properties with global brand recognition, including Marvel, Disney, AMC’s The Walking Dead, Avatar, Harry Potter and Star Wars.

3DS plans to immediately leverage Gentle Giant Studios technology and vast library of digital content into its consumer platform and extend its existing brand relationships to further the reach of 3D scanning, modeling and printing for entertainment, toys, collectibles, action figures in conjunction with numerous blockbuster films and evergreen licensed properties.

“Gentle Giant Studios catapults 3DS’s consumer platform forward with highly curated, licensed characters, content publishing know-how and first-mover experience for the benefit of leading toy companies, movie studios and their merchandising divisions,” said Avi Reichental, President and CEO, 3D Systems.

Learn more about how 3DS is manufacturing the future today at www.3dsystems.com.

About 3D Systems Corporation

3D Systems is a leading provider of 3D printing centric design-to-manufacturing solutions including 3D printers, print materials and cloud sourced on-demand custom parts for professionals and consumers alike in materials including plastics, metals, ceramics and edibles. The company also provides integrated 3D scan-based design, freeform modeling and inspection tools. Its products and services replace and complement traditional methods and reduce the time and cost of designing new products by printing real parts directly from digital input. These solutions are used to rapidly design, create, communicate, prototype or produce real parts, empowering customers to manufacture the future.

 

Leadership Through Innovation and Technology

  • 3DS invented 3D printing with its Stereolithography (SLA) printer and was the first to commercialize it in 1989.
  • 3DS invented Selective Laser Sintering (SLS) printing and was the first to commercialize it in 1992.
  • 3DS invented the Color-Jet-Printing (CJP) class of 3D printers and was the first to commercialize 3D powder-based systems in 1994.
  • 3DS invented Multi-Jet-Printing (MJP) printers and was the first to commercialize it in 1996.

Today its comprehensive range of 3D printers is the industry’s benchmark for production-grade manufacturing in aerospace, automotive, patient specific medical device and a variety of consumer, electronic and fashion accessories.

More information on the company is available at www.3DSystems.com.

About Gentle Giant Studios

Gentle Giant Studios is the leading provider of 3D digital data and is the first company to utilize digital data and 3D printing technology for the consumer products and entertainment industries. Creating beloved 3D characters from a variety of franchise properties with worldwide name recognition, including Star Wars, Marvel, Avatar, Harry Potter, AMC’s The Walking Dead, and The Hobbit. Gentle Giant produces a wide range of products that are manufactured using the highest quality utilizing the most advanced 3D scan to print techniques and a team of incredibly talented artisans that digitally captures the likenesses of actors, props, and scenery to accurately model and recreate these images for fans and collectors everywhere. Gentle Giant Studios also provides prototyping and product development services for consumer products, fine art, theme parks, and provides on set digitizing services for major motion pictures.

More information on the company is available on www.gentlegiantltd.com, and www.gentlegiantstudios.com.

Leica_CloudWorx_for_AutoCAD

Leica Geosystems announces updates for its point cloud software applications

Leica Geosystems announces a major set of updates for its point cloud software applications within its flagship Leica Cyclone and Leica CloudWorx families. These updates save a significant time in the office per day and make it more convenient to work with rich, as-built point cloud data. This is the company’s largest set of point cloud software releases to date.

“What we’re seeing in the market is that our customers are using laser scanning in an increasing variety of scenarios and under more demanding circumstances, so they need more options for working with point cloud data and they need to do their work even faster,” states Chris Thewalt, VP of Scanning Software.  “Overall, we continue to see strong growth of 3D laser scanning/High-Definition Surveying (HDS) with a corresponding expansion and diversification of our user community’s needs. In response, we’ve been investing heavily in a number of our standalone Cyclone and our plug-in CloudWorx point cloud software applications. This large set of releases reflects that ongoing investment.”

Leica Cyclone and Leica CloudWorx families

• More flexible licensing lets users easily move licenses between the field and office and on-or-off a network.
• Users on customer support can implement license upgrades on their own at any time
• Rentals are now available for as short as one week for most products; discounts are available for extended rental periods

Leica CloudWorx for AutoCAD 5.0

• Plug-in for AutoCAD saves hours in the office for working with 3D point clouds in AutoCAD for both experienced users and users new to working in 3D
• Easier X,Y,Z coordinate system setup and faster navigation to desired views; faster creation of 2D drawings; faster ground surface and TIN creation; and, faster selection of high, low and ground points

Leica CloudWorx for 3ds Max 2.0

• New Leica CloudWorx plug-in family member (replaces Leica CloudWorx-VR)
• Eliminates prior need to export from Cyclone and import to Leica CloudWorx-VR; users now enjoy direct data access to Cyclone files
• Adds rich set of standard CloudWorx plug-in tools for working more efficiently with point clouds in 3ds Max

Leica CloudWorx for PDMS 1.3

• Plug-in for PDMS adds valuable option of importing plant models from PDMS directly into Leica Cyclone and exporting models created from point clouds in Cyclone directly into PDMS
• Avoids prior need to import/export models into/from PDMS and Cyclone via AutoCAD or MicroStation
• Supports direct import of PDMS models into popular Leica TruView software

Leica_CloudWorx_for_AutoCAD