Loading
SCANable
  • Home
  • About Us
    • COVID-19 Response
    • News
  • Services
    • Request 3D Scanning Services
      • Architecture & Heritage
      • Civil/Survey
      • Plant, Process & Marine
      • Security and Threat Analysis
      • Training and Support
    • Mobile Photogrammetry Studios
    • CAL – Capture Array for Lumetrics
    • Virtual Production Support
    • Drone Photogrammetry
    • Digital Doubles
    • Sets and Environments
    • Props and Products
    • Vehicles
    • Rentals
  • Recent Productions
  • Locations
    • Los Angeles
    • New York
    • Atlanta
    • New Orleans
    • Houston
  • Contact
  • Click to open the search input field Click to open the search input field Search
  • Menu Menu
spotlight stories app 3D Scanning Google ATAP Spotlight Story ‘HELP’

Google Brings Its 360-Degree Movies App, Spotlight Stories, To iOS

September 3, 2015/0 Comments/in 3D Laser Scanning, In the News, Industry, New Technology, News, Software/by Travis Reinke

SCANable helped bring the Google Spotlight Stories fourth animated film to life by capturing all of the sets, locations, actors and vehicles in 3D for recreation of each item as photorealistic 3D digital assets. These assets were used by the VFX teams to create the final version of the film.

Source: TechCrunch.com

Google Spotlight Stories, a mobile app featuring immersive, 360-degree animated films originally developed by Motorola, has now made its way to iOS devices.

When viewers watch the movie, entitled “HELP” they can look anywhere, set the pace and frame the shot by moving their mobile device. Previously Spotlight Stories was only supported by Android but is now available to users of iOS 8.0 or higher.

The app itself is intended for entertainment purposes, as it offers stories built using 3D and 2D animations, 360-degree spherical “cinema-quality” video, sound sphere audio and “sensor fusion techniques,” explains Google. In short, what that means is that viewers can look around inside the animated content by moving their body and the phone to see different parts or angles of the story taking place.

Basically, the app can take advantage of the device’s sensors like its gyroscope and accelerometer in order to offer an immersive viewing experience. However, it doesn’t let end users create these sorts of movies for themselves.

One of the original animations featured in Spotlight Stories when it debuted was a film called “Windy Days” by ex-Pixar moviemakers, which appeared on Moto X phones when the Android app rolled out. This, as well as the other content previously available on Android, is also available in the new iOS app.

The app includes films like “Duet” from Glen Keane, “Buggy Night” from ATAP, and “Help” by “The Fast and the Furious” director Justin Lin. What’s interesting is that this latter movie, unlike the others, is noted as being “free for a limited time,” which indicates that Google may be planning to sell movies through this service in the future.

The technology for making these artistic mini-movies was first developed by Motorola Mobility’s Advanced Technology And Products (ATAP) moonshot division, but Google continued to fund its development in the years that followed. However, because the app was originally intended for Motorola devices (like the Moto X), it didn’t immediately support a wide range of Android devices when it launched. Some limitations on Android continue today, as the Google Play version still indicates that Spotlight Stories is “not yet compatible with all smartphones.”

However, the new iOS release will work on any device running iOS 8.0 or higher, notes Google.

The app is a free download, here on iTunes.

https://scanable.com/wp-content/uploads/2015/05/00000009727.jpg 720 1280 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-09-03 13:43:362015-09-03 13:43:36Google Brings Its 360-Degree Movies App, Spotlight Stories, To iOS
Nissan Super Bowl ad

Emmy Nomination for Nissan “With Dad” Commercial

September 3, 2015/0 Comments/in In the News, News/by Travis Reinke

Do you recall the emotional Nissan television ad that was introduced during Super Bowl XLIX earlier this year? It made the list of 2015 Emmy nominations for Outstanding Commercial, and we are not surprised. Refresh your memory and watch it again.

This sentimental commercial, titled “With Dad,” certainly caught the public’s attention. After it premiered during the Super Bowl, the ad was voted the Favorite Super Bowl Commercial of 2015 and received more than 22 million views on YouTube. Nissan definitely made a bang after opting not to advertise during the Super Bowl in nearly twenty years!

As you can see, the commercial takes viewers through the lives of a loving family working its way through tough times and struggling to stay close despite the circumstances. Nissan took the opportunity to introduce two new Nissan models, the GT-R LM NISMO sports car and 2016 Nissan Maxima sports sedan.

Nissan #withdad campaign

The popular commercial ran as part of Nissan’s #withdad campaign, aimed toward reminding the public how much more fun and exciting life can be with dad. In addition to the Super Bowl commercial and other YouTube videos, Nissan also donated $1,000,000 to a couple organizations that help individuals make better lives for them and their families–Habitat for Humanity and Wounded Warrior Project.

Source: Sorg Nissan

https://scanable.com/wp-content/uploads/2015/02/Nissan-LMP1.jpg 675 1200 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-09-03 13:42:442015-09-03 13:42:44Emmy Nomination for Nissan “With Dad” Commercial
Shade VFX logo

Shade VFX Nominated for an Emmy Award for Daredevil

September 3, 2015/0 Comments/in 3D Laser Scanning, Events, In the News, News, Visual Effects (VFX)/by Travis Reinke

Daredevil, The Netflix Original Series, has been nominated for three Emmy awards in 2015, including one for Shade VFX in the category of Best Supporting Visual Effects. SCANable partnered with Shade VFX on this project by providing 3D scans of sets and actors used for camera tracking and matchmoving.

Source: Shade VFX

At the 67th Primetime Emmy Awards the Best Supporting Visual Effects category will award seamless, invisible effects in a television series and we’re utterly thrilled to be in the running alongside American Horror Story: Freak Show, Boardwalk Empire, Gotham, and The Walking Dead.

Daredevil, as we’ve discussed previously, was a unique opportunity for Shade VFX to expand both into New York and into the world of exceptional television working alongside Netflix and Marvel.

We created invisible effects that were a showpiece of the series, propelling the gritty and dark crime story forward and making your palms sweaty with action along the way.

Congratulations to the whole team that worked on Daredevil; Visual Effects Producer, David Van Dyke, Visual Effects Supervisors, Bryan Godwin and Karl Coyner, as well as Senior Compositing Lead, Steve J. Sanchez, Visual Effects Coordinator, Julie Long, Visual Effects Editor, Pedro Tarrago, Associate Compositing Lead, Neiko Nagy, CG Artist, Moshe Swed, and FX Technical Director, Kjell Strode.

Best of luck also to the Shade VFX team and all the other vendors for their nomination for Best Visual Effects for Black Sails! Particular congratulations of course to Chip Baden, who will be representing the Team Shade in this category.

Sincere thanks to Academy of Television Arts and Sciences, Marvel and Netflix.

https://scanable.com/wp-content/uploads/2015/07/Shade-VFX-logo.png 360 640 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-09-03 13:40:582015-09-03 13:40:58Shade VFX Nominated for an Emmy Award for Daredevil
Leica_Pegasus_Backpack_Keyvisual_PIC_655x180

Leica Pegasus Backpack Wearable Reality Capture – Indoors, Outdoors, Anywhere

June 3, 2015/0 Comments/in 3D Laser Scanning, LiDAR, Mobile Scanning, News, Point Cloud/by Travis Reinke

Ultra mobile reality capture sensor platform – authoritative professional documentation indoors or outdoors

Leica Pegasus Backpack is a unique wearable reality capturing sensor platform combining cameras and Lidar profilers with the lightness of a carbon fiber chassis and a highly ergonomic design. The Pegasus:Backpack enables extensive and efficient indoor or outdoor documentation at a level of accuracy that is authoritative and professional. The Pegasus:Backpack is designed for rapid and regular reality capture – no longer is scanning registration needed for progressive scanning. The Pegasus:Backpack is just completely portable – enabling it to be checked in as luggage on a flight – simply fly-in, wear, collect, then fly-out. As part of the Pegasus platform, the Pegasus:Backpack is designed to act as a sensor platform with our standard external trigger and sync port outputs.

Leica Pegasus:Backpack

Map indoors, outdoors, underground, anywhere
Making progressive professional BIM documentation a reality with the Leica Pegasus:Backpack solution, synchronising imagery and point cloud data together, therefore assuring a complete documentation of a building for full life cycle management. By using SLAM (Simultaneous Localization and Mapping) technology and a high precision IMU, we ensure accurate positioning with GNSS outages – ensuring the best known position independent of how it is used.

With the Leica Pegasus:Backpack outdoor areas or underground infrastructures with limited access professional data collection is no longer limited . By capturing full 360 spherical view and Lidar together means you never forget an object or return to a project site – no matter where you are. A hardware light sensor ensures the operator that all images are usable while other functions are verifiable and adjustable over the operators tablet device.


Main features

  • Indoor and outdoor mapping in one single solution – position agnostic
  • Marries imagery and point cloud data into a single calibrated, user-intuitive platform
  • Full calibrated spherical view
  • External trigger output and external time stamping for additional sensors
  • Light sensor for auto brightness and balance control for image capture
  • Software enables access to Esri® ArcGIS for Desktop
  • Capture and edit 3D spatial objects from images and / or within the point cloud
  • Economical with data – balances data quantity and quality, with project logistics and post-processing
  • Ultra light weight carbon fiber core frame with an extensive ergonomic support for prolonged use
  • Real time view of the captured data through the tablet device
  • Up to 6 hours operational time with optional battery pack

Hardware features

  • Two profilers with 600,000 pts/sec, 50 m usable range and 16 channels
  • Largest sensor to pixel in the market – 5.5 um x 5.5 um
  • Five 4 MB cameras positioned to capture 360° x 200° view
  • User adjustable acquisition intervals based on the distance travelled
  • NovAtel ProPak6™ provides the latest and most sophisticated precise GNSS receiver with a robust field proven IMU for position accuracy of 20 mm RMS after 10 seconds of outage
  • Marrying a triple band GNSS system with the latest multiple beam enabled SLAM algorithms
  • INS determination of the location, speed, velocity and orientation at a rate of 200 Hz
  • Ultra portable system fitting into one carrying case (system weight 13 kg)
  • Battery based – using four batteries in a hot swappable configuration
  • Multi-core industrial PC, 1 TB SSD, USB3 interface, ethernet, and wireless connection from the system to the tablet device

Leica Pegasus:Backpack Indoor mapping solution

Leica Pegasus:Backpack enables unimaginable applications for indoor and outdoor mapping combining  visual images with the accuracy of a point cloud for professional documentation – in a wearable, ergonomic, and ultra light carbon fiber construction.

Software features

  • User capable of adding acquisition point objects in a Shapefile format during data acquisition
  • Advanced export capability for CAD-systems and others (DWG, DXF, SHP, GDB, DGN, E57, HPC, LAS, PTS, NMEA, KMZ)
  • Semi-automatic extraction tools
  • Sequenced images and videos for rapid navigation and object recognition
  • Software pointer “snaps” automatically and continuously onto the point cloud data from within an image
  • Immediate access to point clouds for accurate measurement
  • 3D stereoscopic view to decrease errors and increase throughput
  • Shadowed or missing 3D points can be acquired via photogrammetric processes
  • Data capture module displays the current location based on a GIS user interface
  • Data capture module displays all cameras and Lidar scans live, simultaneously
  • Data capture module enables laser scanner management and GNSS Operation
  • Live status monitoring of system during data acquisition

Software benefits

  • Lidar accuracy with image-based usability
  • Digitise spatial objects through mobile mapping
  • A more natural approach for non-professional users while offering technical interface for advanced users
  • Scalable to your applications including less accurate simple GIS needs
  • Short data acquisition time
  • High acquisition throughput
  • High post-processing throughput
  • Manageable license options – compatible with thin-client viewer
  • Esri® ArcGIS for Desktop compatible
  • Leverages Esri® relational platform for advanced features
https://scanable.com/wp-content/uploads/2015/06/Leica_Pegasus_Backpack_Keyvisual_PIC_655x1801.jpg 200 728 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-06-03 20:26:252015-06-03 20:29:11Leica Pegasus Backpack Wearable Reality Capture – Indoors, Outdoors, Anywhere
SynthEyes 3D Tracking Software

Andersson Technologies releases SynthEyes 1502 3D Tracking Software

May 22, 2015/0 Comments/in 3D Laser Scanning, Animation, LiDAR, Point Cloud, Software, Visual Effects (VFX)/by Travis Reinke

Andersson Technologies has released SynthEyes 1502, the latest version of its 3D tracking software, improving compatibility with Blackmagic Design’s Fusion compositing software.

Reflecting the renewed interest in Fusion
According to the official announcement: “Blackmagic Design’s recent decision to make Fusion 7 free of charge has led to increased interest in that package. While SynthEyes has exported to Fusion for many years now — for projects such as Battlestar Galactica — Andersson Technologies LLC upgraded SynthEyes’s Fusion export.”

Accordingly, the legacy Fusion exporter now supports 3D planar trackers; primitive, imported, or tracker-built meshes; imported or extracted textures; multiple cameras; and lens distortion via image maps.

The new lens distortion feature should make it possible to reproduce the distortion patterns of any real-world lens without its properties having been coded explicitly in the software or a custom plugin.

A new second exporter creates corner pin nodes in Fusion from 2D or 3D planar trackers in SynthEyes.

Other new features in SynthEyes 1502 include an error curve mini-view, a DNG/CinemaDNG file reader, and a refresh of the user interface, including the option to turn toolbar icons on or off.

Pricing and availability
SynthEyes 1502 is available now for Windows, Linux and Mac OS X. New licences cost from $249 to $999, depending on which edition you buy. The new version is free to registered users.

New features in SynthEyes 1502 include:

  • Toolbar icons are back! Some love ’em, some hate ’em. Have it your way: set the preference. Shows both text and icon by default to make it easiest, especially for new users with older tutorials. Some new and improved icons.
  • Refresh of user interface color preferences to a somewhat darker and trendier look. Other minor appearance tweaks.
  • New error curve mini-view.
  • Updated Fusion 3D exporter now exports all cameras, 3D planars, all meshes (including imported), lens distortion via image maps, etc.
  • New Fusion 2D corner pinning exporter.
  • Lens distortion export via color maps, currently for Fusion (Nuke for testing).
  • During offset tracking, a tracker can be (repeatedly) shift-dragged to different reference patterns on any frame, and SynthEyes will automatically adjust the offset channel keying.
  • Rotopanel’s Import tracker to CP (control point) now asks whether you want to import the relative motion or absolute position.
  • DNG/CinemaDNG reading. Marginal utility: DNG requires much proprietary postprocessing to get usable images, despite new luma and chroma blur settings in the image preprocessor.
  • New script to “Reparent meshes to active host” (without moving them)
  • New section in the user manual on “Realistic Compositing for 3-D”
  • New tutorials on offset tracking and Fusion.
  • Upgraded to RED 5.3 SDK (includes REDcolor4, DRAGONcolor2).
    • Faster camera and perspective drawing with large meshes and lidar scan data.
  • Windows: Installing license data no longer requires “right click/Start as Administrator”—the UAC dialog will appear instead.
  • Windows: Automatically keeps the last 3 crash dumps. Even one crash is one too many.
  • Windows: Installers, SynthEyes, and Synthia are now code-signed for “Andersson Technologies LLC” instead of showing “Unknown publisher”.
  • Mac OS X: Yosemite required that we change to the latest XCode 6—this eliminated support for OS X 10.7. Apple made 10.8 more difficult as well.

About SynthEyes

SynthEyes is a program for 3-D camera-tracking, also known as match-moving. SynthEyes can look at the image sequence from your live-action shoot and determine how the real camera moved during the shoot, what the camera’s field of view (~focal length) was, and where various locations were in 3-D, so that you can create computer-generated imagery that exactly fits into the shot. SynthEyes is widely used in film, television, commercial, and music video post-production.

What can SynthEyes do for me? You can use SynthEyes to help insert animated creatures or vehicles; fix shaky shots; extend or fix a set; add virtual sets to green-screen shoots; replace signs or insert monitor images; produce 3D stereoscopic films; create architectural previews; reconstruct accidents; do product placements after the shoot; add 3D cybernetic implants, cosmetic effects, or injuries to actors; produce panoramic backdrops or clean plates; build textured 3-D meshes from images; add 3-D particle effects; or capture body motion to drive computer-generated characters. And those are just the more common uses; we’re sure you can think of more.

What are its features? Take a deep breath! SynthEyes offers 3-D tracking, set reconstruction, stabilization, and motion capture. It handles camera tracking, 2- and 3-D planar tracking, object tracking, object tracking from reference meshes, camera+object tracking, survey shots, multiple-shot tracking, tripod (nodal, 2.5-D) tracking, mixed tripod and translating shots, stereoscopic shots, nodal stereoscopic shots, zooming shots, lens distortion, light solving. It can handle shots of any resolution (Intro version limited to 1920×1080)—HD, film, IMAX, with 8-bit, 16-bit, or 32-bit float data, and can be used on shots with thousands of frames. A keyer simplifies and speeds tracking for green-screen shots. The image preprocessor helps remove grain, compression artifacts, off-centering, or varying lighting and improve low-contrast shots. Textures can be extracted for a mesh from the image sequence, producing higher resolution and lower noise than any individual image. A revolutionary Instructible Assistant, Synthia™, helps you work faster and better, from spoken or typed natural language directions.

SynthEyes offers complete control over the tracking process for challenging shots, including an efficient workflow for supervised trackers, combined automated/supervised tracking, offset tracking, incremental solving, rolling-shutter compensation, a hard and soft path locking system, distance constraints for low-perspective shots, and cross-camera constraints for stereo. A solver phase system lets you set up complex solving strategies with a visual node-based approach (not in Intro version). You can set up a coordinate system with tracker constraints, camera constraints, an automated ground-plane-finding tool, by aligning to a mesh, a line-based single-frame alignment system, manually, or with some cool phase techniques.

Eyes starting to glaze over at all the features? Don’t worry, there’s a big green AUTO button too. Download the free demo and see for yourself.

What can SynthEyes talk to? SynthEyes is a tracking app; you’ll use the other apps you already know to generate the pretty pictures. SynthEyes exports to about 25 different 2-D and 3-D programs. The Sizzle scripting language lets you customize the standard exports, or add your own imports, exports, or tools. You can customize toolbars, color scheme, keyboard mapping, and viewport configurations too. Advanced customers can use the SyPy Python API/SDK.

https://scanable.com/wp-content/uploads/2015/05/phahalfcap.png 310 512 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-05-22 16:26:412015-05-22 16:28:22Andersson Technologies releases SynthEyes 1502 3D Tracking Software
TC2 Announces Availability of Its Most Advanced 3D-4D Body Scanner

[TC]2 Announces Availability of Its Most Advanced 3D/4D Body Scanner

May 22, 2015/0 Comments/in 3D Laser Scanning, In the News, News, Point Cloud, Visual Effects (VFX)/by Travis Reinke

New TC2-19 Offers Fastest and Most Accurate Measurements on the Market [source]

TC2 Announces Availability of Its Most Advanced 3D-4D Body ScannerCary, NC – April 30, 2015 – [TC]², the innovation leader for the fashion industry and 3D body scanning technology, announces general availability of the TC2-19, the most advanced 3D/4D body scanning and measurement technology available on the market.

The TC2-19 provides the option of using a touch screen inside the scanner booth which allows users to “self-scan” by following on-screen instructions. The “self-scan mode” is ideal for use with the iStyling™ Full Retail Solution that provides for greater fit accuracy, styling advice, and garment customization. The body scanner interfaces with multiple CAD technologies and avatar engines.

While capturing the accurate body measurements that [TC]² 3D body scanners are well-known for today, the TC2-19 offers a “quick scan” option (2 seconds), 360° 3D body scanning and rapid processing speeds (17 seconds). The newly developed 4D mode enables 3D movement visualization inside the scanner.

“The combination of speed, accuracy, and stability enabled by software and hardware advancements make this the most advanced 3D/4D body scanner ever built”, said Dr. Mike Fralix, CEO of [TC]². “We are excited that our existing retail customers and major brands now have a scalable solution that can easily and cost effectively roll out to multiple locations.”

The [TC]² body scanner captures thousands of body measurements used by the fashion, medical, and fitness industries to make custom garments, predict customer sizing, benchmark fitness goals, and augment surgical processes. The scanner fits in a space the size of the average retail dressing room and features a booth for privacy. The TC2-19 comes with a lifetime scanner software license and PC.

The TC2-19 can be viewed at the [TC]² National Apparel Technology Center in Cary, N.C.

About [TC]²:

[TC]², Textile / Clothing and Technology Corporation, is a leader in innovation and dedicated to the advancement of the fashion and sewn products industry. Its research, consulting services, and products help brands, retailers, and manufacturers provide increased value for their customers while improving their bottom line. [TC]²’s mission focuses on the development, promotion and implementation of new technologies and ideas that significantly impact the industry.

https://scanable.com/wp-content/uploads/2015/05/TC2-Announces-Availability-of-Its-Most-Advanced-3D-4D-Body-Scanner.jpg 219 500 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-05-22 16:11:322015-05-22 16:12:57[TC]2 Announces Availability of Its Most Advanced 3D/4D Body Scanner
3D Body Scans for Garage Magazine Augmented Reality App

Behind the Work: Garage Magazine Augmented Reality App

March 10, 2015/0 Comments/in Featured, Visual Effects (VFX)/by Travis Reinke

With New York Fashion Week FW15 in high gear, SCANable got in on the action with a dream assignment collaborating with The Mill, Garage Magazine, renowned makeup artist Pat McGrath, photographer Phil Poynter and Beats by Dre to bring February’s cover models to life through Garage’s smart phone app.

Source: Mill Blog
Garage Magazine Nº8 features cover models: Cara Delevingne, Kendall Jenner, Lara Stone, Binx Walton and Joan Smalls. Each model was rendered in a way that blends the magic touch of Pat McGrath and the technical skills of The Mill team. When the covers are viewed using the GARAGE Mag app, each of the cover stars literally jump out of the page as a 3D rendition and animation, using augmented reality to explore the intersection of print and digital.

‪The Covers

Led by The Mill creative director Andreas Berner, the brief was to create five different covers, each featuring a supermodel wearing a colorful set of Beats by Dre headphones. Each model was treated with a pure CG interpretation of various elements inspired by original Pat McGrath’s make up designs: android mask, graphite scribbles, shrink wrap, crystals, and smoke elements.

GARAGE Nº8

GARAGE Nº8

Cara Delevingne: Android Mask

Cara’s look was inspired by Pat McGrath’s make-up for the Spring/Summer Alexander McQueen show. Once the app is activated, segments of blue armor animate from behind her head and create an android effect.

Kendall Jenner: Graphite Scribbles

Kendall is taken over by a mesh of stone 3D graphite that swoops over her whole body and face until it engulfs her in delicate body armor. The animation is the shape of her silhouette, creating the effect that both she and the headphones are immersed in a cage of lines.

Lara Stone: Shrink Wrap

Lara appears in airtight shrink-wrap plastic, creating a smooth and almost liquid looking cover. As Lara’s head emerges from the page, the rich color tone of the headphones begins to take over and envelope her completely back into the cover.

Binx Walton: Crystals

The animation appears in midair, organically building from crystals and facets  to create a 3D crystal bust. The tiny delicate sharp crystals appear in a mask shape, gradually taking over so she is fully covered. The look is inspired by McGrath’s makeup from the Givenchy Spring/Summer 2014 show.

Joan Smalls: Smoke

Joan’s look is inspired by the movement of a smoke electric storm and the northern lights. Joan emerges from the page to the sound of her taking a deep breath. Purple smoke FX continues to build and swirl around her three dimensional head.

The Process

The idea was that each model would be a breathing, living organism consumed by the nature of the VFX. After Phil shot the models with bare makeup, SCANable captured a 3D scan on-set to generate full body CG models for post production. This allowed The Mill’s 3D team, led by Raymond Leung, to retouch and create geometry for the app.

The Mill design team then outlined the designs on top of the retouched photographs with style frames. After a final cropping & editing session with Phil, the high res prints were sent to press.

Binx Walton: Crystals

Binx Walton: Crystals

The next step was to convert the ideas from the print component into the AR app. Garage Magazine had previously teamed up with artist Jeff Koons, creating a cover where, when using the Garage app, viewers were able to walk around a virtual sculpture. The Mill team pushed the technology further for Issue Nº8 with animation, sound and additional elements.

For the app execution, in-house 2D and 3D tools were used to create FX simulations, animations and final composites. Many of the initial ideas were limited by the technology used in real-time applications and a lack of processing power, which meant the team needed to get creative.

Lara Stone: Shrink Wrap

Lara Stone: Shrink Wrap

Kendall’s effect was particularly challenging as her execution utilized disciplines across the CG department. We were up against resolution restrictions within the technology. Each strand grown was a culmination of particle effects, animation, texturing, modeling and lighting. But with some clever ingenuity, and an indication process of constant testing, updating, and retesting, her incredible look was achieved. This was the approach taken for the look of each of the models.

Kendall Jenner: Graphite Scribbles

Kendall Jenner: Graphite Scribbles

Unique music scores by Alex Da Kid and sound FX by Finger Music were incorporated to accompany each execution, building an even stronger interactive experience. The last step was integrating all finished animations into the actual app, done by Meiré & Meiré.


This issue of Garage is currently on stands and available for purchase online. You can download the app here and watch these beauties come to life.

https://scanable.com/wp-content/uploads/2015/02/Garage-Magazine-Covers1.png 544 2048 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-03-10 16:05:002015-03-10 16:16:23Behind the Work: Garage Magazine Augmented Reality App
faro freestyle 3d handheld scanner

FARO® Launches Innovative, User-Friendly Handheld 3D Scanner to Meet Growing Demand for Portable Scanning

January 20, 2015/0 Comments/in 3D Laser Scanning, In the News, New Technology, Point Cloud/by Travis Reinke

LAKE MARY, Fla., Jan. 7, 2015 /PRNewswire/ — FARO Technologies, Inc. (NASDAQ: FARO), the world’s most trusted source for 3D measurement, imaging, and realization technology, announces the release of the new FARO Freestyle3D Handheld Laser Scanner, an easy, intuitive device for use in Architecture, Engineering and Construction (AEC), Law Enforcement, and other industries.

The FARO Freestyle3D is equipped with a Microsoft Surface™ tablet and offers unprecedented real-time visualization by allowing the user to view point cloud data as it is captured. The Freestyle3D scans to a distance of up to three (3) meters and captures up to 88K points per second with accuracy better than 1.5mm.  The patent-pending, self-compensating optical system also allows users to start scanning immediately with no warm up time required.

“The Freestyle3D is the latest addition to the FARO 3D laser scanning portfolio and represents another step on our journey to democratize 3D scanning,” stated Jay Freeland, FARO’s President and CEO.  “Following the successful adoption of our Focus scanners for long-range scanning, we’ve developed a scanner that provides customers with the same intuitive feel and ease-of-use in a handheld device.”
The portability of Freestyle3D enables users to maneuver and scan in tight and hard-to-reach areas such as car interiors, under tables and behind objects making it ideal for crime scene data collection or architectural preservation and restoration activities.  Memory-scan technology enables Freestyle3D users to pause scanning at any time and then resume data collection where they left off without the use of artificial targets.

Mr. Freeland added, “FARO’s customers continue to stress the importance of work-flow simplicity, portability, and affordability as key drivers to their continued use and adoption of 3D laser scanning.  We have responded by developing an easy-to-use, industrial grade, handheld laser scanning device that weighs less than 2 lbs.”

The Freestyle3D can be employed as a standalone device to scan areas of interest, or used in concert with FARO’s Focus X 130 / X 330 scanners.  Point cloud data from all of these devices can be seamlessly integrated and shared with all of FARO’s software visualization tools including FARO SCENE, WebShare Cloud, and FARO CAD Zone packages.

For more information about FARO’s 3D scanning solutions visit: www.faro.com

This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995 that are subject to risks and uncertainties, such as statements about demand for and customer acceptance of FARO’s products, and FARO’s product development and product launches. Statements that are not historical facts or that describe the Company’s plans, objectives, projections, expectations, assumptions, strategies, or goals are forward-looking statements. In addition, words such as “is,”“will,” and similar expressions or discussions of FARO’s plans or other intentions identify forward-looking statements. Forward-looking statements are not guarantees of future performance and are subject to various known and unknown risks, uncertainties, and otherfactors that may cause actual results, performances, or achievements to differ materially from future results, performances, or achievements expressed or implied by such forward-looking statements. Consequently, undue reliance should not be placed on these forward-looking statements.

Factors that could cause actual results to differ materially from what is expressed or forecasted in such forward-looking statements include, but are not limited to:

  • development by others of new or improved products, processes or technologies that make the Company’s products less competitive or obsolete;
  • the Company’s inability to maintain its technological advantage by developing new products and enhancing its existing products;
  • declines or other adverse changes, or lack of improvement, in industries that the Company serves or the domestic and international economies in the regions of the world where the Company operates and other general economic, business, and financial conditions; and
  • other risks detailed in Part I, Item 1A. Risk Factors in the Company’s Annual Report on Form 10-K for the year ended December 31, 2013 and Part II, Item 1A. Risk Factors in the Company’s Quarterly Report on Form 10-Q for the quarter ended June 28, 2014.

Forward-looking statements in this release represent the Company’s judgment as of the date of this release. The Company undertakes no obligation to update publicly any forward-looking statements, whether as a result of new information, future events, or otherwise, unless otherwise required by law.

About FARO

FARO is the world’s most trusted source for 3D measurement technology. The Company develops and markets computer-aided measurement and imaging devices and software. Technology from FARO permits high-precision 3D measurement, imaging and comparison of parts and complex structures within production and quality assurance processes. The devices are used for inspecting components and assemblies, rapid prototyping, documenting large volume spaces or structures in 3D, surveying and construction, as well as for investigation and reconstruction of accident sites or crime scenes.

Approximately 15,000 customers are operating more than 30,000 installations of FARO’s systems, worldwide. The Company’s global headquarters is located in Lake Mary, FL; its European regional headquarters in Stuttgart, Germany; and its Asia/Pacific regional headquarters in Singapore. FARO has other offices in the United States, Canada, Mexico, Brazil, Germany, the United Kingdom,France, Spain, Italy, Poland, Turkey, the Netherlands, Switzerland, Portugal, India, China, Malaysia, Vietnam, Thailand, South Korea, and Japan.

More information is available at http://www.faro.com

SOURCE FARO Technologies, Inc.

https://scanable.com/wp-content/uploads/2015/01/freestyle-video-thumb-big.jpg 563 1000 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-01-20 16:29:522015-01-20 16:29:52FARO® Launches Innovative, User-Friendly Handheld 3D Scanner to Meet Growing Demand for Portable Scanning
Mattepainting Toolkit Camera Projection

Photogrammetry and camera projection mapping in Maya made easy

January 20, 2015/0 Comments/in 3D Laser Scanning, Software, Visual Effects (VFX)/by Travis Reinke

The Mattepainting Toolkit

Photogrammetry and camera projection mapping in Maya made easy.

What’s included?

The Mattepainting Toolkit (gs_mptk) is a plugin suite for Autodesk Maya that helps artists build photorealistic 3D environments with minimal rendering overhead. It offers an extensive toolset for working with digital paintings as well as datasets sourced from photographs.

Version 3.0 is now released!

For Maya versions 2014 and 2015, version 3.0 of the toolkit adds support for Viewport 2.0, and a number of new features. Version 2.0 is still available for Maya versions 2012-2014. A lite version of the toolkit, The Camera Projection Toolkit (gs_cptk) is available for purchase from the Autodesk Exchange. To see a complete feature comparison list between these versions, click here.

How does it work?

The Mattepainting Toolkit uses an OpenGL implementation for shader feedback within Maya’s viewport. This allows users to work directly with paintings, photos, and image sequences that are mapped onto geometry in an immediate and intuitive way.

Overview

The User Interface

Textures are organized in a UI that manages the shaders used for viewport display and rendering.

...

  • Clicking on an image thumbnail will load the texture in your preferred image editor.
  • Texture layer order is determined by a drag-and-drop list.
  • Geometry shading assignments can be quickly added and removed.

Point Cloud Data

Import Bundler and PLY point cloud data from Agisoft Photoscan, Photosynth, or other Structure From Motion (SFM) software.

...

  • Point clouds can be used as a modeling guide to quickly reconstruct a physical space.
  • Cameras are automatically positioned in the scene for projection mapping.

The Viewport

A custom OpenGL shader allows textures to be displayed in high quality and manipulated interactively within the viewport.

...

  • Up to 16 texture layers can be displayed per shader.
  • HDR equirectangular images can be projected spherically.
  • Texture mattes can be painted directly onto geometry within the viewport.
  • Image sequences are supported so that film plates can be mapped to geometry.

Rendering

The layered textures can be rendered with any renderer available to Maya. Custom Mental Ray and V-Ray shaders included with the toolkit extend the texture blending capabilities for those renderers.

...

  • The texture layers can be baked down to object UVs.
  • A coverage map can be rendered to isolate which areas of the geometry are most visible to the camera.
  • For Mental Ray and V-Ray, textures can be blended based on object occlusion, distance from the projection camera, and object facing ratio.
https://scanable.com/wp-content/uploads/2015/01/gnomon_cover.jpg 510 800 Travis Reinke https://scanable.com/wp-content/uploads/2025/01/SCANable_logo_emblemSimple-180x180.png Travis Reinke2015-01-20 16:11:532015-01-20 16:11:53Photogrammetry and camera projection mapping in Maya made easy

Pages

  • 2023 Trilith Charity Gala
  • 3D Body Scanning, LiDAR and VFX Support in Houston, Texas
  • 3D Head and Body Scanning
  • 3D Object Scanning
  • 3D VFX LiDAR Scanning and Support in New Orleans, Louisiana
  • 3D VFX Scanning and Support in Atlanta, Georgia
  • 3D VFX Scanning and Support in Los Angeles, California
  • 3D VFX Scanning and Support in New York
  • About Us
  • Architecture & Heritage
  • Artec Leo
  • Asset
  • Awards
  • Blog
  • CAL – Capture Array for Lumetrics
  • Cart
  • Checkout
  • Civil/Survey
  • Clients
  • Contact
  • Drone Photogrammetry
  • FARO Focus S 150 and 350 3D Laser Scanner
  • FARO Focus3D X 330 Long Range 3D Laser Scanner
  • FARO Freestyle3D Handheld Scanner
  • Home
  • Industry News
  • Leica HDS6200
  • LiDAR Location and Environment 3D Scanning
  • Logout
  • Maquette, Statues and Fine Art 3D scanning
  • Mobile 3D Scanning
  • Mobile Photogrammetry Studios
  • MoPho 1 Service Request
  • MoPho 2 Service Request
  • MoPho 3 Service Request
  • My Account
  • New
  • News
  • Plant, Process & Marine
  • Pricing and FAQ
  • Privacy Policy
  • Purchase Affordable 3D Laser Scanners
  • Recent Productions
  • Rent the Leica RTC360
  • Request 3D Scanning Services
  • SCANable Rig Checklist
  • SCANable UAS (Unmanned Aerial Systems)
  • SCANable | COVID-19 Response
  • Security and Threat Analysis
  • Services
  • Shop
  • Surphaser 100HSX
  • Trailer Move
  • Training and Support
  • VFX 3D Vehicle Scanning For Film and Television
  • VFX is not AI
  • Virtual Reality
  • Visual Effects/CGI

Categories

  • 3D Laser Scanning
  • 3D Printing
  • Animation
  • Archaeology
  • Blog
  • Building Information Modeling (BIM)
  • Events
  • Featured
  • Forensic
  • Government
  • In the News
  • Industrial
  • Industry
  • LiDAR
  • Mobile Scanning
  • Modeling
  • New Hardware
  • New Technology
  • News
  • Photogrammetry
  • Point Cloud
  • Reviews
  • Software
  • Uncategorized
  • Virtual Reality
  • Visual Effects (VFX)

Archive

  • June 2025
  • May 2025
  • January 2025
  • December 2024
  • November 2022
  • October 2020
  • August 2020
  • July 2020
  • June 2020
  • January 2020
  • July 2017
  • May 2017
  • April 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • July 2016
  • May 2016
  • March 2016
  • September 2015
  • June 2015
  • May 2015
  • March 2015
  • January 2015
  • December 2014
  • November 2014
  • August 2014
  • June 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • November 2013
  • June 2013
  • March 2013
  • January 2013
  • December 2012
  • November 2012
  • September 2012
  • August 2012
  • July 2012
  • December 2010
  • November 2010
  • October 2010
  • August 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009

Quick Pages

  • Home
  • News
  • Services
  • Contact
  • Privacy Policy

Recent Productions

  • Stranger Things 5 (Netflix)
  • Mountainhead (HBO Films)
  • Weapons (Warner Bros.)
  • The Studio (Lionsgate Television)
  • The Woman in the Yard (Universal Pictures)

Latest News

  • A $1.5B Investment in Texas’ Film is Now LawJune 23, 2025 - 4:49 pm
  • Daredevil: Born Again: the art and craft of critical VFX collaborationMay 3, 2025 - 1:58 pm
  • Matthew McConaughey, Woody Harrelson Revive ‘True Detective’ Roles to Call for Filming in Texas: ‘Hollywood Is a Flat Circle’January 29, 2025 - 4:22 pm
  • Here are all the nominees for the 23rd Annual VES AwardsJanuary 15, 2025 - 6:58 pm

Locations

Georgia

Los Angeles

New York

Houston

New Orleans

© Copyright 2025 - SCANable - Enfold Theme by Kriesi
  • Link to Facebook
  • Link to X
  • Link to LinkedIn
  • Link to Youtube
  • Link to Rss this site
  • Link to Instagram
Scroll to top Scroll to top Scroll to top