Playlist
June 1st, 2015
Google’s Advanced Technology and Projects group (ATAP) introduced several exciting projects during the ATAP session of Google’s 2015 I/O developers’ conference . ATAP is Google’s mobile focused project development laboratory making big ideas a reality. Led by former DARPA director Regina Dugan, the ‘small band of pirates’ unveiled several projects that have the potential to change the way we tell, consume and engage with stories in the near future.

The Future of Interaction

Google introduced two new projects that will revolutionize how we interact with technology: Project Jacquard and Project Soli.

Project Jacquard makes it possible to weave electronically conductive thread into yarn, transforming everyday objects like clothing and furniture into interactive surfaces. Since it can be implemented into virtually any material and communicate with any type of device, it opens up a world of possibilities for connecting to online services, apps and even storytelling experiences. Google has tapped Levi's for the initiative with the release of its first ‘smart’ jeans expected in early 2016.


Forget touch screens and styluses, Project Soli transforms the world into a screen using radar waves to detect precise finger movements or finger "micromotions”. There’s already a need for better ways to control devices, including smaller wearables like the Apple watch. With sensors that can track sub-millimeter motions at high speed and accuracy, Project Soli is a solution for controlling these smaller interfaces while also opening the ability to interact with larger virtual worlds.


When considered with Google's Project Tango, which combines 3D motion tracking with depth sensing to provide your mobile device the ability to know where it is and how it moves through space, it's clear to see that ATAP is developing the technologies we need to evolve from watching passive stories on the screen to interacting with the world – both real and virtual - around us.

The Future of Storytelling

Spotlight Stories

ATAP chief Regina Dugan opened the segment on Spotlight Stories,ATAP’s platform to explore and redefine storytelling through mobile, with the statement, “We’re making a mobile movie theater and it opens today.”  A fitting introduction to the fourth installment in the series and the first live action Spotlight Story, HELP, created by Google ATAP, Fast & Furious director Justin Lin, Bullitt Production and The Mill.

5 Spotlight Story 'HELP'

Googleioconference5Tawfeeq Martin, Technical Innovations Manager at The Mill, interacting with ‘HELP’ at Google I/O

‘HELP’ combines live action and CG in a 360 environment, creating an action-driven story with cinematic quality visual effects. The team worked to solve the technical and creative challenges involved with immersive and VR filmmaking, developing a custom 360 camera rig solution with the ability to follow the action.

Googleioconference3360 Camera Rig

Shooting 360 live action footage has generally been a challenge for directors, with most methods providing images from multiple cameras with an overlapping wide-angle field of view. The Mill took on the challenge to build a comprehensive onset/dailies type solution that was intuitive and inspired confidence on set.

After numerous tests, experimentation and R&D, The Mill developed ‘Mill Stitch’ (TM pending), a proprietary software solution that takes images from multiple cameras and then stitches the output into a continuous 360 degree view. Mill Stitch proved invaluable to the process, bringing real-time stitching, interactive controls, and record/review to a cinematic VR production pipeline. Mill Stitch was even featured at Google I/O sandbox so patrons could experience first hand what it would be like directing a VR film.

Find out more about the technologies developed in our Behind the Scenes film

Googleioconference2Gawain Liddiard, VFX Supervisor at The Mill, interacting with Mill Stitch at Google I/O

Storytelling Tools 

With “More great stories. More happy viewers.” as the goal, the ATAP team came up with a solution to enable immersive content creation with its new Story Development Kit (SDK).  Acting as a bridge for filmmakers into the new landscape of interactive and immersive storytelling, the SDK was designed to fit into existing film production pipelines, while also taking into account the need for stories to run in real time on mobile at 60 fps. 

To help filmmakers work with an interactive story structure, the SDK’s story editor helps translate traditional filmmaking methods like cuts and shots into new techniques to direct the viewer’s attention and control the flow of the story. The SDK also includes a 360 Storyboard tool that allows you to workout the story in space and time by visualizing the “boards” around you.

Goggleatap Io15 Sdk2

Making Stories Accessible

Spotlight Stories will also be available beyond Motorola phones and even the Android ecosystem. Google announced that an iOS version would be available soon and through Google’s YouTube app following this summer. Android users can watch it now through the Google Spotlight Stories app from Google Play.

Beyond the ATAP session, Google announced an improved design of Google Cardboard and an updated version of the headset that can work with larger phones. On the other end of the experience, Google seeks to also make VR content creation more accessible with Google Jump, an open-source VR solution that provides components needed to create 360 films, including the blueprints for a 360-degree camera rig made with 16 cameras.

Googlecardboardio15

What’s Next

Like the early days of radio, film and television, we’re still experimenting and redefining the rules of how we tell and consume immersive and interactive stories. For modern storytellers, the difference is made by industry leaders like Google that are helping to drive the evolution forward through new and exciting tools, platforms and content. 

Watch the full session below and find more videos from the conference here.