Friday, March 24, 2017


  • Monday, Nov. 21, 2016
Primestream introduces VR/360 Dynamic Media Management
Primestream's Xchange Extension Panel

Primestream has introduced Virtual Reality/360 – asset management support in Xchange Suite, for workflows from capture through delivery. These new features enable storytellers and content creators to produce and store VR/360 content while leveraging collaborative workflows in a metadata rich environment.
Archiving VR/360 assets is an often requested feature in the market. Currently, many content creators have been using YouTube to store, search, and view VR/360 media. Xchange enables VR/360 media management in a secure and controlled location that supports the capture, production, management and delivery of all your assets.
“Working in a 360-degree space poses new challenges for storytellers. We are all trying to figure out how to move a viewer though a 360 degree space.” said David Schleifer, COO at Primestream. “Often, a specific part of the 360 content might be the only area of interest and with Xchange spatial markers professionals can now create focus points within the VR/360 media with metadata descriptions or keywords to easily find the area of interest. This helps push the project through the creative process, but also helps with finding the value in assets after they have been archived.”
Unique features of VR/360 dynamic media management for Xchange include:

  • Proxy video generation at preselected qualities
  • Preview content in 360°
  • Search for reference markers by keyword, views can target specific camera positions with in markers
  • Search for pre-programmed reference markers, target specific camera view by using keywords and spatial markers
  • Navigate in X/Y mode, zoom in within Xchange 360°
  • Search within a 360° video and locate metadata specific key points with Xchange

Xchange integrates with Adobe Premiere Pro Creative Cloud, enabling media asset management functionality inside Premiere Pro with the Xchange Extension Panel. Editors can easily locate the assets they need and begin editing with two-way communication of metadata, markers, subclips and projects.

  • Monday, Nov. 21, 2016
Vicon's motion capture embedded in Royal Shakespeare Company's "The Tempest"
"The Tempest" (photo by Toper McGrillis, courtesy of Royal Shakespeare Company)

Vicon, the motion capture technology specialist for the entertainment, engineering and life science industries, announced that one of its motion capture systems and powerful object tracking solution, Tracker, will be used for the upcoming theatre production of The Tempest.

The Royal Shakespeare Company (RSC) in collaboration with Intel and in association with The Imaginarium Studios, is creating a new production of The Tempest starring Simon Russell Beale as Prospero. The innovative production will mark the 400th anniversary of Shakespeare’s death and go down in history as the first live motion capture performance featured in a major stage production.

Motion capture technology is used heavily throughout the performance and the production team will use Vicon’s optical camera system to track the whereabouts of moving objects on stage, some of which will be held by the actors. Vicon cameras will also capture various screens during the performance, informing the production team of their movements on and around the stage. Due to its precision tracking capabilities the Vicon system will provide the animation team with real-time feedback, allowing them to have complete visibility of the objects on stage as well as managing interactions between the cast.

With the addition of augmented reality to the performance, an added layer of complexity is presented. It was important for THE RSC to have a mocap system in place which would accurately inform the other technical elements of the production. In order to create this consistency, a Vicon system will be tracking screens as well as moving objects during the performance. The data processed by the Tracker software using Intel® Xeon® and Intel® Core™ i7 processors will not only inform the lighting software and drive the spotlights for the performance, but also enable the augmented reality aspect of the production – allowing objects to become projected onto screens in real-time. To enable this innovative use of technology, Vicon and d3 technologies have developed a bespoke software solution allowing the data generated by Tracker to drive d3’s software using the PosiStageNet protocol – an industry standard which is used to drive interactive effects and lighting on stage.

“We are using Vicon cameras and Tracker software to track moving objects and actors for projection mapping and light tracking. In order to realise this, we’ve been working with the Vicon and d3 Technologies to develop a plug in application of PosiStage.Net. This will mean that both video and light tracking will be sharing the same tracking data protocol from the Vicon camera system.” said Pete Griffin, RSC Production manager on The Tempest.

Ben Lumsden, head of studio at The Imaginarium Studio added: “This theatrical production of The Tempest will see never before used technology on stage. This live spectacle, which starts today, has been further enhanced by bringing in the expertise of Vicon and the use of their cameras, which adds greatly to the audiences’ experience.”

‘Combining one of Shakespeare’s most renowned plays with innovative, augmented reality driven by Vicon technology signals a change in theatre production  and traditional media,’ said Imogen Moorhouse, CEO, Vicon. ‘It’s important to continue to innovate in motion capture and The Royal Shakespeare Company have demonstrated just how versatile the technology is when used to ignite the imagination of theater goers old and new.”

  • Thursday, Nov. 17, 2016
Facebook buys Pittsburgh-based facial analysis software firm 
In this June 11, 2014 file photo, a man walks bast a Facebook sign in an office on the Facebook campus in Menlo Park, Calif. (AP file photo)

Facebook has bought a facial analysis software firm linked to Pittsburgh's Carnegie Mellon University, a move that will help the social media giant boost its artificial intelligence-powered facial recognition technology.

Details of the deal weren't revealed by Menlo Park, California-based Facebook or FacioMetrics, the Carnegie Mellon spinoff.

Facebook says FacioMetrics's software can be used to monitor the emotions of medical patients, assess audience reaction to a public speaker, or even detect drowsy drivers.

It also will enable Facebook users to express themselves through special effects that can manipulate photos and videos with facial images. This is something rival Snapchat already does to some extent - it has special filters users can add on to selfie "snaps" they take of themselves. The filters change depending on your facial expressions. For example, a dog filter will not just add on dog ears but show a giant panting tongue instead of your own when you stick out your tongue.

FacioMetrics has developed software called IntraFace, which can be downloaded onto mobile phones and enables users to do real-time facial image analysis. The company's CEO is Fernando De la Torre, an associate research professor at the school's Robotics Institute.

  • Thursday, Nov. 17, 2016
NEP Europe goes with Grass Valley LDX 86N 4K cameras for new mobile prodn. units

NEP Europe announced its commitment to purchase all Grass Valley native 4K cameras for its new trucks that are currently under construction. NEP Europe has been relying on cameras from Grass Valley, a Belden Brand, for years, with more than 600 cameras purchased and used for a wide variety of production projects during that time. When designing the newest trucks that will be joining the fleet over the next six months, leaders at NEP Europe concluded that the new LDX 86N cameras from Grass Valley will provide the best performance and value for the increasingly popular UHD production its clients are demanding.

“We won’t be switching to UHD production overnight, but it’s clear that the market is moving that way,” observed Paul Henriksen, president of NEP Broadcast Services Europe. “By making this decision now and equipping our new trucks with the best 4K cameras on the market, we are ready to offer the capability as our customers demand it. Also, the LDX 86N offers outstanding HD image performance, so it fits our production needs today very well.”

The LDX 86N camera is ideally suited for outside broadcasting as well as live and studio productions, giving broadcasters the advantage of selecting between the highest image resolution with native 4K or switching to native 3G/HD for all applications where uncompromised 3G/HD performance is required. It also offers the ability to upgrade to LDX 86N Universe functionality, with 6X HD and 3X HD/3G high frame rate capture, on a daily, weekly or perpetual basis with the GV-eLicense.

NEP Europe’s long-term plans include utilizing Grass Valley’s 4K and IP compatible K-Frame switcher video processing engines and Belden cable in its mobile fleet, with some of its projects beginning to rely on an IP infrastructure for improved speed and efficiency. The company already has commitments to provide UHD production for customers in Germany and Switzerland and expects more of its work to transition from HD to UHD with the five new OB vans, all with Karrera 3-stripe control surfaces.

“NEP Europe is a real pioneer in the market, putting everything in place to take advantage of the new opportunities for UHD production,” noted Jan-Pieter van Welsem, VP of sales and marketing, EMEA, Grass Valley. “We’ve worked together for years, and by staying on the leading edge of technology, we are able to provide them with solutions that fuel their growth while paving the way for change. Before we know it, UHD will be the default format for a majority of programming.”

The five new trucks, which include one triple-expanding unit that can accommodate up to 30 of the LDX 86N cameras, should be on the road by early spring of 2017. All future trucks built for NEP Europe will also feature LDX 86N cameras exclusively.

  • Wednesday, Nov. 16, 2016
DJI premieres short film shot entirely on drone by Oscar winner Claudio Miranda
Claudio Miranda
SHENZHEN, China -- 

DJI, a maker of unmanned aerial vehicles, has premiered a short film by Academy Award-winning cinematographer Claudio Miranda shot entirely on its new Inspire 2 professional drone and its professional-grade X5R camera.
The Circle stars Ryan Phillippe (Shooter) and Noah Schnapp (Stranger Things), and illustrates how the DJI Inspire 2 can be an integral part of the creative process for high-end filmed entertainment. Miranda--who won the Best Cinematography Oscar for Life of Pi in 2013 and was nominated in 2009 for The Curious Case of Benjamin Button--and EP Dana Brunetti (The Social Network, House of Cards) used the Inspire 2 and X5R for every shot in the film, from emotional close-ups to sweeping aerial views.
“Filmmaking has been tethered to the ground for so long. DJI’s technology is allowing filmmaking to be free,” Phillippe said. “The advantage is obvious on smaller projects when you can’t afford cranes and all of the technicians that come along with it, and you can still achieve the same beautiful shots as with the larger equipment.”
The Circle tells the story of an estranged father (Phillippe) who reunites with his young son (Schnapp) in Depression-era America after the sudden death of the boy’s mother. The two travel from town to town as dad makes a meager living for the both of them by sketching portraits for locals. It’s on this formative journey that the boy discovers not only the transformative power of art, but how to open his heart as well.
The DJI Inspire 2, unveiled Tuesday at an event on the Warner Bros. studio lot, is a ready-to-fly platform for high-end film and video creators. While aspiring filmmakers have long had their creative ambitions restricted by shoestring budgets and limited gear options, affordable and powerful aerial equipment is now opening new doors and providing them with more options to pursue their visions.
“When you think of shooting a movie set in the Great Depression, you think of big Hollywood studio films, big cameras and massive crews,” said director Sheldon Schwartz. “What’s unique about this shoot is that we’re using new technology to push the limits of camera language and storytelling.”                   

For cinematographers, Inspire 2 with the X5R camera allows them to control every aspect of an image, gives them new freedom to move the camera in three dimensions, and uses stabilized gimbal technology to eliminate unwanted camera movement.
“What’s fantastic about this drone is that we’re able to shoot in the RAW format,” Miranda said. “It’s nice to have that dynamic range, so I’m able to push shadows up or highlights down and create a mood.”
“When you work with incredible visual storytellers like Claudio who are hyper-focused on every detail and element of the image, it’s important to have the most dynamic and versatile product to help tell that story but also one that also doesn’t break the bank,” Brunetti said. “We get just that with the DJI Inspire 2.”

  • Tuesday, Nov. 15, 2016
Facilis TerraBlock at work on Canadian family drama series "Backstage"
HUDSON, Mass. -- 

Facilis, an international supplier of cost-effective, high performance shared storage solutions for collaborative media production networks, announced that the postproduction team for Canadian family drama series Backstage is again relying on Facilis TerraBlock for its collaborative editorial workflow for season 2.

Ellen Fine, CCE (Canada Cinema Editor) is the supervising editor and consulting producer on Backstage. The series is produced by Fresh TV and distributed by DHX. It currently airs on the Family Channel in Canada and on the Disney Channel in the US and worldwide.

The offline edit is cut on Avid Media Composers by a team of three editors who share two assistants at Technicolor’s Toronto facility. For the first season, the team started out using only local storage. With some arm-twisting, the producers were convinced to allocate funds so that Fine’s team could better collaborate and share media.

The group called Don Kinzinger from Dynamix, a systems reseller and integrator in Ontario, to help them find the right system for their needs. Choices quickly narrowed down to Avid ISIS or Facilis Terrablock. In the end, a 16TB TerraBlock 8D system best fit the budget and gave the team the flexibility they wanted while minimizing restrictions on Apple OS and Media Composer version compatibility issues with the storage. The TerraBlock 8D was connected to a 10Gb Ethernet switch and from that point 1Gb Ethernet lines were connected to five Mac Pro workstations. Having central media access for the editorial team provided a great leap forward in efficiency and collaboration.

“We wanted a system that gave us the freedom to be on whatever Mac OS we were comfortable with, running any version of Media Composer that we felt was stable without any worries about compatibility with the storage,” stated Fine.

When the show was picked up for a second season, Fine knew that the producers would want to occasionally incorporate footage from season one which meant it had to be online and accessible. Since this required more storage, the team called on Dynamix again this time purchasing a 32TB TerraBlock 8D. The new storage was added to the same Ethernet switch so that everyone could access both servers.

Following the same hectic schedule as season one, for season two the crew shot 30 episodes over the summer. Since the location is an actual school, they had a very limited amount of time to film. There were two complete units shooting simultaneously.  “It’s quite unique, we’re block-shooting, so our actors, who are aged 14 to 17, are filming 4 episodes over 4 days. The actors have to memorize 4 entire episodes and jump back and forth between them,” said Fine. “They’re phenomenal dancers and musicians as well. It was very impressive.”

The show is filmed with the ARRI Amira camera with dailies processed and synced at Technicolor. Fine and team receive Avid DNxHD 36 files from the Technicolor team, which are transferred to the Facilis system, ready to edit. After the dailies are delivered, Fine and her team have 5 or 6 days to assemble an episode. Then they move onto the director’s cut, the producer’s cut, and finally the broadcaster’s cut before they lock. This year, they started in July, and typically finish in eight or nine months.

“The great thing about our Facilis system is that we don’t need to think about it.  It’s just rock solid and always there,” said Fine. “We organize everything very meticulously by episode in folders and bins. We store our music in a separate volume. It’s very efficient and just what we need to meet our deadlines.”

  • Monday, Nov. 14, 2016
HiScene, Inuitive, Heptagon team on AR glasses
HiAR Glasses
SANTA CLARA, Calif. -- 

HiScene, Inuitive and Heptagon have teamed to roll out HiAR Glasses,  billed as HiScene’s next generation of Augmented Reality (AR )glasses. The companies worked together to develop a complete solution for advanced 3D depth sensing and AR/VR applications that delivers excellent performance even in changing light conditions and outdoors. HiAR Glasses incorporate Inuitive’s NU3000 Computer Vision Processor and Heptagon’s advanced illumination. 
The glasses’ AR operating system provides stereoscopic interactivity, 3D gesture perception, intelligent speech recognition, natural image recognition, inertial measuring unit (IMU) displayed with an improved 3D graphical user interface.
“We are committed to providing the best possible user experience to our customers, and for this reason we have partnered with Inuitive and Heptagon to create the most intelligent AR glasses available on the market,” said Chris Liao, CEO of HiScene. “The technologies implemented provide a seamless experience in a robust and compact format, without compromising on battery life.”
Inuitive’s NU3000 serves AR Glasses by providing 3D depth sensing and computer vision capabilities. This solution acts also as a smart sensors hub to accurately time-stamp and synchronize multiple sensors in a manner that off-loads the application processor and shortens the development time. “Inuitive’s solution allows Hiscene to provide the reliability, latency and performance its customers expect,” said Shlomo Gadot, CEO of Inuitive. “With Inuitive technology, AR products and applications can now be used outdoors without the sunlight interfering or damaging their efficacy thanks to cameras featuring depth perception.”
Heptagon provides unique IR Pattern Illuminators, which were chosen to handle changing light conditions and plain surfaces.  In addition, the range and Field of Illumination features of Heptagon’s LIMA stereo pattern projector ensure superior lighting and added texture for higher-quality images.
“Our Wide Field of Illumination provides better gesture recognition, and our miniaturization technologies enable ultra-small, high performance, low power components for 3D AR/VR applications,” said Dr. Erik H. Volkerink, Heptagon’s chief business officer and executive VP.

  • Monday, Nov. 14, 2016
Moonlight Cinema deploys DaVinci Resolve Studio to complete animated film "Ozzy"
A scene from "Ozzy."
FREMONT, Calif. -- 

Blackmagic Design announced that Barcelona-based postproduction house Moonlight Cinema has completed both the digital intermediate and finishing of the new children’s animated film “Ozzy” on DaVinci Resolve Studio.

Postproduction director Alejandro Matus and colorist Ignasi González produced more than 40 different deliveries of the animation over a two-week period, using Moonlight Cinema’s three DaVinci Resolve color grading suites to cater for everything from different aspect ratios and frame rates through to multiple language versions.
“We had never taken on an animated film before, so there was an added layer of pressure to deliver a high quality finish,” said Matus. “Resolve was an enormous help throughout all aspects of the project, particularly with its new editing tools. Not only did we use it to conform and master the project, but we also were able to make changes to the edit during the grade without having to leave Resolve.”
Starting with 2K DPX files transferred to them by Tangent Animation in Canada, the Moonlight Cinema team applied a LUT to act as a base for the film’s overall look. “For many in the industry, it may seem natural to think that animation projects don’t require any real color correct, but in fact, the complete opposite is true,” said González. “Unlike a live action feature, a CG animation’s whole universe will have been created by many different people on many different machines. Homogenizing this content and giving greater depth to the animation through tools like light and blur is a daunting task.”
To ensure that every shot in a sequence matched uniformly, Matus and González decided to separate key elements from each scene through their respective alpha channels using mattes. This allowed for more control to modify individual assets, such as character faces, throughout the final grade. The next step was to give “Ozzy” its own unique aesthetic.
“Ozzy is a children’s film, so we wanted to maintain a bright, friendly feel even in scenes that were tense or sad,” González shared. “The great thing about being a colorist on an animation like this was that I had way more creative freedom when grading. For some night scenes, for instance, I could push the blues of the sky to a point where a live action sky would never go. The same happened with sunsets.”
“The best thing about using DaVinci Resolve for this,” he continued, “was the Color Management system. What impressed me on this feature was Resolve’s ability to handle all of the different color space conversions that we needed to output. It was quick and pain free. We also had a lot of shot replacements coming in throughout the grading process. With Resolve, I could simply save a still from a grade that I was happy with to the stills gallery, and it would include all the keyframe and tracking information, which made it stress free to update to a new shot.”

  • Thursday, Nov. 10, 2016
Sony expands FS Series with new FS7 II camcorder
Sony's FS7 II camcorder

Sony is expanding its FS Series Super 35mm professional family with the addition of the new FS7 II camcorder. The new model builds on the original FS7’s strengths by adding advanced features including Electronic Variable ND technology, a lever lock type E-mount, and a new mechanical design for faster and easier set-up. The new FS7 II camcorder also supports Sony’s α Mount System, which includes more than 70 lenses.
Since its introduction in 2014, the FS7 has become one of the most widely used cameras in a range of production applications, and the original FS7 model remains in the Sony line-up. The new FS7 II now gives creative professionals a broader range of creative tools, with new features all based on end user feedback.
The new FS7 II camcorder is designed for long-form shooting and production applications, especially documentaries and independent filmmaking. Sony is also introducing an FS7 II kit model which includes a new E-mount, Super 35mm lens, model SELP18110G, covering Super35mm and APSC sensors.
“The FS7 II features state of the art, Sony variable ND technology and a robust locking E‑Mount,” said Juan Martinez, senior product manager, professional digital imaging, Sony Electronics. “Extensive enhancements to the VF support system enables super-fast and secure viewfinder repositioning, while retaining the Zen-like simplicity, flexibility and comfort of the FS7’s ‘multi-award winning’ industrial design.”
Electronic Variable ND Technology
The camcorder’s Electronic Variable ND Filter system, combined with its large sensor, delivers greater exposure control, with the option of preset or variable operation modes. Variable ND mode (seamless ND attenuation within the camera’s 2~7 stop range) allows the user to vary the density of the ND filter during shooting and to transition seamlessly between steps.
The camera’s expanded ND operations also enables fine exposure adjustment by relegating iris to set depth of field, prevents soft focus caused by diffraction, and prevents color shift caused by stacking multiple external ND filters.
Preset mode lets users assign three ND settings to the filter turret, useful in selecting the most appropriate filtration range for changing light conditions. Auto ND mode is also available allowing exposure to stay at a fixed level while adjusting the depth of field with iris control.
E-mount (Lever Lock type) for professional shooting
The FS7 II’s new E-mount (lever lock type) gives users the ability to change lenses by rotating the locking collar rather than the lens itself, which means that in most cases lens support rigs don’t need to be removed, saves time during a production.
Mechanical Design Enhancements
Like its counterparts in the FS Series family – the FS7 and FS5 models – the new FS7 II features several design and ergonomic updates for comfortable and functional use in the field.
The FS7 II’s “tool-less” mechanical design lets users make on-the-fly changes to the camera’s set-up and operation. For example, no tools are required to adjust the Smart Grip or viewfinder positions.
The viewfinder eyepiece provides a third stabilizing contact point when shooting handheld. Durable square section rods and lever-clamps on the LCD and camera body provide simple and precise front-to-back VF adjustment while retaining level positioning. 
New Sony 18-110mm Sony G lens
Sony is also introducing an FS7 II kit model including a new E-mount, Super 35mm lens. The new lens, model SELP18110G, covers Super35mm and APSC sensors. Compact and lightweight -- 2.4 lbs (1.1Kg) -- with an 18 to 110 focal range (6x zoom) it uses a new fully mechanical/servo zoom switchable system, capable of snap zooms and entirely devoid of lag. The focal range is optimized for Super 35 and APS-C sensors.
The lens is compatible with Sony α Mount System cameras, including the α7 series interchangeable-lens cameras and professional Super 35mm 4K camcorders like Sony’s FS7 or FS5. Although perfectly suited for still image taking, filmmakers will fully appreciate the lens’ extended creative capabilities for shooting motion images.
The lens benefits from Sony’s Smooth Motion Optics (SMO) design, which is developed to optimize performance during motion image capture. This lens design eliminates undesirable characteristics and artifacts that do not affect still image capture, but can severely limit a lens’ usefulness for motion shooting, such as:
·         Ramping: F stop gradually diminishes when zooming tight.
·         Not holding focus while zooming. 
·         Breathing (angle of view variation while focusing).
·         Optical axis shift (image moves in the frame while zooming).
XQD Cards
The FS7 II supports the XQD memory card format, designed for capturing and transferring high-bandwidth, high resolution files. Sony is also introducing a new XQD card, QD-G256E -- with an industry first 256 GB capacity – which enables a recording time of approximately 45 minutes at 4K 60P and 3.5 hours at 2K 30P. Combined with a read speed of up to 440MB/s and write speed of up to 400MB/s, users can shoot for longer without needing to change media cards.
The FS7 II is planned to be available in January 2017 for an estimated street price of $10,000 (body only) and $13,000 for the camcorder with 18-110mm power zoom lens kit.

  • Thursday, Nov. 10, 2016
The highlight of Google's Daydream VR is ... its controller
Clay Bavor, Google vice president of virtual reality, talks about Daydream and virtual reality during the keynote address of the Google I/O conference, Wednesday, May 18, 2016, in Mountain View, Calif. (AP Photo/Eric Risberg)

The best thing about Google's new virtual-reality headset isn't the headset at all.

In fact, Daydream View would pale compared with Samsung's Gear VR headset were it not for Daydream's controller, a handheld device that responds to gestures and other motion.

With Gear VR, I have to move my head to point a cursor at something, then reach for a button on the headset. With Daydream, I can just aim and click the controller in my hand. Sensors in the device tell the headset what I'm trying to do, whether it's swinging a tennis racket or casting a fishing rod. The headset's display responds accordingly.

The headset and controller are sold together for $79, starting Thursday. No rush in getting one, though, as the virtual experiences built for Daydream are still limited. And for now, it works only with Google's Pixel phone .

While sophisticated systems like Facebook's Oculus Rift and HTC's Vive let you walk around in the virtual world, Daydream View is a sit-down experience in which you use the controller to move yourself around. (You could walk around with the Daydream on if you wanted to, but you won't go anywhere in virtual space - and you might run into the wall.)

But the Rift and the Vive each costs more than $1,500, once you include powerful personal computers they require. Suddenly, $79 sounds like a bargain. Daydream stays cheap by using the display and processing power of your phone, which you insert into the headset at eye level.

Gear VR, at $100, takes a similar approach, but it works only with Samsung phones. While Daydream works only with Pixel for now, several other Android makers plan to make compatible phones. Sorry, iPhone users.

Those without compatible phones still have Google Cardboard, a $15 contraption you hold up to your face. Using Daydream View, by contrast, is more like wearing goggles. While Gear VR has a better fit, with focusing and a second strap over your head to keep the headset from sliding too low, Daydream is much more comfortable to wear and use than Cardboard.

Those who've played Nintendo's Wii system will find the Daydream controller familiar. It's about the size and shape of a chocolate bar, and it has motion sensors to track movement.

Although I'm not a big gamer, I enjoyed shooting water out of a hose to put out fires. You simply hold a button to spray and move the controller around to douse flames. You can even tilt the controller to control the angle of the hose. Another app lets you explore the universe by using the controller as a laser pointer to bring up more information.

The controller makes it easier to navigate menus without making yourself dizzy; just move it around to point at things. And while getting the full 360-degree experience of VR often requires spinning around (a swivel chair helps), some apps in Daydream let you grab the scene with your controller and drag it around you, just as you would with a PC mouse.

It's also handy to have volume controls and a home button in your hand rather than on your head.

VR can be nauseating, and Daydream is no different. I found that it's less about the headset, and more about the VR video.

The best videos use stationary cameras and let you move your head (or controller) around to explore. The nauseating ones tend to treat VR cameras like regular movie cameras , with a lot of panning in response to a subject's movements. The viewer, not the subject, should be the one doing the moving.

And while I enjoyed watching a woman's skydive in VR on YouTube, scenes of her preparing to jump felt jarring because the camera was on her shaky arm. I had to remove my headset.

You can view 360-degree YouTube videos and any 360-degree photos you store on Google Photos. You can visit other destinations such as the Galapagos Islands in a 360-degree version of Google's Street View. A few games, museum artworks and The Wall Street Journal's app were also available to try out prior to Thursday's launch.

A handful more are coming Thursday. Even more are promised by the end of the year, including apps for Netflix and Hulu - though all that does is offer video on a giant screen in a virtual living room.

There's much more available for Cardboard. Unfortunately, app developers will need to make some tweaks first to make them compatible with Daydream. They'll need to do even more to take advantage of the motion control.

Daydream has promise, but until more apps arrive, its potential is still a dream.