Thursday, June 29, 2017

Toolbox

  • Tuesday, Mar. 28, 2017
Wanda to buy Dolby Atmos and Dolby Digital Cinema processors for cinema screens in China
SAN FRANCISCO and BEIJING -- 

Dolby Laboratories, Inc. (NYSE: DLB) and Wanda Cinema Line Corporation Ltd. announced that Wanda Cinema Line plans to purchase 800 units of Dolby digital cinema processors, including 790 units of the Dolby® Digital Cinema Processor CP750 and 10 units of the Dolby Atmos® Cinema Processor CP850, to deploy in its new and existing cinema screens in China through the end of 2017.

“Dolby has long been a trusted partner that delivers quality products with high reliability and great performance,” said Jack Wang, general manager, Information and Technology Center, Wanda Cinema Line Corporation Ltd. “Our ongoing collaboration with Dolby shows we are committed to delivering high quality audio to our patrons. We look forward to expanding our Dolby Atmos footprint to more Wanda Cinema locations across China so audiences can truly understand what great sound brings to the moviegoing experience .“

“Dolby is committed to supporting the cinema industry through quality and innovation,” said Michael Archer, VP, Dolby Laboratories. “We look forward to offering the amazing cinema experiences to more moviegoers across China as the Dolby digital cinema processors are deployed in Wanda’s new and existing screens.”

Dolby Atmos Cinema Processor CP850
The Dolby Atmos Cinema Processor CP850 offers a complete audio solution to today’s digital cinema theatres and brings a natural, realistic, and immersive Dolby Atmos sound experience to audiences. The Dolby CP850 supports Dolby Atmos playback up to 64 speaker feeds, configurable between 16 analog outputs and a Dolby Atmos Connect output. The Dolby CP850 supports Dolby Surround 7.1 and 5.1 digital playback as well as common Dolby formats used with alternative content in cinemas: Dolby TrueHD, Dolby Digital Plus, and Dolby E.

Dolby Digital Cinema Processor CP750
The Dolby Digital Cinema Processor CP750 provides easy-to-operate audio control in digital cinema environments while integrating seamlessly with existing technologies. The Dolby CP750 supports 5.1 as well as Dolby Surround 7.1 premium surround sound, and it will receive and process audio from multiple digital audio sources, including digital cinema servers, preshow servers, and alternative content sources.

The Dolby CP750 is ready for use by a network operations center (NOC) and can be monitored and controlled from anywhere on the network for status and functions.

  • Tuesday, Mar. 28, 2017
Globecast appoints Ken Fuller as CTO in the U.S.
Ken Fuller
LOS ANGELES -- 

Globecast, a global solutions provider for media, has appointed Ken Fuller to the post of chief technology officer (CTO) of Globecast Americas. Fuller will lead all aspects of the company’s technical development and will work closely with the executive management team to establish a clear and strategic technical vision.
 
In his new role, he will oversee key vendor relationships and investigate, purchase, and implement new technologies. On top of this, Fuller will manage a team of 30 in the U.S. He reports to Globecast COO Philippe Fort who is based in Paris.
 
Eddie Ferraro, managing director, Globecast Americas, said, “Ken’s impressive experience in broadcast and satellite transmission as well as OTT, VOD and media management makes him an incredible asset to Globecast. He will successfully implement the roadmap we need to continue to deliver high caliber solutions to our customers around the world.”
 
Prior to joining Globecast, Fuller held the post of sr. VP of operations at Deluxe Entertainment Services Group in Burbank, Calif., where he was responsible for several integration groups that focused on ingest, QC, metadata management of packaging and delivery of SVOD, TVOD, and streaming content. Before then, he spent several years as sr. VP and general manager at Encompass Digital Media, Inc., where he was responsible for the company’s metro Los Angeles operations, production, engineering and facilities services. Fuller is also a past president of the Society of Motion Picture and Television Engineers as well as an SMPTE Fellow. In addition, he was director of broadcast and network operations NBC New York. While there, he received five Technical Emmy Awards for his work on NBC’s Olympic broadcasts.
 
Fuller said, “In my new position, I’m committed to helping ensure that our customers have full visibility on the value we offer, and I’m looking forward to engaging with the industry to deliver a technical strategy that’s successful for everyone.”

  • Tuesday, Mar. 28, 2017
Got camera? Facebook adds more Snapchat-like features 
In this May 16, 2012 file photo, the Facebook logo is displayed on an iPad in Philadelphia. (AP Photo/Matt Rourke, File)
NEW YORK (AP) -- 

Facebook is adding more Snapchat-like features to its app. The company says it wants to let your camera "do the talking" as more people are posting photos and videos instead of blocks of text.

Facebook is rolling out an app update starting Tuesday. With it, you can tap a new camera icon on the top left corner. That opens up the phone's camera to do a photo or video post. You could have posted photos from the app before, but it took an extra tap.

Once you open the camera, you'll find Facebook's other new Snapchat-like features, including filters that can be added to images.

Other effects, such as animations and other interactive filters, are a new twist to dressed-up photos.

Also new is a "stories" tool that lets you post photos and videos that stay live for 24 hours. This feature is already available on Messenger and Instagram, which is owned by Facebook.

Snapchat pioneered camera-first sharing and is wildly popular with younger users. Years ago, Facebook tried to buy the company but was rebuffed. Since then, it has been trying, with varying degrees of success, to clone Snapchat's most popular features.

It might be working: Snapchat's growth rate has slowed down since Instagram introduced its own "stories" feature.

  • Tuesday, Mar. 28, 2017
Robert Legato to present keynote at NAB's “Future of Cinema Conference” produced with SMPTE
Robert Legato
WHITE PLAINS, NY -- 

The Society of Motion Picture and Television Engineers® (SMPTE®) announced that Emmy® and Academy® Award-winning visual effects supervisor Robert Legato will present the keynote at the 2017 NAB Show’s “The Future of Cinema Conference: The Intersection of Technology, Art, & Commerce in Cinema,” produced in partnership with SMPTE. The conference will take place April 22-23 at the Las Vegas Convention Center.

Legato will present “Jungle Book, Photorealism, and the Bright Future of Filmmaking” on the second day of the two-day conference, which also will feature sessions delving into how technical innovation, artistic intent, and evolving consumption and business models will interact to shape the future of cinema. With computer simulations becoming so accurate that it is often difficult for even the most seasoned pros to distinguish the difference between an effect and the real thing, drawing from his experiences working on acclaimed titles such as “The Jungle Book,” “Apollo 13,” “Titanic,” and “Hugo,” Legato will explore the distinction between visual effects (VFX) and “traditional” cinematic disciplines, such as direction, cinematography, production, and design.

“Rob’s surprising and creative visual illusions help movies to resonate powerfully with audiences, and his work has enriched and enhanced some of the industry’s most beloved films,” said Cynthia Slavens of Pixar Animation Studios, who serves as program chair for the conference. “In addition to being a master of his craft, Rob is an engaging speaker who draws on his wealth of experience to provide insights into the creative process, offering a valuable perspective as we look to the future of cinema.”

Legato has been nominated for four Academy Awards and has won three. His first win was for his VFX work on “Titanic,” his second was for his work on “Hugo,” and his third for “The Jungle Book.” In 1996, he won the BAFTA Award for Best Achievement in Special Effects for “Apollo 13,” for which he also garnered his first Academy Award nomination. Prior to his work in film, he was a visual effects supervisor for “Star Trek: The Next Generation” from 1987 to 1992, and “Star Trek: Deep Space Nine” in 1993. His work on the “Star Trek” franchise earned him three Emmy Award nominations and two wins.

“The Future of Cinema Conference” will gather the brightest industry minds and talents to discuss the changing nature of storytelling today and into the future, as well as the industry’s role in ensuring that creative work is preserved in its highest form for generations to come. The conference will feature sessions on forward-looking techniques and challenges related to making content for theatrical release and beyond.

Complete conference details, including registration information, are available here.

 

  • Monday, Mar. 27, 2017
Snell Advanced Media sets product lineup for NAB
SAM's Morpheus UX
NEWBURY, UK -- 

At NAB 2017(Booth SL1805), Snell Advanced Media (SAM) will introduce a wide range of new technology across its product portfolio. These include the new 12G-SDI product range, demonstrating game-changing fast turnaround solutions for live 4K multi-format sports and news production, the most comprehensive range of monitoring and control solutions on the market and next generation of software defined and IP technology.
 
SAM’s CEO, Tim Thorsteinson, commented, “Eighteen months in from becoming a single entity, we are now delivering integrated technology solutions that draw on our joint heritage to develop a world-leading offering for our customers that pushes the boundaries with 4K, IP, 12G and software defined solutions. NAB will see SAM shake up perceptions in the live sports and news production space and continue to demonstrate our ability to deliver riskless and future-ready technology that is agile enough to cope with customers’ needs today, whilst being prepared for the challenges they will face tomorrow.”

Among the SAM innnovations that will be showcased at NAB are:
 
•Offering easy and familiar operations and the ability to scale to meet HD and UHD productions of any size, SAM’s live sports solutions combine immediate replay of live events with no-copy editing and postproduction. Users can publish highlights and replays straight to social media at the touch of a button.
 
•SAM will introduce the next generation of Multiviewers that span every niche in the market, including its new 12G Multiviewer and IP Multiviewer. Part of the monitoring offering on show will be its media content monitoring and control solutions based on SAM’s pioneering Media Biometrics technology. Distributed intelligent logic engines across the workflow enable exception-based and schedule-aware monitoring. SAM delivers unmatchable production monitoring density and flexibility, full integration with master control solutions and third-party integration.
 
•SAM will be showing its new IP products which use 40, 25, 50 and 100 GbE interfaces. New 12G-SDI solutions give customers the most flexibility when it comes to making investment decisions for UHD/4K projects.  SAM will demonstrate its modular infrastructure and conversion solutions and highlight UHD to SD support for IP and SDI along with integrated SDI and IP routing control and flexible IP/SDI I/O for routing, switching, conversion, multiviewers and monitoring.
 
•SAM continues its innovation in 4K and HDR with the launch of HDR file-based conversion products, FormatFusion4 HDR conversion in its Kahuna production switcher line up and support for 4K and HDR conversion in its mid-range KudosPro and UHD1000 products.
 
•Recently introduced and on show at NAB is SAM’s radical new web-based user interface for multi-channel playout control - Morpheus UX. Morpheus UX gives customers an unparalleled level of adaptability so they can precisely tailor their channel views and focus on the specific functionality they need. ICE SDC, SAM’s pure software playout solution for virtualized IP playout will also be demonstrated, allowing customers to realize full channel functionality in a software defined ecosystem. SAM will also introduce its brand new 12G-SDI Master Control system with HD/1080p and 4K 12Gps single link support providing a robust, flexible and powerful solution for the most demanding broadcast environments.
 
•SAM will show its ultra-fast, flexible multi-format news solutions at NAB, featuring integrated social media publishing workflows for delivering quality content to every platform. Multi-format, multi-aspect ratio media management capability allows users to seamlessly mix aspects and file sizes from acquisition through to delivery. SAM’s news solutions create an open, flexible and collaborative environment, allowing easy integration with newsroom partners.

•For media organizations where cost, combined with core technical capabilities, is of paramount concern, or for first-time entrants to the market with limited budgets, SAM has created a highly cost-effective production package. This space saving system includes SAM’s 1-3M/E switcher, routers, processing solutions, multiviewers and infrastructure portfolio. SAM’s low-cost suite is ideal for a variety of production environments including live sports, houses of worship, outside broadcasting and education.

  • Thursday, Mar. 23, 2017
DP Catalan deploys Cooke Anamorphic/i lenses for ITV's "Broadchurch"
A scene from "Broadchurch"
LEICESTER, UK -- 

The third and final series of ITV’s acclaimed TV crime drama Broadchurch, produced by Kudos, Imaginary Friends & Sister Pictures, was shot using Cooke Anamorphic/i lenses by cinematographer Carlos Catalan. The unusual decision to shoot a TV drama anamorphically was taken to make the most of the wide vistas of the coastal setting, and to bring a cinematic look to the story.

“I was a big fan of the show, so the question was how could I contribute to something that had made its mark in terms of cinematography?” said Catalan. “One of the elements that came to mind was anamorphic, to bring a cinematic experience and a wider aspect ratio to the show.  Luckily when we presented the idea of going 2:1 to ITV, we were surprised and delighted that they agreed!”

Catalan had previously worked with the director Paul Andrew Williams on the BAFTA award-winning drama Murdered By My Boyfriend. “We used older anamorphic lenses on that production, so we already had an idea in our minds of what we could bring to Broadchurch – improving, but still keeping the elements that people like and expect,” Catalan continued. “I tested the Cooke Anamorphic/i lenses and really liked them. They produce a clear picture with contrast while at the same time keeping the warm, delicate feel which is so nice for skin tones. I shot a lot wide open, and the Anamorphic/i’s create that fall-off at the top that helps to give a more cinematic look. As well as the exterior shots, they also handled lower light situations very well.”

Broadcurch 3 producer Dan Winch said, “From the outset Broadchurch was talked about as being visually stunning, so during prep for Series 3 we scrutinised every creative decision taken on the show previously in order to creatively and technically push the boundaries further without losing what people loved about the show. The bold decision to use anamorphic lenses has subtly enhanced each and every frame. With close up angles, the fall off and richness of the images has the unique ability to embed characters within the landscape in a beautiful way. We’re delighted with the superb end result Carlos has achieved for us.”

Cooke Anamorphic/i lenses will be on display at NAB 2017, including the new front Anamorphic/i zoom lens that will make its debut at the show. In addition, visitors to the Cooke booth will see the Anamorphic/i SF lens range. Complementary to the acclaimed Anamorphic/i series, the SF lenses feature a coating that gives cinematographers even more options for anamorphic character with enhanced flares and other aberrations, and still retaining the oval bokeh.

  • Wednesday, Mar. 22, 2017
Wal-Mart launches incubator lab to house tech startups 
In this May 9, 2013 file photo, a worker pushes shopping carts in front of a Wal-Mart store in La Habra, Calif. (AP Photo/Jae C. Hong, File)
NEW YORK (AP) -- 

Wal-Mart is launching an incubator lab focused on projects in robotics, virtual and augmented reality, and artificial intelligence as it aims to compete more aggressively with Amazon.

The so-called Store No. 8 will be located in Silicon Valley and marks the latest attempt by the company's new head of e-commerce operations, Marc Lore, to speed up innovation at the company. Wal-Mart Stores Inc., based in Bentonville, Arkansas, brought over Lore, the founder of Jet.com, when the discounter bought the online retailer last year for more than $3 billion. Wal-Mart has been on an acquisition binge since then, snapping up ShoeBuy, Moosejaw and ModCloth.

The incubator lab's mission will be to house new startups that will run independently from the company. It will hatch, invest in and team up with entrepreneurs, and venture capitalists to create proprietary technology. Wal-Mart says these startups "will be ring-fenced from the broader organization, so they have the room to grow and develop."

Store No. 8 is named after an early store where Wal-Mart founder Sam Walton used to test different ideas that could be rolled out.

  • Tuesday, Mar. 21, 2017
ARRI Rental expands ALEXA 65 network to West Coast and Canada
Tony Linares
LOS ANGELES -- 

ARRI Rental, a provider of camera, lighting and grip equipment, has enlarged its ALEXA 65 network with the opening of boutique offices in Los Angeles and Vancouver. 

The expansion, prompted by the ever-increasing popularity of the ALEXA 65 with leading filmmakers, will better place ARRI Rental to support productions that want to take advantage of the system’s exceptional image quality on the West Coast and in Canada. 

Offering a complete large-format solution, the ALEXA 65 system comprises a 65 mm digital cinema camera, a growing range of custom-designed lenses, and fast, efficient workflow tools. Recent developments in ARRI Rental’s optics program have seen brand new Prime 65 S and Prime DNA lens options added to the existing lineup of Prime 65 and Vintage 765 lenses for the ALEXA 65. The upcoming months will see further ARRI Rental lens initiatives that will broaden the creative appeal of the system even more.

“The ALEXA 65 system is ever-evolving, with our new custom lens options continuing to extend its capabilities,” said ARRI Rental CEO Martin Cayzer. “As demand continues to grow, it is critical that we have the correct infrastructure in place to support the system and its use. The opening of a boutique rental space in Los Angeles and Vancouver ensures that the ALEXA 65 is available to two of Northern America’s most vibrant production communities.”

Joining the existing team of international marketing executive Dana Ross and technical marketing executive Matt Kolze in Burbank are Tony Linares and Rafael Adame. Both bring over 20 years of experience to ARRI Rental. As technical marketing executive Adame will be a key member of the ALEXA 65 support team for North America. Linares will manage key operational departments including prep, warehouse, logistics and inventory control in the role of operations manager. 

Sarah Mather will oversee the Vancouver facility as operations manager. She has over 16 years of experience on set. As a camera assistant she worked on movies such as The Revenant, Star Trek, Tomorrowland, Godzilla, Rise of The Planet of the Apes and The Bourne Legacy

  • Friday, Mar. 17, 2017
Turning James Joyce's "Ulysses" into a virtual reality game 
In this Jan. 26, 2017, photo, Joseph Nugent, a Boston College English professor, wears virtual reality goggles at the school's virtual reality lab in Boston. (AP Photo/Charles Krupa)
BOSTON (AP) -- 

Students are developing a virtual reality game based on James Joyce's "Ulysses" as part of a class at Boston College.

The goal of "Joycestick" is to expose new audiences to the works of one of Ireland's most celebrated authors, as well as to give a glimpse of how virtual reality can be used to enhance literature, said Joseph Nugent, the Boston College English professor who is coordinating the project.

"This is a new way to experience the power of a novel," he said. "We're really at the edge of VR. There's no guidance for this. What we have produced has been purely out of our imagination."

Nugent and his students hope to release a version of the game on June 16 in Dublin during Bloomsday, the city's annual celebration of the author and novel. They've already showcased their progress at an academic conference in Rome last month.

"Joycestick," in many ways, fills in the blanks of the novel, as many of the places key to the story have been lost to time as Dublin has evolved, said Enda Duffy, chairman of the English Department at the University of California, Santa Barbara, who has tried a prototype of the game.

"The VR version in this way completes the book," she said. "It makes it real. 'Ulysses' is an ideal book to be turned into a VR experience, since Dublin is, you might say, the book's major character."

There have been a number of efforts to bring works of literature into the gaming world over the years, including a computer game of F. Scott Fitzgerald's "The Great Gatsby" that became a viral hit in 2011 as it mimicked the look and feel of a classic, 1980s-era Nintendo game.

But the Boston College project is unique for trying to incorporate virtual reality technology, says D. Fox Harrell, a digital media professor at the Massachusetts Institute of Technology.

He is impressed that the students are taking on such a complex text.

"It requires multiple entry points and modes of interpretation, so it will be fascinating to see how their VR system addresses these aspects of the work," said Harrell, who hasn't tried the game out yet.

Considered the epitome of the 1920s-era modernist literature, "Ulysses" traces a day in the life of an ordinary Dubliner named Leopold Bloom. The title reflects how the novel draws parallels between Bloom's day and "The Odyssey," the ancient Greek epic.

"Joycestick" isn't meant to be a straight re-telling of "Ulysses," which in some versions runs nearly 650 pages long, acknowledged Evan Otero, a Boston College junior majoring in computer science who is helping to develop the game.

Instead, the game lets users explore a handful of key environments described in the book, from a military tower where the novel opens to a cafe in Paris that is significant to the protagonist's past.

It's also not a typical video game in the sense of having tasks to complete, enemies to defeat or points to rack up, said Jan van Merkensteijn, a junior studying philosophy and medical humanities who is also involved in the project. For now, users can simply explore the virtual environments at their leisure. Touching certain objects triggers readings from the novel.

The project represents an extension of what academics call the "digital humanities," a field that merges traditional liberal arts classes with emerging technology. Nugent has had previous classes develop a smartphone application that provides walking tours of Dublin, highlighting important landmarks in Ulysses and Joyce's life.

But the native of Mullingar, Ireland, is quick to shift credit for the current project's ambition to his group of 22 students, who are studying a range of disciplines, from English to computer science, philosophy, business and biology, and have also been recruited from nearby Northeastern University and the Berklee College of Music.

"These are ambitious kids," Nugent said. "They want to prove they've done something on the cutting edge. They have the skills. They're doing the work. All I'm trying to do is direct these things."

  • Friday, Mar. 17, 2017
Program takes shape for SMPTE's Entertainment Technology in the Connected Age Conference
Pat Griffis, SMPTE EVP and ECTA program chair.
WHITE PLAINS, NY -- 

The Society of Motion Picture and Television Engineers® (SMPTE®), the organization whose standards work has supported a century of technological advances in entertainment technology, today announced program details for the Entertainment Technology in the Connected Age (ETCA) conference, May 8-9 at the Microsoft Silicon Valley Campus in Mountain View, Calif. Entitled “Redefining the Entertainment Experience,” this year’s conference will explore emerging technologies’ impact on current and future delivery of compelling connected entertainment experiences.

“Now in its fifth year, SMPTE is pleased to be hosting this year’s ETCA on the Microsoft campus in the heart of Silicon Valley, where engineers, executives, creatives, and researchers will gain a unique perspective on the technologies that are arguably redefining entertainment as we know it, while engaging with the leaders who are making it happen,” said Patrick Griffis, SMPTE executive vice president, and ETCA program chair. “This year’s program is extremely strong, and we look forward to a thought-provoking two days.”

Bob DeHaven, general manager of Worldwide Communications & Media at Microsoft Azure, will present the first conference keynote, titled “At the Edge: The Future of Entertainment Carriage.” The growth of on-demand programming and mobile applications, the proliferation of the cloud, and the advent of the internet-of-things demands that video content is available closer to the end user to improve both availability and the quality of experience. DeHaven will discuss the multifarious relationships taking shape to embrace these new requirements and will explore the roles network providers, content delivery networks (CDNs), network optimization technologies, and cloud platforms will play in achieving the industry’s evolving needs.

Hanno Basse, chief technical officer at Twentieth Century Fox Film Corporation, will present “Next-Generation Entertainment: A View From the Fox.” 20th Century Fox distributes content via outlets ranging from cinema to Blu-ray Disc, over-the-top (OTT), and even virtual reality (VR). Basse will share his views on the technical challenges of enabling next-generation entertainment in a connected age and how Fox plans to address them.

The first conference session, “Rethinking Content Creation and Monetization in a Connected Age,” will leap right into a discussion of multiplatform production and monetization using the latest creation, analytics, and search technologies. The session “Is There a JND in It for Me?” will take a second angle, exploring what new content creation, delivery, and display technology innovations will mean for the viewer. Panelists will discuss the parameters required to achieve original artistic intent while maintaining a just noticeable difference (JND) quality level for the consumer viewing experience.

“Video Compression: What’s Beyond HEVC?” likewise will explore emerging techniques and innovations, outlining evolving video coding techniques and their ability to handle new types of source material including high-dynamic-range (HDR) and wide color gamut (WCG) content, as well as video for virtual and augmented reality (VR/AR).

Moving from content creation and compression into delivery, “Linear Playout: From Cable to the Cloud” will discuss the current distribution landscape, looking at the consumer apps, smart TV apps, and content aggregators/curators that are enabling cord-cutters to watch linear television, as well as the new business models and opportunities shaping services and the consumer experience. The session will explore tools for digital ad insertion, audience measurement, and monetization while considering the future of cloud workflows.

“Would the Internet Crash If Everyone Watched the Super Bowl Online?” will shift the discussion to live streaming, examining the technologies that enable today’s services as well as how technologies such as transparent caching, multicast streaming, peer-assisted delivery, and User Datagram Protocol (UDP) streaming might enable live streaming at a traditional broadcast scale and beyond. “Adaptive Streaming Technology: Entertainment Plumbing for the Web” will focus specifically on innovative technologies and standards that will enable the industry to overcome inconsistencies of the bitrate quality of the internet.

“IP and Thee: What’s New in 2017?” will delve into the upgrade to internet protocol (IP) infrastructure and the impact of next-generation systems such as the ATSC 3.0 digital television broadcast system, the Digital Video Broadcast (DVB) suite of internationally accepted open standards for digital television, and fifth-generation mobile networks (5G wireless) on internet-delivered entertainment services. Moving into the cloud, “Weather Forecast: Clouds and Partly Scattered Fog in Your Future” examines how local networking topologies, dubbed “the fog,” are complementing the cloud by enabling content delivery and streaming via less traditional — and often wireless — communication channels such as 5G.

The rise of interactivity, as both a control mechanism and as a means of enhancing the viewing experience, will also be a theme of ETCA. “Giving Voice to Video Discovery” will highlight the ways in which voice is being added to pay television and OTT platforms to simplify searches. Panelists will discuss the benefits and challenges of implementing voice effectively, and the impact this trend will have on viewing behavior. In a session that explores new consumption models, “VR From Fiction to Fact” will examine current experimentation with VR technology, emerging use cases across mobile devices and high-end headsets, and strategies for addressing the technical demands of this immersive format.

Complete conference details, including registration information, are available here.