Wednesday, January 17, 2018

Toolbox

  • Wednesday, Dec. 20, 2017
"Pokemon Go" unleashes its critters in Apple's AR playground
In this Monday, Dec. 18, 2017, photo, Pokemon Go is played at a park in San Francisco. Pokemon Go is unleashing its digital critters in Apple’s playground for augmented reality, turning iPhones made during the past two years into the best place to play the mobile game, according to the CEO of the company that makes Pokemon Go. (AP Photo/Michael Liedtke)
SAN FRANCISCO (AP) -- 

"Pokemon Go" is moving into a different dimension on the iPhone, thanks to software that allows the game to play new tricks with its menagerie of digital critters.

An upcoming game update relies on built-in Apple software called ARKit that gives the iPhone's new ways to serve as a portal into augmented reality.

AR is a technology that projects life-like images into real-world settings such as parks and streetscapes.

The new approach, announced Wednesday, adds depth to the playing field and lets Pokemon monsters grow or shrink to fit their environment. The game's creatures will now flee when they detect sudden movement or if players approach too quickly.

John Hanke, CEO of "Pokemon Go" creator Niantic, believes iPhones equipped with Apple's AR software now offer the best way to play the game.

That's an ironic twist because Niantic spun out of Google, whose Android software powers most of the smartphones in the world. Hanke played a key role in build Google Maps, one of the most frequently used apps on Android phones.

Apple's AR technology works on iPhones dating back to the 2015 iPhone 6S, a line-up that encompasses an estimated 200 million to 300 million devices, including iPads.

"Pokemon Go" has offered an AR option since its release 17 months ago, but Apple's technology is more advanced than what the game has been using.

Apple is hoping app makers will find compelling ways to deploy its AR tools, helping to hook the masses on a technology that so far has been embraced by a sliver of smartphone users. If AR takes off, many analysts believe Apple will branch out in a few years and release a new line of devices designed specifically for AR.

  • Tuesday, Dec. 19, 2017
Dan Hammond named VP/general manager of Panavision Hollywood
Dan Hammond
WOODLAND HILLS, Calif. -- 

Panavision has named Dan Hammond, a longtime creative solutions technologist in the industry, as vice president and general manager of Panavision Hollywood. Hammond will be responsible for overseeing daily operations, and working with the Hollywood team to leverage the exceptional camera systems, optics, service and support that Panavision customers have come to expect.

Hammond is a Panavision veteran, who worked at the company between 1989 and 2008 in various departments, including training, technical marketing, and sales. Most recently he was at Production Resource Group (PRG), expanding his technical services skills. Hammond is active with industry organizations, and is an associate member of the American Society of Cinematographers (ASC), as well as a member of the Academy of Television Arts and Sciences (ATAS) and Association of Independent Commercial Producers (AICP).

“Dan has a broad range of strengths in developing technology solutions, and extensive experience providing production studios, cinematographers and directors with the products and services they need on a global level,” said Michael George, Panavision’s chief operating officer, to whom Hammond directly reports. “We are excited to have him back with the team at Panavision.”

Hammond added, “Service, innovation, and growth are always the top priority at Panavision. I look forward to continuing to work with filmmakers to provide creative solutions and further expanding Panavision Hollywood’s offerings to the production community.”

  • Wednesday, Dec. 13, 2017
Collaborative Baselight color workflow deployed on Woody Allen's "Wonder Wheel"
Kate Winslet in Woody Allen’s “Wonder Wheel” (photohoto by: Jessica Miglio/courtesy of Gravier Productions, Inc.)
LONDON -- 

For his 50th movie as a director, Woody Allen chose a story set in and around the famous Coney Island amusement park in 1950s New York. To create a convincing period look and feel, the creative team established a collaborative color pipeline relying on the Baselight Linked Grade (BLG) render-free workflow provided by the tools from FilmLight. 

Wonder Wheel, from Amazon Studios, reunites Allen with veteran cinematographer Vittorio Storaro, who was behind the camera for Allen’s Café Society, where his cinematography earned critical acclaim. The movie also reunites colorist Anthony Raffaele of Technicolor PostWorks New York and DIT Simone d’Arcangelo. 

Working closely together, core looks were created by Raffaele in Baselight and then d’Arcangelo used FilmLight’s Prelight application on set, which not only imposed these looks on the raw footage, but gave him the opportunity to adjust the grade to meet the needs of Storaro and Allen.

“Vittorio [Storaro] is a very traditional cinematographer, who thinks in terms of photo-chemical effects,” Raffaele explained. “He grew up with the idea of flashing the negative – which gives you more details in the shadows – and flashing the positive for more in the highlights. So I created layers for each of those elements in Baselight and gave them to Simone, who used layer blending in Prelight to achieve the exact balance required.”

On set throughout the production, d’Arcangelo interpreted the requests from Storaro to present the best possible grades. “The first picture we see is like the first love,” he said. “We should make sure it is as close as possible to the one the cinematographer and director wants. This means for me I have to have a solid relationship with the colorist from the beginning. I love to feel in perfect harmony with the cinematographer and the colorist, to be in the chain in all the creative steps.”

The 1950s setting of the movie meant that there was a need for a lot of VFX. The apartment at the centre of the story, for example, is in the middle of the park, with the lights and bustle – and the famous Coney Island Ferris wheel, the wonder wheel of the title – visible through the window. Replicating  the studio interior to VFX exteriors called for careful matching.

“This is one of the reasons why we really wanted to use Prelight,” said Raffaele. “We could create looks and know on set how it was going to feel. Vittorio could say ‘it should be warmer because of the sunset on set, and then made it happen in the grade.”

D’Arcangelo added, “One of the tools we used most was the Hue Angle. That gave us the chance to have a strong contrast, but still select the highlight and the shadow to get rich detail in each. Prelight had everything I needed. It gave us more tools, and more consistency in our workflow. The only choice for us was Prelight.”

Prelight is a Mac application that reads the Baselight standard BLG format grading metadata and imposes it on the raw footage as shot. It includes a full Baselight toolkit to be able to tweak or create grades, and allows for multiple layers that ensure windows and tracking can be established, as well as allowing for effects like introducing negative flashing.

Wonder Wheel was an ACES production workflow and shot using Sony cameras, in 4K and high dynamic range. It went into general release on December 1.

  • Tuesday, Dec. 12, 2017
Shotgun 7.6 released with new analytics feature set for VFX and animation
Shotgun 7.6 Production Insights software
LOS ANGELES -- 

Shotgun Software has released Shotgun 7.6, the latest version of its cloud-based review and production tracking software. This release delivers a new set of analytics and reporting tools which give studio leaders the ability to visualize key production metrics, keep a close eye on the progress of their projects, and make business-critical decisions fast.

Faced with shorter timelines, tighter budgets, and growing creative demands, studios need to be efficient, identify business issues quickly, and adjust where and how resources are being used during production--rather than after the fact. Now, instead of relying on manual reporting and gut instinct, Production Insights in Shotgun provide studio leaders with a high-level overview of the health of projects as well as the ability to dive into the details to see where time and resources are used, so operations can be streamlined and better decisions can be made, faster.

“Shotgun’s Production Insights help us work realtime software development and scrum-style methods for task organization into our VFX pipeline. So far this workflow has had an immediate positive effect on the communication we’ve been able to achieve between our departments,” said Kent Rausch, associate VR producer, Framestore. “The more data we can share across disciplines helps us work more predictably and efficiently, and these new tools are a great first step in helping us get there.” 

James Pycock, head of product for Shotgun, Autodesk, said, “Our new Production Insights features help Shotgun customers answer critical, urgent, and costly production questions such as: Are we going to hit our deadline? How much work is there left to do? Where are we struggling? Having access to these tools out of the box gives everyone instant at-a-glance visualizations of how and where they are spending time and resources. We believe that sophisticated data analytics will help facilities of all sizes turn their production data into real insights which can help them remove guesswork and optimize the production process.”

Shotgun Production Insights include:

  • Analytics: Apply production data in Shotgun to optimize how resources are used, plan ahead for tight deadlines and budgets, and accurately compile bids for upcoming projects
  • Data Visualization: Explore new graph types including pie charts, vertical bar charts and line charts in addition to the existing horizontal bar chart in Shotgun
  • Data Grouping: Display data as stacked or un-stacked bar charts to visualize in even greater at-a-glance detail
  • Presets: Drag and drop from a number of pre-configured presets to build reports instantly, with flexible customization options
  • Monday, Dec. 11, 2017
RED WEAPON camera with MONSTRO sensor introduced to marketplace
RED WEAPON camera with the MONSTRO 8K VV sensor
IRVINE, Calif. -- 

RED Digital Cinema® has announced that its cinematic full frame WEAPON® camera with the MONSTRO™ 8K VV sensor is available for purchase. MONSTRO is an evolutionary step in large-format sensor technology, with improvements in image quality including dynamic range and shadow detail.

RED’s newest camera and sensor combination, WEAPON 8K VV, offers full frame lens coverage, captures 8K full format motion at up to 60 fps, produces ultra-detailed 35.4 megapixel stills, and delivers incredibly fast data speeds--up to 300 MB/s. And like all of RED’s DSMC2 cameras, WEAPON shoots simultaneous REDCODE® RAW and Apple ProRes or Avid DNxHD/HR recording and adheres to the company’s dedication to OBSOLESCENCE OBSOLETE® — a core operating principle that allows current RED owners to upgrade their technology as innovations are unveiled and move between camera systems without having to purchase all new gear.

The WEAPON 8K VV is priced starting at $79,500 (for the camera BRAIN) with upgrades available for carbon fiber WEAPON customers.  RED is also offering  RED® ARMOR-W, an upgraded coverage program for RED WEAPON® cameras that includes increased warranty protection, and a sensor swap service.

  • Thursday, Dec. 7, 2017
URSA Mini Pros, DaVinci Resolve Studio deployed on "The S.P.A.A.C.E. Program"
Lensing "The S.P.A.A.C.E. Program"
FREMONT, Calif. -- 

Blackmagic Design announced that director, editor, colorist and post supervisor Alex Ferrari used Blackmagic URSA Mini Pro digital film cameras and DaVinci Resolve Studio to shoot, edit, grade and finish the streaming series “The S.P.A.A.C.E. Program.”
 
“The Scientific Pop and Also Cultural Explorations Program” aka “The S.P.A.A.C.E. Program” is a new streaming series from Nerdist and Legendary Digital Networks available on the Alpha streaming platform. The eight-episode series blends together science and pop culture by visiting different fictional planets and realms, such as Tatooine, Krypton, Arrakis and Westeros, and examining them through a scientific lens. Host Kyle Hill and his robot assistant AI visit a different place from pop culture each episode and break down the big scientific questions, such as what is it really like to live on a planet with two suns or what makes a White Walker a White Walker.
 
Led by Ferrari, the series was shot using two URSA Mini Pros. “We only had four days to shoot all eight episodes, so it was a very fast-paced shoot,” Ferrari explained. “We decided to shoot with the URSA Mini Pros because we knew they’d be reliable and fast, and they’d get us the cinematic look we were going for. You can take them straight out of the box and they’re ready to go with no fuss. The menu and operating system is intuitive and easy to use, so you don’t waste any time while shooting, and having the timecode on the side was helpful. Reliability can be a rarity, so the fact that we could count on them when we were in the heat of battle really made a difference.”
 
“We shot everything in a practical spaceship set that showed the cockpit, hallway and war room,” Ferrari continued. “All the windows were green screen, and we created outer space and the surrounding worlds in post. Being able to cleanly pull keys was crucial, and the camera’s sensors made it easy. We shot the whole series in 4.6K ProRes, which gave us a lot of latitude in post.”
 
DaVinci Resolve Studio was used on set by the DIT and then in post by Ferrari for the series’ full editing, grading and finishing.
 
“Using DaVinci Resolve Studio on set allowed us to organize and synch everything in real-time. At the end of the shoot, we easily exported everything and went right into editing,” said Ferrari. “By keeping everything in the ecosystem, I was able to directly edit the entire series in native 4.6K ProRes without having to transcode to a smaller proxy file. Doing everything soup to nuts in DaVinci Resolve Studio saved us a lot of time that would have been spent roundtripping.
 
“Moreover, it’s allowed me to evolve my editing process so color is intertwined rather than a separate function. As I edit and select shots, I can easily jump from the Edit Page to the Color Page to see if I can save a shot that might be too blown out or too dark. I can work on the lighting in real-time to see if I can make the shot work, which is invaluable in the creative process. Using DaVinci Resolve Studio, I can make editorial decisions based on what I know I can make work in color, rather than just hoping something will work down the line or scrapping what might be the best take because it initially seems unusable.”
 
When grading the series, Ferrari was inspired by the planets and realms Hill and AI visited. “We wanted the series to look cohesive from episode to episode, but we also wanted each to have its own look that mirrors the land we’re visiting. For example, the episode on LV-426 is more cold and desaturated. I used a greenish overtone for the episode with the Borg, whereas I used a very warm palette for King Kai’s planet in the Dragon Ball Z episode. Since each place the series visited has such a strong look already associated with it, we wanted to play homage to that,” Ferrari concluded.

  • Wednesday, Dec. 6, 2017
Dalet placed at core of BBC Wales' new broadcast center
BBC Wales
PARIS -- 

Dalet, a provider of solutions and services for broadcasters and content professionals, announced that BBC Wales has selected the enterprise Dalet Galaxy Media Asset Management (MAM) and Orchestration platform to facilitate all workflows and asset management requirements at its new state-of-the-art media facility located in Cardiff, Wales. Once deployed, Dalet Galaxy will offer a centralized content repository and provide tools to orchestrate workflows and media processes across production, news, studios and delivery departments. The massive installation design and multi-year deployment will be managed by Dalet Professional Services, which will ensure customer success in the transformation journey towards agility and maximize return on investment (ROI).

“BBC Wales is pleased to be working with Dalet to provide an asset management system for our new home in Central Square, Cardiff.  Dalet was chosen after a very competitive process, and will provide an important part of the technology solution at Central Square within a state of the art broadcast center.  We are looking forward to the successful delivery of the project,” said Gareth Powell, chief operating officer, BBC Wales.  

As the core media hub, Dalet Galaxy will be deployed as the cornerstone of the new digital facility. All systems and sub-systems deployed in future phases will connect to this hub. The state-of-the-art, BPMN-compliant Dalet Workflow Engine will enable the BBC to orchestrate a combination of user tasks and media services ranging from ingest, transcoding and QC, to logging, editing, media packaging and distribution. A simple-to-use workflow designer interface allows users to model business processes, picking from a palette of stencils operations such as user tasks and notifications, media and metadata services, gateways, timeout and error management, and much more.

The comprehensive and open Dalet Galaxy API will allow the BBC to tightly connect storage and infrastructure technologies, media services and post-production applications, and traffic and business platforms, orchestrating a fluid workflow that tracks assets and associated metadata across the media enterprise.

“We have been working with the BBC on a multitude of projects for more than fifteen years. commented Adrian Smith, regional manager, Dalet UK. “Dalet Galaxy’s flexible architecture provides a future-proof framework on which the BBC can evolve to meet new requirements and production needs that arise over coming months and even years. The Dalet Professional Services team’s experience in managing such enterprise rollouts will help them navigate the juggernaut of this multi-year, large-scale deployment.”

In addition to Dalet Galaxy, Dalet will be supplying a new Dalet HTML application for simplified management of camera card ingests and its Dalet Brio video server. Supporting both SDI and IP, the versatile, high-density Dalet Brio ingest and playout platform adheres to the SMPTE 2110 standards, allowing broadcasters to step into the future of IP while retaining the security of SDI.

  • Monday, Dec. 4, 2017
Foundry launches Cara VR 2.0 
Cara VR 2.0's new GlobalWarp feature
LONDON -- 

Creative software developer Foundry, has announced the launch of Cara VR 2.0, the next chapter for the cutting-edge virtual reality plug-in toolset for Nuke.

Building on the first-of-its-kind plug-in debuted in 2016, Cara VR 2.0 boasts improved stitching and stabilization, allowing for more efficient creation of seamless VR and 360 video content with the highest levels of quality. The new version features major updates in stitching, the introduction of 360 match-move tracking and automatic stabilization, with new tools for stereoscopic corrections using the cutting edge algorithms from Ocula.
 
Craig Rodgerson, CEO  of Foundry commented: “To fully realize the potential of VR, we need to enable content creators to build experiences that are more immersive than anything before. The first iteration of our Cara VR toolkit was hugely well-received, and this latest version will help usher in the next level of VR experiences. Artists can now better meet the demand for VR content thanks to our industry-leading creative toolset.”
 
Cara VR 2.0’s new GlobalWarp node speeds up delivery of stitches while producing a high quality 360 stitch with minimal ghosting. Global Warp adds additional controls for lining up key features in overlapping areas and allows you to add constraints to reduce warping on known straight lines, even for objects overlapping multiple camera views, helping users achieve the highest quality stitch faster.

Cara VR 2.0 includes a redesigned Tracker which accelerates the process of stabilization and match-moving for a more comfortable VR experience and easier alignment of 3D elements.  Automatically track a 360 stitch for stabilization and create a 360 match-move camera to assist in 3D corrections and CG insertion. The Tracker node now simplifies stabilization, adding 3D stabilisation to remove parallax changes, and brings match-moving to Cara VR.
 
Cara VR 2.0 also includes a suite of tools adapted from the powerful Ocula toolset which have now been optimized to work with 360 video, making these powerful algorithms accessible to VR content creators and more efficient to use in a 360 video workflow.

These tools take the headache out of stereo cleanup, allowing for efficient correction of alignment, focus and color across stereo views resulting in sharp and accurate stereoscopic content.  This release includes updated versions of the Disparity Generator node, which generates high quality disparity vectors for depth estimation, Disparity To Depth for calculating depth from the disparity vectors, a new Stereo Colour Matcher for unifying color between left and right views, and New View, allowing you to rebuild one stereo view from another, all optimized for use with 360 footage.
 
Cara VR 2.0 is available for purchase on Foundry’s website and via accredited resellers.

  • Friday, Oct. 13, 2017
Cinematographers play key role in Tech Emmy win for Fujinon Cine Zooms
During Fujinon Day Atlanta, Bill Wages, ASC (r), gives feedback on FUJINON Cabrios with Radames Gonzalez from Arri Rental in lens projection room.
WAYNE, NJ -- 

The Optical Devices Division of FUJIFILM has been awarded  an Engineering Emmy® for its FUJINON “4K Cine Zoom Lenses providing imagery in television” by the Television Academy, and will receive the honor at the Academy’s October 25 Engineering Awards ceremony at Loews Hollywood Hotel. The introduction of FUJINON’s Cabrio and Premier series of cinema zoom lenses brought about the ability to cover Super 35mm imagers and efficiently shoot the full gamut of television production without sacrificing image quality.

“The willingness of some of the top cinematographers and their rental houses to test, explore and provide feedback about our lenses is an integral part of this Emmy win,” states Thomas Fletcher, director of sales, FUJIFILM Optical Devices Division. “They’re a very loyal group, devoted to their lens choice. To test a new cinema lens is not something that’s considered lightly. Winning an honor as prestigious as an Emmy is an affirmation of Fujifilm’s dedication to the art and craft of cinematography. We thank the Academy for their recognition of our work and for the support we’ve received from the cinematography community.”

In fact, two cinematographers won Creative Arts Emmys this year using FUJINON cine zooms: David Miller, ASC, for Veep won Outstanding Cinematography for a Single-Camera Series (Half Hour) honors; and Donald A. Morgan, ASC, was awarded an Emmy for The Ranch in the Outstanding Cinematography for a Multi-Camera Series category.

Others who’ve embraced the new FUJINON zoom lenses include three-time ASC Award winner William Wages, ASC (Burn Notice, Containment, Revolution, Sun Records). For Wages, the FUJINON Cabrio 19-90 and 85-300mm zooms have “changed the way I shoot.”  Wages added: “With their speed alone, they’re virtually the only lenses I’m using. The optical quality, small size and speed are unequaled. The combination of these two lenses are ideal for television production.” Wages is an ASC recipient of the Career Achievement in Television honor. 

This marks the sixth Engineering Emmy award granted to Fujifilm and Fujinon. Past awards include:

  • “Development of the new high-speed color negative film A250 Color Negative Film” in 1982
  • “Developments in Metal Tape Technology” in 1990
  • “Implementation in Lens Technology to Achieve Compatibility with CCD sensors” in 1996
  • “Lens technology developments for solid state imagers cameras in high definition formats” in 2005
  • The world's first autofocus system, "Precision Focus," in 2009
  • Wednesday, Oct. 11, 2017
Facebook gets real about broadening virtual reality's appeal
In this Jan. 6, 2016, file photo, Peijun Guo wears the Oculus Rift VR headset at the Oculus booth at CES International in Las Vegas. (AP Photo/John Locher, File)
SAN FRANCISCO (AP) -- 

Facebook CEO Mark Zuckerberg seems to be realizing a sobering reality about virtual reality: His company's Oculus headsets that send people into artificial worlds are too expensive and confining to appeal to the masses.

Zuckerberg on Wednesday revealed how Facebook intends to address that problem, unveiling a stand-alone headset that won't require plugging in a smartphone or a cord tethering it to a personal computer like the Oculus Rift headset does.

"I am more committed than ever to the future of virtual reality," Zuckerberg reassured a crowd of computer programmers gathered in San Jose, California, for Oculus' annual conference.

Facebook's new headset, called Oculus Go, will cost $199 when it hits the market next year. That's a big drop from the Rift, which originally sold for $599 and required a PC costing at least $500 to become immersed in virtual reality, or VR.

Recent discounts lowered the Rift's price to $399 at various times during the summer, a markdown Oculus now says will be permanent.

"The strategy for Facebook is to make the onboarding to VR as easy and inexpensive as possible," said Gartner analyst Brian Blau. "And $199 is an inexpensive entry for a lot of people who are just starting out in VR. The problem is you will be spending that money on a device that only does VR and nothing else."

Facebook didn't provide any details on how the Oculus Go will work, but said it will include built-in headphones for audio and have a LCD display.

The Oculus Go will straddle the market between the Rift and the Samsung Gear, a $129 headset that runs on some of Samsung's higher-priced phones. It will be able to run the same VR as the Samsung Gear, leading Blau to conclude the Go will rely on the same Android operating system as the Gear and likely include similar processors as Samsung phones.

The Gear competes against other headsets, such as Google's $99 Daydream View, that require a smartphone. Google is also working on a stand-alone headset that won't require a phone, but hasn't specified when that device will be released or how much it will cost.

Zuckerberg promised the Oculus Go will be "the most accessible VR experience ever," and help realize his new goal of having 1 billion people dwelling in virtual reality at some point in the future.

Facebook and other major technology companies such as Google and Microsoft that are betting on VR have a long way to go.

About 16 million head-mounted display devices were shipped in 2016, a number expected to rise to 22 million this year, according to the research firm Gartner Inc. Those figures include headsets for what is known as augmented reality.

Zuckerberg, though, remains convinced that VR will evolve into a technology that reshapes the way people interact and experience life, much like Facebook's social networks and smartphones already have. His visions carry weight, largely because Facebook now has more than 2 billion users and plays an influential role in how people communicate.

But VR so far has been embraced mostly by video game lovers, despite Facebook's efforts to bring the technology into the mainstream since buying Oculus for $2 billion three years ago.

Facebook has shaken up Oculus management team since then in a series of moves that included the departure of founder Palmer Luckey earlier this year.

Former Google executive Hugo Barra now oversees Facebook's VR operations.