Thursday, May 25, 2017

Toolbox

  • Wednesday, May. 24, 2017
FUJIFILM/FUJINON to showcase wares at Cine Gear Expo
FUJINON MK50-135mm T2.9 zoom
WAYNE, NJ -- 

The newly launched “MK” Series of cinema lenses from the Optical Devices Division of FUJIFILM, along with the UA Series of HDR zooms, will make their Cine Gear debut next month. Also showcased during this event will be the entire range of FUJINON PL mount Cabrio and Premier zooms. The Cine Gear Expo 2017 exhibition runs June 2-3 in Hollywood’s Paramount Studios.

In addition to the new lens highlights, a projection room within the company’s booth will return after a strong customer reception to a similar event held during Cine Gear 2016. A variety of lens technicians from around the country, such as Matthew Duclos, of Duclos Lenses, are scheduled to discuss and compare the optical traits of FUJINON cinema style lenses throughout both days of the show.

“Cine Gear draws its sizeable crowd from some of our most active and involved customers,” said Tom Fletcher, director of sales, Optical Devices Division of FUJIFILM. “While we’ve participated in Cine Gear Expo since its inception, hosting a projection room with independent optical technicians from various rental houses and service facilities is relatively new. We’re looking forward to connecting with our existing customers more dynamically in this way as well as with an exciting, new group of potential FUJINON customers.”

With the first in its series introduced in February of this year, “MK” lenses are currently designed for E-mount cameras and boast advanced optical performance, ultra-compact and lightweight design, as well as superb cost performance. The FUJINON MK18-55mm T2.9 is a standard zoom with an 18-55mm focal length. It is currently available for $3,799. The FUJINON MK50-135mm T2.9 will be available this summer. With a combined focal length range of 18mm-135mm in the Super 35mm format, together the first two “MK” lenses cover the most frequently used range utilized by emerging cinematographers. The series offers fast lenses with T2.9 speed across the entire zoom range, enabling a shallow depth-of-field. The entire “MK” series is designed with the "emerging" cinematographer in mind, whether shooting a live event, online programming, documentary, independent or short film production.

  • Monday, May. 22, 2017
Baseball coming June 1 to virtual-reality headsets 
This photo provided by MLB Advanced Media shows a Stream Live MLB Games demonstration in the company's new At Bat VR app. Baseball games will soon arrive on virtual-reality headsets. (MLB Advanced Media via AP)
NEW YORK (AP) -- 

Baseball games will soon arrive on virtual-reality headsets.

Video in the new At Bat VR app won't be in VR. Rather, the app places you behind home plate and shows you graphical depictions of each pitch, including a colored streak (red for strikes and green for balls) tracing the ball's trajectory. The data come from sensors Major League Baseball already has installed in all of its stadiums.

The app also lets you hover over icons to see the speed and type of each pitch, as well as which parts of a strike zone is strong or weak for a particular batter. Traditional TV coverage of the games will appear on a virtual screen in front of you, alongside play-by-play information and individual player stats.

It's more information that casual baseball fans will want, but hard-core fans might get a kick from having this perspective supplement what they see with regular TV cameras. Baseball's regular At Bat app does have some of this information, but not in 3-D and not while watching video.

At Bat VR will also have a section for 360-degree video packages, but not of actual games.

At Bat VR is included with Major League Baseball's existing streaming packages. For live video, that starts at about $87 for the season. At Bat VR is also subject to the usual blackouts for local teams; in such cases, the graphical depictions will still be available, but not the live video within the headset. (Audio is available with the cheaper At Bat Premium subscription for $20; non-paying users get just the graphics and stats.)

The VR app comes out June 1 and works with Android phones and headsets compatible with Google's Daydream VR system. There's no version for iPhones.

  • Thursday, May. 18, 2017
Ymagis Group appoints Anne Feret as head of Eclair post division
Anne Feret
PARIS & CANNES -- 

Ymagis Group, the European specialist in digital technologies for the film industry, announces the appointment of Anne Feret as VP, Europe for the Post Production division. For the past eight years, Feret was the international sales administrator and VP of the cinema arm of Zodiak Rights/ Banijay Rights.

Feret is a graduate of the ESSEC business school. She started her career at Pandora in 1996, where she was in charge of television and film sales, and sales administration manager. Feret acquired extensive experience in business affairs, financing and international feature film, TV series and documentary sales, working in senior executive positions for various top-tier production companies, including Cipango Films (now EuropaCorp Télévision), Korava Productions, AF Consulting and Banijay Rights.

Feret said, “This is the start of a new adventure in a rapidly changing sector driven by innovation with new technologies such as our EclairColor HDR solution and UHD 4K. All my energy and expertise are already focused on creating growth opportunities and increasing synergies through our different sites to achieve our expansion goals in Europe.”

Feret is based in Paris-Vanves and reports directly to Christophe Lacroix, Eclair sr. VP.

Eclair is organized around its six main divisions:
• Post Production, managed by Anne Feret
• Theatrical Delivery, managed by Daniel Danciu
• Digital Distribution, managed by Serge Sépulcre
• Versioning and Accessibility, managed by Bouchra Alami
• Restoration and Preservation, managed by Yves Gringuillard

Eclair has offices in Berlin, Karlsruhe, Madrid, Barcelona, London, New York, Liège, Vicenza, Rabat and in France, in Vanves, Issy-les-Moulineaux, Auxerre and Strasbourg.

  • Wednesday, May. 17, 2017
Google unveils latest tech tricks as computers get smarter 
Google CEO Sundar Pichai speaks at the end of his keynote address of the Google I/O conference Wednesday, May 17, 2017, in Mountain View, Calif. Google provided the latest peek at the digital services and gadgets that it has assembled in the high-tech tussle to become an even more influential force in people's lives.(AP Photo/Eric Risberg)
MOUNTAIN VIEW, Calif. -- 

Google's computer programs are gaining a better understanding of the world, and now it wants them to handle more of the decision-making for the billions of people who use its services.

CEO Sundar Pichai and other top executives brought Google's audacious ambition into sharper focus Wednesday at an annual conference attended by more than 7,000 developers who design apps to work with its wide array of digital services.

Among other things, Google unveiled new ways for its massive network of computers to identify images, as well as recommend, share, and organize photos. It also is launching an attempt to make its voice-controlled digital assistant more proactive and visual while expanding its audience to Apple's iPhone, where it will try to outwit an older peer, Siri.

The push marks another step toward infusing nearly all of Google's products with some semblance of artificial intelligence — the concept of writing software that enables computers to gradually learn to think more like humans.

Google punctuated the theme near the end of the conference's keynote address by projecting the phrase, "Computing that works like we do."

Pichai has made AI the foundation of his strategy since becoming Google's CEO in late 2015, emphasizing that technology is rapidly evolving from a "mobile-first" world, where smartphones steer the services that companies are building, to an "AI-first" world, where the computers supplement the users' brains.

AI unnerves many people because it conjures images of computers eventually becoming smarter than humans and eventually running the world. That may sound like science fiction, but the threat is real enough to prompt warnings from respected technology leaders and scientists, including Tesla Motors CEO Elon Musk and Stephen Hawking.

But Pichai and Google co-founder Larry Page, now CEO of Google corporate parent Alphabet Inc., see it differently. They believe computers can take over more of the tedious, grunt work so humans have more time to think about deeper things and enjoy their lives with friends and family.

Other big tech companies, including Amazon.com, Microsoft, Apple and Facebook, also are making AI a top priority as they work on similar services to help users stay informed and manage their lives.

Google believes it can lead the way in AI largely because it has built a gigantic network of data centers with billions of computers scattered around the world. This while people using its dominant internet search engine and leading email service have been feeding the machines valuable pieces of personal information for nearly 20 years.

Now, Google is drawing upon that treasure trove to teach new tricks to its digital assistant, which debuted last year on its Pixel phone and an internet-connected speaker called Home that is trying to mount a challenge to Amazon's Echo. Google Assistant is on more than 100 million devices after being on the market for slightly more than six months and now is trying to invade new territory with a free app released Wednesday that works on the operating system powering Apple's iPhone. Previously, the assistant worked only on Google's Android software.

Google's assistant will be at a disadvantage on the iPhone, though, because Siri — a concierge that Apple introduced in 2011 — is built into that device.

A new service called Google Lens will give Assistant a new power. Lens uses AI to identify images viewed through a phone. For instance, point the phone at a flower and Assistant will call upon Lens to identify the type of flower. Or point the camera at the exterior of a restaurant and it will pull up reviews of the place.

Pinterest has a similar tool. Also called Lens, it lets people point their cameras at real-world items and find out where to buy them, or find similar things online.

Google Photos is adding a new tool that will prompt you to share photos you take of people you know. For instance, Photos will notice when you take a shot of a friend and nudge you to send it to her, so you don't forget. Google will also let you share whole photo libraries with others. Facebook has its own version of this feature in its Moments app.

One potentially unsettling new feature in Photos will let you automatically share some or all of your photos with other people. Google maintains the feature will be smart enough so that you would auto-share only specific photos — say, of your kids — to your partner or a friend.

Google is also adding a feature to Photos to create soft-cover and hard-cover albums of pictures at prices beginning at $9.99. The app will draw upon its AI powers to automatically pick out the best pictures to put in the album.

AP technology reporter Tali Arbel contributed from New York.

  • Tuesday, May. 16, 2017
Octopus Newsroom promotes Lukas Kotek to CTO
Lukas Kotek
PRAGUE, Czech Republic -- 

Octopus Newsroom, a globally active producer of television broadcast newsroom automation systems, has promoted Lukas Kotek from project director to chief technology officer. Reporting to founder and CEO Petr Stokuc, he will be responsible for development, project delivery, training and support across Africa, America, Asia and Europe.

Kotek said, “My role will be to ensure that we continue to deliver the most effective possible solutions matching the latest news production workflows, new delivery channels and the widest possible range of viewing platforms.”

Kotek studied cybernetics, information and control systems at the University of West Bohemia before joining Prague-based MAM and broadcast automation specialist Aveco in 2004 where he advanced to the role of projects and support director. He joined Octopus Newsroom in 2015 to making greater use of his widened experience and face new challenges. 

Established in 1999, Octopus Newsroom is a producer of standalone newsroom computer systems. Octopus Newsroom advocates an open ecosystem using the MOS protocol which enables customers to choose freely among high-quality providers of graphics, playout, MAM, prompters, traffic-handing and advertising solutions. Octopus Newsroom has successfully installed systems into more than 200 channels around the world. Based on Unicode, Octopus Newsroom products support all major character sets including Chinese, Japanese, Korean, Thai, and Vietnamese.

  • Monday, May. 15, 2017
Technicolor invests in Baselight X to meet increasing demands for HDR finishing
Maxine Gervais, sr. supervising colorist, Technicolor
LONDON -- 

Technicolor has extended its color grading capabilities in Hollywood with FilmLight’s Baselight X system. Baselight X is the latest and most powerful implementation of the Baselight colour grading and finishing system.
 
HDR color grading services are available at Technicolor’s postproduction facilities globally, but its Hollywood center obviously represents a major film and television production market. The expansion in Baselight grading in this location reflects the increasing demand from the local industry to deliver an uncompromised finish that still pushes creative boundaries. Baselight X provides exceptional power and performance for HDR projects, along with color space management.
 
HDR is not the only technology where Technicolor is very active. Over the past few years, the group has made a significant commitment to the growth of several next-generation entertainment formats, such as 120fps stereo 4K, 8K UHD and other custom display formats. The architecture of Baselight X ensures Technicolor can rise to the challenge to work at the maximum resolution through every stage in the process--from the original source material to the final deliverables.
 
Technicolor’s sr. supervising colorist, Maxine Gervais, has worked with Baselight for many years. “Of the many things that are important to better serve my clients, one is to work directly from raw camera files,” she explained. “The ability to debayer these files live not only saves time, but it preserves image detail that can be accessed and manipulated during the DI.”
 
The Ultra HD video output – available on all Baselight systems – provides a full 4K 4:4:4 display output at frame rates up to 60p, allowing the user to view 4K work at its native resolution. Additionally Baselight X also incorporates a large, ultra high-speed storage system that connects directly to the internal image processing components, addressing Dolby’s requirement to play 4K 4096x3112 16-bit film scans and cache the results to disk at the same time.
 
“As camera technologies evolve, it’s become common for shows to capture and deliver 6K raw files,” added Gervais. “And some shows are moving towards a 4K VFX workflow too. The additional processing power and storage capacity of Baselight X is essential in allowing me to work with today’s larger files, without sacrificing the real-time playback with complex colour grading and compositing that my clients have come to expect.”
 
Technicolor and FilmLight have a long history of collaboration, with Technicolor facilities in Montreal, New York, Los Angeles and London all offering Baselight as part of their DI pipeline. This gives Technicolor’s colorists and color scientists ultimate real-time control over high-resolution, high-bit depth HDR grading and finishing.

  • Sunday, May. 14, 2017
Conversica CEO discusses future of artificial intelligence 
This photo provided by Conversica shows Alex Terry, CEO of Conversica. (Conversica via AP)
NEW YORK (AP) -- 

Artificial intelligence is all around us, whether it's to recommend movies you might like or weed out unsavory videos. Smaller companies such as Conversica are joining the likes of Google and Facebook in pursuing AI.

Conversica sells digital assistants to businesses ranging from car dealerships to real estate companies. They work just as an entry-level sales or marketing person would; customers usually don't know they're interacting with software, or a bot, when responding to a sales pitch or seeking help.

Conversica CEO Alex Terry spoke recently with The Associated Press about the future of AI and its impact on jobs. Questions and responses have been edited for clarity and length.

Q: How long did it take to get to a point where people couldn't tell they are conversing with a bot?

A: We have been doing this for seven years, and it's a function not only of time but also the amount of training data. At this point we've had over 215 million messages go through our AI platform, and those messages help us train the system to respond like a person.

Q: How does natural language processing work?

A: Software reads messages that are coming in and understands what the person is saying. Then we figure out what we should do for that particular customer. If you think about Siri or Alexa, those are examples where the computer is listening to a spoken sentence or paragraph and figures out what someone means. Like if I say "what's the weather out today?" you have to understand what's the weather and then some kind of location.

Q: What happens to the people who would have handled these responses?

A: Typically we see our customers hire more people, not less. It's about a 6 to 1 ratio of customers that hire more staff vs. those who use the efficiencies from AI to reduce the size of their team.

Q: What are the biggest challenges right now?

A: On the technical side, it's making sure these experiences are really seamless and genuinely helpful. Our systems are getting smarter all the time. But that isn't our biggest challenge. Our biggest challenge is getting the word out there. People tend to be hesitant to try something that sounds almost like science fiction.

Q: Do you think stuff like Siri and Alexa are helping with this?

A: I think it really is helping. For example, Facebook trying to find fake news, that's a great example of using really powerful technology, pattern matching. I think people using Alexa or Siri or even recognizing that Netflix is using pattern matching to recommend a new movie you might like, it actually makes your life easier and better. People are becoming less fearful of the technology as they see actual benefits in their day-to-day lives.

  • Friday, May. 12, 2017
Fusion Studio deployed for Dawn of War III game cinematic and cutscenes
Fusion Studio in action
FREMONT, Calif. -- 

Blackmagic Design announced that Axis Animation used Fusion Studio for its work on the game cinematic and cutscenes for the latest real time strategy game from Relic Entertainment and SEGA Europe, Warhammer(R) 40,000(R): Dawn of War III(R).
 
Axis Animation has previously worked on AAA SEGA titles such as “Aliens: Colonial Marines” and “Alien: Isolation.” For “Dawn of War III” however, the team adopted a brand new approach to its VFX pipeline, allowing the studio to streamline its 3D characters and environments into a 2.5D compositing setup in Fusion Studio.
 
“It was the work of Polish surrealist painter Zdzisław Beksiński that was the essential ingredient in our pitch to Relic. We worked with the team at Relic to strike a careful balance between our intended aesthetic with Beksiński’s ethereal imagery and the established WH40K world created by IP holders Games Workshop(R),” explained lead LRC artist on the project, John Barclay.
 
Axis produced the award-winning trailer for “Dawn of War III” which received great responses from fans and critics alike. With the success of the trailer, developers Relic Entertainment wanted to take the same art direction into the cutscenes.
 
To do this, the Axis team assembled their scenes within the standard studio pipeline but automated the process of converting the 3D scene into the multitude of image planes required for their 2.5D comps. “This helped us enormously, especially for the cutscenes, which featured limited camera movement,” Barclay shared. “It was incredible that we had the flexibility to use effects like volumetric fog and insert them into the same scene as characters and environments built from 2.5D cards. It also meant our lighting artists could work on the scene as though it were any other 3D set.”
 
Brought into Fusion using a custom Python and Lua script to ensure all the data was positioned correctly, the cards meant that every artist at Axis could automatically render their character from Houdini and import it as a projection into Fusion. An alembic export was then used to bring in the camera positioning from Maya.
 
“Using image planes in Fusion allowed us some significantly reduced render times and also greater flexibility with more creative iteration over the whole show. The workflow allowed artists familiar with our standard pipeline to create something a little more unusual,” Barclay concluded. “Fusion is key to helping us solve both creative and logistical challenges. It was an essential part of our toolbox on Dawn of War III.”

  • Monday, May. 8, 2017
Augustus Color taps WCPMedia Services to help "Bent" through post
A scene from "Bent"
ROME -- 

Bent is the latest motion picture production to move its media management to the WCPMedia cloud. Augustus Color, Rome-based postproduction Lab, is using WCPMedia’s platform to deliver dailies and other assets from production locations in Italy to production partners around the globe.

Full dailies sets are delivered to the film’s editorial team following each production day. Select raw assets are sent to visual effects providers. Producers and other stakeholders, meanwhile, are able to review dailies media via password-secure WCPMedia Virtual Screening Rooms. The result is significant time and cost savings, and improved security over the delivery of physical assets.

“WCPMedia streamlines the production workflow by reducing transfer times compared with standard courier delivery—and does so with top level security,” explained Alessandro Pelliccia, general manager at Augustus Color and technical supervisor for this project. “With the same unique upload, we can feed the editorial team and the Virtual Screening Room simultaneously. It’s a beautiful collaboration tool that allows editors and producers to review content and work together as a team from any location.”

Augustus Color plans to use WCPMedia throughout the postproduction process to store and share assets, including rough cuts and high-resolution media from visual effects and color grading sessions. “The ability to immediately access media assets from anywhere in the world is a huge benefit for geographically-distributed productions like Bent,” observed Pelliccia. “With WCPMedia all production content is in one, secure place, immediately accessible to authorized parties to view, work with and comment on. It’s the perfect tool for modern film production”.

WCPMedia is a cloud-based, end-to-end platform for storing, managing, sharing, transcoding, viewing and distributing media files quickly and securely. It provides dedicated workflows and functionalities to support the full media life cycle from production and postproduction to marketing, promotion and distribution.

WCPMedia allows producers to closely monitor on-set activities, and provides editors with faster access to critical media. The platform employs market-preferred file transfer tools, including Aspera and Signiant, and sophisticated transcoding technology to automatically transcode and deliver media optimized for individual end users.  Both high and low-resolution media can be managed within the platform with a unique easy to use interface.

Additionally, WCPMedia provides unprecedented security and control over assets. Production media is stored via a Swiss data center with Tier IV certification, the same level of security employed by international financial institutions. Access to virtual screening rooms is controlled and tracked directly by the production team with enterprise-level tools for user authorization and authentication.

WCP takes cloud-based digital asset management to the next level. “It provides levels of quality and service that go far beyond other solutions,” noted WCPMedia Services CEO Cristina Molinari. “WCPMedia is flexible and extensible. It can be tailored to the unique needs of the individual client and user. As needs change, it grows and adapts.”

  • Monday, May. 8, 2017
Fullscreen Media deploys Facilis TerraBlock servers for remote collaboration
Fullscreen Media studio
HUDSON, Mass. -- 

Facilis, an international supplier of cost-effective, high performance shared storage solutions for collaborative media production networks, announced that youth-oriented digital media company, Fullscreen Media, is using TerraBlock shared storage systems to improve collaboration and workflows between its remotely located teams.

A global network of content creators and brands focused on creating engaging entertainment across social media channels and its own subscription service, Fullscreen Media creates and supports content over a wide range of platforms and clientele from offices in Los Angeles, New York City, Atlanta, and Chicago.

With a significant amount of content being created in-house for its B2B branded teams and its own online consumer-based platforms, the company also operates as a full-service facility offering shooting, production, editing, and management. The Fullscreen team manages third party productions that are creating content out-of-house as well.

As Fullscreen’s postproduction department grew from two editors in 2014 and to more than 30 editors, Fullscreen invested in the Facilis TerraBlock storage system to provide their first shared, collaborative editing workflow. “After we installed the Facilis system, we weren’t shuttling hard drives back and forth anymore and we eliminated duplication for the most part,” said Adam Ford, head of postproduction at Fullscreen Media. “Editors were able to work together and share volumes in way they hadn’t before.”  

Fullscreen purchased the original 96TB TerraBlock 24D from Cutting Edge Audio and Video Group, a media systems integrator with offices in Los Angeles and San Francisco. Two years later an additional 128TB TerraBlock 24EX server was added to their Studio location over a mile away, connected to the main office with 10gig point-to-point fiber, administered by Sohonet. The Cutting Edge team has since added 384TB of expansion storage to the environment, and the high-bandwidth of this connectivity enables the editorial teams at either location access and edit directly to and from either system.

“Cutting Edge has made it very easy for us to build out our infrastructure,” said Ford. “Typically, I‘ll propose an idea as something ‘nice to have,’ and Zeke Margolis, our sales contact at Cutting Edge, comes back with a well-thought-out solution.”

One benefit of Fullscreen’s set up is that it enables non-editors from many different teams ‘self-serve access’ to the TerraBlock. “Our marketing or social media team can access media to screen content and cut clips up for Instagram and other social media videos,” said Ford. “The content ID team will make sure finished assets are logged with Facebook and YouTube. We’ll have 25 seats connected typically and flex up to 40 when it gets really busy.”

For administration, Ford and the lead system editor manage the Facilis servers themselves.  “We’re a small team, but administration is easy with the Facilis tools, and for the most part we’re able to take care of ourselves without bothering our IT team.”

Purchasing a second Facilis system was an easy decision since they needed to keep things running smoothly for a team that can’t afford downtime. “We’ve had essentially zero technical issues with our original TerraBlock,” said Ford. “So, purchasing another Facilis system from Cutting Edge made sense to us.”

The new Facilis system was installed at the Playa Vista studio where there are 8 sets, 40 people, and a smaller post production facility with around 10 seats connected to the storage network. The studio was already able to access the first Facilis system as if it were local thanks to the 10gig connection between sites. The new system enables the studio team to work collaboratively on material they shoot and edit while eliminating the need to move large files between facilities, even though it’s not a problem over the 10gig backbone.

“Cutting Edge is focused on delivering technology solutions that not only meet client requirements today, but prepare them to address their future needs as well,” said Zeke Margolis, Cutting Edge Los Angeles sales manager. “We’re excited to be supporting Fullscreen’s continued growth by expanding their Facilis TerraBlock system in multiple locations.” 

Fullscreen creates and post-produces a large variety of content shot from various locations for both short and longer form programs. Being flexible with camera formats is important, so the team accepts anything from Alexa, Red, Canon, Sony, GoPro, etc. Much of the shorter form content stays at its native resolution in Premiere Pro, so it’s very important that the storage can deliver the bandwidth when required. “The flexibility and the bandwidth of the TerraBlock is extremely important to us,” said Ford. “We don’t have time to stop and think if the storage can handle a certain format or not.”

Another goal at Fullscreen, and a compelling reason for installing the second Facilis system at a different location, is for disaster recovery (DR) in the event of an earthquake or similar event. Cutting Edge is currently helping the Fullscreen team set up a mirrored data configuration between the facilities that automatically backs up mission critical media and metadata. 

“We don’t want to be forced to complete a project in a certain way, so it is important to have total flexibility, said Ford. “Our Facilis systems let us present a blank canvas to post supervisors allowing them to organize projects any way they want while giving us piece of mind.”