Wednesday, August 23, 2017

Toolbox

  • Tuesday, Aug. 22, 2017
SMPTE unveils awards recipients for 2017
SMPTE 2017 Progress Medal recipient Paul E. Debevec
LOS ANGELES & WHITE PLAINS, NY -- 

SMPTE® has revealed the outstanding individuals who will be recognized with 2017 awards as part of the SMPTE 2017 Annual Technical Conference & Exhibition (SMPTE 2017) in Hollywood, California.

This year, the Annual SMPTE Awards Gala will take place on Thursday, Oct. 26, and will feature a red carpet, reception, and dinner in the Hollywood Ballroom of the Loews Hollywood Hotel. In addition, Fellows elevations will be conferred at the SMPTE 2017 Fellows Luncheon on Tuesday, Oct. 24.

Honorary Membership is the Society's highest accolade. It recognizes individuals who have performed distinguished service in the advancement of engineering in motion pictures, television, or in the allied arts and sciences. Honorary Members who have passed away are named to the SMPTE Honor Roll, which also posthumously recognizes individuals who were not awarded Honorary Membership during their lifetimes but whose contributions would have been sufficient to warrant such an honor.

Renville "Ren" H. McMann Jr. (1927 – 2015) will be inducted into the Honor Roll in recognition of his award-winning leadership in the development of television and imaging technology. McMann held more than 36 patents for inventions that include the electronic video recorder, the electronic image enhancer, the color camera system, and the magnetic scan conversion techniques used by NASA to bring color television images from the moon to viewers around the world. He was the principal inventor for and a major participant in projects such as the development of the CBS Minicam Mark VI, the first handheld color TV camera. A tireless and curious engineer, McMann made many contributions to the advancement of color television signal processing and image gathering technology. Along with those contributions, his pioneering work in the field of high-definition television systems has garnered worldwide recognition.

The Progress Medal is the most prestigious SMPTE award, and it recognizes outstanding technical contributions to the progress of engineering phases of the motion picture, television, or motion-imaging industries.

SMPTE is presenting the 2017 Progress Medal to Paul E. Debevec in recognition of his achievements and ongoing work in pioneering techniques for illuminating computer-generated objects based on measurement of real-world illumination and their effective commercial application in numerous Hollywood films. Techniques from his research have been used to dramatic effect in films such as the "The Matrix" sequels, "The Curious Case of Benjamin Button," "District 9," "Avatar," "Rogue One: A Star Wars Story," and "Life of Pi." Debevec is also a pioneer in high-dynamic-range (HDR) imaging and co-author of the 2005 book "High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting," now in its second edition.

The Camera Origination and Imaging Medal recognizes significant technical achievements related to inventions or advances in imaging technology, including sensors, imaging processing electronics, and the overall embodiment and application of image capture devices. David S. Corley will receive the award for his five decades of continuous innovation in measurement and calibration tools for image acquisition, display, and color correction.

The David Sarnoff Medal recognizes outstanding contributions to the development of new techniques or equipment that have improved the engineering phases of television technology, including large-venue presentations. The award will be presented to Phillip Bennett in recognition of his significant contributions to the broadcast industry with his work in video effects, still stores, and digital standards conversion during the dawning of the digital video era. Over the years, Bennett has developed many groundbreaking products for broadcasters, including the very successful Ampex Digital Optics (ADO) digital video effects system and one of the very early digital disk recorders.

The Digital Processing Medal recognizes significant technical achievements related to the development of digital processing of content for cinema, television, games, or other related media. Michael A. Isnardi will receive the award for his contributions to the art of digital video delivery systems, including video encoding, re-encoding, and quality evaluation. Isnardi's body of work includes one of the first advanced television systems proposals, encoder, compressed-domain watermarking the first real-time Motion Picture Experts Group (MPEG), Emmy® Award-winning MPEG Compliance Bitstreams, compressed-domain bit rate reduction, salience-based compression, and JND evaluation of JPEG 2000 for digital cinema applications. His current work includes sub-Nyquist compressed sensing and skin-tone analysis algorithms.

The James A. Lindner Archival Technology Medal, sponsored by James A. Linder, recognizes significant technical advancements or contributions related to the invention or development of technology, techniques, workflows, or infrastructure for the long-term storage, archive, or preservation of media content essence. The 2017 award will be presented to James M. Reilly for his more than three decades of contributions to image preservation and sustainable preservation practices. In 1985, Reilly founded the Image Permanence Institute, a non-profit, university-based laboratory devoted to preservation research — the world's largest independent laboratory with this specific scope. As its founder and director, Reilly studied the mechanisms of film deterioration and developed technology, techniques, and preservation strategies to lengthen its life in storage.

The Samuel L. Warner Memorial Medal, sponsored by Warner Bros., recognizes outstanding contributions in the design and development of new and improved methods and/or apparatus for motion picture sound, at any step in the process. The award will be presented to Mark Robert Gander in recognition of his contributions to the design and development of cinema loudspeaker systems. Gander has brought a comprehensive perspective to these efforts and has been responsible for every aspect of bringing a new loudspeaker design to market, from transducer engineering through logistics of manufacture and distribution to the signature marketing of the JBL Professional cinema product line. In his four decades devoted to the highest fidelity cinema sound reproduction, Gander has influenced cinema loudspeaker design industrywide.

The Technicolor — Herbert T. Kalmus Medal, sponsored by Technicolor, Inc., recognizes outstanding contributions that reflect a commitment to the highest standards of quality and innovation in motion picture postproduction and distribution services. The award will be presented to Joseph Goldstone for his innovations in the design and implementation of hardware and software to perform the accurate analysis and characterization of photochemical film processes, including film printing, which have been used in color management systems by the motion picture industry. Goldstone's early work involved the creation and refinement of film scanning and recording processes used for visual effects (VFX) creation at Digital Domain and Industrial Light & Magic (ILM). He was a pioneer in incorporating color science theory into digital production and postproduction workflows, and he is currently working on digital image processing for the ALEXA camera systems at ARRI. Goldstone is a key contributor to the Academy of Motion Picture Arts and Sciences (AMPAS) Academy Color Encoding System (ACES) and serves on several SMPTE Technology Committees (TCs): TC-10E DG Dynamic Metadata for Color Transforms of HDR and WCG Images, TC-32NF-40 DG HDR and WCG Signaling on Streaming Interfaces, and TC-31FS DG Constrained DPX for HDR.

The Workflow Systems Medal, sponsored by Leon Silverman, recognizes outstanding contributions related to the development and integration of workflows, such as integrated processes, end-to-end systems or industry ecosystem innovations that enhance creativity, collaboration, and efficiency, or novel approaches to the production, postproduction, or distribution process. The award will be presented to Randy Ubillos in recognition of his role in establishing the foundation of accessible and affordable digital nonlinear editing software that fundamentally shaped the industry landscape and changed the way visual stories are created and told. Ubillos' revolutionary work with creating and designing lower-cost editing software such as Final Cut Pro® and Adobe® Premiere® shifted the film and television industry toward a more inclusive future, giving storytellers of diverse backgrounds and experience levels the ability to tell their stories and rise as filmmakers, technicians, engineers, and key players in every facet of media and entertainment. His work significantly enhanced and transformed the world of postproduction, popularizing and commoditizing file-based workflows while removing significant barriers to the creative editing process for millions of users worldwide.

Each year, one SMPTE Journal Award is presented to the author of the most outstanding paper originally published in the SMPTE Motion Imaging Journal during the preceding calendar year. The SMPTE Journal Award will be presented to Sean T. McCarthy for the article "How Independent Are HDR, WCG, and HFR in Human Visual Perception and the Creative Process?" published in the May/June 2016 issue of the SMPTE Motion Imaging Journal.

Two Journal Certificates of Merit will be presented to:

  • Katy C. Noland for the article "High Frame Rate Television: Sampling Theory, the Human Visual System, and Why the Nyquist–Shannon Theorem Does Not Apply," published in the April 2016 issue of the SMPTE Motion Imaging Journal.
  • David Long and Mark D. Fairchild for the article "Observer Metamerism Models and Multiprimary Display Systems," published in the April 2016 issue of the SMPTE Motion Imaging Journal.

The Student Paper Award recognizes the outstanding paper prepared and submitted by a Student Member. The paper receiving the Student Paper Award will be published in the SMPTE Motion Imaging Journal.

The 2017 award will be presented to Elizabeth DoVale, a recent graduate of the Rochester Institute of Technology in Rochester, New York, for her paper "High Frame Rate Psychophysics: Experimentation to Determine a JND for Frame Rate."

Jonathan Bouchard, a student at McGill University in Montréal, Québec, Canada, will receive an honorable mention for his paper "Quality Control of Stereoscopic 3-D Compositing Using Half-Occlusion Geometry."

The Presidential Proclamation recognizes individuals of established and outstanding status and reputation in the motion picture, television, and motion-imaging industries worldwide. Mark Schubin will receive the award in recognition of his five decades of contributions to the television technology industry. An internationally recognized expert with an insatiable intellectual curiosity, Schubin has worked in every aspect of television production, including design, manufacturing, lighting, sound, camera, editing, distribution, as well as talent, and his projects have spanned every continent of the globe. Today, he supports the broadcasting of Metropolitan Opera (The Met) productions to cinemas and televisions around the world. Schubin is an active SMPTE Life Fellow and a sought-after resource in educating the industry on the history and current state of motion-imaging technology.

The Excellence in Standards Award recognizes individuals or companies that have been actively involved in advancing the Society's standards activities and processes. Johann Safar will receive this award in recognition of his continuous participation in SMPTE's standards work for more than 30 years. Safar has contributed to the development of countless standards related to the compression and formatting of multimedia content for storage on analog and digital media, as well as the development of Time Code, ancillary data formatting and mapping. He is a careful reviewer of SMPTE standards, with a focus on ensuring harmonization and compatibility of interrelated standards across multiple technology committees. Safar's dedicated performance in the SMPTE Standards Community has resulted in a high quality of professional standards documents.

The Society Citation recognizes individuals or companies that have actively been involved in specific Society engineering or editorial functions. Elizabeth "Betty" Migliore will receive this award in recognition of her 45 years of service with SMPTE and her contributions to the standards program. A dedicated and loyal staff member, Migliore joined SMPTE in November of 1972 as secretary to the staff director of engineering. Over the years, she has been steadfast in supporting SMPTE's standards work as engineering assistant and standards publisher, roles in which she actively supported document preparation for the SMPTE Technology Committees. In these roles, Migliore participated in advancing SMPTE's standards publishing process from typewriter to CD-ROM to PDF and the digital library.

The Citation for Outstanding Service to the Society, which recognizes individuals for dedicated service for the betterment of the Society over a sustained period, will be conferred upon four SMPTE Members:

Merrick Ackermans, for his contributions to and leadership of the Atlanta Section and the Southern Region. A long-time contributor to the Atlanta Section and four-term governor of the Southern Region, Ackermans has devoted much time and effort to producing quality Section events. His extensive involvement has included participation in everything from proposing engaging topics, securing speakers, and organizing facility and audio-visual logistics to contributing his knowledge as a speaker and facilitator.

Herbert Jay Dunmore, for his contributions as Member and manager of the Washington DC SMPTE Section and as the Student Chapter advisor of the Loyola University Maryland Chapter since its founding in 2012. Dunmore has continually promoted learning between the Student Chapter and the Section through hosting an annual meeting and seminars, exposing future professionals to the creative, business, and technical realms of television production.

John Walsh, for his dedication and service to the SMPTE Australia Section over the past decade. Walsh has served on the board of the SMPTE Australia Section since 2005, taking on the additional roles of membership chair and Section meeting lead. He has also been an active member of the organizing committee for the biennial SMPTE Australia Section conferences. Walsh is a dedicated Section member, always working behind the scenes to ensure the success of the Australia Section meetings.

David Wheeler, for his many contributions to the Australia Section, especially the SMPTE Australia Conferences. Wheeler was a member of the Conference Papers Committee for the SMPTE 2013 Australia Conference and served as chair of that committee for the 2015 event with 16 sessions and 52 presentations spanning four days.

The Louis F. Wolf Jr. Memorial Scholarship is designed to assist students in furthering their undergraduate or graduate studies in motion pictures and television, with an emphasis on technology. The 2017 scholarship will be awarded to three SMPTE Student Members:

  • Trevor Canham, Rochester Institute of Technology
  • Emily Faw, Rochester Institute of Technology
  • Catherine Marie Meininger, Rochester Institute of Technology

Twelve new SMPTE Fellows also will be recognized during the Annual Awards Gala. The 2017 SMPTE Fellows announcement is forthcoming.

  • Monday, Aug. 21, 2017
Sony FS5 deployed for an inspiring peak performance in "True North"
"True North"

The Workshop’s True North is an inspiring documentary about Sean Swarner, who in 2002 became the first cancer survivor to reach the top of Mount Everest.  He’s kept climbing ever since spreading his encouraging message of hope.  In a span of five years, the two-time cancer survivor with only one functioning lung scaled the Seven Summits, the highest mountains in each of the seven continents. 
 
Swarner’s journey started at the age of 13 when he was diagnosed with Stage 4 Hodgkin’s lymphoma and given two months to live.  He beat the odds and went into remission 10 months later, only to be diagnosed with Askin’s Sarcoma two years later and told he had two weeks to live. Once again, Swarner overcame adversity and resisted expectations to continue sharing his infectious spirit.
 
But defying the odds and topping the Seven Summits wasn’t enough to satisfy Swarner.  Two years ago, he set out to accomplish another goal by conquering the North and South Poles to complete The Explorer’s Grand Slam.  In 2015, Swarner checked the South Pole off his list and in April 2017, after summiting the North Pole bearing a flag with the names of over 2,000 people affected by cancer, he became an official member of this exclusive club. 
 
True North’s Tom Caamano, director/producer, and Igor Kropotov, DP, chose Sony’s FS5 as their primary camera to document this unimaginable achievement.  During the nine-day journey, which saw temperatures of -40 degrees Celsius, the two asserted “the Sony FS5 was definitely the right camera for this shoot.”  In addition, the team summiting the North Pole, which included Kropotov and second camera operator Corbin Johnson, used two Sony α7S IIs for stills and support. They paired all the cameras with Sony’s complementary 18-105mm zoom lens for longer telephoto shots and brought an 11-18mm lens for an extremely wide angle.
 
Kropotov, who had previously used Sony’s cameras including the FS5, is no stranger to shooting in remote and cold climates including Nepal and Siberia.  It is no wonder he “jumped at the once in a lifetime opportunity to go to the North Pole and be a part of an amazing documentary centered on Sean, an inspirational and strong-willed person.” 
 
The crew kept their equipment list limited, bringing only Sony’s FS5, two α7S IIs, 10 additional batteries, two lenses and more than a dozen 128GB Sony SD cards for this journey.  ”We had to be very conscious of our camera’s size and weight, since it had to easily fit in our sled and we had weight restrictions when flying in the helicopter,” Kropotov said.  “Alternately, I’d strap it around my neck and it wouldn’t weight me down.  We appreciated that the camera doesn’t require a lot of accessories to make it operational.  The FS5’s small and compact size, coupled with the quality of the image made it the best tool for the job.” 
 
The team opted to shoot HD (1080x720) to ensure they had enough media and batteries to stay up and running for nine days, and to stay “self-sufficient without the need to carry hard drives or laptops, which allowed us to rely solely on SD cards.” 
 
Kropotov praised the FS5’s slow motion capabilities, saying it added a “meditative element” to the documentary.  He described how he used the camera to capture a classic shot of a mug of hot water being tossed into the air and immediately turning into ice droplets.  He felt slow motion helped to illustrate the “emotional weight of traveling to the North Pole.”
 
Kropotov also appreciated the camera’s distinctive neutral density filters. “Because there’s 24 hour sunlight on the North Pole and the reflective white snow is so bright,” he said, “the ISO was generally pretty low and we used the ND filters to give us that depth of field and it worked extremely well.”
 
As with any shoot in punishing environments, the crew was up against a lot of unknowns.  A challenge which the FS5 was up for was producing a natural and realistic image despite the flat terrain and persistent landscape of blinding white snow, in addition to the 24-hour sunlight. 
 
Kropotov did experience some issues due to the intense temperatures and likened carrying around a camera in the extreme cold to storing it in a freezer and immediately shooting with it upon removal – only way colder.  Other concerns included a lag in the image and the ability for the LCD monitor’s liquid crystals to freeze in the bitter cold. 
 
He also quickly learned how to keep the extra batteries warm and fully functional by storing them in his sleeping bag overnight and chest pocket during the day, using his body heat to keep them operational; “otherwise they’d wind up being ice cubes by the end of the day,” he said. The crew also had to contend with condensation, which froze immediately to their faces, eyelashes and even the camera, causing the buttons and lenses to fog if not handled properly. 
 
Kropotov also learned the importance of preparing the camera to his desired settings in advance of leaving the tent, since “there’s no tinkering or changing settings possible, and if something falls off the camera, you have to wait until the end of the day when you’re inside the frigid tent to fix it.  Everything has to be on point in terms of where it’s located and stationed and how the settings and switches work with ISO and light balance.  Because of the constant sunlight and temperature, we didn’t want or need to significantly manipulate these settings over the course of the day.”
 
Despite some minor cracking and freezing, Kropotov asserted, “The FS5 held up really well and if I were to do it all over again, I would most certainly take the FS5 with me again – it was great.”
 
Between Sony’s 18-105mm lens and the 11-18mm lens the team brought, they were able to capture a variety of scenarios and vantage points using just two lenses.  Kropotov explained, “Sometimes it’s not ideal to switch lenses in the field, so our choices had to be varied and flexible. There’s a lot of talk about the condensation settling on the glass and the sensor, but that never seemed to be an issue, which contributed to our decision not to switch out lenses very frequently in the field.”
 
He continued, “The idea was to use the 11-18 inside the tent to get wider shots and also to fully capture the landscape without veering too far off our path.  We were also able to get wide shots, to show a sense of scale of the Arctic Ocean, and shooting in the helicopter gave us a wide field of view.  One of the film’s concepts is to bring these survivors and people affected by cancer who are named on the flag on this journey with us.  The wide angle shots conceptually capture the constant feeling of movement and allow viewers to follow Sean and have a visual sense of what carries him through this journey.” 
 
Caamano described the team’s intentional form of storytelling. The distance between Sean and the sled he pulls is about four to six feet,” he said.  “We wanted the viewers to experience that distance throughout the entire documentary and feel like they’re in the sled with the team’s supplies and Sean’s symbolic flag.  When Sean’s going to get his check-up or visiting people at hospitals before his trip, we opted to have the camera the same distance behind him and use those same prime lenses to keep the vantage point consistent.  Then, when we made the transition to the North Pole it was a seamless extension of his day-to-day journey.  We like to say that six months beforehand, when Sean got the first signature on his flag, that’s when the trip to the North Pole really began, so we wanted to keep that uniformity in our point of view.”
 
Kropotov spoke about the benefits of using all Sony cameras saying, “We wanted to keep everything in the Sony family because it allowed us to use the same system.  Lenses and other accessories were interchangeable and our videos and stills looked great. If I was shooting with the FS5, I would have an α7S II underneath my layers and ready to swiftly capture a lead.  The Alpha cameras were a key element of our shoot and the ability to quickly draw it out and produce a quality image was great.  Another reason we chose it to be our secondary camera is because it is an integrated system with no cables and less buttons, which means less opportunity for failure.”
 
Caamano described other elements of the documentary, many of which were shot using Sony’s F5 camera and matched perfectly with the FS5 footage.  “Prior to his excursions, Sean met with cancer patients and people who are at the darkest points in their lives,” he said.  “When he meets with them he tells his story of overcoming hardship, and when he leaves, these people are at a different point.  They’re feeling more positive.  They’re feeling more inspired.  And we’re sitting back as a fly on the wall and documenting this amazing story to encourage and motivate others. It’s a privilege for us as filmmakers to witness something so genuine and inspiring and see how many people Sean is affecting.  And we get to put faces to those names on the flag and follow up and see how they’re doing.”
 
In the end, the journey was an overwhelming success and Swarner is now one of few on the elite list of people who have accomplished the Explorer’s Grand Slam, and he has the video diaries to prove it.  Caamano jokes, “I think Sean has run out of places to go, so the next journey may have to be into space!”
 
True North is now in postproduction and will air nationally on American Public Television in the fall. 

  • Monday, Aug. 21, 2017
ASC Technology Committee upgraded to Motion Imaging Technology Council
Curtis Clark, ASC accepts the Sci-Tech Award for the ASC CDL in February 2013, along with (l-r) Joshua Pines, David Reisner, David Register, and Lou Levinson (not pictured). (Credit: Aaron Poole/©A.M.P.A.S.)
LOS ANGELES -- 

The ASC Technology Committee has been renamed the ASC Motion Imaging Technology Council. Established in 2003, the Committee has helped organize efforts to study and assess subjects ranging from digital cameras and lens optics to motion imaging workflows, advanced color management, virtual production techniques digital archiving and more recently virtual reality.

“During our past 14 years of proactive motion picture and TV industry engagement, the ASC Technology Committee has played a significant leadership role in guiding the evolution and development of key motion imaging technologies to better support our filmmaking art form,” noted Chairman Curtis Clark, ASC.

“Many of our industry partners and supporters, along with users of our technologies, have suggested that the Committee’s name does not sufficiently convey the scope and influence that our activities have had on important motion imaging technology developments,” he continues. “In response to that input and after careful consideration, we have decided to change the Committee’s name to the ASC Motion Imaging Technology Council (MITC) — or ‘My Tech.’ We believe this better represents the expanded scope of the work we are doing and our widely recognized role as industry leaders — influencing the advancement of motion imaging technologies in ways that best serve the creative interests of filmmakers while emphasizing the cinematographer’s contribution to the art form.”

Clark added, “Our Subcommittees will now be designated Committees of the ASC Motion Imaging Technology Council. We will continue to encourage our Committees to work in a coordinated manner, combining their expertise on topics of wide interest and concern, including ACES, HDR, digital motion picture camera developments, look management, virtual production techniques, lens developments, DI, motion imaging workflows, projection and display technologies, archiving, as well as advanced imaging.”

MITC’s latest reports on a variety of technological issues will be published on in the September issue of the SMPTE Motion Imaging Journal as part of the 2017 SMPTE Progress Report.

  • Thursday, Aug. 17, 2017
Playbox Technology demos at IBC to feature its CloudAir and Neo platforms
Playbox Technology's CloudAir platform
LONDON -- 

PlayBox Technology will demonstrate complete broadcast playout solutions leveraging its cloud-based CloudAir and server-based Neo platforms at the upcoming IBC-2017 exhibition in Amsterdam from Sept. 15-19. Hybrid configurations combining the strengths of both platforms will also be shown.

“Broadcasters today are demanding speed and flexibility in the way they set up and manage their services,” said Don Ash, president of PlayBox Technology. “Partnership agreements between PlayBox Technology and an increasing number of communication service providers have made CloudAir more accessible than ever to existing and would-be broadcasters throughout the world. CloudAir eliminates the need for channel managers to wait for new technical hardware to be delivered, installed and commissioned. Available on a fast-startup software-as-a service basis, CloudAir forms a basis for highly efficient broadcasting via terrestrial, satellite and dedicated cable wherever and whenever these are the channel management’s preferred delivery media. It makes the process of starting a new channel as simple as making a phone call, either direct to their preferred service provider or via the global network of PlayBox Technology support offices.”

“CloudAir also gives content owners the ability to start purely IPTV-based channels at very short notice, accessible to online viewers in any country.  IPTV channels can be operated to a published schedule or as viewer-specific time-buffered video-on-demand,” added CEO Pavlin Rahnev. “Channel managers can control the whole process of branding and playout via a secure link from a desktop or even a laptop computer. They can upload content via the same link ahead of transmission while retaining the freedom all broadcasters appreciate to add late-breaking stories such as news as additions to the playout schedule. Entire channels can be operated this way without managers needing to own, accommodate and maintain dedicated hardware. We will also be demonstrating the ease with which CloudAir can be integrated with our established Neo server-based product series to form a hybrid of onsite and offsite channel management and playout resources. An increasing number of Neo customers are already seeing the advantages CloudAir offers as a remote disaster-recovery solution and as a medium for single-event OTT or full 24/7 fast-startup television channels.” 

Among new CloudAir features making their IBC debut will be a transcoder capable of handling multiple file wrappers and formats including MPEG PS/TS, MXF, QT, AVI, MP4, GXF, MPG2, H.264, ProRes, DNX HD and MJPEG. Also being introduced to European broadcasters are an enhanced graphics editor template preparation interface, improved playlist editing, advanced playlist export to EPGs and automated linking of stored assets.

A new addition to the Neo platform, Neo TS IP Stream Delay, will make its maiden exhibition appearance. Occupying a standalone 1U chassis, Neo TS IP Stream Delay provides fully transparent delay of IP transport streams such as DVB/ATSC MPEG broadcast-quality compressed video and audio for single or multichannel time zone shift and disaster-recovery applications. Designed for fully automated operation, it can be configured with multiple input channels and multiple delayed outputs. Each input also has one zero-delay output. All operating parameters are easily adjusted via an integral web-based user interface, including channel-specific time delay in 15 second increments. Maximum delay duration depends on input bit rate and storage capacity. Additional features include programme information display of MPEG-compliant transport streams plus automatic error logging.

Over 40 new features for other modules in the Neo series will be introduced at IBC2017. These include the ability to integrate ProductionAirBox Neo closely with the Associated Press ENPS news production system via MOS gateway. Among other additions to the capabilities of PlayBox Neo are extended control features, expanded file handling capabilities, greater input and output connectivity and Microsoft Windows 10 compatibility. These have all been implemented within an informative and intuitive graphic interface which is familiar to operators around the world.

PlayBox Technology Limited is an international communications and information-technology company serving the broadcast and corporate sectors in more than 120 countries. Over 17,000 TV and branding channels are powered by PlayBox Technology Limited broadcast solutions. Users include national and international broadcasters, start-up TV channels, webcasters, DVB (IP/ASI) TV channels, interactive TV and music channels, film channels, remote TV channels and disaster recovery channels.

  • Wednesday, Aug. 16, 2017
Timeline Television to showcase its IP 4K HDR OB truck at IBC
Timeline Television's newest OB truck
NEWBURY, UK -- 

Snell Advanced Media (SAM) announced that Timeline Television’s newest OB truck--the first IP 4K HDR truck in Europe--will be featured on its stand (#9A01) at IBC 2017. The truck, UHD2, seamlessly handles fully uncompressed 4K/UHD, IP and HDR.
 
A state-of-the-art, triple expanding OB truck, UHD2 is home to a range of SAM technology including two Kahuna IP production switchers, IP Multiviewers and with SAM’s IP infrastructure technology providing the backbone. Also in the truck for IBC, SAM’s LiveTouch 4K/UHD replay and highlights system will be used for demonstrations.
 
Timeline’s UHD2 is designed to support 32 Sony 4K cameras. Its two Kahunas enable SDR and HDR to be run simultaneously along with down converted HD outputs. The set-up allows production teams to work in VSF TR03 (SMPTE ST 2110 draft)--the first time this has been done in an OB truck--enabling Timeline to work with video and audio as separate essence flows within an IP workflow.
 
Daniel McDonnell, managing director at Timeline Television, said, “We worked closely with SAM to design a workflow based on the latest IP infrastructure and HDR technology available, providing customers with a highly scalable solution that can meet complex production requirements without the need to add additional OB support. Given the increased number of 4K cameras and replay positions that we wanted to support, IP made perfect sense and SAM’s technology even more so as it afforded us the maximum flexibility and scalability.”
 
Robert Szabó-Rowe, EVP and general manager, live Production and infrastructure, SAM, commented, “We’re really excited to have Timeline’s award winning UHD2 truck on our stand at IBC as it’s a tremendous showcase for our technology and testament to our close partnership with Timeline in delivering true market innovation. The truck offers a great opportunity for visitors to IBC to experience how IP is being used today in a real life scenario.”
 
Timeline Television’s McDonnell will be presenting a detailed case study on UHD2 within the IBC IP Showcase theatre (E106/107).

  • Tuesday, Aug. 15, 2017
Lineup of events, program details unveiled for SMPTE 2017 Annual Technical Conference & Exhibition
SMPTE Education VP Richard Welsh (l) and Pat Griffis, SMPTE EVP, attend the 2016 Annual SMPTE Awards.
WHITE PLAINS, NY -- 

Program details for the SMPTE 2017 Annual Technical Conference & Exhibition (SMPTE 2017), Oct. 24-26 in Hollywood, Calif., have been announced. SMPTE 2017 will fill two exhibit halls and multiple session rooms at the Hollywood & Highland Center, and the event will also feature an Oktoberfest reception, Broadcast Beat’s SMPTE 2017 Live! Studio, and special events culminating with the SMPTE Annual Awards Gala at the Loews Hollywood Hotel’s Hollywood Ballroom on Thursday, Oct. 26.

“We’ve got an incredible lineup of technical sessions scheduled for this year, and we’re rounding out the conference and exhibition with some popular events that were added last year,” said SMPTE Education VP Richard Welsh, CEO of Sundog Media Toolkit. “The timely topics and technologies discussed at SMPTE 2017 are sure to make a splash as the Society dives into its next century of standards development and education.”

SMPTE’s Annual Technical Conference & Exhibition explores media and entertainment technology. The conference and exhibition will follow the daylong SMPTE 2017 Symposium — “Artificial Intelligence (AI) and Machine Learning in Digital Media Creation: The Promise, The Reality, and The (Scary?) Future” — on Oct. 23. The Symposium is co-chaired by SMPTE Fellow Michelle Munson and Yvonne Thomas of Arvato Systems. Further details about the Symposium will soon be available. Events on Oct. 23 also will include the annual Women in Technology Luncheon, presented by SMPTE and Hollywood Professional Association (HPA) Women in Post, and the SMPTE-HPA Student Film Festival, which will highlight the creative use of technology to support the art and craft of storytelling. Tickets for the luncheon and festival are available separately or as add-ons to a SMPTE 2017 conference registration.

The SMPTE 2017 Technical Conference program committee is co-chaired by three SMPTE Fellows: Paul Chapman, senior vice president of technology at Fotokem; Thomas Edwards, vice president engineering and development at Fox; and SMPTE Education director Sara J. Kudrle, product marketing manager for playout at Imagine Communications. SMPTE 2017 itself will include the usual wealth of technical sessions, along with an array of special events that offer numerous opportunities for face-to-face interaction between attendees, exhibitors, and speakers.

The first day of the technical conference will feature special events, including the Fellows Luncheon, open exclusively to SMPTE Fellows and Life Fellows who have registered for the event, as well as the SMPTE Annual General Membership Meeting and Oktoberfest Reception, both open to all attendees with conference registration. On the second day, the Evening Reception will take place in the Ray Dolby Exhibit Hall. The SMPTE 2017 Annual Awards Gala on the third and final day of the conference will welcome registered guests on the red carpet and treat them to a reception and dinner honoring industry leaders. SMPTE 2017 will conclude with the Awards After-Party featuring the SMPTE Jam, which once again will feature a pickup band comprising a diverse group of SMPTE members playing popular hits — and possibly a few original pieces created for the occasion.

Technical conference sessions throughout all three days of SMPTE 2017 will delve into the industry’s most innovative, intriguing, and important technological advances. The papers presented will address topics including advances in display technologies; cinema processing and projection technology; wider color and dynamic range; compression; content management and storage, restoration, and preservation; content security; virtual, augmented, and mixed reality (VR, AR, and MR); media infrastructure (SMPTE ST 2110) and distribution; image acquisition and processing; new techniques in audio; quality assurance and monitoring; workflow systems management; cloud technologies; and encouraging diversity in technology.

The emerging SMPTE ST 2110 suite of standards for professional media over IP (internet protocol) networks will be a hot topic during SMPTE 2017, and Leigh Whitcomb of Imagine Communications will present a paper titled “Is SMPTE ST 2110 the New Standards Superpower?” as part of the Media Infrastructure session. This and other session presentations will delve into the standard, implementation of IP for media production and distribution, and techniques used to optimize performance.

Among the presentations in the Advances in Display Technology session, “Engineering a Live UHD Program from the International Space Station” will feature Rodney P. Grubbs of NASA’s Marshall Space Flight Center and Sandy George of Science Applications International Corporation (SAIC), who will describe how they overcame engineering challenges involved with broadcasting live content in UHD from the International Space Station, as well as the ways commercial technologies are leveraged for in-orbit use.

Callum Hughes of Amazon Studios will present during the Content Security session, describing an approach to security within a digital asset management (DAM) system. During the Stream Privacy session, Raj Nair of Ericsson will discuss mechanisms for guaranteeing stream privacy for both OTT and live/linear adaptive-bit-rate (ABR) workflows.

The Advances in Immersive Storytelling session will feature “360-Degree Video Streaming and Its Subjective Quality,” a paper presentation by Igor Curcio and Henri Toukomaa of Nokia, and a case study by Éric Minoli and Kuban Altan, respectively from Canadian companies Groupe Média TFO and Zero Density, about bridging the gaming and broadcast industries for high-productivity production. The session focusing on new technologies and techniques will include “How Artificial Intelligence and Machine Learning Will Change Content Creation Methodologies,” by Tom Ohanian of TAO Associates.

Sessions on workflow systems will include “IMF End-to-End Workflows in Media Asset Management Systems,” presented by Julian Fernandez of Tedial, as well as “Applying an Agile Approach to Next-Generation Media Management,” presented by Arvato’s Ben Davenport and Christian Siegert. Moving into cloud-oriented workflows, Avid’s Shailendra Mathur will present “Media Cloud Migration Patterns: Connecting Services Between Bare Metal, Virtual Machines, and Containers.” Richard Cartwright of Streampunk Media will present his paper on “An Internet of Things Architecture for Cloud-Fit Professional Media Workflow.”

Speaking within the session on compression, RealNetworks’ Reza Rassool will present a paper titled “VMAF Reproducibility: Validating a Perceptual Practical Quality Metric for 4K Video.” Subhabrata Bhattacharya and Adithya Prakash of Netflix will look at quality from another perspective, presenting “Towards Scalable Automated Analysis of Digital Video Assets for Content Quality Control Applications” within the Quality and Monitoring of Images and Sound session.

The SMPTE 2017 session on UHD acquisition and processing will feature a presentation by YunHyoung Kim of the Korean Broadcasting System (KBS), whose paper describes the world’s first implementation of the Internet Media Subtitles and Captions 1.0 (IMSC1) closed-captioning system — on which ATSC 3.0 is based — on terrestrial UHD TV. The BBC’s Simon Thompson will present “Access Services for UHDTV: An Initial Investigation of W3C TTML2 Subtitles (Closed Captions).” Also in the UHD session, Pierre Hugues Routhier of Canada’s Creat3 inc. will present “Beyond 4K: Can We Actually Tell Stories in Motion Pictures and TV in 8K? A Cinematography Perspective.”

The Cinema Processing and Projection Technology session will include a presentation by Tim Ryan of Texas Instruments, who will explore techniques for using and optimizing variable-frame-rate display for cinematic presentations. A presentation by Kyunghan Lee of KAI Inc. will describe a new VR-based multiscreen movie theater simulator that enables researchers and multiscreen producers to provide a testing platform for multiscreen content and the viewing environment.

The Emerging Research in Visual Perception session will feature Elizabeth Pieri and Jaclyn Pytlarz of Dolby Laboratories, presenting “Hitting the Mark — A New Color Difference Metric for HDR and WCG Imagery,” and Elizabeth DoVale also of Dolby Laboratories and a recipient of the 2016 SMPTE Louis F. Wolf Jr. Memorial Scholarship, presenting “Assessing Psychophysics Functions for Framerate Perception.” Martyn Gates of Ravensbourne and Pure & Applied Image Recognition Limited will present “Is Seeing Still Believing: A Critical Review of the Factors That Allow Humans and Machines to Discriminate Between Real and Generated Images,” a paper exploring the implications as CGI (photo-realistic moving images) increasingly becomes indistinguishable from actual pictures.

During the Content Management, Value Proposition, and Archiving session, Oracle Digital Media Solutions’ Brian Campanotti will present “SMPTE and ISO: Standards to Protect the World’s Most Valuable Assets,” a paper that delves into the inception, development, advancement, and deployment of the Archive eXchange Format (AXF). In the Next Generation TV session, a paper presentation by Alex Giladi of Comcast will discuss adaptive streaming of content that is produced using capped variable-bit-rate encoding.

The session titled “Innovating People: Managing, Mentoring, and Change” will be chaired by Loren Nielsen of Entertainment Technology Consultants and Kari Grubin of Walt Disney Studios, and will feature a discussion of mentoring and reverse-mentoring between baby boomer and millennial tech professionals. Kylee Peña of Bling Digital and Blue Collar Post Collective and Meaghan Wilbur of IncitefulMedia will discuss why diversity programs fail and how to fix them. John McCoskey of Eagle Hill Consulting — and former EVP and CTO at the Motion Picture Association of America (MPAA) — will present his paper, “A Formal Approach to Change Management for Dynamic Technology-Driven Media Organizations.”

  • Thursday, Aug. 10, 2017
Facebook envisions Watch feature as TV for social media
This image provided by Facebook shows a screenshot demonstrating Facebook's new Watch feature, which is dedicated to live and recorded video. The idea is to have fans commenting and interacting with the videos. The new Watch section is a potential threat to Twitter, YouTube, Netflix and other services for watching video. (Courtesy of Facebook via AP)
NEW YORK (AP) -- 

Facebook envisions its new Watch feature as TV designed for social media, a place where users comment, like and interact with show creators, stars and each other — and never leave.

It's a potential threat to Twitter, YouTube, Netflix and other services for watching video, including old-fashioned TV. Yet its success is far from guaranteed.

While people watch a lot of videos on Facebook, these are mostly shared by their friends, seen as users scroll down their main news feed.

Getting people to see Facebook as a video service is like Walmart trying to sell high fashion, or McDonald's peddling high-end food, said Joel Espelien, senior analyst with The Diffusion Group, a video research firm.

Sure, it's possible, but something is off.

"It's very difficult to change people's core perception of what your brand is," he said.

Facebook has already had a special video section, but it mainly shows a random concoction of "suggested" videos. The new Watch section replaces this. Some U.S. users got Watch on Thursday; others will get it over time.

The idea behind Watch is to let people find videos and series they like, keep up with them as new episodes appear, and interact with the show's stars, creators and other fans. People's own tastes, as well as those of their friends, will be used to recommend videos.

Daniel Danker, a product director for video at Facebook, said the most successful shows will be the ones that get people interacting with each other. "Live does that better than almost anything," he said.

Facebook wants to feature a broad range of shows on Watch, including some exclusive to Facebook. Users who already follow certain outlets, say, BuzzFeed, will get recommended shows from those pages.

But Espelien wonders whether Facebook users will tap (or click) the Watch tab when with another tap of the finger they can "click over to Hulu or Netflix or whatever."

Though Facebook might want you to think otherwise, Espelien said there's no boundary keeping you from straying.

Advertising details are still being hashed out, but typically the shows will have five to 15-second ad breaks. Facebook said show creators will decide where the ads go, so they can be inserted during natural breaks.

But it might be a tough sell for advertisers used to a predictable, reliable audience that television has had, Forrester Research analyst Jim Nail said in an email. Facebook's big challenge, he said, will be to train users "to establish a Watch habit."

  • Tuesday, Aug. 8, 2017
Meredith Corp. to standardize stations on Avid’s MediaCentral Platform
BURLINGTON, Mass. -- 

U.S. media group Meredith Corporation has chosen to standardize its workflow on Avid’s MediaCentral® Platform. Over a six-year period, Avid will upgrade 10 stations, install new Avid workflows at two additional stations, and enable Meredith to migrate to a virtualized environment, reducing costs and boosting efficiency while also benefiting from the advantage of adopting a common platform across the enterprise.
 
Meredith’s Local Media Group includes 17 owned or operated television stations reaching 11 percent of U.S. households. Meredith’s portfolio is concentrated in large, fast-growing markets, with seven stations in the nation’s Top 25--including Atlanta, Phoenix, St. Louis and Portland--and 13 in Top 50 markets. Its stations produce 700 hours of local news and entertainment every week, delivering 24/7 news coverage on digital, mobile and broadcast platforms in large, high-growth markets. Faced with the pressures of operating in a digital environment, Meredith needed to upgrade its aging infrastructure and reduce expenditures. A mix of disparate news production equipment at different stations made technology upgrades, support, training and planning complicated and expensive.
 
Meredith’s enterprise-wide adoption of Avid’s MediaCentral Platform will help the media company overcome these challenges. With a single platform across the enterprise and planned upgrades every two years, Meredith’s stations will benefit from advanced tools and workflows for enterprise-wide search and content sharing, and for embracing social media. 
 
“Avid is a leader in the broadcast news industry and has been a trusted partner for many years,” said Larry Oaks, VP of Technology at Meredith. “By standardizing on Avid’s platform, we have a one-stop shop for all our technology, support and training needs across our newsrooms, which will enable us to reduce costs, save a great deal of time and effort, and give us the tools we need to succeed in today’s digital environment.”
 
Meredith’s new workflow comprises Avid’s comprehensive tools and workflow solutions to create, deliver and optimize media, including Avid NEXIS®, the media industry’s first and only software-defined storage platform, MediaCentral | UX, the cloud-based, web front end for the MediaCentral Platform, Avid Interplay® | Production for asset management, and Avid iNEWS® and iNEWS | Command for newsroom management. Meredith will use Media | Distribute to deliver content to social media channels, as well as Media Composer® | Cloud Remote and Media Composer | NewsCutter® Option for nonlinear editing, and Avid AirSpeed® video servers. Avid Professional Services will provide installation, support and customized enterprise-wide training.
 
“Meredith is the latest member of Avid’s growing community of preeminent customers to adopt an enterprise-wide single platform approach,” said Jeff Rosica, president at Avid. “With Avid’s flexible commercial options and deployment models, Meredith can keep its stations and staff at the forefront of technology, virtualize its infrastructure, and respond quickly to new challenges and opportunities--all while reducing costs.” 

  • Saturday, Aug. 5, 2017
Academy Investigates 11 scientific & technical areas for 2017 Oscars
LOS ANGELES -- 

The Academy of Motion Picture Arts and Sciences has announced that 11 distinct scientific and technical investigations have been launched for the 2017 Oscars®.

These investigations are made public so individuals and companies with devices or claims of innovation within these areas will have the opportunity to submit achievements for review.

The deadline to submit additional entries is Tuesday, August 15, at 5 p.m. PT.  The Academy’s Scientific and Technical Awards Committee has started investigations into the following areas:

  • Systems using multiple, stabilized, synced cameras to capture background footage, with integrated playback for simulating movement in static vehicles
  • Submersible, telescoping camera cranes
  • Automated systems for cinema auditorium quality control
  • Systems for onset digital dailies with color managed workflows
  • Systems for onboard RAW recording for digital cinema cameras
  • Gyroscopically stabilized camera platforms for aerial cinematography
  • Systems for modular character rigging enabling large scale, complex, high quality 3D digital character animation
  • Systems for digital storyboarding and story reel development
  • Efficient systems for interactive animation of large numbers of high-resolution 3D characters with full surface detail
  • Single surface audio platforms for automated dialogue replacement (ADR).
  • Software applications to synthesize complex sound scenes from a limited set of source elements

Claims of prior art or similar technology must be submitted online here.  

After thorough investigations are conducted in each of the technology categories, the committee will meet in November to vote on recommendations to the Academy’s Board of Governors, which will make the final awards decisions.

The 2017 Scientific and Technical Awards Presentation will be held on Saturday, February 10, 2018.

The 90th Oscars will be held on Sunday, March 4, 2018, at the Dolby Theatre® at Hollywood & Highland Center® in Hollywood, and will be televised live on the ABC Television Network at 7 p.m. ET/4 p.m. PT.  The Oscars also will be televised live in more than 225 countries and territories worldwide.

 

  • Wednesday, Aug. 2, 2017
RED RAVEN Camera Kit available via Apple.com
The RED RAVEN Camera Kit
IRVINE, Calif. -- 

RED Digital Cinema has announced that its RED RAVEN Camera Kit is now available exclusively through Apple.com and available to demo at select Apple Retail Stores. This complete handheld camera package features a diverse assortment of components from some of the industry’s top brands, including:

·      RED RAVEN 4.5K camera BRAIN

·      RED DSMC2 Touch LCD 4.7” Monitor 

·      RED DSMC2 Outrigger Handle

·      RED V-Lock I/O Expander

·      RED 120 GB RED MINI-MAG

·      Two IDX DUO-C98 batteries with VL-2X charger

·      G-Technology ev Series RED MINI-MAG Reader

·      Sigma 18-35mm F1.8 DC HSM | Art

·      Nanuk heavy-duty camera case

·      Final Cut Pro X

·      foolcontrol iOS app for RAVEN Camera Kit

 

The RED RAVEN Camera Kit is available for $14,999.95. Customers can buy this package or learn more at Apple.com and select Apple Retail Stores.

“We are very excited to work with Apple on the launch of the RED RAVEN Camera Kit, available exclusively through Apple.com,” said Jarred Land, president of RED Digital Cinema. “The RED RAVEN Camera Kit is a ready-to-shoot professional package that gives content creators everything they need to capture their vision with RED’s superior image capture technology.”

The RAVEN 4.5K is RED’s most compact camera BRAIN, weighing in at just 3.5 lbs. This makes it a great choice for a range of applications including documentaries, online content creation, indie filmmaking, and use with drones or gimbals. The RAVEN is equipped with a 4.5K RED DRAGON sensor, and is capable of recording REDCODE RAW (R3D) in 4.5K at up to 120 fps and in 2K at up to 240 fps. RED RAVEN additionally offers incredible dynamic range, RED’s renowned color science, and is capable of recording REDCODE RAW and Apple ProRes simultaneously—ensuring shooters get the best image quality possible in any format.

The RED RAVEN Camera Kit also includes Final Cut Pro X which features native support for REDCODE RAW video, built-in REDCODE RAW image controls, and the most complete ProRes support of any video editing software. Together with the free RED Apple Workflow software, Final Cut Pro allows professional video editors to work quickly and easily with RED RAVEN footage on MacBook Pro, iMac, and Mac Pro systems.