The Society of Motion Picture and Television Engineers® (SMPTE®), the worldwide leader in motion-imaging standards and education for the communications, media, entertainment, and technology industries, today announced that the SMPTE 2014 Annual Technical Conference and Exhibition (SMPTE 2014) will bring renowned experts to Hollywood, California, from Oct. 20-23 to offer insight into critical industry topics including the latest developments in ultra-high definition television (UHDTV) standards and systems, as well as in image processing, theatrical display, and audio technology.
"The rapid development of UHDTV, image processing, cinema projector, and immersive audio technologies over the past year will make for an exciting series of sessions at this year's conference," said Jim DeFilippis, SMPTE 2014 Conference Co-Chair and SMPTE Fellow. "Continued innovation in these areas promises not only to enrich storytelling itself, but also to alter and enhance the experience of moving images, both in the cinema and in the home. Experts from preeminent media technology companies around the world will present their latest work and findings in these dynamic areas of our industries."
The session track "UHDTV: Building the Plane in Flight" will begin with a presentation by NHK's Kenichiro Ichikawa, Seiji Mitsuhashi, Mayumi Abe, Akira Hanada, and Kohji Mitani, along with Mitsutoshi Kanetsuka of Sony, on a system capable of producing simultaneous 8K, 4K, and 2K video in real time from a single 4K camera and continue with a presentation by Belden's Stephen H. Lampen on the challenges of transporting 4K (12 Gb/s) video over single-link coax and how they may be overcome. The series track will wrap up with a presentation by Archimedia's Josef Marc on the implications of viewing 4K and UHD content in a largely 2K world and 2K content on UHD screens as the infrastructure for 4K evolves.
In the session entitled "Dammit, Gamut, I Love You!" NHK's Kenichiro Masaoka, Takayuki Yamashita, Yukiko Iwasaki, Yukihiro Nishida, and Masayuki Sugawara will examine color management for wide-color-gamut UHDTV production; François Helt and Valerie La Torre of Highlands Technologies Solutions will look at a quality assessment framework for color conversions and perception; Lars Borg of Adobe will discuss improved methods for color matching between HD and UHD content, and Gary Demos of Image Essence will examine approaches to defining a high dynamic range (HDR) intermediate that can be used to help maintain the creative's mastered intent. Presenters will discuss how these tools are being used to support development of wide-gamut displays, enable high-quality gamut mapping, and facilitate gamut conversion in which the perception of artistic intent is preserved from the initial working display to the viewer.
A session track entitled "Higher Frame Rates" asks "Is faster better?" and further examines the challenges, benefits, and solutions of working at frame rates beyond 60Hz, including both video and high frame rate (HFR) cinema formats through a series of presentations beginning with David Richards of Moving Image Technologies, who will discuss 120 frames per second (fps) capture as a universal open production standard. Paola Hobson of InSync Technology will continue with "High Frame Rate Video Conversion," which will be followed by a presentation by Keith Slavin and Chad Fogg of ISOVIDEO on quality advancements and automation challenges in file-based conversion, with a focus on noise reduction, deinterlacing, HFR, and compression efficiency.
A session dedicated to display technologies will start with a presentation by consultant George Joblove, who will compare and contrast today's numerous display performance measurements and what these photometric dimensions and units represent. A subsequent session by Peter H. Putman of Kramer Electronics on next-generation display interfaces will cover the latest versions of high-definition multimedia interface (HDMI), DisplayPort, and the many variations on each standard. The session will wrap up with a presentation by 3M's Jimmy Thielen, James Hillis, John Van Derlofske, Dave Lamb, and Art Lathrop on quantum dots, a new backlighting system for achieving the wider color gamuts required for UHDTV, particularly as defined by the International Telecommunications Union (ITU) Recommendation BT.2020.
The SMPTE 2014 session called "Advancements in Theatrical Display" will feature a presentation by Dolby Laboratories' Suzanne Farrell, Scott Daly, and Timo Kunkel on study results summarizing viewer preferences for cinema screen luminance dynamic range, followed by a presentation by Rick Posch of CR Media Technologies and Peter Ludé of RealD on development of an accurate and repeatable measurement method for speckle in laser illuminated projectors. The session will conclude with a presentation by Jim Houston of Starwatcher Digital and Bill Beck of Barco, who will discuss design considerations for cinema exhibition using laser illumination.
A three-part session "Developments in Audio Technology" will begin with presentations dedicated to tools for immersive audio. A presentation by Dolby Laboratories' Charles Robinson and Nicolas Tsingos on cinematic sound scene description and rendering control will be followed by an examination of immersive audio systems and the management of consequent sounds, presented by Technicolor's William Redmann. Part one of the session will wrap up with a presentation that brings Robert Bleidt of Fraunhofer USA, Arne Borsum and Harald Fuchs of Fraunhofer IIS, and S. Merrill Weiss together to discuss the opportunities that object-based audio provides for improving the listening experience and increasing listener involvement.
The second part of the audio technology developments session will look at the elements required to offer new audio services. A presentation by Jeffrey Riedmiller, Sripal Mehta, Prinyar Boon, and Nicolas Tsingos of Dolby Laboratories will first examine a practical system for enabling interchange, distribution, and delivery of next-generation audio experiences in the cinema. Shifting to include broadcast, a subsequent presentation by Thomas Lund of TC Electronic A/S will explore the technical aspects of loudness normalization versus speech normalization, followed by a presentation by Jon D. Paul of Scientific Conversion, who will provide an overview of the test data and recommendations for improved standards and reference designs for digital audio transmission.
The three-part session on audio technology developments will conclude with three presentations, and the first by Dolby Laboratories' Michael Babbitt will examine leading-edge work on audio data management and analysis. J. Patrick Waddell of Harmonic will look at issues related to the CALM Act in his presentation "Have Things Calmed Down?". The audio session will conclude with a look back at the origins of audio and video compression by Jon D. Paul.
In the conference's multiple-part session on image processing, Seiichi Goshi of Kogakuin University will begin with a presentation introducing super resolution technology that uses nonlinear signal processing to create naturally appearing thin edges that do not exist in the original image. Technicolor's Pierre Routhier will follow, presenting a model for motion control that ensures true 4K detail at capture, and Klaus Weber of Grass Valley will subsequently present on potential solutions for 4K or UHD image acquisition, with a focus on live broadcast production.
The second part of the image processing session will feature Scott Daly, Ning Xu, and James Crenshaw of Dolby Laboratories, along with Vickrant J. Zunjarrao of Microsoft, who will provide an overview of a psychophysical study isolating judder using fundamental signals, as well as what the results say about the appearance and magnitude of motion distortions from the viewer's perspective. To conclude the session, Sony Electronics' Gary Mandle will describe the systems used to acquire, develop, transmit, and record massive detailed images recorded over five U.S. lunar orbiter missions from 1966 to 1967, as well as the story of how the tapes and video equipment were saved and refurbished so that these images could be archived and publicly distributed.
SMPTE 2014 is the premier annual event for motion-imaging and media technology, production, operations, and the allied arts and sciences. A detailed event schedule for SMPTE 2014 is available online at www.smpte2014.org. The SMPTE Events app is available in iTunes and Google Play. It is expected to be updated with SMPTE 2014 event and program information this week. Further information about SMPTE and its work is available at www.smpte.org.
About the Society of Motion Picture and Television Engineers® (SMPTE®)
The Oscar® and Emmy® Award-winning Society of Motion Picture and Television Engineers® (SMPTE®), a professional membership association, is the preeminent leader in the advancement of the art, science, and craft of the image, sound, and metadata ecosystem, worldwide. An internationally recognized and accredited organization, SMPTE advances moving-imagery education and engineering across the communications, technology, media, and entertainment industries. Since its founding in 1916, SMPTE has published the SMPTE Motion Imaging Journal and developed more than 800 standards, recommended practices, and engineering guidelines.
The Society is sustained by more than 6,000 members — motion-imaging executives, engineers, creative and technology professionals, researchers, scientists, educators, and students — who meet in Sections throughout the world. Through the Society's partnership with the Hollywood Post Alliance® (HPA®), this membership is complemented by the professional community of businesses and individuals who provide expertise, support, tools and the infrastructure for the creation and finishing of motion pictures, television, commercials, digital media, and other dynamic media content. Information on joining SMPTE is available at www.smpte.org/join.
All trademarks appearing herein are the property of their respective owners.