Wednesday, March 29, 2017

Toolbox

  • Wednesday, Nov. 16, 2016
DJI premieres short film shot entirely on drone by Oscar winner Claudio Miranda
Claudio Miranda
SHENZHEN, China -- 

DJI, a maker of unmanned aerial vehicles, has premiered a short film by Academy Award-winning cinematographer Claudio Miranda shot entirely on its new Inspire 2 professional drone and its professional-grade X5R camera.
 
The Circle stars Ryan Phillippe (Shooter) and Noah Schnapp (Stranger Things), and illustrates how the DJI Inspire 2 can be an integral part of the creative process for high-end filmed entertainment. Miranda--who won the Best Cinematography Oscar for Life of Pi in 2013 and was nominated in 2009 for The Curious Case of Benjamin Button--and EP Dana Brunetti (The Social Network, House of Cards) used the Inspire 2 and X5R for every shot in the film, from emotional close-ups to sweeping aerial views.
                      
“Filmmaking has been tethered to the ground for so long. DJI’s technology is allowing filmmaking to be free,” Phillippe said. “The advantage is obvious on smaller projects when you can’t afford cranes and all of the technicians that come along with it, and you can still achieve the same beautiful shots as with the larger equipment.”
 
The Circle tells the story of an estranged father (Phillippe) who reunites with his young son (Schnapp) in Depression-era America after the sudden death of the boy’s mother. The two travel from town to town as dad makes a meager living for the both of them by sketching portraits for locals. It’s on this formative journey that the boy discovers not only the transformative power of art, but how to open his heart as well.
 
The DJI Inspire 2, unveiled Tuesday at an event on the Warner Bros. studio lot, is a ready-to-fly platform for high-end film and video creators. While aspiring filmmakers have long had their creative ambitions restricted by shoestring budgets and limited gear options, affordable and powerful aerial equipment is now opening new doors and providing them with more options to pursue their visions.
 
“When you think of shooting a movie set in the Great Depression, you think of big Hollywood studio films, big cameras and massive crews,” said director Sheldon Schwartz. “What’s unique about this shoot is that we’re using new technology to push the limits of camera language and storytelling.”                   

For cinematographers, Inspire 2 with the X5R camera allows them to control every aspect of an image, gives them new freedom to move the camera in three dimensions, and uses stabilized gimbal technology to eliminate unwanted camera movement.
 
“What’s fantastic about this drone is that we’re able to shoot in the RAW format,” Miranda said. “It’s nice to have that dynamic range, so I’m able to push shadows up or highlights down and create a mood.”
 
“When you work with incredible visual storytellers like Claudio who are hyper-focused on every detail and element of the image, it’s important to have the most dynamic and versatile product to help tell that story but also one that also doesn’t break the bank,” Brunetti said. “We get just that with the DJI Inspire 2.”

  • Tuesday, Nov. 15, 2016
Facilis TerraBlock at work on Canadian family drama series "Backstage"
"Backstage"
HUDSON, Mass. -- 

Facilis, an international supplier of cost-effective, high performance shared storage solutions for collaborative media production networks, announced that the postproduction team for Canadian family drama series Backstage is again relying on Facilis TerraBlock for its collaborative editorial workflow for season 2.

Ellen Fine, CCE (Canada Cinema Editor) is the supervising editor and consulting producer on Backstage. The series is produced by Fresh TV and distributed by DHX. It currently airs on the Family Channel in Canada and on the Disney Channel in the US and worldwide.

The offline edit is cut on Avid Media Composers by a team of three editors who share two assistants at Technicolor’s Toronto facility. For the first season, the team started out using only local storage. With some arm-twisting, the producers were convinced to allocate funds so that Fine’s team could better collaborate and share media.

The group called Don Kinzinger from Dynamix, a systems reseller and integrator in Ontario, to help them find the right system for their needs. Choices quickly narrowed down to Avid ISIS or Facilis Terrablock. In the end, a 16TB TerraBlock 8D system best fit the budget and gave the team the flexibility they wanted while minimizing restrictions on Apple OS and Media Composer version compatibility issues with the storage. The TerraBlock 8D was connected to a 10Gb Ethernet switch and from that point 1Gb Ethernet lines were connected to five Mac Pro workstations. Having central media access for the editorial team provided a great leap forward in efficiency and collaboration.

“We wanted a system that gave us the freedom to be on whatever Mac OS we were comfortable with, running any version of Media Composer that we felt was stable without any worries about compatibility with the storage,” stated Fine.

When the show was picked up for a second season, Fine knew that the producers would want to occasionally incorporate footage from season one which meant it had to be online and accessible. Since this required more storage, the team called on Dynamix again this time purchasing a 32TB TerraBlock 8D. The new storage was added to the same Ethernet switch so that everyone could access both servers.

Following the same hectic schedule as season one, for season two the crew shot 30 episodes over the summer. Since the location is an actual school, they had a very limited amount of time to film. There were two complete units shooting simultaneously.  “It’s quite unique, we’re block-shooting, so our actors, who are aged 14 to 17, are filming 4 episodes over 4 days. The actors have to memorize 4 entire episodes and jump back and forth between them,” said Fine. “They’re phenomenal dancers and musicians as well. It was very impressive.”

The show is filmed with the ARRI Amira camera with dailies processed and synced at Technicolor. Fine and team receive Avid DNxHD 36 files from the Technicolor team, which are transferred to the Facilis system, ready to edit. After the dailies are delivered, Fine and her team have 5 or 6 days to assemble an episode. Then they move onto the director’s cut, the producer’s cut, and finally the broadcaster’s cut before they lock. This year, they started in July, and typically finish in eight or nine months.

“The great thing about our Facilis system is that we don’t need to think about it.  It’s just rock solid and always there,” said Fine. “We organize everything very meticulously by episode in folders and bins. We store our music in a separate volume. It’s very efficient and just what we need to meet our deadlines.”

  • Monday, Nov. 14, 2016
HiScene, Inuitive, Heptagon team on AR glasses
HiAR Glasses
SANTA CLARA, Calif. -- 

HiScene, Inuitive and Heptagon have teamed to roll out HiAR Glasses,  billed as HiScene’s next generation of Augmented Reality (AR )glasses. The companies worked together to develop a complete solution for advanced 3D depth sensing and AR/VR applications that delivers excellent performance even in changing light conditions and outdoors. HiAR Glasses incorporate Inuitive’s NU3000 Computer Vision Processor and Heptagon’s advanced illumination. 
 
The glasses’ AR operating system provides stereoscopic interactivity, 3D gesture perception, intelligent speech recognition, natural image recognition, inertial measuring unit (IMU) displayed with an improved 3D graphical user interface.
 
“We are committed to providing the best possible user experience to our customers, and for this reason we have partnered with Inuitive and Heptagon to create the most intelligent AR glasses available on the market,” said Chris Liao, CEO of HiScene. “The technologies implemented provide a seamless experience in a robust and compact format, without compromising on battery life.”
 
Inuitive’s NU3000 serves AR Glasses by providing 3D depth sensing and computer vision capabilities. This solution acts also as a smart sensors hub to accurately time-stamp and synchronize multiple sensors in a manner that off-loads the application processor and shortens the development time. “Inuitive’s solution allows Hiscene to provide the reliability, latency and performance its customers expect,” said Shlomo Gadot, CEO of Inuitive. “With Inuitive technology, AR products and applications can now be used outdoors without the sunlight interfering or damaging their efficacy thanks to cameras featuring depth perception.”
 
Heptagon provides unique IR Pattern Illuminators, which were chosen to handle changing light conditions and plain surfaces.  In addition, the range and Field of Illumination features of Heptagon’s LIMA stereo pattern projector ensure superior lighting and added texture for higher-quality images.
 
“Our Wide Field of Illumination provides better gesture recognition, and our miniaturization technologies enable ultra-small, high performance, low power components for 3D AR/VR applications,” said Dr. Erik H. Volkerink, Heptagon’s chief business officer and executive VP.

  • Monday, Nov. 14, 2016
Moonlight Cinema deploys DaVinci Resolve Studio to complete animated film "Ozzy"
A scene from "Ozzy."
FREMONT, Calif. -- 

Blackmagic Design announced that Barcelona-based postproduction house Moonlight Cinema has completed both the digital intermediate and finishing of the new children’s animated film “Ozzy” on DaVinci Resolve Studio.

Postproduction director Alejandro Matus and colorist Ignasi González produced more than 40 different deliveries of the animation over a two-week period, using Moonlight Cinema’s three DaVinci Resolve color grading suites to cater for everything from different aspect ratios and frame rates through to multiple language versions.
 
“We had never taken on an animated film before, so there was an added layer of pressure to deliver a high quality finish,” said Matus. “Resolve was an enormous help throughout all aspects of the project, particularly with its new editing tools. Not only did we use it to conform and master the project, but we also were able to make changes to the edit during the grade without having to leave Resolve.”
 
Starting with 2K DPX files transferred to them by Tangent Animation in Canada, the Moonlight Cinema team applied a LUT to act as a base for the film’s overall look. “For many in the industry, it may seem natural to think that animation projects don’t require any real color correct, but in fact, the complete opposite is true,” said González. “Unlike a live action feature, a CG animation’s whole universe will have been created by many different people on many different machines. Homogenizing this content and giving greater depth to the animation through tools like light and blur is a daunting task.”
 
To ensure that every shot in a sequence matched uniformly, Matus and González decided to separate key elements from each scene through their respective alpha channels using mattes. This allowed for more control to modify individual assets, such as character faces, throughout the final grade. The next step was to give “Ozzy” its own unique aesthetic.
 
“Ozzy is a children’s film, so we wanted to maintain a bright, friendly feel even in scenes that were tense or sad,” González shared. “The great thing about being a colorist on an animation like this was that I had way more creative freedom when grading. For some night scenes, for instance, I could push the blues of the sky to a point where a live action sky would never go. The same happened with sunsets.”
 
“The best thing about using DaVinci Resolve for this,” he continued, “was the Color Management system. What impressed me on this feature was Resolve’s ability to handle all of the different color space conversions that we needed to output. It was quick and pain free. We also had a lot of shot replacements coming in throughout the grading process. With Resolve, I could simply save a still from a grade that I was happy with to the stills gallery, and it would include all the keyframe and tracking information, which made it stress free to update to a new shot.”

  • Thursday, Nov. 10, 2016
Sony expands FS Series with new FS7 II camcorder
Sony's FS7 II camcorder

Sony is expanding its FS Series Super 35mm professional family with the addition of the new FS7 II camcorder. The new model builds on the original FS7’s strengths by adding advanced features including Electronic Variable ND technology, a lever lock type E-mount, and a new mechanical design for faster and easier set-up. The new FS7 II camcorder also supports Sony’s α Mount System, which includes more than 70 lenses.
 
Since its introduction in 2014, the FS7 has become one of the most widely used cameras in a range of production applications, and the original FS7 model remains in the Sony line-up. The new FS7 II now gives creative professionals a broader range of creative tools, with new features all based on end user feedback.
 
The new FS7 II camcorder is designed for long-form shooting and production applications, especially documentaries and independent filmmaking. Sony is also introducing an FS7 II kit model which includes a new E-mount, Super 35mm lens, model SELP18110G, covering Super35mm and APSC sensors.
 
“The FS7 II features state of the art, Sony variable ND technology and a robust locking E‑Mount,” said Juan Martinez, senior product manager, professional digital imaging, Sony Electronics. “Extensive enhancements to the VF support system enables super-fast and secure viewfinder repositioning, while retaining the Zen-like simplicity, flexibility and comfort of the FS7’s ‘multi-award winning’ industrial design.”
 
Electronic Variable ND Technology
The camcorder’s Electronic Variable ND Filter system, combined with its large sensor, delivers greater exposure control, with the option of preset or variable operation modes. Variable ND mode (seamless ND attenuation within the camera’s 2~7 stop range) allows the user to vary the density of the ND filter during shooting and to transition seamlessly between steps.
 
The camera’s expanded ND operations also enables fine exposure adjustment by relegating iris to set depth of field, prevents soft focus caused by diffraction, and prevents color shift caused by stacking multiple external ND filters.
 
Preset mode lets users assign three ND settings to the filter turret, useful in selecting the most appropriate filtration range for changing light conditions. Auto ND mode is also available allowing exposure to stay at a fixed level while adjusting the depth of field with iris control.
 
E-mount (Lever Lock type) for professional shooting
 
The FS7 II’s new E-mount (lever lock type) gives users the ability to change lenses by rotating the locking collar rather than the lens itself, which means that in most cases lens support rigs don’t need to be removed, saves time during a production.
 
Mechanical Design Enhancements
Like its counterparts in the FS Series family – the FS7 and FS5 models – the new FS7 II features several design and ergonomic updates for comfortable and functional use in the field.
 
The FS7 II’s “tool-less” mechanical design lets users make on-the-fly changes to the camera’s set-up and operation. For example, no tools are required to adjust the Smart Grip or viewfinder positions.
 
The viewfinder eyepiece provides a third stabilizing contact point when shooting handheld. Durable square section rods and lever-clamps on the LCD and camera body provide simple and precise front-to-back VF adjustment while retaining level positioning. 
 
New Sony 18-110mm Sony G lens
 
Sony is also introducing an FS7 II kit model including a new E-mount, Super 35mm lens. The new lens, model SELP18110G, covers Super35mm and APSC sensors. Compact and lightweight -- 2.4 lbs (1.1Kg) -- with an 18 to 110 focal range (6x zoom) it uses a new fully mechanical/servo zoom switchable system, capable of snap zooms and entirely devoid of lag. The focal range is optimized for Super 35 and APS-C sensors.
 
The lens is compatible with Sony α Mount System cameras, including the α7 series interchangeable-lens cameras and professional Super 35mm 4K camcorders like Sony’s FS7 or FS5. Although perfectly suited for still image taking, filmmakers will fully appreciate the lens’ extended creative capabilities for shooting motion images.
 
The lens benefits from Sony’s Smooth Motion Optics (SMO) design, which is developed to optimize performance during motion image capture. This lens design eliminates undesirable characteristics and artifacts that do not affect still image capture, but can severely limit a lens’ usefulness for motion shooting, such as:
·         Ramping: F stop gradually diminishes when zooming tight.
·         Not holding focus while zooming. 
·         Breathing (angle of view variation while focusing).
·         Optical axis shift (image moves in the frame while zooming).
 
XQD Cards
The FS7 II supports the XQD memory card format, designed for capturing and transferring high-bandwidth, high resolution files. Sony is also introducing a new XQD card, QD-G256E -- with an industry first 256 GB capacity – which enables a recording time of approximately 45 minutes at 4K 60P and 3.5 hours at 2K 30P. Combined with a read speed of up to 440MB/s and write speed of up to 400MB/s, users can shoot for longer without needing to change media cards.
 
The FS7 II is planned to be available in January 2017 for an estimated street price of $10,000 (body only) and $13,000 for the camcorder with 18-110mm power zoom lens kit.

  • Thursday, Nov. 10, 2016
The highlight of Google's Daydream VR is ... its controller
Clay Bavor, Google vice president of virtual reality, talks about Daydream and virtual reality during the keynote address of the Google I/O conference, Wednesday, May 18, 2016, in Mountain View, Calif. (AP Photo/Eric Risberg)
NEW YORK (AP) -- 

The best thing about Google's new virtual-reality headset isn't the headset at all.

In fact, Daydream View would pale compared with Samsung's Gear VR headset were it not for Daydream's controller, a handheld device that responds to gestures and other motion.

With Gear VR, I have to move my head to point a cursor at something, then reach for a button on the headset. With Daydream, I can just aim and click the controller in my hand. Sensors in the device tell the headset what I'm trying to do, whether it's swinging a tennis racket or casting a fishing rod. The headset's display responds accordingly.

The headset and controller are sold together for $79, starting Thursday. No rush in getting one, though, as the virtual experiences built for Daydream are still limited. And for now, it works only with Google's Pixel phone .

THE MANY FLAVORS OF VIRTUAL REALITY
While sophisticated systems like Facebook's Oculus Rift and HTC's Vive let you walk around in the virtual world, Daydream View is a sit-down experience in which you use the controller to move yourself around. (You could walk around with the Daydream on if you wanted to, but you won't go anywhere in virtual space - and you might run into the wall.)

But the Rift and the Vive each costs more than $1,500, once you include powerful personal computers they require. Suddenly, $79 sounds like a bargain. Daydream stays cheap by using the display and processing power of your phone, which you insert into the headset at eye level.

Gear VR, at $100, takes a similar approach, but it works only with Samsung phones. While Daydream works only with Pixel for now, several other Android makers plan to make compatible phones. Sorry, iPhone users.

Those without compatible phones still have Google Cardboard, a $15 contraption you hold up to your face. Using Daydream View, by contrast, is more like wearing goggles. While Gear VR has a better fit, with focusing and a second strap over your head to keep the headset from sliding too low, Daydream is much more comfortable to wear and use than Cardboard.

ALL ABOUT THAT CONTROLLER
Those who've played Nintendo's Wii system will find the Daydream controller familiar. It's about the size and shape of a chocolate bar, and it has motion sensors to track movement.

Although I'm not a big gamer, I enjoyed shooting water out of a hose to put out fires. You simply hold a button to spray and move the controller around to douse flames. You can even tilt the controller to control the angle of the hose. Another app lets you explore the universe by using the controller as a laser pointer to bring up more information.

The controller makes it easier to navigate menus without making yourself dizzy; just move it around to point at things. And while getting the full 360-degree experience of VR often requires spinning around (a swivel chair helps), some apps in Daydream let you grab the scene with your controller and drag it around you, just as you would with a PC mouse.

It's also handy to have volume controls and a home button in your hand rather than on your head.

NAUSEA FREE?
VR can be nauseating, and Daydream is no different. I found that it's less about the headset, and more about the VR video.

The best videos use stationary cameras and let you move your head (or controller) around to explore. The nauseating ones tend to treat VR cameras like regular movie cameras , with a lot of panning in response to a subject's movements. The viewer, not the subject, should be the one doing the moving.

And while I enjoyed watching a woman's skydive in VR on YouTube, scenes of her preparing to jump felt jarring because the camera was on her shaky arm. I had to remove my headset.

WHAT'S THERE TO DO?
You can view 360-degree YouTube videos and any 360-degree photos you store on Google Photos. You can visit other destinations such as the Galapagos Islands in a 360-degree version of Google's Street View. A few games, museum artworks and The Wall Street Journal's app were also available to try out prior to Thursday's launch.

A handful more are coming Thursday. Even more are promised by the end of the year, including apps for Netflix and Hulu - though all that does is offer video on a giant screen in a virtual living room.

There's much more available for Cardboard. Unfortunately, app developers will need to make some tweaks first to make them compatible with Daydream. They'll need to do even more to take advantage of the motion control.

Daydream has promise, but until more apps arrive, its potential is still a dream.

  • Thursday, Nov. 10, 2016
Sky Deutschland selects Avid MediaCentral Platform for unified content delivery
BURLINGTON, Mass. -- 

Avid® (Nasdaq: AVID) announced that Sky Deutschland, the pay-TV market leader in Germany and Austria, has significantly expanded its investment in the MediaCentral® Platform. Now, with the most open and integrated platform comprising solutions from the Avid Media, Storage and Artist Suites, Sky’s news and sports production teams are empowered with the most comprehensive individual products at every stage of the workflow to create, collaborate, distribute, optimize and monetize its content..

Recognizing the need to update its infrastructure to meet the demands of today’s fast-turnaround, high-quality content-driven environment, Sky needed a future-proofed workflow flexible enough to take advantage of and monetize new and emerging technologies such as Ultra High Definition (UHD).

“Our goal is providing customers the very best TV experience, delivering highly engaging content,” said Kevin Hughes, director of broadcast engineering at Sky Deutschland. “Investment in the MediaCentral Platform provides us with a more scalable, secure and flexible production and delivery environment, future-proofing us to meet the next wave of broadcast industry demands and significantly increasing our efficiency.”

By investing in Avid NEXIS™ | E4 software-defined storage platform connected to Avid Media Composer®, Sky is able to quickly and easily acquire, edit, and deliver content in any resolution—including 2K and Ultra HD.

Facilitating collaboration and enabling producers and editors in remote locations and the newsroom to connect more efficiently was another key driver for Sky. Using Avid MediaCentral | UX, the cloud-based web front end to MediaCentral, Sky editors have the freedom to write scripts, view and edit video, record voiceovers, add and preview graphics, search across multiple systems simultaneously, send stories straight to air, and publish to social media platforms or the website, from anywhere, significantly boosting productivity and efficiency.

Prior to adopting MediaCentral, Sky’s audio team operated in a silo. Investment in seven Pro Tools® | S6 modular control surfaces networked and running on the same platform as the news and editing rooms enables the broadcaster to place creativity at the center of its audio workflow, allowing mixers to work more efficiently and fluidly.

“With a catalog of premium sporting content to deliver, including the German Bundesliga, the English Premier League and the UEFA Champions League, Sky Deutschland’s investment in the MediaCentral Platform places the best individual tools at every step of the workflow enabling it to create, distribute and monetize content,” said Jeff Rosica, ‎Senior Vice President, Chief Sales & Marketing Officer at Avid. “Now, with a unified approach to optimize production, Sky’s delivery of this compelling content empowers the broadcaster to engage with and motivate increasingly sophisticated audiences.”

Sky’s investment includes a five-year support contract with Avid Global Services and software upgrades for Avid’s Interplay® Production Asset Management with Avid Interplay Capture and Interplay Archive modules, Avid AirSpeed®, Avid iNEWS®, Avid Pro Tools® and Avid Media Composer® with Avid NewsCutter.

  • Thursday, Nov. 10, 2016
Is high-frame rate the next failed Hollywood gimmick? 
This image released by Sony Pictures shows Joe Alwyn, portraying Billy Lynn, on a screen in a scene from the film, “Billy Lynn’s Long Halftime Walk,” in theaters on November 11. The film will only be screened at 120 fps at two specially equipped theaters in North America. (Mary Cybulski/Sony-TriStar Pictures via AP)
NEW YORK (AP) -- 

It’s starting to look a lot like the Fifties at the movies.

That was when theaters, alarmed by the rise of television and newly freed from the ownership of Hollywood studios, trotted out a wave of gimmicks to freshen up the moviegoing experience. Ballyhooed advancements like “Smell-O-Vision” and 3-D raged briefly before —at least for a time — receding into camp.

But many of those gimmicks have been reborn for a more high-tech age with new media anxieties. Now it’s cable dramas and streaming networks that are stoking fears that a mere movie isn’t enough to draw audiences out of their homes.

For this new era, there aren’t brilliant showmen like William Castle who put electric buzzers in the seats for 1959’s “The Tingler” and guaranteed $1,000 for any moviegoer who died of fright while watching 1958’s “Macabre.”

Instead, it’s many of the industry’s top filmmakers who are pushing new theatrical experiences. The latest purported cinematic savior is high-frame rate, an innovation without quite as catchy a name as 1959’s scented “AromaRama.” Instead of the traditional 24 frames a second, HFR is composed of many more images per second, lending greater clarity. But so far, the reviews are dismal.

First, Peter Jackson made his “Hobbit” trilogy in 48 frames-per-second, though poor reviews led it to be largely phased out by the final installment. Now, Ang Lee has doubled-down on the format, and then some. His “Billy Lynn’s Long Halftime Walk,” which opens in limited release Friday, was made with 120 frames per second. Critics have, in kind, amped up their doubts about the technology’s promise, claiming its hyper-real effect appears artificial or, worse, like a telenovela.

Whether high frame rate will go the way of “Smell-O-Vision” remains to be seen. “Billy Lynn” will only be screened at 120 fps at two specially equipped theaters in North America, and maybe half-a-dozen worldwide. Lee has urged patience. James Cameron, who led the 3-D resurrection, has pledged to make his “Avatar” sequels in a HFR format.

But high-frame rate is just one of the big-screen innovations making this decade look like a digitized sequel of the ‘50s. Here are some of the gimmicks that have returned, in mutated forms, like creatures from a black lagoon:

3-D
The golden era of 3-D, ushered in by 1952’s “Bwana Devil,” lasted less than two years. But the phase propelled by Cameron’s “Avatar” and embraced by the likes of Lee, Steven Spielberg and Martin Scorsese, has already lasted a decade. It’s now a regular, if divisive component of moviegoing: a cherished part of the theatrical spectacle to some, a loathsome surcharge on already higher priced movie tickets to others. Though audience interest for 3-D has at times waned, its grip on theaters seems assured. Cameron hopes to release “Avatar 2” in glasses-free 3-D.

CINERAMA
The panoramic widescreen format, projected onto a curved and arced screen, first debuted with 1952’s “This Is Cinerama.” It and other screen-stretching formats such as Ultra Panavision 70, brought widescreen majesty to films like “How the West Was Won” and “2001: A Space Odyssey.” Cinerama was closely followed by CinemaScope, the anamorphic lens advertised as “the modern miracle you see without glasses,” and the less successful Circle-Vision 360 — something like a forerunner to today’s IMAX screens. CinemaScope, big and beautiful, remains a cherished choice for many filmmakers. Damien Chazelle’s upcoming, glowingly nostalgic “La La Land” -- an early Oscar favorite and an implicit argument for the glory of movies -- is the latest to bring back CinemaScope.

SENSURROUND
The 1974 film “Earthquake” launched Sensurround which used low bass sounds to create a rumbling, vibrating effect. (Moviegoers in next-door theaters sometimes complained of the tremors from “Earthquake” while watching other releases that year, like “The Godfather Part II.”) Other efforts to transfer sensations on the screen to people in the seats have followed. So-called “4-D,” long a theme park attraction, adds an amusement park ride effect to theaters with moving seats, smells and weather effects like fog and rain. A South Korean company has opened “4DX” rooms around the world, playing Hollywood blockbusters. Some of them — and William Castle would appreciate this — even tingle.

  • Wednesday, Nov. 9, 2016
Rental program expands for Jaunt ONE VR camera 
Jaunt ONE VR camera
PALO ALTO, Calif. -- 

Cinematic virtual reality (VR) company Jaunt Inc. has announced that the award-winning Jaunt ONE camera is being made available to even more creators through an expanding rental program.  AbelCine, a provider of products and services to the production, broadcast and new media industries, is the latest company to offer the Jaunt ONE for rent.
 
The Jaunt ONE 24G model camera--which features 24 global shutter sensors, ideal for low-light and fast moving objects, and ability to couple with 360° ambisonic audio recording--will be available to rent from AbelCine. Creators will also have access to AbelCine’s training, workshops and educational tools for shooting in VR.
 
The nationwide availability of the Jaunt ONE camera, paired with access to the company’s end-to-end VR pipeline, provides filmmakers, creators and artists with the hardware and software solutions for shooting, producing and distributing immersive cinematic VR experiences.
 
●        Hardware – Rent the award-winning Jaunt ONE camera through AbelCine or Radiant Images
●        Software – Jaunt Cloud Services (JCS) provides the tools necessary to edit, stitch and render stereoscopic 360° footage 
●        Distribution – Submit high quality VR content for distribution directly to the Jaunt VR app through the Jaunt Publishing program
 
“As we continue to open the Jaunt pipeline to the expanding community of VR creators, AbelCine is a perfect partner to not only get the Jaunt ONE camera in the hands of filmmakers, but also to educate them on the opportunities in VR,” said Koji Gardiner, VP of hardware engineering at Jaunt. “Whether they’re a frequent experimenter of new mediums or a proven filmmaker dabbling in VR for the first time, we want to equip creators of all backgrounds with everything needed to bring their stories to life.”
 
“At AbelCine, we are always on the lookout for cutting-edge storytelling tools, and this describes the Jaunt ONE perfectly,” said Mike Nichols, business development manager. “Our clients rely on us for assistance in adopting new technologies and providing outstanding technical support on these projects. We are excited to do just this, and help our clients discover what’s possible with the Jaunt ONE.”
 
Creators interested in shooting with Jaunt ONE should stop by AbelCine’s booth #1149 at NAB Show NY, November 9-10, at the Javits Convention Center, where the camera will be on display.  Jaunt is also expanding its existing rental program with LA-based Radiant Images to increase the number of cameras available to customers.

  • Tuesday, Nov. 8, 2016
AbelCine opens Development Center in Brooklyn’s Industry City
NEW YORK -- 

AbelCine announced the opening of a 12,500-square-foot Development Center located in Brooklyn’s Industry City. The new facility is home to AbelCine’s expanding engineering, product development and integration units and will serve as a hub of innovation within the company.

“Evaluating new technologies and adapting them to the needs of the production and broadcast communities has always been at the heart of what we do as a company,” said Pete Abel, CEO of AbelCine. “With the opening of our new Development Center, we are greatly expanding our capabilities in this regard, which ensures creatives can always rely on us to help them navigate the next big thing.”
 
AbelCine’s Development Center features an engineering lab where products are designed to address specific technical needs within the industry. Thanks to AbelCine’s unique position as an equipment provider and technology leader, the company is able to identify limitations of traditional gear and develop effective solutions. AbelCine’s successful line of Cameo camera accessories and resolution analysis charts is an example of this approach. With expanded design, prototyping, machining, and manufacturing capabilities in house, AbelCine will continue to work with industry partners to bring innovative products to market.
 
AbelCine’s Solutions Group, another key component of the Development Center, has an expanded base of operations for integration design, assembly, and staging, as well as ample space for project management, technical collaboration, and administration. This will enable the company to take on more complex assignments, and collaborate on integration projects utilizing emerging imaging and media technologies.
 
“We are excited to a be a part of the creative community at Industry City,” said Jonathan Epner, director of market development at AbelCine. “Since our range of knowledge and experience encompasses traditional broadcast and production, as well as exciting new mediums, such as VR and 360 imaging, we see ourselves as a resource for any media company looking to maximize the impact of new technology on their creative projects.”
 
AbelCine’s Development Center is currently open to their customers and business partners by appointment only from Monday through Friday from 9am to 6pm, while all other locations continue to operate under normal business hours.