Friday, February 24, 2017


  • Monday, Nov. 28, 2016
2 Taiwanese teenagers win World Robot Olympiad in India 
In this Sunday, Nov. 27, 2016, photo, a boy plays with a robot during the World Robot Olympiad in New Delhi, India. (AP Photo/Tsering Topgyal)

Whizzing around a green felt table chasing a soccer ball beaming infrared light, the boxy robot shoots - and scores - and wins its Taiwanese teenage creators first prize at this year's student robot games.

The two breadbox-sized scooters, playing goalie and kicker, from the team called "Wings of Storm" were up against another Taiwanese team's robots in the "Football" category of the World Robot Olympiad held over the weekend in the Indian capital of New Delhi.

"We have been practicing since primary school," said Liaw Jia-wun, 15, thrilled to have won with his teammate. "We never in our lives could think that we would win the world championship."

Other categories at the robotics championships - attended by more than 450 teams from 50 countries - asked participants to create robotics solutions to reduce or recycle waste, leading teams to build robots that emptied trash bins or scooped up building debris for future use.

Some participants were as young as 6 years old, while others were approaching university graduation.

In the more advanced robotics category, robots had to be preprogrammed for the automated challenge of picking up mini bowling balls and knocking down pins. That meant the robots had to sense where the target was and hit it without any intervention from their creators.

The idea is to teach students computer programming as robotics moves beyond factory applications to everyday functions, said engineer Dominic Bruneau, the head coach for the Canadian teams.

"More and more, we will be interacting with robots" in our daily lives, Bruneau said. The student engineers are not just working on theory but are "doing practical work of building real stuff and trying to solve problems."

South African teacher and coach Nicky Du Plessis said the games helped kids develop key skills.

"We start with the fundamentals. We believe that if kids can start from a very young age ... it teaches them how to build," she said. "Then it teaches them logical thinking. How to change something quickly."

  • Wednesday, Nov. 23, 2016
White House Student Film Festival finalists from Claremont High School tap into Avid

Avid® (Nasdaq: AVID) recognizes the students from Claremont High School in Claremont, Calif., who were named finalists in the third annual White House Student Film Festival in Washington, D.C. Students Ira Clark, Johnathan Abrolat and Christian Settles used editing system Avid Media Composer® to create A Walk, one of 13 finalists selected from more than 700 entries from across the country to be screened at the festival.

In partnership with the American Film Institute (AFI), the White House Student Film Festival was part of South By South Lawn, an outdoor event celebrating the arts and ideas in America. Special guests at the film’s screening included Modern Family star Ty Burrell, Emmy® Award-nominated actress Alfre Woodard, Stranger Things actress Millie Bobby Brown and Stranger Things creators Matt Duffer and Ross Duffer.

“Avid Media Composer is a sophisticated piece of software, able to handle projects of varying complexity with ease, all the while being very accessible even to high school film students,” said Clark. “It was a great surprise to be selected as a finalist and that the judges saw our true vision in our work and the talent that went into creating it.”

The students created A Walk, which sheds light on environmental issues that affect society, under the supervision of educator Sara Hills, who has taught video production at the school for the past three years. The course gives students hands-on experience in Avid’s comprehensive tools and workflow solutions powered by its open, tightly integrated and efficient platform designed for media. Some of the same tools that powered the top 10 films of 2016, are used by Hills’ students. As an Avid Learning Partner, Claremont High School benefits from Avid’s cost-effective licensing options for educational institutions and has immersed its students in Avid curriculum, inspiring them to pursue future careers in the creative arts and enabling them to benefit from being part of Avid’s preeminent user community.

“Media Composer is the industry standard,” said Hills. “After attending AFI and having experience in the television and film industry, that’s what everybody edits on. My goal in rebuilding the program here at Claremont was to give students the tools they can take into the workforce or college. The continuous support we’ve received from Avid has been phenomenal, and played a tremendous role in helping the program reach the next level. Avid’s partnership with next generation creatives is a true inspiration.”

“We’re honored that Claremont High School has chosen to teach its students Avid – preparing the next generation of story tellers,” said Jeff Rosica, sr. VP, chief sales and marketing officer at Avid. “Our flexible deployment and payment options are designed to allow anyone to tell their story--from industry professionals to high school students. We want to congratulate both the students and educators of Claremont for their achievement and for being a shining example of what’s possible.”

  • Monday, Nov. 21, 2016
Primestream introduces VR/360 Dynamic Media Management
Primestream's Xchange Extension Panel

Primestream has introduced Virtual Reality/360 – asset management support in Xchange Suite, for workflows from capture through delivery. These new features enable storytellers and content creators to produce and store VR/360 content while leveraging collaborative workflows in a metadata rich environment.
Archiving VR/360 assets is an often requested feature in the market. Currently, many content creators have been using YouTube to store, search, and view VR/360 media. Xchange enables VR/360 media management in a secure and controlled location that supports the capture, production, management and delivery of all your assets.
“Working in a 360-degree space poses new challenges for storytellers. We are all trying to figure out how to move a viewer though a 360 degree space.” said David Schleifer, COO at Primestream. “Often, a specific part of the 360 content might be the only area of interest and with Xchange spatial markers professionals can now create focus points within the VR/360 media with metadata descriptions or keywords to easily find the area of interest. This helps push the project through the creative process, but also helps with finding the value in assets after they have been archived.”
Unique features of VR/360 dynamic media management for Xchange include:

  • Proxy video generation at preselected qualities
  • Preview content in 360°
  • Search for reference markers by keyword, views can target specific camera positions with in markers
  • Search for pre-programmed reference markers, target specific camera view by using keywords and spatial markers
  • Navigate in X/Y mode, zoom in within Xchange 360°
  • Search within a 360° video and locate metadata specific key points with Xchange

Xchange integrates with Adobe Premiere Pro Creative Cloud, enabling media asset management functionality inside Premiere Pro with the Xchange Extension Panel. Editors can easily locate the assets they need and begin editing with two-way communication of metadata, markers, subclips and projects.

  • Monday, Nov. 21, 2016
Vicon's motion capture embedded in Royal Shakespeare Company's "The Tempest"
"The Tempest" (photo by Toper McGrillis, courtesy of Royal Shakespeare Company)

Vicon, the motion capture technology specialist for the entertainment, engineering and life science industries, announced that one of its motion capture systems and powerful object tracking solution, Tracker, will be used for the upcoming theatre production of The Tempest.

The Royal Shakespeare Company (RSC) in collaboration with Intel and in association with The Imaginarium Studios, is creating a new production of The Tempest starring Simon Russell Beale as Prospero. The innovative production will mark the 400th anniversary of Shakespeare’s death and go down in history as the first live motion capture performance featured in a major stage production.

Motion capture technology is used heavily throughout the performance and the production team will use Vicon’s optical camera system to track the whereabouts of moving objects on stage, some of which will be held by the actors. Vicon cameras will also capture various screens during the performance, informing the production team of their movements on and around the stage. Due to its precision tracking capabilities the Vicon system will provide the animation team with real-time feedback, allowing them to have complete visibility of the objects on stage as well as managing interactions between the cast.

With the addition of augmented reality to the performance, an added layer of complexity is presented. It was important for THE RSC to have a mocap system in place which would accurately inform the other technical elements of the production. In order to create this consistency, a Vicon system will be tracking screens as well as moving objects during the performance. The data processed by the Tracker software using Intel® Xeon® and Intel® Core™ i7 processors will not only inform the lighting software and drive the spotlights for the performance, but also enable the augmented reality aspect of the production – allowing objects to become projected onto screens in real-time. To enable this innovative use of technology, Vicon and d3 technologies have developed a bespoke software solution allowing the data generated by Tracker to drive d3’s software using the PosiStageNet protocol – an industry standard which is used to drive interactive effects and lighting on stage.

“We are using Vicon cameras and Tracker software to track moving objects and actors for projection mapping and light tracking. In order to realise this, we’ve been working with the Vicon and d3 Technologies to develop a plug in application of PosiStage.Net. This will mean that both video and light tracking will be sharing the same tracking data protocol from the Vicon camera system.” said Pete Griffin, RSC Production manager on The Tempest.

Ben Lumsden, head of studio at The Imaginarium Studio added: “This theatrical production of The Tempest will see never before used technology on stage. This live spectacle, which starts today, has been further enhanced by bringing in the expertise of Vicon and the use of their cameras, which adds greatly to the audiences’ experience.”

‘Combining one of Shakespeare’s most renowned plays with innovative, augmented reality driven by Vicon technology signals a change in theatre production  and traditional media,’ said Imogen Moorhouse, CEO, Vicon. ‘It’s important to continue to innovate in motion capture and The Royal Shakespeare Company have demonstrated just how versatile the technology is when used to ignite the imagination of theater goers old and new.”

  • Thursday, Nov. 17, 2016
Facebook buys Pittsburgh-based facial analysis software firm 
In this June 11, 2014 file photo, a man walks bast a Facebook sign in an office on the Facebook campus in Menlo Park, Calif. (AP file photo)

Facebook has bought a facial analysis software firm linked to Pittsburgh's Carnegie Mellon University, a move that will help the social media giant boost its artificial intelligence-powered facial recognition technology.

Details of the deal weren't revealed by Menlo Park, California-based Facebook or FacioMetrics, the Carnegie Mellon spinoff.

Facebook says FacioMetrics's software can be used to monitor the emotions of medical patients, assess audience reaction to a public speaker, or even detect drowsy drivers.

It also will enable Facebook users to express themselves through special effects that can manipulate photos and videos with facial images. This is something rival Snapchat already does to some extent - it has special filters users can add on to selfie "snaps" they take of themselves. The filters change depending on your facial expressions. For example, a dog filter will not just add on dog ears but show a giant panting tongue instead of your own when you stick out your tongue.

FacioMetrics has developed software called IntraFace, which can be downloaded onto mobile phones and enables users to do real-time facial image analysis. The company's CEO is Fernando De la Torre, an associate research professor at the school's Robotics Institute.

  • Thursday, Nov. 17, 2016
NEP Europe goes with Grass Valley LDX 86N 4K cameras for new mobile prodn. units

NEP Europe announced its commitment to purchase all Grass Valley native 4K cameras for its new trucks that are currently under construction. NEP Europe has been relying on cameras from Grass Valley, a Belden Brand, for years, with more than 600 cameras purchased and used for a wide variety of production projects during that time. When designing the newest trucks that will be joining the fleet over the next six months, leaders at NEP Europe concluded that the new LDX 86N cameras from Grass Valley will provide the best performance and value for the increasingly popular UHD production its clients are demanding.

“We won’t be switching to UHD production overnight, but it’s clear that the market is moving that way,” observed Paul Henriksen, president of NEP Broadcast Services Europe. “By making this decision now and equipping our new trucks with the best 4K cameras on the market, we are ready to offer the capability as our customers demand it. Also, the LDX 86N offers outstanding HD image performance, so it fits our production needs today very well.”

The LDX 86N camera is ideally suited for outside broadcasting as well as live and studio productions, giving broadcasters the advantage of selecting between the highest image resolution with native 4K or switching to native 3G/HD for all applications where uncompromised 3G/HD performance is required. It also offers the ability to upgrade to LDX 86N Universe functionality, with 6X HD and 3X HD/3G high frame rate capture, on a daily, weekly or perpetual basis with the GV-eLicense.

NEP Europe’s long-term plans include utilizing Grass Valley’s 4K and IP compatible K-Frame switcher video processing engines and Belden cable in its mobile fleet, with some of its projects beginning to rely on an IP infrastructure for improved speed and efficiency. The company already has commitments to provide UHD production for customers in Germany and Switzerland and expects more of its work to transition from HD to UHD with the five new OB vans, all with Karrera 3-stripe control surfaces.

“NEP Europe is a real pioneer in the market, putting everything in place to take advantage of the new opportunities for UHD production,” noted Jan-Pieter van Welsem, VP of sales and marketing, EMEA, Grass Valley. “We’ve worked together for years, and by staying on the leading edge of technology, we are able to provide them with solutions that fuel their growth while paving the way for change. Before we know it, UHD will be the default format for a majority of programming.”

The five new trucks, which include one triple-expanding unit that can accommodate up to 30 of the LDX 86N cameras, should be on the road by early spring of 2017. All future trucks built for NEP Europe will also feature LDX 86N cameras exclusively.

  • Wednesday, Nov. 16, 2016
DJI premieres short film shot entirely on drone by Oscar winner Claudio Miranda
Claudio Miranda
SHENZHEN, China -- 

DJI, a maker of unmanned aerial vehicles, has premiered a short film by Academy Award-winning cinematographer Claudio Miranda shot entirely on its new Inspire 2 professional drone and its professional-grade X5R camera.
The Circle stars Ryan Phillippe (Shooter) and Noah Schnapp (Stranger Things), and illustrates how the DJI Inspire 2 can be an integral part of the creative process for high-end filmed entertainment. Miranda--who won the Best Cinematography Oscar for Life of Pi in 2013 and was nominated in 2009 for The Curious Case of Benjamin Button--and EP Dana Brunetti (The Social Network, House of Cards) used the Inspire 2 and X5R for every shot in the film, from emotional close-ups to sweeping aerial views.
“Filmmaking has been tethered to the ground for so long. DJI’s technology is allowing filmmaking to be free,” Phillippe said. “The advantage is obvious on smaller projects when you can’t afford cranes and all of the technicians that come along with it, and you can still achieve the same beautiful shots as with the larger equipment.”
The Circle tells the story of an estranged father (Phillippe) who reunites with his young son (Schnapp) in Depression-era America after the sudden death of the boy’s mother. The two travel from town to town as dad makes a meager living for the both of them by sketching portraits for locals. It’s on this formative journey that the boy discovers not only the transformative power of art, but how to open his heart as well.
The DJI Inspire 2, unveiled Tuesday at an event on the Warner Bros. studio lot, is a ready-to-fly platform for high-end film and video creators. While aspiring filmmakers have long had their creative ambitions restricted by shoestring budgets and limited gear options, affordable and powerful aerial equipment is now opening new doors and providing them with more options to pursue their visions.
“When you think of shooting a movie set in the Great Depression, you think of big Hollywood studio films, big cameras and massive crews,” said director Sheldon Schwartz. “What’s unique about this shoot is that we’re using new technology to push the limits of camera language and storytelling.”                   

For cinematographers, Inspire 2 with the X5R camera allows them to control every aspect of an image, gives them new freedom to move the camera in three dimensions, and uses stabilized gimbal technology to eliminate unwanted camera movement.
“What’s fantastic about this drone is that we’re able to shoot in the RAW format,” Miranda said. “It’s nice to have that dynamic range, so I’m able to push shadows up or highlights down and create a mood.”
“When you work with incredible visual storytellers like Claudio who are hyper-focused on every detail and element of the image, it’s important to have the most dynamic and versatile product to help tell that story but also one that also doesn’t break the bank,” Brunetti said. “We get just that with the DJI Inspire 2.”

  • Tuesday, Nov. 15, 2016
Facilis TerraBlock at work on Canadian family drama series "Backstage"
HUDSON, Mass. -- 

Facilis, an international supplier of cost-effective, high performance shared storage solutions for collaborative media production networks, announced that the postproduction team for Canadian family drama series Backstage is again relying on Facilis TerraBlock for its collaborative editorial workflow for season 2.

Ellen Fine, CCE (Canada Cinema Editor) is the supervising editor and consulting producer on Backstage. The series is produced by Fresh TV and distributed by DHX. It currently airs on the Family Channel in Canada and on the Disney Channel in the US and worldwide.

The offline edit is cut on Avid Media Composers by a team of three editors who share two assistants at Technicolor’s Toronto facility. For the first season, the team started out using only local storage. With some arm-twisting, the producers were convinced to allocate funds so that Fine’s team could better collaborate and share media.

The group called Don Kinzinger from Dynamix, a systems reseller and integrator in Ontario, to help them find the right system for their needs. Choices quickly narrowed down to Avid ISIS or Facilis Terrablock. In the end, a 16TB TerraBlock 8D system best fit the budget and gave the team the flexibility they wanted while minimizing restrictions on Apple OS and Media Composer version compatibility issues with the storage. The TerraBlock 8D was connected to a 10Gb Ethernet switch and from that point 1Gb Ethernet lines were connected to five Mac Pro workstations. Having central media access for the editorial team provided a great leap forward in efficiency and collaboration.

“We wanted a system that gave us the freedom to be on whatever Mac OS we were comfortable with, running any version of Media Composer that we felt was stable without any worries about compatibility with the storage,” stated Fine.

When the show was picked up for a second season, Fine knew that the producers would want to occasionally incorporate footage from season one which meant it had to be online and accessible. Since this required more storage, the team called on Dynamix again this time purchasing a 32TB TerraBlock 8D. The new storage was added to the same Ethernet switch so that everyone could access both servers.

Following the same hectic schedule as season one, for season two the crew shot 30 episodes over the summer. Since the location is an actual school, they had a very limited amount of time to film. There were two complete units shooting simultaneously.  “It’s quite unique, we’re block-shooting, so our actors, who are aged 14 to 17, are filming 4 episodes over 4 days. The actors have to memorize 4 entire episodes and jump back and forth between them,” said Fine. “They’re phenomenal dancers and musicians as well. It was very impressive.”

The show is filmed with the ARRI Amira camera with dailies processed and synced at Technicolor. Fine and team receive Avid DNxHD 36 files from the Technicolor team, which are transferred to the Facilis system, ready to edit. After the dailies are delivered, Fine and her team have 5 or 6 days to assemble an episode. Then they move onto the director’s cut, the producer’s cut, and finally the broadcaster’s cut before they lock. This year, they started in July, and typically finish in eight or nine months.

“The great thing about our Facilis system is that we don’t need to think about it.  It’s just rock solid and always there,” said Fine. “We organize everything very meticulously by episode in folders and bins. We store our music in a separate volume. It’s very efficient and just what we need to meet our deadlines.”

  • Monday, Nov. 14, 2016
HiScene, Inuitive, Heptagon team on AR glasses
HiAR Glasses
SANTA CLARA, Calif. -- 

HiScene, Inuitive and Heptagon have teamed to roll out HiAR Glasses,  billed as HiScene’s next generation of Augmented Reality (AR )glasses. The companies worked together to develop a complete solution for advanced 3D depth sensing and AR/VR applications that delivers excellent performance even in changing light conditions and outdoors. HiAR Glasses incorporate Inuitive’s NU3000 Computer Vision Processor and Heptagon’s advanced illumination. 
The glasses’ AR operating system provides stereoscopic interactivity, 3D gesture perception, intelligent speech recognition, natural image recognition, inertial measuring unit (IMU) displayed with an improved 3D graphical user interface.
“We are committed to providing the best possible user experience to our customers, and for this reason we have partnered with Inuitive and Heptagon to create the most intelligent AR glasses available on the market,” said Chris Liao, CEO of HiScene. “The technologies implemented provide a seamless experience in a robust and compact format, without compromising on battery life.”
Inuitive’s NU3000 serves AR Glasses by providing 3D depth sensing and computer vision capabilities. This solution acts also as a smart sensors hub to accurately time-stamp and synchronize multiple sensors in a manner that off-loads the application processor and shortens the development time. “Inuitive’s solution allows Hiscene to provide the reliability, latency and performance its customers expect,” said Shlomo Gadot, CEO of Inuitive. “With Inuitive technology, AR products and applications can now be used outdoors without the sunlight interfering or damaging their efficacy thanks to cameras featuring depth perception.”
Heptagon provides unique IR Pattern Illuminators, which were chosen to handle changing light conditions and plain surfaces.  In addition, the range and Field of Illumination features of Heptagon’s LIMA stereo pattern projector ensure superior lighting and added texture for higher-quality images.
“Our Wide Field of Illumination provides better gesture recognition, and our miniaturization technologies enable ultra-small, high performance, low power components for 3D AR/VR applications,” said Dr. Erik H. Volkerink, Heptagon’s chief business officer and executive VP.

  • Monday, Nov. 14, 2016
Moonlight Cinema deploys DaVinci Resolve Studio to complete animated film "Ozzy"
A scene from "Ozzy."
FREMONT, Calif. -- 

Blackmagic Design announced that Barcelona-based postproduction house Moonlight Cinema has completed both the digital intermediate and finishing of the new children’s animated film “Ozzy” on DaVinci Resolve Studio.

Postproduction director Alejandro Matus and colorist Ignasi González produced more than 40 different deliveries of the animation over a two-week period, using Moonlight Cinema’s three DaVinci Resolve color grading suites to cater for everything from different aspect ratios and frame rates through to multiple language versions.
“We had never taken on an animated film before, so there was an added layer of pressure to deliver a high quality finish,” said Matus. “Resolve was an enormous help throughout all aspects of the project, particularly with its new editing tools. Not only did we use it to conform and master the project, but we also were able to make changes to the edit during the grade without having to leave Resolve.”
Starting with 2K DPX files transferred to them by Tangent Animation in Canada, the Moonlight Cinema team applied a LUT to act as a base for the film’s overall look. “For many in the industry, it may seem natural to think that animation projects don’t require any real color correct, but in fact, the complete opposite is true,” said González. “Unlike a live action feature, a CG animation’s whole universe will have been created by many different people on many different machines. Homogenizing this content and giving greater depth to the animation through tools like light and blur is a daunting task.”
To ensure that every shot in a sequence matched uniformly, Matus and González decided to separate key elements from each scene through their respective alpha channels using mattes. This allowed for more control to modify individual assets, such as character faces, throughout the final grade. The next step was to give “Ozzy” its own unique aesthetic.
“Ozzy is a children’s film, so we wanted to maintain a bright, friendly feel even in scenes that were tense or sad,” González shared. “The great thing about being a colorist on an animation like this was that I had way more creative freedom when grading. For some night scenes, for instance, I could push the blues of the sky to a point where a live action sky would never go. The same happened with sunsets.”
“The best thing about using DaVinci Resolve for this,” he continued, “was the Color Management system. What impressed me on this feature was Resolve’s ability to handle all of the different color space conversions that we needed to output. It was quick and pain free. We also had a lot of shot replacements coming in throughout the grading process. With Resolve, I could simply save a still from a grade that I was happy with to the stills gallery, and it would include all the keyframe and tracking information, which made it stress free to update to a new shot.”