Q&A with (from left to right): Evan Jacobs (Stereographic 3D Supervisor, Marvel), Richard Baker (Senior Stereo Supervisor, Prime Focus World) and Brian Taber (Senior Stereographer, Stereo-D).


Announcer (Nick Urbom, Advanced Imaging Society):

Our moderator tonight is a highly sought after expert in 3D, high frame-rate production and post production. He personally trained over 3,000 professionals while running Sony’s 3D tech center, has worked with most every major studio, as well as directors Baz Luhrmann, Ang Lee, Barry Sonnenfeld, Sam Raimi, Robert Zemeckis and Ridley Scott.  He also served as the society’s first chairman. Please welcome our moderator Buzz Hays.

First up tonight, from Marvel, this first panelist is a visual effects supervisor and stereoscopic supervisor who has contributed to films such as Olympus Has Fallen, Alice in Wonderland and Titanic. He has served as a stereoscopic supervisor for Marvel Studios on Avengers: Age of Ultron, Captain America: The Winter Soldier and the upcoming release Ant-Man. His work on Marvel’s Guardians of the Galaxy was recognized by our society with a Lumiere™ for feature live action. Please welcome Evan Jacobs.

Our next panelist is the senior stereo supervisor for Prime Focus World in London and has supervised Prime Focus World’s stereo work on films including Avengers: Age of Ultron, Guardians of Galaxy, Edge of Tomorrow, Maleficent, The Amazing Spider-Man 2 and Gravity with Alfonso Cuarón, for which he received our society’s Lumiere™ award for best 2D to 3D conversion. Prior to his work building Prime Focus World’s stereo division, he was employed at many of London’s top visual effects facilities including MPC and Framestore, as well as a compositor and compositing supervisor on films such as Charlie and the Chocolate Factory, the Harry Potter series, Poseidon, Angels and Demons and Sunshine. Please welcome Richard Baker.

And our final panellist this evening, I think Brian Taber is here. He serves as Senior Stereographer at Stereo-D where he’s been since they opened their doors in 2009, a time when the current era of 3D filmmaking was still in its infancy. He has help them grow from 15 employees to over 1,000 and has worked on feature films including Jackass 3D, The Avengers, Star Trek: Into Darkness, Captain America: The Winter Soldier, Iron Man 3 and most recently Avengers: Age of Ultron. Please welcome Brian Taber. 

 

Moderator (Buzz Hays, Advanced Imaging Society):

Thank you Nick. Thanks everybody for being here. I really appreciate it. So I have a feeling you guys enjoyed the movie, right? But, how many of you are not familiar with the conversion process? Anybody? Okay. So, quickly, maybe if you guys wouldn’t mind, maybe we could start with Brian at the tail end and just talk in very broad strokes about what it means to convert a movie from 2D to 3D. Just so we can catch everybody up here.

 

Brian Taber, Stereo-D: 

In conversion we start with the 2D plate. We break it down into individual layers, separate them out, put them into foreground and background, all of the little detail that you see in stereo, and there is a lot of cleanup involved with the occlusion that comes out from the separate eye. And that is it in a nutshell. 

 

Moderator:

That doesn’t sound that hard (laughter). Well the way I usually describe conversions to people who don’t fully understand them; there’s basically a couple different ways to make a 3D movie, actually three simple ones. The first 3D movies that we saw in this modern era of 3D back in 2005 were animated films, so those are handled slightly differently. We tend to render two different eyes, one represents what the left eye sees, one to represent the right eye. Then following on from that we had some native production, native meaning we shoot in 3D. When conversion started out in 2006 – 2007, when I was with Sony, we had a film called G-Force, it was the first live-action converted film. So it was very much in its infancy and the process has changed tremendously, but basically, to put it in a very concise way, converting a film is turning every single frame of your movie into a visual effect, which is something that’s quite challenging, because visual effects have become pretty spectacular, as you can see in a film like this. It’s a great representation of it. But then having to go in and touch every single frame, and then take every object in that frame, cut it out, put it to where it belongs in space, give it shape and volume, and then put it together to make it look like you didn’t do anything. It’s a pretty masterful thing, so quick round of applause for these guys for doing such amazing work.

(Applause)

 

 

Moderator:

The first question I wanted to direct is to Evan, just to talk a little bit about the creative process. When you do a film like this, it’s a bit different than an animated film or a natively shot 3D film, because 3D tends to come later in the process. But you still have directors like Joss Whedon who have a very specific style that they go for. So I’m curious if you could talk a little bit about the creative process and where this idea of converting from 2D to 3D starts with the creatives and then how that carries through the process?

 

Evan Jacobs, Marvel:

If you think about it, 3D is an extension of the visual language in the storytelling of the film, just like the colour correction or the lens choices, costumes, any other department you can think of. And, as you say, we sort of get the material last. So the film in this case was produced in 2D first and somewhere early in post production we began selecting scenes and then sort of picking the big moments and trying to identify places where we want to go deeper versus shallower, to create what we call a depth script for the film to maximize the audience’s enjoyment of the film and find those big 3D moments. With Joss it was about supporting the vision that he already had in 2D, and then where we could find places to actually take it further. So, an example of that might be, the difference between seeing the movie in 2D and seeing those sequences where we are inside this virtual holographic environment, and then seeing that in 3D. If you were to compare them, what you see is a whole lot more detail in the 3D and it is a much more immersive experience, and that was just one of those opportunities that 3D gave us to take the movie a little further. So we would support what’s already there. And then, wherever we can, we take it as far as we can possibly take it, maximizing and taking advantage of this extra dimension we have. Another example is Hulkbuster versus Hulk. The challenge of that sequence even in 2D was to keep the scale of those guys making sense because they are two big huge guys but they can start to feel normal-sized because you don’t have anything to compare them to.  But in 3D we were able to keep them feeling massive so that we could subliminally support what the 2D was trying to do. We try to find those places to support the film and make it as cool as we can possibly make it.

 

Moderator:

Well it’s working (laughter). One of the things I’d like Richard to touch on a little bit, because you’ve done a lot of this work as well, is how the process itself has evolved? And maybe not so technical, but just in terms of the time it takes to do these things. How has that evolved since you started doing this work?

 

Richard Baker, Prime Focus World:

Overall I think the whole process has become of a higher quality over the recent years. I think the process is, as you said, very much a visual effects process - the reviewing, establishing the look with Evan, and the review process with us sending shots over here. I think internally, for us, and I’m sure it’s the same for Stereo-D, there’s a skill level of artists out there now compared to when we first started doing conversions. There weren’t really that many people around that could actually do it, or had skills in it.  And there weren’t really that many technical tools to help us with what we do. So I think the way we work with elements from the film as well as all of the visual effects shots, we are able to get all of the layers broken out and provided to us which makes it a lot easier for us. So I think the whole process has evolved hugely. I think in terms of the timeframe, that’s another one of the challenges that we always have and I think one of the great things about working with Marvel and the team is that everybody’s very integrated into the production. We all know what’s going on, there are no sort of fake deadlines and mysteries. Really it’s like, “Right, guys this is what’s going on. This is when the delivery is. This is what’s just happened to that scene, it’s just changed.” You know. And I think really, we are all pushing forward to try and get that film delivered to the highest quality. I think that kind of collaboration is what makes it a lot easier.

 

Evan Jacobs, Marvel:

Yeah, if I could just add. From my perspective, I feel like there’s also this visual language that’s kind of evolved over time in terms of how you tell a 3D story, and also the audience’s tolerances for depth and how far you can push things, and there’s a lot of subtle technical stuff going on that evolves over time. I think it’s part of the fun thing about 3D - that it’s still really a pretty new visual language. So we are always experimenting. “What if we did this, and see how the audience reacts?”  It’s very interesting.

 

 

Moderator:

That’s a very good point. One of the things that’s challenging about 3D, I remember when we were doing some very early conversion work, is that we used to joke that you can’t cheat the third dimension. Because we know what the world looks like, we know the scale of things, we know how big people are, how big buildings are, that sort of thing. When you convert something you have to honor that because we are all now seeing it through the eyes of someone who’s taken a 2D image and made it 3D. There are some significant challenges in that especially when it relates to certain objects.  When you have characters like the Hulk, who are clearly much bigger than other people, you can run the risk of making him look small or normal, which really isn’t the intent.  That becomes one of these challenges that I think Evan is referring to. It is evolving - not only the language of storytelling, but also the language of the tools that are used to be able to do this. There is a lot of automation that goes into this process, and maybe Brian you can talk a little bit about how those tools have become a little bit more automated.  But it’s still very much a human process. There are still a lot of hands in the mix when it comes to making a converted film. 

 

Brian Taber, Stereo-D: 

Yeah I mean with the technology, since I started back in 2009, it was very hands-on, very manual. It still is, and our company has grown to over thousand people like you said. But with the artists getting better, the development getting better, the artists understanding what they need from the tools, we are able to create different aspects of the process that help us skip steps or eliminate steps, or make steps quicker. There’s a lot of painting involved, a lot of tracking. That stuff can be automated within the process to help the occlusion fill.  That’s one of the major challenges in conversion.

 

Evan Jacobs, Marvel:

I think also, what that automation gives you is the opportunity to see the shot more than once. So you have more creative opportunity to play because you don’t just get the opportunity to see the shot one time and then go, “Well, we’re out of time.” 

 

Richard Baker, Prime Focus World:

Yeah. The amount of times Evan sees the shot is maybe 4 or 5 times… hopefully… (laughter) but internally it’s going through 20 or 30+ iterations of development.  You lay it out, the first time you see a shot will be once all the roto is done… We work with very similar pipelines, which is good for us because our work looks similar when we’ve got shots cut with each other but coming from different companies. So, we see it as a layout, and then we’ll take the shots to a depth approval stage where will first show it to Evan. And from that point on, when Evan approves it, we can take it forward, clean all the edges, do all the filling in, and take the shot to final. That’s quite a few weeks of work, and a lot of people have worked on that shot from a number of different disciplines.

 

Moderator:

Yeah, there is so much hand involvement in this, and you kind of want to leave no trace when you get done. You don’t want the hands of the artist showing in this. You just want to have a great three-dimensional image, and as Brian had mentioned there’s this phenomena called occlusion. This is basically; if you have an object in front of another object, a simple example would be a person standing in front of a wall, then you need to separate the person from the wall. But if you start to move the pieces around to make a three-dimensional image, you’re creating holes in that background plate that have to be filled in somehow. That’s where some of these procedural processes are coming about to help that, but it still requires the hands of an artist in there. And it’s amazing to see the level of detail that is in the conversion of this film. Specifically when you have Tony Stark - there’s so much detail in front of his face, or we are looking through spaceship windows and all of these other layers of transparency. It’s an enormous challenge to separate out what’s reflected in the glass from what what’s behind it and what’s behind the subject. I’m wondering if maybe you guys can talk little bit about those challenges, and how working with the visual effects team has actually made that easier?

 

Richard Baker, Prime Focus World:

That’s a good example - the close-ups on Tony, looking into his mask and you can see layers of holograms - the HUD display. So there will be a 2D plate of him, it will come to us, we’ll convert it, we’ll then pass it over to the visual effects company that is doing those graphics. This graphics will be rendered in stereo so they are really nice and clean, because sometimes with transparencies it’s quite difficult to get a really nice look. So they will be rendered in stereo and composited into the stereo shots. Then with that shot where he pulls back and flies off, that’s combining the HUD shot with the visual effects shot created by ILM that we’ve converted. So it kind of brings together three different companies, two visual effects companies and stereo conversion working together on that one shot.

 

Evan Jacobs, Marvel:

Yeah, you know, one of things that we talk about is the debate about conversion versus native photography having died down to some extent because the level of conversion has got to the place where for most people, unless you’ve got a really educated eye, most people can’t tell the difference. But, what we are doing is really a hybrid of conversion and fully CG rendered stereo shots. Like the shot of Thanos at the end, Luma rendered that shots with both eyes. Stereo D set the cameras for them, they went and rendered it, and delivered it to us as a stereo shot. We didn’t convert it. We have other shots, like the holograms, where we set the camera based on what we are converting for the background. And then they delivered stereo elements to those and then we integrate them into the stereo project. So, we are trying to use every tool in the toolbox where it’s appropriate, honouring the filmmakers intent, maximizing the impact of the shots, getting it done quickly, all that stuff. I don’t think of it as a straight conversion type thing. It’s not like the filmmakers handed us a 2D movie and then walked away. We are tightly integrated into the creative process. We are working lockstep with the visual effects vendors; we are getting elements from them. We are just a part of the team. About forty percent of the audience during the first three weeks saw the movie in 3D, so it’s important that it looks good.

 

 

Moderator:

That’s pretty great. That’s an important consideration, certainly. I remember when the first conversions were done, not that long ago, most of the techniques and tools that were used, in those cases, were designed by software developers. No offence to software developers, but it didn’t have the hands of the artist in it. And it was only once we got visual effects supervisors, and people with a really good creative eye involved, that we were able to take things on to the level that we are seeing now. Plus, this integration, of having visual effects companies who are ready to deliver separate elements where you night see Iron Man, and the smoke layers, and some other things coming in as separate pieces so that they don’t have to basically un-mix paint to try and separate those elements, it makes things a lot easier. But that integration didn’t exist a few years ago, and it was quite impossible. Some of the earliest demos we saw were interesting but they didn’t include the hard stuff like glass, and smoke, and reflections, and all those things that, in a film like this, the film would not be nearly as successful without. So, we’ve come a long way. And I agree, I think the hybrid approach is a perfect approach to making a great 3D story. Now, I’m just curious, especially Evan, from your perspective, how the schedules have changed since you started doing 3D at Marvel, and today. They never seem to get more time

 

Evan Jacobs, Marvel:

Yeah. No, it’s disappointing. And I think Brian could speak to it even more because he’s worked, I think, on every single Marvel production, right? But, you know, when I started on Captain America: The Winter Soldier, and we had, I think, 19 weeks on that film, from the day we started breaking it down. And then Guardians of the Galaxy was down to 12 weeks. And then on Ant-Man, I think we are down to 8 weeks. We just keeping setting the bar a little higher every time… or lower (jokingly), I don’t know. But this film is huge, and very long and 3,200 plus shots or something. It was an absurd amount of work. And the challenge with this particular film was actually not completely the schedule. I mean the schedule was an issue in terms of the overall amount of time that we had. But the bigger challenge was actually visual effects, because there were so many shots, and so many shots with a really high complexity level, that some of the shots came in very-very-very late. So, for example, each of these companies had to deal with shots that came in literally hot-off-the-presses, right at the end of our 3D deadline. So, the opening shot, the third shot in the show, which is about a minute long, we call it the tie-in shot, where all of the characters are flying through the forest. That was a shot that Prime had to convert in seven days or something like that. And from “go,” you know. Like literally had almost nothing to work on before that. And it’s a huge amount of work. And then Stereo-D didn’t get away easy either. They had a shot that was even longer, and had some visual effects elements, but a lot of live action glass-over-glass-over-glass-over-glass-over-glass, and that shot was also one of the last visual effects to final. So, you know, we rise to the challenge, and we planned those shots out, but you get to a place where you go, “Hey, the 2D has got to look great so that we have our thing to do.” So we want to give them the time. So you don’t want to say, “Hey, you have a deadline. You have to deliver it to us, even if you’re not done with it.” We all need to finish it, so it’s definitely challenging. But, with teams like these guys…I mean, Brian says he’s got 1,000 guys, and you know, you give everyone a shot and you’re done in a day… (laughter)…it’s like Victoria Alonso, my boss, always says, “Can you have a baby in one month because you have nine women?” (laughter)

 

Moderator:

So, speaking of challenges. Brian, would you talk about what you felt were some of the most challenging aspects of this particular film, aside from time?

 

Brian Taber, Stereo-D: 

Creatively, the lab was very interesting. Tony’s lab, Banner’s lab, it was a lot of glass-over-glass as Evan said. That is very difficult for the conversion process. There are a lot of moving cameras in there, which is very difficult to get z rates correctly, to get the right feeling that you are moving through the space. Also, just filling up that space and making sure that nothing felt too large or too small - you can get a lot of miniaturizing effects when you start putting too much stereo into things. So that lab space was definitely one of the more creative challenges for us. It was also one of the newer environments for us. A lot of these characters I’ve dealt with for multiple movies now, so their look has been established. But that new lab was something we could explore and work with to make it look great, and make it feel like you are in that environment.

 

Moderator:

That’s great. And Richard?

 

Richard Baker, Prime Focus World:

Similar kinds of things. One of things, talking about scale, was with Hulk. The wide shots of Hulk when he’s with Black Widow, because of the wide lens they do make him look a little on the smaller side, so there are some subtle things like cheating the depth a little bit with Hulk to maintain his scale.

 

Evan Jacobs, Marvel:

One of the interesting things about Hulk, and Brian we’ve talked about this as well, is that he’s a CG character. So we got a CG depth render of Hulk, but to hit that right out of the box didn’t give us a result that we liked.

 

Richard Baker, Prime Focus World:

Yeah, it always needed a little bit of manipulation really. And sometimes you find that, even with the sculpting, I mean receiving z depths is great for CG characters, for speeding that process along. But, you often find, once you plug it in and you look at the depth of a shot, that you have to alter certain characteristics maybe, just to make it more pleasing to the eye.

 

 

Moderator:

That brings up an interesting question. I’m not sure there’s even an answer for it yet, but… I’ve noticed in a lot of 2D filmmaking, especially early on, they are focused on trying to get the movie ‘in-the-can’, and then figure out what they are working with in post production. So there is not a whole lot of regard to 3D, although there tends to be little bit more now. But I’ve noticed, in particular, the lens choices as you’ve mentioned, can really affect the quality of the conversion later. You know, a lot of filmmakers will use a long lens to either change the perceived perspective in a shot, but when you get into conversion, in some ways it limits your options, and I’m wondering if that’s been the case with the work that you have.

 

Evan Jacobs, Marvel:

One of the things that I think I can credit Brian’s team with developing a bit has been how to deal with these long lenses. One of the things that I often say about conversion that is really cool, is that you are not limited by physics. So, if you actually take out a stereo rig with two cameras, with two lenses, and you shoot with a long lens - a telephoto lens - you tend to get kind of a “card-y” look or flattened out detail, and you don’t get nice round features on people’s faces and stuff like that. And one of the things, when I came to Marvel and started to work with Stereo-D’s team, was the way that they treated that, by basically nonlinearly treating the spaces. So saying, “Let the faces be round, even if they shouldn’t be, and let the backgrounds go a little flatter. And put the focus on the characters” which is often the most important thing. Make sure your actors look good. That’s a big focus for us.

 

Brian Taber, Stereo-D: 

Yeah. Definitely with Marvel, they are all about their superheroes and their characters, so just working with them over the years, developing that visual language, about Iron Man looking like he needs to be and his suit looking the way it needs to be, and drawing the audience eye, using the stereo to draw their eyes to where the story is going.

 

Moderator:

Yeah, actually a question I also have is, in a case like this where you are using multiple vendors to do the conversion work, there’s a certain sort of hand-off process so that you have to be more aware of how the stereo plays between shots that came from different vendors. So, how do you typically handle that?

 

Evan Jacobs, Marvel:

Well we share a lot of imagery to be honest. I mean, there’s a lot of transparency on our team. So, I don’t really think of these guys as vendors. They are just part of our team. So they are coming in, and they are seeing each other’s shots, and they are definitely seeing hand-off points and stuff like that, and we are sharing a lot of information so that it does integrate well. And honestly, it’s funny; it’s never been a problem.

 

Richard Baker, Prime Focus World:

Often, you know, if it’s possible, we try and do it by scenes as well. So, you know if Brian’s team is working on the shot of Iron Man, and we’re on one, and then there’s a minute in between it or something, you’re not going to notice the differences necessarily. Because we are all pretty much in the same place with it, but when you do intercut shots for whatever reason, Evan will go, “Look, I’m going to send you over a version that Stereo-D have done so that you can match to it” and so forth to bring it in line.

 

Moderator:

And in terms of depth balancing across cuts, are you typically doing that on the Marvel end in post, or is that really the vendors doing that within their own sequences to make sure you don’t have jumps of attention across cuts?

 

Evan Jacobs, Marvel:

Well we do a pretty aggressive balancing pass here in the studio. So I’m reviewing everything in a theatre not quite as big as this, although we do actually review in this theatre sometimes. But we’ve got our own theatre so we are watching everything big all of the time. All of the shots come here to the same space. The guys will come over and see everything here, take their notes. And then once we start assembling the reels together, we’ll balance cut-to-cut, deal with the horizontal image translations or moving the screen plane around, and also the windows - the floating windows - which we do a lot because we are sensitive to edge violations… we don’t like that. So we do a pretty heavy-handed balancing on the Marvel side. But that doesn’t mean there aren’t notes that go back to them to say, “Hey listen guys, I can’t make this work. Can you adjust this?” And that happens too.

 

Richard Baker, Prime Focus World:

Internally, we review stuff in sequence as well. If we are working on a sequence, my lead supes will be working on it in shots and, like Evan does with us, I’ll give a sort of overview to look at the bigger picture and make sure that stuff like that is running smoothly.

 


Moderator:

Anybody have any questions out there? I see one right here in the front. Yes sir.


From the audience:

First of all, I want to say thank-you. It’s a really good movie. The level of detail that you’ve done, even the HUD displays. You know, when you go out and buy the DVD, if you read the text, it’s actually real words in there and not just gibberish.


Evan Jacobs, Marvel:

Absolutely. It’s interesting to speak to that, because that’s an area where we actually deviated from the 2D version of the movie because in the 2D version of the movie those things are actually defocused because you want to look past them. But in 3D they didn’t look good that way so we actually sharpened them up.


From the audience: 

My question is, in Hollywood, the big movies, we are getting them out fast like you said, 3 weeks, 6 weeks, you know, 2 days (jokingly). And 3D is hard to sell to the audience a lot of the time so… I’ve seen a lot of my clients who are in 3D moving into other venues like VR and augmented reality and stuff like that. And I was wondering if you guys, in your divisions or your companies, are looking over into those other technologies like augmented reality and virtual reality, moving them into 3D?


Evan Jacobs, Marvel:

Well, you know, it’s interesting. Actually for Avengers, Samsung did a promotion with their Oculus set-up. And we actually do the - Avengers mansion I guess it’s called - the Avengers space that they have the party in. We actually went out and captured that thing, converted it into 3D, and you could walk around it and look around, and this was a promo for Samsung. So we’ve started playing with this stuff a little bit. I think the challenge still for VR becomes, how do you integrate storytelling into it? You’re giving the audience a lot of opportunity to look around, and so you’ve got to figure out how the storytelling works and that’s the nut to crack still. But I think, from a technical standpoint, we are really interested in figuring that out.


Moderator:

And that actually brings up one of the questions that I have, and I’d be remiss in not asking this now that we’ve renamed ourselves the Advanced Imaging and 3D Society. Are you guys presenting material these days when you are working with higher resolutions and higher dynamic range and what challenges might that bring?


Evan Jacobs, Marvel:

Yeah, well this was the first film where we actually did a version for IMAX with their new laser projector, which is sort of a higher dynamic range projector that they are rolling out. They’ve only got it in like four or five theatres but the movie was released that way. It was the first 3D movie that they’ve done this way. They did Fast and Furious as well, but it was a 2D film. So, we are going to do that again for Ant-Man and then Captain America. I think they’ve already announced that they are going to be shooting sequences of the film with their 6K IMAX Arri 65 camera. So we are pushing further and further down the line. It’s coming fast. So we are definitely embracing it. This film was pretty far along before that technology rolled out. I think we had just finished shooting when we saw the Dolby projectors get rolled out, but we did jump in and try and do a pass for that. So, little by little. It’s amazing, I mean, it’s going to change…If you see the movie at the Chinese Theatre, you see it with the laser projector, and it’s…not to get into the weeds technically but you know like, they projected here tonight probably at somewhere in the neighbourhood of 6 foot-lamberts, which is how you describe the brightness of the projector, and that’s pretty bright for 3D. But at the Chinese Theatre, you’d see it at 14 foot-lamberts, so it’s much brighter. It gets to the point where you don’t even notice the glasses at all. It’s going to change the way people perceive 3D, it really is.


Moderator:

Are there any challenges from the vendor side in terms of added resolution or dynamic range that you have to accommodate?


Brian Taber, Stereo-D: 

I think with the dynamic range, it definitely helps us identify problems that you might not see on a 6 foot-lambert screen. It’s definitely a bigger challenge for us. From a resolution standpoint we are working with more and more material that is 3K and 4K. You know, that’s just bigger and beefier machines and servers for us to go through, but the work is the same.


Richard Baker, Prime Focus World:

Yeah, I think probably from a colour grading point of view as well, a high dynamic range is where you are really going to benefit in 3D. What’s nice in 3D is, when it’s quite dark, to just keep a bit of fill light in those low light areas so you can always feel the layering of an image. So I guess that’s probably where we are going to gain quite a lot.


Moderator:

Another question?


From the audience: 

Hi. So what scene or sequences that you guys went though had unexpectedly had the most notes, and how did you guys crack it?


Evan Jacobs, Marvel:

I’m going to say the farmhouse. It seems like a fairly straightforward scene, but there are a lot of people, and there is a lot of space, and the shots are quite long.


Brian Taber, Stereo-D: 

Yeah, with the farmhouse, that was one of the first scenes that we got because there wasn’t a lot of visual effects in there. You’d be surprised that every type of shot ends up being a visual effects shot with any type of vanity fix. But we can usually get started on those pretty early. So those went through some rounds of notes, with getting the characters looking right again. Any time you jump back into a project, it takes a while to get re-acclimated to it. There is a lot of space in there. They do step away from the house quite a bit, so the miniaturization of the house again was a thing.


Moderator:

Your question, here in the middle.


From the audience: 

Hello. I’m a student and I have a propensity towards immersive technologies. Obviously you are at the pinnacle of the industry... how far away are we from having a fully immersed movie experience, taking what we saw on the screen and having it 100% immersive?


Evan Jacobs, Marvel:

Well I think what’s interesting is… what I’m excited by is this movement towards premium large format theaters, and making theatre-going more than just every little multiplex. You know, there was a period of time when it was about cramming small screens into lots of spaces. And now you go to theatres and there are leather seats and big huge screens, you know, with screens that are 90 feet wide. This is a 40-foot wide screen here, just to give you a sense of scale. And I feel that’s where the distribution side of things, or the exhibition side of things, is trying to differentiate from the TV you have in your house. Are we going to be watching these movies on Oculus and stuff? I don’t know. Maybe. I mean. I think it would work.


Richard Baker, Prime Focus World:

Yeah, it’s possible. I think 3D brings a lot of immersion to a film anyway. It doesn’t work for every film. I think tonight was a lot more immersive for me, for sure. And I know, for example, Gravity, you watch that in 2D and it’s a very different film than it is in 3D. I think they are really good examples. It’s not just about changing a movie into 3D, you know. It’s also, as I was saying, you’ve got the ups and the downs, and you’ve got your 3D moments of course. But I think appreciating a character, and appreciating the depth and the sculpting and all of the fine detail is more of a subliminal thing which does make the overall experience more immersive.


Moderator:

And on that note, I think technologically that can happen today. There is no reason why that can’t happen today. But I personally think there’s a philosophical adjustment that needs to be made in the viewing audience, because essentially you’re asking the viewer to play camera operator and director for 2 hours, and I don’t think a lot of people are prepared for that. I think a lot of people are waiting to be entertained and not have to work too hard for it. So there’s going to be a cultural shift in the way we appreciate entertainment before that really becomes fully seamless….


From the audience: 

Well, how can persons like myself start developing these methodologies? Because I have ideas for telling these stories. How can a person like myself, that is still in school, start integrating these ideas…


Evan Jacobs, Marvel:

It’s an interesting thing because people sometimes talk about 3D as kind of an emerging new technology, and it’s not. I see it a lot like live theatre, and if you think about it as a live theatre director, you’ve got to use the tools in your toolbox to get the audience to look where you want them to look. And that means shifting the light, that means moving someone up or down stage, that means maybe having someone deliver lines while other people go silent. There are a lot of different ways you can direct some people’s attention. But, if you are a filmmaker and you are only thinking about the film as a 2D thing, and you are not contemplating that third dimension, you’re not going to have a successful result, that’s for sure. When we worked on Guardians of the Galaxy, which was a particularly successful 3D film for all of us, we had a filmmaker that was in the seat with us on a weekly basis, making notes, saying, “Change this. No, no, no. This is what I was trying to do here…” and letting us help him to support his own vision, you know. And we got to play with variable aspect ratios in the IMAX version, so we had the film opening up into big wide images and then back to small, and we played all kinds of fun tricks with that movie because we had a filmmaker who was excited about it and we could play. I think, you know, as a storyteller, you’re in school and you want to play with it. The best thing is to start playing with the tools. There are 3D cameras out there if you can’t afford it. I mean conversion is still an expensive thing on that level to play around with, but you can definitely get your hands on 3D rigs and play with them and stuff. And any opportunities to start to get to know the 3D, because the more 3D you watch, the more you learn about the tools and how to tell the story. It takes time to learn how to view it and how to know what impact it’s going to have on the audience. It’s so subjective too, that’s the other thing. It’s like, I’m constantly bringing people into the screening room, and saying, “What do you see? What do you look at?”


Richard Baker, Prime Focus World:

That’s why we get so many notes (joking).

(laughter)


Evan Jacobs, Marvel:

Yeah, yeah, exactly, exactly. We give everybody a note (joking).


Moderator:

Well your timing is great regarding VR because, to be perfectly honest, nobody knows anything. It’s so “wild west.” There are no toolsets, there is no standardization, there’s nothing. So you have only to succeed in that arena. We’ve got a question up here in the back.


From the audience: 

It’s Corey (Corey Turner, Paramount VP 3D Post Production).


Brian Taber, Stereo-D: 

Oh great. On the spot (joking).


Corey Turner, from the audience: 

I have a question. Knowing that VFX is always considerably challenging to deliver, and this is primarily to Taber and to ‘Lord’ Baker…

(laughter)


Evan Jacobs, Marvel:

Isn’t it “Sir?” Is it “Sir” yet?

(laughter)


Corey Turner, from the audience: 

I’m curious how you, Evan, handled this. Exactly how late were some of the shots delivered before final, and if there were any compromises made when that happened, and specifically how many shots could be turned around in that time if one was in the middle of a movie, trying to finish? Just curious.

(laughter)


Evan Jacobs, Marvel:

It’s so relative, right? I mean it came on time, in that we knew it was going to come in when it came in.

(laughter)


Richard Baker, Prime Focus World:

Some of the shots, we did get a final version maybe five days before, but because of the way we set up with the visual effects companies like ILM for example we can harvest earlier versions as they are in progress. And this isn’t for you Corey, this more for the benefit of the audience what I’m saying now. You know the answers anyway.

(laughter)


Brian Taber, Stereo-D: 

Yeah. The answer for Corey is zero. Zero came late. They were all on time. Months early.

(laughter)


Evan Jacobs, Marvel:

Look, let me just give you a sense of it. I mean, we had 3,200 shots in the film, and I think we delivered about half of those shots in the last two weeks of the movie.


Corey Turner, from the audience: 

That’s excellent news.

(laughter)


Richard Baker, Prime Focus World:

We’re not going to do that for you, Corey..

(laughter)


Moderator:

Uh, we’ve got one here, right in the front, and then we’ve got a gentleman in the back.


From the audience: 

Hi. Um, is it frustrating from all the work you guys do, and bringing people in theatres like this, when other regular movie theatres technology aren’t kind of doing the service?


Evan Jacobs, Marvel:

Yeah. Absolutely. I mean, seeing the film… I’ve seen it a bunch of times now, in various different venues, and you know, look, it’s the same with sound everything else. You want people to see it the best way they can see it, but I think 3D is really one that can suffer if the projector bulb is old or whatever. But these new laser projectors that are slowly but surely going to make their way into your theatres are going to change that tremendously. I mean, it’s just about to happen. We are a year or two years away from a wide deployment of these things. We are going to see better 3D, better systems, and better glasses. It’s all changing really fast. And so, you know, we produce the best movie we can make and we hope that people find the best theatres to see it in. And that’s all we can do.


Moderator:

Yes, we have a question in the back.


From the audience: 

Yeah, back to the question about where you were talking about the hybrid aspect of things, you’ve got your CGI, stereoscopic elements, the conversion. Are you also working with films, or does it make any sense to also shoot some stuff native as part of the hybrid? Like, shoot some stuff native, some stuff hybrid.


Richard Baker, Prime Focus World:

You can do that. I mean, ultimately it comes down to budget as well, because having a stereo crew, and shooting mono, there’s two different types of crews there. But I mean, Corey, on Transformers did exactly that. Had multiple formats really. There was definitely native material there, then some of that was combined - sort of native visual effects combined into mono plates, and then full native shots. So that does happen already but there’s lots of other considerations with that. And the thing about native is that it still has to go trough a big post process at the end of the day. The lenses are a lot more accurate now, but you often get vertical line up issues, polarization issues if you are mirroring. And my experience of it as well, going back to the lens choices. If you are on a long lens for example, you could be flattening the foreground to keep some mid-ground, background depth. So we’ve had situations where we’ve gone back into native with our conversion methods and increased the volume in certain areas. So I think, if anything, it’s about finding the right tools for the job. For certain shots, yeah, it might be worth doing that. But internally, the way we do it, the approach internally is based on what the shots require or what the film wants.


Evan Jacobs, Marvel:

I think, you know, for Marvel the… you know, I’ve always kind of been an advocate for this idea that you are describing, of taking a stereo rig out when it’s appropriate and shooting a rain scene of whatever, something that would be notoriously difficult to convert. But the thing about Marvel is, the films continue to kind of evolve in post and they are finding that they just don’t stop making the film. So it’s not a group of people creatively that want to kind of limit any options, anywhere, ever. And so, any opportunity to make the movie better, they are going to do it. And so conversion really is the right tool for the way that we make movies because we do not have to make depth choices on set, we can make them in post once we see how the movie cuts together.


Richard Baker, Prime Focus World:

I think that’s why visual affects films traditionally, in our business, are converted. I think, you know, native, if you are doing a heavy drama film or nature, those kinds of things, perfect for it, you know. Because they tend to operate at a bit of a slower pace I think. They are much longer projects, even in the making and filming of them. Whereas, on set, for the feature films, they are really up against it from day one, aren’t they?


Moderator:

We have time for one last question, right over here.


From the audience: 

Can you speak generally about the software development? In general terms where is the development going?


Brian Taber, Stereo-D: 

With our development, it’s moving more and more into real time. Having better tools to adjust depth on the fly with Evan or with the filmmaker, and being able to make those adjustments and not have it break, so that we can proceed with the next step of the process.


Evan Jacobs, Marvel:

This is one of the most exciting things that’s happening in the last, I don’t know, for me in last year certainly, which is that we can actually get an opportunity to sit with an artist and actually pull the levers right there in the film with the filmmakers. And so we can bring the director in and say, “What do you want to do?” And he can say, “Move this guy forward, push this guy back, make this rounder, make this flatter.” And we can just do it right there and he can see it.  As opposed to having to make a bunch of written notes and wait a week and then get it back. And it cuts down revision cycles and allows us to move faster, but it also allows us to play. Which is one of the things we are having a lot of fun with on the next film, Ant-Man, right now because we have a lot of interesting scale choices to be made and opportunities to play with 3D but it’s the sort of thing you almost have to do interactively. So we are getting an opportunity to do a bit of that, which has been an amazing new set of tools that we get to play with.


Moderator:

Moderator: And with that, I think… I want to thank our panelists, Brian Taber, Richard Baker, Evan Jacobs, and of course, the Marvel organization. Without them, we wouldn’t be here watching this tonight. And I also want to thank Mr. Mike DeValue for being so kind and generous with the Disney theatre for this screening. So thank you all for coming, and make sure you tell all of your friends to go see this movie at least twice.

(applause)