When directors Peter Middleton and James Spinney were working on their documentary Notes on Blindness, their attention was on finding a filming style that could convey the deepest emotions sensed in John Hull’s voice – a professor of religious education at Birmingham University that went blind in 1983 and spent much of a decade recording his journal on tape as his sight deteriorated. But when Arte came on board as a co-producer the idea emerged of adding a digital experience to the documentary, and Arnaud Colinart, from Ex Nihilo, was called to imagine what has then become Notes on Blindness: into Darkness.
The following is an interview where Arnaud Colinart guides us through a two years and a half creative journey that started as a podcast and ended up as one of the most poetic and acclaimed VR experiences produced to date.
Each of the 6 chapters of the VR experience addresses a memory, a moment and a specific location from John’s audio diary, using binaural audio and real time 3D animations to create a fully immersive experience in a ‘world beyond sight’.
Produced by Ex Nihilo, Archer’s Mark in coproduction with Arte France and the French studio AudioGaming, the project won the Storyscapes Award at Tribeca Film Festival, the Special Jury Prize at San Francisco Film Festival and the Alternate Realities VR Award at Sheffield Doc/Fest. It is now freely available on Samsung Gear, mobile and Cardboard (iOS | Android).
SG: Notes on Blindness started as a feature film documentary. Then, Arte had the idea of producing an interactive version of it, and it became a VR piece. It was quite a bols decision to go VR – considering that the theme was blindness. What was your initial brief?
AC: We had a very iterative creative process. We did not start to work on the digital experience with a VR piece in mind. With my partner David Coujard, we met Peter and James – the directors of Notes on Blindness – and their producers Mike Brett and Jo-Jo Ellison from Archer’s Mark, when they were looking for a French co-producer to produce the feature film with Arte, and Arte wanted to create a digital experience to go with the film.
When we started listening to John Hull on tape, we realised how deeply emotional the content was and our first discussion was about how to use such an amazing story. We really felt that we could make an interactive experience that could be as moving as the linear piece.
If you look at all the field of interactive work (with the exception of video games), I think there is little work that has the emotional power of a more classic format, such as a feature film or a TV documentary. For me, there is one really moving piece in web documentaries, which is Welcome to Pine Point, from the The Goggles and the NFB.
This is a piece that really moved me far above the friction that you can have with an interface. Often, people see interactive documentary or interactivity as a wall between their experience and the story and the emotion. Maybe for this reason when Peter and James thought of doing something in the digital world, they were more focused on an impact campaign. For us social impact and art had to fuse together and these two goals were part of the discussion when we first started.
SG: So, at what point of the creative process did you start considering VR?
AC: Very late in fact, because we started working on the material, and focusing on the story itself. At first, we decided to do an interactive podcast. I was a big fan of the ‘Serial’ podcast, and there is very little interactive work focused on sound. So, when we started, the idea was to use John Hull’s tapes, and to make an interactive podcast where you would use several gameplays you can have on your mobile – such as tracking, gyroscoping[1], using the microphone to recreate a 360 environment around John’s voice, and to put you in the shoes of John Hull. All this only based on sound. When we started, we weren’t at all in a VR environment, although we were in an immersive environment. The immersion was coming from sound only and this for several reasons: first, because the story was about blindness, secondly, because the non-fiction material was audio in the tapes, and third, because, in the field of the online documentary, very little is based on sound, and we thought we could be innovative in this field.
Then, we started prototyping with AudioGaming, a French studio specialised in audio interactivity, and now in interactive experience in general, and video games. When we did the first prototype with Amaury La Burthe CEO of AudioGaming, we saw that we were absolutely missing an important point of John Hull’s work, which was the transmission to a sighted audience. Because, when we did the first user test, we noticed that when the user has a mobile, he’s stuck at looking at the screen. So, we had to bring visuals in, although at first we never wanted to have visuals. At first we had static visuals or no visuals at all, like black screens, but for users, black screens were the signal of a bug. So, we decided to add visuals, and we started a very long process to find the right art direction. Then, we took the first thing we had that we already mapped in a sound scape and we started drawing the scenes, like when you would draw a theatre set. We designed the scene, and we started to make a list of all the graphical assets we had to create.
To find inspiration about the art direction could we used the film in itself as our graphic bible, because Peter and James already had a strong visual grammar. So we searched how to give a sense of blindness.
Through a lot of glimpses and blurs we created an initial visual grammar. We also wanted to add something that was very much at the centre of John Hull’s testimony, the discovery of “the outside world”. But how to represent the adventuring of a blind person going outside when you are in a first person position?
This is how we arrived at the idea that all is black and every sound is a point of activity, and when there is no sound, then the world dies. This is something that John said in his tapes.
AC: At this point we were only working on the art direction, how to represent blindness with still images, pixels, dots of light, raindrops. But we were still focusing on mobile interfaces because we are working for Arte. I think when you work for public services, you have the responsibility to bring the project to as many people as possible, and today ‘mobile is king’. So, our first iteration of this project was a podcast.
The second one was a 360 “magic window” experience for mobile, which was not really immersive.
Then, we presented the project to Tribeca New Media Fund and during the industry meetings a lot of people from the industry asked us to present a VR prototype.
Two years ago, the VR hype had already started. But we were quite reluctant about it, because, for us, VR wasn’t accessible to all. We had a good experience on Oculus DK2, it was fun, but how to justify doing this in a public services mission, when very few do have access to VR? So we did a prototype on cardboard, because we were focusing on the mobile platform. And the result was far beyond our expectations.
SG: Why do you think it worked so well in VR?
AC: It worked for a very simple reason: when we are using the app in the magic window of your phone, the world around you were still there. The VR headset is also a mask cutting you from reality. It focuses your attention on John Hull’s stories and on the art direction, which is very demanding, because it is totally black, and just has filters of light.
SG: So, the VR was mainly a way to concentrate, and therefore get the subtleties of the project? And at this point of the project how many people do you have working on it?
AC: The core team is Peter and James, Amaury and I. We worked together on the narrative structure, the scriptwriting, and the interactive script. We defined the concept of the art direction, and then we asked three art directors to work with us on it, Arnaud Desjardins, Fabien Togman and Béatrice Lartigue. We had to ask different art directors because it’s still difficult to find people with skills in art directing, experience on mobile development with Unity and 3D real-time experience.
SG: So, how did you work together?
AC: Peter and James were London, and AudioGaming is in Toulouse and they had developers. In total, we had 11 or 13 people working on it.
SG: So, until you arrived at the Cardboard prototype, we’re talking of a period of months? Give me an idea of time.
AC: I think it took eight months between starting the project and switching to VR. Eight months of developing, because we started working on the concept, we applied to several financial entities such as the New Media funds of the National Centre of Cinematography in France and Tribeca New Media Fund. We prototyped a first audio version, then we had a 360 version that we presented then at the Tribeca New Media Funds, where we had the industry meetings, and two months later we pivoted to a VR experience.
When we switched, the question was the performance of the mobile. It was a huge challenge to make 3D real-time, and binaural audio, real-time audio on mobile. After the prototype phase, we focused our work on Samsung Gear VR. So today you can still watch it on your iPad, in a 360 mode in iOS or Android, and on Samsung Gear. But, we marketed the whole project as a ‘VR experience’.
VR was bringing much more power to the experience, we also used VR as the ‘sexy’ element, to sell the project. But, we always tried to keep in mind the accessibility question. That’s why it was very, very important for us to have a 360 magic window mode, even if it’s not the most relevant experience for what we are trying to do, it’s probably the most accessible.
SG: As a result, do you have numbers to know how people are actually accessing it?
AC: We have numbers, but we are not allowed to communicate about it. What I can say is that this numbers match with the figures you can have on a successful indie game on mobile platform. And we are very happy about it.
SG: So, you were constantly prototyping, from what I understand, and then when do you test, what do you test, and with whom?
AC: We were the first test user, and the model was to prototype and then to enrich these prototypes. First, we validated the narrative script, then the interactive script. When the interactive script and interaction was okay, we had the art direction, and when everything was in place, we tested and we adjusted the timing, the colours, the 3D modelling. When that was okay, we deployed and we validated.
SG: But who did you test it with? Who is your target audience?
AC: In general, when we tested, we test with the team.. it was very precious for us to have these people who knew the context of the project, and that would experience what we had in mind without being part of all the creative process.
SG: Did you ever actually test it with outsiders? Not friends, nor Arte?
AC: In fact, we did that, but it takes a lot of time. It’s very complicated to bring people, so what we decided to do, was that you say, “Okay, it’s amazing, we are selected at Sundance” then Sundance will be our user test platform.
SG: Because, by then, you had a working version of the VR prototype. Where were you at that point?
AC: Sundance was in January 2016, we presented a version with four chapters, and we released six. So, we tested these four chapters, and, for example, we adjusted timing, we adjusted interactivity, and some user information on how to select extra scenes.
We did the same with Tribeca, by then we released what is now the full experience, the six chapters. We did another test session, so we were taking notes when people said, “Oh, it’s great, but here, I’m stuck etc…”
We also tried to balance the user information, because when you are in a narrative piece, it’s complicated if there is too much user information on screen. When we presented at Tribeca, we had several chapters, and at the end of every chapter, you were coming back to the menu. After Tribeca, a lot of people we discussed with said, “This is great, but I just want to continue the story.” So, we changed it.
SG: You can tell me the budget of the project?
AC: I can’t be precise. But, we had around 450,000 Euros to produce the VR part. I think it could have been less expensive project if it didn’t have so many platforms. It was extremely complicated, stressful and costly to have to make a lot of versions.
SG: Software-wise, what do you need in order to produce at this level.
AC: All is made with Unity. The sound design is more complicated. We used a very good plug-in named PopcornFX for all the particle effects. It’s from a start-up in France.
SG: The whole project took two years and a half from start to finish. This is quite a long time. What have you learned in terms of process?
AC: I think that what is essential is to have an iterative model of production, and to be open to a lot of feedback. So, prototyping is really important. This can be difficult when you have funders because they also need to be open to such iterative process. In this regard, it was amazing to work with Arte, because my first contract with them was about producing a 90 minutes podcast! And at the end, we produced six VR chapters of 20 minutes.
I think one of the huge questions we have is: is it possible for indie companies to innovate without the money that comes from the public sector? Maybe this is a question broadcasters and digital groups should start addressing.
SG: Are you saying that funds and producers should always include quite a large period of R&D in digital work? And that if public institutions were to set this as a standard procedure it would influence the commercial sector?
AC: Yes, exactly. The tech companies do have a prototype approach, but it is still quite a short time. Notes on Blindness was a long journey because of the way we decided to work on the story. We were trying to create a narrative arc in the experience. For all of that, you need time as you are cumulating technology R&D to narrative R&D. In a way we need to learn from the tech world, but also, the tech world needs to learn from the content industry too. The process of scriptwriting is all about iterations. While you work on a script, you need to take some time off in order to think about what it means on a very deep way. You are creating different levels of lecture, and avoiding a flat first-level storytelling structure.
SG: Thank you Arnaud for your time. I have to say that Notes on Blindness: walking into darkness is one of my favorite VR piece … it is a very intense and powerful, yet delicate, piece of work. Thank you!
Sandra Gaudenzi
All images courtesy of Ex Nihilo, Archer’s Mark and Arte France.
(this interview has previously been published in docubase.mit.edu)
[1] The gyroscope, or gyro for short, adds an additional dimension to the information supplied by the accelerometer of a mobile phone by tracking rotation or twist. An accelerometer measures linear acceleration of movement, while a gyro on the other hand measures the angular rotational velocity.