Composer Daniel Pemberton discusses his inventive new original score for “Project Hail Mary.” He breaks down how he built a custom musical language for the film using everything from wooden blocks, body percussion, treated vocals, bowls of water, and even a squeaky water tap recorded on his iPhone. He also reflects on balancing the film’s vast sci-fi scale with its intimate emotional core, and how experimentation, failure, and discovery shaped one of his most ambitious scores yet.
One quote stuck out: “Developing all your own sounds… I call it mixing your own paints. You've basically spend a long time mixing paint colors rather than buying it off the shelf. And that's how you get stuff that feels very original, like this. And I've got millions of these and most of them don't work. You spend ages when you experiment. When you have time to fail, you have time to create ... Hopefully there'll be something in there that'll work.”
Music concrete - something I recall from my heady uni daze. Running around getting real world sounds, usually from construction sites, bringing them back to the studio and sequencing them into something that got a passing grade from the weird TA with a Synthi-100 in his caravan. Anyway, from the scant samples of Pemberton's score, I wouldn't call this music... sorry, but it's sound design. It's good sound design, albeit, but sound design; not music.
There are tonnes of companies with libraries and software to pull this exact sound design, and even music - should one feel so inclined - from a laptop.
Just to name drop a few: SoundMorph, Native Instruments, Spitfire... et. al.
But it's a subject dear to my heart, and I'd love to see some opinions on this: where's the line between sound design and music?