Billed as the future of concerts, Dan Tepfer‘s live show– a solo efficiency where a self-playing Yamaha Disklavier is the only instrument– entrances every attendee. From the intimate-by-choice setting and the Google– branded VR headsets to Tepfer’s winding style, the show mesmerizes with each tune and its visual accompaniment– all of which is improvisational and diverges from the original recording. There are rules in music and in coding and, while appearing to break all of them, Tepfer simultaneously complies with both centuries-old and 21 st-century rules alike.
His Natural Makers program is an amalgamation of piano developments and coding improvements– Tepfer bridges the space between technology and music on phase. Combined, Tepfer and his self-made technology form a smooth stream of creative consciousness– which is broken just by moments where he ‘d instruct the audience to divert their attention from him to the VR headsets they have actually been offered. We were fortunate to capture him after his New York reveal at National Sawdust, in which he explains the performance’s advancement from its earliest stages to now.
How and when did you get into coding?
I started entering coding as a kid. I was born in1982 My daddy, a biologist, brought house a Macintosh Plus around 1988 or two. I started making things in HyperCard and slowly getting into the scripting environment there. It was always actually fascinating to me. As a kid, it’s simple to feel reasonably helpless. But with coding, you have an incredibly powerful maker that will do whatever you ask it to; it’s very empowering. As a teen, I entered Basic and C. It was all self-taught and, in those days, there was nothing like the online resources there are now, so it was a lot of poking around and exploring.
And how did this idea come about? When did this job start?
The Natural Devices job started the day I strolled into the Yamaha space in Manhattan, sat down at a Disklavier gamer piano, and realized that instead of needing to repeat pre-recorded music, it might take real-time input from my computer system. I ‘d been explore creating generative music from algorithms for a couple of years, but I was constantly frustrated with how inorganic it felt. Unexpectedly, two things were possible: I might use what I improvised at the keyboard– with its naturally natural quality– as real-time input for my algorithms, and the computer system could express its action in real-time by playing the piano itself. These two things combined made me feel very ecstatic because it actually brought the computer system into the world of natural, acoustic sound that I like.
It’s not about changing human beings with computer systems, it’s about exploring the fertile ground where intuition satisfies structure
How does a show like this align with our underlying societal suspicion of (some) innovation? Does it open a brand-new world of technological wonder for audiences?
Natural Machines, as it is now, is everything about checking out the intersection of organic and mechanical procedures– thus the name. It’s not about changing people with computer systems, it has to do with checking out the fertile ground where instinct meets structure, likewise to how an author like Bach sets down rules for himself while leaving adequate degrees of flexibility for self-expression. And the entire thing centers around free improvisation; everything depends on my playing the piano well and being motivated in the minute. That, combined with the truth that I’ve composed every line of code myself so that the programs is just as home-grown as the playing, implies that this project is pretty various from the important things that normally terrify individuals about technology, I believe. We tend to skepticism technology when we seem like we do not understand it totally, however here there’s no data-harvesting, no phishing, there’s nothing dubious hiding beneath the surface, and I believe individuals can feel that.
People have told me that the visuals, rather than sidetracking them from the music, really help them to comprehend its structure, which was my goal the whole time. And the music that’s come out of the job would not be possible without the tech involved. So I don’t understand about “technological marvel,” however hopefully individuals come away motivated by the possibilities that arise when computers are used to naturally enhance human creativity.
How did you teach yourself to code this program, with the included VR aspect?
I’m entirely self-taught as a coder. It’s something I’ve dipped into kind of fanatically for days or weeks at a time at numerous times in my life. It’s simply kept frequently drawing me back in and each time, I get much deeper into it. In my teenagers I found out how to make 3D graphics by asking my math instructor for rotational equations. In my early 20 s, I composed programs to aid with musical workouts I required to do, primarily ear training and sight-reading. That turned into my entering SuperCollider, a rather arcane programs environment concentrated on music. It has a sort of high knowing curve, but is really effective and trusted once you’ve gotten the hang of it. All the musical aspects in Natural Machines are coded in SuperCollider.
I love how, with programming, once you’ve found out a couple of languages it becomes easier and much easier to find out others
There are a lot of technical elements. Can you explain, for anyone who hasn’t seen the program or a YouTube clip of your performance, how all of it works together?
All of it starts with me playing something on the piano. When I push a secret, the Disklavier (which is a fully acoustic instrument, with the added ability to use its own and record what a pianist plays) sends data to my computer system. There, programs I have actually composed react to that input by sending out commands back out for the piano to play. Since I’m improvising, I respond to that, and a favorable feedback loop is produced, with me constructing on what the computer system has done, it’s constructing on what I have, and so on.
Every time the computer system offers with a musical component, it likewise sends out the data via OSC to the visual programs I’ve written in Processing, which produce a kind of visual live score, in real-time, of what’s going on. I have actually striven to make each musical algorithm have its own aesthetically distinct space. And now, considering that I have actually added a VR element to the program, I’m also sending out information to node.js, which sends it on to all the phones that are connected to my wifi network, each of which then internally develops a VR environment from the information in real-time.
What’s the most essential aspect about the show, to you?
At the end of the day, all the innovation needs to serve the music. If I have not moved individuals through the music, if the music doesn’t stand on its own, I’ve failed. The last thing I want is for it to be a technological trick. What’s important is to be using the tech just in as much as it opens new musical spaces, takes me to musical possibilities I couldn’t have gotten to without it.
Beyond that, I hope people leave from it with a restored respect for the power of combining factor and instinct, something that I discover is often doing not have from our discourse nowadays. It’s incredible to me how little understanding there is in pop culture of how science works. Science, in spite of its imperfections, is the most sincere discipline in relation to reality. And there are aspects of music, just as designers need to deal with structural engineers, that truly gain from reason. Natural Devices is all about that combination– I’m the architect when I play, and the computer system is the structural engineer.
So you’ve considered how your show integrates something centuries-old (the piano) with something so 21 st century (coding) then?
Yes, a lot. The core of the job, what truly makes it work in my viewpoint, is that the computer system plays precisely the very same instrument I do– we both are playing this fully acoustic piano. That’s what makes it possible for the computer to play a truly equivalent part in the music, and likewise for the music to resonate in the way that I, as a long time performing musician, desire it to be resonating– with the intricacy of acoustic noise resonating in an area. At a more abstract level, I love digging deep into the past– into the musical mathematics of Pythagoras, into the contrapuntal guidelines of Palestrina, into Bach– and making contemporary art with it. So integrating something old like the piano (which was considered very state-of-the-art in its day) with something contemporary like the computer system feels really natural.
How do the guidelines of music mingle with the modern guidelines of technology?
Computers are made to run algorithms, and they’re extremely proficient at it– definitely much better than we are. Throughout music history, individuals all over the world established homegrown systems of rules to assist them teach music to new generations. They figured out methods of codifying what sounded good to them, or at least the non-mysterious elements of it. So in numerous ways, computer systems are perfectly fit to execute these rules. The rules were figured out by people, however computer systems can look after them in exceptionally virtuosic methods. What computers still aren’t really good at, even with the increase of AI, is the emotional/intuitive/mysterious side of art, which is equally crucial.
One could argue that the piano is produced the crossway of music and tech. At a more general level, there’s something about music, especially crucial music (which is, by definition, abstract) that lends itself exceptionally well to advancements in tech. The entire history of music reveals this. Authors have had a cooperative relationship with tech throughout music history– their music would require additional advancements in instrument-making, and when those developments took place, they opened up new opportunities for structure. It’s the exact same today. At the end of the day, we want great music, whatever that suggests– art being, constantly, in the eye of the beholder. And if a computer system or a player piano can assist create good music that we haven’t heard previously, that’s something that I find really amazing.
Are there minutes in specific songs and reveals that surprise you still?
Yes, and it’s very essential to me that continues to hold true. When I go on stage, I devote to really improvising– to producing music that is uniquely of the minute, specific to this instrument, this area, this audience, this frame of mind that I remain in. So hopefully, what I’m playing will already have moments of real surprise for me. But add what the computer system does in action to this, with varying layers of complexity, and it becomes two times as surprising. And what’s finest is that the surprise eats itself– it opens the door to further discoveries and surprises.
That’s maybe my preferred element of this task. It turns the piano, this instrument that I’ve been playing most every day for the last 30 years, into a brand new instrument, an instrument that constantly pulls me far from my comfort zone and leads me to find aspects of my musicality I didn’t know were there. I’m quite sure that Bach used creative constraints in much the very same way– to make himself a little uneasy, just to see how he would get out of the corner he had actually painted himself into.
Angie Ronson is Editor-in-Chief at THRS. She covers the transformative impact of new technology on all sectors.