aphasia

Aphasia(2020)

for real-time generated sound by GENDY, frequency modulator, impulse generator, dynamic oscillator bank in SuperCollider and improvised Guzheng, Jaw Harp and Voice.

Aphasia project is divided into two components:

OVERALL:

Real-time Supercollider generated sound: (defined by the data input by the position of any 2-dimension control board)

a. GENDY

+ b. Impulse Generator

+ c. Dynamic Oscillator Bank

Three integrated improvisation accompanying with the sound generated in OVERALL:

a. Don’t Speak – with Guzheng Improvisation

Guzheng

+ b. Can’t Speak – with Jaw Harp improvisation

Jaw Harp

+ c. Won’t Speak – with Voice-Speech (Khoomei, Scat singing, Language-based Phonetic sound) Improvisation

Khoomei Sample Used: performed by Yudan Zou

Examples of language-based/Singing voice-based phonetics sound.

The code for OVERALL

<!-- wp:code -->
<pre class="wp-block-code"><code>//Player 1:
//Two Gendy Instruments based on GENDY by Iannis Xenakis, it's of the player's freedom to either combine, seperate or interchange between the two instruments during performing.

  (
        {
            Pan2.ar(Gendy1.ar(
                maxfreq:Gendy1.kr(2, 4, 0.6, 0.9, 0.3, MouseY.kr(0.1, 9), 1.0, 1.0, 5, 5, 100, 600),
                knum: MouseX.kr(1, 13), mul:0.2), MouseY.kr(0.07,4,\exponential))
        }.play
    )


 (
        {
            var n = 13;


            Resonz.ar(
                Mix.fill(n,{
                    var freq, numcps;

                    freq = rrand(80, 860.3);
                    numcps = rrand(2, 40);
                    Pan2.ar(
                        Gendy1.ar(
                            6.rand, 7.rand, 2.0.rand, 1.0.rand, freq,
                            MouseX.kr(freq, 3*freq), 1.0.rand, 1.0.rand, numcps,
                            SinOsc.kr(exprand(0.01, 0.2), 0, numcps/3, numcps/3), 0.5/(n.sqrt)
                        ),
                        1.0.rand2
                    )
                }),
                MouseX.kr(90, 2200), MouseY.kr(0.01, 1.0)
            )

        }.play;
    )


//Player 2:
//a sawtooth frequency modulator, an impulse generator, it's of the player's freedom to either combine, seperate or interchange between the two instruments during performing.

{
var x = SinOscFB.ar(MouseX.kr(1,100));
Saw.ar(60*x+800,0,0.1)
+
	PinkNoise.ar(0.1*x+0.08)
}.play;


{Impulse.ar(MouseX.kr(2,208,9))*0.1!2}.play;


//Player 3: a dynamic sine oscillator bank

(
fork {
    loop {
        play {
            var mod = SinOsc.kr(MouseX.kr(40, 10000, 1), 0, Rand(1, 18));
            Pan2.ar(DynKlang.ar(`[ Array.rand(10, 300.0, 1000.0), 1, mod ]), 1.0.rand)
                * EnvGen.kr(Env.sine(5), 0.66, 0.04, doneAction: Done.freeSelf)
        };
        2.wait;
    }
}
)


s.record;
s.stopRecording;
</code></pre>
<!-- /wp:code -->

An improvisation-based computer music piece, with sonic instance generated by SuperCollider. The piece is inspired by Alfred Hitchcock’s classic Psycho. After another review of the movie, I want to find a sound that best describe the fear and sense of ‘Aphasia’ confronting inner isolation and external danger – a stroke of ‘electrified crying’. I found the sound during the improv of summing up several sources implemented from SuperCollider UGens, with parameters adjusted:

Gendy1: Dynamic Stochastic Synthesis Generator, invented and described by Iannis Xenakis in Formalized Music. This part of sound created is baby-crying-like, twisted, desperate, with tonal change. 3: 45 – 7: 08
Impulse Generator + Frequency Modulator: This part created is dense and sizzling, like a mouse rushing to a piece of cheese. 7:15 – 10:50

Oscillator Bank: implemented by DynKlang. This part created is flowing, airy with heavily oscillating instances. 11: 00 – 13:35

~scale1 = [15, 14, 9, 13, 16, 11, 12, 11]; ~scale2 = [4, 3, 4, 3, 6, 5, 6, 5];
~scale3 = [6s, 7b, 8, 6b, 5b, 5, 7, 4b]; ~scale4 = [9, 8, 3, 6, 9, 4, 5, 2];

the Whole Composition 0:00 – 3:40

Performance Instruction: The piece explores, how pure physical movements of in-determinant manner evoke sonic events and changes, perceived pitch is fully disenchanted. What triggers sonic events is no longer the default concept of “progress” in a planned composition, but “the

MOVE” on-the-fly. Every sonic instance is triggered by the pure physical movement of the player with controller “at the moment”. The controllers should either be virtual square surfaces where values are mapped or physical control boards to be manipulated by hands.

In this version of code, the “control board” is assumed to be an MacBook Mouse Touchbar, and what the player can “move” are only fingers, with value possible for mapping restricted to 2- dimensions. Further extensions of the “control” and “mapping” can be sensor-based new musical interfaces built by Wekinator/Kinect or can be triggered by human physical interaction in a virtual sonic space, parameters can be extended to n-dimensions.