This thesis proposes a systematic methodology for the guidance, control, and navigation, of a quadrotor to perform a choreographed dance in real-time as a function of the music performed by a musician. The four main components of a human choreography (namely the notions of space, shape, time and structure) are analyzed and mathematically formulated for a robotic performance. This allows for a real-time interaction with a musician without prior knowledge of the music, and based on the pitch of the acoustic signal. A novel approach for mapping music features to trajectory parameters is proposed, as well as the design of a trajectory shaping filter based on two coefficients that are set in real-time by an artist through a MIDI foot-pedal board. The two coefficients are inspired by a mathematical description of acoustic signals. The proposed approach maps motion parameters and the music to trajectory motifs that are then switched in harmony with the music chord structure. The mathematical formulation of a quadrotor choreography is simulated. The simulation relies on the linearized dynamics and the physical properties of a quadrotor, and produces a graphical representation of the quadrotor choreography. To validate the control system, the position of the quadrotor is compared with the desired position. To measure the effectiveness of the link between music and the position of the quadrotor, the trajectory generator system is inverted to generate a sequence of music pitches. The melodic phrase generated by the position of the quadrotor is played back to the musician. A real-time musical interaction occurs between the musician and the quadrotor. Simulation results show that the proposed methodology yields an effective real-time performance for a quadrotor choreography.