The ArchMusic Visualizer is a visual algorithm created in Unreal Engine 4 (UE4) which, by analyzing the musical frequency of an audio source, allows you to divide the sound information by the number of elements generated for which you want to simulate. The information is thus ‘archived’ in each element and then subsequently represented graphically. The generation is controlled in real time as the audio or music flows through some geometric parameters: translation, rotation and scale. Furthermore, each of these parameters has been divided into the 3 spatial components: x, y, z. The ‘archived’ information can thus be expressed in one or more components simultaneously generating different possible compositions.
“Cyberspace is liquid. Liquid cyberspace, liquid architecture, liquid cities. Liquid architecture is more than kinetic architecture, robotic architecture, an architecture of fixed parts and variable links. Liquid architecture is an architecture that breathes, pulses, leaps as one form and lands as another. Liquid architecture is an architecture whose form is contingent on the interests of the beholder; it is an architecture that opens to welcome me and closes to defend me; it is an architecture without doors and hallways, where the next room is always where I need it to be and what I need it to be. Liquid architecture makes liquid cities, cities that change at the shift of a value, where visitors with different backgrounds sec different landmarks, where neighborhoods vary with ideas held in common, and evolve as the ideas mature or dissolve. […] …cyberspace encodes architectural knowledge in a way that indicates that our conception of architecture is becoming increasingly musical, that architecture is spatialized music. […] In principle and with the proper architectural knowledge, any pattern can be made into a work of architecture, just as any pattern can be made into music.”
Marcos Novak, Liquid Architectures in Cyberspace
“Music is liquid architecture and architecture is frozen music.”