• Nenhum resultado encontrado

aSail – aSynchronous as in life

N/A
N/A
Protected

Academic year: 2021

Share "aSail – aSynchronous as in life"

Copied!
6
0
0

Texto

(1)

xCoAx

2019

Aesthetics & X

Proceedings of the

7

th

Conference on

Computation

Communication

(2)

xCoAx 2019

Proceedings of the Seventh Conference on Computation, Communication, Aesthetics & X Milan, Italy

Edited by

Mario Verdicchio, Miguel Carvalhais, Luísa Ribas, André Rangel

Published by

Universidade do Porto Praça Gomes Teixeira 4099-002 Porto, Portugal

ISBN 978-989-746-220-7 Organising Committee

André Rangel,Jason Reizner, Luís Nunes, Luísa Ribas,

Mario Verdichio, Miguel Carvalhais

Local Organising Committee

Marco Santabrogio, Sara Notargiacomo

Design

Catarina Neves, Miguel Carvalhais

Photography

Camila Mangueira Soares, Fabrício Fava

Panel Moderators

Alessandro Ludovico, André Rangel, Boris Debackere, Jason Reizner, Luís Pinto Nunes, Luísa Ribas, Mario Verdicchio,

Martin Kaltenbrunner, Philip Galanter, Sophie-Carolin Wagner

Doctoral Symposium Chairs

Simona Chiodo, Philip Galanter.

Special Thanks

Sam Baron, Renata Boria, Aurora Morelli, Lia Manzella, Marta Cereda, Marco Rabozzi, Stefano Paraboschi, Enrico Bacis, Marco Rosa, Dario Facchinetti, Matthew Rossi, Nicola Cortesi

(3)

aSail –

aSynchronous

as in life

An audiovisual experiment, that implements spatio-temporal synchronisation between visible and audible events, happened. This experiment was materialized with light – array of LEDs, and digital sounds widespread by an array of loudspeakers. A computer program, based on a video white noise generator, calculates in real time unpredictable rhythmic sequences that are perceived as precisely located audiovisual events. For xCoAx, the experiment will be re-designed to combine synchronous and asynchronous audiovisual events in order to play with the spatial perception of the space where aSail is installed. Audiovisual asynchronism will convey the perception of distance by simulating the differences between light and sound speeds.

Keywords audiovisual synchronism perceptual objects auditory-visual integration multimodality sound spatialization algorithms aesthetics Conference on Computation, Communication, Aesthetics & X

xCoAx 2019 Milan, Italy

André Rangel a@3kta.net Universidade Católica Portuguesa e Universidade do Porto, Portugal Simon Rangel field@strb.net 3kta, Portugal

(4)

294 Fig. 1.

1 DESCRIPTION

Single perceptual object

In our everyday experience almost any audible phenomena has a visual event associated to it. For example, in normal conditions, without phys-ical obstacles, when we see a glass crashing on the floor, we also ear the sound of the same glass hitting the floor. Bizlei et Al. (2016) emphasize that auditory and visual objects share particular properties: both have linked features that change over time and perceptually group together the acoustic or visual features that come from a common source. Ac-cording to Teramoto et Al. (2012) most objects and events in the external world generate concurrent inputs to several different sensory modalities. For a long time researchers assumed that each input (audition and vi-sion) were processed in the brain independently however recent studies on multisensory perception have revealed that different sensory moda- lities are closely related and mutually interplaying. The phenomena of hearing and seeing at the same time can be named as auditory-visual in-tegration. Bizlei et Al. stated that what we hear and see take strikingly dif-ferent physical forms, and are necessarily encoded by difdif-ferent sensory receptor organs, but auditory and visual features are effortlessly bound together to create a coherent percept. Binding stimulus features from a common source is not only a problem across sensory systems – within sensory systems, parallel and independent perceptual feature extraction mean that stimulus features, such as pitch and space, must also be ap-propriately combined into a single perceptual object.

Sail

The goal of Sail—Synchronous as in life, the audiovisual experiment that originates the current submission, is precisely to create a single percep- tual object that would affect simultaneously the senses of hearing and seeing in time and space. This object consists of a horizontal line of three hundred RGB LEDs, four equally spaced loudspeakers, also

(5)

horizon-295 taly aligned, controlled by a computer, a four channel audio card and a

microcontroller development system–Teensy 3.6. An original algorithm, implemented as a custom computer program, transmutes a real time white noise digital video signal into a sequencer where pixels brightness, above a variable threshold, trigger synchronous audiovisual events simulta- neously on the loudspeakers and on the LEDs. The algorithm also deter-mines the spatialization of the audiovisual events that occur on different locations of the line – constituted by loudspeakers and LEDs. For example if a sound is heard on the most left loudspeaker, a luminous event is si-multaneously visible on the left side of the LED line. If a sound is moving from the left to the right, through the four loudspeakers, a synchronized luminous event will also move from the left to the right side of the LED line. For a better understanding of this experiment please attend the audiovisu-al documentation available at: https://vimeo.com/317813704.

Asail

For xCoAx 2019 submission, the previous experiment will be re-designed in order to experiment between audiovisual asynchronism and synchronism as a possibility to expand an illusory sense audiovisual spatialization. This new experiment will be titled aSail – aSynchronous as in life. aSail is consti-tuted by exactly the same hardware, with exactly the same placement, as in Sail, however the algorithm will be re-designed in order to introduce varia- ble time latency between the visible and the audible. The intention is that this latency induces to a depth illusion. Instead of an audiovisual spatia- lization in a horizontal line as in Sail, it is expected that in aSail an illusion of audiovisual depth spatialization on an horizontal plane is achieved.

According to Bizlei et Al., crossmodal integration is a term applicable to many phenomena in which one sensory modality influences task perfor-mance or perception in another sensory modality. In our everyday life ex-perience, almost all audiovisual events are in fact asynchronous due to the light and sound different speeds. When one is three hundred meters away from a firework show, the sound of the fireworks has a latency of around one second. This latency reinforces the perception of distance.

Let’s imagine aSail inside a three axes cartesian space, the visual events can only vary on the x (horizontal) axis because the LEDs are aligned hori- zontaly and all of them have the same height and depth. From a cross-modal integration approach it should be possible to induce the illusion of depth by playing with the latency of audible stimulus to visual stimu-lus. For example if there’s a white flash on the led line synchronized with a bass sound, this perceptual object could be interpreted as being closer than a perceptual object in which the sound would be heard half a second later than the flash of light. xCoAx 2019 seems a great opportunity to ex-periment and premier aSail.

(6)

296

Teramoto, Wataru; Hidaka, Souta; Sugita, Yoichi; Sakamoto, Shuichi; Gyoba, Jiro; Iwaya, Yukio; Suzuki, Yôiti.

Sounds can alter the perceived direction of a moving visual object. Journal of Vision

March 2012, Vol.12, 11. doi:10.1167/12.3.11 [online: https://jov.arvojournals.org/ article.aspx?articleid=2121046]

Bizley, Jennifer K.; Maddox, Ross K.; Lee, Adrian K.C.

Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms. Trends

in Neurosciences, Volume 39, Issue 2, February 2016, Pages 74-85. [online:

https://www.sciencedirect.com/science/ article/pii/S0166223615002817]

Referências

Documentos relacionados

Através do acompanhamento da obra de pavimentos térreos industriais que serviu de caso de estudo para esta dissertação, foi possível concretizar a sistematização do processo

Between-days reliability and validity are crucial measurement properties of an outcome measure which, according to the authors’ best knowledge, have never been explored

Neste trabalho o objetivo central foi a ampliação e adequação do procedimento e programa computacional baseado no programa comercial MSC.PATRAN, para a geração automática de modelos

No Brasil, a Lei nº 11.121, de 25/05/2005, também instituiu o 28 de abril como o Dia Nacional em Memória das Vítimas de Acidentes e Doenças do Trabalho , com o objetivo de

By analyzing the images it is noticeable that the network is looking to the same structures that ophthalmologists inspect, like the optic disc, cup, retinal vessels topology, and

Para tanto foi realizada uma pesquisa descritiva, utilizando-se da pesquisa documental, na Secretaria Nacional de Esporte de Alto Rendimento do Ministério do Esporte

Em Portugal, o Rio Tejo desenvolveu seis níveis terraços fluviais desde a superfície culminante do enchimento sedimentar da Bacia Cenozóica do Baixo Tejo até ao

QUALIDADE DE SEMENTES DE FEIJÃO DO TIPO PRETO EM FUNÇÃO DA TEMPERATURA DE SECAGEM E DO ARMAZENAMENTO.. ERECHIM