Towards Experimental Nanosound Using Almost Disjoint Set Theory

Computer Science – Sound

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

20 pages, 4 figures, 1 table

Scientific paper

Music composition using digital audio sequence editors is increasingly performed in a visual workspace where sound complexes are built from discrete sound objects, called gestures that are arranged in time and space to generate a continuous composition. The visual workspace, common to most industry standard audio loop sequencing software, is premised on the arrangement of gestures defined with geometric shape properties. Here, one aspect of fractal set theory was validated using audio-frequency sets to evaluate self-affine scaling behavior when new sound complexes are built through union and intersection operations on discrete musical gestures. Results showed that intersection of two sets revealed lower complexity compared with the union operator, meaning that the intersection of two sound gestures is an almost disjoint set, and in accord with formal logic. These results are also discussed with reference to fuzzy sets, cellular automata, nanotechnology and self-organization to further explore the link between sequenced notation and complexity.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Towards Experimental Nanosound Using Almost Disjoint Set Theory does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Towards Experimental Nanosound Using Almost Disjoint Set Theory, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Towards Experimental Nanosound Using Almost Disjoint Set Theory will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-262884

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.