Research, General Feature Requests And/Or Inspirations For Developers

Discussion in 'Software' started by Sackbut, Dec 2, 2025 at 12:04 PM.

  1. Sackbut

    Sackbut Producer

    Joined:
    Sep 1, 2025
    Messages:
    236
    Likes Received:
    128
    Location:
    galaxy cluster SDSS J1038+4849
    Put your ideas/inspirations for hypothetical or existing synths here!
    Maybe some devs will be inspired, make comments, etc..

    'Built it and they will come.'? How about (also) 'ask/give the people what they want'?

    Mine, for examples, from over at Vital:

    One
    Two
    ____
    Relevant videos (also included within spoiler link)










    'Fundamental frequency partials' of metallic plate:


    Murmuration simulation:

    ____

    Dawesome's Kontrast?

    Well why have a static wavetable or static image underneath, even if the different-shapes 'playhead' is moving around when the actual wavetable/image, itself, can move/mutate/evolve?

    How about a 3D Morphing-Terrain synth, sort of like moving waves on an ocean (that can be controlled/modded/'splashed'), say, leveraging a video synthesizer, realtime 3D displacement-mapping, realtime AI gaming and/or biological mutation simulation algos to create/extract/run/mod the realtime waveforms, waveform contours and/or evolutions/morphs?

    WRT the latter mention, some of you might think of Synplant 2, and while that's a good synth, I'm also thinking of something more realtime, where, as the waveform plays, it evolves by mutating as it runs and/or mating with other (also-mutating) waveforms thrown in. 'Audiosex'. ;)
    ____
    Me: I have an audio/synthesizer question or more for you if you're game! :)

    AI: Absolutely—I’m always game for audio and synth talk ️! What’s on your mind: are we diving into gear recommendations, sound design techniques, music theory applications, or maybe the history of synthesizers?

    Me: New research!

    AI: Got it —you’re calling for some fresh synth/audio research! Let’s zero in: do you want me to dig into recent developments in synthesizer technology (like new hardware releases, software instruments, or sound design techniques), or are you more curious about academic/musicological research (like studies on timbre perception, psychoacoustics, or electronic music history)?

    For example, I could pull up...
    [...]

    Me: Well, I'm curious if, for examples, research is being done on the idea of evolving/mutating waveforms as if they were biological entities, perhaps using artificial life or even real life algorithms; as well as leveraging 3D animation/fluid-dynamics/fractal-landscape generation/realtime displacement-mapping algorithms as applied, however, to, say, wavetable and/or additive synthesis, where the scanned 'tables' are evolving landscapes, say using 'erosion math' and so forth, and/or moving water surfaces, including using different materials for ray-tracing, only, it's the waveforms being raytraced. Anyway, stuff like those kinds of things. What do you think? Any, especially cutting-edge, ideas/leads/inspirations?

    AI: Yes—there is active research and experimentation at the intersection of audio synthesis, artificial life, and fluid/fractal dynamics, though much of it is still exploratory and highly interdisciplinary. Researchers are borrowing from biology, physics, and computer graphics to create evolving sound structures that behave almost like living entities.
    Key Research Directions

    Bio-inspired waveform evolution: Artificial life models (cellular automata, genetic algorithms) evolve timbres over time, treating sound as a population of entities.

    Fluid dynamics applied to synthesis: Equations like Navier–Stokes can be mapped to audio domains, producing continuously shifting wavetables.

    Fractal and landscape-based synthesis: Terrain-generation algorithms (erosion, Perlin noise, fractals) scanned as wavetables yield evolving timbres.

    Ray-tracing waveforms: Harmonics treated as “rays” interacting with waveform surfaces, producing reflections/refractions of timbre.

    Inspirations

    Generative landscapes as wavetables

    Fluid simulations as modulation

    Artificial life ecosystems of oscillators

    GPU ray-tracing applied to spectral data

    ⚠️ Challenges

    Computational intensity

    Psychoacoustic relevance

    Interdisciplinary expertise

    In short, your vision of evolving/mutating waveforms as biological or fluid-like entities is exactly where some of the most experimental synthesis research is heading.

    Direct interdisciplinary leads do exist, though they’re scattered across computer music, biomimetics, and computational fluid dynamics.
    Specific Interdisciplinary Leads

    Artificial Life + Audio: Cellular automata and genetic algorithms used to evolve timbres in real time.

    Fluid Dynamics → Audio DSP: Particle systems and turbulence equations mapped to additive synthesis.

    Fractal Landscapes → Wavetable/Granular Synthesis: Terrain-generation algorithms scanned as wavetables.

    Ray-Tracing Concepts → Spectral Modeling: GPU ray-tracing adapted for audio spectral transformations.

    ⚡ Why Feasible Now

    GPU acceleration

    AI-assisted synthesis

    Realtime generative worlds in gaming engines
     
    Last edited: Dec 2, 2025 at 3:13 PM
  2.  
  3. Sackbut

    Sackbut Producer

    Joined:
    Sep 1, 2025
    Messages:
    236
    Likes Received:
    128
    Location:
    galaxy cluster SDSS J1038+4849
    As a very loosely-related aside to my previous comment, some years ago, I thought this was kind of cool-- sort of a clean wireless modular-- but it became abandonware:



    (Elektronick Musick doesn't always seem to do the synths they demo justice, but anyway...)

    But as inspiration, I liked the idea of different waveforms being easily/realtime intermorphable like in some synths, but (also) using genetic (controllable/moddable) algos where mutation-rates and sexual-selection for examples could be controlled/filtered/edited/etc. on the fly. Of course, Synplant has this DNA-strand edit section, but it doesn't appear realtime or based on true mutations/sexual-selections.
     
    Last edited: Dec 2, 2025 at 1:05 PM
  4. Thomba

    Thomba Member

    Joined:
    Dec 24, 2022
    Messages:
    73
    Likes Received:
    10
    I am quite intrigued by devices like Oxi One and others that deliver a more hands-on approach to generative sequencing. Given its touch approach I am absolutely sure that this could easily be replicated by a screen oriented vst device. But from my understanding - that is not the case.

    Such a "sound toy" would be nice.
     
    • Interesting Interesting x 1
    • List
Loading...
Loading...