[CentralOH] Async buffer generation

Andrew Kubera andrewkubera at gmail.com
Sun Jun 19 13:55:56 EDT 2016


Asyncio is really only helpful if there are a lot of io-bound operations you have to worry about (hence the networking tutorials), and you'll have to pass complete control to the event loop (once in a coroutine, always in a coroutine). The best bet would probably be to create a coroutine using an Executor <https://docs.python.org/3/library/asyncio-eventloop.html?highlight=asyncio%20executor#executor>, which will manage all behind-the-scenes thread stuff. Look there if you really want asyncio.

With respect to your noise function, which I'm sure is just an illustration but I thought I'd comment anyways, it might be effective to use a generator, which you can "send" your size request.

def get_noise(buff_size=1000):
    intrange = (MINVOL, MINVOL + VOLRANGE)
    buffer = [random.randint(*intrange) for _ in range(buff_size)]
    req_len = yield
    while True:
        if req_len > len(buffer):
	    gen_count = req_len - len(buffer) + buff_size
            buffer += [random.randint(*intrange) for _ in range(gen_count)]
        result, buffer = buffer[:req_len], buffer[req_len:]
	req_len = yield bytes(result)


>>> noise_gen = get_noise()
>>> next(noise_gen)
>>> data_500bytes = noise_gen.send(500)
>>> sound_20bytes = noise_gen.send(20)

(Or with numpy)

def get_noise_np(buff_size=1000):
    intrange = (MINVOL, MINVOL + VOLRANGE)
    buffer = np.random.random_integers(*intrange, buff_size)
    req_len = yield
    while True:
        if req_len > len(buffer):
	    gen_count = req_len - len(buffer) + buff_size
	    buffer += np.random.random_integers(*intrange, gen_count)
        result, buffer = buffer[:req_len], buffer[req_len:]
	req_len = yield bytes(result)

(all code untested)

You could choose a sufficiently large buff_size such that every noise.send(...) just chips away at the already created list. From there you could have a 'low water mark', so if the buffer size falls below it, a thread is spun up to generate more numbers in the meantime; I'm not going to try to write that out.

Good Luck! And please keep us updated on how it works.



> On Jun 19, 2016, at 11:53 AM, Eric Floehr <eric at intellovations.com> wrote:
> 
> I am using pyaudio to generate real-time audio. Pyaudio allows you to pass in a callback function so that it can request some amount of audio bytes to play.
> 
> Here is an example of a callback function that generates random noise:
> 
> def generate_samples(in_data, frame_count, time_info, status_flags):
>     out_data = b""
>     for _ in range(frame_count):
>         out_data += pack('h', int((round(random.random() * VOLRANGE) + MINVOL)))
>     return out_data, pyaudio.paContinue
> 
> The only thing I care about is "frame_count" which is the requested number of audio bytes to return.
> 
> The generation/creation of those audio bytes is occurring synchronously in the callback however. What I'd really like to do is generate the audio bytes asynchronously and then just have the callback just drain however much of the buffer is requested.
> 
> I've been going through Python asyncio tutorials but haven't groked this use-case (most of the tutorials are network/HTTP/etc. based and I know its the same concept but it's different enough that I haven't been able to latch on to how to do it in my case).
> 
> Any thoughts on how I could do this with Python 3.5's ayncio?
> 
> Thanks!
> Eric
> 
> _______________________________________________
> CentralOH mailing list
> CentralOH at python.org
> https://mail.python.org/mailman/listinfo/centraloh

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/centraloh/attachments/20160619/a0986170/attachment.html>


More information about the CentralOH mailing list