logsnarf.uploader module

BigQuery uploader

A class that implements twisted.internet.interfaces.IConsumer for uploading logs to BigQuery.

class logsnarf.uploader.BigQueryUploader(schema_obj, svc, table_name_schema, reactor=None)[source]

Bases: _ConsumerMixin

Parameters:
addData(data)[source]

This expects valid dicts to upload.

flush()[source]

Flush our buffer.

pauseConsuming()[source]

Pause consuming, by pausing our producer.

registerProducer(producer, streaming)[source]

twisted.internet.interfaces.IConsumer method

Registers a producer with the uploader.

Parameters:
resumeConsuming()[source]

Resume consuming, by resuming our producer.

setBatchSize(n)[source]

Set the number of log entries to batch in an upload.

This is both a maximum batch size, minimum batch size. Barring uploads that occur at an interval set by setFlushInterval()

Parameters:

n (int) – size to set batchsize to

setDefaultTZ(tz)[source]

Set the default timezone

Parameters:

tz (str or datetime.tzinfo) – timezone to set as default

setFlushInterval(n)[source]

Set the flush interval for the uploader.

Uploads will occur at least every flush interval seconds, as long as there is anything to upload.

Parameters:

n (int) – flush interval

setMaxBuffer(n)[source]

Set the max buffer size for uploads.

Once this is reached, the uploader will ask the producer to pause until the buffer is below this limit.

Parameters:

n (int) – max buffer size

start()[source]

Start the uploader.

Start periodic tasks, at a trigger to flush our buffer on system shutdown. Make sure the producer is set to produce.

startWriting()[source]

Required by _ConsumerMixin.

upload(flush=False)[source]

Potentially insert some table data.

Parameters:

flush (bool) – if this is part of flushing our buffer

write(data)[source]

twisted.internet.interfaces.IConsumer method

Process data and add it to our buffer. It’s preferable, but not required that full log lines are taken here.

Parameters:

data (str) – text containing line separated JSON