Crates.io | cramjam |
lib.rs | cramjam |
version | 2.4.0 |
source | src |
created_at | 2020-05-03 04:42:37.579812 |
updated_at | 2021-09-10 07:41:32.635569 |
description | Thin Python bindings to de/compression algorithms in Rust |
homepage | |
repository | |
max_upload_size | |
id | 236896 |
size | 3,107,738 |
pip install --upgrade cramjam # Requires no Python or system dependencies!
Extremely thin Python bindings to de/compression algorithms in Rust. Allows for using algorithms such as Snappy, without any system dependencies.
This is handy when being used in environments like AWS Lambda, where installing
packages like python-snappy
becomes difficult because of system level dependencies.
Some basic benchmarks are available in the benchmarks directory
Available algorithms:
All available for use as:
>>> import cramjam
>>> import numpy as np
>>> compressed = cramjam.snappy.compress(b"bytes here")
>>> decompressed = cramjam.snappy.decompress(compressed)
>>> decompressed
cramjam.Buffer(len=10) # an object which implements the buffer protocol
>>> bytes(decompressed)
b"bytes here"
>>> np.frombuffer(decompressed, dtype=np.uint8)
array([ 98, 121, 116, 101, 115, 32, 104, 101, 114, 101], dtype=uint8)
Where the API is cramjam.<compression-variant>.compress/decompress
and accepts
bytes
/bytearray
/numpy.array
/cramjam.File
/cramjam.Buffer
objects.
de/compress_into
Additionally, all variants support decompress_into
and compress_into
.
Ex.
>>> import numpy as np
>>> from cramjam import snappy, Buffer
>>>
>>> data = np.frombuffer(b'some bytes here', dtype=np.uint8)
>>> data
array([115, 111, 109, 101, 32, 98, 121, 116, 101, 115, 32, 104, 101,
114, 101], dtype=uint8)
>>>
>>> compressed = Buffer()
>>> snappy.compress_into(data, compressed)
33 # 33 bytes written to compressed buffer
>>>
>>> compressed.tell() # Where is the buffer position?
33 # goodie!
>>>
>>> compressed.seek(0) # Go back to the start of the buffer so we can prepare to decompress
>>> decompressed = b'0' * len(data) # let's write to `bytes` as output
>>> decompressed
b'000000000000000'
>>>
>>> snappy.decompress_into(compressed, decompressed)
15 # 15 bytes written to decompressed
>>> decompressed
b'some bytes here'
Special note!
If you know the length of the de/compress output, you
can provide output_len=<<some int>>
to any de/compress
to get ~1.5-3x performance increase as this allows single
buffer allocation; doesn't really apply if you're using cramjam.Buffer
or cramjam.File
objects.