Input data for the algorithm The slice is reduced by its begining as the processing moves on. Must be refilled when empty before calling the algorithm process method.
How many bytes read since the start of the stream processing.
Output buffer for the algorithm to write to. This is NOT the data ready after process, but where the algorithm must write next. after a call to process, the slice is reduced by its beginning, and the data written is therefore the one before the slice.
How many bytes written since the start of the stream processing.
import test.util : generateRepetitiveData; import std.array : join; const len = 10_000; const phrase = cast(const(ubyte)[]) "Some very repetitive phrase.\n"; const input = generateRepetitiveData(len, phrase).join(); /// copying with arbitrary chunk sizes on input and output const cop1 = generateRepetitiveData(len, phrase, 1231).copy(234).join(); const cop2 = generateRepetitiveData(len, phrase, 296).copy(6712).join(); assert(input == cop1); assert(input == cop2);
Copy algorithm do not transform data at all This is useful in cases of reading/writing data that may or may not be compressed. Using Copy allows that the same code handles both kind of streams.