# Deluge is (not) a Stream
`Deluge` builds on top of `Stream` to provide stream operations that are parallel or concurrent by default. It allows it's user to have an ordered stream of futures that are evaluated concurrently, with all the complexity hidden inside `Deluge` itself. We achieve this by working one level higher than a stream. Instead of returning values that might materialize at some point in the future, we immediately return an iterator of unevaluated futures which can then be evaluated by the collector. The animation below shows an example of mapping over a highly concurrent six element collection. 📘 indicates the time it takes for an underlying element to become available, while 📗 the time it takes to apply a mapped operation. ![Example of processing using Deluge and Streams](./images/process.gif) **This library is still experimental, use at your own risk** ### Design decisions This is an opinionated library that puts ease of use and external simplicity at the forefront. Operations that apply to individual elements like maps and filters **do not** allocate. They simply wrap each element in another future but they do not control the way these processed elements are evaluated. It is the collector that controls the evaluation strategy. At the moment there are two basic collectors supplied: a concurrent and a parallel one. Where there is a decision between performance and ease of use to be made, we are likely to fall on the side of ease of use. The concurrent collector accepts an optional concurrency limit. If it is specified, at most the number of futures equal to that limit will be evaluated at once. ```rust let result = [1, 2, 3, 4] .into_deluge() .map(|x| async move { x * 2 }) .collect::