Stream is:module

These are stream fusion versions of some of the functions in Data.Conduit.Combinators. Many functions don't have stream versions here because instead they have RULES which inline a definition that fuses.
Monadic streams
Megaparsec's input stream facilities. You probably do not want to import this module directly because Text.Megaparsec re-exports it anyway.
Transmitting HTTP requests and responses holding String in their payload bodies. This is one of the implementation modules for the Network.HTTP interface, representing request and response content as Strings and transmitting them in non-packed form (cf. Network.HTTP.HandleStream and its use of ByteStrings.) over Stream handles. It is mostly here for backwards compatibility, representing how requests and responses were transmitted up until the 4.x releases of the HTTP package. For more detailed information about what the individual exports do, please consult the documentation for Network.HTTP. Notice however that the functions here do not perform any kind of normalization prior to transmission (or receipt); you are responsible for doing any such yourself, or, if you prefer, just switch to using Network.HTTP function instead.
An library for creating abstract streams. Originally part of Gray's/Bringert's HTTP module.
  • Changes by Robin Bate Boerop robin@bateboerop.name:
  • Removed unnecessary import statements.
  • Moved Debug code to StreamDebugger.hs
  • Moved Socket-related code to StreamSocket.hs.
  • Changes by Simon Foster:
  • Split Network.HTTPmodule up into to separate Network.[Stream,TCP,HTTP] modules
Lightweight abstraction over an input/output stream.
Arrow transformer lifting an arrow to streams.
The Stream type represents a producer of a sequence of values. Its dual, Fold, represents a consumer. While both types support similar transformations, the key difference is that only Stream can compose multiple producers, and only Fold can compose multiple consumers.

Console Echo Example

To get you started, here is an example of a program which reads lines from console and writes them back to the console.
>>> import Data.Function ((&))

>>> :{
echo =
Stream.repeatM getLine       -- Stream IO String
& Stream.mapM putStrLn   -- Stream IO ()
& Stream.fold Fold.drain -- IO ()
:}
This is a simple example of a declarative representation of an imperative loop using streaming combinators. In this example, repeatM generates an infinite stream of Strings by repeatedly performing the getLine IO action. mapM then applies putStrLn on each element in the stream converting it to stream of (). Finally, drain folds the stream to IO discarding the () values, thus producing only effects. This gives you an idea about how we can program declaratively by representing loops using streams. Compare this declarative loopless approach with an imperative approach using a while loop for writing the same program. In this module, you can find all Data.List-like functions and many more powerful combinators to perform common programming tasks.

Static Stream Fusion

The Stream type represents streams as state machines. When composed statically, these state machines fuse together at compile time, eliminating intermediate data structures and function calls. This results in the generation of tight, efficient loops comparable to those written in low-level languages like C. For instance, in the earlier example, operations like repeatM and mapM are written as separate fragments but fuse into a single, optimized loop. The primary goal of the Stream type is to build highly efficient streams via compile-time fusion of modular loop fragments. However, this technique comes with trade-offs and should be used with care. Stream construction operations such as cons, append, interleave, mergeBy, and zipWith work extremely well at a small scale. But at a large scale, their performance degrades due to O(n^2) complexity, where n is the number of compositions. Therefore, it's best to generate a fused stream in one go, if possible. While using a small number of composition operations is absolutely fine, avoid using large number of composition operations. For example, do not try to construct a fused Stream by using cons rescursively. However, you can use cons and any other construction operations on the CPS StreamK type without any problem. The CPS construction operations have linear (O(n)) performance characteristics and scale much better, though they are not as efficient as fused streams due to function call overhead at each step. When used correctly, the fused Stream type can be 10x to 100x faster than CPS-based streams, depending on the use case. Rule of Thumb: Use the fused Stream type when the number of compositions is small and they are static or compile-time. Use the CPS-based StreamK type when the number of compositions is large or potentially infinite, and they are dynamic or composed at runtime. Both types are fully interconvertible, allowing you to choose the best tool for each part of your pipeline.

Better and Effectful Lists

This module offers operations analogous to standard Haskell lists from the base package. Streams can be viewed as a generalization of lists — providing all the functionality of standard lists, plus additional capabilities such as effectful operations and improved performance through stream fusion. They can easily replace lists in most contexts, and go beyond where lists fall short. For instance, a common limitation of lists is the inability to perform IO actions (e.g., printing) at arbitrary points during processing. Streams naturally support such effectful operations. As discussed in the fusion section above, while the Stream type is not consable and appendable at scale, the StreamK type is consable and appendable at scale.

Non-determinism and List Transformers

Streamly does not provide a ListT like Monad instance but it provides all the equivalent functionality and more. We do not provide a Monad instance for streams, as there are many possible ways to define the bind operation. Instead, we offer bind-style operations such as concatFor, concatForM, and their variants (e.g. fair interleaving and breadth-first nesting). These can be used for convenient ListT-style stream composition. Additionally, we provide applicative-style cross product operations like cross and its variants which are many times faster than the monad style operations.

Logic Programming

Streamly does not provide a LogicT-style Monad instance, but it offers all the equivalent functionality—and more. Operations like fairCross and fairConcatFor nest outer and inner streams fairly, ensuring that no stream is starved when exploring cross products. This enables balanced exploration across all dimensions in backtracking problems, while also supporting infinite streams. It effectively replaces the core functionality of LogicT from the logict package, with significantly better performance. In particular, it avoids the quadratic slowdown seen with observeMany, and the applicative fairCross runs many times faster, achieving loop nesting performance comparable to C.
Deprecated: Please use Streamly.Internal.Data.Array instead.
Encode Haskell data types to byte streams. The primary purpose of this module is to serialize primitive Haskell types to streams for convenient byte by byte processing when such a need arises. It would be inefficient to use this to build byte streams from algebraic data types. For general serialization of ADTs please use the Serialize type class instances. The fastest way to convert general Haskell types to byte streams is to serialize them to an array and then stream the array.
Deprecated: Please use Streamly.Internal.Data.MutArray instead.
Direct style re-implementation of CPS stream in Streamly.Internal.Data.StreamK. GHC is able to INLINE and fuse direct style better, providing better performance than CPS implementation.
import qualified Streamly.Internal.Data.Stream as Stream

Processing Unicode Strings

A Char stream is the canonical representation to process Unicode strings. It can be processed efficiently using regular stream processing operations. A byte stream of Unicode text read from an IO device or from an Array in memory can be decoded into a Char stream using the decoding routines in this module. A String ([Char]) can be converted into a Char stream using fromList. An Array Char can be unfolded into a stream using the array read unfold.

Storing Unicode Strings

A stream of Char can be encoded into a byte stream using the encoding routines in this module and then written to IO devices or to arrays in memory. If you have to store a Char stream in memory you can fold the Char stream as Array Char using the array write fold. The Array type provides a more compact representation reducing GC overhead. If space efficiency is a concern you can use encodeUtf8' on the Char stream before writing it to an Array providing an even more compact representation.

String Literals

Stream Identity Char and Array Char are instances of IsString and IsList, therefore, OverloadedStrings and OverloadedLists extensions can be used for convenience when specifying unicode strings literals using these types.

Idioms

Some simple text processing operations can be represented simply as operations on Char streams. Follow the links for the following idioms:

Pitfalls

  • Case conversion: Some unicode characters translate to more than one code point on case conversion. The toUpper and toLower functions in base package do not handle such characters. Therefore, operations like map toUpper on a character stream or character array may not always perform correct conversion.
  • String comparison: In some cases, visually identical strings may have different unicode representations, therefore, a character stream or character array cannot be directly compared. A normalized comparison may be needed to check string equivalence correctly.

Experimental APIs

Some experimental APIs to conveniently process text using the Array Char represenation directly can be found in Streamly.Internal.Unicode.Array.
Streams are infinite lists. Most operations on streams are completely analogous to the definition in Data.List. The functions provided in this package are fairly careful about totality, termination, and productivity. None of the functions should diverge, provided you adhere to the preconditions mentioned in the documentation. Note: I get quite a lot of requests regarding a missing Traversable instance for Streams. This has been left out by design.
Secret-key encryption: Crypto.Saltine.Core.Stream The stream function produces a sized stream ByteString as a function of a secret key and a nonce. The xor function encrypts a message ByteString using a secret key and a nonce. The xor function guarantees that the ciphertext has the same length as the plaintext, and is the plaintext xor stream k n. Consequently xor can also be used to decrypt. The stream function, viewed as a function of the nonce for a uniform random key, is designed to meet the standard notion of unpredictability ("PRF"). For a formal definition see, e.g., Section 2.3 of Bellare, Kilian, and Rogaway, "The security of the cipher block chaining message authentication code," Journal of Computer and System Sciences 61 (2000), 362–399; http://www-cse.ucsd.edu/~mihir/papers/cbc.html. This means that an attacker cannot distinguish this function from a uniform random function. Consequently, if a series of messages is encrypted by xor with a different nonce for each message, the ciphertexts are indistinguishable from uniform random strings of the same length. Note that the length is not hidden. Note also that it is the caller's responsibility to ensure the uniqueness of nonces—for example, by using nonce 1 for the first message, nonce 2 for the second message, etc. Nonces are long enough that randomly generated nonces have negligible risk of collision. Saltine does not make any promises regarding the resistance of crypto_stream to "related-key attacks." It is the caller's responsibility to use proper key-derivation functions. Crypto.Saltine.Core.Stream is crypto_stream_xsalsa20, a particular cipher specified in "Cryptography in NaCl" (http://nacl.cr.yp.to/valid.html), Section 7. This cipher is conjectured to meet the standard notion of unpredictability. This is version 2010.08.30 of the stream.html web page.
Parse .xlsx sheets in constant memory. All actions on an xlsx file run inside the XlsxM monad, and must be run with runXlsxM. XlsxM is not a monad transformer, a design inherited from the "zip" package's ZipArchive monad. Inside the XlsxM monad, you can stream SheetItems (a row) from a particular sheet, using readSheetByIndex, which is callback-based and tied to IO.
Internal stream related functions. These are exported because they're tested like this. It's not expected a user would need this.
Writes Excel files from a stream, which allows creation of large Excel files while remaining in constant memory.