Stream is:module

These are stream fusion versions of some of the functions in Data.Conduit.Combinators. Many functions don't have stream versions here because instead they have RULES which inline a definition that fuses.
Monadic streams
Megaparsec's input stream facilities. You probably do not want to import this module directly because Text.Megaparsec re-exports it anyway.
Transmitting HTTP requests and responses holding String in their payload bodies. This is one of the implementation modules for the Network.HTTP interface, representing request and response content as Strings and transmitting them in non-packed form (cf. Network.HTTP.HandleStream and its use of ByteStrings.) over Stream handles. It is mostly here for backwards compatibility, representing how requests and responses were transmitted up until the 4.x releases of the HTTP package. For more detailed information about what the individual exports do, please consult the documentation for Network.HTTP. Notice however that the functions here do not perform any kind of normalization prior to transmission (or receipt); you are responsible for doing any such yourself, or, if you prefer, just switch to using Network.HTTP function instead.
An library for creating abstract streams. Originally part of Gray's/Bringert's HTTP module.
  • Changes by Robin Bate Boerop robin@bateboerop.name:
  • Removed unnecessary import statements.
  • Moved Debug code to StreamDebugger.hs
  • Moved Socket-related code to StreamSocket.hs.
  • Changes by Simon Foster:
  • Split Network.HTTPmodule up into to separate Network.[Stream,TCP,HTTP] modules
Lightweight abstraction over an input/output stream.
Arrow transformer lifting an arrow to streams.
Streams are infinite lists. Most operations on streams are completely analogous to the definition in Data.List. The functions provided in this package are fairly careful about totality, termination, and productivity. None of the functions should diverge, provided you adhere to the preconditions mentioned in the documentation. Note: I get quite a lot of requests regarding a missing Traversable instance for Streams. This has been left out by design.
Streams represented as state machines, that fuse together when composed statically, eliminating function calls or intermediate constructor allocations - generating tight, efficient loops. Suitable for high performance looping operations. If you need to call these operations recursively in a loop (i.e. composed dynamically) then it is recommended to use the continuation passing style (CPS) stream operations from the Streamly.Data.StreamK module. Stream and StreamK types are interconvertible. See more details in the documentation below regarding Stream vs StreamK. Please refer to Streamly.Internal.Data.Stream for more functions that have not yet been released. Checkout the https://github.com/composewell/streamly-examples repository for many more real world examples of stream programming.
Deprecated: Please use Streamly.Internal.Data.Array instead.
Encode Haskell data types to byte streams. The primary purpose of this module is to serialize primitive Haskell types to streams for convenient byte by byte processing when such a need arises. It would be inefficient to use this to build byte streams from algebraic data types. For general serialization of ADTs please use the Serialize type class instances. The fastest way to convert general Haskell types to byte streams is to serialize them to an array and then stream the array.
Deprecated: Please use Streamly.Internal.Data.MutArray instead.
Direct style re-implementation of CPS stream in Streamly.Internal.Data.StreamK. The symbol or suffix D in this module denotes the Direct style. GHC is able to INLINE and fuse direct style better, providing better performance than CPS implementation.
import qualified Streamly.Internal.Data.Stream as D

Processing Unicode Strings

A Char stream is the canonical representation to process Unicode strings. It can be processed efficiently using regular stream processing operations. A byte stream of Unicode text read from an IO device or from an Array in memory can be decoded into a Char stream using the decoding routines in this module. A String ([Char]) can be converted into a Char stream using fromList. An Array Char can be unfolded into a stream using the array read unfold.

Storing Unicode Strings

A stream of Char can be encoded into a byte stream using the encoding routines in this module and then written to IO devices or to arrays in memory. If you have to store a Char stream in memory you can fold the Char stream as Array Char using the array write fold. The Array type provides a more compact representation reducing GC overhead. If space efficiency is a concern you can use encodeUtf8' on the Char stream before writing it to an Array providing an even more compact representation.

String Literals

Stream Identity Char and Array Char are instances of IsString and IsList, therefore, OverloadedStrings and OverloadedLists extensions can be used for convenience when specifying unicode strings literals using these types.

Idioms

Some simple text processing operations can be represented simply as operations on Char streams. Follow the links for the following idioms:

Pitfalls

  • Case conversion: Some unicode characters translate to more than one code point on case conversion. The toUpper and toLower functions in base package do not handle such characters. Therefore, operations like map toUpper on a character stream or character array may not always perform correct conversion.
  • String comparison: In some cases, visually identical strings may have different unicode representations, therefore, a character stream or character array cannot be directly compared. A normalized comparison may be needed to check string equivalence correctly.

Experimental APIs

Some experimental APIs to conveniently process text using the Array Char represenation directly can be found in Streamly.Internal.Unicode.Array.
Stream management for low-level driver interface
Stream management routines
Secret-key encryption: Crypto.Saltine.Core.Stream The stream function produces a sized stream ByteString as a function of a secret key and a nonce. The xor function encrypts a message ByteString using a secret key and a nonce. The xor function guarantees that the ciphertext has the same length as the plaintext, and is the plaintext xor stream k n. Consequently xor can also be used to decrypt. The stream function, viewed as a function of the nonce for a uniform random key, is designed to meet the standard notion of unpredictability ("PRF"). For a formal definition see, e.g., Section 2.3 of Bellare, Kilian, and Rogaway, "The security of the cipher block chaining message authentication code," Journal of Computer and System Sciences 61 (2000), 362–399; http://www-cse.ucsd.edu/~mihir/papers/cbc.html. This means that an attacker cannot distinguish this function from a uniform random function. Consequently, if a series of messages is encrypted by xor with a different nonce for each message, the ciphertexts are indistinguishable from uniform random strings of the same length. Note that the length is not hidden. Note also that it is the caller's responsibility to ensure the uniqueness of nonces—for example, by using nonce 1 for the first message, nonce 2 for the second message, etc. Nonces are long enough that randomly generated nonces have negligible risk of collision. Saltine does not make any promises regarding the resistance of crypto_stream to "related-key attacks." It is the caller's responsibility to use proper key-derivation functions. Crypto.Saltine.Core.Stream is crypto_stream_xsalsa20, a particular cipher specified in "Cryptography in NaCl" (http://nacl.cr.yp.to/valid.html), Section 7. This cipher is conjectured to meet the standard notion of unpredictability. This is version 2010.08.30 of the stream.html web page.
Parse .xlsx sheets in constant memory. All actions on an xlsx file run inside the XlsxM monad, and must be run with runXlsxM. XlsxM is not a monad transformer, a design inherited from the "zip" package's ZipArchive monad. Inside the XlsxM monad, you can stream SheetItems (a row) from a particular sheet, using readSheetByIndex, which is callback-based and tied to IO.
Internal stream related functions. These are exported because they're tested like this. It's not expected a user would need this.