iterate package:streamly-core
Generate an infinite stream with
x as the first element and
each successive element derived by applying the function
f on
the previous element.
>>> Stream.toList $ Stream.take 5 $ Stream.iterate (+1) 1
[1,2,3,4,5]
Keep running the same consumer over and over again on the input,
feeding the output of the previous run to the next.
Internal
>>> iterate f x = x `StreamK.cons` iterate f x
Generate an infinite stream with
x as the first element and
each successive element derived by applying the function
f on
the previous element.
>>> StreamK.toList $ StreamK.take 5 $ StreamK.iterate (+1) 1
[1,2,3,4,5]
Generate an infinite stream with the first element generated by the
action
m and each successive element derived by applying the
monadic function
f on the previous element.
>>> :{
Stream.iterateM (\x -> print x >> return (x + 1)) (return 0)
& Stream.take 3
& Stream.toList
:}
0
1
[0,1,2]
Generates an infinite stream starting with the given seed and applying
the given function repeatedly.
>>> iterateM f m = m >>= \a -> return a `StreamK.consM` iterateM f (f a)
Generate an infinite stream with the first element generated by the
action
m and each successive element derived by applying the
monadic function
f on the previous element.
>>> :{
StreamK.iterateM (\x -> print x >> return (x + 1)) (return 0)
& StreamK.take 3
& StreamK.toList
:}
0
1
[0,1,2]
Same as
concatIterateBfs except that the traversal of the last
element on a level is emitted first and then going backwards up to the
first element (reversed ordering). This may be slightly faster than
concatIterateBfs.
Like
bfsUnfoldIterate but processes the children in reverse
order, therefore, may be slightly faster.
Pre-release
Similar to
concatIterate except that it traverses the stream in
breadth first style (BFS). First, all the elements in the input stream
are emitted, and then their traversals are emitted.
Example, list a directory tree using BFS:
>>> f = either (Just . Dir.readEitherPaths id) (const Nothing)
>>> input = Stream.fromEffect (Left <$> Path.fromString ".")
>>> ls = Stream.bfsConcatIterate f input
Pre-release
N-Ary BFS style iterative fold, if the input stream finished before
the fold then it returns Left otherwise Right. If the fold returns
Left we terminate.
Unimplemented
Binary BFS style reduce, folds a level entirely using the supplied
fold function, collecting the outputs as next level of the tree, then
repeats the same process on the next level. The last elements of a
previously folded level are folded first.
Like
unfoldIterate but uses breadth first style traversal.
Pre-release
Traverse the stream in depth first style (DFS). Map each element in
the input stream to a stream and flatten, recursively map the
resulting elements as well to a stream and flatten until no more
streams are generated.
Example, list a directory tree using DFS:
>>> f = either (Just . Dir.readEitherPaths id) (const Nothing)
>>> input = Stream.fromEffect (Left <$> Path.fromString ".")
>>> ls = Stream.concatIterate f input
This is equivalent to using
concatIterateWith StreamK.append.
Pre-release
Deprecated: Please use bfsConcatIterate instead.
Deprecated: Please use altBfsConcatIterate instead.
Deprecated: Please use concatIterate instead.
Generate a stream from an initial state, scan and concat the stream,
generate a stream again from the final state of the previous scan and
repeat the process.
Iterate a fold generator on a stream. The initial value
b is
used to generate the first fold, the fold is applied on the stream and
the result of the fold is used to generate the next fold and so on.
Usage:
>>> import Data.Monoid (Sum(..))
>>> f x = return (Fold.take 2 (Fold.sconcat x))
>>> s = fmap Sum $ Stream.fromList [1..10]
>>> Stream.fold Fold.toList $ fmap getSum $ Stream.foldIterateM f (pure 0) s
[3,10,21,36,55,55]
This is the streaming equivalent of monad like sequenced application
of folds where next fold is dependent on the previous fold.
Pre-release
Iterate a parser generating function on a stream. The initial value
b is used to generate the first parser, the parser is applied
on the stream and the result is used to generate the next parser and
so on.
Example:
>>> import Data.Monoid (Sum(..))
>>> s = Stream.fromList [1..10]
>>> Stream.toList $ fmap getSum $ Stream.catRights $ Stream.parseIterate (\b -> Parser.takeBetween 0 2 (Fold.sconcat b)) (Sum 0) $ fmap Sum s
[3,10,21,36,55,55]
This is the streaming equivalent of monad like sequenced application
of parsers where next parser is dependent on the previous parser.
Pre-release
Deprecated: Please use parseIterate instead.
Like
parseIterate but includes stream position information in
the error messages.
Deprecated: Please use bfsReduceIterate instead.
Like
foldIterateM but using the
Refold type instead.
This could be much more efficient due to stream fusion.
Internal
Same as
concatIterate but more efficient due to stream fusion.
Example, list a directory tree using DFS:
>>> f = Unfold.either (Dir.eitherReaderPaths id) Unfold.nil
>>> input = Stream.fromEffect (Left <$> Path.fromString ".")
>>> ls = Stream.unfoldIterate f input
Pre-release