Process module:System -package:unix -package:typed-process -is:exact is:module -package:io-streams -is:exact -package:process -is:exact -package:language-c -package:microlens-process -package:system-linux-proc

A collection of FFI declarations for interfacing with Win32.
This module intends to make the operations of System.Posix.Process available on all platforms.
This module provides functions to run operating system processes as stream producers, consumers or stream transformation functions. Thus OS processes can be used in the same way as Haskell functions and all the streaming combinators in streamly can be used to combine them. This allows you to seamlessly integrate external binary executables into your Haskell program. However, we recommend native Haskell functions with Streamly threads over using system processes whenever possible. This approach offers a simpler programming model compared to system processes, which also have a larger performance overhead. Prefer Streamly.System.Command module as a higher level wrapper over this module.

Executables as functions

Processes can be composed in a streaming pipeline just like a Posix shell command pipeline. Moreover, we can mix processes and Haskell functions seamlessly in a processing pipeline. For example:
>>> :{
Process.toBytes "echo" ["hello world"]
& Process.pipeBytes "tr" ["[a-z]", "[A-Z]"]
& Stream.fold Stdio.write
:}
HELLO WORLD
Of course, you can use a Haskell function instead of "tr":
>>> :{
Process.toBytes "echo" ["hello world"]
& Unicode.decodeLatin1 & fmap toUpper & Unicode.encodeLatin1
& Stream.fold Stdio.write
:}
HELLO WORLD

Shell commands as functions

Using a shell as the command interpreter we can use shell commands in a data processing pipeline:
>>> :{
Process.toBytes "sh" ["-c", "echo hello | tr [a-z] [A-Z]"]
& Stream.fold Stdio.write
:}
HELLO

Running Commands Concurrently

We can run executables or commands concurrently as we would run any other functions in Streamly. For example, the following program greps the word "to" in all the files in the current directory concurrently:
>>> :{
grep file =
Process.toBytes "grep" ["-H", "pattern", file]
& Stream.handle (\(_ :: Process.ProcessFailure) -> Stream.nil)
& Stream.foldMany (Fold.takeEndBy (== 10) Array.write)
:}
>>> :{
pgrep =
Dir.readFiles "."
& Stream.parConcatMap id grep
& Stream.fold Stdio.writeChunks
:}

Experimental APIs

See Streamly.Internal.System.Process for unreleased functions.