Process module:System -package:unix -package:typed-process -is:exact -package:cmdargs -package:hsyslog-udp -package:clock -is:exact

Operations for creating and interacting with sub-processes.
A collection of FFI declarations for interfacing with Win32.
A module adapting the functions from System.Process to work with io-streams.
This module intends to make the operations of System.Posix.Process available on all platforms.
This module provides functions to run operating system processes as stream producers, consumers or stream transformation functions. Thus OS processes can be used in the same way as Haskell functions and all the streaming combinators in streamly can be used to combine them. This allows you to seamlessly integrate external binary executables into your Haskell program. However, we recommend native Haskell functions with Streamly threads over using system processes whenever possible. This approach offers a simpler programming model compared to system processes, which also have a larger performance overhead. Prefer Streamly.System.Command module as a higher level wrapper over this module.

Executables as functions

Processes can be composed in a streaming pipeline just like a Posix shell command pipeline. Moreover, we can mix processes and Haskell functions seamlessly in a processing pipeline. For example:
>>> :{
Process.toBytes "echo" ["hello world"]
& Process.pipeBytes "tr" ["[a-z]", "[A-Z]"]
& Stream.fold Stdio.write
:}
HELLO WORLD
Of course, you can use a Haskell function instead of "tr":
>>> :{
Process.toBytes "echo" ["hello world"]
& Unicode.decodeLatin1 & fmap toUpper & Unicode.encodeLatin1
& Stream.fold Stdio.write
:}
HELLO WORLD

Shell commands as functions

Using a shell as the command interpreter we can use shell commands in a data processing pipeline:
>>> :{
Process.toBytes "sh" ["-c", "echo hello | tr [a-z] [A-Z]"]
& Stream.fold Stdio.write
:}
HELLO

Running Commands Concurrently

We can run executables or commands concurrently as we would run any other functions in Streamly. For example, the following program greps the word "to" in all the files in the current directory concurrently:
>>> :{
grep file =
Process.toBytes "grep" ["-H", "pattern", file]
& Stream.handle (\(_ :: Process.ProcessFailure) -> Stream.nil)
& Stream.foldMany (Fold.takeEndBy (== 10) Array.write)
:}
>>> :{
pgrep =
Dir.readFiles "."
& Stream.parConcatMap id grep
& Stream.fold Stdio.writeChunks
:}

Experimental APIs

See Streamly.Internal.System.Process for unreleased functions.
A handle to a process, which can be used to wait for termination of the process using waitForProcess. None of the process-creation functions in this library wait for termination: they all return a ProcessHandle which may be used to wait for the process later. On Windows a second wait method can be used to block for event completion. This requires two handles. A process job handle and a events handle to monitor.
CPU options impacting cryptography implementation and library performance.
ProcessId, number of threads, parent ProcessId, process base priority, path of executable file
This will always come first, before any output or exit code.