Concurrency in .NET
Modern patterns of concurrent and parallel programming
Riccardo Terrell
  • June 2018
  • ISBN 9781617292996
  • 568 pages
  • printed in black & white

A complementary source of knowledge about modern concurrent functional programming on the .NET platform—an absolute must-read.

Pawel Klimczyk, Microsoft MVP

Concurrency in .NET teaches you how to build concurrent and scalable programs in .NET using the functional paradigm. This intermediate-level guide is aimed at developers, architects, and passionate computer programmers who are interested in writing code with improved speed and effectiveness by adopting a declarative and pain-free programming style.

Table of Contents detailed table of contents

Part 1: Functional Concurrent programming Concepts

1. Functional Concurrency Foundations

1.1. What you will learn from this book

1.2. Let’s start with terminology

1.2.1. Sequential programming performs one task at a time

1.2.2. Concurrent programming runs multiple tasks at the same time

1.2.3. Parallel programming executes multiples tasks simultaneously

1.2.4. Multitasking performs multiple tasks concurrently over time

1.2.5. Multithreading for performance improvement

1.3. Why the need for concurrency?

1.3.1. Present and future of concurrent programming

1.4. The pitfalls of concurrent programming

1.4.1. Concurrency hazards

1.4.2. The sharing of state evolution

1.4.3. A simple real-world example: parallel quicksort

1.4.4. Benchmarking in F#

1.5. Why choose functional programming for concurrency

1.5.1. Benefits of functional programming

1.6. Embracing the functional paradigm

1.7. Why use F# and C# for functional concurrent programming

2. Functional Programming Techniques for Concurrency

2.1. Using function composition to solve complex problems with simple solutions

2.1.1. Function composition in C#

2.1.2. Function composition in F#

2.2. Closure to simplify functional thinking

2.2.1. Captured variables in closures with lambda expressions

2.2.2. Closure in a multithreading environment

2.2.3. Memoization-caching technique for program speedup

2.2.4. Memoize in action for a fast Web-Crawler

2.2.5. Lazy memoization for better performance

2.2.6. Gotchas for function memoization

2.3. Effective concurrent speculation to amortize the cost of expensive computations

2.3.1. Precomputation with natural functional support

2.3.2. Let the best computation win

2.4. Being lazy is a good thing

2.4.1. Strict programming languages for better understanding concurrent behaviors

2.4.2. Lazy caching technique and thread-safe singleton pattern

2.4.3. Lazy support in F#

2.4.4. Lazy and Task, a powerful combination

3. Functional Data Structures and Immutability

3.1. Real-world example: Hunt the thread-unsafe object

3.1.1. .NET immutable collections: a safe solution

3.1.2. The .NET concurrent collections: a faster solution

3.1.3. The Agent message passing pattern: a faster and better solution

3.2. Functional data structures can be shared safely among threads

3.3. Immutability for a change

3.3.1. Functional data structure for data parallelism

3.3.2. Performance implication behind using immutability

3.3.3. Immutability in C#

3.3.4. Immutability in F#

3.3.5. Functional lists - linking cells in a chain

3.3.6. Building a persistent data structure: an immutable binary tree (B-Tree)

3.4. Recursive function: a natural way to iterate

3.4.1. The tail of a correct recursive function: Tail-Call optimized

3.4.2. Continuation passing style (CPS) to optimize recursive function

Part 2 How to approach different parts of a concurrent program

4. The Basics of Processing Big Data: Data Parallelism Part 1

4.1. What is data parallelism

4.1.1. Data and task parallelism

4.1.2. The "embarrassingly parallel" concept

4.1.3. Data parallelism support in .NET

4.2. The Fork/Join pattern: Parallel Mandelbrot

4.2.1. When the garbage collector is the bottleneck: struct vs. class objects

4.2.2. The downside of parallel loops

4.3. How to measure performance speed

4.3.1. The Amdahl’s Law defines the limit of performance improvement

4.3.2. The Gustafson’s Law - a step further to measure performance improvement

4.3.3. The limitations of parallel loops: the sum of prime numbers

4.3.4. What can possibly go wrong with a simple loop?

4.3.5. The declarative parallel programming model

5. PLINQ and Map-Reduce: Data Parallelism Part 2

5.1. A short introduction to PLINQ

5.1.1. How is PLINQ more functional?

5.1.2. PLINQ and pure functions: the parallel words counter

5.1.3. Avoiding side effects with pure functions

5.1.4. Isolate and control side effects: refactoring the parallel words counter

5.2. Aggregating and reducing data in parallel

5.2.1. Deforesting: one of many advantages to folding

5.2.2. Fold in PLINQ: the Aggregate functions

5.2.3. Implementing a parallel Reduce function for PLINQ

5.2.4. Parallel list comprehension in F#: PSeq

5.2.5. Parallel array in F#

5.3. Parallel MapReduce pattern

5.3.1. The Map and Reduce functions

6. Real-Time Event Streams: Functional Reactive Programming

6.1. What is Reactive programming: Big Event processing

6.2. .NET tools for Reactive programming

6.2.1. Event combinators: a better solution

6.2.2. .NET interoperability with F# combinators

6.3. Reactive programming in .NET: Reactive Extensions (Rx)

6.3.1. From LINQ/PLINQ to Reactive Extensions

6.3.2. IObservable?the dual IEnumerable

6.3.3. Reactive Extensions in action

6.3.4. Real-time streaming with Reactive Extensions

6.3.5. From events to F# observables

6.4. Taming the event stream: Twitter emotion analysis using Reactive Extensions programming

6.4.1. SelectMany: the monadic bind operator

6.5. An Rx publisher-subscriber

6.5.1. The Subject type for a powerful Publisher-Subscriber hub

6.5.2. Reactive Extensions in relation to concurrency

6.5.3. Implementing a reusable Rx Publisher-Subscriber

6.5.4. Analyzing tweet emotions using an Rx Pub-Sub class

6.5.5. Observer in action

6.5.6. The convenient F# object expression

7. Task-Based Functional Parallelism

7.1. A short introduction to task parallelism

7.1.1. Why task parallelism and functional programming?

7.1.2. Task parallelism support in .NET

7.2. The .NET Task Parallel Library (TPL)

7.2.1. Running operations in parallel with TPL Parallel.Invoke

7.3. The problem of void in C#

7.3.1. The solution of void in C# - the Unit type

7.4. Continuation-passing style: a functional control flow

7.4.1. Why exploit CPS?

7.4.2. Waiting for a task to complete: the continuation model

7.5. Strategies for composing task operations

7.5.1. Mathematical pattern for better composition

7.5.2. Guidelines for using tasks

7.6. The parallel functional pipeline pattern

8. Task Asynchronicity for the win

8.1. The Asynchronous Programming Model (APM)

8.1.1. The value of asynchronous programming

8.1.2. Scalability and Asynchronous Programming

8.1.3. CPU-bound and I/O-bound operations

8.2. Unbounded parallelism with asynchronous programming

8.3. Asynchronous support in .NET

8.3.1. Asynchronous programming breaks the code structure

8.3.2. Event-based asynchronous programming

8.4. The C# Task-Asynchronous-Programming (Async/Await)

8.4.1. Anonymous asynchronous lambdas

8.4.2. Task<T> is a monadic container

8.5. TAP: a case study

8.5.1. Asynchronous cancellation

8.5.2. Task-based asynchronous composition with the monadic Bind operator

8.5.3. Deferring asynchronous computation enables composition

8.5.4. Retry if something goes wrong

8.5.5. Handling errors in asynchronous operations

8.5.6. Asynchronous parallel processing of historical Stock Market

8.5.7. Asynchronous stock market parallel process as tasks complete

9. Asynchronous functional programming in F#

9.1. Asynchronous functional aspect

9.2. What is the F# Asynchronous-Workflow

9.2.1. The continuation passing style in computation expressions

9.2.2. The asynchronous workflow in action: Azure Blob storage parallel operations

9.3. Asynchronous Computation Expressions

9.3.1. Difference between computation expressions and monads

9.3.2. AsyncRetry: building your own computation expression

9.3.3. Extending the asynchronous workflow

9.3.4. Mapping asynchronous operation: The Async.map Functor

9.3.5. Parallelize asynchronous workflows: Async.Parallel

9.3.6. Asynchronous workflow cancellation support

9.3.7. Taming parallel asynchronous operations

10. Functional combinators for fluent concurrent programming

10.1. The execution flow is not always on the "happy path": Error handling

10.1.1. The problem of error handling in imperative programming

10.2. Errors Combinators: Retry, Fallback, Otherwise, and Task.Catch in C#

10.2.1. Error handling in functional programming: exceptions for flow control

10.2.2. Handling errors with the Task<Option<T>> type in C#

10.2.3. The AsyncOption type in F#: Combining the Async and Option types

10.2.4. Idiomatic F# functional asynchronous error handling: Async.Catch

10.2.5. Preserving exception semantic with the Result type

10.3. Taming exceptions in asynchronous operations

10.3.1. Modelling error handling in F# with the Async and Result types

10.3.2. Extending the F# AsyncResult type with monadic bind operators

10.4. Abstracting operations with functional combinators

10.5. Functional combinators in the nutshell

10.5.1. The Task Parallel Library build in asynchronous combinators

10.5.2. Exploiting the Task.WhenAny combinator for redundancy and interleaving

10.5.3. Exploiting the Task.WhenAll combinator for asynchronous-ForEach

10.5.4. Mathematical pattern review: Brief analysis of what you have seen so far

10.6. The ultimate parallel composition: Applicative Functor

10.6.1. Extending the F# async workflow with applicative functor operators

10.6.2. Applicative functor semantic in F# with infix operators

10.6.3. Exploiting heterogeneous parallel computation with Applicative

10.6.4. Composing and executing heterogeneous parallel computations

10.6.5. Controllig the flow with conditional asynchronous combinators

10.6.6. Asynchronous combinators in action

11. Applying reactive programming everywhere with Agents

11.1. What is reactive programming, and how is it useful?

11.2. The asynchronous message-passing programming model

11.2.1. Relation with message passing and immutability

11.2.2. Natural isolation

11.3. What is an agent?

11.3.1. The components of an agent

11.3.2. What an agent can do

11.3.3. The share-nothing approach for lock-free concurrent programming

11.3.4. How is agent-based programming functional?

11.3.5. Agent is object-oriented

11.4. The F# agent: MailboxProcessor

11.4.1. The mailbox asynchronous recursive loop

11.5. Avoiding database bottlenecks with F# MailboxProcessor

11.5.1. The MailboxProcessor message type: discriminated unions

11.5.2. The MailboxProcessor two-way communication

11.5.3. Consuming the AgentSQL from C#

11.5.4. Parallelize the workflow with a group-coordination of agents

11.5.5. How to handle errors with F# MailboxProcessor

11.5.6. Stop, cancel, and dispose MailboxProcessor: CancellationToken

11.5.7. Distributing the work with MailboxProcessor

11.5.8. Caching operations with an agent

11.5.9. Reporting results from a MailboxProcessor

11.5.10. Using the ThreadPool to report events from MailboxProcessor

11.6. The F# MailboxProcessor: 10,000 agents for a game of life

12. Parallel workflow and agents programming with TPL DataFlow

12.1. The power of TPL DataFlow

12.2. Designed to compose - TPL DataFlow blocks

12.2.1. Using the BufferBlock<TInput> as a FIFO buffer

12.2.2. Transforming your Data with the TransformBlock<TInput, TOutput>

12.2.3. Completing the work with ActionBlock<TInput >

12.2.4. Linking DataFlow blocks

12.3. Implementing a sophisticated producer/consumer with TDF

12.3.1. A multiple-producer/single-consumer pattern: TPL DataFlow

12.3.2. A single-producer/multiple-consumers pattern: TPL DataFlow

12.4. Enabling agent model in C# using TPL DataFlow

12.4.1. Agent fold-over state and messages: Aggregate

12.4.2. Agents interaction: A parallel word counter

12.5. A parallel workflow to compress and encrypt a large stream

12.5.1. Context: the problem of processing a large Stream of data

12.5.2. Ensuring the order integrity of a stream of messages

12.5.3. Linking, propagating, and completion

12.5.4. Rules for building a DataFlow workflow

12.5.5. Reactive Extensions (Rx) and TPL DataFlow meshes

Part 3: Modern patterns of concurrent programming applied

13. Recipes and design patterns for successful concurrent programming

13.1. Asynchronous Object-Pool for recycling objects to reduce memory consumption

13.1.1. Solution: asynchronously recycling a pool of objects

13.2. Custom parallel Fork/Join operator

13.2.1. Solution: composing a pipeline of steps forming the Fork/Join pattern

13.3. Parallelizing Tasks with dependencies - Design your code to optimize performance

13.3.1. Solution: Implementing a dependencies graph of tasks using the F# MailboxProcessor and exposing the methods as standard Task to be consumed also by C#

13.4. Gate for coordinating concurrent I/O operations sharing resources: One write, multiple reads

13.4.1. Solution: Leveraging agent programming model to access and apply multiple read/write operations to shared thread-safe resources

13.5. Thread-Safe Random number generator

13.5.1. Solution: Using the Thread-Local object to guarantee a unique and isolated object instance for each thread accessor

13.6. Polymorphic Event Aggregator

13.6.1. Solution: Implementing a polymorphic publisher-subscriber pattern using the Reactive Extensions Subject type

13.7. Custom Rx Scheduler to control the degree of parallelism

13.7.1. Solution: Implementing a Reactive Extensions scheduler with multiple concurrent agents to control the degree of parallelism

13.8. Concurrent Reactive Scalable Server/Client

13.8.1. Solution: Combining Rx and asynchronous programming

13.9. Reusable custom high-performaning Parallel Filter-Map operator

13.9.1. Solution: Combining filter and map parallel operations

13.10. Non-blocking synchronous message passing model

13.10.1. Solution: Coordinating and balancing the payload between read and write operations using the agent programming model

13.11. Coordinating concurrent jobs using agent programming model

13.11.1. Solution: Implementing an agent that coordinates and concurrently runs a set of jobs with a configured degree of parallelism.

13.12. Composing monadic functions with the Kleisli operator

13.12.1. Solution: Effortlessly combining asynchronous operations using the Kleisli composition operator

14. Building a scalable mobile app with concurrent functional programming

14.1. Functional programming on the server in the real world

14.2. How to design a successful performant application

14.2.1. The secret sauce: ACD

14.2.2. A different asynchronous pattern: queuing work for later execution

14.3. Choosing the right concurrent programming model

14.3.1. Real-time communication with SignalR

14.4. Real-time trading: stock market high-level architecture

14.5. Essential elements for the stock market application

14.6. Let’s code the stock market trading application

14.6.1. Benchmark to measure the scalabaility of the Stock-Ticker application

Appendixes

Appendix A: Functional programming

A.1. What is Functional Programming?

A.1.1. The benefits of functional programming

A.1.2. The tenets of functional programming

A.1.3. The clash of program paradigms?from imperative to object-oriented to functional programming

A.1.4. Higher-order functions for raising the abstraction

A.1.5. Higher-order functions and lambda expressions for code reusability

A.1.6. Lambda expressions and anonymous functions

A.1.7. Currying

A.1.8. Partially applied functions

A.1.9. Partial application benefits

A.1.10. The power of partial function application and currying in C#

Appendix B: F# overview

B.1. The "let" Binding

B.2. Understanding function signatures in F#

B.3. Create mutable types: mutable and ref

B.4. Functions as first class types

B.5. Composition - Pipe and Composition operators

B.6. Delegates

B.7. Comments

B.8. Open statements

B.9. Basic data types

B.10. Special String definition

B.11. Tuple

B.12. Record-Types

B.13. Discriminated Unions

B.14. Pattern matching

B.15. Active patterns

B.16. Collections

B.17. Arrays

B.18. Sequences (seq)

B.19. Lists

B.20. Sets

B.21. Maps

B.22. Loops

B.23. Class and inheritance

B.24. Abstract classes and inheritance

B.25. Interfaces

B.26. Object expressions

B.27. Casting

B.28. Units of Measure

B.29. Event Module API reference

B.30. Resources

Appendix C: Interoperability between F# asynchronous workflow and .NET Task

C.1. Interoperability between F# asynchronous workflow and .NET Task

About the Technology

Unlock the incredible performance built into your multi-processor machines. Concurrent applications run faster because they spread work across processor cores, performing several tasks at the same time. Modern tools and techniques on the .NET platform, including parallel LINQ, functional programming, asynchronous programming, and the Task Parallel Library, offer powerful alternatives to traditional thread-based concurrency.

About the book

Concurrency in .NET teaches you to write code that delivers the speed you need for performance-sensitive applications. Featuring examples in both C# and F#, this book guides you through concurrent and parallel designs that emphasize functional programming in theory and practice. You’ll start with the foundations of concurrency and master essential techniques and design practices to optimize code running on modern multiprocessor systems.

What's inside

  • The most important concurrency abstractions
  • Employing the agent programming model
  • Implementing real-time event-stream processing
  • Executing unbounded asynchronous operations
  • Best concurrent practices and patterns that apply to all platforms

About the reader

For readers skilled with C# or F#.

About the author

Riccardo Terrell is a seasoned software engineer and Microsoft MVP who is passionate about functional programming. He has over 20 years’ experience delivering cost-effective technology solutions in a competitive business environment.

Riccardo is well known and actively involved in the functional programming community, including .NET meetups and international conferences. He believes in multi-paradigm programming as a mechanism to maximize the power of code. You can keep up with Riccardo and his coding adventures on his blog www.rickyterrell.com.


buy
combo $59.99 pBook + eBook + liveBook
eBook $47.99 pdf + ePub + kindle + liveBook

FREE domestic shipping on three or more pBooks

Not just for those cutting code on Windows. You can use the gold dust in this book on any platform!

Kevin Orr, Sumus Solutions

Presents real-world problems and offers different kinds of concurrency to solve them.

Andy Kirsch, Rally Health

Easiest entry into concurrency I’ve come across so far!

Anton Herzog, AFMG Technologies