In computer science, concurrency is a property of systems in which multiple computations can be performed in overlapping time periods. The computations may be executing on multiple cores in the same chip, preemptively time-shared threads on the same processor, or executed on physically separated processors.

- Wiki
5 articles, 1 books. Go to books ↓

Modern Internet based applications usually have multiple users viewing and updating data simultaneously. This requires application developers to think carefully about how to provide a predictable experience to their end users, particularly for scenarios where multiple users can update the same data.

Computing systems model the world, and things happen at the same time in the world. These parallel executions have to be composed and coordinated, and that's where the study of concurrency comes in.

Seasoned programmers are familiar with concurrency building blocks like mutexes, semaphores, and condition variables. But what makes them work? How do we write concurrent code when we can’t use them, like when we’re working below the operatingsystem in an embedded environment, or when we can’t block due to hard time constraints?