os211

Top 10 List of Week 07

This list is of no particular orderning, neither of importance nor chronological. It is written in the order in which I remembered them as I was writing this.

  1. On the complexity of deadlock detection in families of planar nets
    This week we’re starting off way too strong with a paper that describes–in the language of mathematics that is going way over my head–of the complexity of deadlock detection. This paper, along with several others of its kind, seems to suggest that the core problem of deadlock detection itself is NP-hard or NP-complete. At the very least, it is for this model of parallel computing represented by finite state machines. Interestingly that didn’t stop researchers from creating heuristic methods from the ancient age of the 1950s in order to mitigate the problem. However this does put into context the sheer justifiability of having of “ignore deadlocks!” as a valid solution to the problem.

  2. Polynomial-Complexity Deadlock Avoidance Policies for Sequential Resource Allocation Systems
    This paper describes a set of deadlock avoidance policies for sequential resource allocation systems, or–as far as I can tell–in layman’s terms, an operating system. Once again it’s flying just far above my head, but it is interesting to see that this problem is still manageable mathematically–even algebraically–considering prior results. One funny note is the fact that it’s catalogued under social policies instead of computer science.

  3. Efficient techniques for deadlock resolution in distributed systems
    This paper describes two polynomial-time heuristics for resolving deadlocks. It stands in contrast because it tackles the problem head-on, instead of going the route of avoidance. Moreover, that it works from the fundamental structure of the wait-for graph, instead of using incredibly complex models as other papers I’ve seen. That, however, still did not stop this paper from going miles above my head as well.

  4. POSIX Semaphores
    Sometime prior to this week’s class I had some problems understanding semaphores and ran headfirst into it during the quiz. This link helped somewhat clear up some of my misconceptions that I had. One strange yet interesting thing that I realized after reading this was that my misconception was wildly stupid: I had, for some reason, concluded that every procedure call encapsulated in a thread would all execute asynchronously, rather than that single thread running asynchronously against every other process; which is what threads are supposed to do in the first place.

  5. Banker’s Algorithm – RosettaCode
    This RosettaCode entry includes an excellent explanation of the Banker’s Algorithm as well as the usual entries. One particular entry of interest–aside from that of C’s, the main player in systems programming languages–is Go’s, the multithreading poster child of Google.

  6. The Dining Philosophers Problem
    This is another link that I found while looking through the problem of deadlocking during last week’s classes. It illustrates the problem quite well with several code snippets throughout to really visualize the condition.

  7. On the probability of deadlock in computer systems
    This is an interesting article discussing on the probability of deadlocks from a rigorous perspective. As usually discussed, the argument for the Ostrich algorithm is that deadlocks are so uncommon so as to be completely ignorable. This article takes that on its word and attempts to calculate this probability in full.

  8. Deadlocks – Oracle Docs
    Once again we take a short detour from the domain of operating systems to elsewhere: databases. During my readings I keep seeing the idea of “database deadlocks” floating around and I finally found a link that explains it. It’s interesting how this concept is applicable elsewhere, and very intuitively so, considering databases also deal mostly in resources.

  9. Livelocks
    This is yet another topic that I came across while surfing through google but didn’t see any of in the actual material. It’s pretty interesting because though the name suggests an opposite condition to deadlocks, it’s actually pretty much the same: an issue of resource allocation. Interestingly the issue arises specifically because of external limits imposed upon the resource structure, rather than physical resource use.

  10. The Sleeping Barber Problem
    Now we end this week’s list with a discussion of another well-known and interesting problem in interprocess synchronization, the sleeping barber problem. This link covers the problem as well as its solution by use of semaphores, going steps further by clearing up misconceptions on the difference between mutexes and seamphores, and then guiding the reader step by step into the solution. An excellent and interesting read, and a very accessible one at that.

That’s it for now. またね!