A COMPARATIVE STUDY OF FIVE PARALLEL
Henri E. Bal
Dept. of Mathematics and Computer Science
Many different paradigms for parallel programming exist, nearly each of which is
employed in dozens of languages. Several researchers have tried to compare these languages and paradigms by examining the expressivity and flexibility of their constructs. Few attempts have been made, however, at practical studies based on actual programming experience with multiple languages. Such a study is the topic of this paper.
We will look at five parallel languages, all based on different paradigms. The languages are: SR (based on message passing), Emerald (concurrent objects), Parlog (parallel Horn clause logic), Linda (Tuple Space), and Orca (logically shared data). We have implemented the same parallel programs in each language, using real parallel machines. The paper reports on our experiences in implementing three frequently occurring communication patterns: message passing through a mailbox, one-to-many communication, and access to replicated shared data.
During the previous decade, a staggering number of languages for programming parallel and distributed systems has emerged [And83, Bal89a]. These languages are based on widely different programming paradigms, such as message passing, concurrent objects, logic, and functional programming. Both within each paradigm and between paradigms, heated discussions are held about which approach is best [Car89b, Kah89, Sha89].
The intent of this paper is to cast new light on these discussions, using a practical approach. We have implemented a number of parallel applications in each of several parallel languages. Based on this experience, we will draw some conclusions about the relative advantages and disadvantages of each language. So, unlike most of the discussions in the literature, this paper is based on actual programming experience in several parallel languages on real parallel systems.
The languages studied in this paper obviously do not cover the whole spectrum of design choices. Still, they represent a significant subset of what we feel are the most important paradigms for parallel programming. We discuss only a single language for each paradigm, although other languages may exist within each paradigm that are significantly different.
The languages that have been selected for this study are: SR, Emerald, Parlog, Linda, and Orca (see Table 1). SR represents message passing languages. It provides a range of message sending and receiving constructs, rather than a single model. Emerald is an object-based language. Parlog is a concurrent logic language. Linda is a set of language primitives based on the Tuple Space model. Orca is representative of the Distributed Shared Memory model.
We focus on languages for parallel applications, where the aim is to achieve a speedup on a single application.
These applications can be run on either multiprocessors with shared memory or distributed systems without
This research was supported in part by the Netherlands Organization for Scientific Research (N.W.O.).
This paper was first published in the Proceedings of the EurOpen Spring 1991 Conference on Open Distributed Systems in Perspective, Tromso, 20-24 May 1991.
A preliminary version of the paper appeared in the Proceedings of the PRISMA Workshop on Parallel Database Systems, Noordwijk, The Netherlands, September 1990.