The Open Directory Project.

Directory of Dataflow Resources

Home > Computers > Programming > Languages > Dataflow

Dataflow programming studies began in the 1970s, as the limits of von Neuman (normal) computers were found: such systems use an inherent control-driven programming model. Dataflow models are alternative to, and different from this, and are much studied and researched in many areas of basic computer science: computation models, programming languages, machine architecture, compilers, parallelism. Many early dataflow researchers thought dataflow would become a new and general model of computing, able to exploit all parallel aspects extant in general purpose programs, and that it would enable developing high-level languages where programmers need not manage all details of efficient program and data mapping on parallel machines. Most dataflow research and papers are on architecture and language aspects of dataflow models in fine-grained parallelism and program execution. Dataflow concepts exist in many aspects of normal computing, such as pipelining and multiple instruction issue techniques used in many RISC processors. Dataflow programming was once a promising approach to a new generation of high-performance computing, but still may be too immature to be a mainstream technology in general parallel computing, and is not much accepted and used in the high-performance community and industry, even though it has many benefits and unique traits that can be used in today's parallel software environment. Likely, von Neuman-based processors, on which current parallel technology is built, will dominate high-performance computing and applications for years to come.

Subcategories

Resources in This Category

Related Categories

 

Home > Computers > Programming > Languages > Dataflow

 


 

Thanks to DMOZ, which built a great web directory for nearly two decades and freely shared it with the web. About us