http://thegeez.net/2012/06/12/euroclojure_reducers.html?utm_source=dlvr.it&am...
Fold will try to do the computation in parallel using fork/join, when the input collection that is asked to apply the reducer to itself supports this and when the reducer supports this. The check for support is done through protocols: for the datastructures: PersistentVector extends CollFold, for the reducers: r/map is defined as a folder, while r/take-while is defined as a reducer (and does not support fold, because partitioning does not make sense for this computation). When parallel fold is not supported, then fold will just do a reduce. See the implementation for details: reducers.clj
A common approach for parallel processing is to use map+reduce. The work is divided into partitions and map is applied to each partition and the intermediate results are combined with reduce. In fold the approach is reduce+combine. The work on the smallest partition is done with a reduce rather than with map. Compare the two approaches by trying to express filter in map+reduce versus reduce+combine. It appeals to the functional programming sensibilities that reduce is a better fit for most operations than map.
No comments:
Post a Comment