A rise in the number of threads in large-scale applications running on multi-node architectures makes operating system activity increasingly more relevant. Therefore, evaluation methodologies need to account for these activities. We decided to build our evaluation environment through the COTSon simulator. Moreover, our environment permits flexible Design Space Exploration (DSE) by making easy the management of many experiments and the characterizations of Operating System (OS) activity. In this paper, we show the result analysis tool flow and the OS impact of different Linux distributions running on a distributed environment consisting of several nodes with a full OS. In order to quantify our results, we use matrix multiplication benchmark executed through a DataFlow model, named DataFlow Threads(DF-Threads). We analyze key metrics like L2 cache miss rate, execution cycles, data access latency, and kernel cycles showing up to 60% performance variations among the different OS distributions.