Binocular stereo matching frequently fails to construct accurate depth maps if the visible texture in the scene is weak or ambiguous or the scene exhibits difficult self-occlusions. Time-of-flight (TOF) sensors can provide depth information regardless of texture how-ever in limited resolution and with presence of various kind of noise. Joint calibration of range (TOF) sensor and a pair of conventional cameras, allows to fuse the information and overcome weaknesses of both individual sensors. We propose an efficient seed growing algorithm which takes TOF 3D points projected to stereo images as the initial correspondence seeds. For the propagation, the algorithm uses a similarity score based on the Bayesian model which combines both image similarity and rough surface prior computed from low resolution TOF data. The result is a dense and accurate depth map of high resolution with significantly more details. We show that the proposed algorithm outperforms both separate TOF and image-based stereo reconstructions. The algorithm has potentially a real time performance in a single CPU.