PRIORITY -BASED DATA DISTRIBUTION MECHANISM Hadoop is an open source platform that provides distributed computing of big data .Hadoop consist of two component,a storage model called hadoop distributed file system and an computing model called MapReducer. Map reducer, is a programming model for handling large complex task by doing two steps called map and reduce.We use MapReduce to implement high-complexity and high-density 2D to 3D image processing.Map function is responsible for processing 2D effects, like gray scale , sobel , Gaussian blur and corner detection , and converting the images to intermediate data, while the Reduce function is responsible for integrating the intermediate data generated by the Map function, using 3D model construction algorithm to construct the model of the object, and presenting 3D images according to user requirements.To speed up the operations we use Dynamic Switch of Reduce Function (DSRF) algorithm whic is a scheduling scheme in which allow the reduce function to switch to different tasks according to the time in which the intermediate data is available from map function