I am wondering when image processing softwares (LR, DXO, C1 etc.) will start using GPU for image processing .
This could drastically increase performance and processing capabilities and could help to implement more complicated and more resource demanding algorithms. Especially using NVIDIA CUDA – when it is possible to utilize more than 1500 processors on latest NVIDA cards for processing instead of just 4 or 8 cores on main CPU.
One who first implement this could have great advantage over other competitors.
This question comes to my mind each time when there are news about new major S/W releases – e.g. now with the information that Lightroom 6 will be coming soon.
Earlier Adobe was telling about difficulties to implement parallel processing but this does not reflect current realities.
Simple search on WEB shows that there are patents existing for GPU image processing as well as number implementations and API Libraries to utilize CUDA technology for image and video processing including RAW files processing and some implementations which provide amazing processing speed.
Here are some references:
1. GPU Raw image processing patent US 8098964 B2
http://www.google.ca/patents/US8098964
2. http://www.ximea.com/de/technology-news/gpu
3. http://on-demand.gputechconf.com/siggraph/2013/presentation/SG3108-GPU-Programming-Video-Image-Processing.pdf
------
http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&cad=rja&uact=8&ved=0CDUQFjAE&url=http%3A%2F%2Fon-demand.gputechconf.com%2Fsiggraph%2F2013%2Fpresentation%2FSG3108-GPU-Programming-Video-Image-Processing.pdf&ei=cfjBVOPfFpDiav_0gcAF&usg=AFQjCNEZ78COMnMT4hvBZrflwN3-b_ZibQ&bvm=bv.84349003,d.d2s
------
http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&cad=rja&uact=8&ved=0CDUQFjAE&url=http%3A%2F%2Fon-demand.gputechconf.com%2Fsiggraph%2F2013%2Fpresentation%2FSG3108-GPU-Programming-Video-Image-Processing.pdf&ei=cfjBVOPfFpDiav_0gcAF&usg=AFQjCNEZ78COMnMT4hvBZrflwN3-b_ZibQ&bvm=bv.84349003,d.d2s
This could drastically increase performance and processing capabilities and could help to implement more complicated and more resource demanding algorithms. Especially using NVIDIA CUDA – when it is possible to utilize more than 1500 processors on latest NVIDA cards for processing instead of just 4 or 8 cores on main CPU.
One who first implement this could have great advantage over other competitors.
This question comes to my mind each time when there are news about new major S/W releases – e.g. now with the information that Lightroom 6 will be coming soon.
Earlier Adobe was telling about difficulties to implement parallel processing but this does not reflect current realities.
Simple search on WEB shows that there are patents existing for GPU image processing as well as number implementations and API Libraries to utilize CUDA technology for image and video processing including RAW files processing and some implementations which provide amazing processing speed.
Here are some references:
1. GPU Raw image processing patent US 8098964 B2
http://www.google.ca/patents/US8098964
2. http://www.ximea.com/de/technology-news/gpu
3. http://on-demand.gputechconf.com/siggraph/2013/presentation/SG3108-GPU-Programming-Video-Image-Processing.pdf
------
http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&cad=rja&uact=8&ved=0CDUQFjAE&url=http%3A%2F%2Fon-demand.gputechconf.com%2Fsiggraph%2F2013%2Fpresentation%2FSG3108-GPU-Programming-Video-Image-Processing.pdf&ei=cfjBVOPfFpDiav_0gcAF&usg=AFQjCNEZ78COMnMT4hvBZrflwN3-b_ZibQ&bvm=bv.84349003,d.d2s
------
http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&cad=rja&uact=8&ved=0CDUQFjAE&url=http%3A%2F%2Fon-demand.gputechconf.com%2Fsiggraph%2F2013%2Fpresentation%2FSG3108-GPU-Programming-Video-Image-Processing.pdf&ei=cfjBVOPfFpDiav_0gcAF&usg=AFQjCNEZ78COMnMT4hvBZrflwN3-b_ZibQ&bvm=bv.84349003,d.d2s