Different words, but close to what I was saying.... (RAW data is NOT highly interdependent)
RAW to JPG - parallel process. The more cores the better.
In theory, using a GPU with multiple cores (There are NVidia chips with 512 CUDA cores) will speed up rendering of images.
The raw data is interdependent, but demosaicing the images isn't the hold up in Lightroom for just about anything. Raw to JPEG is largely irrelevant since turning a raster image into a JPEG is only done for previews and exports, and takes very little time. Turning demosaiced data into raster data *with all your corrections applied* takes some time and is not highly parallelizable.
Of all the things that are inefficient in LR, the raw processing pipeline is the least. It's actually pretty efficient. Now, handling huge numbers of previews and putting them up in a grid, doing the resizing, scrolling them, adding the metadata and other badges, interfacing to the database, saving metadata to files and to the database, updating previews and preview thumbs, handling the user interface, etc., now those are things that LR could do a lot better. The CR pipeline is already pretty good and largely not a holdup for most things.