"Manipulating the data" with the intent to mislead sounds like fudging to me, though it wasn't your word. As far as I can tell they are simply summarizing / aggregating the data and it just happens that with the current sensors, the way they do this doesn't work out well for Canon.
It has nothing to do with the Nikon/Canon debate as far as I am concerned.
You can read at DxO which numbers and factors they use. There are factors they leave out of the evaluation as well and they tell you this. The weight they apply to the three factors they use to arrive at the number score, is arrived at and determined by DxO. Why should we believe that the way they arrive at this average score is correct for how a sensor should be evaulated and scored? Is it because they "claim" to be the leader in sensor testing?
When I first went to DxO to use the information they provide to compare Canon cameras it was years ago and, Nikon only had few in the top ten. At that time when I looked at the scores they had just on Canon cameras the scores just didn't line up with reality with Canon vs Canon. If the final results do not line up with the real world situation, perhaps they should look at how they combine all this data to arrive at a final number score.
Then there is the description "landscape" or "sports" they use, rather than calling the test what it is actually for. Why don't they describe the test for what it is, and then imply this is important for that activity. With landscaping DR is not the only thing that matters, why apply the name landscape to high DR?
I find DxO's presentation of the data to miss leading. Any time a large company chooses to put out data that on the surface appears to miss lead the general public I think you have to question their motives.
I do not believe the "simply summarizing" view. But then maybe in their minds the believe they are.