Linearity of lorentztransformation
My book explains the linearity of the lorentztransformation simply by saying, that had it not been linear then it would lead to something meaningless since one intertial frame of reference would appear accelerated due to the other even if they were moving at a constant velocity relative to each other. They then do an example of some of the disastrous consequences that would have.. But.. I would like to see a more stringtent, mathematical proof for the fact, that ONLY a linear transformation will transform one intertial frame into another without them being accelerated relative to each other.
Can anyone give me a webpage or anything, that discusses that particular problem...?
