Have any of you ever heard of any studies comparing the informational content of western monotonal languages vs. eastern multitonal languages?
I’m thinking here of information theory as Claude Shannon first defined it in 1948. If we think of natural language as a ‘pipe’ through which information is being passed, then the characteristics of the pipe certainly will influence the amount of information which may be passed per unit time.  Shannon would have told us that the efficiency of natural languages could be improved by doing a number of things like speaking faster and faster until the receiver could no longer understand, by dropping redundant and/or unnecessary words and by choosing the shorter words among synonyms.
The fact that eastern languages encode alternate word meanings in terms of tones and western languages do not (though western languages do encode emotional information as tonal variations), may give it a higher efficiency in terms of information transmitted per unit time.
Certainly, one could reason so about Kanji and other such symbolic systems wherein the symbols represent concepts rather than phonemes and thus convey more information per symbol in general.
Of course, thinking about natural language in terms of information theory is a reach at best since natural language is so very sloppy and inefficient but these ideas seemed interesting to me and I was wondering if anyone has read anything along these lines.