Is science sometimes in danger of getting tunnel vision? Recently published ebook author, Ian Miller, looks at other possible theories arising from data that we think we understand. Can looking problems in a different light give scientists a different perspective?

Share this |

Share to Facebook Share to Twitter Share to Linked More...

Latest Posts

The recent edition of Chemistry World had an article in which it was reported that the American Chemical Society is planning to start up a ChemRxiv, a preprint server. In this context, I note that chemists are strangely resistant to change, and the article was hardly overly enthusiastic about the concept. You probably know that the physics community has preprints on the ArXiV server, and by and large, physicists submit papers there for community peer review before submitting to a final journal. I started writing an article for this blog a little while ago on this topic, and never finished it, but this part of what I wrote. This, to me, makes a lot of sense, and a number of years ago some prominent chemists started up the Chemistry Preprint Server. The response from chemists was strangely disappointing. Very few submitted anything, and of those submitted, some were distinctly of low quality. Nevertheless, it seemed a good idea at the time. However, the main part of the chemistry community then proceeded to ignore it. Now I feel this should have been a great place for two things: getting an important paper into a form that the average community would accept, and secondly getting enough support for it that a journal would publish it. However, that preprint server did not work because very few were interested.
According to the Chemistry World article, the previous preprint server died because the quality of the publications was too low, and because the American Chemical Society would not publish anything that had been published before. That may be unfair on the ACS; I doubt it was unique in that. My belief is it died because most of the chemists of stature refused to take part, although in fairness at the time I also refused to put my experimental work on it because I was afraid that it would not be accepted later by the journal where most of my experimental work was then being published.
What are the purposes of a preprint server? One might be to get information out there earlier, but my feeling is that is not really that important in most cases. What I used the earlier server for was to archive material that was otherwise difficult to publish, and possibly get some idea what I could do to improve its acceptability.  Now, you might say, reading that, that I was a contributor to the "low quality" chemistry. Before jumping to such conclusions, consider these.
I submitted a logic analysis once to a chemical journal, the purpose was to show that the published data did not support the concept that the cyclopropane ring could delocalize electrons into adjacent unsaturated substituents. The standard position was it did, largely because it stabilized adjacent positive charge and gave bathochromic shifts to many UV absorptions, and extending the wave function would do that, as in the allyl cation. However, there is another way, namely a polarization field, i.e. the problem is one of electromagnetic theory rather than quantum mechanics. I wrote a logic analysis that showed not only did this alternative give correct predictions, but there were over 60 different types of experiment that falsified, or at least cast a lot of doubt, over the then accepted "delocalization" explanation. Note that demonstrating that cyclopropane stabilized adjacent positive charge 250 times still only adds one datum into a logic analysis. As a further aside, one previous review argued that quantum computations verified the presence of extra stability through delocalization. You may be amused to note that such computations used the same MO programs that "proved" the stability of polywater.
The first journal rejected it because they argued "it was not what the average chemist wanted to see" (or something like that). There were not enough diagrams (I assumed that a triangle headed towards a substituent X should have been general enough for many cases). One journal rejected the proposal "because this issue is settled". Who cares about the data and the logic? Others simply said they did not publish logic analyses. Fair enough in one way, but how could it get published? I will concede immediately that it could have been made easier to follow, but what I hoped was that chemists would tell me what it should have looked like to convince them. No such luck.
Similarly with a paper where I showed a pencil and paper route to getting a reasonably accurate estimate of the covalent bonds of the group 1 elements. That was rejected by editors because "nobody would be interested". They may or may not have been right, but the interesting point is it argued there was a hitherto unrecognized quantum effect applying.
Now some will say that was rubbish. Either it was right or it was wrong. If it were right, should it not be published somewhere? If it were wrong, should not someone be able to show where? And that is where a preprint server should be useful. It gives the scientific community the opportunity to comment and act as public referees, and takes away the ability to block something by editors who do not have to give excuses.
Posted by Ian Miller on Nov 3, 2016 9:44 PM GMT
The RSC should be complimented for its "Future of the Chemical Sciences" initiative, but this raises the question, is anything seriously missing from the output so far. I think the answer is, unfortunately, yes.
First, consider the four plausible scenarios. There is no doubt that chemistry will be needed to address many of the problems facing the world. That suggests there will always be a job for chemists, although not necessarily every type of chemist, so that is well identified, if not somewhat obvious. Of course some of the other options contradict this option.
The concept that we can have "do it yourselves" chemists where chemical processes that are stocked away in computers, and the future chemist can go to a "black box", key in what he wants, the computer tells him what feedstocks it needs, the chemist gets them, fills up some hoppers or whatever, presses "go", and waits for the product to come out, neatly bottled, might be plausible eventually, but this does not really require much in the way of chemistry, and it suggests there is not much future for synthetic chemists. This, to me, is an invitation to disaster, because like it or not, you should understand what you are doing. How many tragedies would there have to be before people were stopped from making something that was highly dangerous?
The third option, "no chemists" speaks for itself, and it is interesting that it focuses on "no new fundamental research". The free market chemistry option simply focuses on funding, and on all funding coming from the private sector. Neither of these options offers much hope for chemistry making our future better.
What do we note about these scenarios? What I see is a fixation on solving immediate problems, with the top problem being jobs for emerging students, and the implication is that current education, etc., is unsuited for purpose. How did we come to this? As far as I know, the major problems we face now have arisen largely because people with little knowledge of science have been at the helm, and the free market has let them at it. Exactly how will the free market solve the problems of climate change or pollution when it is the financial interests of certain players to keep doing what is counterproductive? The only thing stopping them now is regulations, and the voice of those who know. If nobody knows, where then?
What I notice is missing is the concept of trying to understand our discipline. There is no doubt that experienced chemists in a small section of chemistry know all about that small section very well, although whether it is understanding, or more the ability to recall what happened last time something like this was tried, is another matter. I have this feeling that chemists have now become very keen on "how to do something", but seem to have lost the spirit of discovery, wherein we might ask, why does x, y, z happen? Am I right? Who stands up and confesses they want to understand the basis for what they are doing? I do, but how many others are there? I recall earlier in the year I put up a blog post that raised the issue that maybe the energies of the A – B molecules of the group 1 elements were additive for their component atoms. That verges on heresy, and despite the evidence it may well be wrong, but nobody said a thing. So, what I would like to see is at least a little encouragement for understanding. After all, chemistry might be the first science to have all its fundamentals settled. We know all of chemistry is determined by quantum mechanics and electromagnetic theory. Why don't we want to use these a little more deeply?
Finally, here is a little example problem. In my novel, "A Face on Cydonia" the story needed small amounts of powerful explosives, so I "invented" a molecule: tetranitrotetrahedrane. My argument was that apart from its propensity to turn itself into carbon dioxide and nitrogen gas with serious vigour, it would be one of the more stable tetrahedranes. Can you guess why that might be? Or why I think it might be? Then, do you want little black box synthesizers in the hands of some of the terrorists who seem to want to go around blowing stuff up?
Posted by Ian Miller on Oct 2, 2016 10:46 PM BST
Far away, on the other side of this planet (in New Zealand, to be precise) we have watched the Brexit issue with a sort of stunned puzzlement. Why on Earth did this happen? And more to the point, now it has happened, why is it unclear what to do next? Even when trying an experiment, my view is you think out what you expect to happen, what you have to do to make it happen, then you try. It does not always work out correctly, because sometimes the molecules decide to be perverse, but I can't recall ever doing an experiment where I did not know what I wanted from it.
With Brexit, it seems the opposite has happened. Britain did it, and the question now is, now what? All of which raises an interesting question for the scientific community: you are trained to ask questions relating to procedure, so why did nobody or no organization stand up and demand to know from those advocating exiting what the consequences of exiting were? Why is it that that a vote was organized, but nobody knew the pros and cons of each position? Why is an informed vote something to be avoided?
All of which raises the question, where now for UK science? Presumably it will be out of major EU projects, although in principle there is no reason why that aspect of cooperation with the EU cannot continue. Perhaps the biggest single problem is immigration. Restricting immigration from the East seems to have been a major reason why the exit vote won, but when Switzerland as an associate began to impose immigration restrictions, it had that status suspended from Horizon 2020. The EU has to remain consistent, and since immigration was a major cause of the vote, the UK government is effectively ignoring the voters if it goes back on that. Chemists may feel that such ejection is no bad thing because chemistry mainly avoids big projects. It is physics that has the LHC, and ESA. I would hope that that sort of attitude is voided, and chemists, and the RSC, start lobbying for science as a whole, and not an insulated part.
Posted by Ian Miller on Jul 11, 2016 4:13 AM BST
In the recent Chemistry World, we read the heading "Technetium carbide refuted; proof that the compound cannot exist after all". The article then goes on to show that a team of computational chemists made calculations and showed that the carbide cannot exist, and what experiment had shown was there is a new phase of the metal.
 
Sorry, but that is just plain wrong. Not the calculations, necessarily, which may or may not be correct. The point is, you cannot prove anything by theory. One of the most successful theories of all time, in my opinion, was Newtonian mechanics, and when used to calculate the orbit of Uranus, observation failed to match the calculations. Either the theory was wrong or it was not, but whatever, nobody argued that the theory was right and observation wrong. The only way out was to postulate a new planet and Neptune was discovered. That was a triumph for theory. It put observable facts into the theory to predict something new, and there it was. However, when Newtonian mechanics was used to calculate the orbit of Mercury, observation failed to match the calculations. Again, either the theory was wrong or it was not, but whatever, nobody argued that the theory was right and observation wrong, and worse, no new planet could fix this problem. In the end, it was found that Newtonian mechanics is merely a good approximation to Einsteinian relativity.
 
As far as I am concerned, I have no idea whether technetium carbide exists. I know manganese carbide exists, and I know it is not especially resistant to certain reagents, so it is easy to make it and then lose it. Because of the nuclear instability, I doubt anyone really worries too much about technetium carbide, because it is extremely unlikely to be of significant use, but that does not matter. Further, the failure to make something in a synthesis does not prove the compound does not exist, but merely that is not the way to make it. There are an awful lot of unstable compounds that can be made, if you know how to go about it. As an aside, from my experience with manganese carbide, you may be better off not starting with the metal, as then that new metallic phase is far less likely to form.
 
As for the calculations, the best theory can do is make predictions. Only Nature can tell us whether they are correct. For me, calculations help us know we understand nature, but you cannot use calculations to prove something. All you can say is, if my theory is correct, then this is what you should expect.
Posted by Ian Miller on May 23, 2016 5:08 AM BST
By now it is apparent that either chemists are not reading these posts, or they are not interested in evidence suggesting bond strengths are additive, or why, sometimes, they are not.
I must change my approach.  In the most recent Chemistry World there is a comment on climate change. What we see is that two proposals have been made to reduce carbon dioxide levels. One is to introduce crushed silicates into soils. Of course, they have to be the right silicates. One that has been proposed is peridotite. Most certainly, the earth is not short of this; it makes up most of the mantle, but of course the mantle is somewhat difficult to access, and we wo0uld have to deal with outcrops that have reached the surface.
The problem with this proposal is the rate of reaction. Some rocks weather tolerably quickly but overall the process is slow. It can be accelerated by at least a million times by injecting carbon dioxide into a suitable fractured rock layer, but that requires a lot of energy. This sort of proposal depends on sufficient of the right silicates being available, and the energy demands on the processing not generating, either directly or indirectly, more carbon dioxide than is removed. One problem is the source of the rock; if you can find it on the surface, obviously it is not reacting very quickly.
A more straightforward method suggested was to greatly increase afforestation. One point noted briefly in the article was that such forests might generate unintended consequences. Does not the logic of this comment grate a little?
First, huge amounts of forest have already been cut down. Allowing them to regrow would merely return the system to where it was before. A particularly good area to let re-develop would be the tropical rain forests. Huge areas of Brazil have had their forests removed, and the land is not that useful for anything else, so it tends to lie barren or be eroded. Replanting the forest, or simply stopping cutting it down and letting it regrow and spread would be a start.
One scheme that I think is worth further consideration is ocean fertilization, to let algae grow. There are two forms of algae: micro and macro algae. Microalgae grow simply with modest fertilization, usually with iron containing materials because the ocean waters away from the coast are remarkably deficient in certain cations. This proposal has been examined, and rejected because it was argued that only a minor part of the algae sunk to the depths and thus would be taken out of circulation. That, to my mind is ridiculous. What happened to the rest? Some, at least, would be eaten by fish and if we helped the fish population regenerate, would that be such a bad idea? Similarly, in the 1970s the US Navy showed that macroalgae could be grown in deep water on rafts, fertilized by sucking up water from the depths using wave power. The experiment ran into trouble during a major storm, and the consequent drop in oil prices killed it, but it might still be worth while. Some algae are the fastest growing plants on the planet, and as I have argued in my ebook "Biofuels", it is reasonably straightforward to make biofuel from it, which would replace fossil oil.
But for me, the biggest problem with the logic of "unintended consequences" is we are going to see some really major unintended consequences. There is a possibility of a sea level rise of up to 60 meters, as a consequence of our fossil fuel consumption. London sits between 5 and 30 meters above sea level. Is not drowning London an adverse consequence? Check with Google Earth, and if you live somewhere near the coast, your descendants may not be living where you live.
My view is the Society should be making as many efforts as it can to persuade various governments to invest more money into geoengineering research, and to coordinate it, because geoengineering alone can reduce the carbon dioxide levels in the atmosphere. What do you think?
Posted by Ian Miller on Apr 18, 2016 12:50 AM BST
In a previous post (http://my.rsc.org/blogs/84/1702) I made the case that the covalent bonds of the group one metals were characteristic of the element, i.e. the energy of any A – B bond was the arithmetic mean of the A – A and the B – B bond energies. I also asked " What do you think? Are you interested?" So far, no comments. Does this mean that nobody can see a glaring problem, or does it really mean that chemists as a whole have little interest in the nature of the chemical bond?
First, the glaring problem. How can the energies be the arithmetic mean? Thus from de Broglie, we know
pλ = h
We have also established that the covalent radius is characteristic of the atom, which means that λ on the bond axis is constant. We also know that on average, there are no net forces on the nucleus, otherwise they would accelerate in the direction of the force. (Zero point motion is superimposed on such an equilibrium distance, but the forces average to zero.) With no net forces, the average wavelength as determined on the other axes should also be constant. You may protest (correctly) that the wave may have only one wavelength, but that is only true if the wave is not separable. For example, one might argue the medium changes on the bond axis due to the change in particle density due to wave interference.

Thus the constant covalent radius implies a constant wavelength for the valence electron in different molecules. But since the total energy will involve a term (p1 p2)^2 minus the original energies, and since the square of a sum does not equal the sum of the squares, and since the path length must change between, say, Li2 and LiCs, the bond energies should not be the linear sum of the components if the waves are delocalized over the whole molecule.  For a simple two-electron wave function that arises from pairing, no new nodes are placed in the wave function (other than the antibond or excited states) so the path length must change significantly. To me, this strongly suggests that the molecular orbital theory is not soundly based. Yes, they can get the right answers by adjusting the parameters/constants within the calculations, but that does not prove the theory is correct. Instead there should some algebraic reason why such additivity arises naturally.

Is there any? The answer to that, in my opinion, rests on the reason why the energy levels are stable anyway. Under Maxwell's electromagnetic theory, an accelerating electron should emit electromagnetic radiation, and this occurs always, except for the stationary states of atoms and molecules. From the Schrödinger equation, such stable states occur only when the action is exactly quantized. If the action about each atom must be quantized for σ-bonded molecules for the molecule to be stable, then we get the additivity of the energy of such simple molecules if the covalent radius is constant. Thus we have a physical reason, independent of calculations, for the observation. The importance of this is that it gives a new relationship to aid calculations, which also shows why the functional group actually occurs. Is such a potentially new physical relationship of sufficient interest to be worth further investigation?
Any comments? Please!
Posted by Ian Miller on Mar 20, 2016 10:54 PM GMT
In the last post, I presented data for the covalent bonds of the A – B compounds of the Group 1 elements that showed to a reasonable degree that the atoms each had a characteristic covalent energy, in the same way there is a covalent radius, and that the bond energy of the A – B bond is the sum of the A and B contributions. This goes against all the standard textbook writings. In an earlier post I stated that previously I had submitted a paper that would lead to a method for readily calculating these bond energies, but the paper was rejected by the editors of some journals on the grounds that either these are not very important molecules, or alternatively (or both) nobody would be interested. This annoyed me at the time, but is seems to me they had a point.  These blog posts have received absolutely no comment.  Either nobody cares, or nobody is reading the posts. Either way, it is hardly encouraging.
Now, the next point that could have been made is that when we get to more common problems, the bond energies are not additive in that way. Or are they? One problem I see is the actual data are not really suitable for reaching a conclusion.
Let's consider the P –P bond energy, which is needed for considering the bond additivity of any phosphorous compounds. I made a quick calculation of the P – P bond energy in diphosphine, on the assumption that the P – H bond energy was the same as in phosphine, and I got the energy 242 kJ/mol. If you look up some bond energy tables, you find the energy is quoted as 201 kJ/mol. How did they get that?  If you consider the heats of atomization of phosphorus, the bond energy is 221 kJ/mol, but if we assume that is in the P4 form, it would be in the tetrahedrane structure, which will be strained (although the strain will also stabilize lone pairs) and of course the standard state will be a solid, so in principle energy should be added to get it into the gas phase before atomizing to make the comparison, so it is reasonable to assume that the real bond energy will be stronger than that indicated by that calculation.
The problem is obvious: to make any sense of this, we need more accurate data. We also need the data to involve energies of atomization, and not rely on the more easily obtained bond dissociation energies. But as far as I can see, the chemical community has given up trying to establish this data. Does it matter? I think it does. For me, a problem with modern chemical theory, which is essentially extremely complicated computations, is that it offers little assistance to the issues that matter for the chemist because there are no principles enunciated, but merely results and comments on various computational programs. The principles are needed, even if the calculations are not completely accurate, so that chemists can draw conclusions, and use these to formulate new plans of action. How many really think they understand why many synthetic reactions work that way? Do we care about the very fundamental component of our discipline? And, for that matter, does anyone care whether I write this blog?
Posted by Ian Miller on Feb 29, 2016 2:14 AM GMT
In my last post, I presented evidence that the covalent radius of a Group 1 metal was constant in the dimeric compounds. I also asked whether anyone was interested. So far, no responses, and I suspect the post received something of a yawn, if that, from some because, after all, everyone "knows" there is a constant covalent radius. There is, of course a problem. Had I included hydrides, the relation would not have worked. Ha, you say, but the hydrides are ionic. Well, the constant covalent radius of hydrogen simply does not work for a lot of other compounds either. Try methane, ammonia and water.  There are various alternative explanations/reasons, but let us for the moment accept that hydrogen does not comply with this covalent radius proposition.
 
If the covalent radius of an atom is constant, then there should be a characteristic wavelength for each given atom when chemically bound, which in turn suggests from the de Broglie relation that the bonding electrons will provide a constant momentum value to the bond. While that is a little questionable, if true it would mean the bond energy of an A – B molecule is the arithmetic mean of the corresponding A – A and B – B molecules. Now, one can argue over the reasoning behind that, but much better is to examine the data and see what nature wants to tell us.
 
Pauling, in The Nature of the Chemical Bond stated clearly that that is not correct. However, if we pause for thought, we find the arithmetic mean proposition depends on no additional interactions being present in addition to those arising from the bonding electrons forming the covalent bond. Thus atoms with a lone pair would be excluded because the A – A bonds are too weak, such weakness usually attributed to lone pair interactions. Think of peroxides. Then, bonds involving hydrogen would be excluded because the covalent radius relationship does not hold. Bonds involving hybridization may produce other problems. This is where the Group 1 metals come to their own: they do not have any additional complicating features. Far from "not being very interesting" as one editor complained to me, I believe they are essential to starting an analysis of covalent bond theory. So, what have we got?
 
The energies of the A – A bonds are somewhat difficult to nail down. Values are published, but often there is more than one value, and the values lie outside their mutual error bars. With that reservation, a selection of energies (in kJ/mol) are as follows:  Li2 102.3; Na2 72.04, 73.6; K2 57.3; Rb2 47.8; Cs2 44.8
 
The observed bond energies for A – B molecules are taken from a review (Fedorov, D. A., Derevianko,  A., Varganov, S. A. J. Chem Phys. 140: 184315 (2014)) Below, the calculated value, based on the average of the A – A molecules are given, then in brackets, the observed energy, then the difference δ expressed as what has to be added to the calculated value to get the observed value.
                  Mean     Obs          δ
Li – Na        88.0   (85.0)     -3.0
Li – K         79.8    (73.6)     -6.2
Li – Rb       75.1    (70.9)     -4.2
Li – Cs       73.6     (70.3)    -3.3
Na – K       65.5     (63.1)    -2.4
Na – Rb      60.7    (60.2)    -0.5
Na – Cs      59.2     (59.3)     0.1
K – Rb        52.6    (50.5)    -2.1
K – Cs        51.1    (48.7)     3.4
Rb – Cs      46.3    (45.9)    -0.4
 
The question now is, does this show that the bond energies are the arithmetic means of the A – A and B – B molecules? Similarly to my last post, there are three options:
(1) The bond energies are the sum of the atomic contributions, and the discrepancies are observational error, including in the A – A molecules.
(2) The bond energies are the sum of the atomic contributions, and the discrepancies are partly observational error, including in the A – A molecules, and partly some very small additional effect.
(3) The bond energies are not the sum of the atomic contributions, and any agreement is accidental.
What do you think? Are you interested?

 
Posted by Ian Miller on Feb 8, 2016 2:03 AM GMT
Time to start the New Year, and wish the RSC a great 175th anniversary. If we think of the last 175 years, chemistry has made serious changes, both to itself and to our lives. Nevertheless, at heart, chemistry is about molecules, and molecules are groups of atoms held together by what we call chemical bonds. The recent Chemistry World has an article that shows how our understanding of bonding has evolved, but I am not convinced it shows how hard this was. Don't forget, as it stood then, the concept of an electron moving around a nucleus violated one of the greatest triumphs of 19th century physics, namely Maxwell's electromagnetic theory. Would you have been prepared to propose something as radical? Or has the quest to discover died out? It is a lot easier to grasp something when everyone tells you it is correct, but what about when nobody knows? I may annoy a number of people, but I believe we still do not properly understand even the simplest of chemical bonds, and I thought one contribution I could make to the year would be to illustrate a problem over a small number of posts. Amongst other things, I hope to show you how hard it is to form new theories.
 
What I am going to do is to focus on the dimeric molecules of the Group 1 metals. These are of interest to me because they are the simplest molecules, with the fewest complicating issues. Or so I thought. Some of what I am going to put in the following posts was submitted to two journals and rejected by both on what I consider spurious grounds. The first said these molecules were not very interesting; the second said nobody would be interested in what I was proposing (which involved a hitherto unrecognized quantum effect). There were no adverse comments about the physics! I wonder were these editors right? Perhaps nobody is interested? Perhaps the urge to overturn the wrong has gone? If it looks OK, then do not disturb! Welcome back, Claudius Ptolemy! Of course, just because nobody is arguing, that may be because it is correct. Maxwell's electromagnetic theory is correct, right? Apart from that pesky issue about electrons around atoms, that is. But you know why that is, don't you? Do you really? Maybe in detail things are not quite what you think. Care to try out some problems?
 
So here goes thought number one. Atoms have a characteristic covalent radius, so the bond distance of an A – B molecule is the arithmetic mean of the A – A and B – B bond distances. Do you agree with that statement?
 
Let's test it. The literature contains the necessary A – A bond distances, although of varying degrees of accuracy. However, recently a review of the bond properties of the A – B molecules of the group 1 metals has been published (Fedorov, D. A., Derevianko,  A., Varganov, S. A. J. Chem Phys. 140: 184315 (2014).).  So, let's check. First, the A – A bond distances. The following is from various literature sources with distances in pm:
Li2  133.6;  Na2 153.9;  K2 193.5;  Rb2 210.5; Cs2 230
Now, let us look at the A – B molecules. In the following, the column labelled "mean" is the arithmetic mean of the relevant A – A molecules, the column labelled observed is the measured values from Federov et al., and δ is what must be added to the calculated value to get the observed value.
 
                   Mean       Obs          δ   
Li – Na       287.5    (288.9)       1.4
Li – K         328.1    (332.3)       4.2
Li – Rb       344.1    (346.6)       2.5
Li – Cs       363.6     (366.8)      3.2
Na – K       347.4     (349.9)      2.5
Na – Rb     364.4     (364.3)     -0.1
Na – Cs      383.9     (385)         1.1
K – Rb       404       (406.9)       2.9
K – Cs       423.5     (428.4)       5.9
Rb – Cs     440.5     (437.1)      -3.4
 
What do you make of that? There are three options:
(1) The bond distance is the sum of the covalent radii, and the discrepancies are observational error, including in the A – A molecules.
(2) The bond distance is the sum of the covalent radii, and the discrepancies are partly observational error, including in the A – A molecules, and partly some very small additional effect.
(3) The bond distance is not the sum of the covalent radii, as shown by the lack of agreement.
What do you opt for? Can you discern any trends? This probably seems fairly obvious to you, but soon it will be less so. The question is, is anyone interested? Were the journal editors right in that nobody cares about the nature of the chemical bond? Will anyone respond?

 
Posted by Ian Miller on Jan 24, 2016 9:13 PM GMT