Is science sometimes in danger of getting tunnel vision? Recently published ebook author, Ian Miller, looks at other possible theories arising from data that we think we understand. Can looking problems in a different light give scientists a different perspective?

Share this |

Share to Facebook Share to Twitter Share to Linked More...

Latest Posts

In previous blogs, I tried to outline some of the pros and cons of ethanol as a biofuel, with fermentable sugars largely being provided either from crops, or from lignocellulose by fermentation. The difficulty with using lignocellulose is that lignin has evolved to protect cellulose from such enzymatic attack, with the result that processing plant is very large, and there are massive volumes of water and wet spent biomass to process. Further, there is almost as much hemicellulose as cellulose, and because of the variety of linkages in hemicellulose, a set of enzymes is required, and there is a significant additional cost involved in supplying these enzymes. The alternative is acid hydrolysis, and for some reason, this appears to have been discarded as an option. Is that premature?
 
The basic problem with acid hydrolysis is that employing dilute acid results in an unacceptably slow reaction unless heated, but glucose (the desired product) reacts with hot acid to produce hydroxymethyl furfural (HMF), which further reacts to produce levulinic acid plus dark polymeric material. The net result is that in the Madison process, which involves heating wood chips with dilute sulphuric acid, the recovered yield was about 35% of theoretical, which is clearly not good enough. Interestingly, however, there are two options that are seemingly not currently considered.
 
The first is flash hydrolysis. As shown by Chen and Grethlein (Biomass 23: 319-326, 1990) if you heat the biomass for seconds at over 200 degrees C, conditions they argue are reasonably reached in a specially designed cyclonic reactor, then yields of up to 87% are claimed. The concept is, of course, you make the glucose and quench it before the conversion to HMF can get underway. Actually, that in turn may not matter that much because if the HMF can be recovered, it can also be converted into useful chemicals/fuel. These conditions will also hydrolyse hemicellulose, and the pentoses, which do not so easily ferment to ethanol, should be recoverable as furfural, which is valuable in its own right. One problem might involve size reduction; the acid has to get at the cellulose to hydrolyse it, seconds do not permit much diffusion, so the interior of chips may not be reached. Size reduction is possible, but the work done doing it may be too costly.
 
The second option is hydrolysis with chilled 40% hydrochloric acid. Provided the concentration of acid is high enough, the cellulose simply dissolves. The cellulose converts smoothly to glucose, and further reaction is apparently trivial. The hydrochloric acid is removed by vacuum distillation and recycled. You may be skeptical, however this is one of the very few processes that were ever deployed at scale: the Germans made ethanol this way during World War II.  As far as can be determined, this worked well even on reasonably sized chips, and there was only one problem: corrosion. However, while biomass-processing may not have advanced much since then, materials have become far more advanced.
 
Would either of these processes solve any problem? I do not know, but what concerns me is that there are no data around that are readily available that would permit these processes to be either eliminated from consideration, or provisionally considered so that data can be obtained to address questions that currently have no answers. Sending public funds after random guesses seems wrong to me. Why not analyse the problem and publish the findings so that funding can be deployed on a more rational basis? A very large amount of money must be invested to compensate for declining oil supplies. Given the current financial constraints, surely it is better not to waste it, and any progress that can be made from theory will save a lot of wastage. Theory is cheap – why not use it?
Posted by Ian Miller on Apr 19, 2012 3:17 AM BST
Why is our solar system different from most of the others we see? How common are planets like Earth that have life on them? Is there life under the ice of Europa? Why will alien life have similar systems to ours? How did we get homochirality, and more to the point, why? I am uploading an ebook with the above title to Amazon that provides answers, and once they put it up, for five days it is a free download, which I why I am mentioning this here. (The answers to these questions are: accretion disks last between 1-10 My after stellar accretion, and our system is required to be one of the 50% where the disk only lasted ca 1 My; some significant fraction of that 50% around stars of similar size to Sol because life is a consequence of planetary formation; life under Europa is impossible for several reasons, an absence of nitrogen and no mechanism to make phosphate esters being two of them; RNA is the only feasible polymer that can self-reproduce and form abiogenically in dilute aqueous solution; homochirality is a requirement for life and it selects itself, but why is too complicated for such a quick comment.)
 
Why am I putting up an alternative theory as an ebook, instead of through peer-reviewed scientific papers? There are several reasons, including:
(a) The theory is largely chemical, which explains why the various planets and minor bodies have different compositions, but only a few physics journals will publish theories on planetary formation,
(b) Such journals require computer modelling. I can't do that, and in any case the growth of a planet with respect to time requires four terms, none of which have clearly defined values.
(c) Scientific papers really should assert one major point. No single point is convincing, but my argument it that the complete set is.
(d) I want to get it read. So far I have published what I believe are four very significant advances (if correct) in peer-reviewed papers, and I doubt anybody reading this could name one of them.
 
But surely standard theory is adequate, so you say? Then consider this. In standard theory you need a relatively massive amount of solids to form planetesimals (with no known mechanism to form them) which then, in our system, take about 15 My to get to the point where Jupiter can start massive gas accretion. Because of solids dilution with distance, the problem gets much worse with distance. (Originally Safranov required 10^11 yr to form Neptune. Subsequent models have greatly reduced this, but it is not entirely clear to me how, or, putting it another way, why was Safranov so wrong?) However, there is a planet LkCa 15b that is about three times Jupiter's distance from a star slightly smaller than Sol, it is about 5 times Jupiter's mass, and LkCa is a ca 2 My star. (Which is why I believe our system removed its accretion disk early.) If you are interested the download, at http://www.amazon.com/dp/B007T0QE6I is free from April 12-16, which is why I am putting it on this post. I know there should be no commercial point to these posts, but free seems to me to be different.
Posted by Ian Miller on Apr 12, 2012 12:58 AM BST
In my previous two postings, I praised David MacKay's approach to energy analysis in his book, "Sustainable Energy — without the hot air", because he used numbers to put everything into perspective. The problem with numbers is that while they can put the problem into perspective when the numbers are used correctly, the opposite happens when they are not. As the old computer saying goes, "garbage in, garbage out". In general, numbers showing the dependency of one variable (that being measured) with the change of another effectively represent solutions to partial differential equations. In the lab, this is not a problem; most chemists will have done this, assiduously keeping everything except what we wish to relate constant (or at least try to). Unfortunately, for living systems, ecological systems, economic systems, and a number of observational systems, such a separation of the variables can't be done, so elements of significant unreliability creep in.
 
There then arises a consequential problem: you have the numbers, but what do they mean? You believe you have shown something so there is a temptation to take this to an extreme. Worse, in most analyses of complicated topics there is little option but to accept somebody else's numbers for some aspect. Do they really mean what you think they do, or equally importantly, but more difficult to unravel, what the supplier of the numbers thought they meant? In my opinion we need a forum where misinterpretations can be discussed, and if the objections are valid, the conclusions corrected. I do not believe that any single person will have a broad enough and deep enough knowledge of some multi-disciplined problems to get everything right. We need the expertise out there to correct the flaws.
 
An example. In an earlier post I quoted from MacKay as follows: the calculated power available per unit area, in W/m2 for pond algae are – 4 (if fed with CO2) which is an order of magnitude higher than most land based biomass, but then he adds the productivity drops 100 fold without adding CO2, then he notes that to use the sea, country-sized areas would be required. Does anything about that strike you as odd?
 
My first reaction, perhaps afflicted by living in New Zealand, is that even if country-sized areas of sea are required, so what? Once you fly over the Pacific, it becomes obvious that whatever else we are deficient in, surface area of sea is not one of them. Certainly there are other problems using it, and possibly these will be overwhelming, but let us do some research and find out, and not simply write the possibility off by assumption.
 
My second reaction was to look in disbelief at the two orders of magnitude loss of productivity by not pumping carbon dioxide into the water. The limiting feature of carbon dioxide on photosynthesis will be its solubility in water. However, there are many more factors involved with algal growth. For example, most people have heard of algal blooms. Given the second law of thermodynamics, do we really believe there were suddenly self-assembled massive increases of carbon dioxide that led to them? Or do we accept that the masses of algae in some lakes are actually there because agricultural run-off has delivered the additional nutrients to make this possible? Algal growth is usually limited by nutrients, and not carbon dioxide availability. As an example, one of my first projects in the commercial area was to look at resources for making agar. In the Manakau harbour, Gracilaria chilensis grows at a rate of a few t/ha/y, but we found one pond where, due to the construction of a road and the dumping of gravel, productivity went up to about 110 t/ha/y.  What was different? Gravel for the plants to attach a holdfast, and the pond being close to the effluent from a sewage treatment plant. Such treatments usually discharge high levels of nitrogen compounds and phosphate, which become ideal fertilizers. There was no increase in carbon dioxide levels at this site. Farmers, after millennia of learning, get far better yields than are found in the wild; why should this not happen in the sea? Why do we think we don't have to put any work into making improvements?
 
Yes, I think numbers are important, but only if the appropriate ones are used correctly and their limitations accepted.
Posted by Ian Miller on Mar 30, 2012 3:32 AM BST
In my previous blog, I mentioned David MacKay's excellent book, in which he emphasized that we should focus on numbers. The problem with numbers, of course, is that they are easy to generate, but often different sets do not compare easily. Since then, I have seen an Editorial in Angewandte Chemie International Edition by Hartmut Michel, who more strongly suggested abandoning biofuels and going for electrified transport. So what do we make of this? Michel’s case was mainly based on efficiency: plants can only convert about 4% of the light energy, and in practice as little a 1%, and then about half of that is used on fertilizer, pesticides, ploughing, transport and the chemical conversion. On the other hand, photovoltaics can convert 15% of their energy to electricity so the use of biofuels involves an extremely inefficient land use. Note that the argument against biofuels includes externalities (which properly have to be considered) but the argument for electrification omits them. For biofuels, MacKay has calculated power available per unit area, and the figures are, in W/m2:  biodiesel from rape – 0.13, sugar beet to ethanol -  0.4, ethanol from cane sugar – 1.2, ethanol from corn – 0.02, ethanol from switchgrass – 0.2, jatropha on good land – 0.18, jatropha on waste land 0.065, pond algae – 4 (if fed with CO2). He then disposes of algae in the sea by noting the productivity drops 100 fold without adding CO2, then he notes that to use the sea, country-sized areas would be required. Biofuels Digest has listed the following numbers, in t/acre/annum: managed prairie – 2.5, switchgrass – 5-9, Eucalypts – 10, Miscanthus – 12, macrocystis – 50, microalgae – 60. So, what do you make of this raft of numbers?
 
The point of my ebook, Elements of Theory, was to put forward the Aristotelian method of analysing an argument and putting forward a theory. What most people do not appreciate is that Aristotle effectively invented discrete mathematics; unfortunately algebra had not been invented so he had to use awkward sentences instead of simple symbolic statements. However, if we think in his way, we note that transport has one function, to get something from A to B. There is a set of requirements to get from A to B, and the whole set has to be considered, and that includes the convenience. Energy efficiency is not a goal in itself. The provision of fuel is a subset of the set, but not the only one, and these subsets are rather complicated.
 
As an example, MacKay shows how the range problem for electrified cars can be solved with 500 kg of batteries, when a range of 300 km is plausible. That seems plausible until, in the fine print, we see we are driving at 50 km/hr (30 mph). Now that is not a typical speed on, say, the M1. From Michel, the supply of the necessary electricity is from photovoltaics spanning various deserts, with the current being supplied by superconducting cables. Leaving aside the issue of night-time, the demand for an enormous number of batteries and for elements like tellurium, one suspects that he has never seen a desert dust storm.
 
Of course it is also easy to pick holes in such arguments, and by itself that achieves little except that it might identify areas to carry out more research. I suspect that one of the more promising options has been essentially rejected in the above discussion. Everybody believes it would not be worth considering because there are a few seemingly insurmountable problems. My argument is that we need the research carried out now to determine whether they can be overcome, but even more importantly, we need to better think out where we are going to invest. More in due course.
Posted by Ian Miller on Mar 22, 2012 10:51 PM GMT
Can society live sustainably? That is a question addressed by David MacKay, a physicist, in "Sustainable Energy — without the hot air", (www.withouthotair.com for a free pdf download) and it is a book I would strongly recommend for reading by anyone interested in this topic. No, it does not, in my opinion anyway, give all the answers, but what it does is more important: the author sets out the way a physicist goes about finding them.  There are places where I disagree, e.g. I feel that too many of his "possible solutions" are somewhat optimistic because by his own admission, he has left out the economic issues. However, what it does is cut through a lot of the "hot air" then considers one very important issue: put numbers on your proposal. My second reservation about this is that the numbers have to be good. In my opinion qualifiers are needed, e.g. for tidal and wave power it is all very well to count up the miles of coastline; it is another thing to actually harvest the energy therein. What is great about it, however, is not the answers he provides, but the framework he provides and the methodology to get them. Read it and see for yourself.
 
Transport is a particularly difficult area. Perhaps because he is a physicist he focuses on electricity to power it, although he does accept that some liquid fuels will be required. These tend to be ethanol and biodiesel, and I shall try to show that this approach is not necessarily optimum in later posts. However, there is no doubt that electrification is optimal for major routes employing public transport. The undergrounds in London and Paris are, from experience, "must use" modes. One problem is that unfortunately the population explosion in the 20th century has led to cities that are too spread-out, e.g. Los Angeles, as well as smaller ones, such as, closer to home, Auckland. Why do commuters in these places not use public transport? Because the workplaces and the residences are scattered widely in a "salt and pepper" fashion. Without major city restructuring, individual transport is going to be required for some time yet, particularly as long as those in residences do not want to be "polluted" with nearby commerce.
 
This book promptly disposes of the hydrogen economy, at least for private transport (I think you have to work with hydrogen to appreciate how difficult leak prevention is) and mainly concentrates on batteries, but it does mention the zinc-air fuel cell. My own view is that more research effort should be put into metal-based fuel cells, but even then I suspect there will be a good place for biofuels. MacKay is correct that so far it is extremely unlikely that biofuels can ever solve the transport problem; it is simply not possible to make enough of them. Nevertheless, there will be some applications where they can contribute, and we may not have unlimited electricity either. As you may gather from my previous posts, I disagree with MacKay when he says municipal solid waste should be incinerated to provide electricity. Resources with carbon-carbon bonds have too much potential to provide liquid fuel. Now it is possible that the economics do not add up, but again, let us determine some accurate numbers for the options before we close them off.
 
After reading MacKay's book there is one almost inescapable fact: if society wants a lifestyle with any resemblance to the current one in fifty years, the numbers simply do not add up unless nuclear power (including the possibility of thorium reactors) is employed.
 
You don't agree? Then read the book and provide the numbers for an alternative scenario. For the one most important point MacKay makes is, arm-waving and adjectives simply do not help. Your theory of what we should do has to add up, and that requires numbers, and some simple mathematics.
Posted by Ian Miller on Mar 14, 2012 10:38 PM GMT
Recently I have been involved in a web-discussion hosted by the Royal Society of New Zealand on energy development. Since this is closed to members, I thought I should raise some of my points here. One of my concerns, which I raised in Elements of Theory is that modern science is not very good at reaching appropriate conclusions, i.e. we know not all that we know. I am not suggesting that conclusions of papers are not appropriate; internally most scientific papers are self-consistent and do not overlook much within their scope. (That does not mean the conclusions must be correct, but it does mean they are sensibly reached.) No, what concerns me is that they are not put in proper context. We may describe a leaf in minute detail and correctly assign its function to the tree, but the role of the tree in the wood may elude us.
 
I am choosing Range Fuels Inc as an example. According to Jim Lane, (Biofuels Digest, 5th December, 2011) Range Fuels received about $160 million in investor funding, and $162 million in government commitments, although not necessarily all drawn down. The process apparently involved gasification of biomass, the conversion to syngas, and the conversion of syngas to ethanol (plus some other alcohols) through its proprietary catalyst. The major funding was for a 40 million gallon per year plant, but once funding was achieved, it appears to have been reduced to a 4 million gallon per year plant, then it turned out that that would be only methanol for a while, and then, well, the influential people lost patience.
 
This raises some interesting questions. The first is, methanol technology from synthesis gas has been known since the early 1920s, so that should not have been a problem. There are many gasifiers available, and the water gas shift reaction is very well known. The making of a different alcohol mix apparently involves their proprietary catalyst, which should not change the plant, so the entire process from gasifier to alcohol-synthesis plant should have been essentially "off the shelf". So, what could go wrong?
 
My guess is project analysis. The first problem is scale. Syngas to alcohol or hydrocarbons usually involve massive plants because they need the economies of scale to be competitive. Biomass feedstock is not really well suited to processing in large-scale plants. It does not transport easily, and while the yield from an area appears to scale as the square of the radius from the plant, in practice it is closer to linear with distance travelled because the trucks have to follow roads. Gasification of biomass is not straightforward. Without appropriate care, too much water and carbon dioxide is made, while a carbon monoxide rich feed can also eventuate, which requires considerable scrubbing of CO2 and gas recycling from the water gas shift reaction. Gasification can waste a considerable amount of the available C/H, while to be economical, very large volumes of syngas are required. This simple fact, based on considerable operational experience of methanol and Fischer Tropsch processing, should have suggested this was a very bad idea.
 
This brings in what may be a really fundamental problem: those who have the funds were probably not sufficiently skilled to pick this up, and there has been no publicly available analysis to guide them. The issue is, $320 million still buys quite a bit. The object of a development problem is to locate the mistakes and things that will go wrong while everything is still cheap (or relatively cheap). It is one thing to take a risk, but success involves minimizing risk, and who knew what the risks were? I am far from convinced that society has the wealth to keep on throwing away these sorts of sums.
 
Interestingly, in early 2010, energy writer Robert Rapier wrote a stinging critique of Range Fuels funding, and noted that the decision to continue funding this very expensive project was at the expense of projects that were, perhaps, more deserving on technical grounds, but less vocal. This raises another hazard, as noted by Rapier: funding should not depend on the boldness of the claims or the loudness of the claimants. The problem is, how do we get things right?
Posted by Ian Miller on Feb 29, 2012 9:14 PM GMT

What is science about? Having a comfortable research area, churning out papers in a very narrow niche, and thus securing further funding? Or taking a risk, getting out of the comfort zone and trying to understand? I am going to propose an experiment that chemists could do that could verify (or not, as the case may be) a major issue in physics. My question is, has anyone got the nerve? First, some background.

 

When James Clerk Maxwell applied his fundamental equations of electrodynamics to  consider a pulse of electric field in a transverse wave form, his equations required a similar wave of magnetic field be generated at right angles, with a phase delay, and that wave, then proceeded to generate a new electric field wave. Further, with a little mathematics he showed that the speed of this composite wave depended on the reciprocal of the square root of the product of the permeability and permittivity of space, which was close to the speed of light as then measured. He therefore concluded that light was an electromagnetic wave. This was one of the great unification moments in our understanding of reality. Unfortunately, within about two decades, evidence indicated particle properties, and the photon was "born". Now everybody thinks they know what a photon is.
 

In Maxwell’s electrodynamics, the electromagnetic fields are short-range, that is, they behave as if a charged particle is continually sending out messages at the speed of light, thus making other charged particles aware that they are there. As far as I can make out, current thinking is that the charged particle sends out pulses of “field”, which then behave as above, i.e. the messenger for the electric field is the photon, but since we do not see it, it is a virtual photon. Quantum electrodynamics describes how the virtual particle is supposed to behave, and is outside the scope of this blog, but it accounts for a very restricted set of observations with incredible accuracy.
 

But does that mean the "virtual photon" is a genuine intermediate? What would provide a means of deciding? After all, the value of the scientific method must lie in determining an answer, not reciting accounts of that with which we have become comfortable. What appears to be required is an experiment that gives a result that requires the virtual photon, and cannot be readily explained by some other means. To devise such an experiment we need to define the properties of such a virtual photon. Either the virtual photon is equivalent in most ways to a real photon, or it is not. If it is not, we know nothing about its real nature and the name gives misleading comfort.  Theory is useless if any term can take any value "on demand" as it explains everything and predicts nothing. 
 

 What do we require?  It carries momentum (which leads to the force), it carries energy (because the field itself carries energy), and it interacts in the same way a real photon does. It is postulated to be transmitted because it has oscillating electromagnetic fields. One possible difference involves where the oscillations occur. If real photons oscillate in the x, y dimensions, it has been proposed that virtual ones oscillate in the z,t dimensions.
 

An oscillation in the z dimension involves motion exceeding the velocity of light, while oscillations in time permit negative momentum as the photon travels backwards in time, thus permitting attractive forces. To me, leaving aside the minor problem that length and time are dimensionally different, this has an element of desperation about it, nevertheless quantum electrodynamics has had remarkable success. However, there are further options that do not seem to have been investigated, such as the use of additional dimensions. The question now is, what helps us devise an experiment to demonstrate their existence?
 

We know that electric fields are attenuated by intervening materials; we describe the degree of attenuation as the dielectric constant, and this should indicate that the virtual photons are absorbed, for if they were not, they should progress and deliver the expected force without the intervening medium. This would appear to give a possible method for detecting them. Set up two plates in parallel and with equal electric charge (to avoid a potential gradient), and insert a solution with a molecule that fluoresces. The concept is, the virtual photon should have a certain probability of exciting the molecule, but when the absorber's excited state relaxes via fluorescence, it will emit a real photon, which is detectable.
 

A positive result is a triumph, however the negative one is more problematical, after all, there may be many reasons why a set is empty. Thus if the virtual photons have the wrong frequency, there is no excited state, and the theory does not specify their frequency. If frequency represents the number of photons in a ray, then adjusting the voltage should alter the frequency so there are, in principle possible variations. 
 

Understanding implies that when presented with a new situation, you can predict the outcome. Can you predict the outcome of this? If the experiment gives a positive result, a number of experiments follow that allow a good understanding of virtual photons, and probably fame and fortune result. The question is, what do you do with a negative result? My guess is, that will prevent anybody from trying, but if you could do it, and you cannot explain why it would not work under any circumstances before doing the experiment, do you really and truly believe in virtual photons? Of don't you care one way or the other?

Posted by Ian Miller on Feb 21, 2012 9:13 PM GMT
Previous comments have raised an important issue: land can only be used for one purpose at the same time. If you grow biofuel crops, you are not growing food. So, is the concept of biofuels doomed by the need for food? I do not think so, and I hope to convince some of you, although I must emphasize, biofuels could only ever provide some of our current transport requirements; it is most unlikely that we can get away without changing some of our habits. I am convinced there is no single answer to the transport problem, and we shall need multiple contributions.
 
One promising source, in my opinion, is wastes. One simple calculation I did was that municipal solid waste (MSW) could be converted to hydrocarbon fuel at a rate of about 2 litres per person per week, based on 1980s MSW production. That may not seem a lot, but it includes all those who do not use vehicles. That is gross production, but it is effectively net because fuel is needed to collect the waste. You cannot simply leave the refuse on the streets. Forestry wastes are another source. Currently these are usually left lying, or buried, but they could produce very large amounts of fuel if properly used. Agricultural wastes are slightly more difficult to assess, because many of them, while they could produce large amounts of fuel, are also valuable for returning nutrients to the soil and conditioning it. However, food processing wastes are highly suitable, because there is no other use for them. Think of the huge volumes of waste after pressing olive oil. Sewage sludge is unlikely to have serious conflicting uses. The advantages of using wastes is that in most cases, using them solves another problem, namely what to do with them.
 
Objecting to growing crops seems wrong to me unless those crops have better uses. While there are food shortages, these are in specific localities, and elsewhere, farmers have been paid to grow nothing. They might as well grow something useful. Think of sugarcane in Brazil. If you did not use it for fuel, what would you do with it? And why not use the mountains of bagasse that are created? I have driven through selected parts of Brazil, and there are vast tracts of land that are essentially doing nothing, and have little environmental benefit either. Maybe they are not much use for growing food, but if so, is it wrong to grow fuel-crops? Then there is marginal land.  Members of the Euphorbiaceae frequently have relatively high lipid or hydrocarbon contents and are alleged to grow on marginal land; the problem with this is that marginal land may support special plants, but the yields are usually marginal, and whether it is worth the effort is another matter.
 
The aqueous environment also offers scope. One project I have been involved in is to use microalgae grown in sewage ponds. This option has two advantages: besides producing fuels, and some interesting chemical options, microalgae also scrub excess nutrients from the water, thus reducing pollution. Similarly, the marine environment offers further opportunities. Macrocystis is one of the fastest growing plants on the planet, and it is really fascinating to have a small plant growing under a microscope: you can watch the cells divide right before your eyes. More to the point, the US Navy once demonstrated that you could grow this on rafts in deep water. Whatever else we are short of, ocean area is not one of them.
 
What is interesting is that most of these opportunities are currently being ignored. All require serious technical work, but is that not what scientists should be advocating? For example, the US Navy project had no problem growing the plants, but because the rafts were anchored to the seafloor, the project was destroyed in a storm. Such a failure, though, is no reason to give up. There are other ways of going about such a problem. (The project also probably wound up then because the price of crude began falling rapidly.) All of which raises the question, how does society recognize its options and organize itself to carry out the necessary actions in a timely fashion. Current evidence is that society is half asleep at the wheel. Yes, in principle it sees a problem, but that will be in the distant future, or so it believes. What it does not realize that developing technology to deal with such problems, and implementing them on the necessary scale, cannot be done in a few years. We need to be doing more now.
Posted by Ian Miller on Feb 7, 2012 9:06 PM GMT
There appears to be a theory running around regarding the world's economy that goes something like, "We've had troubles, but in a few years it will be business as usual." Is that correct? I think some have considered me to be just plain pessimistic because I think the answer is, "Not really," and I also think things will get very much worse in the course of time. Since I am now semi-retired and research work for a private researcher in NZ is not exactly promising in these tight economic times, I wrote a thriller and self-published it as an ebook, which is set in about 2030 when economies were on the verge of total collapse due to a lack of oil. Part of the reason for writing it was to get people to think about our future prospects if we continue to do nothing about them, but the question is, can things really get that bad?
 
In this context it might be of interest to read an article by Murray and King (Nature, 481, 433). Their basic point is that oil production has now become inelastic (i.e. an increase in price does not lead to an increase in production) and the peak production is about 75 million bbl/day. Yes, small new fields are being discovered, but the fields we know about are declining between 4.5-6.7% per annum. They quote the US Energy Information Administration as projecting the demand for oil at 2030 (the accidental date of my novel!) at a 30% increase, and to do that, we need new fields to produce about 64 million bbl/day, which is almost as much again as what we are producing now. That is not likely to happen. Does it matter right now? The increase in the price of petrol of 20 cents/litre between 2010 and 2011 cost the US $280 million/day. Italy currently spends $55 billion/a, up from $12 billion in 1999, and that difference is very close to the annual trade deficit.
 
Murray and King's point is economies cannot continue like that. If they cannot make changes, then change will be imposed. Thus for every additional dollar spent on petrol, that is a dollar that cannot be spent on something else, which means somebody else's job disappears when the products they make cannot be sold, which in turn means that there is less tax to pay for government services (let alone pay off debt) and of course, the unemployed person, besides taking a benefit and incurring additional government costs, cannot spend on services, which leads to further unemployment, etc. Then again, if society does not get started soon and lets the debt get out of hand, it will not have the capital it needs to make the necessary changes.
 
An immediate conclusion might be, biofuels cannot possibly make up that slack, and that is probably correct, at least in the near future. In my opinion, the only possible area available for such massive production is the oceans, and as yet we have no idea how to make that practical. Tar sands and coal similarly cannot make up the slack, because there is simply insufficient there. So, do we give up on biofuels?
 
In my opinion, no. Biofuels cannot replace oil, but do they have to? Oil is used for a number of uses, many of them because the oil is there. Oil is not necessary to make electricity, but it is difficult to see an alternative to liquid fuels to power aircraft. Similarly, I live not that far away from a major road, and what is interesting is that at rush hours there are a large number of cars going both ways, although there is obviously a preference for commuters going to work in the major city centre. But the question then is, why don't we arrange our lives so that we live closer to work? If everybody lived within 5 km of work, they could all cycle to work. Logic suggests there are answers that can make the future quite attractive, but it needs both cooperation and coordination, and there appear to be few if any indications from politicians that they see such possibilities. The problem, of course, is that the real difficulties will occur some time in what the politician sees as the distant future (i.e. well after the next election), which raises the question, what should people that believe such problems are coming do about it? Suggestions welcome, because I think that if at least people begin discussing it, at least there will be wider appreciation of the problem, assuming there is one. 
Posted by Ian Miller on Jan 26, 2012 3:06 AM GMT
Polly-Anna Ashford posted a comment about PhD supervisors, and instead of commenting, I thought I would post my experience here, mainly because it was my PhD that got me into the alternative interpretations game.

My PhD started like this. I was given a nice looking project, but just as I was about to start, there it was published in the latest JACS. (Better then than 2 years later!) My supervisor gave me two possible new projects and went on holiday. Project 1 involved measuring the rate of some reactions, however, a quick search of the literature gave the answer: zero! Project 2 verged on the suicidal - heating about 3 kg of material with nearly a kg of diazo compound to get a few grams of starting material from several kg of carcinogenic tar. So, head of department suggested I find my own project, which I did. I would enter the emerging controversy on whether cyclopropane was an electron conjugating entity. It seemed a good project but for some synthetic difficulties, not helped by the fact that one key Tet. Lett. kindly omitted the fact that a key reaction had to be done at minus 80 degrees!  (Silly me for not picking that.) So I had to make everything the long way, and by the time I was getting to the meat of the project, supervisor disappeared off for a 1 yr sabbatical. In the end, it dragged out to nearer 18 months by which time I had virtually had to work it out myself.

Four things then happened in fairly quick succession. Supervisor reappeared and came up with a genuinely constructive idea, then disappeared again, searching for a better paying job in North America. The scientific community became firmly convinced that cyclopropane did conjugate because (a) it stabilized adjacent positive charge, and (b) use of the Hammett equation showed that the constant rho, which the degree of transmissability (sorry about that word!) was 30% higher than (CH2-CH2). However, in my work, the Hammett sigma "constants", which actually vary depending on whether there is conjugation (!), showed no conjugation. I then realized the value of rho was exactly as expected with no conjugation but with the additional path, and more to the point, I saw why cyclopropane should stabilize adjacent positive charge without conjugating. The reason lies in the nature of the potential energy in electromagnetic interactions. (In my ebook, I give the example, throw a stone in the air. At the top, its kinetic energy is converted to potential energy, but where is that energy?) In short, I decided that if you considered cyclopropane to be a strained system, before you start postulating strange effects, your reference point should be in accord with Maxwell's electromagnetic theory. The problem then, of course, is that Maxwell's eelctromagnetic theory is not included in most chemistry courses, which is odd since chemistry depends on electromagnetic interactions. The fourth event was, of course, I had to write up my thesis, more or less on my own. What should have happened was that my supervisor should have sat me down in front of a physicist and got my analysis sorted out, but recall my supervisor was on another continent. So, Polly-Anna, are things for you really that bad?

Post-script. Supervisor, who had no intention of putting his neck out, refused to publish the data resulting from his "good idea" because the results contradicted emerging consensus, and the emerging "proof" of conjugation through MO theory, including some work published by the emerging John Pople. These versions of MO theory were exactly the same as predicted the exceptional stability of polywater, so forgive me if I am unimpressed by that "proof". I published a small series of papers,  but I botched the first one by assuming the reader would follow the essence of Maxwell's theory, and I got carried away by the fact that I had an approximation that got an analytical solution to an otherwise insoluble set of partial differential equations. The highlight arose for me when I realized that the n -> pi star transitions of a carbonyl group would generate a charge transfer towards the cyclopropane ring, and while conjugation requires a bathochromic shift, my theory permitted me to calculate the hypsochromic shift to within 1 nm of observation. Triumph? Well, no. One review dismissed these hypsochromic shifts as "unimportant". The authoratative review came out and decided that cyclopropane conjugates, and this is found in most text books. It completely ignored my work, and it completely ignored all the data I had found to support my theory. I later wrote a logic analysis type of review in which I listed over 60 different types of observations that are not in accord with the conjugative theory. The journal that started this review rejected it because there were too many mathematics! Other journals refused to accept logic analyses.

So, Polly-Anna, and others who are a little less happy at the bench, be grateful you do not make too much of a discovery. It is not the good thing to do that I imagined when I made that one. If you go against consensus, too many people have too much to lose, so if you do not win quickly, you lose badly, and you would be surprised how many people dismiss you out of hand. On the other hand, join the flow and anything is forgivable. You may recall John Pople won a Nobel Prize. His assertions on the stability of polywater were put aside for that!
Posted by Ian Miller on Jan 14, 2012 11:06 PM GMT
   1 2 3 4    Next >