Sunday, December 31, 2006
A planet is a celestial body that (a) has sufficient mass for its self-gravity to overcome rigid-body forces so that it assumes a hydrostatic equilibrium (nearly round) shape, and (b) is in orbit around a star, and is neither a star nor a satellite of a planet.
This definition was supplemented with the following definition of a satellite:
For two or more objects comprising a multiple-object system, the primary object is designated a planet if it independently satisfies the conditions above. A secondary object satisfying these conditions is also designated a planet if the system barycenter [center of mass] resides outside the primary. Secondary objects not satisfying these criteria are 'satellites'.
If accepted, this joint definition would have raised the number of planets in our solar system from 9 to 12, by adding: the largest asteroid Ceres; Charon, previously treated as merely Pluto's largest moon, but considered to form a double-planet system with Pluto under the supplementary definition above; and 2003 UB313, an icy body more than twice as far from the Sun as Pluto.
However, the proposed definition was rejected by the majority of the participants at the assembly, and, after much belligerent argument, in its place came a definition based upon the orbital dynamics of a planet. The assembly agreed to define a planet to be an object which is neither a star nor a satellite, and which has 'cleared out the neighbourhood of its orbit'. Under this definition, there are only eight planets in our solar system, and Pluto is cast out from the club. One of the advocates of the orbital dynamics approach, Steven Soter, attempts to justify this definition in the January 2007 issue of Scientific American:
Soter rejects the definition proposed by the IAU committee, arguing that "asteroids and KBOs [Kuiper Belt Objects] span an almost continuous spectrum of sizes and shapes. How are we to quantify the degree of roundness that distinguishes a planet? Does gravity dominate such a body if its shape deviates from a spheroid by 10 percent or by 1 percent? Nature provides no unoccupied gap between round and nonround shapes, so any boundary would be an arbitrary choice."
The fact that there is a continuum of object types, and one has to draw a dividing line at some, perhaps semi-arbitrary point, is a poor reason to reject a proposed definition of what philosophers call a 'natural kind'. Such continua, and such arbitrary points of division are endemic to the definition of natural kinds other than elementary particles, and Soter faces exactly the same problem with his own preferred definition: "The IAU may need to amend the definition to specify what degree of clearing qualifies a body as a planet. I have suggested setting the cutoff at a µ value of 100. That is, a body in our solar system is a planet if it accounts for more than 99 percent of the mass in its orbital zone. But the exact value of this cutoff is not critical. Any value between about 10 and 1,000 would have the same effect." Perhaps, however, the choice of objects included or excluded as planets is more sensitive to the choice of a cut-off in the case of the sphericity condition, and Soter might therefore be able to justify his definition on this basis.
Soter ends his article by making the excellent point that, "to be useful, a scientific definition should be derived from, and draw attention to, the structure of the natural world. We can revise our definitions when necessary to reflect the better understanding that arises from new discoveries." Definitions of natural kinds in science are malleable concepts, which change as our understanding changes, unlike the stipulative definitions found in pure mathematics. The debate over the definition of a planet in astronomy is an excellent case study of this.
Saturday, December 30, 2006
Of course, to ensure that nobody engages in binge-drinking anymore, we now have very late opening hours, and this means that I generally can't get to sleep for the noise between 12am and 3:30am on a Saturday night/Sunday morning. In response I have developed a set routine: Watch 'Match of the Day', have a bath between midnight and 1am, log-on to the Sunday Times website and see what's in tomorrow's paper, and finally, listen to Dave Aldridge's film review on Radio 5 between 2:30am and 3:30am. And strangely, I have begun to look forward to this part of a Saturday. Particularly the bath.
I've never understood why people take showers. A shower is just such a stressful experience: the water hammering against the wrong parts of your body, flooding down your face into your eyes and ears, the water always at the wrong temperature, the soap or shampoo going missing, the water leaking out the door of the cabinet. Sure, a shower cleans you, but it's not an enjoyable experience. Contrast that with the luxurious experience of a long, hot, relaxing soak in the bath. A bath is to a cup of tea what a shower is to a cup of coffee. You can read a good book as the heat diffuses through your tired limbs, or just shut your eyes, and let the alpha waves ripple gently across your brain as the visual cortex idles. And the urking just melts away.
Friday, December 29, 2006
Thursday, December 28, 2006
Given the benefits of self-regulation, what do our highly-paid policy-makers propose? John Birt: ‘No comment.’ Transport Minister Douglas Alexander: ‘Road charging!’ Head of Highways Agency, Derek Turner, in charge of de-congesting our roads: ‘Speed delimiters!’ In other words, more expensive technology to hamper human nature and expand the control industry.
Cassini certainly has a point: there is an excessive amount of top-down, government planning and control of the road network, and this has reduced the capacity of the road network at exactly the time when the demand placed upon it is at its greatest. Traffic lights, bus-lanes, and one-way systems have all contributed to congestion, and this ideological trend needs to be reversed. However, it is worth pointing out that when cars do collide at an unregulated junction where there is a low volume of traffic, they often do so at moderate or high-speed, and because kinetic energy squares with velocity, the damage incurred is considerable. Those who work in risk management are often most concerned with low-frequency, but high-impact events, and I suspect that complete re-regulation of all junctions would increase the number of high-impact accidents. Almost counter-intuitively, then, it is perhaps only the busiest junctions from which we should remove the traffic lights, and substitute a filter-in-turn system.
Wednesday, December 27, 2006
- 2006 was the warmest calendar year in the UK since records began, approximately 300 years ago. 273 out of the 365 days were warmer than average.
- It was the coolest winter in the South of the UK since 1996-1997.
- It was the driest winter in England and Wales since 1964.
- There was very little snow in the winter.
- It was the latest spring for 25 years, with snowfalls in both March and April.
- The first day of the year with a temperature of at least 70F was May 3rd. The temperature on May 4th was 82F. The transition from winter to summer therefore took place in one day! There were then 10 days or so of warm weather before rain returned in the second half of May.
- It was the warmest June since 1976. July was, in some places, the hottest recorded ever, but averaged over the UK, it was merely equal hottest with July 1983. Neither calendar month was as hot as a 30-day period which occurred from mid-July to mid-August of 1995. It was, however, the hottest June-July pair of months ever recorded in the UK. There is normally a type of switch in the UK's weather, which occurs around St Swithin's day, which ensures that the weather in late July and August is the opposite of that in June. If one period is dominated by a continental air mass, the other will be dominated by Atlantic weather. This transition failed to occur this year.
- August was cooler than average, with plenty of rain. It was the largest ever recorded July-August drop in temperature.
- It was the warmest ever recorded autumn in the UK: September was the warmest ever September on record, and October was the 4th warmest. All three autumn months had plenty of rainfall.
- It was the warmest first half of December since 1988, but with fog and frost setting in around the 16th/17th December. The fog in this period was the most persistent and widespread since the late 80s/early 90s.
Plenty to think about there!
Saturday, December 23, 2006
Friday, December 22, 2006
These are difficult times for rational people, particularly in the United States. Those of us who believe that scientific evidence should be the bedrock of policy formation, that logic should be the basis for argument and that uncertainty should beget tolerance are not honored in the political world. Rather, scientific evidence is ignored when it leads to politically unacceptable conclusions, logic is tossed aside when faith is involved, and tolerance for minority opinions is simply out of political fashion. Why should this be? For one thing, we seem to be becoming an increasingly religious country, and because religion supplants evidence and logic with faith—and faith can mean anything you want it to—politicians can get away with appealing to faith without having to justify themselves...
Whether it be jihad, opposition to stem-cell research, or teaching of intelligent design, religion is the genesis of more of our news than at any time I can remember. Because of the central role of religious belief in U.S. political life, this is a good time for a hard look at its nature...
I am glad Dawkins took the time to write The God Delusion at this moment in history. In the United States, there is an increasingly pervasive assumption that Christianity is our state religion. In fact, the tolerance of other religions that was so much a part of American politics, at least in the post-World War II era, is giving way to an increasing focus on Christianity as the only true belief. Atheism has never had a strong position in the United States, and it is hard to imagine a politician today publicly admitting to such views.
Thursday, December 21, 2006
It has been claimed that the reduced visibility itself is causing the problems, (http://news.bbc.co.uk/1/hi/uk/6200527.stm). To some extent, this makes obvious sense. Unless my understanding is out-of-date, there are basically four main controllers in tower control at Heathrow: the Ground Movement Planner (GMP), the Ground Movement Controller (GMC), the Air Controller (Departures) (Air D), and the Air Controller (Arrivals) (Air A). The GMP is responsible for the sequence and timing with which departing aircraft are cleared to start their engines; the GMC is responsible for clearing departing aircraft to receive a 'push-back' tow from a tug onto a taxiway, and for directing incoming and outgoing aircraft along taxiways, to and from the runways; Air D is responsible for the timing and sequence with which aircraft are cleared to take-off from the departures runway; and Air A is responsible for clearing aircraft to land on the arrivals runway.
Now, the aircraft themselves can land in foggy conditions without undue difficulty, but foggy conditions do pose a problem for the GMC, who normally observes the movement of the aircraft out of the window of the control tower. Note, however, that the GMC also utilises the Surface Movement Radar (SMR), which should be independent of foggy conditions, and the GMC operates at night-time, albeit with the help of a 'lighting operator', who controls the pattern of lights along the taxiways.
I have heard it claimed that the main problem at Heathrow lies not only with the surveillance capability of the GMC in the fog, but with the quality of the radio telephony in these conditions. If radio communication between the pilots and controllers is being distorted by the foggy conditions, then obviously this is a good reason for increasing the separation between aircraft.
Sadly, BAA have not considered using a FIDO system to clear the fog away from Heathrow, http://en.wikipedia.org/wiki/FIDO_(device). This was a British World War II system which ran pipelines down either side of a runway, with burners located at regulated intervals. Fuel would be pumped down these pipelines, and ignited at the burners to clear the fog!
Tuesday, December 19, 2006
In a series of "engineering runs", both facilities identified and tried minimize all sources of noise. For example: microseismic noise, caused mainly by ocean waves hitting distant shores. Thermal noise of various sorts, minimized by cooling things to 2 kelvin, hanging mirrors attached to fused quartz test masses on steel wires... and many other clever tricks! Shot noise, meaning the uncertainty in the laser beam phase due to quantum mechanics. Radiation pressure noise, from the lasers pushing on the mirrors! Noise from residual gas in the evacuated tubes. And so on.
The battle against noise and other sources of error led in some strange directions. The Livingston facility had to remove a cattle guard at the entrance because of the microseismic noise produced whenever a car rolled over it. More annoyingly, it turned out that commercial logging near this facility caused real trouble every time a tree fell. And at the Hanford facility, wind-blown tumbleweeds piling up along the pipe would sometimes throw the beam out of alignment, thanks to their gravitational pull.
In this week's post, Week142, Baez berates the US government for their plans to set-up a base on the Moon, and the peril in which this apparently places a number of fascinating projects, such as the Laser Interferometer Space Antenna (LISA):
Monday, December 18, 2006
The New Scientist article also raises the intriguing possibility that we might want to deliberately inject sulphate aerosols into the upper atmosphere today as an anthropogenic means of controlling global warming. (Some similar suggestions are mentioned in Bryan Appleyard's excellent Sunday Times survey of the various possible responses to global warming: http://www.timesonline.co.uk/article/0,,2099-2208385_1,00.html).
Volcanism, in fact, may be far more beneficial for the environment than often thought. The 'Naked Scientist', Chris Smith, has spotted an excellent idea from Pete McGrail of the Pacific North-West National Laboratory: volcanic basalt may provide an excellent means of sequestrating carbon dioxide, (http://seattletimes.nwsource.com/html/localnews/2003445393_carbonstorage24m.html). When carbon dioxide forms a very strong solution in water (to form carbolic acid), and is then pumped into the sponge-like structure of basalt, the result is solid calcium carbonate, a comparatively safe form in which to store carbon.
Did Rowling copy the lot off you? “If my lawyer was here he’d say, ‘Do not open your mouth’,” laughs Pratchett, before making a visible effort to be conciliatory. “Look, if Tolkien hadn’t written The Lord of the Rings I couldn’t have written the Discworld series. It’s how a genre works. Everyone makes their cake from the same ingredients.” Is Rowling’s cake too similar to yours? “I’m not answering that,” he squeaks.
Sunday, December 17, 2006
Now comes evidence of BMW's social conscience: the turbosteamer. The thermal efficiency of the internal combustion engine in an automobile is appalling, at around 25%. (The thermal efficiency is the fraction of the heat energy generated in combustion which actually goes into mechanical work; the rest of the heat energy is carried away in the exhaust gases and the fluid cooling the engine). Transforming the efficiency of the internal combustion engine should be a priority for a society which seeks to mitigate global warming and to extend the lifetime of its oil reserves. In this respect, the turbosteamer is a significant step forward, increasing fuel consumption efficiency by 15%. (http://www.gizmag.co.uk/go/4936/). The turbosteamer uses heat energy from the exhuast gases and the cooling system to power a belt-drive attached to the crankshaft. BMW estimate it will take a decade to put this into production, but perhaps government should get involved in accelerating this timescale, passing legislation which requires new cars to have certain minimum levels of thermal efficiency, rising with each passing year.
Saturday, December 16, 2006
A couple of interesting facts about methane have emerged in the past year, which have received little attention from the media. Firstly, it was discovered that trees and plants actually release large amounts of methane during their normal lifetime. "This effect is completely missing from climate change and biogeochemical models," according to Peter Cox of the Centre for Ecology and Hydrology, at Winfrith in Dorset, UK. (http://news.bbc.co.uk/1/hi/sci/tech/4604332.stm, http://www.newscientist.com/channel/life/mg18925343.900-the-lungs-of-the-planet-are-belching-methane.html). It had been understood for some time that dead trees and plants release methane into the atmosphere as bacteria consume the dead plant matter. At any one time, a certain proportion of a forest will consist of decaying plant matter, so it was known that a forest is a source of atmospheric methane. The latest research suggests that the methane released by trees and plants during their normal lifetime could be responsible for 10-30% of atmospheric methane production. David Lowe of New Zealand's National Institute of Water and Atmospheric Research points out that "We now have the spectre that new forests might increase greenhouse warming through methane emissions rather than decrease it by sequestering carbon dioxide."
One can well understand why environmentalists do not want to publicise such a possibility. Moreover, it has recently been discovered that the growth-rate of atmospheric methane was at its greatest in the 1980s, and there has been no increase at all in the past seven years. (http://environment.newscientist.com/channel/earth/dn10643-emissions-of-key-greenhouse-gas-stabilise.html, http://news.bbc.co.uk/1/hi/sci/tech/6170736.stm). As the BBC article states "some scientists and policymakers...suggest that cutting emissions of methane could be a more effective way of curbing climate change than focusing on carbon dioxide."
The lack of attention devoted by environmentalists to methane, and their obsession with carbon dioxide, supports the following hypothesis: most 'environmentalists' are fundamentally motivated by political and economic beliefs, rather than environmental principles alone; a large proportion of 'environmentalists' are anti-capitalists, who see environmentalism as a tool for attacking the industrial, capitalist world economy. The global capitalist economy is largely built upon the burning of fossil fuels, so the anti-capitalists are seeking to bring down the system by attacking its foundation. Hence, many environmentalists argue that the only way to avoid catastrophic climate change is to reduce economic growth, and perhaps to even accept economic contraction. The rational alternative, to transform the efficiency with which we expend our energy resources, seems to receive scant consideration as the overall solution.
Of course, the truth-value of a proposition, and the validity of an argument, is independent of the person expressing that proposition or argument, and independent of the motives they may or may not possess. However, understanding the motives of environmentalists helps one to assess the credibility of their science, and the credibility of their economic proposals.
Friday, December 15, 2006
Thursday, December 14, 2006
We have a similar problem in the UK with our ground transport system, namely the road and rail network. The control surfaces of automobiles and trains are assumed to be rotating wheels, and this method of control entails the need for an artificially constructed network of roads and rails. Such a network channels all the cars and trains into small areas of the land surface, and this severely limits the capacity of the ground transport system. The demand placed upon our ground transport system continues to grow, and due to the economic cost and enivronmental implications of building new roads and railways, our ground transport system becomes increasingly inefficient with each passing year.
In aviation, the concept of 'free flight' has been suggested as a partial remedy to the airspace capacity problem. The idea is to make each individual aircraft responsible for its flight-path, rather than an external air traffic controller. Each aircraft is to be equipped with the necessary radar and avionics to predict potential conflicts with other aircraft, and to maintain safe separation. By so doing, aircraft would utilise a larger region of airspace.
I would like to propose that a similar concept of 'free motoring' should be applied to our ground transport network, at least the automobile component of it. I would like to resurrect a suggestion made by Arthur C. Clarke some decades ago, that our favoured form of personalised ground transport should be a hovercraft. (At least, I propose that our ground vehicles should have hovercraft capabilities; the best solution may be a wheeled-hovercraft hybrid). This would instantly free us from our dependency upon a road network, or at least our dependency upon the motorway and trunk-road network between towns and cities. Hovercraft can travel across open land without any need for an aggregated road surface, and without causing any damage to the land. There would be no need for new road-building, and because such vehicles would not be channelled into small areas of the land surface, the capacity of the ground transport system would be boosted. The money currently taken by the Exchequer in the form of road tax and petrol tax, could be used to pay land owners an annual fee for the use of their land.
At the time Clarke proposed his idea, the problem of collisions between such freely driven vehicles would have been insurmountable. Today, the hardware and software being developed for free flight could equally be used to predict and prevent collisions between ground vehicles.
Wednesday, December 13, 2006
Personally, I've always felt that our infantry should have jet-pacs.
Tuesday, December 12, 2006
"That day on his way to work he stopped at the newsagents, as usual, to buy a newspaper. He paid for it but, on the way out, when the shopkeeper wasn't looking, Robert took a chocolate bar from the shelf and slipped it into his pocket. This little act of theft was curiously energizing. His senses felt stripped and raw and he ran back to his car in a whorl of elation. He drove faster than he should, but, instead of going to work, he travelled 320 miles from Yorkshire to Cornwall. By early evening, he found himself sitting on a beach, in the face of a warm sea breeze. Robert was profoundly happy.
"The sun set, it grew dark and chilly, but he stayed there all night, conceding to sleep only as the sun rose in another part of the sky...He returned home late in the day with no explanation except the truth and spent another sleepless night placating his distressed wife.
"Life reverted to routine for a couple of weeks...The next day, out of nowhere, he announces to his wife that their marriage is over and he leaves her, the house, the children, and his new guitar, never to return...Two years later, living alone in a threadbare bed-sit in the suburbs of a northern city, Robert can scarcely recall the Cornish interlude...Robert's fourth seizure happens in the middle of a supermarket and, afterwards, he's taken to hospital. The doctors...investigate with head scans and find a large mass in the orbitofrontal region of the brain. It turns out to be a meningioma. This is a tumour, intrinsically benign, which has invaded the outer coverings of the brain. It has been growing for several years. By distorting the frontal lobes of Robert's brain, it was reshaping the very person he felt himself to be. They operate. Tumour excised, Robert enquires of his nurses most days 'When are my children coming' and 'Can I go home now?' "
This is an extract from Paul Broks's brilliant and haunting book 'Into the Silent Land', and concerns a real case-study. Broks is a neuropsychologist. He points out that "some people grow old and die never knowing that for half their life or more they were harbouring a benign brain tumour. Perhaps they never know who they might have been." This is, indeed, a disturbing thought, although leaving one's job to drive down to the beach might well be considered a sign of mental health, rather than a symptom of brain cancer!
In addition, is there not a sense in which the stream of experiences to which we are subject, determines our personalities? Whilst our personalities can be changed by changing the hard-wiring of the brain, as a tumour such as Robert's does, our personalities can also be changed by our experiences. It is our experiences which partially determine the software of our brains, and the data stored within. Hence, we can ask who we might have been, if only our experiences had been different. Sadly, there is, as yet, no operation to excise tumourous parts of our memories.
Monday, December 11, 2006
Two thoughts occur to me here: Firstly, the irony that these figures were probably compiled by clever graduates, working for the ONS on a salary of circa £20K, who probably think that they are the ones currently doing nothing with their lives. Secondly, if we weaken the criteria here, how many people in society are actually doing nothing with their lives? Here's a litmus test for deciding where you stand: ask yourself, could the role you play in life, either at work or at home, be done equally well by another random individual? If you bring something unique to the party, something that no-one else could supply, then the answer is no, but if the answer is yes, then perhaps you should consider that you're doing nothing with your life. Do not, however, despair! There may be some tax rebates in the pipeline for you.
Sunday, December 10, 2006
Non-Australians can download his weekly BBC Radio 5 question and answer session, hosted on Thursday mornings between 3am and 4am by Rhod Sharp. http://www.bbc.co.uk/fivelive/programmes/upallnight.shtml
"I think it's unlikely that a research programme of that kind can work. Even if you found the right mathematical object, you probably wouldn't even recognise it because you wouldn't know how it corresponds with the world...I would warn against expecting the answer to come from a new mathematical model. It should be the other way round: first find what you think might be the solution to a problem, then express it as a mathematical model, then test it."
Deutsch is spot-on. In very simple terms, a theory of mathematical physics can be broken into (i) an empirical domain, (ii) a collection of mathematical structures, and (iii) a set of correspondence rules which link parts of the mathematical structures with parts of the empirical domain. String theory is an attempt to invent a new fundamental theory by exploring only one of these three dimensions: namely, the collection of mathematical structures. Hence, despite string theory's notorious proposal that space-time has many more dimensions than we can currently detect, string theory itself is rather one-dimensional.
Saturday, December 09, 2006
In the November 30th issue of Nature, new research on the Antikythera mechanism was published by an international team led by Mike Edmunds and Tony Freeth (Cardiff University, Wales). This research employed X-ray tomography to reconstruct in greater detail the structure of the device, and the inscriptions upon it.
This latest research, suggests philosopher Nicholas Rescher today (http://philsci-archive.pitt.edu/archive/00003078/01/Anaxmander_and_Antikythera_Mech.doc), indicates that the Antikythera mechanism was based upon Anaximander's pre-Aristotelian model of the universe. Anaximander postulated that the stars, the moon, and the sun, were circles of fire, hollow like chariot wheels, and visible down tube-like passages which resemble the spokes of a chariot wheel. This contrasts, of course, with the Aristotelian model of the universe as a set of rotating crystal spheres. Rescher points out that "the one major piece of the Antikythera-mechanism that has survived intact is just exactly the chariot wheel that lies at the core of Anaximander’s cosmology." (See the photo at http://skytonight.com/news/4776976.html).
The Antikythera mechanism was capable of predicting the positions of the sun and the moon, and the occurrence of lunar and solar eclipses, with high levels of accuracy. Despite its function as an empirically successful analog computer, the assumptions it represented were no closer to the truth than the assumptions represented in an astronomical computing device based upon the Ptolemaic model of the solar system. It is a trite, but salutary reminder to those physicists and scientists who claim that the precision of the predictions made by modern physical theories are evidence of their truth.
There is, however, an alternative type of micro-organism which humanity might decide to mimic: the symbiotic parasite. These are micro-organisms which invade other host cells, but rather than destroying those cells, they evolve with the host DNA to establish a mutual dependency. Perhaps the best-known example of this is the mitochondria in our own eucaryotic cells, which were formerly independent aerobic bacteria, but which have invaded our cells, and, by virtue of performing the function of converting oxygen and sugar molecules into useful work, have become entwined with the structure and function of our own eucaryotic cells. They are parasites, but symbiotic parasites. If humans develop into a state of mutually dependent, sustainable equilibrium, (sustainable at least on the time-scales of stellar evolution), with planet Earth, and begin to play a positive role in the various feedback mechanisms which maintain the atmospheric and biotic equilibrium of the Earth, then humans would function as symbiotic parasites.
The question, then, is this: does humanity want to be a super-virus or a super-symbiotic-parasite?