COP15 Copenhagen climate summit: Day 2
1930 CET: So, if you've seen my news story on the "Danish text" issue you'll see that not everyone is using the "something rotten in the state of Denmark" line that The Guardian followed.
Some agencies such as CAFOD (quoted in the news story) were very damning, but other observers saw less to disagree about.
One long-time observer of this process - from another charity working on poverty and development issues - suggested positions were close enough for a deal to be likely.
Both the Danish text and that from the BASIC bloc (again, see the news story for details) leave brackets where numbers should be for pledges on cutting emissions.
The big issue is the legal form of a new agreement - whether to extend the Kyoto Protocol to cover further pledges from developed nations, or whether to stick everything under a new agreement - and that's been going on since discussions started shortly after the 2007 climate summit in Bali.
So what can we learn from the issue that might help make sense of the rest of the conference?
Firstly, bits of text will abound - some will turn out to be important, others will fade into obscurity. Some contain the seeds of a compromise, others are intended to mark a position with steel.
Secondly, some commentators will bend whatever texts emerge (or are selectively leaked) to the service of long-held positions. Nothing wrong with that - one expects bodies engaged in this process to spin fast and furiously - but the spin has to be taken off before we can see the form underneath.
Thirdly, governments may not always speak with one voice. One rumour doing the rounds - from a credible source, to me - is that the Danish text is very much the agent of the Prime Minister's office and regarded with distate in the environment ministry.
True? I have no idea; sounds possible, might be wrong; or perhaps that idea has been dropped into the mix to make someone look good - who knows?
At some point over the next couple of days, I'll blog further about how the summit works, and how people in my position attempt to make sense of it for you - and why we sometimes fail.
1619 CET: Something of a buzz going around the centre now, as the Guardian's estimable environment editor John Vidal reveals the so-called "Danish text".
It's a document drawn up by the Danish hosts and distributed around a select group of 40 countries last week, apparently as their preferred basis for a political outcome here.
The whole exercise has the potential to disrupt things hugely as - according to John Vidal's analysis - it removes lots of power from the UN climate convention and legitimises a long-term inequality between developed and developing countries.
Developing countries have already found plenty in it to hate.
I'll file a news story at some point once various radio commitments are out of the way.
In the meantime, the Guardian has posted the text here.
1516 CET: As the seriously obscure parts of the UN climate conference got down to business this morning, attention turned to some of the science being presented - notably, the analyses from the World Meteorological Organization and the UK Met Office that pegged the "Noughties" - the decade since 2000 - as the warmest since instrumental observations began.
The essential conclusions are in our news story; but obviously there's some interest these days in how the records are put together.
Some news reports on the "ClimateGate" affair might have given the impression that the UK's Climatic Research Unit (CRU), and maybe the Met Office team that works closely with it too, has its own network of weather stations across the world and that somehow data from that network is collected, stored and processed separately from every other source of temperature data.
It's erroneous; and for pretty obvious reasons. If you have three or four major centres in the world doing climate analysis, why would they each set up and safeguard data from their own stations?
Would a UK or US centre, for example, be allowed to set up its own weather station in Tibet? Would three agencies go through the financial and logistical pain of setting up their own instruments in Antarctica, say, when one could do it and share the data?
The other factor is that temperature measurements aren't only used for climate research. They're also vital for weather monitoring and forecasting - indeed, most were set up for that purpose.
What there is is a series of weather stations around the world - thousands of them - mostly maintained by national weather services. These services now generally share information with each other as a matter of policy, in theory in real time or as close as possible:
"As a fundamental principle of the World Meteorological Organization (WMO), and in consonance with the expanding requirements for its scientific and technical expertise, WMO commits itself to broadening and enhancing the free and unrestricted international exchange of meteorological and related data and products."
When there isn't a national weather service, things become a little messy. Some stations are not set up for immediate data transfer; some are privately maintained and are under no obligation to provide data free of constraint.
Adding into this now, as the Met Office explains, is a fleet of floating buoys that now numbers well over a thousand; plus a number of ships that take the sea temperature as they move.
Much of this data has to be scrutinised and processed. Sometimes instruments take a knock and produce bizarre reading, which may be taken out of the record until the problem can be investigated.
More importantly, the spread of instruments across the world is very uneven; lots in North America and western Europe, very few in parts of Africa.
The US GISTEMP dataset gets round this, for example, by "interpolating", or filling in, the important but data-scarce Arctic region, whereas the Met Office record does not - which, in the words of Met Office head of climate research Vicky Pope, means that the Met Office record "tends to underestimate the warming".
Then there are satellites. And these too are showing increasing temperatures, as demonstrated in the latest bulletin from the University of Alabama at Huntsville, where John Christy, Roy Spencer and their team collect and collate satellite readings.
Their value for the global trend since 1978 is 0.13C per decade, with 2009 featuring the warmest November in the 31-year record.
That's slightly less than the Met Office figures that show about 0.18C per decade; but a clear trend nevertheless.
The data should come as no surprise to anyone who's been following the issue for a long time - and that includes delegates to the UN climate talks here, where the UN system means IPCC and WMO advice carries official weight.