Tag Archives: software

The instrumental temperature record

The instrumental temperature record shows fluctuations of the temperature of earth’s climate system. Initially the instrumental temperature record only documented land and sea surface temperature, but in recent decades instruments have also begun recording sub-surface ocean temperature. This data is collected from several thousand meteorological stations, Antarctic research stations, satellite observations of sea-surface temperature, and subsurface ocean sensors. The longest-running temperature record is the Central England temperature data series, that starts in 1659. The longest-running quasi-global record starts in 1850.

The Hadley Centre maintains the HadCRUT4, a monthly-mean global surface temperature analysis,[2] and NASA maintains GISTEMP, another monthly-mean global surface temperature analysis, for the period since 1880.[3] The two analyses differ in the details of how they obtain temperature values on a regular grid from the network of irregularly spaced observation sites; thus, their results for global and regional temperature differ slightly. The United States National Oceanic and Atmospheric Administration (NOAA) maintains the Global Historical Climatology Network (GHCN-Monthly) data base contains historical temperature, precipitation, and pressure data for thousands of land stations worldwide.[4] Also, NOAA’s National Climatic Data Center (NCDC), which has “the world’s largest active archive”[5] of surface temperature measurements, maintains a global temperature record since 1880.

Deriving a reliable global temperature from the instrument data is not easy because the instruments are not evenly distributed across the planet, the hardware and observing locations have changed over the years, and there has been extensive land use change (such as urbanization) around some of the sites.

The calculation needs to filter out the changes that have occurred over time that are not climate related (e.g. urban heat islands), then interpolate across regions where instrument data has historically been sparse (e.g. in the southern hemisphere and at sea), before an average can be taken.

There are three main datasets showing analyses of global temperatures, all developed since the late 1970s: the HadCRUT analysis is compiled in a collaboration between the University of East Anglia‘s Climatic Research Unit and the Hadley Centre for Climate Prediction and Research,[6][7], independent analyses largely based on the same raw data are produced using different levels of interpolation by the Goddard Institute for Space Studies and by the National Climatic Data Center.[7] These datasets are updated on a monthly basis and are generally in close agreement.

In the late 1990s, the Goddard team used the same data to produce a global map of temperature anomalies to illustrate the difference between the current temperature and average temperatures prior to 1950 across every part of the globe.

Records of global average surface temperature are usually presented as anomalies rather than as absolute temperatures. A temperature anomaly is measured against a reference value or long-term average.[9] For example, if the reference value is 15 °C, and the measured temperature is 17 °C, then the temperature anomaly is +2 °C (i.e., 17 °C -15 °C).

Temperature anomalies are useful for deriving average surface temperatures because they tend to be highly correlated over large distances (of the order of 1000 km).[10] In other words, anomalies are representative of temperatures over large areas and distances. By comparison, absolute temperatures vary markedly over even short distances.

Absolute temperatures for the Earth’s average surface temperature have been derived, with a best estimate of roughly 14 °C (57.2 °F).[11] However, the correct temperature could easily be anywhere between 13.3 and 14.4 °C (56 and 58 °F) and uncertainty increases at smaller (non-global) scales.

In September 2007, the GISTEMP software which is used to process the GISS version of the historical instrument data was made public. The software that was released has been developed over more than 20 years by numerous staff and is mostly in FORTRAN; large parts of it were developed in the 1980s before massive amounts of computer memory was available as well as modern programming languages and techniques.

Two recent open source projects have been developed by individuals to re-write the processing software in modern open code. One, http://www.opentemp.org/, was by John van Vliet. More recently, a project which began in April 2008 (Clear Climate Code) by staff of Ravenbrook Ltd to update the code to Python has so far detected two minor bugs in the original software which did not significantly change any results.

The period for which reasonably reliable instrumental records of near-surface temperature exist with quasi-global coverage is generally considered to begin around 1850. Earlier records exist, but with sparser coverage and less standardized instrumentation.

The temperature data for the record come from measurements from land stations and ships. On land, temperature sensors are kept in a Stevenson screen or a maximum minimum temperature system (MMTS). The sea record consists of surface ships taking sea temperature measurements from engine inlets or buckets. The land and marine records can be compared.[13] Land and sea measurement and instrument calibration is the responsibility of national meteorological services. Standardization of methods is organized through the World Meteorological Organization and its predecessor, the International Meteorological Organization.[14]

Most meteorological observations are taken for use in weather forecasts. Centers such as ECMWF show instantaneous map of their coverage; or the Hadley Centre show the coverage for the average of the year 2000. Coverage for earlier in the 20th and 19th centuries would be significantly less. While temperature changes vary both in size and direction from one location to another, the numbers from different locations are combined to produce an estimate of a global average change.

Most of the observed warming occurred during two periods: 1910 to 1945 and 1976 to 2000; the cooling/plateau from 1945 to 1976 has been mostly attributed to sulphate aerosol.[15][16] Some of the temperature variations over this time period may also be due to ocean circulation patterns.[17]

Attribution of the temperature change to natural or anthropogenic (i.e., human-induced) factors is an important question: see global warming and attribution of recent climate change.

Land and sea measurements independently show much the same warming since 1860.[18] The data from these stations show an average surface temperature increase of about 0.74 °C during the last 100 years. The Intergovernmental Panel on Climate Change (IPCC) stated in its Fourth Assessment Report (AR4) that the temperature rise over the 100-year period from 1906–2005 was 0.74 °C [0.56 to 0.92 °C] with a 90% confidence interval.[19]

For the last 50 years, the linear warming trend has been 0.13 °C [0.10 to 0.16 °C] per decade according to AR4.[19]

The IPCC Fourth Assessment Report found that the instrumental temperature record for the past century included urban heat island effects but that these were primarily local, having a negligible influence on global temperature trends (less than 0.006 °C per decade over land and zero over the oceans).

There is a scientific consensus that climate change is occurring and that greenhouse gases emitted by human activities are the primary driver.[21] The scientific consensus is reflected in, for example, reports by the Intergovernmental Panel on Climate Change (IPCC) and U.S. Global Change Research Program.[21]

Although the IPCC AR4 concluded that “warming of the climate system is unequivocal,” public debate over the evidence for global warming continues.[22] However, it is often confined to a small set of reiterated disputes about Land Surface Air Temperature (LSAT) records, diverting attention from the broader evidence basis.[22]

The methods used to derive the principal estimates of global surface temperature trends — HadCRUT3, NOAA and NASA/GISS — are largely independent.[22] So, the spread of the three estimates indicates the likely degree of uncertainty in the evolution of the global mean surface temperature.[22] Independently derived estimates of tropospheric temperature trends for the whole troposphere channel from satellites differ by an order of magnitude more than do estimated surface temperature trends.[22]

Numerous studies attest to the robustness of the global LSAT records and their non-reliance on individual stations.[22] Evidence from recent re-analyses lends further support.[22]

The IPCC conclusion that “warming of the climate system is unequivocal” does not rest solely upon LSAT records.[22] These constitute only one line of evidence among many, for example: uptake of heat by the oceans, melting of land ice such as glaciers, the associated rise in sea level and increased atmospheric surface humidity (see the figure opposite and effects of global warming).[22] If the land surface records were systematically flawed and the globe had not really warmed, then it would be almost impossible to explain the concurrent changes in this wide range of indicators produced by many independent groups.[22] The observed changes in a broad range of indicators provide a self-consistent story of a warming world.

The U.S. National Academy of Sciences, both in its 2002 report to President George W. Bush, and in later publications, has strongly endorsed evidence of an average global temperature increase in the 20th century.[24]

The preliminary results of an assessment carried out by the Berkeley Earth Surface Temperature group and made public in October 2011, found that over the past 50 years the land surface warmed by 0.911 °C, and their results mirrors those obtained from earlier studies carried out by the NOAA, the Hadley Centre and NASA‘s GISS. The study addressed concerns raised by “skeptics”[25][26] including urban heat island effect, “poor”[25] station quality, and the “issue of data selection bias”[25] and found that these effects did not bias the results obtained from these earlier studies.

One of the issues that has been raised in the media is the view that global warming “stopped in 1998”.[30][31] This view ignores the presence of internal climate variability.[31][32] Internal climate variability is a result of complex interactions between components of the climate system, such as the coupling between the atmosphere and ocean.[33] An example of internal climate variability is the El Niño Southern Oscillation (ENSO).[31][32] The El Niño in 1998 was particularly strong, possibly one of the strongest of the 20th century.[31]

Cooling between 2006 and 2008, for instance, has likely been driven by La Niña, the opposite of El Niño conditions.[34] The area of cooler-than-average sea surface temperatures that defines La Niña conditions can push global temperatures downward, if the phenomenon is strong enough.[34] Even accounting for the presence of internal climate variability, recent years rank among the warmest on record.[35] For example, every year of the 2000s was warmer than the 1990 average.

In their study of media coverage of the 2013 Intergovernmental Panel on Climate Change (IPCC) report, Media Matters for America found that nearly half of print media stories discussed that the warming of global surface temperatures has slowed over the past 15 years. While this factoid is true, the question is, what does it mean?

Many popular climate myths share the trait of vagueness. For example, consider the argument that climate has changed naturally in the past. Well of course it has, but what does that tell us? It’s akin to telling a fire investigator that fires have always happened naturally in the past. That would doubtless earn you a puzzled look from the investigator. Is the implication that because they have occurred naturally in the past, humans can’t cause fires or climate change?

The same problem applies to the ‘pause’ (or ‘hiatus’ or better yet, ‘speed bump‘) assertion. It’s true that the warming of average global surface temperatures has slowed over the past 15 years, but what does that mean? One key piece of information that’s usually omitted when discussing this subject is that the overall warming of the entire climate system has continued rapidly over the past 15 years, even faster than the 15 years before that.

The global temperature changes are not uniform over the globe, nor would they be expected to be, whether the changes were naturally or humanly forced.

Temperature trends from 1901 are positive over most of the world’s surface except for Atlantic Ocean south of Greenland, the southeastern United States, and parts of Bolivia. Warming is strongest over interior land area in Asia and North America as well as south-eastern Brazil and some area in the South Atlantic and Indian oceans.

Since 1979 temperatures increase is considerably stronger over land while cooling has been observed over some oceanic regions in the Pacific Ocean and Southern Hemisphere, the spatial pattern of ocean temperature trend in those regions is possibly related to the Pacific Decadal Oscillation and Southern Annular Mode.[37]

Seasonal temperature trends are positive over most of the globe but weak cooling is observed over the mid latitudes of the southern ocean but also over eastern Canada in spring due to strengthening of the North Atlantic Oscillation, warming is stronger over northern Europe, China and North America in winter, Europe and Asia interior in spring, Europe and north Africa in summer and northern North America, Greenland and Eastern Asia in autumn. Enhanced warming over north Eurasia is partly linked to the Northern Annular Mode,[38][39] while in the southern hemisphere the trend toward stronger westerlies over the southern ocean favoured a cooling over much of Antarctica with the exception of the Antarctic Peninsula where strong westerlies decrease cold air outbreak from the south.[40] The Antarctic Peninsula has warmed by 2.5 °C (4.5 °F) in the past five decades at Bellingshausen Station.

Systematic local biases in surface temperature trends may exist due to changes in station exposure and instrumentation over land, or changes in measurement techniques by ships and buoys in the ocean.[42][43] It is likely that these biases are largely random and therefore cancel out over large regions such as the globe or tropics.[42]

Some have expressed concern that land temperature data might be biased due to urbanization effects (see urban heat island effect for more information).[42] Studies specifically designed to identify systematic problems using a range of approaches have found no detectable urban influence in large-area averages in the data sets that have been adjusted to remove non-climatic influences (i.e., “homogenized“).[42][44]

The uncertainty in annual measurements of the global average temperature (95% range) is estimated to be ≈0.05 °C since 1950 and as much as ≈0.15 °C in the earliest portions of the instrumental record. The error in recent years is dominated by the incomplete coverage of existing temperature records. Early records also have a substantial uncertainty driven by systematic concerns over the accuracy of sea surface temperature measurements.[45][46] A temperature drop of about 0.3 °C in 1945 could be the result of uncorrected instrumental biases in the sea surface temperature record.[43]

Station densities are highest in the northern hemisphere, providing more confidence in climate trends in this region. Station densities are far lower in other regions such as the tropics, northern Asia and the former Soviet Union. This results in less confidence in the robustness of climate trends in these areas. If a region with few stations includes a poor quality station, the impact on global temperature would be greater than in a grid with many weather stations.[47]

As stated, uncertainties in the instrumental record do not undermine the robust finding[48] of an observed long-term increase in global mean temperature, which is supported by a wide range of evidence.

The list of hottest years on record is dominated by years from this millennium; every year of the 21st century is one of the 15 warmest on record (14 out of 15). This is the first time since 1990 that the high temperature record was broken in the absence of any El Niño conditions in the year, as indicated by NOAA’s CPC Oceanic Niño Index. El Niño generally tends to increase global temperatures around the globe yet conditions remained neutral in during the entire year and the globe reached record warmth despite this. The previous recordholder (2010) occurred during an El Niño year. La Niña, on the other hand, usually causes years which are cooler than the short-term average. (though while the last La Niña year (2012) was relatively cool by recent standards it was still the 10th warmest year since records began). Slightly less recently, 2006 and 2009 are approximately tied for the warmest “La Niña year” between 1971 and 2014.[65]

Although the NCDC temperature record begins in 1880, less certain reconstructions of earlier temperatures suggest these years may be the warmest for several centuries to millennia.

10 warmest years on record (°C anomaly from 1901–2000 mean)
Year Global[66] Land[66] Ocean[66]
2014 0.69 1.00 0.57
2010 0.66 1.06 0.50
2005 0.65 1.05 0.50
1998 0.64 0.94 0.52
2013 0.62 0.99 0.48
2003 0.62 0.88 0.52
2002 0.61 0.93 0.49
2006 0.60 0.90 0.48
2009 0.59 0.85 0.50
2007 0.59 1.09 0.41

The last 38 years in a row were above the 20th century average.[65]

The values in the table above are anomalies from the 1901–2000 global mean of 13.9 °C.[67] For instance, the +0.59 °C anomaly in 2007 added to the 1901–2000 mean of 13.9 °C gives a global average temperature of 14.49 °C for 2007.[68]

The coolest year in the record was 1911. The warmest year was 2014.[66

 

 

Captivate VS Camtasia: What’s the difference?

Bill's Blog

Quick FYI: I will post a new blog every Thursday going forward.

As an eLearning developer, I’ve come to know there are a number of development tools out there to serve our learning purposes. Like any job, a decision needs to be made as to which tool is best used to accomplish the task at hand. You wouldn’t use a sledgehammer to fix a toaster (unless it was broken beyond repair, then I’d say have at it.) In the case of eLearning, you wouldn’t want to use a tool that can’t accomplish the task.

Two of the most often used and requested eLearning development programs out there today are Adobe Captivate and Techsmith Camtasia. Both programs offer a wide variety of tools that assist in making web-based learning more interactive, intuitive, and overall fun. At the same time, there are some differences to take into consideration when deciding which one to…

View original post 356 more words

Control de versión

Los repositorios administrados de documentos son importantes en el trabajo en equipo cuando varios miembros deben trabajar de manera simultánea o coordinada sobre los mismos documentos, pero también es útil en el caso de lobos solitarios. Control de versión es el arte de administrar cambios. Es una herramienta crítica en el desarrollo de software.

Algunos sistemas de control de versión son administradores de software (Software Configuration Management). Estos sistemas están especí­ficamente diseñados para administrar árboles de código fuente y soportan el ciclo de vida de aplicaciones. Otros sistemas son repositorios generales de documentos.

Un repositorio de información para control de versión guarda un registro de los cambios hechos tanto a los datos como a la estructura misma de archivos. Un cliente puede no solo ver la última versión de los documentos guardados, sino también estados previos del sistema de archivos. Por ejemplo un cliente puede hacer consultas del tipo ¿Qué cambios se hicieron en un documento en la última semana?

El problema fundamental es por un lado ¿Cómo compartir información y coordinar modificaciones concurrentes a un grupo de documentos? Y complementariamente ¿Cómo recuperar estados anteriores de los documentos cuando una serie de cambios resultan inapropiados o se requieren variaciones de base común?

Un enfoque para evitar conflictos es reservar-modificar-cambiar (lock-modify-unlock). Este enfoque no siempre garantiza la integridad o coherencia de un sistema cuando se trabaja con múltiples documentos y serializa el trabajo innecesariamente cuando se pudiera hacer cambios independientes. Otro enfoque es copiar-modificar-integrar (copy-modify-merge). El repositorio puede asistir en el manejo de documentos y sus cambios, pero una persona necesita hacer el análisis de si un conjunto de cambios es valido y los miembros de un equipo deben mantener una buena comunicación.

En el caso particular del software algunas de las áreas que soporta un SCM son:

        • Administración de versiones múltiples, permitiendo a usuarios y desarrolladores reportas defectos y cambios con relación a versiones históricas.

       

    • Administración de equipos de desarrollo, permitiendo que varios programadores trabajen en un mismo archivo e integrando los cambios.
    • Auditorias de cambios.

 

Los sistemas de control de versión trabajan con dos elementos base: áreas de trabajo y repositorios. Las áreas de trabajo es donde se hacen cambios y el repositorio es el lugar donde se guardan los documentos de referencia que sincronizan el trabajo de todos y define el estado de la información. El repositorio guarda metadata que permite rastrear cambios y versiones.
El paradigma central de control de versión es Pedir/Aplicar (check out/commit). Todos los documentos se almacenan en el repositorio. El programador registra una copia en su área de trabajo y procede a aplicar cambios a su copia. Cuando los cambios son estables, se aplican al repositorio de acuerdo a polí­ticas de administración de cambios y resolución de conflictos.

Dos conceptos importantes en la administración de cambios son ramas (branches) y etiquetas (tags). La ramificación del código permite mantener el desarrollo del sistema y liberar versiones de acuerdo a plataformas, características y pruebas; O para pruebas de código experimental. Etiquetas son similares a ramas pero puntos de referencia en la misma línea de desarrollo, no a una variante del mismo.

El abuelito y punto de referencia de los sistemas de control de versión es CVS, referenciado a scripts escritos por Dick Grune y publicados en comp.sources.unix en diciembre de 1986.

Sistemas de control de versión:
CVS
Subversion
Perforce (p4)
BitKeeper
VOODOO Server
ClearCase
RCS (Revision Control System)

Herramientas gratuitas para UML

Existen herramientas gratuitas de buena caliadad para UML. Tanto Netbeans como Eclipse soportan esta funcionalidad con el ciclo completo de desarrollo desde generación de código hasta reingenieria. Esto, claro, si se quiere trabajar en Java. En .Net no he encontrado este grado de funcionalidad en herramientas Open Source. Una opción de bajo costo, relativo a RUP y similares, es Visual UML. Visual Paradigm tiene una edición limitada sin costo, Smart Development Environment Community Edition for Visual Studio.

UML, ejemplo sencillo sobre Modelado de un Proyecto Introducción a UML

Zachman y los seis honestos de Kipling

I keep six honest serving-men

(They taught me all I knew)

Their names are What and Why and When And How and Where and Who

Uno de los dichos de mi buen amigo Ángel es sobre la gracia del gringo, ese gringo mítico de poderes de Comic, para tomar algún concepto del sentido común y convertirlo en un producto mercadeable. Un ejemplo interesante de esto es el marco de Zachman para arquitecturas empresariales. Todo un icono en la comunidad de arquitectura de datos. Se basa en el patrón de analizar problemas con una matriz de puntos a revisar. En el marco de Zachman las columnas corresponden a los seis interrogantes en ingles y las hileras a diferentes roles en el desarrollo de una aplicación empresarial. De este sencillo concepto Zachman desarrolla todo una teoría detallada de cómo documentar y administrar un proyecto de desarrollo de un sistema empresarial basado en un modelo entidad-relación.

WHAT’S WRONG WITH THE ZACHMAN FRAMEWORK? Extending the RUP with the Zachman Framework