The World3 model is a system dynamics model for computer simulation of interactions between population, industrial growth, food production and limits in the ecosystems of the Earth. It was originally produced and used by a Club of Rome study that produced the model and the book The Limits to Growth. The principal creators of the model were Donella Meadows, Dennis Meadows, and Jørgen Randers.
The model was documented in the book Dynamics of Growth in a Finite World. It added new features to Jay W. Forrester‘s World2 model. Since World3 was originally created it has had minor tweaks to get to the World3/91 model used in the book Beyond the Limits, later improved to get the World3/2000 model distributed by the Institute for Policy and Social Science Research and finally the World3/2004 model used in the book Limits to growth: the 30 year update (synopsis).
World3 is one of several global models that have been generated throughout the world (Mesarovic/Pestel Model, Bariloche Model, MOIRA Model, SARU Model, FUGI Model) and is probably the model that generated the spark for all later models.
System dynamics is a computer-aided approach to policy analysis and design. It applies to dynamic problems arising in complex social, managerial, economic, or ecological systems — literally any dynamic systems characterized by interdependence, mutual interaction, information feedback, and circular causality.
iThink® and STELLA® are two names for one model development platform published by isee systems. The software is available in different configurations under a commercial license for Windows and Macintosh computers. Educational licenses and a free runtime version of the software are available.
Powersim Studio is available in a number of different configurations from Powersim Software. The software is available under commercial license and runs under Windows. Educational licenses and options for publishing standalone model packages are available. A new free version, Studio Express is now available.
Vensim® is available in a number of different configurations from Ventana Systems, Inc. The software is available under a commercial license and runs on Windows and the Macintosh. Educational licenses, including a configuration of the software that is free for educational use, and a free runtime version of the software are available.
See Also: There are a number of other products that can be used to construct models. These include: Anylogic, Goldsim, Berkely Madonna, Sysdea and SimGua under related methodologies and MyStrategy under pedagogical tools.
OPENMODELICA is an open-source Modelica-based modeling and simulation environment intended for industrial and academic usage. Its long-term development is supported by a non-profit organization – the Open Source Modelica Consortium (OSMC).
The goal with the OpenModelica effort is to create a comprehensive Open Source Modelica modeling, compilation and simulation environment based on free software distributed in binary and source code form for research, teaching, and industrial usage. We invite researchers and students, or any interested developer to participate in the project and cooperate around OpenModelica, tools, and applications.
Simantics System Dynamics is a ready-to-use system dynamics modelling and simulation software application for understanding different organizations, markets and other complex systems and their dynamic behavior.
ASCEND is a free open-source software program for solving small to very large mathematical models. ASCEND can solve systems of non-linear equations, linear and nonlinear optimisation problems, and dynamic systems expressed in the form of differential/algebraic equations.
Insight Maker supports System Dynamics modeling: a powerful method for exploring systems on an aggregate level. By “aggregate”, it is meant that System Dynamics models look at collections of objects, not the objects themselves. For instance, if you created a model of a water leakage from a bucket, a System Dynamics model would concern itself with the quantity of water as a whole, not with individual droplets or even molecules. Similarly, if you were modeling a population of rabbits, the System Dynamics model would look at the population as a whole, not at the individual rabbits.
Sysdea modeling is based upon Stocks (something that accumulates, such as money in a bank account, trees in a forest) and Flows (the forces that cause such Stocks to accumulate and deplete). With just these two concepts and supportive Variables to allow intermediate calculations, you get great expressive power.
NetLogo is a multi-agent programmable modeling environment. It is used by tens of thousands of students, teachers and researchers worldwide. It also powers HubNetparticipatory simulations. It is authored by Uri Wilensky and developed at the CCL. You can download it free of charge.
We’ve all heard about the “Limits to Growth”. Well, the results of the computer program that started it all are published in “World Dynamics” by Jay W. Forrester (The MIT Press, Cambridge, MA, 1971; second edition, 1973). Back then you had to be at an institution to run the computer program to simulate the future world dynamics, so modeled. Nowadays you can run it on your own personal computer and play around with the model all you want (after spending a day or two going through the tutorials). All you need to do is download Vensim PLE (take care to download all files first to a known location like your “Desktop” so you can direct the installer program to them when it asks you for their location), and then open the WORLD.MDL file most likely located at: C:\Program Files\Vensim\models\sample\EXTRA\WORLD.MDL
Read on for some screen shots of the output.
PS: Meadows et al’s 2003 update to the original model is in the file WRLD3-03.VMF, most likely located at C:\Program Files\Vensim\models\sample\WRLD3-03\WRLD3-03.VMF
The instrumental temperature record shows fluctuations of the temperature of earth’s climate system. Initially the instrumental temperature record only documented land and sea surface temperature, but in recent decades instruments have also begun recording sub-surface ocean temperature. This data is collected from several thousand meteorological stations, Antarctic research stations, satellite observations of sea-surface temperature, and subsurface ocean sensors. The longest-running temperature record is the Central England temperature data series, that starts in 1659. The longest-running quasi-global record starts in 1850.
The Hadley Centre maintains the HadCRUT4, a monthly-mean global surface temperature analysis, and NASA maintains GISTEMP, another monthly-mean global surface temperature analysis, for the period since 1880. The two analyses differ in the details of how they obtain temperature values on a regular grid from the network of irregularly spaced observation sites; thus, their results for global and regional temperature differ slightly. The United States National Oceanic and Atmospheric Administration (NOAA) maintains the Global Historical Climatology Network (GHCN-Monthly) data base contains historical temperature, precipitation, and pressure data for thousands of land stations worldwide. Also, NOAA’s National Climatic Data Center (NCDC), which has “the world’s largest active archive” of surface temperature measurements, maintains a global temperature record since 1880.
Deriving a reliable global temperature from the instrument data is not easy because the instruments are not evenly distributed across the planet, the hardware and observing locations have changed over the years, and there has been extensive land use change (such as urbanization) around some of the sites.
The calculation needs to filter out the changes that have occurred over time that are not climate related (e.g. urban heat islands), then interpolate across regions where instrument data has historically been sparse (e.g. in the southern hemisphere and at sea), before an average can be taken.
There are three main datasets showing analyses of global temperatures, all developed since the late 1970s: the HadCRUT analysis is compiled in a collaboration between the University of East Anglia‘s Climatic Research Unit and the Hadley Centre for Climate Prediction and Research,, independent analyses largely based on the same raw data are produced using different levels of interpolation by the Goddard Institute for Space Studies and by the National Climatic Data Center. These datasets are updated on a monthly basis and are generally in close agreement.
In the late 1990s, the Goddard team used the same data to produce a global map of temperature anomalies to illustrate the difference between the current temperature and average temperatures prior to 1950 across every part of the globe.
Records of global average surface temperature are usually presented as anomalies rather than as absolute temperatures. A temperature anomaly is measured against a reference value or long-term average. For example, if the reference value is 15 °C, and the measured temperature is 17 °C, then the temperature anomaly is +2 °C (i.e., 17 °C -15 °C).
Temperature anomalies are useful for deriving average surface temperatures because they tend to be highly correlated over large distances (of the order of 1000 km). In other words, anomalies are representative of temperatures over large areas and distances. By comparison, absolute temperatures vary markedly over even short distances.
Absolute temperatures for the Earth’s average surface temperature have been derived, with a best estimate of roughly 14 °C (57.2 °F). However, the correct temperature could easily be anywhere between 13.3 and 14.4 °C (56 and 58 °F) and uncertainty increases at smaller (non-global) scales.
In September 2007, the GISTEMP software which is used to process the GISS version of the historical instrument data was made public. The software that was released has been developed over more than 20 years by numerous staff and is mostly in FORTRAN; large parts of it were developed in the 1980s before massive amounts of computer memory was available as well as modern programming languages and techniques.
Two recent open source projects have been developed by individuals to re-write the processing software in modern open code. One, http://www.opentemp.org/, was by John van Vliet. More recently, a project which began in April 2008 (Clear Climate Code) by staff of Ravenbrook Ltd to update the code to Python has so far detected two minor bugs in the original software which did not significantly change any results.
The period for which reasonably reliable instrumental records of near-surface temperature exist with quasi-global coverage is generally considered to begin around 1850. Earlier records exist, but with sparser coverage and less standardized instrumentation.
The temperature data for the record come from measurements from land stations and ships. On land, temperature sensors are kept in a Stevenson screen or a maximum minimum temperature system (MMTS). The sea record consists of surface ships taking sea temperature measurements from engine inlets or buckets. The land and marine records can be compared. Land and sea measurement and instrument calibration is the responsibility of national meteorological services. Standardization of methods is organized through the World Meteorological Organization and its predecessor, the International Meteorological Organization.
Most meteorological observations are taken for use in weather forecasts. Centers such as ECMWF show instantaneous map of their coverage; or the Hadley Centre show the coverage for the average of the year 2000. Coverage for earlier in the 20th and 19th centuries would be significantly less. While temperature changes vary both in size and direction from one location to another, the numbers from different locations are combined to produce an estimate of a global average change.
Most of the observed warming occurred during two periods: 1910 to 1945 and 1976 to 2000; the cooling/plateau from 1945 to 1976 has been mostly attributed to sulphate aerosol. Some of the temperature variations over this time period may also be due to ocean circulation patterns.
Land and sea measurements independently show much the same warming since 1860. The data from these stations show an average surface temperature increase of about 0.74 °C during the last 100 years. The Intergovernmental Panel on Climate Change (IPCC) stated in its Fourth Assessment Report (AR4) that the temperature rise over the 100-year period from 1906–2005 was 0.74 °C [0.56 to 0.92 °C] with a 90% confidence interval.
For the last 50 years, the linear warming trend has been 0.13 °C [0.10 to 0.16 °C] per decade according to AR4.
The IPCC Fourth Assessment Report found that the instrumental temperature record for the past century included urban heat island effects but that these were primarily local, having a negligible influence on global temperature trends (less than 0.006 °C per decade over land and zero over the oceans).
There is a scientific consensus that climate change is occurring and that greenhouse gases emitted by human activities are the primary driver. The scientific consensus is reflected in, for example, reports by the Intergovernmental Panel on Climate Change (IPCC) and U.S. Global Change Research Program.
Although the IPCC AR4 concluded that “warming of the climate system is unequivocal,” public debate over the evidence for global warming continues. However, it is often confined to a small set of reiterated disputes about Land Surface Air Temperature (LSAT) records, diverting attention from the broader evidence basis.
The methods used to derive the principal estimates of global surface temperature trends — HadCRUT3, NOAA and NASA/GISS — are largely independent. So, the spread of the three estimates indicates the likely degree of uncertainty in the evolution of the global mean surface temperature. Independently derived estimates of tropospheric temperature trends for the whole troposphere channel from satellites differ by an order of magnitude more than do estimated surface temperature trends.
The IPCC conclusion that “warming of the climate system is unequivocal” does not rest solely upon LSAT records. These constitute only one line of evidence among many, for example: uptake of heat by the oceans, melting of land ice such as glaciers, the associated rise in sea level and increased atmospheric surface humidity (see the figure opposite and effects of global warming). If the land surface records were systematically flawed and the globe had not really warmed, then it would be almost impossible to explain the concurrent changes in this wide range of indicators produced by many independent groups. The observed changes in a broad range of indicators provide a self-consistent story of a warming world.
The U.S. National Academy of Sciences, both in its 2002 report to President George W. Bush, and in later publications, has strongly endorsed evidence of an average global temperature increase in the 20th century.
The preliminary results of an assessment carried out by the Berkeley Earth Surface Temperature group and made public in October 2011, found that over the past 50 years the land surface warmed by 0.911 °C, and their results mirrors those obtained from earlier studies carried out by the NOAA, the Hadley Centre and NASA‘s GISS. The study addressed concerns raised by “skeptics” including urban heat island effect, “poor” station quality, and the “issue of data selection bias” and found that these effects did not bias the results obtained from these earlier studies.
One of the issues that has been raised in the media is the view that global warming “stopped in 1998”. This view ignores the presence of internal climate variability. Internal climate variability is a result of complex interactions between components of the climate system, such as the coupling between the atmosphere and ocean. An example of internal climate variability is the El Niño Southern Oscillation (ENSO). The El Niño in 1998 was particularly strong, possibly one of the strongest of the 20th century.
Cooling between 2006 and 2008, for instance, has likely been driven by La Niña, the opposite of El Niño conditions. The area of cooler-than-average sea surface temperatures that defines La Niña conditions can push global temperatures downward, if the phenomenon is strong enough. Even accounting for the presence of internal climate variability, recent years rank among the warmest on record. For example, every year of the 2000s was warmer than the 1990 average.
In their study of media coverage of the 2013 Intergovernmental Panel on Climate Change (IPCC) report, Media Matters for America found that nearly half of print media stories discussed that the warming of global surface temperatures has slowed over the past 15 years. While this factoid is true, the question is, what does it mean?
Many popular climate myths share the trait of vagueness. For example, consider the argument that climate has changed naturally in the past. Well of course it has, but what does that tell us? It’s akin to telling a fire investigator that fires have always happened naturally in the past. That would doubtless earn you a puzzled look from the investigator. Is the implication that because they have occurred naturally in the past, humans can’t cause fires or climate change?
The same problem applies to the ‘pause’ (or ‘hiatus’ or better yet, ‘speed bump‘) assertion. It’s true that the warming of average global surface temperatures has slowed over the past 15 years, but what does that mean? One key piece of information that’s usually omitted when discussing this subject is that the overall warming of the entire climate system has continued rapidly over the past 15 years, even faster than the 15 years before that.
The global temperature changes are not uniform over the globe, nor would they be expected to be, whether the changes were naturally or humanly forced.
Temperature trends from 1901 are positive over most of the world’s surface except for Atlantic Ocean south of Greenland, the southeastern United States, and parts of Bolivia. Warming is strongest over interior land area in Asia and North America as well as south-eastern Brazil and some area in the South Atlantic and Indian oceans.
Since 1979 temperatures increase is considerably stronger over land while cooling has been observed over some oceanic regions in the Pacific Ocean and Southern Hemisphere, the spatial pattern of ocean temperature trend in those regions is possibly related to the Pacific Decadal Oscillation and Southern Annular Mode.
Seasonal temperature trends are positive over most of the globe but weak cooling is observed over the mid latitudes of the southern ocean but also over eastern Canada in spring due to strengthening of the North Atlantic Oscillation, warming is stronger over northern Europe, China and North America in winter, Europe and Asia interior in spring, Europe and north Africa in summer and northern North America, Greenland and Eastern Asia in autumn. Enhanced warming over north Eurasia is partly linked to the Northern Annular Mode, while in the southern hemisphere the trend toward stronger westerlies over the southern ocean favoured a cooling over much of Antarctica with the exception of the Antarctic Peninsula where strong westerlies decrease cold air outbreak from the south. The Antarctic Peninsula has warmed by 2.5 °C (4.5 °F) in the past five decades at Bellingshausen Station.
Systematic local biases in surface temperature trends may exist due to changes in station exposure and instrumentation over land, or changes in measurement techniques by ships and buoys in the ocean. It is likely that these biases are largely random and therefore cancel out over large regions such as the globe or tropics.
Some have expressed concern that land temperature data might be biased due to urbanization effects (see urban heat island effect for more information). Studies specifically designed to identify systematic problems using a range of approaches have found no detectable urban influence in large-area averages in the data sets that have been adjusted to remove non-climatic influences (i.e., “homogenized“).
The uncertainty in annual measurements of the global average temperature (95% range) is estimated to be ≈0.05 °C since 1950 and as much as ≈0.15 °C in the earliest portions of the instrumental record. The error in recent years is dominated by the incomplete coverage of existing temperature records. Early records also have a substantial uncertainty driven by systematic concerns over the accuracy of sea surface temperature measurements. A temperature drop of about 0.3 °C in 1945 could be the result of uncorrected instrumental biases in the sea surface temperature record.
Station densities are highest in the northern hemisphere, providing more confidence in climate trends in this region. Station densities are far lower in other regions such as the tropics, northern Asia and the former Soviet Union. This results in less confidence in the robustness of climate trends in these areas. If a region with few stations includes a poor quality station, the impact on global temperature would be greater than in a grid with many weather stations.
As stated, uncertainties in the instrumental record do not undermine the robust finding of an observed long-term increase in global mean temperature, which is supported by a wide range of evidence.
The list of hottest years on record is dominated by years from this millennium; every year of the 21st century is one of the 15 warmest on record (14 out of 15). This is the first time since 1990 that the high temperature record was broken in the absence of any El Niño conditions in the year, as indicated by NOAA’s CPC Oceanic Niño Index. El Niño generally tends to increase global temperatures around the globe yet conditions remained neutral in during the entire year and the globe reached record warmth despite this. The previous recordholder (2010) occurred during an El Niño year. La Niña, on the other hand, usually causes years which are cooler than the short-term average. (though while the last La Niña year (2012) was relatively cool by recent standards it was still the 10th warmest year since records began). Slightly less recently, 2006 and 2009 are approximately tied for the warmest “La Niña year” between 1971 and 2014.
The last 38 years in a row were above the 20th century average.
The values in the table above are anomalies from the 1901–2000 global mean of 13.9 °C. For instance, the +0.59 °C anomaly in 2007 added to the 1901–2000 mean of 13.9 °C gives a global average temperature of 14.49 °C for 2007.
The coolest year in the record was 1911. The warmest year was 2014.[66
Quick FYI: I will post a new blog every Thursday going forward.
As an eLearning developer, I’ve come to know there are a number of development tools out there to serve our learning purposes. Like any job, a decision needs to be made as to which tool is best used to accomplish the task at hand. You wouldn’t use a sledgehammer to fix a toaster (unless it was broken beyond repair, then I’d say have at it.) In the case of eLearning, you wouldn’t want to use a tool that can’t accomplish the task.
Two of the most often used and requested eLearning development programs out there today are Adobe Captivate and Techsmith Camtasia. Both programs offer a wide variety of tools that assist in making web-based learning more interactive, intuitive, and overall fun. At the same time, there are some differences to take into consideration when deciding which one to…
View original post 356 more words
Los repositorios administrados de documentos son importantes en el trabajo en equipo cuando varios miembros deben trabajar de manera simultánea o coordinada sobre los mismos documentos, pero también es útil en el caso de lobos solitarios. Control de versión es el arte de administrar cambios. Es una herramienta crítica en el desarrollo de software.
Algunos sistemas de control de versión son administradores de software (Software Configuration Management). Estos sistemas están específicamente diseñados para administrar árboles de código fuente y soportan el ciclo de vida de aplicaciones. Otros sistemas son repositorios generales de documentos.
Un repositorio de información para control de versión guarda un registro de los cambios hechos tanto a los datos como a la estructura misma de archivos. Un cliente puede no solo ver la última versión de los documentos guardados, sino también estados previos del sistema de archivos. Por ejemplo un cliente puede hacer consultas del tipo ¿Qué cambios se hicieron en un documento en la última semana?
El problema fundamental es por un lado ¿Cómo compartir información y coordinar modificaciones concurrentes a un grupo de documentos? Y complementariamente ¿Cómo recuperar estados anteriores de los documentos cuando una serie de cambios resultan inapropiados o se requieren variaciones de base común?
Un enfoque para evitar conflictos es reservar-modificar-cambiar (lock-modify-unlock). Este enfoque no siempre garantiza la integridad o coherencia de un sistema cuando se trabaja con múltiples documentos y serializa el trabajo innecesariamente cuando se pudiera hacer cambios independientes. Otro enfoque es copiar-modificar-integrar (copy-modify-merge). El repositorio puede asistir en el manejo de documentos y sus cambios, pero una persona necesita hacer el análisis de si un conjunto de cambios es valido y los miembros de un equipo deben mantener una buena comunicación.
En el caso particular del software algunas de las áreas que soporta un SCM son:
- Administración de versiones múltiples, permitiendo a usuarios y desarrolladores reportas defectos y cambios con relación a versiones históricas.
- Administración de equipos de desarrollo, permitiendo que varios programadores trabajen en un mismo archivo e integrando los cambios.
- Auditorias de cambios.
Los sistemas de control de versión trabajan con dos elementos base: áreas de trabajo y repositorios. Las áreas de trabajo es donde se hacen cambios y el repositorio es el lugar donde se guardan los documentos de referencia que sincronizan el trabajo de todos y define el estado de la información. El repositorio guarda metadata que permite rastrear cambios y versiones.
El paradigma central de control de versión es Pedir/Aplicar (check out/commit). Todos los documentos se almacenan en el repositorio. El programador registra una copia en su área de trabajo y procede a aplicar cambios a su copia. Cuando los cambios son estables, se aplican al repositorio de acuerdo a políticas de administración de cambios y resolución de conflictos.
Dos conceptos importantes en la administración de cambios son ramas (branches) y etiquetas (tags). La ramificación del código permite mantener el desarrollo del sistema y liberar versiones de acuerdo a plataformas, características y pruebas; O para pruebas de código experimental. Etiquetas son similares a ramas pero puntos de referencia en la misma línea de desarrollo, no a una variante del mismo.
El abuelito y punto de referencia de los sistemas de control de versión es CVS, referenciado a scripts escritos por Dick Grune y publicados en comp.sources.unix en diciembre de 1986.
Existen herramientas gratuitas de buena caliadad para UML. Tanto Netbeans como Eclipse soportan esta funcionalidad con el ciclo completo de desarrollo desde generación de código hasta reingenieria. Esto, claro, si se quiere trabajar en Java. En .Net no he encontrado este grado de funcionalidad en herramientas Open Source. Una opción de bajo costo, relativo a RUP y similares, es Visual UML. Visual Paradigm tiene una edición limitada sin costo, Smart Development Environment Community Edition for Visual Studio.
I keep six honest serving-men
(They taught me all I knew)
Their names are What and Why and When And How and Where and Who
Uno de los dichos de mi buen amigo Ángel es sobre la gracia del gringo, ese gringo mítico de poderes de Comic, para tomar algún concepto del sentido común y convertirlo en un producto mercadeable. Un ejemplo interesante de esto es el marco de Zachman para arquitecturas empresariales. Todo un icono en la comunidad de arquitectura de datos. Se basa en el patrón de analizar problemas con una matriz de puntos a revisar. En el marco de Zachman las columnas corresponden a los seis interrogantes en ingles y las hileras a diferentes roles en el desarrollo de una aplicación empresarial. De este sencillo concepto Zachman desarrolla todo una teoría detallada de cómo documentar y administrar un proyecto de desarrollo de un sistema empresarial basado en un modelo entidad-relación.