Friday, March 25, 2016

Learning how to learn

Just finished a great book by Barbara Oakley on how to learn to learn. As I experience as a professor teaching courses that students consider to be difficult, many students do not know how to study effectively. Just reading books to learn mathematics will not bring you far. Dr. Oakley failed mathematics and science courses when she was in school, learned Russian and became a translator in the army. Now she is a full professor of engineering at Oakland University. What happened? She learned how to effectively learn the material she was interested in to learn.
   An important lesson is to vary between focused and unfocused attention. Your brain keeps working on problems when you are not concentrated on it. It is important to start early with studying and make use of this background brain processing. Spending many hours in the last minute before an exam is very ineffective. And practice. Do your homework and practice. You don't derive skills in sports or music by just trying it once, you have to practice!!
  To get a brief overview of her book, see her Tedtalk.


Monday, August 17, 2015

Bridge maintenance as a collective action problem.

Collapsed bridge on the I-10
On July 19th a bridge collapsed on the I-10 that connects Phoenix with Los Angeles. Heavy rain caused flash flooding which eroded the eastbound bridge to give way. As a consequence the I-10, the main highway between Arizona and California was closed for a week and travelers had to drive a few 100 miles extra to go to their destination.
Is the bridge collapse a rare accident or can we expect more due to increased intensity of rainfall events and lack of maintenance of bridges (and infrastructure in general)?
This bridge collapse coincided with me reading the book "Too Big to Fall" by Barry LePatner, a construction lawyer of New York City. LePatner discusses a number of cases, such as the collapse of the I-35W bridge in Minneapolis, in detail and provides a historical analysis of the incentives to built and maintain infrastructure. Unfortunately, the incentives are tailored to building new roads and bridges (where the costs are shared with the federal government), and maintenance tend to be postponed. Furthermore, there has been a lack of coordination on how inspections need to be done. While there are now standard inspections, new technologies might be used to create smart infrastructure to get more often relevant info on key indicators of the structural functionality of the bridges.
The book provides interesting material to start looking into the provision of bridging services as a collective action problem where current perverse incentives lead to an under provision of maintenance and inspections. Together with increased extreme weather events due to climate change, we can expect that this will have major consequences for our society. Unfortunately those topics do not get the attention in political debates that they deserve. Strategies have to be made how to cope with the failing infrastructure and increased vulnerabilities.





Thursday, July 23, 2015

Who rules the world?

The new book of Paul Steinberg, a professor of Political Science and Environmental Policy at Harvey Mudd College, is entitled Who Rules the Earth? The book is an engaging discussion on how rules rule our lives and the interaction with the environment. Unlike many people may assume rules are constructed because individuals care about a problem and try to find solutions. For example, building codes have a potential big environmental impact and thus changing those rules or creating new standard (like LEED) can have major impact. Based on personal observations and stories, Professor Steinberg shows us that rules are everywhere and key to understand how we can solve environmental challenges.
His students created a website where you find animations, news updates and a game to get immersed  even more with the rules and the environment. Although rules and regulations may sound like a boring topic. Steinberg is doing a great job to make it engaging and raise awareness. 

Tuesday, June 16, 2015

Eco: how to save the world?

In my previous post I lamented the lack of resource dynamics in (video) games. Some of you let me know about resource constraints in some games, but many of you recognized the lack of ecological realism. Now a new multi-player game is announced: Eco, which will focus on the fragility of the world we live in. It is a survival game for the the resource you share with others. The ecosystem consists of predators and prey. Eat or be eaten. As human avatars you need to harvest resources to survive, but if you overuse the resource, you and the whole world will be affected. In fact, the developer mentions that the server might be physically wiped if the world collapsed. This might sound dramatic especially since many lifeforms will outlive humans, but it is an interesting option. No restart is possible, nor having extra lives.
I wonder how the developers deal with trolls, those gamers who purposely want to collapse the system. Anyway check out the Trailer and see whether this would be an interesting alternative to the robust worlds.

Tuesday, January 13, 2015

Resource dynamics in video games

During the recent winter break I got introduced by some younger family members to Clash of Clans and Boom Beach, which are strategy games you can play on your ipad or other devices. You make investment decisions for defense and attack infrastructure as well as infrastructure to extract resources. For example, in Boom Beach (see figure below) you occupy an island and use gold and wood resources to build and support your army (to attack island of other players and rob their resources). There is a saw mill that generate construction material and it will not reduce the amount of forest on the island. In fact, if you have collected sufficient gold from the unlimited gold resource, you can increase the capacity of the saw mill which will not affect the amount of trees on the landscape.
So we can conclude that the game has no relevant renewable (let along non-renewable) resource dynamics. Just invest in better technology to extract the resources and everything will be fine. Why do I bother about this? I don't want to spoil the game. In fact, the games are entertaining and addictive, which is exactly why millions of people play these games. But the lack of relevant resource dynamics affect the perception of people on how to solve resource problems. I understand that it might be more challenging to develop a game with limited resources to attract millions of players (everyone wants to grow their army to stay in the game). On the other hand, it is like assuming gravity does not exist for the convenience of the game dynamics.
It might be an interesting challenge for the gaming industry to try to capture relevant resource dynamics such that people learn not only to develop complex strategies to combat other armies but also derive a better understanding of the complex dynamics of short term benefits of resource extraction and long term consequences of a livable planet.



Thursday, December 4, 2014

Archiving practice for model code of agent-based models

There is increasing concern over the repeatability and reproducibility of computational science (see also here, here, here, here and here). If computational scientific enterprises want to be accumulative more transparency is required including the archiving of computer code in public repositories. This also holds for agent-based modeling, an increasingly popular methodology in the social and life sciences.
            I show here some initial results of an analysis of the practice of archiving agent-based models. Five journals were selected that regularly publish research that use agent-based models:  Advances in Complex Systems, Computational and Mathematical Organization Theory, Ecological Modelling, Environmental Modeling and Software, Journal of Artificial Societies and Social Simulation. Using the ISI web of science we searched for all articles in those 5 journals in the years 2010 to 2014 using the search term "agent-based model*". This resulted in 255 articles on September 5, 2014 of which 56 articles were disregarded since they did not discuss an agent-based model itself.
            Out of the 199 remaining articles 135 were found not to provide the computational model’s source code. 21 articles referred to an institutional or individual homepage. In 5 cases, the link resulted in a 404 not found error and we recorded that the code was not available. In 17 cases the code was included as an electronic appendix of the journal.  Only 31 articles provided the model code in a public archive, out of which 26 were stored at the CoMSES Computational Model Library. The other 5 models were archived in repositories like Bitbucket, Git Hub, Google code, Sourceforge and the Netlogo community models site.
Percentage of archived model per year.
            Over the years there has been improvement in model archiving. In 2010 75% of models were not archived. The increasing availability of public archives has enabled authors to archive their models more frequently and in 2014 50% of the models are archived. The majority of those models are archived in OpenABM.
Percentage of archived model per journal.
            As we can see, most models are still not archived. One journal has championed model archiving with more than 50% of its publications associated with a publicly archived model, whereas the other journals have an archiving percentage between 10% and 20%.
          Since most research is sponsored by tax money, sponsors sometimes explicitly require that the data, including software code, is made publicly available. We find that papers from the 2 main sponsors (16 by European Commission and 21 by the National Science Foundation) experience a low compliance rate to best practices. In both cases we find that only 15% of the models are available in public archives, significantly lower than the articles that do not list a sponsor (29%), or list other sponsors (24%).
        Currently the scope of the analysis is extended to about 3000 articles (using search term agent-based model* unrestricted to years and journals). Besides getting a better picture of current archiving practices we also hope this activity lead to more awareness of the problem and the need for journals to increase requirements for archiving code and documentation in public repositories. 



Thursday, September 18, 2014

Open Science; Science for a 21th century experiences roadblocks from 19th century incentive structures


The book Reinventing Discovery by Michael Nielsen is a joy to read. Technological development enables the production of knowledge by a large group of people making small contributions, instead of the isolated inventions in 19th century science. Nielsen discussed examples like the Polymath project where mathematicians - from field medalist to high school students - collaborate to solve problems, the GalaxyZoo (classifying galaxies) and Foldit (folding proteins). These success stories brought large number of people together - often non-specialists - to solve problems faster and better than individuals may do. Key components for success are the ability to split the problem in smaller modules, and clear measures of performance (getting a score when folding proteins).
So citizens start solving science problems, but a harder problem is to have scientists doing research in the open, Open science, which enable others to build on it. The incentives structures in Science are perverse to stimulate discovery. In fact we use incentives structures from the 19th century ignoring the potential increase in knowledge production if we use a 21th century approach. This networked open science approach aspired by Nielsen experience major challenges due to the incentives of scientists to publish results in high profile journals, but not getting recognition for sharing data and/or computer code, which are often the main outcomes of a project.
As we experience in the development of openabm scientists like to download the models of others, but are reluctant to archive their own work. Furthermore, journals are reluctance to increase the standards of transparency and sponsors like the National Science Foundation require data management plans but do not invest sufficiently in cyberinfrastructure to make this possible.
Although there is self-organization at the small scale in the science community, the sponsors need to step up and invest seriously to make a science for the 21th century possible.