David Bradley in Spectroscopy.now: “American researchers have used “crowdsourcing” – the cooperation of a large number of interested non-scientists via the internet – to help them identify a new fungus. The species contains unusual metabolites, isolated and characterized, with the help of vibrational circular dichroism (VCD). One compound reveals itself to have potential antitumour activity.
So far, a mere 7 percent of the more than 1.5 million species of fungi thought to exist have been identified and an even smaller fraction of these have been the subject of research seeking bioactive natural products. …Robert Cichewicz of the University of Oklahoma, USA, and his colleagues hoped to remedy this situation by working with a collection of several thousand fungal isolates from three regions: Arctic Alaska, tropical Hawaii, and subtropical to semiarid Oklahoma. Collaborator Susan Mooberry of the University of Texas at San Antonio carried out biological assays on many fungal isolates looking for antitumor activity among the metabolites in Cichewicz’s collection. A number of interesting substances were identified…
However, the researchers realized quickly enough that the efforts of a single research team were inadequate if samples representing the immense diversity of the thousands of fungi they hoped to test were to be obtained and tested. They thus turned to the help of citizen scientists in a “crowdsourcing” initiative. In this approach, lay people with an interest in science, and even fellow scientists in other fields, were recruited to collect and submit soil from their gardens.
As the samples began to arrive, the team quickly found among them a previously unknown fungal strain – a Tolypocladium species – growing in a soil sample from Alaska. Colleague Andrew Miller of the University of Illinois did the identification of this new fungus, which was found to be highly responsive to making new compounds based on changes in its laboratory growth conditions. Moreover, extraction of the active chemicals from the isolate revealed a unique metabolite which was shown to have significant antitumour activity in laboratory tests. The team suggests that this novel substance may represent a valuable new approach to cancer treatment because it precludes certain biochemical mechanisms that lead to the emergence of drug resistance in cancer with conventional drugs…
The researchers point out the essential roles that citizen scientists can play. “Many of the groundbreaking discoveries, theories, and applied research during the last two centuries were made by scientists operating from their own homes,” Cichewicz says. “Although much has changed, the idea that citizen scientists can still participate in research is a powerful means for reinvigorating the public’s interest in science and making important discoveries,” he adds.”
A Bottom-Up Smart City?
Alicia Rouault at Data-Smart City Solutions: “America’s shrinking cities face a tide of disinvestment, abandonment, vacancy, and a shift toward deconstruction and demolition followed by strategic reinvestment, rightsizing, and a host of other strategies designed to renew once-great cities. Thriving megacity regions are experiencing rapid growth in population, offering a different challenge for city planners to redefine density, housing, and transportation infrastructure. As cities shrink and grow, policymakers are increasingly called to respond to these changes by making informed, data-driven decisions. What is the role of the citizen in this process of collecting and understanding civic data?
Writing for Forbes in “Open Sourcing the Neighborhood,” Professor of Sociology at Columbia University Saskia Sassen calls for “open source urbanism” as an antidote to the otherwise top-down smart city movement. This form of urbanism involves opening traditional verticals of information within civic and governmental institutions. Citizens can engage with and understand the logic behind decisions by exploring newly opened administrative data. Beyond opening these existing datasets, Sassen points out that citizen experts hold invaluable institutional memory that can serve as an alternate and legitimate resource for policymakers, economists, and urban planners alike.
In 2012, we created a digital platform called LocalData to address the production and use of community-generated data in a municipal context. LocalData is a digital mapping service used globally by universities, non-profits, and municipal governments to gather and understand data at a neighborhood scale. In contrast to traditional Census or administrative data, which is produced by a central agency and collected infrequently, our platform provides a simple method for both community-based organizations and municipal employees to gather real-time data on project-specific indicators: property conditions, building inspections, environmental issues or community assets. Our platform then visualizes data and exports it into formats integrated with existing systems in government to seamlessly provide accurate and detailed information for decision makers.
LocalData began as a project in Detroit, Michigan where the city was tackling a very real lack of standard, updated, and consistent condition information on the quality and status of vacant and abandoned properties. Many of these properties were owned by the city and county due to high foreclosure rates. One of Detroit’s strategies for combating crime and stabilizing neighborhoods is to demolish property in a targeted fashion. This strategy serves as a political win as much as providing an effective way to curb the secondary effects of vacancy: crime, drug use, and arson. Using LocalData, the city mapped critical corridors of emergent commercial property as an analysis tool for where to place investment, and documented thousands of vacant properties to understand where to target demolition.
Vacancy is not unique to the Midwest. Following our work with the Detroit Mayor’s office and planning department, LocalData has been used in dozens of other cities in the U.S. and abroad. Currently the Smart Chicago Collaborative is using LocalData to conduct a similar audit of vacant and abandoned property in southwest Chicagos. Though an effective tool for capturing building-specific information, LocalData has also been used to capture behavior and movement of goods. The MIT Megacities Logistics Lab has used LocalData to map and understand the intensity of urban supply chains by interviewing shop owners and mapping delivery routes in global megacities in Mexico, Colombia, Brazil and the U.S. The resulting information has been used with analytical models to help both city officials and companies to design better city logistics policies and operations….”
Using Social Media in Rulemaking: Possibilities and Barriers
New paper by Michael Herz (Cardozo Legal Studies Research Paper No. 417): “Web 2.0” is characterized by interaction, collaboration, non-static web sites, use of social media, and creation of user-generated content. In theory, these Web 2.0 tools can be harnessed not only in the private sphere but as tools for an e-topia of citizen engagement and participatory democracy. Notice-and-comment rulemaking is the pre-digital government process that most approached (while still falling far short of) the e-topian vision of public participation in deliberative governance. The notice-and-comment process for federal agency rulemaking has now changed from a paper process to an electronic one. Expectations for this switch were high; many anticipated a revolution that would make rulemaking not just more efficient, but also more broadly participatory, democratic, and dialogic. In the event, the move online has not produced a fundamental shift in the nature of notice-and-comment rulemaking. At the same time, the online world in general has come to be increasingly characterized by participatory and dialogic activities, with a move from static, text-based websites to dynamic, multi-media platforms with large amounts of user-generated content. This shift has not left agencies untouched. To the contrary, agencies at all levels of government have embraced social media – by late 2013 there were over 1000 registered federal agency twitter feeds and over 1000 registered federal agency Facebook pages, for example – but these have been used much more as tools for broadcasting the agency’s message than for dialogue or obtaining input. All of which invites the questions whether agencies could or should directly rely on social media in the rulemaking process.
This study reviews how federal agencies have been using social media to date and considers the practical and legal barriers to using social media in rulemaking, not just to raise the visibility of rulemakings, which is certainly happening, but to gather relevant input and help formulate the content of rules.
The study was undertaken for the Administrative Conference of the United States and is the basis for a set of recommendations adopted by ACUS in December 2013. Those recommendations overlap with but are not identical to the recommendations set out herein.”
Open Data in Action
- Brightscope, a San Diego-based company that leverages data from the Department of Labor, the Security and Exchange Commission, and the Census Bureau to rate consumers’ 401k plans objectively on performance and fees, so companies can choose better plans and employees can make better decisions about their retirement options.
- AllTuition, a Chicago-based startup that provides services—powered by data from Department of Education on Federal student financial aid programs and student loans— to help students and parents manage the financial-aid process for college, in part by helping families keep track of deadlines, and walking them through the required forms.
- Archimedes, a San Francisco healthcare modeling and analytics company, that leverages Federal open data from the National Institutes of Health, the Centers for Disease Control and Prevention, and the Center for Medicaid and Medicare Services, to provide doctors more effective individualized treatment plans and to enable patients to make informed health decisions.
See also:
Open Government Data: Companies Cash In
Open Budgets Portal
About: “The Open Budgets Portal is the first effort to create a one-stop shop for budget data worldwide with the hope of bringing visibility to countries’ efforts in this field, facilitating access and promoting use of spending data, and motivating other countries into action.
The portal offers the opportunity to showcase a subset of developing countries and subnational entities (identified by blue markers in the map) that have excelled in the exploration of new frontiers of fiscal transparency by choosing to disseminate their entire public spending datasets in accessible formats (i.e., soft copy), with the expectation that these efforts could motivate other countries into action . Users will be able to download the entire public expenditure landscape of the members of the portal in consolidated files, all of which were rigorously collected, cleaned and verified through the BOOST Initiative.
For each of these countries, the site also includes links to their original open data portals, which provide additional important information (i.e., higher frequencies other than annual, links to output data and other socio economic indicators, etc.). While every effort has been done to certify the quality of these databases according to BOOST approach and methodology, users are encouraged to refer back to the country-owned open data portals to ensure complete consistency of data with published official figures, as well as consult accompanying user manuals for potential caveats on uses of the data.
This portal represents a starting point to build momentum within the growing interest around fiscal transparency and the importance of data for enhanced decision-making processes and improved budget outcomes and accountability. While most initiatives on open budgets rightfully center on availability of key documents, little focus has been given to the quality of data dissemination and to the importance of its analytical use for incorporation into evidence-based decision-making processes.
This Open Budgets Portal aims to fill this gap by providing access to budget data worldwide and particularly to the most disaggregated and comprehensive data collected through the BOOST Initiative. The portal combines this information with a variety of tools, manuals, reports and best practices aimed at stimulating use by intermediaries, as well as easier to interpret visualization for non-experts. Our objective is to encourage all potential uses of this data to unbind the analytical power of such data.
The Open Budgets Portal was launched at the event “Boosting Fiscal Transparency for Better Policy Outcomes,” held on December 17, 2013 in Washington, DC. The following presentations were shown at the event:
Presentation of the Open Budgets Portal by Massimo Mastruzzi, Senior Economist, Open Goverment, World Bank.
Building a Citizen’s Budget Understanding – BudgetStories.md by Victoria Vlad, Economist of “Expert-Grup” from the Republic of Moldova.”
Selected Readings on Data Visualization
The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data visualization was originally published in 2013.
Data visualization is a response to the ever-increasing amount of information in the world. With big data, informatics and predictive analytics, we have an unprecedented opportunity to revolutionize policy-making. Yet data by itself can be overwhelming. New tools and techniques for visualizing information can help policymakers clearly articulate insights drawn from data. Moreover, the rise of open data is enabling those outside of government to create informative and visually arresting representations of public information that can be used to support decision-making by those inside or outside governing institutions.
Selected Reading List (in alphabetical order)
- D.J. Duke, K.W. Brodlie, D.A. Duce and I. Herman — Do You See What I Mean? [Data Visualization] — a paper arguing for a systematic ontology for data visualization.
- Michael Friendly — A Brief History of Data Visualization – a brief overview of the history of data visualization that traces the path from early cartography and statistical graphics to modern visualization practices.
- Alvaro Graves and James Hendler — Visualization Tools for Open Government Data — a paper arguing for wider use of visualization tools to open government data achieve its full potential for impact.
- César A. Hidalgo — Graphical Statistical Methods for the Representation of the Human Development Index and Its Components — an argument for visualizing complex data to aid in human development initiatives.
- Genie Stowers — The Use of Data Visualization in Government — a report aimed at helping public sector managers make the most of data visualization tools and practices.
Annotated Selected Reading List (in alphabetical order)
Duke, D.J., K.W. Brodlie, D.A. Duce and I. Herman. “Do You See What I Mean? [Data Visualization].” IEEE Computer Graphics and Applications 25, no. 3 (2005): 6–9. http://bit.ly/1aeU6yA.
- In this paper, the authors argue that a more systematic ontology for data visualization to ensure the successful communication of meaning. “Visualization begins when someone has data that they wish to explore and interpret; the data are encoded as input to a visualization system, which may in its turn interact with other systems to produce a representation. This is communicated back to the user(s), who have to assess this against their goals and knowledge, possibly leading to further cycles of activity. Each phase of this process involves communication between two parties. For this to succeed, those parties must share a common language with an agreed meaning.”
- That authors “believe that now is the right time to consider an ontology for visualization,” and “as visualization move from just a private enterprise involving data and tools owned by a research team into a public activity using shared data repositories, computational grids, and distributed collaboration…[m]eaning becomes a shared responsibility and resource. Through the Semantic Web, there is both the means and motivation to develop a shared picture of what we see when we turn and look within our own field.”
Friendly, Michael. “A Brief History of Data Visualization.” In Handbook of Data Visualization, 15–56. Springer Handbooks Comp.Statistics. Springer Berlin Heidelberg, 2008. http://bit.ly/17fM1e9.
- In this paper, Friendly explores the “deep roots” of modern data visualization. “These roots reach into the histories of the earliest map making and visual depiction, and later into thematic cartography, statistics and statistical graphics, medicine and other fields. Along the way, developments in technologies (printing, reproduction), mathematical theory and practice, and empirical observation and recording enabled the wider use of graphics and new advances in form and content.”
- Just as the general the visualization of data is far from a new practice, Friendly shows that the graphical representation of government information has a similarly long history. “The collection, organization and dissemination of official government statistics on population, trade and commerce, social, moral and political issues became widespread in most of the countries of Europe from about 1825 to 1870. Reports containing data graphics were published with some regularity in France, Germany, Hungary and Finland, and with tabular displays in Sweden, Holland, Italy and elsewhere.”
Graves, Alvaro and James Hendler. “Visualization Tools for Open Government Data.” In Proceedings of the 14th Annual International Conference on Digital Government Research, 136–145. Dg.o ’13. New York, NY, USA: ACM, 2013. http://bit.ly/1eNSoXQ.
- In this paper, the authors argue that, “there is a gap between current Open Data initiatives and an important part of the stakeholders of the Open Government Data Ecosystem.” As it stands, “there is an important portion of the population who could benefit from the use of OGD but who cannot do so because they cannot perform the essential operations needed to collect, process, merge, and make sense of the data. The reasons behind these problems are multiple, the most critical one being a fundamental lack of expertise and technical knowledge. We propose the use of visualizations to alleviate this situation. Visualizations provide a simple mechanism to understand and communicate large amounts of data.”
- The authors also describe a prototype of a tool to create visualizations based on OGD with the following capabilities:
- Facilitate visualization creation
- Exploratory mechanisms
- Viralization and sharing
- Repurpose of visualizations
Hidalgo, César A. “Graphical Statistical Methods for the Representation of the Human Development Index and Its Components.” United Nations Development Programme Human Development Reports, September 2010. http://bit.ly/166TKur.
- In this paper for the United Nations Human Development Programme, Hidalgo argues that “graphical statistical methods could be used to help communicate complex data and concepts through universal cognitive channels that are heretofore underused in the development literature.”
- To support his argument, representations are provided that “show how graphical methods can be used to (i) compare changes in the level of development experienced by countries (ii) make it easier to understand how these changes are tied to each one of the components of the Human Development Index (iii) understand the evolution of the distribution of countries according to HDI and its components and (iv) teach and create awareness about human development by using iconographic representations that can be used to graphically narrate the story of countries and regions.”
Stowers, Genie. “The Use of Data Visualization in Government.” IBM Center for The Business of Government, Using Technology Series, 2013. http://bit.ly/1aame9K.
- This report seeks “to help public sector managers understand one of the more important areas of data analysis today — data visualization. Data visualizations are more sophisticated, fuller graphic designs than the traditional spreadsheet charts, usually with more than two variables and, typically, incorporating interactive features.”
- Stowers also offers numerous examples of “visualizations that include geographical and health data, or population and time data, or financial data represented in both absolute and relative terms — and each communicates more than simply the data that underpin it. In addition to these many examples of visualizations, the report discusses the history of this technique, and describes tools that can be used to create visualizations from many different kinds of data sets.”
Lessons in the crowdsourced verification of news from Storyful and Reddit’s Syria forum
Mathew Ingram at GigaOm: “One of the most powerful trends in media over the past year is the crowdsourced verification of news, whether it’s the work of a blogger like Brown Moses or former NPR journalist Andy Carvin. Two other interesting efforts in this area are the “open newsroom” approach taken by Storyful — which specializes in verifying social-media reports for mainstream news entities — and a Reddit forum devoted to crowdsourcing news coverage of the civil war in Syria.
Storyful journalist Joe Galvin recently looked at some of the incidents that the company has helped either debunk or verify over the past year — including a fake tweet from the official account of the Associated Press about explosions at the White House (which sent the Dow Jones index plummeting before it was corrected), a claim from Russian authorities that a chemical attack in Syria had been pre-meditated, and a report from investigative journalist Seymour Hersh about the same attack that questioned whether the government had been involved….
Reddit, meanwhile, has been conducting some “open newsroom”-style experiments of its own around a number of news events, including the Syrian civil war. The site has come under fire in the past for some of those efforts — including the attempt to identify the bombers in the Boston bombings case, which went badly awry — but the Syrian thread in particular is a good example of how a smart aggregator can make sense of an ongoing news event. In a recent post at a site called Dissected News, one of the moderators behind the /r/SyrianCivilWar sub-Reddit — a 22-year-old law student named Christopher Kingdon (or “uptodatepronto” as he is known on the site) — wrote about his experiences with the forum, which is trying to be a broadly objective source for breaking news and information about the conflict….
Some of what the moderators do in the forum is similar to the kind of verification that Storyful or the BBC’s “user-generated content desk” do — checking photos and video for obvious signs of fakery and hoaxes. But Kingdon also describes how much effort his team of volunteers puts into ensuring that the sub-Reddit doesn’t degenerate into trolling or flame-wars. Strict rules are enforced “to prevent personal attacks, offensive and violent language and racism” and the moderators favor posts that “utilize sources, background information and a dash of common sense.”
The Impact of Innovation Inducement Prizes
From the Compendium of Evidence on Innovation Policy/NESTA: “Innovation inducement prizes are one of the oldest types of innovation policy measure. The popularity of innovation inducement prizes has gradually decreased during the early 20th century. However, innovation inducement prizes have regained some of their popularity since the 1990s with new prizes awarded by the US X Prize Foundation and with the current USA Administration’s efforts to use them in various government departments as an innovation policy instrument. Innovation Prizes are also becoming an important innovation policy instrument in the UK. A recent report by McKinsey & Company (2009) estimates the value of prizes awarded to be between £600 million and £1.2million. Despite the growing popularity of innovation inducement prizes, the impact of this innovation policy measure is still not understood. This report brings together the existing evidence on the effects of innovation inducement prizes by drawing on a number of ex-ante and ex-post evaluations as well as limited academic literature. This report focuses on ex-ante innovation inducement prizes where the aim is to induce investment or attention to a specific goal or technology. This report does not discuss the impact of ex-post recognition prizes where the prize is given as a recognition after the intended outcome happens (e.g. Nobel Prize).
Innovation inducement prizes have a wide range of rationales and there is no agreed on dominant rationale in the literature. Traditionally, prizes have been seen as an innovation policy instrument that can overcome market failure by creating an incentive for the development of a particular technology or technology application. A second rationale is that the implementation demonstration projects in which not only creation of a specific technology is intended but also demonstration of the feasible application of this technology is targeted. A third rationale is related to the creation of a technology that will later be put in the public domain to attract subsequent research. Prizes are also increasingly organised for community and leadership building. As prizes probably allow more flexibility than most of the other innovation policy instruments, there is a large number of different prize characteristics and thus a vast number of prize typologies based on these characteristics.
Evidence on the effectiveness of prizes is scarce. There are only a few evaluations or academic works that deal with the creation of innovation output and even those which deal with the innovation output only rarely deals with the additionality. Only a very limited number of studies looked at if innovation inducement prizes led to more innovation itself or innovation outputs. As well as developing the particular technology that the innovation inducement prizes produce, they create prestige for both the prize sponsor and entrants. Prizes might also increase the public and sectoral awareness on specific technology issues. A related issue to the prestige gained from the prizes is the motivations of participants as a conditioning factor for innovation performance. Design issues are the main concern of the prizes literature. This reflects the importance of a careful design for the achievement of desired effects (and the limitation of undesired effects). There are a relatively large number of studies that investigated the influence of the design of prize objective on the innovation performance. A number of studies points out that sometimes prizes should be accompanied with or followed by other demand side initiatives to fulfil their objectives, mostly on the basis of ex-ante evaluations. Finally, prizes are also seen as a valuable opportunity for experimentation in innovation policy.
It is evident from the literature we analysed that the evidence on the impact of innovation inducement prizes is scarce. There is also a consensus that innovation inducement prizes are not a substitute for other innovation policy measures but are complementary under certain conditions. Prizes can be effective in creating innovation through more intense competition, engagement of wide variety of actors, distributing risks to many participants and by exploiting more flexible solutions through a less prescriptive nature of the definition of the problem in prizes. They can overcome some of the inherent barriers to other instruments, but if prizes are poorly designed, managed and awarded, they may be ineffective or even harmful.”
20 Innovations that Mattered in 2013
New Guide from GovLoop: “The end of the year means two things: setting unrealistic New Year’s resolutions and endless retrospectives. While we can’t force you to put down the cake and pick up a carrot, we can help you to do your job better by highlighting some of the biggest and best innovations to come out of government in the last 365 days.
The past year brought us the Interior Department’s Instagram feed and Colorado’s redesigned website. It also brought us St. Louis’ optimized data analytics that make their city safer and North Carolina’s iCenter that adopted a “try before you buy” policy.
All of these new technologies and tactics saved time and resources, critical outcomes in the current government landscape where budget cuts are making each new purchase risky.
But these were not the only buzzworthy projects for government technology in 2013. In this end-of-year issue, GovLoop analyzed the 20 best innovations in government in four different categories:
- Mobile Apps Movers and Shakers?
- Big Data Dynamos
- Social Media Mavericks
- Website Wonders
We also asked two of the most innovative Chief Information Officers in the country to don some Google Glass’ In a year where the government shutdown and sequestration brought progress to a screeching halt, many agencies were able to rise above the inauspicious environment and produce groundbreaking and innovative programs.
For instance, when the horrible bombings brought terror to the finish line of the Boston Marathon, the local police department sprang into action. They immediately mobilized their forces on the ground. But then they did something else, too. The Boston Police Department took to Twitter. The social media team was informative, timely and, accurate. The BPD flipped the script on emergency media management. They innovated in a time of crisis and got people the information they needed in a timely and appropriate manner.
We know that oftentimes it is hard to see through the budget cuts and government shutdowns, but government innovation is all around us. Our goal with this guide is to showcase programs that are not only making a difference but demonstrate risk and reward….”