Next.Data.gov


Nick Sinai at the White House Blog: “Today, we’re excited to share a sneak preview of a new design for Data.gov, called Next.Data.gov. The upgrade builds on the President’s May 2013 Open Data Executive Order that aims to fuse open-data practices into the Federal Government’s DNA. Next.Data.gov is far from complete (think of it as a very early beta), but we couldn’t wait to share our design approach and the technical details behind it – knowing that we need your help to make it even better.  Here are some key features of the new design:
 

OSTP_nextdata_1 

Leading with Data: The Data.gov team at General Services Administration (GSA), a handful of Presidential Innovation Fellows, and OSTP staff designed Next.Data.Gov to put data first. The team studied the usage patterns on Data.gov and found that visitors were hungry for examples of how data are used. The team also noticed many sources, such as tweets and articles outside of Data.gov featuring Federal datasets in action. So Next.Data.gov includes a rich stream that enables each data community to communicate how its datasets are impacting companies and the public.

OSTP_nextdata_2 

In this dynamic stream, you’ll find blog posts, tweets, quotes, and other features that more fully showcase the wide range of information assets that exist within the vaults of government.
Powerful Search: The backend of Next.Data.gov is CKAN and is powered by Solr—a powerful search engine that will make it even easier to find relevant datasets online. Suggested search terms have been added to help users find (and type) things faster. Next.Data.gov will start to index datasets from agencies that publish their catalogs publicly, in line with the President’s Open Data Executive Order. The early preview launching today features datasets from the Department of Health and Human Services—one of the first Federal agencies to publish a machine-readable version of its data catalog.
Rotating Data Visualizations: Building on the theme of leading with data, even the  masthead-design for Next.Data.gov is an open-data-powered visualization—for now, it’s a cool U.S. Geological Survey earthquake plot showing the magnitude of earthquake measurements collected over the past week, around the globe.

OSTP_nextdata_3 

This particular visualization was built using D3.js. The visualization will be updated periodically to spotlight different ways open data is used and illustrated….
We encourage you to collaborate in the design process by creating pull requests or providing feedback via Quora or Twitter.”

Metrics for Government Reform


Geoff Mulgan: “How do you measure a programme of government reform? What counts as evidence that it’s working or not? I’ve been asked this question many times, so this very brief note suggests some simple answers – mainly prompted by seeing a few writings on this question which I thought confused some basic points.”
Any type of reform programme will combine elements at very different levels. These may include:

  • A new device – for example, adjusting the wording in an official letter or a call centre script to see what impact this has on such things as tax compliance.
  • A new kind of action – for example a new way of teaching maths in schools, treating patients with diabetes, handling prison leavers.
  • A new kind of policy – for example opening up planning processes to more local input; making welfare payments more conditional.
  • A new strategy – for example a scheme to cut carbon in cities, combining retrofitting of housing with promoting bicycle use; or a strategy for public health.
  • A new approach to strategy – for example making more use of foresight, scenarios or big data.
  • A new approach to governance – for example bringing hitherto excluded groups into political debate and decision-making.

This rough list hopefully shows just how different these levels are in their nature. Generally as we go down the list the following things rise:

  • The number of variables and the complexity of the processes involved
  • The timescales over which any judgements can be made
  • The difficultness involved in making judgements about causation
  • The importance of qualitative relative to quantitative assessment”

Internet Association's New Website Lets Users Comment on Bills


Mashable: “The Internet Association, the lobbying conglomerate of big tech companies like Google, Amazon and Facebook, has launched a new website that allows users to comment on proposed bills.
The association unveiled its redesigned website on Monday, and it hopes its new, interactive features will give citizens a way to speak up…
In the “Take Action” section of the website, under “Leave Your Mark,” the association plans to upload bills, declarations and other context documents for netizens to peruse and, most importantly, interact with. After logging in, a user can comment on the bill in general, and even make line edits.”

The Durkheim Project


Co.Labs: “A new project, newly launched by DARPA and Dartmouth University, is trying something new: Data-mining social networks to spot patterns indicating suicidal behavior.
Called The Durkheim Project, named for the Victorian-era psychologist, it is asking veterans to offer their Twitter and Facebook authorization keys for an ambitious effort to match social media behavior with indications of suicidal thought. Veterans’ online behavior is then fed into a real-time analytics dashboard which predicts suicide risks and psychological episodes… The Durkheim Project is led by New Hampshire-based Patterns and Predictions, a Dartmouth University spin-off with close ties to academics there…
The Durkheim Project is part of DARPA’s Detection and Computational Analysis of Psychological Signals (DCAPS) project. DCAPS is a larger effort designed to harness predictive analytics for veteran mental health–and not just from social media. According to DARPA’s Russell Shilling’s program introduction, DCAPS is also developing algorithms that can data mine voice communications, daily eating and sleeping patterns, in-person social interactions, facial expressions, and emotional states for signs of suicidal thought. While participants in Durkheim won’t receive mental health assistance directly from the project, their contributions will go a long way toward treating suicidal veterans in the future….
The project launched on July 1; the number of veterans participating is not currently known but the finished number is expected to hover around 100,000.”

https://dai.ly/x11gnun

Why the Share Economy is Important for Disaster Response and Resilience


Patrick Meier at iRevolution: “A unique and detailed survey funded by the Rockefeller Foundation confirms the important role that social and community bonds play vis-à-vis disaster resilience. The new study, which focuses on resilience and social capital in the wake of Hurricane Sandy, reveals how disaster-affected communities self-organized, “with reports of many people sharing access to power, food and water, and providing shelter.” This mutual aid was primarily coordinated face-to-face. This may not always be possible, however. So the “Share Economy” can also play an important role in coordinating self-help during disasters….
In a share economy, “asset owners use digital clearinghouses to capitalize the unused capacity of things they already have, and consumers rent from their peers rather than rent or buy from a company”. During disasters, these asset owners can use the same digital clearinghouses to offer what they have at no cost. For example, over 1,400 kindhearted New Yorkers offered free housing to people heavily affected by the hurricane. They did this using AirBnB, as shown in the short video above. Meanwhile, on the West Coast, the City of San Francisco has just lunched a partnership with BayShare, a sharing economy advocacy group in the Bay Area. The partnership’s goal is to “harness the power of sharing to ensure the best response to future disasters in San Francisco”

https://web.archive.org/web/2000/https://www.youtube.com/watch?v=vIWxAWRq4t0

Open Data Tools: Turning Data into ‘Actionable Intelligence’


Shannon Bohle in SciLogs: “My previous two articles were on open access and open data. They conveyed major changes that are underway around the globe in the methods by which scientific and medical research findings and data sets are circulated among researchers and disseminated to the public. I showed how E-science and ‘big data’ fit into the philosophy of science though a paradigm shift as a trilogy of approaches: deductive, empirical, and computational, which was pointed out, provides a logical extenuation of Robert Boyle’s tradition of scientific inquiry involving “skepticism, transparency, and reproducibility for independent verification” to the computational age…
This third article on open access and open data evaluates new and suggested tools when it comes to making the most of the open access and open data OSTP mandates. According to an article published in The Harvard Business Review’s “HBR Blog Network,” this is because, as its title suggests, “open data has  little value if people can’t use it.” Indeed, “the goal is for this data to become actionable intelligence: a launchpad for investigation, analysis, triangulation, and improved decision making at all levels.” Librarians and archivists have key roles to play in not only storing data, but packaging it for proper accessibility and use, including adding descriptive metadata and linking to existing tools or designing new ones for their users. Later, in a comment following the article, the author, Craig Hammer, remarks on the importance of archivists and international standards, “Certified archivists have always been important, but their skillset is crucially in demand now, as more and more data are becoming available. Accessibility—in the knowledge management sense—must be on par with digestibility / ‘data literacy’ as priorities for continuing open data ecosystem development. The good news is that several governments and multilaterals (in consultation with data scientists and – yep! – certified archivists) are having continuing ‘shared metadata’ conversations, toward the possible development of harmonized data standards…If these folks get this right, there’s a real shot of (eventual proliferation of) interoperability (i.e. a data platform from Country A can ‘talk to’ a data platform from Country B), which is the only way any of this will make sense at the macro level.”

How algorithms rule the world


in The Guardian: “From dating websites and City trading floors, through to online retailing and internet searches (Google’s search algorithm is now a more closely guarded commercial secret than the recipe for Coca-Cola), algorithms are increasingly determining our collective futures. “Bank approvals, store cards, job matches and more all run on similar principles,” says Ball. “The algorithm is the god from the machine powering them all, for good or ill.”…The idea that the world’s financial markets – and, hence, the wellbeing of our pensions, shareholdings, savings etc – are now largely determined by algorithmic vagaries is unsettling enough for some. But, as the NSA revelations exposed, the bigger questions surrounding algorithms centre on governance and privacy. How are they being used to access and interpret “our” data? And by whom?”

Big Data Comes To Boston’s Neighborhoods


WBUR: “In the spring of 1982, social scientists James Q. Wilson and George L. Kelling published a seminal article in The Atlantic Monthly titled “Broken Windows.”
The piece focused public attention on a long-simmering theory in urban sociology: that broken windows, graffiti and other signs of neighborhood decay are correlated with — and may even help cause — some of the biggest problems in America’s cities.
Wilson and Kelling focused on the link to crime, in particular; an abandoned car, they argued, signals that illicit behavior is acceptable on a given block….Some researchers have poked holes in the theory — arguing that broken widows, known in academic circles as “physical disorder,” are more symptom than cause. But there is no disputing the idea’s influence: it’s inspired reams of research and shaped big city policing from New York to Los Angeles…
But a new study out of the Boston Area Research Initiative, a Harvard University-based collaborative of academics and city officials, suggests a new possibility: a cheap, sprawling and easily updated map of the urban condition.
Mining data from Boston’s constituent relationship management (CRM) operation — a hotline, website and mobile app for citizens to report everything from abandoned bicycles to mouse-infested apartment buildings — researchers have created an almost real-time guide to what ails the city…
But a first-of-its-kind measure of civic engagement — how likely are residents of a given block to report a pothole or broken streetlight? — yields more meaningful results.
One early finding: language barriers seem to explain scant reporting in neighborhoods with large populations of Latino and Asian renters; that’s already prompted targeted flyering that’s yielded modest improvements.
The same engagement measure points to another, more hopeful phenomenon: clusters of citizen activists show up not just in wealthy enclaves, as expected, but in low-income areas.”

Power of open data reveals global corporate networks


Open Data Institute: “The ODI today welcomed the move by OpenCorporates to release open data visualisations which show the global corporate networks of millions of businesses and the power of open data.
See the Maps
OpenCorporates, a company based at the ODI, has produced visuals using several sources, which it has published as open data for the first time:

  • Filings made by large domestic and foreign companies to the U.S. Securities and Exchange Commission
  • Banking data held by the National Information Center of the Federal Reserve System in the U.S.
  • Information about individual shareholders published by the official New Zealand corporate registry

Launched today, the visualisations are available through the main OpenCorporates website.”

The Future of Co-Creation and Crowdsourcing


New paper by Nick van Breda and Jan Spruijt: “This article reviews how co-creation is developing over the world and how different businesses are able to use co-creation. To give a clear sight of that, stories of companies, marketers and trend watchers will be used to tell about this phenomenon called crowdsourcing and co-creation. Marketers found a method to combine co-creation with the existing method of creating something new. Based on research we can now predict how co-creation will develop over the following years.
The evolution of co-creation is more exciting than we previously thought and we think that these results have to do with how the internet and social media have developed. A revolution is coming up and organizations will see an increase in turnover based on fast innovation and participation by the crowd.
We are living a world with a new dimension: a dimension where large organizations have no reason for existence when customers aren’t satisfied with their purchase, the organization’s service and most of all their feeling of participation. Consumers feel that they should have the power to change visions and missions of the old fashioned marketing way: the manipulative way to earn money. A dimension where 24/7 online is the key to succeed, fast responses to questions and remarks. In this time if continuous changes, creativity is a must.”