Explore our articles
View All Results

Stefaan Verhulst

Paper by Gianni Barlacchi et al in Scientific Data/Nature: “The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others….(More)”

A multi-source dataset of urban life in the city of Milan and the Province of Trentino

WorldPolicy Blog: “At first, no one knew why the children of Bagega in Zamfara state were dying. In the spring of 2010, hundreds of kids in and around the northern Nigerian village were falling ill, having seizures and going blind, many of them never to recover. A Médecins Sans Frontières‎ team soon discovered the causes: gold and lead.

With the global recession causing the price of precious metals to soar, impoverished villagers had turned to mining the area’s gold deposits. But the gold veins were mingled with lead, and as a result the villagers’ low-tech mining methods were sending clouds of lead-laced dust into the air. The miners, unknowingly carrying the powerful toxin on their clothes and skin, brought it into their homes where their children breathed it in.

The result was perhaps the worst outbreak of lead poisoning in history, killing over 400 children in Bagega and neighboring villages. In response, the Nigerian government pledged to cleanup the lead-contaminated topsoil and provide medical care to the stricken children. But by mid-2012, there was no sign of the promised funds. Digitally savvy activists with the organization Connected Development (CODE) stepped in to make sure that the money was disbursed.

A group of young Nigerians founded CODE in 2010 in the capital Abuja, with the mission of empowering local communities to hold the government to account by improving their access to information and helping their voices to be heard. “In 2010, we were working to connect communities with data for advocacy programs,” says CODE co-founder Oludotun Babayemi, a former country director of a World Wildlife Fund project in Nigeria. “When we heard about Bagega, we thought this was an opportunity for us.”

In 2012, CODE launched a campaign dubbed ‘Follow the Money Nigeria’ aimed at applying pressure on the government to release the promised funds. “Eighty percent of the less developed parts of Nigeria have zero access to Twitter, let alone Facebook, so it’s difficult for them to convey their stories,” says Babayemi. “We collect all the videos and testimonies and take it global.”

CODE members travelled to the lead-afflicted area to gather information. They then posted their findings online, and publicized them with a #SaveBagegahashtag, which they tweeted to members of the government, local and international organizations and the general public. CODE hosted a 48-hour ‘tweet-a-thon’, joined by a senator, to support the campaign….

By July 2014, CODE reported that the clean-up was complete and that over 1,000 children had been screened and enrolled in lead treatment programs. Bagega’s health center has also been refurbished and the village’s roads improved. “There are thousands of communities like Bagega,” says Babayemi. “They just need someone to amplify their voice.”….

Key lessons

  • Revealing information is not enough; change requires a real-world campaign driven by that information and civil society champions who can leverage their status and networks to draw international attention to the issues and maintain pressure.
  • Building relationships with sympathetic members of government is key.
  • Targeted online campaigns can help amplify the message of marginalized communities offline to achieve impact (More)”
Cleaning Up Lead Poisoning One Tweet at a Time

Report by Rebecca Rumbul at MySociety: “This research seeks to begin at the beginning, asking the most basic questions about who actually uses civic technology and why. Gathering data from civic technology groups from around the world, it shows the variations in usage of civic tech across four core countries (US, UK, Kenya and South Africa), and records the attitudes of users towards the platforms they are using.

Download: Who Benefits From Civic Technology? Demographic and public attitudes research into the users of civic technologiespdf

Who Benefits From Civic Technology?

Report by Drive Impact: “From Kentucky to Arkansas to New York, government leaders across the United States are leveraging data, technology, and a heightened focus on outcomes to deliver social impact with modern solutions. In Louisville, Kentucky, “smart” asthma inhalers track where attacks happen citywide and feed this data into a government dashboard, helping policymakers identify hot spots to improve air quality and better treat patients. Policy leaders in New York and Texas are reforming Medicaid with “value-based payments” that reward doctors for performing preventive procedures that protect against costly tests and treatments down the road. In Arkansas, a digital government platform called Gov2Go connects citizens with a personalized console that sends reminders to file paperwork, renew registrations, and seek out other relevant government services.

What all of these initiatives share is a smarter approach to policymaking: an operating belief that government can and should reward the best policies and programs by paying for the best outcomes and using the best data and technology to identify solutions that can transform service delivery and strengthen citizens’ connection to government. These transformational policies are smarter government, and America needs more of it. Smarter government uses an outcomes mindset to embrace cutting-edge data and technology, make better funding choices, learn from policy failures and successes, act on new knowledge about what works, and align clear goals with the right incentives to achieve them. Americans need a smarter, outcomes-focused government for the twenty-first century—one that can identify and address systemic barriers to effective service delivery and seek out and promote innovative solutions to our greatest social challenges….(More)”

Smarter Government For Social Impact: A New Mindset For Better Outcomes

Jon Brodkin at Ars Technica: “Government IT departments have a mostly deserved reputation for being behind the times. While private companies keep giving customers new and better ways to buy products and learn about their services, government agencies have generally made it difficult for residents to interact with them via the Internet.

But this is slowly changing, with agencies from the local level to the federal level focusing on fixing broken websites and building new tools for Americans to get what they need from the government….

“Improve Detroit,” a smartphone app launched in April this year using technology from SeeClickFix, has helped Detroiters find out how to get things done. In its first six months of availability, 10,000 complaints were resolved in an average of nine days, “a vast improvement from when problems often languished for years,” the city said in an announcement this month.

Improve Detroit was used to get “more than 3,000 illegal dumping sites cleaned up; 2,092 potholes repaired; 991 complaints resolved related to running water in an abandoned structure; 565 abandoned vehicles removed; 506 water main breaks taken care of; [and] 277 traffic signal issues fixed,” Detroit said….

At the municipal level, Oakland is also planning to pass its hard-earned wisdom on to other cities. “Our goal is to create a roadmap for cities big and small,” Oakland Communications Director Karen Boyd told Ars.

Like Detroit, Oakland partnered with SeeClickFix and Code for America after experiencing tough economic times. “Oakland was particularly hard hit by the mortgage crisis [in 2008], a lot of predatory loans were made to our low-income folks,” Boyd said.

Property tax revenue plummeted and the city lost about a quarter of its government workforce, Boyd said.

“Governments were finding themselves way behind the curve on technology. We looked up and realized this was no longer sensible to try to do more with less. We have to do things differently, and technology is an opportunity,” she said.

Working with Code for America in 2013, Oakland made RecordTrac, a website for requesting public records and tracking records requests. Obtaining government documents is often a convoluted process, but Ars Technica’s own Freedom of Information Act enthusiast Cyrus Farivar told me that RecordTrac “is the best (albeit imperfect) public records process I’ve ever used.”…

One of the best examples of a government agency using the Internet to engage residents comes from NASA. The space agency has had an online presence since the early years of the Web, said Brian Dunbar, who has been the content manager for nasa.gov since 1995.

The website has allowed NASA to distribute huge amounts of photos and videos from missions and broadcast an online TV service. There’s even live video feed of the Earth from the International Space Station.

NASA is all over social media, with nearly 700,000 subscribers to its YouTube channel, 13 millionTwitter followers, 13 million Facebook likes, 5.4 million Instagram followers, and a big presence on several other social networks. That’s not even including individuals like astronaut Scott Kelly, who has been tweeting from the International Space Station.

NASA has nearly 100 people editing its website, with content generally capitalizing on current events such as the recent Pluto flyby. NASA gets a lot of feedback when there are video problems, “but we’ve been lucky in that the problems have been not been overwhelming in either number or size, and we get a lot of positive feedback from the public,” Dunbar said.

This is all a natural extension of NASA’s core mission because the legislation that created the agency in 1958 charged it “with disseminating information about its programs to the widest extent practicable,” Dunbar said….(More)”

 

Slowly but surely, government IT enters the 21st century

National Archives of Australia: “The Digital Continuity 2020 Policy is a whole-of-government approach to digital information governance. It complements the Australian Government’s digital transformation agenda and underpins the digital economy. The policy aims to support efficiency, innovation, interoperability, information re-use and accountability by integrating robust digital information management into all government business processes.

The policy is based on three principles, and for each of them identifies what success looks like and the targets that agencies should reach by 2020. All Digital Continuity 2020 targets are expected to be achieved as part of normal business reviews and ongoing technology maintenance and investment cycles.

The principles

Principle 1 – Information is valued

Focus on governance and people

Agencies will manage their information as an asset, ensuring that it is created, stored and managed for as long as it is required, taking into account business requirements and other needs and risks.
Case study – Parliamentary Budget Office

Principle 2 – Information is managed digitally

Focus on digital assets and processes

Agencies will transition to entirely digital work processes, meaning business processes including authorisations and approvals are completed digitally, and that information is created and managed in digital format.
Case study – Federal Court of Australia

Principle 3 – Information, systems and processes are interoperable

Focus on metadata and standards

Agencies will have interoperable information, systems and processes to improve information quality and enable information to be found, managed, shared and re-used easily and efficiently.
Case study – Opening government data with the NationalMap

View the Digital Continuity 2020 Policy. (More)

Digital Continuity 2020

Paper by Gianluigi Viscusi and Christopher L. Tucci: “In conventional wisdom on crowdsourcing, the number of people define the crowd and maximizing this number is often assumed to be the goal of any crowdsourcingexercise. However, we propose that there are structural characteristics of the crowd that might be more important than the sheer number of participants. These characteristics include (1) growth rate and its attractiveness to the members, (2) the equality among members, (3) the density within provisional boundaries, (4) the goal orientation of the crowd, and (5) the “seriality” of the interactions between members of the crowd. We then propose a typology that may allow managers to position their companies’ initiatives among four strategic types: crowd crystals, online communities, closed crowd, and open crowd driven innovation. We show that incumbent companies may prefer a closed and controlled access to the crowd, limiting the potential for gaining results and insights from fully open crowd-driven innovation initiatives. Consequently, we argue that the effects on industries and organizations by open crowds are still to be explored, possibly via the mechanisms of entrepreneurs exploiting open crowds as new entrants, but also for the configuration of industries such as, e.g., finance, pharmaceuticals, or even the public sector where the value created usually comes from interpretation issues and exploratory problem solving…(More).”

Distinguishing ‘Crowded’ Organizations from Groups and Communities: Is Three a Crowd?

FastCoExist: “Most kids learn the grade school civics lesson about how a bill becomes a law. What those lessons usually neglect to show is how legislation today is often birthed on a lobbyist’s desk.

But even for expert researchers, journalists, and government transparency groups, tracing a bill’s lineage isn’t easy—especially at the state level. Last year alone, there were 70,000 state bills introduced in 50 states. It would take one person five weeks to even read them all. Groups that do track state legislation usually focus narrowly on a single topic, such as abortion, or perhaps a single lobby groups.

Computers can do much better. A prototype tool, presented in September at Bloomberg’sData for Good Exchange 2015 conference, mines the Sunlight Foundation’s database of more than 500,000 bills and 200,000 resolutions for the 50 states from 2007 to 2015. It also compares them to 1,500 pieces of “model legislation” written by a few lobbying groups that made their work available, such as the conservative group ALEC (American Legislative Exchange Council) and the liberal group the State Innovation Exchange(formerly called ALICE).

The results are interesting. In one example of the program in use, the team—all from the Data Science for Social Good fellowship program in Chicago—created a graphic (above) that presents the relative influence of ALEC and ALICE in different states. The thickness of each line in the graphic correlates to the percentage of bills introduced in each state that are modeled on either group’s legislation. So a relatively liberal state like New York is mostly ALICE bills, while a “swing” state like Illinois has a lot from both groups….

Along with researchers from the University of Chicago, Wikimedia Foundation, Microsoft Research, and Northwestern University, Walsh is also co-author of another paperpresented at the Bloomberg conference shows how data science can increase government transparency.

Walsh and these co-authors developed software that automatically identifies earmarks in U.S. Congressional bills, showing how representatives are benefiting their own states with pork barrel projects. They verified that it works by comparing it to the results of a massive effort from the U.S. Office of Management and Budget to analyze earmarks for a few limited years. Their results, extended back to 1995 in a public database, showed that there may be many more earmarks than anyone thought.

“Governments are making more data available. It’s something like a needle in a haystack problem, trying to extract all that information out,” says Walsh. “Both of these projects are really about shining light to these dark places where we don’t know what’s going on.”

The state legislation tracker data is available for download here, and the team is working on an expanded system that automatically downloads new state legislation so it can stay up to date…(More)”

When Lobbyists Write Legislation, This Data Mining Tool Traces The Paper Trail

Civicus: “…we’re very happy today to launch “Citizen-Generated Data and Governments: Towards a Collaborative Model”.

This piece explores the idea that governments could host and publish citizen-generated data (CGD) themselves, and whether this could mean that data is applied more widely and in a more sustainable way. It was inspired by a recent meeting in Buenos Aires with Argentine civil society organizations and government representatives, hosted by the City of Buenos Aires Innovation and Open Government Lab (Laboratorio de innovación y Gobierno Abierto de la Ciudad de Buenos Aires).

Screen Shot 2015-10-26 at 20.58.06

The meeting was organized to explore how people within government think about citizen-generated data, and discuss what would be needed for them to consider it as a valid method of data generation. One of the most novel and exciting ideas that surfaced was the potential for government open data portals, such as that managed by the Buenos Aires Innovation Lab, to host and publish CGD.

We wrote this report to explore this issue further, looking at existing models of data collaboration and outlining our first thoughts on the benefits and obstacles this kind of model might face. We welcome feedback from those with deeper expertise into different aspects of citizen-generated data, and look forward to refining these thoughts in the future together with the broader community…(More)”

Citizen-Generated Data and Governments: Towards a Collaborative Model

 at the Conversation: “It was 1986, and the American space agency, NASA, was reeling from the loss of seven lives. The space shuttle Challenger had broken apart about one minute after its launch.

A Congressional commission was formed to report on the tragedy. The physicist Richard Feynman was one of its members.

NASA officials had testified to Congress that the chance of a shuttle failure was around 1 in 100,000. Feynman wanted to look beyond the official testimony to the numbers and data that backed it up.

After completing his investigation, Feynman summed up his findings in an appendix to the Commission’s official report, in which he declaredthat NASA officials had “fooled themselves” into thinking that the shuttle was safe.

After a launch, shuttle parts sometimes came back damaged or behaved in unexpected ways. In many of those cases, NASA came up with convenient explanations that minimised the importance of these red flags. The people at NASA badly wanted the shuttle to be safe, and this coloured their reasoning.

To Feynman, this sort of behaviour was not surprising. In his career as a physicist, Feynman had observed that not just engineers and managers, but also basic scientists have biases that can lead to self-deception.

Feynman believed that scientists should constantly remind themselves of their biases. “The first principle” of being a good researcher, according to Feynman, “is that you must not fool yourself, and you are the easiest person to fool”….In the official report to Congress, Feynman and his colleagues recommended an independent oversight group be established to provide a continuing analysis of risk that was less biased than could be provided by NASA itself. The agency needed input from people who didn’t have a stake in the shuttle being safe.

Individual scientists also need that kind of input. The system of science ought to be set up in such a way that researchers subscribing to different theories can give independent interpretations of the same data set.

This would help protect the scientific community from the tendency for individuals to fool themselves into seeing support for their theory that isn’t there.

To me it’s clear: researchers should routinely examine others’ raw data. But in many fields today there is no opportunity to do so.

Scientists communicate their findings to each other via journal articles. These articles provide summaries of the data, often with a good deal of detail, but in many fields the raw numbers aren’t shared. And the summaries can be artfully arranged to conceal contradictions and maximise the apparent support for the author’s theory.

Occasionally, an article is true to the data behind it, showing the warts and all. But we shouldn’t count on it. As the chemist Matthew Todd has said to me, that would be like expecting a real estate agent’s brochure for a property to show the property’s flaws. You wouldn’t buy a house without seeing it with your own eyes. It can be unwise to buy into a theory without seeing the unfiltered data.

Many scientific societies recognise this. For many years now, some of the journals they oversee have had a policy of requiring authors to provide the raw data when other researchers request it.

Unfortunately, this policy has failed spectacularly, at least in some areas of science. Studies have found that when one researcher requests the data behind an article, that article’s authors respond with the data in fewer than half of cases. This is a major deficiency in the system of science, an embarrassment really.

The well-intentioned policy of requiring that data be provided upon request has turned out to be a formula for unanswered emails, for excuses, and for delays. A data before request policy, however, can be effective.

A few journals have implemented this, requiring that data be posted online upon publication of the article…(More)”

Science is best when the data is an open book

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday