Digital Democracy


Digital Democracy is a product of the Institute for Advanced Technology and Public Policy. The new online platform features a searchable database of California state legislative committees hearings, allowing the user to search videos by keyword, topic, speaker or date. Digital Democracy is a first of its kind tool because it will transcribe all legislative hearing videos and will make the transcriptions available to users in their searchable entirety. These data rich transcripts represent an entirely new data set that is currently unavailable to the public. Additionally, sophisticated meta tags attached to the transcripts will enable users to run in depth analytics to identify trends and relationships. A robust database of all speakers will track individual participants’ testimony, positions, and donation and gift histories.

This project is pushing beyond the technical challenges of providing mere access to information, instead focusing on how this new data set can be meaningfully interpreted and acted upon. Tools within the system will allow a user to quickly and easily search, locate, view, clip, and share this information and opinions on Facebook, Twitter, YouTube, Google+, and other social media platforms. The video clips will provide dynamic content for grassroots mobilizers, online media outlets, bloggers, professional associations, and government watchdogs.

Digital Democracy has been deployed as a one year beta to provide searchable video files of available California state committee hearings for the 2015 legislative year….(More)”

Governing the smart city: a review of the literature on smart urban governance


Review by Albert Meijer and Manuel Pedro Rodríguez Bolívar in International Review of Administrative Sciences: “Academic attention to smart cities and their governance is growing rapidly, but the fragmentation in approaches makes for a confusing debate. This article brings some structure to the debate by analyzing a corpus of 51 publications and mapping their variation. The analysis shows that publications differ in their emphasis on (1) smart technology, smart people or smart collaboration as the defining features of smart cities, (2) a transformative or incremental perspective on changes in urban governance, (3) better outcomes or a more open process as the legitimacy claim for smart city governance. We argue for a comprehensive perspective: smart city governance is about crafting new forms of human collaboration through the use of ICTs to obtain better outcomes and more open governance processes. Research into smart city governance could benefit from previous studies into success and failure factors for e-government and build upon sophisticated theories of socio-technical change. This article highlights that smart city governance is not a technological issue: we should study smart city governance as a complex process of institutional change and acknowledge the political nature of appealing visions of socio-technical governance….(More)”

Cops Increasingly Use Social Media to Connect, Crowdsource


Sara E. Wilson at GovTech: “Law enforcement has long used public tip lines and missing persons bulletins to recruit citizens in helping solve crime and increasing public safety. Though the need for police departments to connect with their communities is nothing new, the array of technologies available to do so is growing all the time — as are the ways in which departments use those technologies.

In fact, 81 percent of law enforcement professionals use sites such as Facebook and Twitter on the job. And 25 percent use it daily.

Much of law enforcement is crowdsourced — Amber alerts are pushed to smartphones, seeking response from citizens; officers push wanted information and crime tips to users on Facebook and Twitter in the hopes they can help; and some departments create apps to streamline the information sharing.

Take the Johns Creek, Ga., Police Department, which has deployed a tool that allows additional citizen engagement and crowdsourcing…..Using a mobile app — the SunGard Public Sector P2C Converge app, which is branded specifically for Johns Creek PD as JCPD4Me — the department can more smoothly manage public safety announcements and other social media interactions….

Another tool cops use for communicating with citizens is Nixle, which lets agencies publish alerts, advisories, community information and traffic news. Citizens register for free and receive credible, neighborhood-level public safety information via text message and email in real time.

The Oakland, Calif., Police Department (OPD) uses the platform to engage with citizens — an April 17, 2015 post on Oakland PD’s Nixle Community feed informs readers that the department’s Special Victims Section, which is working to put an end to sex trafficking in the city, arrested five individuals for solicitation of prostitution. Since Jan. 1, 2015, OPD has arrested 70 individuals from 27 cities across the state for solicitation of prostitution.

Nixle allows two-way communication as well — the Tip Watch function allows anonymous tipsters to send information to Oakland PD in three ways (text, phone, Web). Now OPD can issue a passcode to tipsters for two-way, anonymous communication to help gather more information.

On the East Coast, the Peabody, Mass., Police Department has used the My Police Department (MyPD) app by WiredBlue, which lets citizens submit tips and feedback directly to the department, since its creation….

During the high-profile manhunt for the Boston Marathon bombers in 2013, the FBI asked the public for eyewitness photo and video evidence. The response from the public was so overwhelming that the server infrastructure couldn’t handle the massive inflow of data.

This large-scale crowdsourcing and data dilemma inspired a new product: the Los Angeles Sheriff’s Department’s Large Emergency Event Digital Information Repository (LEEDIR). Developed by CitizenGlobal Inc. and Amazon Web Services, LEEDIR pairs an app with cloud storage to help police use citizens’ smartphones as tools to gather and investigate evidence. Since its creation, the repository was used in Santa Barbara, Calif., in 2014 to investigate riots in Isla Vista.

Proponents of LEEDIR say the crowdsourcing system gives authorities a secure, central repository for the countless electronic tips that can come in during a crisis. Critics, however, claim that privacy issues come into play with this kind of policing. …(More)”

A new approach to measuring the impact of open data


 at SunLight Foundation: “Strong evidence on the long-term impact of open data initiatives is incredibly scarce. The lack of compelling proof is partly due to the relative novelty of the open government field, but also to the inherent difficulties in measuring good governance and social change. We know that much of the impact of policy advocacy, for instance, occurs even before a new law or policy is introduced, and is thus incredibly difficult to evaluate. At the same time, it is also very hard to detect the causality between a direct change in the legal environment and the specific activities of a policy advocacy group. Attribution is equally challenging when it comes to assessing behavioral changes – who gets to take credit for increased political engagement and greater participation in democratic processes?

Open government projects tend to operate in an environment where the contribution of other stakeholders and initiatives is essential to achieving sustainable change, making it even more difficult to show the causality between a project’s activities and the impact it strives to achieve. Therefore, these initiatives cannot be described through simple “cause and effect” relationships, as they mostly achieve changes through their contribution to outcomes produced by a complex ecosystem of stakeholders — including journalists, think tanks, civil society organizations, public officials and many more — making it even more challenging to measure their direct impact.

We at the Sunlight Foundation wanted to tackle some of the methodological challenges of the field through building an evidence base that can empower further generalizations and advocacy efforts, as well as developing a methodological framework to unpack theories of change and to evaluate the impact of open data and digital transparency initiatives. A few weeks ago, we presented our research at the Cartagena Data Festival, and today we are happy to launch the first edition of our paper, which you can read below or on Scribd.

The outputs of this research include:

  • A searchable repository of more than 100 examples on the outputs, outcomes and impacts of open data and digital technology projects;
  • Three distinctive theories of change for open data and digital transparency initiatives from the Global South;
  • A methodological framework to help develop more robust indicators of social and political change for the ecosystem of open data initiatives, by applying and revising the Outcome Mapping approach of IDRC to the field…(You can read the study at :The Social Impact of Open Data by juliakeseru)

Nepal Aid Workers Helped by Drones, Crowdsourcing


Shirley Wang et al in the Wall Street Journal: “….It is too early to gauge the exact impact of the technology in Nepal relief efforts, which have just begun amid chaos on the ground. Aid organizations have reported hospitals are overstretched, a shortage of capacity at Katmandu’s airport is crippling aid distribution and damaged roads and the mountainous country’s difficult terrain make reaching villages difficult.

Still, technology is playing an increasing role in the global response to humanitarian crises. Within hours of Saturday’s 7.8-magnitude temblor, U.S. giants such as Google Inc. and Facebook Inc. were offering their networks for use in verifying survivors and helping worried friends and relatives locate their loved ones.

Advances in online mapping—long used to calculate distances and plot driving routes—and the ability of camera-equipped drones are playing an increasingly important role in coordinating emergency responses at ground zero of any disaster.

A community of nonprofit groups uses satellite images, private images and open-source mapping technology to remap areas affected by the earthquake. They mark damaged buildings and roads so rescuers can identify the worst-hit areas and assess how accessible different areas are. The technology complements more traditional intelligence from aircraft.

Such crowdsourced real-time mapping technologies were first used in the 2010 Haiti earthquake, according to Chris Grundy, a professor in Geographical Information Systems at the London School of Hygiene and Tropical Medicine. The technology “has been advancing a little bit every time [every situation where it is used] as we start to see what works,” said Prof. Grundy.

The American Red Cross supplied its relief team on the Wednesday night flight to Nepal from Washington, D.C. with 50 digital maps and an inch-thick pile of paper maps that help identify where the needs are. The charity has a mapping project with the British Red Cross, Doctors Without Borders and the Humanitarian OpenStreetMap Team, a crowdsourced data-sharing group.

Almost a week after the Nepal earthquake, two more people have been pulled from the rubble in Katmandu by teams of international rescuers. But hope for finding more survivors is waning. Photo: Sean McLain/The Wall Street Journal.

Mapping efforts have grown substantially since Haiti, according to Dale Kunce, head of the geographic information systems team at the American Red Cross. In the two months after the Haiti temblor, 600 mapping contributors made 1.5 million edits, while in the first 48 hours after the Nepal earthquake, 2,000 mappers had already made three million edits, Mr. Kunce said.

Some 3,400 volunteers from around the world are now inspecting images of Nepal online to identify road networks and conditions, to assess the extent of damage and pinpoint open spaces where displaced persons tend to congregate, according to Nama Budhathoki, executive director of a nonprofit technology company called Katmandu Living Labs.

His group is operating from a cramped but largely undamaged meeting room in a central-Katmandu office building to help coordinate the global effort of various mapping organizations with the needs of agencies like Doctors Without Borders and the international Red Cross community.

In recent days the Nepal Red Cross and Nepalese army have requested and been supplied with updated maps of severely damaged districts, said Dr. Budhathoki….(More)”

Apple Has Plans for Your DNA


Antonio Regalado at MIT Technology Review: “…Apple is collaborating with U.S. researchers to launch apps that would offer some iPhone owners the chance to get their DNA tested, many of them for the first time, according to people familiar with the plans.

The apps are based on ResearchKit, a software platform Apple introduced in March that helps hospitals or scientists run medical studies on iPhones by collecting data from the devices’ sensors or through surveys.

The first five ResearchKit apps, including one called mPower that tracks symptoms of Parkinson’s disease, quickly recruited thousands of participants in a few days, demonstrating the reach of Apple’s platform.

“Apple launched ResearchKit and got a fantastic response. The obvious next thing is to collect DNA,” says Gholson Lyon, a geneticist at Cold Spring Harbor Laboratory, who isn’t involved with the studies.

Nudging iPhone owners to submit DNA samples to researchers would thrust Apple’s devices into the center of a widening battle for genetic information. Universities, large technology companies like Google (see “Google Wants to Store Your Genome”), direct-to-consumer labs, and even the U.S. government (see “U.S. to Develop DNA Study of One Million People”) are all trying to amass mega-databases of gene information to uncover clues about the causes of disease (see “Internet of DNA”).

In two initial studies planned, Apple isn’t going to directly collect or test DNA itself. That will be done by academic partners. The data would be maintained by scientists in a computing cloud, but certain findings could appear directly on consumers’ iPhones as well. Eventually, it’s even possible consumers might swipe to share “my genes” as easily as they do their location….(More)”

The Incredible Jun: A Town that Runs on Social Media


Deb Roy and William Powers in the Huffington Post:For the last four years, a town in southern Spain has been conducting a remarkable experiment in civic life. Jun (pronounced “hoon”) has been using Twitter as its principal medium for citizen-government communication. Leading the effort is Jun’s Mayor, José Antonio Rodríguez Salas, a passionate believer in the power of technology to solve problems and move society forward.

Since launching the initiative in 2011, Rodríguez Salas has been recruiting his 3,500 townspeople to not only join the social network but have their Twitter accounts locally verified at town hall. This extra step isn’t necessary to participate in the conversation – Twitter is open to anyone – but it helps town employees know they’re dealing with actual residents.

In the most basic scenario, a citizen who has a question, request or complaint tweets it to the mayor or one of his staff, who work to resolve the matter. For instance, in the sequence of tweets shown below (which we pulled from the 2014 Twitter data and translated into English), at 10:48 pm a citizen tells the mayor that a street lamp is out on Maestro Antonio Linares Street. Nine minutes later, the mayor replies that he’ll have the town electrician fix it the next day. The mayor’s tweet includes the Twitter handle of the electrician, who is automatically notified that he’s been mentioned and sees the exchange. That tweet is a public promise that the town will indeed take action, and to underline this it ends with the hashtag #JunGetsMoving. The next day, the electrician tweets a photo of the repaired fixture, thanking the citizen for his help and repeating the hashtag.

A citizen alerts the mayor to a broken street lamp. Two tweets later, it’s fixed.

Governments have been responding to citizens for centuries. But digital networks have made it possible to build much faster, more efficient feedback loops. Each of the participants in the above transaction wrote a single text of less than 140 characters, and in less than 24 hours the problem was solved….(More)”

Five Headlines from a Big Month for the Data Revolution


Sarah T. Lucas at Post2015.org: “If the history of the data revolution were written today, it would include three major dates. May 2013, when theHigh Level Panel on the Post-2015 Development Agenda first coined the phrase “data revolution.” November 2014, when the UN Secretary-General’s Independent Expert Advisory Group (IEAG) set a vision for it. And April 2015, when five headliner stories pushed the data revolution from great idea to a concrete roadmap for action.

The April 2015 Data Revolution Headlines

1. The African Data Consensus puts Africa in the lead on bringing the data revolution to the regional level. TheAfrica Data Consensus (ADC) envisions “a profound shift in the way that data is harnessed to impact on development decision-making, with a particular emphasis on building a culture of usage.” The ADC finds consensus across 15 “data communities”—ranging from open data to official statistics to geospatial data, and is endorsed by Africa’s ministers of finance. The ADC gets top billing in my book, as the first contribution that truly reflects a large diversity of voices and creates a political hook for action. (Stay tuned for a blog from my colleague Rachel Quint on the ADC).

2. The Sustainable Development Solutions Network (SDSN) gets our minds (and wallets) around the data needed to measure the SDGs. The SDSN Needs Assessment for SDG Monitoring and Statistical Capacity Development maps the investments needed to improve official statistics. My favorite parts are the clear typology of data (see pg. 12), and that the authors are very open about the methods, assumptions, and leaps of faith they had to take in the costing exercise. They also start an important discussion about how advances in information and communications technology, satellite imagery, and other new technologies have the potential to expand coverage, increase analytic capacity, and reduce the cost of data systems.

3. The Overseas Development Institute (ODI) calls on us to find the “missing millions.” ODI’s The Data Revolution: Finding the Missing Millions presents the stark reality of data gaps and what they mean for understanding and addressing development challenges. The authors highlight that even that most fundamental of measures—of poverty levels—could be understated by as much as a quarter. And that’s just the beginning. The report also pushes us to think beyond the costs of data, and focus on how much good data can save. With examples of data lowering the cost of doing government business, the authors remind us to think about data as an investment with real economic and social returns.

4. Paris21 offers a roadmap for putting national statistic offices (NSOs) at the heart of the data revolution.Paris21’s Roadmap for a Country-Led Data Revolution does not mince words. It calls on the data revolution to “turn a vicious cycle of [NSO] underperformance and inadequate resources into a virtuous one where increased demand leads to improved performance and an increase in resources and capacity.” It makes the case for why NSOs are central and need more support, while also pushing them to modernize, innovate, and open up. The roadmap gets my vote for best design. This ain’t your grandfather’s statistics report!

5. The Cartagena Data Festival features real-live data heroes and fosters new partnerships. The Festival featured data innovators (such as terra-i using satellite data to track deforestation), NSOs on the leading edge of modernization and reform (such as Colombia and the Philippines), traditional actors using old data in new ways (such as the Inter-American Development Bank’s fantastic energy database), groups focused on citizen-generated data (such as The Data Shift and UN My World), private firms working with big data for social good (such asTelefónica), and many others—all reminding us that the data revolution is well underway and will not be stopped. Most importantly, it brought these actors together in one place. You could see the sparks flying as folks learned from each other and hatched plans together. The Festival gets my vote for best conference of a lifetime, with the perfect blend of substantive sessions, intense debate, learning, inspiration, new connections, and a lot of fun. (Stay tuned for a post from my colleague Kristen Stelljes and me for more on Cartagena).

This month full of headlines leaves no room for doubt—momentum is building fast on the data revolution. And just in time.

With the Financing for Development (FFD) conference in Addis Ababa in July, the agreement of Sustainable Development Goals in New York in September, and the Climate Summit in Paris in December, this is a big political year for global development. Data revolutionaries must seize this moment to push past vision, past roadmaps, to actual action and results…..(More)”

Monithon


“Moni-thon” comes from “monitor” and “marathon”, and this is precisely what this platform seeks to help with: anintensive activity of observing and reporting of public policies in Italy.

What’s there to monitor?  Monithon was born as an independently developed initiative to promote the citizen monitoring of development projects funded both by the Italian government and the EU through the Cohesion (aka. Regional) Policy. Projects include a wide range of interventions such as large transport, digital, research or environmental infrastructures (railroads, highways, broadband networks, waste management systems…), aids to enterprises to support innovation and competitiveness, and other funding for energy efficiency, social inclusion, education and training, occupation and workers mobility, tourism, etc.

Citizen monitoring of these projects is possible thanks to a combination of open government data and citizens’ collaboration, joined by the goal of controlling how the projects are progressing, and whether they deliver actual results.

The Italian government releases the information on all the 800k+ projects funded (worth almost 100 billion Euros), the beneficiaries of the subsidies and all the actors involved as open data, including the location and the timing of the intervention. All the data is integrated with interactive visualizations on the national portal OpenCoesione, where people can play with the data and find the most interesting projects to follow.

The Monithon initiative takes this transparency further: it asks citizens to actively engage with open government data and to produce valuable information through it.

How does it work? Monithon means active involvement of communities and a shared methodology. Citizens, journalist, experts, researchers, students – or all combined – collect information on a specific project chosen from the OpenCoesione database. Then this information can be uploaded on the Monithon platform (based on Ushahidi) by selecting the projects from a list and it can be geo-referenced and enriched with interviews, quantitative data, pictures, videos. The result is a form of civic, bottom-down, collective data storytelling. All the “wannabe monithoners” can download this simple toolkit, a 10-page document that describes the initiative and explains how to pick a project to monitor and get things started.  ….

How to achieve actual impact? The Monithon platform is method and a model whereby citizen monitoring may be initiated and a tool for civic partners to press forward, to report on malpractice, but also to collaborate in making all these projects work, in accelerating their completion and understanding whether they actually respond to local demand. ….

Monithon has rapidly evolved from being an innovative new platform into a transferable civic engagement format.  Since its launch in September 2013, Monithon has drawn dozens of national and local communities (some formed on purpose, other based on existing associations) and around 500 people into civic monitoring activities, mostly in Southern Italy, where cohesion funds are more concentrated. Specific activities are carried out by established citizen groups, like Libera, a national anti-Mafia association, which became Monithon partner, focusing their monitoring on the rehabilitation of Mafia-seized properties. Action Aid is now partnering with Monithon to promote citizen empowerment. Existing, local groups of activists are using the Monithon methodology to test local transportation systems that benefited from EU funding, while new groups have formed to begin monitoring social innovation and cultural heritage projects.

Now more than 50 “citizen monitoring reports”, which take the form of collective investigations on project development and results, are publicly available on the Monithon website, many of which spurred further dialogue with public administrations….(More)

Data Fusion Heralds City Attractiveness Ranking


Emerging Technology From the arXiv: “The ability of any city to attract visitors is an important metric for town planners, businesses based on tourism, traffic planners, residents, and so on. And there are increasingly varied ways of measuring this thanks to the growing volumes of city-related data generated by with social media, and location-based data.

So it’s only natural that researchers would like to draw these data sets together to see what kind of insight they can get from this form of data fusion.

And so it has turned out thanks to the work of Stanislav Sobolevsky at MIT and a few buddies. These guys have fused three wildly different data sets related to the attractiveness of a city that allows them to rank these places and to understand why people visit them and what they do when they get there.

The work focuses exclusively on cities in Spain using data that is relatively straightforward to gather. The first data set consists of the number of credit and debit card transactions carried out by visitors to cities throughout Spain during 2011. This includes each card’s country of origin, which allows Sobolevsky and co to count only those transactions made by foreign visitors—a total of 17 million anonymized transactions from 8.6 million foreign visitors from 175 different countries.

The second data set consists of over 3.5 million photos and videos taken in Spain and posted to Flickr by people living in other countries. These pictures were taken between 2005 and 2014 by 16,000 visitors from 112 countries.

The last data set consists of around 700,000 geotagged tweets posted in Spain during 2012. These were posted by 16,000 foreign visitors from 112 countries.

Finally, the team defined a city’s attractiveness, at least for the purposes of this study, as the total number of pictures, tweets and card transactions that took place within it……

That’s interesting work that shows how the fusion of big data sets can provide insights into the way people use cities.   It has its limitations of course. The study does not address the reasons why people find cities attractive and what draws them there in the first place. For example, are they there for tourism, for business, or for some other reason. That would require more specialized data.

But it does provide a general picture of attractiveness that could be a start for more detailed analyses. As such, this work is just a small part of a new science of cities based on big data, but one that shows how much is becoming possible with just a little number crunching.

Ref: arxiv.org/abs/1504.06003 : Scaling of city attractiveness for foreign visitors through big data of human economic and social media activity”