OSTP’s Own Open Government Plan


Nick Sinai and Corinna Zarek: “The White House Office of Science and Technology Policy (OSTP) today released its 2014 Open Government Plan. The OSTP plan highlights three flagship efforts as well as the team’s ongoing work to embed the open government principles of transparency, participation, and collaboration into its activities.
OSTP advises the President on the effects of science and technology on domestic and international affairs. The work of the office includes policy efforts encompassing science, environment, energy, national security, technology, and innovation. This plan builds off of the 2010 and 2012 Open Government Plans, updating progress on past initiatives and adding new subject areas based on 2014 guidance.
Agencies began releasing biennial Open Government Plans in 2010, with direction from the 2009 Open Government Directive. These plans serve as a roadmap for agency openness efforts, explaining existing practices and announcing new endeavors to be completed over the coming two years. Agencies build these plans in consultation with civil society stakeholders and the general public. Open government is a vital component of the President’s Management Agenda and our overall effort to ensure the government is expanding economic growth and opportunity for all Americans.
OSTP’s 2014 flagship efforts include:

  • Access to Scientific Collections: OSTP is leading agencies in developing policies that will improve the management of and access to scientific collections that agencies own or support. Scientific collections are assemblies of physical objects that are valuable for research and education—including drilling cores from the ocean floor and glaciers, seeds, space rocks, cells, mineral samples, fossils, and more. Agency policies will help make scientific collections and information about scientific collections more transparent and accessible in the coming years.
  • We the Geeks: We the Geeks Google+ Hangouts feature informal conversations with experts to highlight the future of science, technology, and innovation in the United States. Participants can join the conversation on Twitter by using the hashtag #WeTheGeeks and asking questions of the presenters throughout the hangout.
  • “All Hands on Deck” on STEM Education: OSTP is helping lead President Obama’s commitment to an “all-hands-on-deck approach” to providing students with skills they need to excel in science, technology, engineering, and math (STEM). In support of this goal, OSTP is bringing together government, industry, non-profits, philanthropy, and others to expand STEM education engagement and awareness through events like the annual White House Science Fair and the upcoming White House Maker Faire.

OSTP looks forward to implementing the 2014 Open Government Plan over the coming two years to continue building on its strong tradition of transparency, participation, and collaboration—with and for the American people.”

Open Data Is Open for Business


Jeffrey Stinson at Stateline: ” Last month, web designer Sean Wittmeyer and colleague Wojciech Magda walked away with a $25,000 prize from the state of Colorado for designing an online tool to help businesses decide where to locate in the state.
The tool, called “Beagle Score,” is a widget that can be embedded in online commercial real estate listings. It can rate a location by taxes and incentives, zoning, even the location of possible competitors – all derived from about 30 data sets posted publicly by the state of Colorado and its municipalities.
The creation of Beagle Score is an example of how states, cities, counties and the federal government are encouraging entrepreneurs to take raw government data posted on “open data” websites and turn the information into products the public will buy.
“The (Colorado contest) opened up a reason to use the data,” said Wittmeyer, 25, of Fort Collins. “It shows how ‘open data’ can solve a lot of challenges. … And absolutely, we can make it commercially viable. We can expand it to other states, and fairly quickly.”
Open-data advocates, such as President Barack Obama’s former information chief Vivek Kundra, estimate a multibillion-dollar industry can be spawned by taking raw government data files on sectors such as weather, population, energy, housing, commerce or transportation and turn them into products for the public to consume or other industries to pay for.
They can be as simple as mobile phone apps identifying every stop sign you will encounter on a trip to a different town, or as intricate as taking weather and crops data and turning it into insurance policies farmers can buy.

States, Cities Sponsor ‘Hackathons’

At least 39 states and 46 cities and counties have created open-data sites since the federal government, Utah, California and the cities of San Francisco and Washington, D.C., began opening data in 2009, according to the federal site, Data.gov.
Jeanne Holm, the federal government’s Data.gov “evangelist,” said new sites are popping up and new data are being posted almost daily. The city of Los Angeles, for example, opened a portal last week.
In March, Democratic New York Gov. Andrew Cuomo said that in the year since it was launched, his state’s site has grown to some 400 data sets with 50 million records from 45 agencies. Available are everything from horse injuries and deaths at state race tracks to maps of regulated child care centers. The most popular data: top fishing spots in the state.
State and local governments are sponsoring “hackathons,” “data paloozas,” and challenges like Colorado’s, inviting businesspeople, software developers, entrepreneurs or anyone with a laptop and a penchant for manipulating data to take part. Lexington, Kentucky, had a civic hackathon last weekend. The U.S. Transportation Department and members of the Geospatial Transportation Mapping Association had a three-day data palooza that ended Wednesday in Arlington, Virginia.
The goals of the events vary. Some, like Arlington’s transportation event, solicit ideas for how government can present its data more effectively. Others seek ideas for mining it.
Aldona Valicenti, Lexington’s chief information officer, said many cities want advice on how to use the data to make government more responsive to citizens, and to communicate with them on issues ranging from garbage pickups and snow removal to upcoming civic events.
Colorado and Wyoming had a joint hackathon last month sponsored by Google to help solve government problems. Colorado sought apps that might be useful to state emergency personnel in tracking people and moving supplies during floods, blizzards or other natural disasters. Wyoming sought help in making its tax-and-spend data more understandable and usable by its citizens.
Unless there’s some prize money, hackers may not make a buck from events like these, and participate out of fun, curiosity or a sense of public service. But those who create an app that is useful beyond the boundaries of a particular city or state, or one that is commercially valuable to business, can make serious money – just as Beagle Score plans to do. Colorado will hold onto the intellectual property rights to Beagle Score for a year. But Wittmeyer and his partner will be able to profit from extending it to other states.

States Trail in Open Data

Open data is an outgrowth of the e-government movement of the 1990s, in which government computerized more of the data it collected and began making it available on floppy disks.
States often have trailed the federal government or many cities in adjusting to the computer age and in sharing information, said Emily Shaw, national policy manager for the Sunlight Foundation, which promotes transparency in government. The first big push to share came with public accountability, or “checkbook” sites, that show where government gets its revenue and how it spends it.
The goal was to make government more transparent and accountable by offering taxpayers information on how their money was spent.
The Texas Comptroller of Public Accounts site, established in 2007, offers detailed revenue, spending, tax and contracts data. Republican Comptroller Susan Combs’ office said having a one-stop electronic site also has saved taxpayers about $12.3 million in labor, printing, postage and other costs.
Not all states’ checkbook sites are as openly transparent and detailed as Texas, Shaw said. Nor are their open-data sites. “There’s so much variation between the states,” she said.
Many state legislatures are working to set policies for releasing data. Since the start of 2010, according to the National Conference of State Legislatures, nine states have enacted open-data laws, and more legislation is pending. But California, for instance, has been posting open data for five years without legislation setting policies.
Just as states have lagged in getting data out to the public, less of it has been turned into commercial use, said Joel Gurin, senior adviser at the Governance Lab at New York University and author of the book “Open Data Now.”
Gurin leads Open Data 500, which identifies firms that that have made products from open government data and turned them into regional or national enterprises. In April, it listed 500. It soon may expand. “We’re finding more and more companies every day,” he said. “…

Making cities smarter through citizen engagement


Vaidehi Shah at Eco-Business: “Rapidly progressing information communications technology (ICT) is giving rise to an almost infinite range of innovations that can be implemented in cities to make them more efficient and better connected. However, in order for technology to yield sustainable solutions, planners must prioritise citizen engagement and strong leadership.
This was the consensus on Tuesday at the World Cities Summit 2014, where representatives from city and national governments, technology firms and private sector organisations gathered in Singapore to discuss strategies and challenges to achieving sustainable cities in the future.
Laura Ipsen, Microsoft corporate vice president for worldwide public sector, identified globalisation, social media, big data, and mobility as the four major technological trends prevailing in cities today, as she spoke at the plenary session with a theme on “The next urban decade: critical challenges and opportunities”.
Despite these increasing trends, she cautioned, “technology does not build infrastructure, but it does help better engage citizens and businesses through public-private partnerships”.
For example, “LoveCleanStreets”, an online tool developed by Microsoft and partners, enables London residents to report infrastructure problems such as damaged roads or signs, shared Ipsen.
“By engaging citizens through this application, cities can fix problems early, before they get worse,” she said.
In Singapore, the ‘MyWaters’ app of PUB, Singapore’s national water agency, is also a key tool for the government to keep citizens up-to-date of water quality and safety issues in the country, she added.
Even if governments did not actively develop solutions themselves, simply making the immense amounts of data collected by the city open to businesses and citizens could make a big difference to urban liveability, Mark Chandler, director of the San Francisco Mayor’s Office of International Trade and Commerce, pointed out.
Opening up all of the data collected by San Francisco, for instance, yielded 60 free mobile applications that allow residents to access urban solutions related to public transport, parking, and electricity, among others, he explained. This easy and convenient access to infrastructure and amenities, which are a daily necessity, is integral to “a quality of life that keeps the talented workforce in the city,” Chandler said….”

Lessons in Mass Collaboration


Elizabeth Walker, Ryan Siegel, Todd Khozein, Nick Skytland, Ali Llewellyn, Thea Aldrich, and Michael Brennan in the Stanford Social Innovation Review: “significant advances in technology in the last two decades have opened possibilities to engage the masses in ways impossible to imagine centuries ago. Beyond coordination, today’s technological capability permits organizations to leverage and focus public interest, talent, and energy through mass collaborative engagement to better understand and solve today’s challenges. And given the rising public awareness of a variety of social, economic, and environmental problems, organizations have seized the opportunity to leverage and lead mass collaborations in the form of hackathons.
Hackathons emerged in the mid-2000s as a popular approach to leverage the expertise of large numbers of individuals to address social issues, often through the creation of online technological solutions. Having led hundreds of mass collaboration initiatives for organizations around the world in diverse cultural contexts, we at SecondMuse offer the following lessons as a starting point for others interested in engaging the masses, as well as challenges others’ may face.

What Mass Collaboration Looks Like

An early example of a mass collaborative endeavor was Random Hacks of Kindness (RHoK), which formed in 2009. RHoK was initially developed in collaboration with Google, Microsoft, Yahoo!, NASA, the World Bank, and later, HP as a volunteer mobilization effort; it aimed to build technology that would enable communities to respond better to crises such as natural disasters. In 2012, nearly 1,000 participants attended 30 events around the world to address 176 well-defined problems.
In 2013, NASA and SecondMuse led the International Space Apps Challenge, which engaged six US federal agencies, 400 partner institutions, and 9,000 global citizens through a variety of local and global team configurations; it aimed to address 58 different challenges to improve life on Earth and in space. In Athens, Greece, for example, in direct response to the challenge of creating a space-deployable greenhouse, a team developed a modular spinach greenhouse designed to survive the harsh Martian climate. Two months later, 11,000 citizens across 95 events participated in the National Day of Civic Hacking in 83 different US cities, ultimately contributing about 150,000 person-hours and addressing 31 federal and several state and local challenges over a single weekend. One result was Keep Austin Fed from Austin, Texas, which leveraged local data to coordinate food donations for those in need.
Strong interest on the part of institutions and an enthusiastic international community has paved the way for follow-up events in 2014.

Benefits of Mass Collaboration

The benefits of this approach to problem-solving are many, including:

  • Incentivizing the use of government data. As institutions push to make data available to the public, mass collaboration can increase the usefulness of that data by creating products from it, as well as inform and streamline future data collection processes.
  • Increasing transparency. Engaging citizens in the process of addressing public concerns educates them about the work that institutions do and advances efforts to meet public expectations of transparency.
  • Increasing outcome ownership. When people engage in a collaborative process of problem solving, they naturally have a greater stake in the outcome. Put simply, the more people who participate in the process, the greater the sense of community ownership. Also, when spearheading new policies or initiatives, the support of a knowledgeable community can be important to long-term success.
  • Increasing awareness. Engaging the populace in addressing challenges of public concern increases awareness of issues and helps develop an active citizenry. As a result, improved public perception and license to operate bolster governmental and non-governmental efforts to address challenges.
  • Saving money. By providing data and structures to the public, and allowing them to build and iterate on plans and prototypes, mass collaboration gives agencies a chance to harness the power of open innovation with minimal time and funds.
  • Harnessing cognitive surplus. The advent of online tools allowing for distributed collaboration enables citizens to use their free time incrementally toward collective endeavors that benefit local communities and the nation.

Challenges of Mass Collaboration

Although the benefits can be significant, agencies planning to lead mass collaborations should be aware of several challenges:

  • Investing time and effort. A mass collaboration is most effective when it is not a one-time event. The up-front investment in building a collaboration of supporting partner organizations, creating a robust framework for action, developing the necessary tools and defining the challenges, and investing in implementation and scaling of the most promising results all require substantial time to secure long-term commitment and strong relationships.
  • Forging an institution-community relationship. Throughout the course of most engagements, the power dynamic between the organization providing the frameworks and challenges and the groupings of individuals responding to the call to action can shift dramatically as the community incorporates the endeavor into their collective identity. Everyone involved should embrace this as they lay the foundation for self-sustaining mass collaboration communities. Once participants develop a firmly entrenched collective identity and sense of ownership, the convening organization can fully tap into its collective genius, as they can work together based on trust and shared vision. Without community ownership, organizers need to allot more time, energy, and resources to keep their initiative moving forward, and to battle against volunteer fatigue, diminished productivity, and substandard output.
  • Focusing follow-up. Turning a massive infusion of creative ideas, concepts, and prototypes into concrete solutions requires a process of focused follow-up. Identifying and nurturing the most promising seeds to fruition requires time, discrete skills, insight, and—depending on the solutions you scale—support from a variety of external organizations.
  • Understanding ROI. Any resource-intensive endeavor where only a few of numerous resulting products ever see the light of day demands deep consideration of what constitutes a reasonable return on investment. For mass collaborations, this means having an initial understanding of the potential tangible and intangible outcomes, and making a frank assessment of whether those outcomes meet the needs of the collaborators.

Technological developments in the last century have enabled relationships between individuals and institutions to blossom into a rich and complex tapestry…”

HHS releases new data and tools to increase transparency on hospital utilization and other trends


Pressrelease: “With more than 2,000 entrepreneurs, investors, data scientists, researchers, policy experts, government employees and more in attendance, the Department of Health and Human Services (HHS) is releasing new data and launching new initiatives at the annual Health Datapalooza conference in Washington, D.C.
Today, the Centers for Medicare & Medicaid Services (CMS) is releasing its first annual update to the Medicare hospital charge data, or information comparing the average amount a hospital bills for services that may be provided in connection with a similar inpatient stay or outpatient visit. CMS is also releasing a suite of other data products and tools aimed to increase transparency about Medicare payments. The data trove on CMS’s website now includes inpatient and outpatient hospital charge data for 2012, and new interactive dashboards for the CMS Chronic Conditions Data Warehouse and geographic variation data. Also today, the Food and Drug Administration (FDA) will launch a new open data initiative. And before the end of the conference, the Office of the National Coordinator for Health Information Technology (ONC) will announce the winners of two data challenges.
“The release of these data sets furthers the administration’s efforts to increase transparency and support data-driven decision making which is essential for health care transformation,” said HHS Secretary Kathleen Sebelius.
“These public data resources provide a better understanding of Medicare utilization, the burden of chronic conditions among beneficiaries and the implications for our health care system and how this varies by where beneficiaries are located,” said Bryan Sivak, HHS chief technology officer. “This information can be used to improve care coordination and health outcomes for Medicare beneficiaries nationwide, and we are looking forward to seeing what the community will do with these releases. Additionally, the openFDA initiative being launched today will for the first time enable a new generation of consumer facing and research applications to embed relevant and timely data in machine-readable, API-based formats.”
2012 Inpatient and Outpatient Hospital Charge Data
The data posted today on the CMS website provide the first annual update of the hospital inpatient and outpatient data released by the agency last spring. The data include information comparing the average charges for services that may be provided in connection with the 100 most common Medicare inpatient stays at over 3,000 hospitals in all 50 states and Washington, D.C. Hospitals determine what they will charge for items and services provided to patients and these “charges” are the amount the hospital generally bills for those items or services.
With two years of data now available, researchers can begin to look at trends in hospital charges. For example, average charges for medical back problems increased nine percent from $23,000 to $25,000, but the total number of discharges decreased by nearly 7,000 from 2011 to 2012.
In April, ONC launched a challenge – the Code-a-Palooza challenge – calling on developers to create tools that will help patients use the Medicare data to make health care choices. Fifty-six innovators submitted proposals and 10 finalists are presenting their applications during Datapalooza. The winning products will be announced before the end of the conference.
Chronic Conditions Warehouse and Dashboard
CMS recently released new and updated information on chronic conditions among Medicare fee-for-service beneficiaries, including:

  • Geographic data summarized to national, state, county, and hospital referral regions levels for the years 2008-2012;
  • Data for examining disparities among specific Medicare populations, such as beneficiaries with disabilities, dual-eligible beneficiaries, and race/ethnic groups;
  • Data on prevalence, utilization of select Medicare services, and Medicare spending;
  • Interactive dashboards that provide customizable information about Medicare beneficiaries with chronic conditions at state, county, and hospital referral regions levels for 2012; and
  • Chartbooks and maps.

These public data resources support the HHS Initiative on Multiple Chronic Conditions by providing researchers and policymakers a better understanding of the burden of chronic conditions among beneficiaries and the implications for our health care system.
Geographic Variation Dashboard
The Geographic Variation Dashboards present Medicare fee-for-service per-capita spending at the state and county levels in interactive formats. CMS calculated the spending figures in these dashboards using standardized dollars that remove the effects of the geographic adjustments that Medicare makes for many of its payment rates. The dashboards include total standardized per capita spending, as well as standardized per capita spending by type of service. Users can select the indicator and year they want to display. Users can also compare data for a given state or county to the national average. All of the information presented in the dashboards is also available for download from the Geographic Variation Public Use File.
Research Cohort Estimate Tool
CMS also released a new tool that will help researchers and other stakeholders estimate the number of Medicare beneficiaries with certain demographic profiles or health conditions. This tool can assist a variety of stakeholders interested in specific figures on Medicare enrollment. Researchers can also use this tool to estimate the size of their proposed research cohort and the cost of requesting CMS data to support their study.
Digital Privacy Notice Challenge
ONC, with the HHS Office of Civil Rights, will be awarding the winner of the Digital Privacy Notice Challenge during the conference. The winning products will help consumers get notices of privacy practices from their health care providers or health plans directly in their personal health records or from their providers’ patient portals.
OpenFDA
The FDA’s new initiative, openFDA, is designed to facilitate easier access to large, important public health datasets collected by the agency. OpenFDA will make FDA’s publicly available data accessible in a structured, computer readable format that will make it possible for technology specialists, such as mobile application creators, web developers, data visualization artists and researchers to quickly search, query, or pull massive amounts of information on an as needed basis. The initiative is the result of extensive research to identify FDA’s publicly available datasets that are often in demand, but traditionally difficult to use. Based on this research, openFDA is beginning with a pilot program involving millions of reports of drug adverse events and medication errors submitted to the FDA from 2004 to 2013. The pilot will later be expanded to include the FDA’s databases on product recalls and product labeling.
For more information about CMS data products, please visit http://www.cms.gov/Research-Statistics-Data-and-Systems/Research-Statistics-Data-and-Systems.html.
For more information about today’s FDA announcement visit: http://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/UCM399335 or http://open.fda.gov/

How open data can help shape the way we analyse electoral behaviour


Harvey Lewis (Deloitte), Ulrich Atz, Gianfranco Cecconi, Tom Heath (ODI) in The Guardian: Even after the local council elections in England and Northern Ireland on 22 May, which coincided with polling for the European Parliament, the next 12 months remain a busy time for the democratic process in the UK.
In September, the people of Scotland make their choice in a referendum on the future of the Union. Finally, the first fixed-term parliament in Westminster comes to an end with a general election in all areas of Great Britain and Northern Ireland in May 2015.
To ensure that as many people as possible are eligible and able to vote, the government is launching an ambitious programme of Individual Electoral Registration (IER) this summer. This will mean that the traditional, paper-based approach to household registration will shift to a tailored and largely digital process more in-keeping with the data-driven demands of the twenty-first century.
Under IER, citizens will need to provide ‘identifying information’, such as date of birth or national insurance number, when applying to register.

Ballots: stuck in the past?

However, despite the government’s attempts through IER to improve the veracity of information captured prior to ballots being posted, little has changed in terms of the vision for capturing, distributing and analysing digital data from election day itself.

Advertisement

Indeed, paper is still the chosen medium for data collection.
Digitising elections is fraught with difficulty, though. In the US, for example, the introduction of new voting machines created much controversy even though they are capable of providing ‘near-perfect’ ballot data.
The UK’s democratic process is not completely blind, though. Numerous opinion surveys are conducted both before and after polling, including the long-running British Election Study, to understand the shifting attitudes of a representative cross-section of the electorate.
But if the government does not retain in sufficient geographic detail digital information on the number of people who vote, then how can it learn what is necessary to reverse the long-running decline in turnout?

The effects of lack of data

To add to the debate around democratic engagement, a joint research team, with data scientists from Deloitte and the Open Data Institute (ODI), have been attempting to understand what makes voters tick.
Our research has been hampered by a significant lack of relevant data describing voter behaviour at electoral ward level, as well as difficulties in matching what little data is available to other open data sources, such as demographic data from the 2011 Census.
Even though individual ballot papers are collected and verified for counting the number of votes per candidate – the primary aim of elections, after all – the only recent elections for which aggregate turnout statistics have been published at ward level are the 2012 local council elections in England and Wales. In these elections, approximately 3,000 wards from a total of over 8,000 voted.
Data published by the Electoral Commission for the 2013 local council elections in England and Wales purports to be at ward level but is, in fact, for ‘county electoral divisions’, as explained by the Office for National Statistics.
Moreover, important factors related to the accessibility of polling stations – such as the distance from main population centres – could not be assessed because the location of polling stations remains the responsibility of individual local authorities – and only eight of these have so far published their data as open data.
Given these fundamental limitations, drawing any robust conclusions is difficult. Nevertheless, our research shows the potential for forecasting electoral turnout with relatively few census variables, the most significant of which are age and the size of the electorate in each ward.

What role can open data play?

The limited results described above provide a tantalising glimpse into a possible future scenario: where open data provides a deeper and more granular understanding of electoral behaviour.
On the back of more sophisticated analyses, policies for improving democratic engagement – particularly among young people – have the potential to become focused and evidence-driven.
And, although the data captured on election day will always remain primarily for the use of electing the public’s preferred candidate, an important secondary consideration is aggregating and publishing data that can be used more widely.
This may have been prohibitively expensive or too complex in the past but as storage and processing costs continue to fall, and the appetite for such knowledge grows, there is a compelling business case.
The benefits of this future scenario potentially include:

  • tailoring awareness and marketing campaigns to wards and other segments of the electorate most likely to respond positively and subsequently turn out to vote
  • increasing the efficiency with which European, general and local elections are held in the UK
  • improving transparency around the electoral process and stimulating increased democratic engagement
  • enhancing links to the Government’s other significant data collection activities, including the Census.

Achieving these benefits requires commitment to electoral data being collected and published in a systematic fashion at least at ward level. This would link work currently undertaken by the Electoral Commission, the ONS, Plymouth University’s Election Centre, the British Election Study and the more than 400 local authorities across the UK.”

Citizen participation and technology


ICTlogy: “The recent, rapid rise in the use of digital technology is changing relationships between citizens, organizations and public institutions, and expanding political participation. But while technology has the potential to amplify citizens’ voices, it must be accompanied by clear political goals and other factors to increase their clout.
Those are among the conclusions of a new NDI study, “Citizen Participation and Technology,” that examines the role digital technologies – such as social media, interactive websites and SMS systems – play in increasing citizen participation and fostering accountability in government. The study was driven by the recognition that better insights are needed into the relationship between new technologies, citizen participation programs and the outcomes they aim to achieve.
Using case studies from countries such as Burma, Mexico and Uganda, the study explores whether the use of technology in citizen participation programs amplifies citizen voices and increases government responsiveness and accountability, and whether the use of digital technology increases the political clout of citizens.
The research shows that while more people are using technology—such as social media for mobile organizing, and interactive websites and text messaging systems that enable direct communication between constituents and elected officials or crowdsourcing election day experiences— the type and quality of their political participation, and therefore its impact on democratization, varies. It also suggests that, in order to leverage technology’s potential, there is a need to focus on non-technological areas such as political organizing, leadership skills and political analysis.
For example, the “2% and More Women in Politics” coalition led by Mexico’s National Institute for Women (INMUJERES) used a social media campaign and an online petition to call successfully for reforms that would allocate two percent of political party funding for women’s leadership training. Technology helped the activists reach a wider audience, but women from the different political parties who made up the coalition might not have come together without NDI’s role as a neutral convener.
The study, which was conducted with support from the National Endowment for Democracy, provides an overview of NDI’s approach to citizen participation, and examines how the integration of technologies affects its programs in order to inform the work of NDI, other democracy assistance practitioners, donors, and civic groups.

Observations:

Key findings:

  1. Technology can be used to readily create spaces and opportunities for citizens to express their voices, but making these voices politically stronger and the spaces more meaningful is a harder challenge that is political and not technological in nature.
  2. Technology that was used to purposefully connect citizens’ groups and amplify their voices had more political impact.
  3. There is a scarcity of data on specific demographic groups’ use of, and barriers to technology for political participation. Programs seeking to close the digital divide as an instrument of narrowing the political divide should be informed by more research into barriers to access to both politics and technology.
  4. There is a blurring of the meaning between the technologies of open government data and the politics of open government that clouds program strategies and implementation.
  5. Attempts to simply crowdsource public inputs will not result in users self-organizing into politically influential groups, since citizens lack the opportunities to develop leadership, unity, and commitment around a shared vision necessary for meaningful collective action.
  6. Political will and the technical capacity to engage citizens in policy making, or providing accurate data on government performance are lacking in many emerging democracies. Technology may have changed institutions’ ability to respond to citizen demands but its mere presence has not fundamentally changed actual government responsiveness.”

Crowdsourcing platform for museums


Thesis by Kræn Vesterberg Hansen: “This thesis addresses a strategic challenge at National Museum of Denmark to engage with external people, interested in contributing information about their collection of more than half a million coins and medals. This approach of getting outsiders to help with the completion of many small tasks are popularly known as crowdsourcing. This entails a need for the transcription of handwritten protocols, establishment of references between of entries in protocols and photographs of coins. These coins also references both structured and non-structured metadata.
Does a digital platformfor crowd engagement, in the museum’s context, exist? And how is such a platform integrated with the existing infrastructure of the museum? The report considers the MediaWiki, Amazon’s Mechanical Turk and Zooniverse’s Scribe transcription interface, and finds that the MediaWiki fits approximately 70% of the requirements.
Existing cases of successful crowdsourcing projects, national as well international is mentioned and the solution builds upon APIs of existing infrastructure components (such as the existing collection management system GenReg Mønt and the Canto Cumulus digital asset management system) in a modular and reusable architecture.
The report approaches the challenge in a three part process, greatly inspired by the software process model of “Reuse-oriented software engineering” proposed by Professor of Software engineering at the University of St Andrews, Ian Summerville.”

How Big Data Could Undo Our Civil-Rights Laws


Virginia Eubanks in the American Prospect: “From “reverse redlining” to selling out a pregnant teenager to her parents, the advance of technology could render obsolete our landmark civil-rights and anti-discrimination laws.
Big Data will eradicate extreme world poverty by 2028, according to Bono, front man for the band U2. But it also allows unscrupulous marketers and financial institutions to prey on the poor. Big Data, collected from the neonatal monitors of premature babies, can detect subtle warning signs of infection, allowing doctors to intervene earlier and save lives. But it can also help a big-box store identify a pregnant teenager—and carelessly inform her parents by sending coupons for baby items to her home. News-mining algorithms might have been able to predict the Arab Spring. But Big Data was certainly used to spy on American Muslims when the New York City Police Department collected license plate numbers of cars parked near mosques, and aimed surveillance cameras at Arab-American community and religious institutions.
Until recently, debate about the role of metadata and algorithms in American politics focused narrowly on consumer privacy protections and Edward Snowden’s revelations about the National Security Agency (NSA). That Big Data might have disproportionate impacts on the poor, women, or racial and religious minorities was rarely raised. But, as Wade Henderson, president and CEO of the Leadership Conference on Civil and Human Rights, and Rashad Robinson, executive director of ColorOfChange, a civil rights organization that seeks to empower black Americans and their allies, point out in a commentary at TPM Cafe, while big data can change business and government for the better, “it is also supercharging the potential for discrimination.”
In his January 17 speech on signals intelligence, President Barack Obama acknowledged as much, seeking to strike a balance between defending “legitimate” intelligence gathering on American citizens and admitting that our country has a history of spying on dissidents and activists, including, famously, Dr. Martin Luther King, Jr. If this balance seems precarious, it’s because the links between historical surveillance of social movements and today’s uses of Big Data are not lost on the new generation of activists.
“Surveillance, big data and privacy have a historical legacy,” says Amalia Deloney, policy director at the Center for Media Justice, an Oakland-based organization dedicated to strengthening the communication effectiveness of grassroots racial justice groups. “In the early 1960s, in-depth, comprehensive, orchestrated, purposeful spying was used to disrupt political movements in communities of color—the Yellow Peril, the American Indian Movement, the Brown Berets, or the Black Panthers—to create fear and chaos, and to spread bias and stereotypes.”
In the era of Big Data, the danger of reviving that legacy is real, especially as metadata collection renders legal protection of civil rights and liberties less enforceable….
Big Data and surveillance are unevenly distributed. In response, a coalition of 14 progressive organizations, including the ACLU, ColorOfChange, the Leadership Conference on Civil and Human Rights, the NAACP, National Council of La Raza, and the NOW Foundation, recently released five “Civil Rights Principles for the Era of Big Data.” In their statement, they demand:

  • An end to high-tech profiling;
  • Fairness in automated decisions;
  • The preservation of constitutional principles;
  • Individual control of personal information; and
  • Protection of people from inaccurate data.

This historic coalition aims to start a national conversation about the role of big data in social and political inequality. “We’re beginning to ask the right questions,” says O’Neill. “It’s not just about what can we do with this data. How are communities of color impacted? How are women within those communities impacted? We need to fold these concerns into the national conversation.”

Open Data at Core of New Governance Paradigm


GovExec: “Rarely are federal agencies compared favorably with Facebook, Instagram, or other modern models of innovation, but there is every reason to believe they can harness innovation to improve mission effectiveness. After all, Aneesh Chopra, former U.S. Chief Technology Officer, reminded the Excellence in Government 2014 audience that government has a long history of innovation. From nuclear fusion to the Internet, the federal government has been at the forefront of technological development.
According to Chopra, the key to fueling innovation and economic prosperity today is open data. But to make the most of open data, government needs to adapt its culture. Chopra outlined three essential elements of doing so:

  1. Involve external experts – integrating outside ideas is second to none as a source of innovation.
  2. Leverage the experience of those on the front lines – federal employees who directly execute their agency’s mission often have the best sense of what does and does not work, and what can be done to improve effectiveness.
  3. Look to the public as a value multiplier – just as Facebook provides a platform for tens of thousands of developers to provide greater value, federal agencies can provide the raw material for many more to generate better citizen services.

In addition to these three broad elements, Chopra offered four specific levers government can use to help enact this paradigm shift:

  1. Democratize government data – opening government data to the public facilitates innovation. For example, data provided by the National Oceanic and Atmospheric Administration helps generate a 5 billion dollar industry by maintaining almost no intellectual property constraints on its weather data.
  2. Collaborate on technical standards – government can act as a convener of industry members to standardize technological development, and thereby increase the value of data shared.
  3. Issue challenges and prizes – incentivizing the public to get involved and participate in efforts to create value from government data enhances the government’s ability to serve the public.
  4. Launch government startups – programs like the Presidential Innovation Fellows initiative helps challenge rigid bureaucratic structures and permeate a culture of innovation.

Federal leaders will need a strong political platform to sustain this shift. Fortunately, this blueprint is also bipartisan, says Chopra. Political leaders on both sides of the aisle are already getting behind the movement to bring innovation to the core of government..