Tech is moving beyond cities to focus on civic engagement in every U.S. county


 at TechCrunch: “While gridlock has taken hold in a paralyzed Washington, D.C. mayors across the country are taking a pragmatic approach to solving local problems and its time for tech to reach out to them….

The United States has 3,0007 counties. And all of them have an appetite to shift the momentum from the federal government to the communities where people live and work. This can’t just involve coastal cities or urban areas within states. Rather, after Trump’s election, now is the moment to redouble policy efforts in communities across the country from states to rural counties.

Cities from Chicago, Los Angeles, Boston, to New York have been leading the way to think about how to provide better services and engagement opportunities.  They’ve been exciting places where rich networks of talent from academia to philanthropy have been helping foster ecosystems to catalyze new policy solutions….

There are a host of illustrative experiments occurring across communities that are leveraging policy innovation, data, and technology for more responsive and inclusive governance. The engagements that work focus on process to ensure that diverse stakeholders are a part of decision making….

Wisconsin:

In Eau Claire, Wisconsin a local organization called Clear Vision is teaming up with stakeholders on a poverty summit to reduce the number of people living poverty in income insecurity and build more resilient and inclusive communities. Citizen action groups will work on key issues they identify as part of the engagement process.

A key component of this poverty summit is to bring in traditionally marginalized communities into the process including low-income households, rural poor, youth and black and Hispanic communities. There is even a community-supported, nonprofit journalism site to support the local work in Eau Claire, Chippewa, and Dunn counties….

Oregon:

In Oregon, a “Kitchen Table” is enabling residents from across the state to contribute ideas, resources, and feedback to inform public policy. The Kitchen Table enables public officials to consult with representatives about key policy areas, crowdfund, and micro-lend for local startups and community businesses….

Another practice in Oregon is the Citizens Initiative Review, where a representative sampling of citizens convenes for deliberations over several days to discuss state ballot measures.  After being established by the state’s bipartisan legislature in 2009, there have been six random representative samples of citizens for multi-day deliberations to draft voting guides written for the people, by their neighbors….

 

This requires tapping into existing networks and civic organizations, leveraging data, technology and policy innovations, and re-shifting our focus from federal policy towards building an infrastructure of governance that is durable through collective development and buy-in from people…(More)”

Four steps to precision public health


Scott F. DowellDavid Blazes & Susan Desmond-Hellmann at Nature: “When domestic transmission of Zika virus was confirmed in the United States in July 2016, the entire country was not declared at risk — nor even the entire state of Florida. Instead, precise surveillance defined two at-risk areas of Miami-Dade County, neighbourhoods measuring just 2.6 and 3.9 square kilometres. Travel advisories and mosquito control focused on those regions. Six weeks later, ongoing surveillance convinced officials to lift restrictions in one area and expand the other.

By contrast, a campaign against yellow fever launched this year in sub-Saharan Africa defines risk at the level of entire nations, often hundreds of thousands of square kilometres. More granular assessments have been deemed too complex.

The use of data to guide interventions that benefit populations more efficiently is a strategy we call precision public health. It requires robust primary surveillance data, rapid application of sophisticated analytics to track the geographical distribution of disease, and the capacity to act on such information1.

The availability and use of precise data is becoming the norm in wealthy countries. But large swathes of the developing world are not reaping its advantages. In Guinea, it took months to assemble enough data to clearly identify the start of the largest Ebola outbreak in history. This should take days. Sub-Saharan Africa has the highest rates of childhood mortality in the world; it is also where we know the least about causes of death…..

The value of precise disease tracking was baked into epidemiology from the start. In 1854, John Snow famously located cholera cases in London. His mapping of the spread of infection through contaminated water dealt a blow to the idea that the disease was caused by bad air. These days, people and pathogens move across the globe swiftly and in great numbers. In 2009, the H1N1 ‘swine flu’ influenza virus took just 35 days to spread from Mexico and the United States to China, South Korea and 12 other countries…

The public-health community is sharing more data faster; expectations are higher than ever that data will be available from clinical trials and from disease surveillance. In the past two years, the US National Institutes of Health, the Wellcome Trust in London and the Gates Foundation have all instituted open data policies for their grant recipients, and leading journals have declared that sharing data during disease emergencies will not impede later publication.

Meanwhile, improved analysis, data visualization and machine learning have expanded our ability to use disparate data sources to decide what to do. A study published last year4 used precise geospatial modelling to infer that insecticide-treated bed nets were the single most influential intervention in the rapid decline of malaria.

However, in many parts of the developing world, there are still hurdles to the collection, analysis and use of more precise public-health data. Work towards malaria elimination in South Africa, for example, has depended largely on paper reporting forms, which are collected and entered manually each week by dozens of subdistricts, and eventually analysed at the province level. This process would be much faster if field workers filed reports from mobile phones.

Sources: Ref. 8/Bill & Melinda Gates Foundation

…Frontline workers should not find themselves frustrated by global programmes that fail to take into account data on local circumstances. Wherever they live — in a village, city or country, in the global south or north — people have the right to public-health decisions that are based on the best data and science possible, that minimize risk and cost, and maximize health in their communities…(More)”

neveragain.tech


neveragain.tech: “We, the undersigned, are employees of tech organizations and companies based in the United States. We are engineers, designers, business executives, and others whose jobs include managing or processing data about people. We are choosing to stand in solidarity with Muslim Americans, immigrants, and all people whose lives and livelihoods are threatened by the incoming administration’s proposed data collection policies. We refuse to build a database of people based on their Constitutionally-protected religious beliefs. We refuse to facilitate mass deportations of people the government believes to be undesirable…..

Today we stand together to say: not on our watch, and never again.

We commit to the following actions:

  • We refuse to participate in the creation of databases of identifying information for the United States government to target individuals based on race, religion, or national origin.
  • We will advocate within our organizations:
    • to minimize the collection and retention of data that would facilitate ethnic or religious targeting.
    • to scale back existing datasets with unnecessary racial, ethnic, and national origin data.
    • to responsibly destroy high-risk datasets and backups.
    • to implement security and privacy best practices, in particular, for end-to-end encryption to be the default wherever possible.
    • to demand appropriate legal process should the government request that we turn over user data collected by our organization, even in small amounts.
  • If we discover misuse of data that we consider illegal or unethical in our organizations:
    • We will work with our colleagues and leaders to correct it.
    • If we cannot stop these practices, we will exercise our rights and responsibilities to speak out publicly and engage in responsible whistleblowing without endangering users.
    • If we have the authority to do so, we will use all available legal defenses to stop these practices.
    • If we do not have such authority, and our organizations force us to engage in such misuse, we will resign from our positions rather than comply.
  • We will raise awareness and ask critical questions about the responsible and fair use of data and algorithms beyond our organization and our industry….(More)

Solving some of the world’s toughest problems with the Global Open Policy Report


 at Creative Commons: “Open Policy is when governments, institutions, and non-profits enact policies and legislation that makes content, knowledge, or data they produce or fund available under a permissive license to allow reuse, revision, remix, retention, and redistribution. This promotes innovation, access, and equity in areas of education, data, software, heritage, cultural content, science, and academia.

For several years, Creative Commons has been tracking the spread of open policies around the world. And now, with the new Global Open Policy Report (PDF) by the Open Policy Network, we’re able to provide a systematic overview of open policy development.

screen-shot-2016-12-02-at-5-57-09-pmThe first-of-its-kind report gives an overview of open policies in 38 countries, across four sectors: education, science, data and heritage. The report includes an Open Policy Index and regional impact and local case studies from Africa, the Middle East, Asia, Australia, Latin America, Europe, and North America. The index measures open policy strength on two scales: policy strength and scope, and level of policy implementation. The index was developed by researchers from CommonSphere, a partner organization of CC Japan.

The Open Policy Index scores were used to classify countries as either Leading, Mid-Way, or Delayed in open policy development. The ten countries with the highest scores are Argentina, Bolivia, Chile, France, Kyrgyzstan, New Zealand, Poland, South Korea, Tanzania, and Uruguay…(More)

Fighting Exclusion, Inequality and Distrust: The Open Government Challenge


Remarks by Manish Bapna delivered at the Open Government Partnership Global Summit: “To the many heads of state, ministers, mayors, civil society colleagues gathered in this great hall, this is an important moment to reflect on the remarkable challenges of the past year.

We have seen the rise of various forms of populism and nationalism in the United States, Britain, the Philippines, Italy, and many other countries. This has led to surprise election results and an increase in anti-immigrant and anti-government movements.

We have seen the tragic results of conflict-driven migration, as captured in the iconic image of a three-year-old boy whose body washed up on the Turkish shore.

We have seen governments struggle to respond to the refugee crisis. Some open their arms while others close their doors.

We have seen deadly terrorist attacks in cities around the world – including this one — that have forced governments to walk a fine line between the need to protect their people and the risk of infringing on their civil liberties.

And we continue to confront two inter-linked challenges: the moral challenge of 700 million people in extreme poverty, living on less than $2 a day, and the existential challenge of a changing climate.

All of these point to a failure of governance and, if we are honest, to a lack of open government that truly connects, engages and meets the needs of all people.

World’s Problems Can’t Be Solved Without Open Government

The crux of the matter is this: While open government alone can’t fix the world’s problems, they can’t be solved without it.

Too many people feel excluded and marginalized. They believe that only elites reap the benefits of growth and globalization. They feel left out of decision-making. They distrust public institutions.

How we collectively confront these challenges will be OGP’s most important test….

Here are five essential steps we can take – we, the people here today – to help accelerate the shift toward open government.

The first step: We must protect civic space – the rights to free speech, assembly and association – because these bedrock rights are at the heart of a functioning society. Serious violations of these rights have been recently reported by CIVICUS in over 100 countries. In 25 active OGP countries, these rights are repressed or obstructed….

The second step: We must foster citizen-centered governance.

We cherish OGP as a unique platform where government and civil society are equal partners in a way that amplifies the concerns of ordinary citizens.

We commend the many OGP countries that have made significant strides. But we recognize that for others, this remains a major struggle.

As heads of state and ministers, we need you to embrace the concept of co-creation. …

The third step: We must make changes that are transformational, not incremental.

Drawing on our commitment to open government and the urgency of this moment, we must be willing to go further, faster…..

Transforming government brings us to the fourth step.

We need to make a real difference in people’s lives.

This is our Partnership’s ultimate aim. Because when open government works, it improves every facet of people’s lives.

• This means giving all people safe drinking water and clean air.
• It means reliable electricity so children can have light to do homework and play.
• It means health clinics where the sick can go to get quality care, where medicines are available
• And it means building trust in public officials who are untainted by corruption….

The fifth and final step: We need to reinvigorate the Partnership’s political leadership….(More)”

Maybe the Internet Isn’t a Fantastic Tool for Democracy After All


 in NewYork Magazine: “My favorite story about the internet is the one about the anonymous Japanese guy who liberated Czechoslovakia. In 1989, as open dissent was spreading across the country, dissidents were attempting to coordinate efforts outside the watchful eye of Czechoslovak state security. The internet was a nascent technology, and the cops didn’t use it; modems were banned, and activists were able to use only those they could smuggle over the border, one at a time. Enter our Japanese guy. Bruce Sterling, who first told the story of the Japanese guy in a 1995 Wired article, says he talked to four different people who’d met the quiet stranger, but no one knew his name. What really mattered, anyway, is what he brought with him: “a valise full of brand-new and unmarked 2400-baud Taiwanese modems,” which he handed over to a group of engineering students in Prague before walking away. “The students,” Sterling would later write, “immediately used these red-hot 2400-baud scorcher modems to circulate manifestos, declarations of solidarity, rumors, and riot news.” Unrest expanded, the opposition grew, and within months, the Communist regime collapsed.

Is it true? Were free modems the catalyst for the Velvet Revolution? Probably not. But it’s a good story, the kind whose logic and lesson have become so widely understood — and so foundational to the worldview of Silicon Valley — as to make its truth irrelevant. Isn’t the best way to fortify the town square by giving more people access to it? And isn’t it nice to know, as one storied institution and industry after another falls to the internet’s disrupting sword, that everything will be okay in the end — that there might be some growing pains, but connecting billions of people to one another is both inevitable and good? Free speech will expand, democracy will flower, and we’ll all be rich enough to own MacBooks. The new princes of Silicon Valley will lead us into the rational, algorithmically enhanced, globally free future.

Or, they were going to, until earlier this month. The question we face now is: What happens when the industry destroyed is professional politics, the institutions leveled are the same few that prop up liberal democracy, and the values the internet disseminates are racism, nationalism, and demagoguery?

Powerful undemocratic states like China and Russia have for a while now put the internet to use to mislead the public, create the illusion of mass support, and either render opposition invisible or expose it to targeting…(More)”

From policing to news, how algorithms are changing our lives


Carl Miller at The National: “First, write out the numbers one to 100 in 10 rows. Cross out the one. Then circle the two, and cross out all of the multiples of two. Circle the three, and do likewise. Follow those instructions, and you’ve just completed the first three steps of an algorithm, and an incredibly ancient one. Twenty-three centuries ago, Eratosthenes was sat in the great library of Alexandria, using this process (it is called Eratosthenes’ Sieve) to find and separate prime numbers. Algorithms are nothing new, indeed even the word itself is old. Fifteen centuries after Eratosthenes, Algoritmi de numero Indorum appeared on the bookshelves of European monks, and with it, the word to describe something very simple in essence: follow a series of fixed steps, in order, to achieve a given answer to a given problem. That’s it, that’s an algorithm. Simple.

 Apart from, of course, the story of algorithms is not so simple, nor so humble. In the shocked wake of Donald Trump’s victory in the United States presidential election, a culprit needed to be found to explain what had happened. What had, against the odds, and in the face of thousands of polls, caused this tectonic shift in US political opinion? Soon the finger was pointed. On social media, and especially on Facebook, it was alleged that pro-Trump stories, based on inaccurate information, had spread like wildfire, often eclipsing real news and honestly-checked facts.
But no human editor was thrust into the spotlight. What took centre stage was an algorithm; Facebook’s news algorithm. It was this, critics said, that was responsible for allowing the “fake news” to circulate. This algorithm wasn’t humbly finding prime numbers; it was responsible for the news that you saw (and of course didn’t see) on the largest source of news in the world. This algorithm had somehow risen to become more powerful than any newspaper editor in the world, powerful enough to possibly throw an election.
So why all the fuss? Something is now happening in society that is throwing algorithms into the spotlight. They have taken on a new significance, even an allure and mystique. Algorithms are simply tools but a web of new technologies are vastly increasing the power that these tools have over our lives. The startling leaps forward in artificial intelligence have meant that algorithms have learned how to learn, and to become capable of accomplishing tasks and tackling problems that they were never been able to achieve before. Their learning is fuelled with more data than ever before, collected, stored and connected with the constellations of sensors, data farms and services that have ushered in the age of big data.

Algorithms are also doing more things; whether welding, driving or cooking, thanks to robotics. Wherever there is some kind of exciting innovation happening, algorithms are rarely far away. They are being used in more fields, for more things, than ever before and are incomparably, incomprehensibly more capable than the algorithms recognisable to Eratosthenes….(More)”

Big Data Coming In Faster Than Biomedical Researchers Can Process It


Richard Harris at NPR: “Biomedical research is going big-time: Megaprojects that collect vast stores of data are proliferating rapidly. But scientists’ ability to make sense of all that information isn’t keeping up.

This conundrum took center stage at a meeting of patient advocates, called Partnering For Cures, in New York City on Nov. 15.

On the one hand, there’s an embarrassment of riches, as billions of dollars are spent on these megaprojects.

There’s the White House’s Cancer Moonshot (which seeks to make 10 years of progress in cancer research over the next five years), the Precision Medicine Initiative (which is trying to recruit a million Americans to glean hints about health and disease from their data), The BRAIN Initiative (to map the neural circuits and understand the mechanics of thought and memory) and the International Human Cell Atlas Initiative (to identify and describe all human cell types).

“It’s not just that any one data repository is growing exponentially, the number of data repositories is growing exponentially,” said Dr. Atul Butte, who leads the Institute for Computational Health Sciences at the University of California, San Francisco.

One of the most remarkable efforts is the federal government’s push to get doctors and hospitals to put medical records in digital form. That shift to electronic records is costing billions of dollars — including more than $28 billion alone in federal incentives to hospitals, doctors and others to adopt them. The investment is creating a vast data repository that could potentially be mined for clues about health and disease, the way websites and merchants gather data about you to personalize the online ads you see and for other commercial purposes.

But, unlike the data scientists at Google and Facebook, medical researchers have done almost nothing as yet to systematically analyze the information in these records, Butte said. “As a country, I think we’re investing close to zero analyzing any of that data,” he said.

Prospecting for hints about health and disease isn’t going to be easy. The raw data aren’t very robust and reliable. Electronic medical records are often kept in databases that aren’t compatible with one another, at least without a struggle. Some of the potentially revealing details are also kept as free-form notes, which can be hard to extract and interpret. Errors commonly creep into these records….(More)”

Making the Case for Evidence-Based Decision-Making


Jennifer Brooks in Stanford Social Innovation Review: “After 15 years of building linkages between evidence, policy, and practice in social programs for children and families, I have one thing to say about our efforts to promote evidence-based decision-making: We have failed to capture the hearts and minds of the majority of decision-makers in the United States.

I’ve worked with state and federal leadership, as well as program administrators in the public and nonprofit spheres. Most of them just aren’t with us. They aren’t convinced that the payoffs of evidence-based practice (the method that uses rigorous tests to assess the efficacy of a given intervention) are worth the extra difficulty or expense of implementing those practices.

Why haven’t we gotten more traction for evidence-based decision-making? Three key reasons: 1) we have wasted time debating whether randomized control trials are the optimal approach, rather than building demand for more data-based decision-making; 2) we oversold the availability of evidence-based practices and underestimated what it takes to scale them; and 3) we did all this without ever asking what problems decision-makers are trying to solve.

If we want to gain momentum for evidence-based practice, we need to focus more on figuring out how to implement such approaches on a larger scale, in a way that uses data to improve programs on an ongoing basis….

We must start by understanding and analyzing the problem the decision-maker wants to solve. We need to offer more than lists of evidence-based strategies or interventions. What outcomes do the decision-makers want to achieve? And what do data tell us about why we aren’t getting those outcomes with current methods?…

None of the following ideas is rocket science, nor am I the first person to say them, but they do suggest ways that we can move beyond our current approaches in promoting evidence-based practice.

1. We need better data.

As Michele Jolin pointed out recently, few federal programs have sufficient resources to build or use evidence. There are limited resources for evaluation and other evidence-building activities, which too often are seen as “extras.” Moreover, many programs at the local, state, and national level have minimal information to use for program management and even fewer staff with the skills required to use it effectively…

 

2. We should attend equally to practices and to the systems in which they sit.

Systems improvements without changes in practice won’t get outcomes, but without systems reforms, evidence-based practices will have difficulty scaling up. …

3. You get what you pay for.

One fear I have is that we don’t actually know whether we can get better outcomes in our public systems without spending more money. And yet cost-savings seem to be what we promise when we sell the idea of evidence-based practice to legislatures and budget directors….

4. We need to hold people accountable for program results and promote ongoing improvement.

There is an inherent tension between using data for accountability and using it for program improvement….(More)”

The Crowd is Always There: A Marketplace for Crowdsourcing Crisis Response


Presentation by Patrick Meier at the Emergency Social Data Summit organized by the Red Cross …on “Collaborative Crisis Mapping” (the slides are available here): “What I want to expand on is the notion of a “marketplace for crowdsourcing” that I introduced at the Summit. The idea stems from my experience in the field of conflict early warning, the Ushahidi-Haiti deployment and my observations of the Ushahidi-DC and Ushahidi-Russia initiatives.

The crowd is always there. Paid Search & Rescue (SAR) teams and salaried emergency responders aren’t. Nor can they be on the corners of every street, whether that’s in Port-au-Prince, Haiti, Washington DC or Sukkur, Pakistan. But the real first responders, the disaster affected communities, are always there. Moreover, not all communities are equally affected by a crisis. The challenge is to link those who are most affected with those who are less affected (at least until external help arrives).

This is precisely what PIC Net and the Washington Post did when they  partnered to deploy this Ushahidi platform in response to the massive snow storm that paralyzed Washington DC earlier this year. They provided a way for affected residents to map their needs and for those less affected to map the resources they could share to help others. You don’t need to be a professional disaster response professional to help your neighbor dig out their car.

More recently, friends at Global Voices launched the most ambitious crowdsourcing initiative in Russia in response to the massive forest fires. But they didn’t use this Ushahidi platform to map the fires. Instead, they customized the public map so that those who needed help could find those who wanted to help. In effect, they created an online market place to crowdsource crisis response. You don’t need professional certification in disaster response to drive someone’s grandparents to the next town over.

There’s a lot that disaster affected populations can (and already do) to help each other out in times of crisis. What may help is to combine the crowdsourcing of crisis information with what I call crowdfeeding in order to create an efficient market place for crowdsourcing response. By crowdfeeding, I mean taking crowdsourced information and feeding it right back to the crowd. Surely they need that information as much if not more than external, paid responders who won’t get to the scene for hours or days….(More)”