When Does ICT-Enabled Citizen Voice Lead to Government Responsiveness?


Paper by Tiago Peixoto and Jonathan Fox (Worldbank): “This paper reviews evidence on the use of 23 information and communication technology (ICT) platforms to project citizen voice to improve public service delivery. This meta-analysis focuses on empirical studies of initiatives in the global South, highlighting both citizen uptake (‘yelp’) and the degree to which public service providers respond to expressions of citizen voice (‘teeth’). The conceptual framework further distinguishes between two trajectories for ICT-enabled citizen voice: Upwards accountability occurs when users provide feedback directly to decision-makers in real time, allowing policy-makers and program managers to identify and address service delivery problems – but at their discretion. Downwards accountability, in contrast, occurs either through real time user feedback or less immediate forms of collective civic action that publicly call on service providers to become more accountable and depends less exclusively on decision- makers’ discretion about whether or not to act on the information provided. This distinction between the ways in which ICT platforms mediate the relationship between citizens and service providers allows for a precise analytical focus on how different dimensions of such platforms contribute to public sector responsiveness. These cases suggest that while ICT platforms have been relevant in increasing policymakers’ and senior managers’ capacity to respond, most of them have yet to influence their willingness to do so….(More)”

Smart Devolution


New report by Eddie Copeland and Cameron Scott at Policy Exchange: “Elected mayors should be required to set up an Office of Data Analytics comprising of small, expert teams tasked with using public and privately held data to create smarter and more productive cities.

A new paper, Smart Devolution, by leading think tank Policy Exchange says that most cities have vast quantities of data that if accessed and used effectively could help improve public services, optimise transport routes, support the growth of small businesses and even prevent cycling accidents.

The report highlights how every UK city should use the additional powers they receive from Whitehall to replicate New York by employing a small team of data experts to collect and collate information from a range of sources, including councils, emergency services, voluntary organisations, mobile phone networks and payment systems.

The data teams will provide city mayors with a great opportunity to break down the silos that exist between local authorities and public sector bodies when it comes to unlocking information that could save money and improve the standard of living for the public.

Examples of how a better use of data could make our cities smarter include:

  • Preventing cycling accidents: HGVs travelling through city centres should be required to share their GPS data with the city mayor’s Office for Data Analytics. Combining HGV routes with data from cyclists obtained by their mobile phone signals could provide real time information showing the most common routes shared by large lorries and cyclists. City leaders could then put in place evidence based policy responses, for example, prioritising spending on new bike lanes or updating cyclists via an app of the city’s most dangerous routes.
  • Spending smarter: cities could save and residents benefit from the analysis of  anonymised spend and travel information to understand where investment and services are needed based on real consumer decisions. Locating schools, transport links and housing when and where it is needed. This also applies to business investment with data being harnessed to identify fruitful locations….(More)”

Human-machine superintelligence pegged as key to solving global problems


Ravi Mandalia at Dispatch Tribunal: “Global complex problems such as climate change and geopolitical conflicts need a new approach if we want to solve them and researchers have suggested that human-machine super intelligence could be the key.

These so called ‘wicked’ problems are some of the most dire ones that need our immediate attention and researchers from the Human Computation Institute (HCI) and Cornell University have presented their new vision of human computation that could help solve these problems in an article published in the journal Science.

Scientists behind the article have cited how power of human computation has helped push the traditional limits to new heights – something that was not achievable until now. Humans are still ahead of machines at great many things – cognitive abilities is one the key areas – but if their powers are combined with those of machines, the result would be multidimensional collaborative networks that achieve what traditional problem-solving cannot.

Researchers have already proved that micro-tasking has helped with some complex problems including build the world’s most complete map of human retinal neurons; however, this approach isn’t always viable to solve much more complex problems of today and entirely new and innovative approach is required to solve “wicked problems” – those that involve many interacting systems that are constantly changing, and whose solutions have unforeseen consequences (e.g., corruption resulting from financial aid given in response to a natural disaster).

Recently developed human computation technologies that provide real-time access to crowd-based inputs could enable creation of more flexible collaborative environments and such setups are more apt for addressing the most challenging issues.

This idea is already taking shape in several human computation projects, including YardMap.org, which was launched by the Cornell in 2012 to map global conservation efforts one parcel at a time.

“By sharing and observing practices in a map-based social network, people can begin to relate their individual efforts to the global conservation potential of living and working landscapes,” says Janis Dickinson, Professor and Director of Citizen Science at the Cornell Lab of Ornithology.

YardMap allows participants to interact and build on each other’s work – something that crowdsourcing alone cannot achieve. The project serves as an important model for how such bottom-up, socially networked systems can bring about scalable changes how we manage residential landscapes.

HCI has recently set out to use crowd-power to accelerate Cornell-based Alzheimer’s disease research. WeCureAlz.com combines two successful microtasking systems into an interactive analytic pipeline that builds blood flow models of mouse brains. The stardust@home system, which was used to search for comet dust in one million images of aerogel, is being adapted to identify stalled blood vessels, which will then be pinpointed in the brain by a modified version of the EyeWire system….(More)”

Can crowdsourcing decipher the roots of armed conflict?


Stephanie Kanowitz at GCN: “Researchers at Pennsylvania State University and the University of Texas at Dallas are proving that there’s accuracy, not just safety, in numbers. The Correlates of War project, a long-standing effort that studies the history of warfare, is now experimenting with crowdsourcing as a way to more quickly and inexpensively create a global conflict database that could help explain when and why countries go to war.

The goal is to facilitate the collection, dissemination and use of reliable data in international relations, but a byproduct has emerged: the development of technology that uses machine learning and natural language processing to efficiently, cost-effectively and accurately create databases from news articles that detail militarized interstate disputes.

The project is in its fifth iteration, having released the fourth set of Militarized Dispute (MID) Data in 2014. To create those earlier versions, researchers paid subject-matter experts such as political scientists to read and hand code newswire articles about disputes, identifying features of possible militarized incidents. Now, however, they’re soliciting help from anyone and everyone — and finding the results are much the same as what the experts produced, except the results come in faster and with significantly less expense.

As news articles come across the wire, the researchers pull them and formulate questions about them that help evaluate the military events. Next, the articles and questions are loaded onto the Amazon Mechanical Turk, a marketplace for crowdsourcing. The project assigns articles to readers, who typically spend about 10 minutes reading an article and responding to the questions. The readers submit the answers to the project researchers, who review them. The project assigns the same article to multiple workers and uses computer algorithms to combine the data into one annotation.

A systematic comparison of the crowdsourced responses with those of trained subject-matter experts showed that the crowdsourced work was accurate for 68 percent of the news reports coded. More important, the aggregation of answers for each article showed that common answers from multiple readers strongly correlated with correct coding. This allowed researchers to easily flag the articles that required deeper expert involvement and process the majority of the news items in near-real time and at limited cost….(more)”

Big Data in U.S. Agriculture


Megan Stubbs at the Congressional Research Service: “Recent media and industry reports have employed the term big data as a key to the future of increased food production and sustainable agriculture. A recent hearing on the private elements of big data in agriculture suggests that Congress too is interested in potential opportunities and challenges big data may hold. While there appears to be great interest, the subject of big data is complex and often misunderstood, especially within the context of agriculture.

There is no commonly accepted definition of the term big data. It is often used to describe a modern trend in which the combination of technology and advanced analytics creates a new way of processing information that is more useful and timely. In other words, big data is just as much about new methods for processing data as about the data themselves. It is dynamic, and when analyzed can provide a useful tool in a decisionmaking process. Most see big data in agriculture at the end use point, where farmers use precision tools to potentially create positive results like increased yields, reduced inputs, or greater sustainability. While this is certainly the more intriguing part of the discussion, it is but one aspect and does not necessarily represent a complete picture.

Both private and public big data play a key role in the use of technology and analytics that drive a producer’s evidence-based decisions. Public-level big data represent records collected, maintained, and analyzed through publicly funded sources, specifically by federal agencies (e.g., farm program participant records and weather data). Private big data represent records generated at the production level and originate with the farmer or rancher (e.g., yield, soil analysis, irrigation levels, livestock movement, and grazing rates). While discussed separately in this report, public and private big data are typically combined to create a more complete picture of an agricultural operation and therefore better decisionmaking tools.

Big data may significantly affect many aspects of the agricultural industry, although the full extent and nature of its eventual impacts remain uncertain. Many observers predict that the growth of big data will bring positive benefits through enhanced production, resource efficiency, and improved adaptation to climate change. While lauded for its potentially revolutionary applications, big data is not without issues. From a policy perspective, issues related to big data involve nearly every stage of its existence, including its collection (how it is captured), management (how it is stored and managed), and use (how it is analyzed and used). It is still unclear how big data will progress within agriculture due to technical and policy challenges, such as privacy and security, for producers and policymakers. As Congress follows the issue a number of questions may arise, including a principal one—what is the federal role?…(More)”

The Future of Behavioural Change: Balancing Public Nudging vs Private Nudging


2nd AIM Lecture by Alberto Alemanno: “Public authorities, including the European Union and its Member States, are increasingly interested in exploiting behavioral insights through public action. They increasingly do so through choice architecture, i.e. the alteration of the environment of choice surrounding a particular decision making context in areas as diverse as energy consumption, tax collection and public health. In this regard, it is useful to distinguish between two situations. The first is that of a public authority which seeks to steer behaviour in the public interest, taking into account one or more mental shortcuts. Thus, a default enrollment for organ donation leverages on the power of inertia to enhance the overall prevalence organ donors. Placing an emoticon (sad face) or a set of information about average consumption on a prohibitive energy bill has the potential to nudge consumers towards less energy consumption. I call this pure public nudging. The second perspective is when public authorities react to exploitative uses of mental shortcuts by market forces by regulating private nudging. I call this ‘counter-nudging’. Pure public nudging helps people correct mental shortcuts so as to achieve legitimate objectives (e.g. increased availability of organs, environmental protection, etc.), regardless of their exploitative use by market forces.
It is against this proposed taxonomy that the 2nd AIM Lecture examines whether also private companies may nudge for good. Are corporations well-placed to nudge their customers towards societal objectives, such as the protection of the environment or the promotion of public health? This is what I call benign corporate nudging.
Their record is far from being the most credible. Companies have used behavioural inspired interventions to maximize profits, what led them to sell more and in turn to induce citizens into more consumption. Yet corporate marketing need not always be self-interested. An incipient number of companies are using their brand, generally through their packaging and marketing efforts, to ‘nudge for good’. By illustrating some actual examples, this lecture defines the conditions under which companies may genuinely and credibly nudge for good. It argues that benign corporate nudging may have – unlike dominant CSR efforts – a positive long-term, habit-forming effect that influences consumers’ future behaviour ‘for good’….(More)”

 

Open Prescribing


“Every month, the NHS in England publishes anonymised data about the drugs prescribed by GPs. But the raw data files are large and unwieldy, with more than 600 million rows. We’re making it easier for GPs, managers and everyone to explore – supporting safer, more efficient prescribing.

OpenPrescribing is one of a range of projects built by Ben Goldacre and Anna Powell-Smith at the EBM Data Lab to help make complex medical and scientific data more accessible and more impactful in the real world…..

Data sources

Please read our guide to using the data.

Prescribing data is from the monthly files published by the Health and Social Care Information Centre(HSCIC), used under the terms of the Open Government Licence.

Practice list sizes are from the NHS Business Service Authority’s Information Portal, used under the terms of the Open Government Licence. ASTRO-PU and STAR-PUs are calculated from list sizes, based on standard formulas.

BNF codes and names are also from the NHS Business Service Authority’s Information Portal, used under the terms of the Open Government Licence.

CCG to practice relations, and practice prescribing settings, are from the HSCIC’s data downloads(epraccur.csv), used under the terms of the Open Government Licence.

CCG names and codes and CCG geographic boundaries are from the Office for National Statistics, used under the terms of the Open Government Licence.

Practice locations are approximate, geocoded using OpenCageData. If you know a better source of practice locations (not including Code-Point Open), please get in touch!…(More)”

Predictive Analytics


Revised book by Eric Siegel: “Prediction is powered by the world’s most potent, flourishing unnatural resource: data. Accumulated in large part as the by-product of routine tasks, data is the unsalted, flavorless residue deposited en masse as organizations churn away. Surprise! This heap of refuse is a gold mine. Big data embodies an extraordinary wealth of experience from which to learn.

Predictive analytics unleashes the power of data. With this technology, the computer literally learns from data how to predict the future behavior of individuals. Perfect prediction is not possible, but putting odds on the future drives millions of decisions more effectively, determining whom to call, mail, investigate, incarcerate, set up on a date, or medicate.

In this lucid, captivating introduction — now in its Revised and Updated edition — former Columbia University professor and Predictive Analytics World founder Eric Siegel reveals the power and perils of prediction:

    • What type of mortgage risk Chase Bank predicted before the recession.
    • Predicting which people will drop out of school, cancel a subscription, or get divorced before they even know it themselves.
    • Why early retirement predicts a shorter life expectancy and vegetarians miss fewer flights.
    • Five reasons why organizations predict death — including one health insurance company.
    • How U.S. Bank and Obama for America calculated — and Hillary for America 2016 plans to calculate — the way to most strongly persuade each individual.
    • Why the NSA wants all your data: machine learning supercomputers to fight terrorism.
    • How IBM’s Watson computer used predictive modeling to answer questions and beat the human champs on TV’s Jeopardy!
    • How companies ascertain untold, private truths — how Target figures out you’re pregnant and Hewlett-Packard deduces you’re about to quit your job.
    • How judges and parole boards rely on crime-predicting computers to decide how long convicts remain in prison.
    • 183 examples from Airbnb, the BBC, Citibank, ConEd, Facebook, Ford, Google, the IRS, LinkedIn, Match.com, MTV, Netflix, PayPal, Pfizer, Spotify, Uber, UPS, Wikipedia, and more….(More)”

 

‘Design thinking’ is changing the way we approach problems


Tim Johnson in University Affairs on “Why researchers in various disciplines are using the principles of design to solve problems big and small” : “A product of the same trends and philosophies that gave us smartphones, laptop computers and internet search engines, design thinking is changing the way some academics approach teaching and research, the way architects design classrooms and how leaders seek to solve the world’s most persistent problems.

Cameron Norman is a long-time supporter of design thinking (or DT) and an adjunct lecturer at the University of Toronto’s Dalla Lana School of Public Health. He notes that designers, especially product designers, are typically experts in conceptualizing problems and solving them– ideal skills for tackling a wide range of issues, from building a better kitchen table to mapping out the plans on a large building. “The field of design is the discipline of innovation,” he says. “[Design thinking] is about taking these methods, tools and ideas, and applying them in other areas.”

Design thinking centres on the free flow of ideas – far-out concepts aren’t summarily dismissed – and an unusually enthusiastic embrace of failure. “Design thinkers try to figure out what the key problem is – they look around and try to understand what’s going on, and come up with some wild ideas, thinking big and bold, on how to solve it,” Dr. Norman says. “They assume they’re not going to get it right the first time.”

If you were looking to build a better mousetrap, you’d prototype a model, test it for weaknesses, then either trash it and start again, or identify the problems and seek to correct them. DT does the same thing, but in an increasingly broad array of areas, from social policy to healthcare to business.

Deborah Shackleton, dean of design and dynamic media at Emily Carr University of Art + Design in Vancouver, was an early adopter of DT. “Design thinking is a mindset. You can use it as a tool or a technique. It’s very adaptable,” she says.

In 2005, ECUAD revamped much of its curriculum in design and dynamic media, looking to shift the focus from more traditional methods of research, like literature reviews, to something called “generative research.” “It’s the idea that you would invite the participants – for whom the design is intended – to be part of the creation process,” Dr. Shackleton says. She adds that various tools, like “co-creation kits” (which include a range of activities to engage people on a variety of cognitive and emotional levels) and ethnographic and cultural probes (activities which help participants demonstrate details about their private lives to their design partners), prove very useful in this area.

Collaboration among various fields is an important part of the design thinking process. At the University of Alberta, Aidan Rowe, an associate professor in design studies, is using design thinking to help the City of Edmonton improve services for people who are homeless. “Design is a truly interdisciplinary discipline,” says Dr. Rowe. “We always need someone to design with and for. We don’t design for ourselves.”….

Design thinkers often speak of “human-centered design” and “social innovation,” concepts that flow from DT’s assertion that no single person has the answer to a complex problem. Instead, it focuses on collective goals and places a premium on sustainability, community, culture and the empowerment of people, says Greg Van Alstyne, director of research and co-founder of the Strategic Innovation Lab, or sLab, at OCAD University. “It means you go about your problem-solving in a more holistic way. We can say ‘human-centered,’ but it’s actually ‘life-centered,’” Mr. Van Alstyne explains. “Our brand of design thinking is amenable to working within social systems and improving the lot of communities.”

 

Design thinking is also transforming university campuses in a tangible way. One example is at the University of Calgary’s Taylor Institute for Teaching and Learning, which is undergoing a $40-million renovation. “The whole space is designed to help students connect, communicate, collaborate and create knowledge,” says Lynn Taylor, vice-provost, teaching and learning. “Traditional learning was focused on the facts and concepts and procedures of a discipline, and we’re moving toward the goal of having students think far more deeply about their learning.”

To create this new space within a two-floor, 4,000-square-metre building that formerly served as an art museum, the university turned to Diamond Schmitt Architects, who have designed similar spaces at a number of other Canadian campuses. The new space, scheduled to open in February, prioritizes flexibility, with movable walls and collapsible furniture, and the seamless integration of technology.

Lead architect Don Schmitt observes that in a traditional campus building, which usually contains a long corridor and individual classrooms, conversation tends to gravitate to the only true public space: the hallway. “There’s a sense that more learning probably happens outside the classroom or between the classrooms, than happens inside the classroom,” he says.

Gone is the old-model lecture hall, with fixed podium and chairs. They’ve been replaced by a much more malleable space, which in a single day can act as a dance studio, movie theatre, lecture space, or just a big area for students to get together. “It’s about individual learning happening informally, quiet study, gregarious social activity, group study, group projects, flexible studio environments, changeable, ‘hack-able’ spaces and lots of flexibility to use different places in different ways,” Mr. Schmitt explains….(More)”

Humanity 360: World Humanitarian Data and Trends 2015


OCHA: “WORLD HUMANITARIAN DATA AND TRENDS

Highlights major trends, challenges and opportunities in the nature of humanitarian crises, showing how the humanitarian landscape is evolving in a rapidly changing world.

EXPLORE...

LEAVING NO ONE BEHIND: HUMANITARIAN EFFECTIVENESS IN THE AGE OF THE SUSTAINABLE DEVELOPMENT GOALS

Exploring what humanitarian effectiveness means in today’s world ‐ better meeting the needs of people in crisis, better moving people out of crisis.

EXPLORE

TOOLS FOR DATA COORDINATION AND COLLECTION