Public servants to go on blind coffee dates for innovation


David Donaldson at The Mandarin: “Victorian public servants will have the opportunity to be randomly matched with others for coffee dates, as part of the government’s plan to foster links across silos and bolster innovation.

That is just one of many initiatives planned by Victoria in its new Public Sector Innovation Strategy, released on Tuesday.The plan acknowledges that plenty of innovative thinking is already happening, so the best way to drive further ideas is to connect people better and provide tools and case studies so they can learn from one another.

Six themes repeatedly came up in conversations around innovation in the public service, says the document:

  1. Leaders who enable and reward — too often new ideas are stifled by leaders who don’t support them;
  2. Employees who feel confident and supported;
  3. Learning well — pockets of innovation exist, and stronger efforts to learn and develop from them will help;
  4. Sharing with each other;
  5. Partnering with the community and other organisations;
  6. Delivering value — don’t innovate on random things. Focus on what makes a difference.

“This strategy helps to find, encourage and support change that adds value across the public sector. We need to unlock good intent and talent, share examples and experiences, and learn from each other,” says Chris Eccles, secretary of the Department of Premier and Cabinet.

“At our best, we all contribute our different skills and roles to generate more public value, shaped by the common purpose of creating a better society.”

To kick off progress, the government has outlined a series of actions it will undertake:

  • A reverse mentoring plan to help executives learn from more junior staff. Due September 2017.
  • Build on a current departmental trial that builds innovation into executive performance development plans. December 2017.
  • Establish a high-profile event to recognise and reward practical innovation across government. March 2018.
  • A practical innovation bank, to provide a common digital space for cross-government sharing of practical resources (case studies, contacts, templates, guides, lessons learned and so on). December 2017.
  • Ideas challenge toolkit to provide guidance on how to run an ideas challenge. December 2017.
  • Learning lab trial, which will provide an incubator environment for cross-government use on a project by project basis. March 2018.
  • VPS Academy, a new peer to peer learning project, will go through two more pilots to build the case for scaling up. July and December 2017…(More)”.

Open Data Blueprint


ODX Canada: “In Canada, the open data environment should be viewed as a supply chain. The movement of open data from producers to consumers involves many different organizations, people, activities, projects and initiatives, all of which work together to push out a final product. Naturally, if there is a break or hurdle in this supply chain, it doesn’t work efficiently. A fundamental hurdle highlighted by companies across the country was the inability to scale their business at the provincial, national and international levels.

This blueprint aims to address the challenges Canadian entrepreneurs are facing by encouraging municipalities to launch open data initiatives. By sharing best practices, we hope to encourage the accessibility of datasets within existing jurisdictions. The structured recommendations in this Open Data Blueprint are based on feedback and best practices seen in major cities across Canada collected through ODX’s primary research….(More)”

(Read more about the OD150 initiative here)

NIH-funded team uses smartphone data in global study of physical activity


National Institutes of Health: “Using a larger dataset than for any previous human movement study, National Institutes of Health-funded researchers at Stanford University in Palo Alto, California, have tracked physical activity by population for more than 100 countries. Their research follows on a recent estimate that more than 5 million people die each year from causes associated with inactivity.

The large-scale study of daily step data from anonymous smartphone users dials in on how countries, genders, and community types fare in terms of physical activity and what results may mean for intervention efforts around physical activity and obesity. The study was published July 10, 2017, in the advance online edition of Nature.

“Big data is not just about big numbers, but also the patterns that can explain important health trends,” said Grace Peng, Ph.D., director of the National Institute of Biomedical Imaging and Bioengineering (NIBIB) program in Computational Modeling, Simulation and Analysis.

“Data science and modeling can be immensely powerful tools. They can aid in harnessing and analyzing all the personalized data that we get from our phones and wearable devices.”

Almost three quarters of adults in developed countries and half of adults in developing economies carry a smartphone. The devices are equipped with tiny accelerometers, computer chip that maintains the orientation of the screen, and can also automatically record stepping motions. The users whose data contributed to this study subscribed to the Azumio Argus app, a free application for tracking physical activity and other health behaviors….

In addition to the step records, the researchers accessed age, gender, and height and weight status of users who registered the smartphone app. They used the same calculation that economists use for income inequality — called the Gini index — to calculate activity inequality by country.

“These results reveal how much of a population is activity-rich, and how much of a population is activity-poor,” Delp said. “In regions with high activity inequality there are many people who are activity poor, and activity inequality is a strong predictor of health outcomes.”…

The researchers investigated the idea that making improvements in a city’s walkability — creating an environment that is safe and enjoyable to walk — could reduce activity inequality and the activity gender gap.

“If you must cross major highways to get from point A to point B in a city, the walkability is low; people rely on cars,” Delp said. “In cities like New York and San Francisco, where you can get across town on foot safely, the city has high walkability.”

Data from 69 U.S. cities showed that higher walkability scores are associated with lower activity inequality. Higher walkability is associated with significantly more daily steps across all age, gender, and body-mass-index categories.  However, the researchers found that women recorded comparatively less activity than men in places that are less walkable.

The study exemplifies how smartphones can deliver new insights about key health behaviors, including what the authors categorize as the global pandemic of physical inactivity….(More)”.

Principles and Practices for a Federal Statistical Agency


National Academies of Sciences Report: “Publicly available statistics from government agencies that are credible, relevant, accurate, and timely are essential for policy makers, individuals, households, businesses, academic institutions, and other organizations to make informed decisions. Even more, the effective operation of a democratic system of government depends on the unhindered flow of statistical information to its citizens.

In the United States, federal statistical agencies in cabinet departments and independent agencies are the governmental units whose principal function is to compile, analyze, and disseminate information for such statistical purposes as describing population characteristics and trends, planning and monitoring programs, and conducting research and evaluation. The work of these agencies is coordinated by the U.S. Office of Management and Budget. Statistical agencies may acquire information not only from surveys or censuses of people and organizations, but also from such sources as government administrative records, private-sector datasets, and Internet sources that are judged of suitable quality and relevance for statistical use. They may conduct analyses, but they do not advocate policies or take partisan positions. Statistical purposes for which they provide information relate to descriptions of groups and exclude any interest in or identification of an individual person, institution, or economic unit.

Four principles are fundamental for a federal statistical agency: relevance to policy issues, credibility among data users, trust among data providers, and independence from political and other undue external influence.� Principles and Practices for a Federal Statistical Agency: Sixth Edition presents and comments on these principles as they’ve been impacted by changes in laws, regulations, and other aspects of the environment of federal statistical agencies over the past 4 years….(More)”.

Justice in Algorithmic Robes


Editorial by Joseph Savirimuthu of a Special Issue of the International Review of Law, Computers & Technology: “The role and impact of algorithms has attracted considerable interest in the media. Its impact is already being reflected in adjustments made in a number of sectors – entertainment, travel, transport, cities and financial services. From an innovation point of view, algorithms enable new knowledge to be created and identify solutions to problems. The emergence of smart sensing technologies, 3D printing, automated systems and robotics is seamlessly being interwoven into discourses such as ‘the collaborative economy’, ‘governance by platforms’ and ‘empowerment’. Innovations such as body worn cameras, fitness trackers, 3D printing, smart meters, robotics and Big Data hold out the promise of a new algorithmic future. However, the shift in focus from natural and scarce resources towards information also makes individuals the objects and the mediated construction of access and knowledge infrastructures now provide the conditions for harnessing value from data. The increasing role of algorithms in environments mediated by technology also coincide with growing inter-disciplinary scholarship voicing concerns about the vulnerability of the values we associate with fundamental freedoms and how these are being algorithmically reconfigured or dismantled in a systematic manner. The themed issue, Justice in Algorithmic Robes, is intended to initiate a dialogue on both the challenges and opportunities as digitalization ushers in a period of transformation that has no immediate parallels in terms of scale, speed and reach. The articles provide different perspectives to the transformation taking place in the digital environment. The contributors offer an inter-disciplinary view of how the digital economy is being invigorated and evaluate the regulatory responses – in particular, how these transformations interact with law. The different spheres covered in Justice in Algorithmic Robes – the relations between the State and individuals, autonomous technology, designing human–computer interactions, infrastructures of trust, accountability in the age of Big Data, and health and wearables – not only reveal the problem of defining spheres of economic, political and social activity, but also highlight how these contexts evolve into structures for dominance, power and control. Re-imagining the role of law does not mean that technology is the problem but the central idea from the contributions is that how we critically interpret and construct Justice in Algorithmic Robes is probably the first step we must take, always mindful of the fact that law may actually reinforce power structures….(Full Issue)”.

Children and the Data Cycle: Rights And Ethics in a Big Data World


Gabrielle Berman andKerry Albright at UNICEF: “In an era of increasing dependence on data science and big data, the voices of one set of major stakeholders – the world’s children and those who advocate on their behalf – have been largely absent. A recent paper estimates one in three global internet users is a child, yet there has been little rigorous debate or understanding of how to adapt traditional, offline ethical standards for research involving data collection from children, to a big data, online environment (Livingstone et al., 2015). This paper argues that due to the potential for severe, long-lasting and differential impacts on children, child rights need to be firmly integrated onto the agendas of global debates about ethics and data science. The authors outline their rationale for a greater focus on child rights and ethics in data science and suggest steps to move forward, focusing on the various actors within the data chain including data generators, collectors, analysts and end-users. It concludes by calling for a much stronger appreciation of the links between child rights, ethics and data science disciplines and for enhanced discourse between stakeholders in the data chain, and those responsible for upholding the rights of children, globally….(More)”.

Gender Biases in Cyberspace: A Two-Stage Model, the New Arena of Wikipedia and Other Websites


Paper by Shlomit Yanisky-Ravid and Amy Mittelman: “Increasingly, there has been a focus on creating democratic standards and norms in order to best facilitate open exchange of information and communication online―a goal that fits neatly within the feminist aim to democratize content creation and community. Collaborative websites, such as blogs, social networks, and, as focused on in this Article, Wikipedia, represent both a cyberspace community entirely outside the strictures of the traditional (intellectual) proprietary paradigm and one that professes to truly embody the philosophy of a completely open, free, and democratic resource for all. In theory, collaborative websites are the solution for which social activists, intellectual property opponents, and feminist theorists have been waiting. Unfortunately, we are now realizing that this utopian dream does not exist as anticipated: the Internet is neither neutral nor open to everyone. More importantly, these websites are not egalitarian; rather, they facilitate new ways to exclude and subordinate women. This Article innovatively argues that the virtual world excludes women in two stages: first, by controlling websites and filtering out women; and second, by exposing women who survived the first stage to a hostile environment. Wikipedia, as well as other cyber-space environments, demonstrates the execution of the model, which results in the exclusion of women from the virtual sphere with all the implications thereof….(More)”.

Technology is making the world more unequal. Only technology can fix this


Here’s the good news: technology – specifically, networked technology – makes it easier for opposition movements to form and mobilise, even under conditions of surveillance, and to topple badly run, corrupt states.

Inequality creates instability, and not just because of the resentments the increasingly poor majority harbours against the increasingly rich minority. Everyone has a mix of good ideas and terrible ones, but for most of us, the harm from our terrible ideas is capped by our lack of political power and the checks that others – including the state – impose on us.

As rich people get richer, however, their wealth translates into political influence, and their ideas – especially their terrible ideas – take on outsized importance….

After all, there comes a point when the bill for guarding your wealth exceeds the cost of redistributing some of it, so you won’t need so many guards.

But that’s where technology comes in: surveillance technology makes guarding the elites much cheaper than it’s ever been. GCHQ and the NSA have managed to put the entire planet under continuous surveillance. Less technologically advanced countries can play along: Ethiopia was one of the world’s first “turnkey surveillance states”, a country with a manifestly terrible, looting elite class that has kept guillotines and firing squads at bay through buying in sophisticated spying technology from European suppliers, and using this to figure out which dissidents, opposition politicians and journalists represent a threat, so it can subject them to arbitrary detention, torture and, in some cases, execution….

That’s the bad news.

Now the good news: technology makes forming groups cheaper and easier than it’s ever been. Forming and coordinating groups is the hard problem of the human condition; the reason we have religions and corporations and criminal undergrounds and political parties. Doing work together means doing more than one person could do on their own, but it also means compromising, subjecting yourself to policies or orders from above. It’s costly and difficult, and the less money and time you have, the harder it is to form a group and mobilise it.

This is where networks shine. Modern insurgent groups substitute software for hierarchy, networks for bosses. They are able to come together without agreeing to a crisp agenda that you have to submit to in order to be part of the movement. When it costs less to form a group, it doesn’t matter so much that you aren’t all there for the same reason, and thus are doomed to fall apart. Even a small amount of work done together amounts to more than the tiny cost of admission…

The future is never so normal as we think it will be. The only sure thing about self-driving cars, for instance, is that whether or not they deliver fortunes to oligarchic transport barons, that’s not where it will end. Changing the way we travel has implications for mobility (both literal and social), the environment, surveillance, protest, sabotage, terrorism, parenting …

Long before the internet radically transformed the way we organise ourselves, theorists were predicting we’d use computers to achieve ambitious goals without traditional hierarchies – but it was a rare pundit who predicted that the first really successful example of this would be an operating system (GNU/Linux), and then an encyclopedia (Wikipedia).

The future will see a monotonic increase in the ambitions that loose-knit groups can achieve. My new novel, Walkaway, tries to signpost a territory in our future in which the catastrophes of the super-rich are transformed into something like triumphs by bohemian, anti-authoritarian “walkaways” who build housing and space programmes the way we make encyclopedias today: substituting (sometimes acrimonious) discussion and (sometimes vulnerable) networks for submission to the authority of the ruling elites….(More).

Mapping the invisible: Street View cars add air pollution sensors


Environment at Google: “There are 1.3 million miles of natural gas distribution pipelines in the U.S. These pipelines exist pretty much everywhere that people do, and when they leak, the escaping methane — the main ingredient in natural gas — is a potent greenhouse gas, with 84 times the short-term warming effect of carbon dioxide. These leaks can be time-consuming to identify and measure using existing technologies. Utilities are required by law to quickly fix any leaks that are deemed a safety threat, but thousands of others can — and often do — go on leaking for months or years.

To help gas utilities, regulators, and others understand the scale of the challenge and help prioritize the most cost-effective solutions, the Environmental Defense Fund (EDF) worked with Joe von Fischer, a scientist at Colorado State University, to develop technology to detect and measure methane concentrations from a moving vehicle. Initial tests were promising, and EDF decided to expand the effort to more locations.

That’s when the organization reached out to Google. The project needed to scale, and we had the infrastructure to make it happen: computing power, secure data storage, and, most important, a fleet of Street View cars. These vehicles, equipped with high-precision GPS, were already driving around pretty much everywhere, capturing 360-degree photos for Google Maps; maybe they could measure methane while they were at it. The hypothesis, says Karin Tuxen-Bettman of Google Earth Outreach, was that “we had the potential to turn our Street View fleet into an environmental sensing platform.”

Street View cars make at least 2 trips around a given area in order to capture good air quality data. An intake tube on the front bumper collects air samples, which are then processed by a methane analyzer in the trunk. Finally, the data is sent to the Google Cloud for analysis and integration into a map showing the size and location of methane leaks. Since the trial began in 2012, EDF has built methane maps for 11 cities and found more than 5,500 leaks. The results range from one leak for every mile driven (sorry, Bostonians) to one every 200 miles (congrats, Indianapolis, for replacing all those corrosive steel and iron pipes with plastic).

All of us can go on our smartphone and get the weather. But what if you could scroll down and see what the air quality is on the street where you’re walking?…

This promising start inspired the team to take the next step and explore using Street View cars to measure overall air quality. For years, Google has worked on measuring indoor environmental quality across company offices with Aclima, which builds environmental sensor networks. In 2014, we expanded the partnership to the outside world, equipping several more Street View cars with its ‘Environmental Intelligence’ (Ei) mobile platform, including scientific-grade analyzers and arrays of small-scale, low-cost sensors to measure pollutants, including particulate matter, NO2, CO2 black carbon, and more. The new project began with a pilot in Denver, and we’ll finish mapping cities in 3 regions of California by the end of 2016. And today the system is delivering reliable data that corresponds to the U.S. Environmental Protection Agency’s stationary measurement network….

The project began with a few cars, but Aclima’s mobile platform, which has already produced one of the world’s largest data sets on air quality, could also be expanded via deployment on vehicles like buses and mail trucks, on the way to creating a street-level pollution map. This hyper-local data could help people make more informed choices about things like when to let their kids play outside and which changes to advocate for to make their communities healthier….(More)”.

More professionalism, less populism: How voting makes us stupid, and what to do about it


Paper by Benjamin Wittes and Jonathan Rauch: “For several generations, political reform and rhetoric have been entirely one-directional: always more direct democracy, never less. The general belief holds that more public involvement will produce more representative and thus more effective and legitimate governance. But does increasing popular involvement in politics remedy the ills of our government culture; is it the chicken soup of political reforms?

In a new report, “More professionalism, less populism: How voting makes us stupid, and what to do about it,” Brookings Senior Fellows Jonathan Rauch and Benjamin Wittes argue that the best way forward is to rebalance the reform agenda away from direct participation and toward intermediation and institutions. As the authors write, “Neither theory nor practice supports the idea that more participation will produce better policy outcomes, or will improve the public’s approbation of government, or is even attainable in an environment dominated by extreme partisans and narrow interest groups.”

Populism cannot solve our problems, Rauch and Wittes claim, because its core premises and reforms are self-defeating. Research has shown that voters are “irrationally biased and rationally ignorant,” and do not possess the specialized knowledge necessary to make complex policy judgments. Further, elections provide little by way of substantive guidance for policymakers and, even on its own terms, direct democracy is often unrepresentative. In the words of the authors, “By itself, building more direct input from the public into the functions of government is likely to lead to more fragmentation, more stalemate, more flawed policies—and, paradoxically, less effective representation.”

The authors are not advocating complacency about voter participation, much less for restricting or limiting voting: “We are arguing that participation is not enough, and that overinvesting in it neglects other, more promising paths.”

To truly repair American democracy, Rauch and Wittes endorse a resurgence of political institutions, such as political parties, and substantive professionals, such as career politicians and experts. Drawing on examples like the intelligence oversight community, the authors assert that these intermediaries actually make democracy more inclusive and more representative than direct participation can do by itself. “In complex policy spaces,” the authors write, “properly designed intermediary institutions can act more decisively and responsively on behalf of the public than an army of ‘the people’ could do on its own behalf, [and are] less likely to be paralyzed by factional disputes and distorted by special-interest manipulation.”…(More) (Read the full paper here).