How Digital Transparency Became a Force of Nature


Daniel C. Dennett and Deb Roy in Scientific American: “More than half a billion years ago a spectacularly creative burst of biological innovation called the Cambrian explosion occurred. In a geologic “instant” of several million years, organisms developed strikingly new body shapes, new organs, and new predation strategies and defenses against them. Evolutionary biologists disagree about what triggered this prodigious wave of novelty, but a particularly compelling hypothesis, advanced by University of Oxford zoologist Andrew Parker, is that light was the trigger. Parker proposes that around 543 million years ago, the chemistry of the shallow oceans and the atmosphere suddenly changed to become much more transparent. At the time, all animal life was confined to the oceans, and as soon as the daylight flooded in, eyesight became the best trick in the sea. As eyes rapidly evolved, so did the behaviors and equipment that responded to them.

Whereas before all perception was proximal — by contact or by sensed differences in chemical concentration or pressure waves — now animals could identify and track things at a distance. Predators could home in on their prey; prey could see the predators coming and take evasive action. Locomotion is a slow and stupid business until you have eyes to guide you, and eyes are useless if you cannot engage in locomotion, so perception and action evolved together in an arms race. This arms race drove much of the basic diversification of the tree of life we have today.

Parker’s hypothesis about the Cambrian explosion provides an excellent parallel for understanding a new, seemingly unrelated phenomenon: the spread of digital technology. Although advances in communications technology have transformed our world many times in the past — the invention of writing signaled the end of prehistory; the printing press sent waves of change through all the major institutions of society — digital technology could have a greater impact than anything that has come before. It will enhance the powers of some individuals and organizations while subverting the powers of others, creating both opportunities and risks that could scarcely have been imagined a generation ago.

Through social media, the Internet has put global-scale communications tools in the hands of individuals. A wild new frontier has burst open. Services such as YouTube, Facebook, Twitter, Tumblr, Instagram, WhatsApp and SnapChat generate new media on a par with the telephone or television — and the speed with which these media are emerging is truly disruptive. It took decades for engineers to develop and deploy telephone and television networks, so organizations had some time to adapt. Today a social-media service can be developed in weeks, and hundreds of millions of people can be using it within months. This intense pace of innovation gives organizations no time to adapt to one medium before the arrival of the next.

The tremendous change in our world triggered by this media inundation can be summed up in a word: transparency. We can now see further, faster, and more cheaply and easily than ever before — and we can be seen. And you and I can see that everyone can see what we see, in a recursive hall of mirrors of mutual knowledge that both enables and hobbles. The age-old game of hide-and-seek that has shaped all life on the planet has suddenly shifted its playing field, its equipment and its rules. The players who cannot adjust will not last long.

The impact on our organizations and institutions will be profound. Governments, armies, churches, universities, banks and companies all evolved to thrive in a relatively murky epistemological environment, in which most knowledge was local, secrets were easily kept, and individuals were, if not blind, myopic. When these organizations suddenly find themselves exposed to daylight, they quickly discover that they can no longer rely on old methods; they must respond to the new transparency or go extinct. Just as a living cell needs an effective membrane to protect its internal machinery from the vicissitudes of the outside world, so human organizations need a protective interface between their internal affairs and the public world, and the old interfaces are losing their effectiveness….(More at Medium)”

21st-Century Public Servants: Using Prizes and Challenges to Spur Innovation


Jenn Gustetic at the Open Government Initiative Blog: “Thousands of Federal employees across the government are using a variety of modern tools and techniques to deliver services more effectively and efficiently, and to solve problems that relate to the missions of their Agencies. These 21st-century public servants are accomplishing meaningful results by applying new tools and techniques to their programs and projects, such as prizes and challenges, citizen science and crowdsourcing, open data, and human-centered design.

Prizes and challenges have been a particularly popular tool at Federal agencies. With 397 prizes and challenges posted on challenge.gov since September 2010, there are hundreds of examples of the many different ways these tools can be designed for a variety of goals. For example:

  • NASA’s Mars Balance Mass Challenge: When NASA’s Curiosity rover pummeled through the Martian atmosphere and came to rest on the surface of Mars in 2012, about 300 kilograms of solid tungsten mass had to be jettisoned to ensure the spacecraft was in a safe orientation for landing. In an effort to seek creative concepts for small science and technology payloads that could potentially replace a portion of such jettisoned mass on future missions, NASA released the Mars Balance Mass Challenge. In only two months, over 200 concepts were submitted by over 2,100 individuals from 43 different countries for NASA to review. Proposed concepts ranged from small drones and 3D printers to radiation detectors and pre-positioning supplies for future human missions to the planet’s surface. NASA awarded the $20,000 prize to Ted Ground of Rising Star, Texas for his idea to use the jettisoned payload to investigate the Mars atmosphere in a way similar to how NASA uses sounding rockets to study Earth’s atmosphere. This was the first time Ted worked with NASA, and NASA was impressed by the novelty and elegance of his proposal: a proposal that NASA likely would not have received through a traditional contract or grant because individuals, as opposed to organizations, are generally not eligible to participate in those types of competitions.
  • National Institutes of Health (NIH) Breast Cancer Startup Challenge (BCSC): The primary goals of the BCSC were to accelerate the process of bringing emerging breast cancer technologies to market, and to stimulate the creation of start-up businesses around nine federally conceived and owned inventions, and one invention from an Avon Foundation for Women portfolio grantee.  While NIH has the capacity to enable collaborative research or to license technology to existing businesses, many technologies are at an early stage and are ideally suited for licensing by startup companies to further develop them into commercial products. This challenge established 11 new startups that have the potential to create new jobs and help promising NIH cancer inventions support the fight against breast cancer. The BCSC turned the traditional business plan competition model on its head to create a new channel to license inventions by crowdsourcing talent to create new startups.

These two examples of challenges are very different, in terms of their purpose and the process used to design and implement them. The success they have demonstrated shouldn’t be taken for granted. It takes access to resources (both information and people), mentoring, and practical experience to both understand how to identify opportunities for innovation tools, like prizes and challenges, to use them to achieve a desired outcome….

Last month, the Challenge.gov program at the General Services Administration (GSA), the Office of Personnel Management (OPM)’s Innovation Lab, the White House Office of Science and Technology Policy (OSTP), and a core team of Federal leaders in the prize-practitioner community began collaborating with the Federal Community of Practice for Challenges and Prizes to develop the other half of the open innovation toolkit, the prizes and challenges toolkit. In developing this toolkit, OSTP and GSA are thinking not only about the information and process resources that would be helpful to empower 21st-century public servants using these tools, but also how we help connect these people to one another to add another meaningful layer to the learning environment…..

Creating an inventory of skills and knowledge across the 600-person (and growing!) Federal community of practice in prizes and challenges will likely be an important resource in support of a useful toolkit. Prize design and implementation can involve tricky questions, such as:

  • Do I have the authority to conduct a prize or challenge?
  • How should I approach problem definition and prize design?
  • Can agencies own solutions that come out of challenges?
  • How should I engage the public in developing a prize concept or rules?
  • What types of incentives work best to motivate participation in challenges?
  • What legal requirements apply to my prize competition?
  • Can non-Federal employees be included as judges for my prizes?
  • How objective do the judging criteria need to be?
  • Can I partner to conduct a challenge? What’s the right agreement to use in a partnership?
  • Who can win prize money and who is eligible to compete? …(More)

Data Science and Ebola


Inaugural Lecture by Aske Plaat on the acceptance of the position of professor of Data Science at the Universiteit Leiden: “…Today, everybody and everything produces data. People produce large amounts of data in social networks and in commercial transactions. Medical, corporate, and government databases continue to grow. Ten years ago there were a billion Internet users. Now there are more than three billion, most of whom are mobile.1 Sensors continue to get cheaper and are increasingly connected, creating an Internet of Things. The next three billion users of the Internet will not all be human, and will generate a large amount of data. In every discipline, large, diverse, and rich data sets are emerging, from astrophysics, to the life sciences, to medicine, to the behavioral sciences, to finance and commerce, to the humanities and to the arts. In every discipline people want to organize, analyze, optimize and understand their data to answer questions and to deepen insights. The availability of so much data and the ability to interpret it are changing the way the world operates. The number of sciences using this approach is increasing. The science that is transforming this ocean of data into a sea of knowledge is called data science. In many sciences the impact on the research methodology is profound—some even call it a paradigm shift.

…I will address the question of why there is so much interest in data. I will answer this question by discussing one of the most visible recent challenges to public health of the moment, the 2014 Ebola outbreak in West Africa…(More)”

Chinese air quality and social media


David C. Roberts at Quartz: “Every year, outdoor air pollution kills more people worldwide than malaria and HIV combined. People in China, particularly in its largest cities, are some of the most affected, since the country’s rapid economic growth has come at the cost of air quality. This issue remained largely unaddressed until the US embassy in Beijing began to tweet out air quality data in 2008, providing a remarkable demonstration of the transformative power of democratizing data. The tweets sparked an energetic environmental movement that forced China’s leaders to acknowledge the massive scale of the problem and begin to take measures to combat it.

The initiative to publicize air quality data was subsequently expanded to US consulates in several major Chinese cities, providing a wealth of new scientific data.  I recently worked with Federico San Martini and Christa Hasenkopf (both atmospheric scientists at the US State Department who are involved in this program) to analyze this data…(More)”

User Experience is a Social Justice Issue


Sumana Harihareswara at code4lib: “…Before I worked in open source, I worked in customer service. I saw first-hand how design flaws (in architecture, signage, and websites) could frustrate and drive away customers and make more work for me. Every time I participated in an open source project — AltLaw, GNOME, MediaWiki, and more — I’ve brought that experience with me. I found it particularly striking that small changes on Wikipedia could cause large changes in user behavior, as I discuss in this essay, which is adapted from my keynote speech.
This issue goes beyond software, as I explain with the healthcare and banking examples. The spark that caused me to write the speech was reading Professor Lisa J. Servon’s piece in The Atlantic about the usability of storefront check cashing services; I saw a pattern where poor user experience repels people from crucial and empowering services, and decided, in a flash of anger and inspiration, to write “User Experience is a Human Rights Issue.”…

The Last Mile Problem

The largest hurdles we as technologists face are choosing to make the right things in the first place and choosing to make them usable. In the 1990’s, telecommunications companies laid down a lot of fiber to connect big hubs to one another, but often it took years to connect those hubs to the actual houses and schools and shops and offices, because it was expensive, or because companies were not creative enough to do it well. This is called the “last mile problem,” and I think usability has a similar problem. We have to be creative and disciplined enough to actually provide services in a way that people can use them.
When we’re building services for people, we often have a lot more practice seeing things from the computer’s point of view or from the data’s point of view than from another person’s point of view. In tech, we understand how to build arteries better than we understand how to build capillaries. Personally, I think capillaries are more interesting than arteries. Maybe it’s just personal temperament, but I like all the little surprising details of how people end up experiencing the ripple effects of big new systems, and how users actually interact with the user interface of a service, especially ones that we don’t really think of as having a user interface. Like taxes, or healthcare, or hotels. All these big systems end in little capillaries, where people exchange information or get healed or get whatever they need. And when those capillaries aren’t working correctly, then those people just don’t get what they need. The hubs are connected to each other, but people aren’t connected to the hubs.
Over and over, in lots of different fields, we see that bad usability makes a huge difference. When choosing between two services, people will make very different choices, depending on which service actually seems designed around the user’s needs….(More)”

Beta Release of the NETmundial Solutions Map


“…the GovLab is pleased to announce the beta release of the NETmundial Solutions Map for further public comment (from April 1 -May 1, 2015). The release is the culmination of a 6-month engagement and development strategy to ensure that the tool reflects input from a diverse set of global stakeholders. The NETmundial Solutions Map is co-developed by the GovLab and Second Rise, and is facilitated by the Internet Corporation for Assigned Names and Numbers (ICANN).

 

Screen Shot 2015-03-31 at 2.29.10 PM

The tool seeks to support innovation in global governance toward a more distributed Internet Governance approach. It is designed to enable information sharing and collaboration across Internet governance issues. It will serve as a repository of information that links issues, actors, solutions and resources, and help users understand the current landscape of Internet governance.

Today, information about internet governance is scattered and hard to find. At the same time we need more coordination and collaboration to address specific issues. The Map seeks to facilitate a more collaborative and distributed way of solving Internet governance issues by providing users with a baseline of what responses already exist and who is working on what — Stefaan Verhulst, Co-Founder and Chief of Research and Development of the GovLab.

..This beta version of the NETmundial Solutions Map seeks to explore how to map the Internet governance landscape in a useful and sustainable way. Future revisions will continue to be guided by community feedback.

To this end, we welcome your comments on the following (period runs till May 1st):

  • What do you feel works well in the map?
  • What needs improving?
  • How can the map help you in your work?
  • Would you want to be part of the next version as a content provider?”

Thinking Ahead – Essays on Big Data, Digital Revolution, and Participatory Market Society


New book by Dirk Helbing: “The rapidly progressing digital revolution is now touching the foundations of the governance of societal structures. Humans are on the verge of evolving from consumers to prosumers, and old, entrenched theories – in particular sociological and economic ones – are falling prey to these rapid developments. The original assumptions on which they are based are being questioned. Each year we produce as much data as in the entire human history – can we possibly create a global crystal ball to predict our future and to optimally govern our world? Do we need wide-scale surveillance to understand and manage the increasingly complex systems we are constructing, or would bottom-up approaches such as self-regulating systems be a better solution to creating a more innovative, more successful, more resilient, and ultimately happier society? Working at the interface of complexity theory, quantitative sociology and Big Data-driven risk and knowledge management, the author advocates the establishment of new participatory systems in our digital society to enhance coordination, reduce conflict and, above all, reduce the “tragedies of the commons,” resulting from the methods now used in political, economic and management decision-making….(More)”

Inspiring and Informing Citizens Online: A Media Richness Analysis of Varied Civic Education Modalities


Paper by Brinker, David and Gastil, John and Richards, Robert C. in the Journal of Computer-Mediated Communication (Forthcoming): “Public deliberation on the Internet is a promising but unproven practice. Online deliberation can engage large numbers of citizens at relatively low cost, but it is unclear whether such programs have substantial civic impact. One factor in determining their effectiveness may be the communicative features of the online setting in which they occur. Within a Media Richness Theory framework, we conducted a quasi-experiment to assess the civic outcomes of interventions executed online by non-profit organizations prior to the 2012 U.S. presidential election. The results assess the impact of these interventions on issue knowledge and civic attitudes. Comparisons of the interventions illustrate the importance of considering media richness online, and our discussion considers the theoretical and practical implications of these findings….(More)”

Solving the obesity crisis: knowledge, nudge or nanny?


BioMedCentral Blog: ” The 5th Annual Oxford London Lecture (17 March 2015) was delivered by Professor Susan Jebb from Oxford University. The presentation was titled: ‘Knowledge, nudge and nanny: Opportunities to improve the nation’s diet’. In this guest blog Dr Helen Walls, Research Fellow at the London School of Hygiene and Tropical Medicine, covers key themes from this presentation.

“Obesity and related non-communicable disease such as diabetes, heart disease and cancer poses a significant health, social and economic burden in countries worldwide, including the United Kingdom. Whilst the need for action is clear, the nutrition policy response is a highly controversial topic. Professor Jebb raised the question of how best to achieve dietary change: through ‘knowledge, nudge or nanny’?

Education regarding healthy nutrition is an important strategy, but insufficient. People are notoriously bad at putting their knowledge to work. The inclination to overemphasise the importance of knowledge, whilst ignoring the influence of environmental factors on human behaviours, is termed the ‘fundamental attribution error’. Education may also contribute to widening inequities.

Our choices are strongly shaped by the environments in which we live. So if ‘knowledge’ is not enough, what sort of interventions are appropriate? This raises questions regarding individual choice and the role of government. Here, Professor Jebb introduced the Nuffield Intervention Ladder.

 

Nuffield Intervention Ladder
Nuffield Intervention Ladder
Nuffield Council on Bioethics. Public health ethical issues. London: Nuffield Council on Bioethics. 2007.

The Nuffield Intervention Ladder or what I will refer to as ‘the ladder’ describes intervention types from least to most intrusive on personal choice. With addressing diets and obesity, Professor Jebb believes we need a range of policy types, across the range of rungs on the ladder.

Less intrusive measures on the ladder could include provision of information about healthy and unhealthy foods, and provision of nutritional information on products (which helps knowledge be put into action). More effective than labelling is the signposting of healthier choices.

Taking a few steps up the ladder brings in ‘nudge’, a concept from behavioural economics. A nudge is any aspect of the choice architecture that alters people’s behaviour in a predictable way without forbidding options or significantly changing economic incentives. Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not.

Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not.

The in-store environment has a huge influence over our choices, and many nudge options would fit here. For example, gondalar-end (end of aisle) promotions create a huge up-lift in sales. Removing unhealthy products from this position could make a considerable difference to the contents of supermarket baskets.

Nudge could be used to assist people make better nutritional choices, but it’s also unlikely to be enough. We celebrate the achievement we have made with tobacco control policies and smoking reduction. Here, we use a range of intervention types, including many legislative measures – the ‘nanny’ aspect of the title of this presentation….(More)”

Modern Methods for Sentiment Analysis


Review by Michael Czerny: “Sentiment analysis is a common application of Natural Language Processing (NLP) methodologies, particularly classification, whose goal is to extract the emotional content in text. In this way, sentiment analysis can be seen as a method to quantify qualitative data with some sentiment score. While sentiment is largely subjective, sentiment quantification has enjoyed many useful implementations, such as businesses gaining understanding about consumer reactions to a product, or detecting hateful speech in online comments.

The simplest form of sentiment analysis is to use a dictionary of good and bad words. Each word in a sentence has a score, typically +1 for positive sentiment and -1 for negative. Then, we simply add up the scores of all the words in the sentence to get a final sentiment total. Clearly, this has many limitations, the most important being that it neglects context and surrounding words. For example, in our simple model the phrase “not good” may be classified as 0 sentiment, given “not” has a score of -1 and “good” a score of +1. A human would likely classify “not good” as negative, despite the presence of “good”.

Another common method is to treat a text as a “bag of words”. We treat each text as a 1 by N vector, where N is the size of our vocabulary. Each column is a word, and the value is the number of times that word appears. For example, the phrase “bag of bag of words” might be encoded as [2, 2, 1]. This could then be fed into a machine learning algorithm for classification, such as logistic regression or SVM, to predict sentiment on unseen data. Note that this requires data with known sentiment to train on in a supervised fashion. While this is an improvement over the previous method, it still ignores context, and the size of the data increases with the size of the vocabulary.

Word2Vec and Doc2Vec

Recently, Google developed a method called Word2Vec that captures the context of words, while at the same time reducing the size of the data. Word2Vec is actually two different methods: Continuous Bag of Words (CBOW) and Skip-gram. In the CBOW method, the goal is to predict a word given the surrounding words. Skip-gram is the converse: we want to predict a window of words given a single word (see Figure 1). Both methods use artificial neural networks as their classification algorithm. Initially, each word in the vocabulary is a random N-dimensional vector. During training, the algorithm learns the optimal vector for each word using the CBOW or Skip-gram method….(More)