We Need To Innovate The Science Business Model


Greg Satell at Forbes: “In 1945, Vannevar Bush, the man that led the nation’s scientific efforts during World War II, delivered a proposal to President Truman for funding scientific research in the post-war world.  Titled Science, The Endless Frontier, it led to the formation of the NSFNIHDARPA and other agencies….
One assumption inherent in Bush’s proposal was that institutions would be at the center of scientific life.  Scientists from disparate labs could read each others papers and meet at an occasional conference, but for the most part, they would be dependent on the network of researchers within their organization and those close by.
Sometimes, the interplay between institutions had major, even historical, impacts, such as John von Neumann’s sponsorship of Alan Turing, but mostly the work you did was largely a function of where you did it.  The proximity of Watson, Crick, Rosalind Franklin and Maurice Wilkins, for example, played a major role in the discovery of the structure of DNA.
Yet today, digital technology is changing not only the speed and ease of how we communicate, but the very nature of how we are able to collaborate.  When I spoke to Jonathan Adams, Chief Scientist at Digital Science, which develops and invests in software that makes science more efficient, he noted that there is a generational shift underway and said this:

When you talk to people like me, we’re established scientists who are still stuck in the old system of institutions and conferences.  But the younger scientists are using technology to access networks and they do so on an ongoing, rather than a punctuated basis.  Today, you don’t have to go to a conference or write a paper to exchange ideas.

Evidence would seem to bear this out.  The prestigious journal Nature recently noted that the average scientific paper has four times as many authors as it did in the 1950’s, when Bush’s career was at its height.  Moreover, it’s become common for co-authors to work at far-flung institutions.  Scientific practice needs to adopt to this scientific reality.
There has been some progress in this area.  The Internet, in fact, was created for the the explicit purpose of scientific collaboration.  Yet still, the way in which scientists report and share their findings remains much the same as a century ago.
Moving From Publications To Platforms For Discovery
One especially ripe area for innovation is publishing.  Typically, a researcher with a new discovery waits six months to a year for the peer review process to run its course before the work can be published.  Even then, many of the results are questionable at best.  Nature recently reported that the overwhelming majority of studies can’t be replicated…(More)”

Using Flash Crowds to Automatically Detect Earthquakes & Impact Before Anyone Else


Patrick Meier at iRevolutions: “It is said that our planet has a new nervous system; a digital nervous system comprised of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. Next generation humanitarian technologies seek to leverage this new nervous system to detect and diagnose the impact of disasters within minutes rather than hours. To this end, LastQuake may be one of the most impressive humanitarian technologies that I have recently come across. Spearheaded by the European-Mediterranean Seismological Center (EMSC), the technology combines “Flashsourcing” with social media monitoring to auto-detect earthquakes before they’re picked up by seismometers or anyone else.

Screen Shot 2014-10-23 at 5.08.30 PM

Scientists typically draw on ground-motion prediction algorithms and data on building infrastructure to rapidly assess an earthquake’s potential impact. Alas, ground-motion predictions vary significantly and infrastructure data are rarely available at sufficient resolutions to accurately assess the impact of earthquakes. Moreover, a minimum of three seismometers are needed to calibrate a quake and said seismic data take several minutes to generate. This explains why the EMSC uses human sensors to rapidly collect relevant data on earthquakes as these reduce the uncertainties that come with traditional rapid impact assessment methodologies. Indeed, the Center’s important work clearly demonstrates how the Internet coupled with social media are “creating new potential for rapid and massive public involvement by both active and passive means” vis-a-vis earthquake detection and impact assessments. Indeed, the EMSC can automatically detect new quakes within 80-90 seconds of their occurrence while simultaneously publishing tweets with preliminary information on said quakes, like this one:

Screen Shot 2014-10-23 at 5.44.27 PM

In reality, the first human sensors (increases in web traffic) can be detected within 15 seconds (!) of a quake…(More)

City Governments Are Using Yelp to Tell You Where Not to Eat


Michael Luca and Luther Lowe at HBR Blog: “…in recent years consumer-feedback platforms like TripAdvisor, Foursquare, and Chowhound have transformed the restaurant industry (as well as the hospitality industry), becoming important guides for consumers. Yelp has amassed about 67 million reviews in the last decade. So it’s logical to think that these platforms could transform hygiene awareness too — after all, people who contribute to review sites focus on some of the same things inspectors look for.

It turns out that one way user reviews can transform hygiene awareness is by helping health departments better utilize their resources. The deployment of inspectors is usually fairly random, which means time is often wasted on spot checks at clean, rule-abiding restaurants. Social media can help narrow the search for violators.
Within a given city or area, it’s possible to merge the entire history of Yelp reviews and ratings — some of which contain telltale words or phrases such as “dirty” and “made me sick” — with the history of hygiene violations and feed them into an algorithm that can predict the likelihood of finding problems at reviewed restaurants. Thus inspectors can be allocated more efficiently.
In San Francisco, for example, we broke restaurants into the top half and bottom half of hygiene scores. In a recent paper, one of us (Michael Luca, with coauthor Yejin Choi and her graduate students) showed that we could correctly classify more than 80% of restaurants into these two buckets using only Yelp text and ratings. In the next month, we plan to hold a contest on DrivenData to get even better algorithms to help cities out (we are jointly running the contest). Similar algorithms could be applied in any city and in other sorts of prediction tasks.
Another means for transforming hygiene awareness is through the sharing of health-department data with online review sites. The logic is simple: Diners should be informed about violations before they decide on a destination, rather than after.
Over the past two years, we have been working with cities to help them share inspection data with Yelp through an open-data standard that Yelp created in 2012 to encourage officials to put their information in places that are more useful to consumers. In San Francisco, Los Angeles, Raleigh, and Louisville, Kentucky, customers now see hygiene data alongside Yelp reviews. There’s evidence that users are starting to pay attention to this data — click-through rates are similar to those for other features on Yelp ….

And there’s no reason this type of data sharing should be limited to restaurant-inspection reports. Why not disclose data about dentists’ quality and regulatory compliance via Yelp? Why not use data from TripAdvisor to help spot bedbugs? Why not use Twitter to understand what citizens are concerned about, and what cities can do about it? Uses of social media data for policy, and widespread dissemination of official data through social media, have the potential to become important means of public accountability. (More)

Making City Hall Leaner


Nigel Jacob at Governing: “…How do we create services that people actually want to use?
The first change is to start thinking about these services as products. What’s the difference? Well, this is where we can learn something from startups. Products are the tools that we build to deliver value to our users.
Products are typically managed by one or more product managers that watch very carefully how users interact with the product so  the startup can determine which features to keep and which to toss. We can contrast this with traditional government services which are developed at some point to solve a problem of some sort, but because they are typically not monitored in a way to understand whether these services are actually adding value, they quickly fall out of sync with the needs of people.
Consider government websites that allow people to access their benefits. These sites are typically clunky to use and hard to navigate. This isn’t a small issue. It can be the difference between people getting and not getting the resources they need to survive.
Case in point: CalFresh.
These are services.

Compare this to a site such as Balance which was designed by watching how people use the CalFresh site, talking to these users about how they would like to access their benefits and then building a tool that actually responds to their needs.This is a product.
So, we need to be thinking about not only what government is building (in terms of tools), but also how it builds them.
The approach to building high-value products used by startups (and other orgs looking to build better products) is called Agile.
There are many flavors of Agile, each with their own strengths and weaknesses. One of the more recent agile methodologies that has garnered support in the startup community is Lean developed by Eric Reiss in his book, “The Lean Startup.”
Now, a word of caution. Any methodology that is used outside of the context in which it was intended runs the risk of simply not working. However, at its core, Lean is about learning what works and what doesn’t, so I’ll focus on the central elements of Lean since they have much to teach those of us who are working to overhaul local government about how to create value….(More)”

Tired of Being Profiled, a Programmer Turns to Crowdsourcing Cop Reviews


Christopher Moraff at Next City: “…despite the fact that policing is arguably one of the most important and powerful service professions a civilized society can produce, it’s far easier to find out if the plumber you just hired broke someone’s pipe while fixing their toilet than it is to find out if the cop patrolling your neighborhood broke someone’s head while arresting them.
A 31-year-old computer programmer has set out to fix that glitch with a new web-based (and soon to be mobile) crowdsourced rating tool called CopScore that is designed to help communities distinguish police officers who are worthy of praise from those who are not fit to wear the uniform….
CopScore is a work in progress, and, for the time being at least, a one-man show. Hardison does all the coding himself, often working through the night to bring new features online.
Currently in the very early beta stage, the platform works by consolidating information on the service records of individual police officers together with details of their interactions with constituents. The searchable platform includes data gleaned from public sources — such as social media and news articles — cross-referenced with Yelp-style ratings from citizens.

For Hardison, CopScore is as much a personal endeavor as it is a professional one. He says his youthful interest in computer programming — which he took up as a misbehaving fifth-grader under the guiding hand of a concerned teacher — made him the butt of the occassional joke in the predominantly African-American community of North Nashville where he grew up….”(More)

Making emotive games from open data


Katie Collins at WIRED: “Microsoft researcher Kati London’s aim is “to try to get people to think of data in terms of personalities, relationships and emotions”, she tells the audience at the Story Festival in London. Through Project Sentient Data, she uses her background in games development to create fun but meaningful experiences that bridge online interactions and things that are happening in the real world.
One such experience invited children to play against the real-time flow of London traffic through an online game called the Code of Everand. The aim was to test the road safety knowledge of 9-11 year olds and “make alertness something that kids valued”.
The core mechanic of the game was that of a normal world populated by little people, containing spirit channels that only kids could see and go through. Within these spirit channels, everything from lorries and cars from the streets became monsters. The children had to assess what kind of dangers the monsters posed and use their tools to dispel them.
“Games are great ways to blur and observe the ways people interact with real-world data,” says London.
In one of her earlier projects back in 2005, London used her knowledge of horticulture to bring artificial intelligence to plants. “Almost every workspace I go into has a half dead plant in it, so we gave plants the ability to tell us what they need.” It was, she says, an exercise in “humanising data” that led to further projects that saw her create self aware street signs and a dynamic city map that expressed shame neighbourhood by neighbourhood depending on the open dataset of public complaints in New York.
A further project turned complaint data into cartoons on Instagram every week. London praised the open data initiative in New York, but added that for people to access it, they had to know it existed and know where to find it. The cartoons were a “lightweight” form of “civic engagement” that helped to integrate hyperlocal issues into everyday conversation.
London also gamified community engagement through a project commissioned by the Knight Foundation called Macon Money….(More)”.

Beyond Transparency


Hildy Gottlieb on “How “opening up” can help organizations achieve their missions” in Stanford Social Innovation Review : “…For the past two years, Creating the Future, a social change research and development laboratory, has been experimenting to find the answer to that question. In the process, we have learned that when organizations are more open in their work, it can improve both the work itself and the results in the communities they serve.
In December 2012, Creating the Future’s board voted to open all its board and strategy meetings (including meetings for branding, resource development, and programming) to anyone who wished to attend and participate.
Since our organization is global, we hold our meetings via Google Hangout, and community members participate via a dedicated Twitter hashtag. Everyone is encouraged to participate—through asking questions and sharing observations—as if they are board members, whether or not they are.
This online openness mirrors the kind of inclusive, participatory culture that many grassroots neighborhood groups have fostered in the “real world” for decades. As we’ve studied those groups and experienced open engagement for ourselves, here are some of the things we’ve learned that can apply to any organization, whether they are working at a distance or in person.

What Being Open Makes Possible

1.  Being open adds new thinking to the mix. We can’t overstate this obvious practical benefit for every strategic issue an organization considers. During a recent discussion of employee “paid time off” policies, a participant with no formal relationship to the organization powerfully shifted the board’s conversation and perspectives away from the rigidity of a policy, focusing instead on the values of relationships, outcomes, buy-in, and adaptability. That input helped the board clarify its intent. It ultimately chose to scrap the idea of a certain amount of “paid time off,” in favor of an outcomes-based approach that provides flexibility for both employees and their supervisors.
2. Being open flattens internal communications. Opening all our meetings has led to cross-pollination across every aspect of our organization, providing an ongoing opportunity for sharing information and resources, and for developing everyone’s potential as leaders….
3. Being open walks the talk of the engaged communities we want to see. From the moment we opened the doors to our meetings, people have walked in and found meaningful ways to become part of our work. …
It seems so simple: If we want to engage the community, we just need to open the doors and invite people in!
4. Being open creates meaningful inclusion. Board diversity initiatives are intended to ensure that an organization’s decision-making reflects the experience of the community it serves. In reality, though, there can never be enough seats on a board to accomplish inclusion beyond what often feels like tokenism. Creating the Future’s board doesn’t have to worry about representing the community, because our community members represent themselves. And while this is powerful in an online setting, it is even more powerful when on-the-ground community members are part of a community-based organization’s decision-making fabric.
5. Being open creates more inclusive accountability. During a discussion of cash flow for our young organization, one concerned board member wondered aloud whether adhering to our values might be at cross-purposes with our survival. Our community members went wild via Twitter, expressing that it was that very code of values that drew them to the work in the first place. That reminder helped board members remove scarcity and fear from the conversation so that they could base their decision on what would align with our values and help accomplish the mission.
The needs of our community directly impacted that decision—not because of a bylaws requirement for “voting members” but simply because we encouraged community members to actively take part in the conversation….(More)”

Data for good


NESTA: “This report explores how capturing, sharing and analysing data in new ways can transform how charities work and how social action happens.

Key Findings

  • Citizens Advice (CAB) and Data Kind partnered to develop the Civic Dashboard. A tool which mines data from CAB consultations to understand emerging social issues in the UK.
  • Shooting Star Chase volunteers streamlined the referral paths of how children come to be at the hospices saving up to £90,000 for children’s hospices around the country by refining the referral system.
  • In a study of open grant funding data, NCVO identified 33,000 ‘below the radar organisations’ not currently registered in registers and databases on the third sector
  • In their social media analysis of tweets related to the Somerset Floods, Demos found that 39,000 tweets were related to social action

New ways of capturing, sharing and analysing data have the potential to transform how community and voluntary sector organisations work and how social action happens. However, while analysing and using data is core to how some of the world’s fastest growing businesses understand their customers and develop new products and services, civil society organisations are still some way off from making the most of this potential.
Over the last 12 months Nesta has grant funded a number of research projects that explore two dimensions of how big and open data can be used for the common good. Firstly, how it can be used by charities to develop better products and services and secondly, how it can help those interested in civil society better understand social action and civil society activity.

  • Citizens Advice Bureau (CAB) and Datakind, a global community of data scientists interested in how data can be used for a social purpose, were grant funded to explore how a datadriven approach to mining the rich data that CAB holds on social issues in the UK could be used to develop a real–time dashboard to identify emerging social issues. The project also explored how data–driven methods could better help other charities such as St Mungo’s and Buttle UK, and how data could be shared more effectively between charities as part of this process, to create collaborative data–driven projects.
  • Five organisations (The RSA, Cardiff University, The Demos Centre for Analysis of Social Media, NCVO and European Alternatives) were grant funded to explore how data–driven methods, such as open data analysis and social media analysis, can help us understand informal social action, often referred to as ‘below the radar activity’ in new ways.

This paper is not the definitive story of the opportunities in using big and open data for the common good, but it can hopefully provide insight on what can be done and lessons for others interested in exploring the opportunities in these methods….(More).”

Unleashing the Power of Data to Serve the American People


Memorandum: Unleashing the Power of Data to Serve the American People
To: The American People
From: Dr. DJ Patil, Deputy U.S. CTO for Data Policy and Chief Data Scientist

….While there is a rich history of companies using data to their competitive advantage, the disproportionate beneficiaries of big data and data science have been Internet technologies like social media, search, and e-commerce. Yet transformative uses of data in other spheres are just around the corner. Precision medicine and other forms of smarter health care delivery, individualized education, and the “Internet of Things” (which refers to devices like cars or thermostats communicating with each other using embedded sensors linked through wired and wireless networks) are just a few of the ways in which innovative data science applications will transform our future.

The Obama administration has embraced the use of data to improve the operation of the U.S. government and the interactions that people have with it. On May 9, 2013, President Obama signed Executive Order 13642, which made open and machine-readable data the new default for government information. Over the past few years, the Administration has launched a number of Open Data Initiatives aimed at scaling up open data efforts across the government, helping make troves of valuable data — data that taxpayers have already paid for — easily accessible to anyone. In fact, I used data made available by the National Oceanic and Atmospheric Administration to improve numerical methods of weather forecasting as part of my doctoral work. So I know firsthand just how valuable this data can be — it helped get me through school!

Given the substantial benefits that responsibly and creatively deployed data can provide to us and our nation, it is essential that we work together to push the frontiers of data science. Given the importance this Administration has placed on data, along with the momentum that has been created, now is a unique time to establish a legacy of data supporting the public good. That is why, after a long time in the private sector, I am returning to the federal government as the Deputy Chief Technology Officer for Data Policy and Chief Data Scientist.

Organizations are increasingly realizing that in order to maximize their benefit from data, they require dedicated leadership with the relevant skills. Many corporations, local governments, federal agencies, and others have already created such a role, which is usually called the Chief Data Officer (CDO) or the Chief Data Scientist (CDS). The role of an organization’s CDO or CDS is to help their organization acquire, process, and leverage data in a timely fashion to create efficiencies, iterate on and develop new products, and navigate the competitive landscape.

The Role of the First-Ever U.S. Chief Data Scientist

Similarly, my role as the U.S. CDS will be to responsibly source, process, and leverage data in a timely fashion to enable transparency, provide security, and foster innovation for the benefit of the American public, in order to maximize the nation’s return on its investment in data.

So what specifically am I here to do? As I start, I plan to focus on these four activities:

…(More)”

Choosing Not to Choose: Understanding the Value of Choice


New book by Cass Sunstein: “Our ability to make choices is fundamental to our sense of ourselves as human beings, and essential to the political values of freedom-protecting nations. Whom we love; where we work; how we spend our time; what we buy; such choices define us in the eyes of ourselves and others, and much blood and ink has been spilt to establish and protect our rights to make them freely.
Choice can also be a burden. Our cognitive capacity to research and make the best decisions is limited, so every active choice comes at a cost. In modern life the requirement to make active choices can often be overwhelming. So, across broad areas of our lives, from health plans to energy suppliers, many of us choose not to choose. By following our default options, we save ourselves the costs of making active choices. By setting those options, governments and corporations dictate the outcomes for when we decide by default. This is among the most significant ways in which they effect social change, yet we are just beginning to understand the power and impact of default rules. Many central questions remain unanswered: When should governments set such defaults, and when should they insist on active choices? How should such defaults be made? What makes some defaults successful while others fail?….
The onset of big data gives corporations and governments the power to make ever more sophisticated decisions on our behalf, defaulting us to buy the goods we predictably want, or vote for the parties and policies we predictably support. As consumers we are starting to embrace the benefits this can bring. But should we? What will be the long-term effects of limiting our active choices on our agency? And can such personalized defaults be imported from the marketplace to politics and the law? Confronting the challenging future of data-driven decision-making, Sunstein presents a manifesto for how personalized defaults should be used to enhance, rather than restrict, our freedom and well-being. (More)”