Crowdsourcing America’s cybersecurity is an idea so crazy it might just work


at the Washington Post: “One idea that’s starting to bubble up from Silicon Valley is the concept of crowdsourcing cybersecurity. As Silicon Valley venture capitalist Robert R. Ackerman, Jr. has pointed out, due to “the interconnectedness of our society in cyberspace,” cyber networks are best viewed as an asset that we all have a shared responsibility to protect. Push on that concept hard enough and you can see how many of the core ideas from Silicon Valley – crowdsourcing, open source software, social networking, and the creative commons – can all be applied to cybersecurity.

Silicon Valley venture capitalists are already starting to fund companies that describe themselves as crowdsourcing cybersecurity. For example, take Synack, a “crowd security intelligence” company that received $7.5 million in funding from Kleiner Perkins (one of Silicon Valley’s heavyweight venture capital firms), Allegis Ventures, and Google Ventures in 2014. Synack’s two founders are ex-NSA employees, and they are using that experience to inform an entirely new type of business model. Synack recruits and vets a global network of “white hat hackers,” and then offers their services to companies worried about their cyber networks. For a fee, these hackers are able to find and repair any security risks.

So how would crowdsourced national cybersecurity work in practice?

For one, there would be free and transparent sharing of computer code used to detect cyber threats between the government and private sector. In December, the U.S. Army Research Lab added a bit of free source code, a “network forensic analysis network” known as Dshell, to the mega-popular code sharing site GitHub. Already, there have been 100 downloads and more than 2,000 unique visitors. The goal, says William Glodek of the U.S. Army Research Laboratory, is for this shared code to “help facilitate the transition of knowledge and understanding to our partners in academia and industry who face the same problems.”

This open sourcing of cyber defense would be enhanced with a scaled-up program of recruiting “white hat hackers” to become officially part of the government’s cybersecurity efforts. Popular annual events such as the DEF CON hacking conference could be used to recruit talented cyber sleuths to work alongside the government.

There have already been examples of communities where people facing a common cyber threat gather together to share intelligence. Perhaps the best-known example is the Conficker Working Group, a security coalition that was formed in late 2008 to share intelligence about malicious Conficker malware. Another example is the Financial Services Information Sharing and Analysis Center, which was created by presidential mandate in 1998 to share intelligence about cyber threats to the nation’s financial system.

Of course, there are some drawbacks to this crowdsourcing idea. For one, such a collaborative approach to cybersecurity might open the door to government cyber defenses being infiltrated by the enemy. Ackerman makes the point that you never really know who’s contributing to any community. Even on a site such as Github, it’s theoretically possible that an ISIS hacker or someone like Edward Snowden could download the code, reverse engineer it, and then use it to insert “Trojan Horses” intended for military targets into the code….  (More)

The Metrics Myth


Jed Emerson at BlendedValue: “…Simply because our present, dominant approaches to assessing metrics fall short of our task—How can one measure the full value of a life saved or possible future changed? What, ultimately, is the real impact and value created through the allocation of our capital?—we persist because we know two things:
First, we know we are on a Hero’s Journey of inquiry and innovation. Too often we forget the present system of tracking financial performance (the basis upon which trillions of dollars flow through global capital markets and the foundation upon which too many of us build our lives) is the outcome of over sixty years of development, refinement and debate. In the U.S., GAAP and FASB (the fundamental building blocks of mainstream business and finance) were not created until after World War II; and it was not until the creation of the Environmental Protection Agency in 1970 that business and many nonprofits began tracking and assessing environmental metrics on a consistent basis. And while social metrics have always been a part of the parlance of government and philanthropic funding, many foundations and social investors have not sought to weave performance assessment into their process of allocating funds until recent decades. It is for these reasons I am quite comfortable with the reality that those creating the metrics and evaluation frameworks of tomorrow will need another twenty years to build what is not yet ours, for I know it will come in good time.
Second, we are creating Total Portfolio Reporting frameworks to track the returns of unified investing strategies (capable of reflecting the aggregate performance of philanthropic, social and environmental value creation) because we know it can be done—and indeed, we see the metrics mist clearing by the year.
As initiatives such as

The Principles for Responsible Investing’s Integrated Reporting work,

the recently re-organized SROI Network,

the Sustainable Accounting Standards Board,

B-Lab’s B-Analytics framework,

CapRock’s iPar system,

the ANDE Metrics Working Group

and a variety of grassroots initiatives coming together around various sets of common reporting for assessing community impact,we find one can create a balance between our aspirations for a better world and the challenges of demarcating our progress toward that goal.
In the end, I hate the whole metrics debate.
It is repetitive, mind numbing and distracting from the critical task of fighting the forces presently destroying our societies and planet. Each time some ignorant (not stupid, mind you, and yet, not fully aware of what they do not know; they are quite rightly, ignorant) newcomer enters the discussion, we’re all expected to re-group and re-define concepts and issues well documented and explored in the past. The continual, mindless reminders that not everything that counts can be counted leave me frustrated and even angry at some who for reasons beyond me don’t seem to understand that such now trite insights were the very starting place of this journey well more than 25 years ago and that, indeed, as newcomers they are as far behind the current exploration as we are from our goal.
Yet, we make progress despite our doubts and complications.
We advance the practice of both impact investing and performance measurement one step forward and two steps back as the current “knowledge” of the crowd actually pulls us backward to previous thinking and practice. And we know the appropriate application of metrics bring meaning and insight just as they demonstrate the limitations of such efforts….(More)”

We Need To Innovate The Science Business Model


Greg Satell at Forbes: “In 1945, Vannevar Bush, the man that led the nation’s scientific efforts during World War II, delivered a proposal to President Truman for funding scientific research in the post-war world.  Titled Science, The Endless Frontier, it led to the formation of the NSFNIHDARPA and other agencies….
One assumption inherent in Bush’s proposal was that institutions would be at the center of scientific life.  Scientists from disparate labs could read each others papers and meet at an occasional conference, but for the most part, they would be dependent on the network of researchers within their organization and those close by.
Sometimes, the interplay between institutions had major, even historical, impacts, such as John von Neumann’s sponsorship of Alan Turing, but mostly the work you did was largely a function of where you did it.  The proximity of Watson, Crick, Rosalind Franklin and Maurice Wilkins, for example, played a major role in the discovery of the structure of DNA.
Yet today, digital technology is changing not only the speed and ease of how we communicate, but the very nature of how we are able to collaborate.  When I spoke to Jonathan Adams, Chief Scientist at Digital Science, which develops and invests in software that makes science more efficient, he noted that there is a generational shift underway and said this:

When you talk to people like me, we’re established scientists who are still stuck in the old system of institutions and conferences.  But the younger scientists are using technology to access networks and they do so on an ongoing, rather than a punctuated basis.  Today, you don’t have to go to a conference or write a paper to exchange ideas.

Evidence would seem to bear this out.  The prestigious journal Nature recently noted that the average scientific paper has four times as many authors as it did in the 1950’s, when Bush’s career was at its height.  Moreover, it’s become common for co-authors to work at far-flung institutions.  Scientific practice needs to adopt to this scientific reality.
There has been some progress in this area.  The Internet, in fact, was created for the the explicit purpose of scientific collaboration.  Yet still, the way in which scientists report and share their findings remains much the same as a century ago.
Moving From Publications To Platforms For Discovery
One especially ripe area for innovation is publishing.  Typically, a researcher with a new discovery waits six months to a year for the peer review process to run its course before the work can be published.  Even then, many of the results are questionable at best.  Nature recently reported that the overwhelming majority of studies can’t be replicated…(More)”

Unleashing the Power of Data to Serve the American People


Memorandum: Unleashing the Power of Data to Serve the American People
To: The American People
From: Dr. DJ Patil, Deputy U.S. CTO for Data Policy and Chief Data Scientist

….While there is a rich history of companies using data to their competitive advantage, the disproportionate beneficiaries of big data and data science have been Internet technologies like social media, search, and e-commerce. Yet transformative uses of data in other spheres are just around the corner. Precision medicine and other forms of smarter health care delivery, individualized education, and the “Internet of Things” (which refers to devices like cars or thermostats communicating with each other using embedded sensors linked through wired and wireless networks) are just a few of the ways in which innovative data science applications will transform our future.

The Obama administration has embraced the use of data to improve the operation of the U.S. government and the interactions that people have with it. On May 9, 2013, President Obama signed Executive Order 13642, which made open and machine-readable data the new default for government information. Over the past few years, the Administration has launched a number of Open Data Initiatives aimed at scaling up open data efforts across the government, helping make troves of valuable data — data that taxpayers have already paid for — easily accessible to anyone. In fact, I used data made available by the National Oceanic and Atmospheric Administration to improve numerical methods of weather forecasting as part of my doctoral work. So I know firsthand just how valuable this data can be — it helped get me through school!

Given the substantial benefits that responsibly and creatively deployed data can provide to us and our nation, it is essential that we work together to push the frontiers of data science. Given the importance this Administration has placed on data, along with the momentum that has been created, now is a unique time to establish a legacy of data supporting the public good. That is why, after a long time in the private sector, I am returning to the federal government as the Deputy Chief Technology Officer for Data Policy and Chief Data Scientist.

Organizations are increasingly realizing that in order to maximize their benefit from data, they require dedicated leadership with the relevant skills. Many corporations, local governments, federal agencies, and others have already created such a role, which is usually called the Chief Data Officer (CDO) or the Chief Data Scientist (CDS). The role of an organization’s CDO or CDS is to help their organization acquire, process, and leverage data in a timely fashion to create efficiencies, iterate on and develop new products, and navigate the competitive landscape.

The Role of the First-Ever U.S. Chief Data Scientist

Similarly, my role as the U.S. CDS will be to responsibly source, process, and leverage data in a timely fashion to enable transparency, provide security, and foster innovation for the benefit of the American public, in order to maximize the nation’s return on its investment in data.

So what specifically am I here to do? As I start, I plan to focus on these four activities:

…(More)”

Choosing Not to Choose: Understanding the Value of Choice


New book by Cass Sunstein: “Our ability to make choices is fundamental to our sense of ourselves as human beings, and essential to the political values of freedom-protecting nations. Whom we love; where we work; how we spend our time; what we buy; such choices define us in the eyes of ourselves and others, and much blood and ink has been spilt to establish and protect our rights to make them freely.
Choice can also be a burden. Our cognitive capacity to research and make the best decisions is limited, so every active choice comes at a cost. In modern life the requirement to make active choices can often be overwhelming. So, across broad areas of our lives, from health plans to energy suppliers, many of us choose not to choose. By following our default options, we save ourselves the costs of making active choices. By setting those options, governments and corporations dictate the outcomes for when we decide by default. This is among the most significant ways in which they effect social change, yet we are just beginning to understand the power and impact of default rules. Many central questions remain unanswered: When should governments set such defaults, and when should they insist on active choices? How should such defaults be made? What makes some defaults successful while others fail?….
The onset of big data gives corporations and governments the power to make ever more sophisticated decisions on our behalf, defaulting us to buy the goods we predictably want, or vote for the parties and policies we predictably support. As consumers we are starting to embrace the benefits this can bring. But should we? What will be the long-term effects of limiting our active choices on our agency? And can such personalized defaults be imported from the marketplace to politics and the law? Confronting the challenging future of data-driven decision-making, Sunstein presents a manifesto for how personalized defaults should be used to enhance, rather than restrict, our freedom and well-being. (More)”

Opinion Mining in Social Big Data


New Paper by Wlodarczak, Peter and Ally, Mustafa and Soar, Jeffrey: “Opinion mining has rapidly gained importance due to the unprecedented amount of opinionated data on the Internet. People share their opinions on products, services, they rate movies, restaurants or vacation destinations. Social Media such as Facebook or Twitter has made it easier than ever for users to share their views and make it accessible for anybody on the Web. The economic potential has been recognized by companies who want to improve their products and services, detect new trends and business opportunities or find out how effective their online marketing efforts are. However, opinion mining using social media faces many challenges due to the amount and the heterogeneity of the available data. Also, spam or fake opinions have become a serious issue. There are also language related challenges like the usage of slang and jargon on social media or special characters like smileys that are widely adopted on social media sites.
These challenges create many interesting research problems such as determining the influence of social media on people’s actions, understanding opinion dissemination or determining the online reputation of a company. Not surprisingly opinion mining using social media has become a very active area of research, and a lot of progress has been made over the last years. This article describes the current state of research and the technologies that have been used in recent studies….(More)”
 

The Trouble With Disclosure: It Doesn’t Work


Jesse Eisinger at ProPublica: “Louis Brandeis was wrong. The lawyer and Supreme Court justice famously declared that sunlight is the best disinfectant, and we have unquestioningly embraced that advice ever since.
All this sunlight is blinding. As new scholarship is demonstrating, the value of all this information is unproved. Paradoxically, disclosure can be useless — and sometimes actually harmful or counterproductive.
“We are doing disclosure as a regulatory move all over the board,” says Adam J. Levitin, a law professor at Georgetown, “The funny thing is, we are doing this despite very little evidence of its efficacy.”…
Of course, some disclosure works. Professor Levitin cites two examples. The first is an olfactory disclosure. Methane doesn’t have any scent, but a foul smell is added to alert people to a gas leak. The second is ATM. fees. A study in Australia showed that once fees were disclosed, people avoided the high-fee machines and took out more when they had to go to them.
But to Omri Ben-Shahar, co-author of a recent book, ” More Than You Wanted To Know: The Failure of Mandated Disclosure,” these are cherry-picked examples in a world awash in useless disclosures. Of course, information is valuable. But disclosure as a regulatory mechanism doesn’t work nearly well enough, he argues.
First, it really works only when things are simple. As soon as transactions become complex, disclosure starts to stumble. Buying a car, for instance, turns out to be several transactions: the purchase itself, the financing, maybe the trade-in of old car and various insurance and warranty decisions. These are all subject to various disclosure rules, but making the choices clear and useful has proved nigh impossible.
In complex transactions, we then must rely on intermediaries to give us advice. Because they are often conflicted, they, too, become subject to disclosure obligations. Ah, even more boilerplate to puzzle over!
And then there’s the harm. Over the years, banks that sold complex securities often stuck impossible-to-understand clauses deep in prospectuses that “disclosed” what was really going on. When the securities blew up, as they often did, banks then fended off lawsuits by arguing they had done everything the law required and were therefore not liable.
“That’s the harm of disclosure,” Professor Ben-Shahar said. “It provides a safe harbor for practices that smell bad. It sanitizes every bad practice.”
The anti-disclosure movement is taking on the ” Nudge” school, embraced by the Obama administration and promoted most prominently by Cass R. Sunstein, a scholar at Harvard, and Richard H. Thaler, an economist at the University of Chicago. These nudgers believe that small policies will prod people to do what’s in their best interests.
The real-world evidence in favor of nudging is thin. …
The ever-alluring notion is that we are just one or two changes away from having meaningful disclosure. If we could only have annual Securities and Exchange Commission filings in plain English, we could finally understand what’s going on at corporations. A University of San Diego Law School professor, Frank Partnoy, and I called for better bank disclosure in an article in The Atlantic a few years ago.
Professor Ben-Shahar mocks it. ” ‘Plain English!’ ‘Make it simple.’ That is the deus ex machina, the god that will solve everything,” he said.
Complex things are, sadly, complex. A mortgage is not an easy transaction to understand. People are not good at predicting their future behavior and so don’t know what options are best for them. “The project of simplification is facing a very poor empirical track record and very powerful theoretical problem,” he said.
What to do instead? Hard and fast rules. If lawmakers want to end a bad practice, ban it. Having them admit it is not enough. (More)”

Holding Data Hostage: The Perfect Internet Crime?


Tom Simonite at MIT Technology Review: “Every so often someone invents a new way of making money on the Internet that earns wild profits, attracts countless imitators, and reshapes what it means to be online. Unfortunately, such a shift took place last year in the world of online crime, with the establishment of sophisticated malicious software known as ransomware as a popular and reliable business model for criminals.

After infecting a computer, perhaps via an e-mail attachment or a malicious website, ransomware automatically encrypts files, which may include precious photos, videos, and business documents, and issues an electronic ransom note. Getting those files back means paying a fee to the criminals who control the malware—and hoping they will keep their side of the bargain by decrypting them.

The money that can be made with ransomware has encouraged technical innovations. The latest ransomware requests payment via the hard-to-trace cryptocurrency Bitcoin and uses the anonymizing Tor network. Millions of home and business computers were infected by ransomware in 2014. Computer crime experts say the problem will only get worse, and some believe mobile devices will be the next target….

The recent rise of ransomware prompted the FBI to issue a report last month in which it warned that the crime poses a threat not only to home computer users but also to “businesses, financial institutions, government agencies, academic institutions, and other organizations.”

Some security researchers predict that 2015 will see significant efforts by criminals to get ransomware working on smartphones and tablets as well. These devices often contain highly prized personal files such as photos and videos….(More)”

What Is the Purpose of Society?


Mark Bittman in the New York Times:“….Think about it this way: There are two kinds of operating systems, hard and soft. A clock is a hard system. We know what it’s for, we know when it isn’t working, and we know that 10 clock experts would agree on how to fix it — and could do so.
Soft systems, like agriculture and economics, are more complex. We don’t all agree on goals, and we don’t agree on whether things are working or in need of repair. For example, is contemporary American agriculture a system for nourishing people and providing a livelihood for farmers? Or is it one for denuding the nation’s topsoil while poisoning land, water, workers and consumers and enriching corporations? Our collective actions would indicate that our principles favor the latter; that has to change.
Defining goals that matter to people is critical, because the most powerful way to change a complex, soft system is to change its purpose. For example, if we had a national agreement that food is not just a commodity, a way to make money, but instead a way to nourish people and the planet and a means to safeguard our future, we could begin to reconfigure the system for that purpose. More generally, if we agreed that human well-being was a priority, creating more jobs would not ring so hollow.
Sadly, even if we did agree, complex systems are not subject to clever fixes. Rather, changes often have unexpected results (that shouldn’t happen with a clock), so change necessarily remains incremental. But without an agreement on goals, without statements of purpose, we are going to continue to see changes that are not in the interest of the majority. Increasingly, it’s corporations and not governments that are determining how the world works. As unrepresentative as government might seem right now, there is at least a chance of improving it, whereas corporations will always act in their own interests.
It’s been adequately demonstrated that more than minor tweaks are needed to improve life for most people. Let’s try to make sense of where the world is now instead of relying on outdated doctrines like “capitalism” and “socialism” created by people who had no idea what the 21st century would look like. Let’s ambitiously and publicly philosophize — as the conservatives do — and think about what shape a sensible political economy might take.
The big ideas and strategies for how we should manage society and thrive with the planet are not a set of rules handed down from on high. To develop them for now and the future is a major challenge, and we — progressives and our allies — have to work harder at it. No one is going to figure it out for us….(More)”.

Special HBR Collection on Innovation in Governance


“A Special Collection of Harvard Business Review in collaboration with the Government Summit consisting of a number of selected articles, led by a special article by HH Sheikh Mohammaed Bin Rashid Al Maktoum, UAE Vice President, Prime Minister and Ruler of Dubai discussing how government can reinvent itself through innovation. The remainder of the articles discuss a series of thoughts in the field of innovation, including service delivery, types of innovation, the spread of digital technology, big data, talent and its role in organizational success and growth in general.”