The Future of Internet Governance: 90 Places to Start


Council on Foreign Relations Blog: “The open, global Internet, which has created untold wealth and empowered billions of individuals, is in jeopardy. Around the world, “nations are reasserting sovereignty and territorializing cyberspace” to better control the political, economic, social activities of their citizens, and the content they can access. These top-down efforts undermine the Internet’s existing decentralized, multi-stakeholder system of governance and threaten its fragmentation into multiple national intranets. To preserve an open system that reflects its interests and values while remaining both secure and resilient, the United States must unite a coalition of like-minded states committed to free expression and free markets and prepared to embrace new strategies to combat cyber crime and rules to govern cyber warfare.
These are the core messages of the just-released CFR report, Defending an Open, Global, Resilient, and Secure Internet. The product of a high-level task force, chaired by former Director of National Intelligence John D. Negroponte and former IBM Chairman Samuel J. Palmisano, the report opens by describing the epochal transformation the Internet has wrought on societies and economies worldwide—particularly in the developing world.
Facilitating this unprecedented connectivity has been a framework based not on governmental (or intergovernmental) fiat but on “self-regulation, private sector leadership, and a bottom-up policy process.” By leaving regulation in the hands of technical experts, private sector actors, civil society groups, and end-users, the pioneers of the early Internet ensured that it would “reflect a broad range of perspectives and keep pace with rapidly changing technology.” They also ensured that rights of free expression and privacy would emerge as dominant norms….
Given current trends, can the United States possibly preserve the open global internet? Yes, but the first step is getting its own house in order. Distressingly, the U.S. government lacks a coherent strategic vision, an adequate policy coordination framework, and the requisite legislative authorities to develop and implement a national cyberspace policy, undercutting its global leadership.
Beyond this general guidance, the CFR task force offers some ninety (!) recommendations for U.S. policymakers.”

Time we all learned how to program the world we want


Editorial of NewScientist: “OUR world is written in code. These days, almost anything electrical or mechanical requires many thousands of lines of code to work. Consider a modern car: you could argue that from the driver’s perspective, it’s now a computer that gives you control over an engine, drivetrain and wheels. And with cars beginning to drive themselves, the code will soon be in even more control.
But who controls the code? Those who write the programs behind the machines have become hugely lionised. Silicon Valley courts software developers with huge salaries and copious stock options, throwing in perks ranging from gourmet food to free haircuts. The rest of us can only look on, excluded by esoteric arguments about the merits of rival programming techniques and languages. Like the clerics who once controlled written language, programmers have a vested interest in keeping the status quo…”

Checkbook NYC advances civic open source


Karl Fogel at OpenSource.com: “New York City Comptroller John Liu is about to do something we need to see more often in government. This week, his office is open sourcing the code behind Checkbook NYC, the citywide financial transparency site—but the open-sourcing itself is not what I’m referring to. After all, lots of governments open source code these days.
Checkbook NYCRather, the release of the Checkbook NYC code, planned for this Thursday, is significant because of a larger initiative that accompanies it. Long before the code release, the Comptroller’s Office started a serious planning process to ensure that the code could be easily adopted by other municipalities, supported by other vendors, and eventually become a long-term multi-stakeholder project—in other words, the very model that advocates of civic open source always cheer for but only rarely see happen in practice.
I have no knowledge (and do not claim) that this is the first instance of a government agency doing such long-range planning for an open source release. But it will at least be an important instance: CheckbookNYC.com is the main financial transparency site for the largest city in the United States, a city with an annual budget of $70 billion. Giving other cities a chance to offer the same user interface and API support, at a fraction of what it would have cost to build it themselves, is already good news. But it’s even more important to show that the project is a safe long-term bet, both for those considering adoption and those considering participation in development.”

First, they gave us targeted ads. Now, data scientists think they can change the world


in Gigaom: “The best minds of my generation are thinking about how to make people click ads … That sucks.” – Jeff Hammerbacher, co-founder and chief scientist, Cloudera
Well, something has to pay the bills. Thankfully, there’s also a sweeping trend in the data science world right now around bringing those skills to bear on some really meaningful problems, …
We’ve already covered some of these efforts, including the SumAll Foundation’s work on modern-day slavery and future work on child pornography. Closely related is the effort — led by Google.org’s deep pockets — to create an international hotline network for reporting human trafficking and collecting data. Microsoft, in particular Microsoft Research’s danah boyd, has been active in helping fight child exploitation using technology.
This week, I came across two new efforts on different ends of the spectrum. One is ActivityInfo, which describes itself on its website as “an online humanitarian project monitoring tool” — developed by Unicef and a consulting firm called BeDataDriven — that “helps humanitarian organizations to collect, manage, map and analyze indicators….
The other effort I came across is DataKind, specifically its work helping the New York City Department of Parks and Recreations, or NYC Parks, quantify the benefits of a strategic tree-pruning program. Founded by renowned data scientists Drew Conway and Jake Porway (who’s also the host of the National Geographic channel’s The Numbers Game), DataKind exists for the sole purpose of helping non-profit organizations and small government agencies solve their most-pressing data problems.”

Filling Power Vacuums in the New Global Legal Order


Paper by Anne-Marie Slaughter in the latest issue of Boston College Law Review: “In her Keynote Address at the October, 12, 2012 Symposium, Filling Power Vacuums in the New Global Legal Order, Anne-Marie  Slaughter describes the concepts of “power over” and “power with” in the global world of law. Power over is the ability to achieve the outcomes you want by commanding or manipulating others. Power with is the ability to mobilize people to do things. In the globalized world, power operates much more through power with than  through power over. In contrast to the hierarchical power of national governments, globally it is more important to be central in the  horizontal system of multiple sovereigns. This Address illustrates different examples of power over and power with. It concludes that in this globalized world, lawyers are ideally trained and positioned to exercise power.”

Is Crowdsourcing the Future for Crime Investigation?


Joe Harris in IFSEC Global: “Following April’s Boston Marathon bombings, many people around the world wanted to help in any way they could. Previously, there would have been little but financial assistance that they could have offered.
However, with the advent of high-quality cameras on smartphone devices, and services such as YouTube and Flickr, it was not long before the well-known online collectives such as Reddit and 4chan mobilized members of the public to ask them to review hundreds of thousands of photos and videos taken on the day to try and identify potential suspects….Here in the UK, we recently had the successful launch of Facewatch, and we have seen other regional attempts — such as Greater Manchester Police’s services and appeals app — to use the goodwill of members of the public to help trace, identify, or report suspected criminals and the crimes that they commit.
Does this herald a new era in transparency? Are we seeing the first steps towards a more transparent future where rapid information flow means that there really is nowhere to hide? Or are we instead falling into some Orwellian society construct where people are scared to speak out or think for themselves?”

Techs and the City


Alec Appelbaum, who teaches at Pratt Institute in The New York Times: “THIS spring New York City is rolling out its much-ballyhooed bike-sharing program, which relies on a sophisticated set of smartphone apps and other digital tools to manage it. The city isn’t alone: across the country, municipalities are buying ever more complicated technological “solutions” for urban life.

But higher tech is not always essential tech. Cities could instead be making savvier investments in cheaper technology that may work better to stoke civic involvement than the more complicated, expensive products being peddled by information-technology developers….

To be sure, big tech can zap some city weaknesses. According to I.B.M., its predictive-analysis technology, which examines historical data to estimate the next crime hot spots, has helped Memphis lower its violent crime rate by 30 percent.

But many problems require a decidedly different approach. Take the seven-acre site in Lower Manhattan called the Seward Park Urban Renewal Area, where 1,000 mixed-income apartments are set to rise. A working-class neighborhood that fell to bulldozers in 1969, it stayed bare as co-ops nearby filled with affluent families, including my own.

In 2010, with the city ready to invite developers to bid for the site, long-simmering tensions between nearby public-housing tenants and wealthier dwellers like me turned suddenly — well, civil.

What changed? Was it some multimillion-dollar “open democracy” platform from Cisco, or a Big Data program to suss out the community’s real priorities? Nope. According to Dominic Pisciotta Berg, then the chairman of the local community board, it was plain old e-mail, and the dialogue it facilitated. “We simply set up an e-mail box dedicated to receiving e-mail comments” on the renewal project, and organizers would then “pull them together by comment type and then consolidate them for display during the meetings,” he said. “So those who couldn’t be there had their voices considered and those who were there could see them up on a screen and adopted, modified or rejected.”

Through e-mail conversations, neighbors articulated priorities — permanently affordable homes, a movie theater, protections for small merchants — that even a supercomputer wouldn’t necessarily have identified in the data.

The point is not that software is useless. But like anything else in a city, it’s only as useful as its ability to facilitate the messy clash of real human beings and their myriad interests and opinions. And often, it’s the simpler software, the technology that merely puts people in contact and steps out of the way, that works best.”

"A bite of me"


Federico Zannier @ Kickstarter: “I’ve data mined myself. I’ve violated my own privacy. Now I am selling it all. But how much am I worth?

I spend hours every day surfing the internet. Meanwhile, companies like Facebook and Google have been using my online information (the websites I visit, the friends I have, the videos I watch) for their own benefit.
In 2012, advertising revenue in the United States was around $30 billion. That same year, I made exactly $0 from my own data. But what if I tracked everything myself? Could I at least make a couple bucks back?
I started looking at the terms of service for the websites I often use. In their privacy policies, I have found sentences like this: “You grant a worldwide, non-exclusive, royalty-free license to use, copy, reproduce, process, adapt, modify, publish, transmit, display and distribute such content in any and all media or distribution methods (now known or later developed).” I’ve basically agreed to give away a lifelong, international, sub-licensable right to use my personal data….
Check out myprivacy.info to see some of the visualizations I’ve made.
http://myprivacy.info”

Principles and Practices for a Federal Statistical Agency


New National Academies Publication : “Publicly available statistics from government agencies that are credible, relevant, accurate, and timely are essential for policy makers, individuals, households, businesses, academic institutions, and other organizations to make informed decisions. Even more, the effective operation of a democratic system of government depends on the unhindered flow of statistical information to its citizens.
In the United States, federal statistical agencies in cabinet departments and independent agencies are the governmental units whose principal function is to compile, analyze, and disseminate information for such statistical purposes as describing population characteristics and trends, planning and monitoring programs, and conducting research and evaluation. The work of these agencies is coordinated by the U.S. Office of Management and Budget. Statistical agencies may acquire information not only from surveys or censuses of people and organizations, but also from such sources as government administrative records, private-sector datasets, and Internet sources that are judged of suitable quality and relevance for statistical use. They may conduct analyses, but they do not advocate policies or take partisan positions. Statistical purposes for which they provide information relate to descriptions of groups and exclude any interest in or identification of an individual person, institution, or economic unit.
Four principles are fundamental for a federal statistical agency: relevance to policy issues, credibility among data users, trust among data providers, and independence from political and other undue external influence. Principles and Practices for a Federal Statistical Agency: Fifth Edition explains these four principles in detail.”

Life and Death of Tweets Not so Random After All


MIT Technology Review: “MIT assistant professor Tauhid Zaman and two other researchers (Emily Fox at the University of Washington and Eric Bradlow at the University of Pennsylvania’s Wharton School) have come up with a model that can predict how many times a tweet will ultimately be retweeted, minutes after it is posted. The model was created by collecting retweets on a slew of topics and looking at the time when the original tweet was posted and how fast it spread. That provided knowledge used to predict how popular a new tweet will be by looking at how many times it was retweeted shortly after it was first posted.
The researchers’ findings were explained in a paper submitted to the Annals of Applied Statistics. In the paper, the authors note that “understanding retweet behavior could lead to a better understanding of how broader ideas spread in Twitter and in other social networks,” and such data may be helpful in a number of areas, like marketing and political campaigning.
You can check out the model here.”