Big Data and Chicago's Traffic-cam Scandal


Holman Jenkins in the Wall Street Journal: “The danger is microscopic regulation that we invite via the democratic process.
Big data techniques are new in the world. It will take time to know how to feel about them and whether and how they should be legally corralled. For sheer inanity, though, there’s no beating a recent White House report quivering about the alleged menace of “digital redlining,” or the use of big-data marketing tactics in ways that supposedly disadvantage minority groups.
This alarm rests on an extravagant misunderstanding. Redlining was a crude method banks used to avoid losses in bad neighborhoods even at the cost of missing some profitable transactions—exactly the inefficiency big data is meant to improve upon. Failing to lure an eligible customer into a sale, after all, is hardly the goal of any business.
The real danger of the new technologies lies elsewhere, which the White House slightly touches upon in some of its fretting about police surveillance. The danger is microscopic regulation of our daily activities that we will invite on ourselves through the democratic process.
Soon it may be impossible to leave our homes without our movements being tracked by traffic and security cameras able to read license plates, identify faces and pull up data about any individual, from social media postings to credit reports.
Private businesses are just starting to use these techniques to monitor shoppers in front of shelves of goodies. Towns and cities have already embraced such techniques as revenue grabs, encouraged by private contractors peddling automated traffic cameras.
Witness a festering Chicago scandal. This month came federal indictments of a former city bureaucrat, an outside consultant, and the former CEO of Redflex Traffic Systems, the company that operated the city’s traffic cameras until last year….”
 

The wisest choices depend on instinct and careful analysis


John Kay in the Financial Times: “Moneyball, Michael Lewis’s 2003 book on the science of picking baseball teams, was perhaps written to distract himself from his usual work of attacking the financial services industry. Even after downloading the rules of baseball, I still could not fully understand what was going on. But I caught the drift: sabermetrics, the statistical analysis of the records of players, proved a better guide than the accumulated wisdom of experienced coaches.

Another lesson, important for business strategy, was the brevity of the benefits gained by the Oakland A’s, Lewis’s sporting heroes. If the only source of competitive advantage is better quantitative analysis – whether in baseball or quant strategies in the financial sector – such an advantage can be rapidly and accurately imitated.

At the same time, another genre of books proclaims the virtues of instinctive decision-making. Malcolm Gladwell’s Blink (2005) begins with accounts of how experts could identify the Getty kouros – a statue of naked youth purported to be of ancient Greek provenance and purchased in 1985 for $9m – as fake immediately, even though it had supposedly been authenticated through extended scientific tests.

Gary Klein, a cognitive psychologist has for many years monitored the capabilities of experienced practical decision makers – firefighters, nurses and military personnel – who make immediate judgments that are vindicated by the more elaborate assessments possible only with hindsight.
Of course, there is no real inconsistency between the two propositions. The experienced coaches disparaged by sabermetrics enthusiasts were right to believe they knew a lot about spotting baseball talent; they just did not know as much as they thought they did. The art experts and firefighters who made instantaneous, but accurate, judgments were not hearing voices in the air. But no expert can compete with chemical analysis and carbon dating in assessing the age of a work of art.
There are two ways of reconciling expertise with analysis. One takes the worst of both worlds, combining the overconfidence of experience with the naive ignorance of the quant. The resulting bogus rationality seeks to objectivise expertise by fitting it into a template.
It is exemplified in the processes by which interviewers for jobs, and managers who make personnel assessments, are required to complete checklists explaining how they reached their conclusion using prescribed criteria….”

Riding the Second Wave of Civic Innovation


Jeremy Goldberg at Governing: “Innovation and entrepreneurship in local government increasingly require mobilizing talent from many sectors and skill sets. Fortunately, the opportunities for nurturing cross-pollination between the public and private sectors have never been greater, thanks in large part to the growing role of organizations such as Bayes Impact, Code for America, Data Science for Social Good and Fuse Corps.
Indeed, there’s reason to believe that we might be entering an even more exciting period of public-private collaboration. As one local-government leader recently put it to me when talking about the critical mass of pro-bono civic-innovation efforts taking place across the San Francisco Bay area, “We’re now riding the second wave of civic pro-bono and civic innovation.”
As an alumni of Fuse Corps’ executive fellows program, I’m convinced that the opportunities initiated by it and similar organizations are integral to civic innovation. Fuse Corps brings civic entrepreneurs with experience across the public, private and nonprofit sectors to work closely with government employees to help them negotiate project design, facilitation and management hurdles. The organization’s leadership training emphasizes “smallifying” — building innovation capacity by breaking big challenges down into smaller tasks in a shorter timeframe — and making “little bets” — low-risk actions aimed at developing and testing an idea.
Since 2012, I have managed programs and cross-sector networks for the Silicon Valley Talent Partnership. I’ve witnessed a groundswell of civic entrepreneurs from across the region stepping up to participate in discussions and launch rapid-prototyping labs focused on civic innovation.
Cities across the nation are creating new roles and programs to engage these civic start-ups. They’re learning that what makes these projects, and specifically civic pro-bono programs, work best is a process of designing, building, operationalizing and bringing them to scale. If you’re setting out to create such a program, here’s a short list of best practices:
Assets: Explore existing internal resources and knowledge to understand the history, departmental relationships and overall functions of the relevant agencies or departments. Develop a compendium of current service/volunteer programs.
City policies/legal framework: Determine what the city charter, city attorney’s office or employee-relations rules and policies say about procurement, collective bargaining and public-private partnerships.
Leadership: The support of the city’s top leadership is especially important during the formative stages of a civic-innovation program, so it’s important to understand how the city’s form of government will impact the program. For example, in a “strong mayor” government the ability to make definitive decisions on a public-private collaboration may be unlikely to face the same scrutiny as it might under a “council/mayor” government.
Cross-departmental collaboration: This is essential. Without the support of city staff across departments, innovation projects are unlikely to take off. Convening a “tiger team” of individuals who are early adopters of such initiatives is important step. Ultimately, city staffers best understand the needs and demands of their departments or agencies.
Partners from corporations and philanthropy: Leveraging existing partnerships will help to bring together an advisory group of cross-sector leaders and executives to participate in the early stages of program development.
Business and member associations: For the Silicon Valley Talent Partnership, the Silicon Valley Leadership Group has been instrumental in advocating for pro-bono volunteerism with the cities of Fremont, San Jose and Santa Clara….”

The Changing Nature of Privacy Practice


Numerous commenters have observed that Facebook, among many marketers (including political campaigns like U.S. President Barack Obama’s), regularly conducts A-B tests and other research to measure how consumers respond to different products, messages and messengers. So what makes the Facebook-Cornell study different from what goes on all the time in an increasingly data-driven world? After all, the ability to conduct such testing continuously on a large scale is considered one of the special features of big data.
The answer calls for broader judgments than parsing the language of privacy policies or managing compliance with privacy laws and regulations. Existing legal tools such as notice-and-choice and use limitations are simply too narrow to address the array of issues presented and inform the judgment needed. Deciding whether Facebook ought to participate in research like its newsfeed study is not really about what the company can do but what it should do.
As Omer Tene and Jules Polonetsky, CIPP/US, point out in an article on Facebook’s research study, “Increasingly, corporate officers find themselves struggling to decipher subtle social norms and make ethical choices that are more befitting of philosophers than business managers or lawyers.” They add, “Going forward, companies will need to create new processes, deploying a toolbox of innovative solutions to engender trust and mitigate normative friction.” Tene and Polonetsky themselves have proposed a number of such tools. In recent comments on Consumer Privacy Bill of Rights legislation filed with the Commerce Department, the Future of Privacy Forum (FPF) endorsed the use of internal review boards along the lines of those used in academia for human-subject research. The FPF also submitted an initial framework for benefit-risk analysis in the big data context “to understand whether assuming the risk is ethical, fair, legitimate and cost-effective.” Increasingly, companies and other institutions are bringing to bear more holistic review of privacy issues. Conferences and panels on big data research ethics are proliferating.
The expanding variety and complexity of data uses also call for a broader public policy approach. The Obama administration’s Consumer Privacy Bill of Rights (of which I was an architect) adapted existing Fair Information Practice Principles to a principles-based approach that is intended not as a formalistic checklist but as a set of principles that work holistically in ways that are “flexible” and “dynamic.” In turn, much of the commentary submitted to the Commerce Department on the Consumer Privacy Bill of Rights addressed the question of the relationship between these principles and a “responsible use framework” as discussed in the White House Big Data Report….”

Enchanted Objects


Book by David Rose: “Some believe the future will look like more of the same—more smartphones, tablets, screens embedded in every conceivable surface. David Rose has a different vision: technology that atomizes, combining itself with the objects that make up the very fabric of daily living. Such technology will be woven into the background of our environment, enhancing human relationships and channeling desires for omniscience, long life, and creative expression. The enchanted objects of fairy tales and science fiction will enter real life.
Groundbreaking, timely, and provocative, Enchanted Objects is a blueprint for a better future, where efficient solutions come hand in hand with technology that delights our senses. It is essential reading for designers, technologists, entrepreneurs, business leaders, and anyone who wishes to understand the future and stay relevant in the Internet of Things. Download the prologue here.”

Detroit and Big Data Take on Blight


Susan Crawford in Bloomberg View: “The urban blight that has been plaguing Detroit was, until very recently, made worse by a dearth of information about the problem. No one could tell how many buildings needed fixing or demolition, or how effectively city services were being delivered to them (or not). Today, thanks to the combined efforts of a scrappy small business, tech-savvy city leadership and substantial philanthropic support, the extent of the problem is clear.
The question now is whether Detroit has the heart to use the information to make hard choices about its future.
In the past, when the city foreclosed on properties for failure to pay back taxes, it had no sense of where those properties were clustered. The city would auction off the houses for the bargain-basement price of $500 each, but the auction was entirely undocumented, so neighbors were unaware of investment opportunities, big buyers were gaming the system, and, as often as not, arsonists would then burn the properties down. The result of this blind spot was lost population, lost revenue and even more blight.
Then along came Jerry Paffendorf, a San Francisco transplant, who saw what was needed. His company, Loveland Technologies, started mapping all the tax-foreclosed and auctioned properties. Impressed with Paffendorf’s zeal, the city’s Blight Task Force, established by President Barack Obama and funded by foundations and the state Housing Development Authority, hired his team to visit every property in the city. That led to MotorCityMapping.org, the first user-friendly collection of information about all the attributes of every property in Detroit — including photographs.
Paffendorf calls this map a “scan of the genome of the city.” It shows more than 84,000 blighted structures and vacant lots; in eight neighborhoods, crime, fires and other torments have led to the abandonment of more than a third of houses and businesses. To demolish all those houses, as recommended by the Blight Task Force, will cost almost $2 billion. Still more money will then be needed to repurpose the sites….”

Open Intellectual Property Casebook


New book by James Boyle & Jennifer Jenkins: “..This book, the first in a series of Duke Open Coursebooks, is available for free download under a Creative Commons license. It can also be purchased in a glossy paperback print edition for $29.99, $130 cheaper than other intellectual property casebooks.
This book is an introduction to intellectual property law, the set of private legal rights that allows individuals and corporations to control intangible creations and marks—from logos to novels to drug formulae—and the exceptions and limitations that define those rights. It focuses on the three main forms of US federal intellectual property—trademark, copyright and patent—but many of the ideas discussed here apply far beyond those legal areas and far beyond the law of the United States.
The book is intended to be a textbook for the basic Intellectual Property class, but because it is an open coursebook, which can be freely edited and customized, it is also suitable for an undergraduate class, or for a business, library studies, communications or other graduate school class. Each chapter contains cases and secondary readings and a set of problems or role-playing exercises involving the material. The problems range from a video of the Napster oral argument to counseling clients about search engines and trademarks, applying the First Amendment to digital rights management and copyright or commenting on the Supreme Court’s new rulings on gene patents.
Intellectual Property: Law & the Information Society is current as of August 2014. It includes discussions of such issues as the Redskins trademark cancelations, the Google Books case and the America Invents Act. Its illustrations range from graphs showing the growth in patent litigation to comic book images about copyright. The best way to get some sense of its coverage is to download it. In coming weeks, we will provide a separate fuller webpage with a table of contents and individual downloadable chapters.
The Center has also published an accompanying supplement of statutory and treaty materials that is available for free download and low cost print purchase.”

Our future government will work more like Amazon


Michael Case in The Verge: “There is a lot of government in the United States. Several hundred federal agencies, 535 voting members in two houses of Congress, more than 90,000 state and local governments, and over 20 million Americans involved in public service.

We say we have a government for and by the people. But the way American government conducts its day-to-day business does not feel like anything we, the people weaned on the internet, would design in 2014. Most interactions with the US government don’t resemble anything else we’re used to in our daily lives….

But if the government is ever going to completely retool itself to provide sensible services to a growing, aging, diversifying American population, it will have to do more than bring in a couple innovators and throw data at the public. At the federal level, these kinds of adjustments will require new laws to change the way money is allocated to executive branch agencies so they can coordinate the purchase and development of a standard set of tools. State and local governments will have to agree on standard tools and data formats as well so that the mayor of Anchorage can collaborate with the governor of Delaware.

Technology is the answer to a lot of American government’s current operational shortcomings. Not only are the tools and systems most public servants use outdated and suboptimal, but the organizations and processes themselves have also calcified around similarly out-of-date thinking. So the real challenge won’t be designing cutting edge software or high tech government facilities — it’s going to be conjuring the will to overcome decades of old thinking. It’s going to be convincing over 90,000 employees to learn new skills, coaxing a bitterly divided Congress to collaborate on something scary, and finding a way to convince a timid and distracted White House to put its name on risky investments that won’t show benefits for many years.

But! If we can figure out a way for governments across the country to perform their basic functions and provide often life-saving services, maybe we can move on to chase even more elusive government tech unicorns. Imagine voting from your smartphone, having your taxes calculated and filed automatically with a few online confirmations, or filing for your retirement at a friendly tablet kiosk at your local government outpost. Government could — feasibly — be not only more effective, but also a pleasure to interact with someday. Someday.”

Crowd-Sourced, Gamified Solutions to Geopolitical Issues


Gamification Corp: “Daniel Green, co-founder and CTO of Wikistrat, spoke at GSummit 2014 on an intriguing topic: How Gamification Motivates All Age Groups: Or How to Get Retired Generals to Play Games Alongside Students and Interns.

Wikistrat, a crowdsourced consulting company, leverages a worldwide network of experts from various industries to solve some of the world’s geopolitical problems through the power of gamification. Wikistrat also leverages fun, training, mentorship, and networking as core concepts in their company.

Dan (@wsdan) spoke with TechnologyAdvice host Clark Buckner about Wikistrat’s work, origins, what clients can expect from working with Wikistrat, and how gamification correlates with big data and business intelligence. Listen to the podcast and read the summary below:

Wikistrat aims to solve a common problem faced by most governments and organizations when generating strategies: “groupthink.” Such entities can devise a diverse set of strategies, but they always seem to find their resolution in the most popular answer.

In order to break group thinking, Wikistrat carries out geopolitical simulations that work around “collaborative competition.” The process involves:

  • Securing analysts: Wikistrat recruits a diverse group of analysts who are experts in certain fields and located in different strategic places.

  • Competing with ideas: These analysts are placed in an online environment where, instead of competing with each other, one analyst contributes an idea, then other analysts create 2-3 more ideas based on the initial idea.

  • Breaking group thinking: Now the competition becomes only about ideas. People champion the ideas they care about rather than arguing with other analysts. That’s when Wikistrat breaks group thinking and helps their clients discover ideas they may have never considered before.

Gamification occurs when analysts create different scenarios for a specific angle or question the client raises. Plus, Wikistrat’s global analyst coverage is so good that they tout having at least one expert in every country. They accomplished this by allowing anyone—not just four-star generals—to register as an analyst. However, applicants must submit a resume and a writing sample, as well as pass a face-to-face interview….”

Beyond just politics: A systematic literature review of online participation


Paper by Christoph Lutz, Christian Pieter Hoffmann, and Miriam Meckel in First Monday :”This paper presents a systematic literature review of the current state–of–research on online participation. The review draws on four databases and is guided by the application of six topical search terms. The analysis strives to differentiate distinct forms of online participation and to identify salient discourses within each research field. We find that research on online participation is highly segregated into specific sub–discourses that reflect disciplinary boundaries. Research on online political participation and civic engagement is identified as the most prominent and extensive research field. Yet research on other forms of participation, such as cultural, business, education and health participation, provides distinct perspectives and valuable insights. We outline both field–specific and common findings and derive propositions for future research.”