Procurement needs better data now


Howard Rolfe, procurement director for East of England NHS Collaborative Procurement Hub, in The Guardian: “Knowledge management is fundamental to any organisation and procurement in the NHS is no exception. Current systems are not joined up and don’t give the level of information that should be expected. Management in many NHS trusts cannot say how effective procurement is within their organisation because they don’t have a dashboard of information that tells them, for example, the biggest spend areas, who is placing the order, what price is paid and how that price compares.
Systems now exist that could help answer these questions and increase board and senior management focus on this area of huge spend….The time for better data is now, the opportunity is at the top of political and management agendas and the need is overwhelming. What is the solution? The provision of effective knowledge management systems is key and will facilitate improvements in information, procurement and collaborative aggregation by providing greater visibility of spend and reduction of administrative activity.”

The Dangers of Surveillance


Paper by Neil M. Richards in Harvard Law Review. Abstract:  “From the Fourth Amendment to George Orwell’s Nineteen Eighty-Four, our culture is full of warnings about state scrutiny of our lives. These warnings are commonplace, but they are rarely very specific. Other than the vague threat of an Orwellian dystopia, as a society we don’t really know why surveillance is bad, and why we should be wary of it. To the extent the answer has something to do with “privacy,” we lack an understanding of what “privacy” means in this context, and why it matters. Developments in government and corporate practices have made this problem more urgent. Although we have laws that protect us against government surveillance, secret government programs cannot be challenged until they are discovered.
… I propose a set of four principles that should guide the future development of surveillance law, allowing for a more appropriate balance between the costs and benefits of government surveillance. First, we must recognize that surveillance transcends the public-private divide. Even if we are ultimately more concerned with government surveillance, any solution must grapple with the complex relationships between government and corporate watchers. Second, we must recognize that secret surveillance is illegitimate, and prohibit the creation of any domestic surveillance programs whose existence is secret. Third, we should recognize that total surveillance is illegitimate and reject the idea that it is acceptable for the government to record all Internet activity without authorization. Fourth, we must recognize that surveillance is harmful. Surveillance menaces intellectual privacy and increases the risk of blackmail, coercion, and discrimination; accordingly, we must recognize surveillance as a harm in constitutional standing doctrine.

How to Clean Up Social News


verilyDavid Talbot in MIT Technology Review: ” New platforms for fact-checking and reputation scoring aim to better channel social media’s power in the wake of a disaster…Researchers from the Masdar Institute of Technology and the Qatar Computer Research Institute plan to launch Verily, a platform that aims to verify social media information, in a beta version this summer. Verily aims to enlist people in collecting and analyzing evidence to confirm or debunk reports. As an incentive, it will award reputation points—or dings—to its contributors.
Verily will join services like Storyful that use various manual and technical means to fact-check viral information, and apps such as Swift River that, among other things, let people set up filters on social media to provide more weight to trusted users in the torrent of posts following major events…Reputation scoring has worked well for e-commerce sites like eBay and Amazon and could help to clean up social media reports in some situations.

Sanitation Hackathon


SanitationNew York Times: “Because of the rapid spread of cellular phones, mobile technology has previously been used to address a variety of problems in the developing world, including access to financial services, health care information and education. But toilets were another matter….Building on a process that had previously been employed to address problems in supplying clean water to people in poor areas, the World Bank turned its attention to sanitation. Over six months last year, it solicited ideas from experts in the field, as well as software developers. The process culminated in early December with the actual hackathon — two days in which more than 1,000 developers gathered in 40 cities worldwide to work on their projects….After the event in Washington, the winners of the hackathon are set to travel to Silicon Valley for meetings with venture capitalists and entrepreneurs who are interested in the issue. The World Bank does not plan to invest in the projects, but hopes that others might.”
See also http://www.sanitationhackathon.org/
 

The Rise of Big Data


Kenneth Neil Cukier and Viktor Mayer-Schoenberger in Foreign Affairs: “Everyone knows that the Internet has changed how businesses operate, governments function, and people live. But a new, less visible technological trend is just as transformative: “big data.” Big data starts with the fact that there is a lot more information floating around these days than ever before, and it is being put to extraordinary new uses. Big data is distinct from the Internet, although the Web makes it much easier to collect and share data. Big data is about more than just communication: the idea is that we can learn from a large body of information things that we could not comprehend when we used only smaller amounts.”
Gideon Rose, editor of Foreign Affairs, sits down with Kenneth Cukier, data editor of The Economist (video):

Investigating Terror in the Age of Twitter


Michael Chertoff and Dallas Lawrence in WSJ: “A dozen years ago when the terrorists struck on 9/11, there was no Facebook or Twitter or i-anything on the market. Cellphones were relatively common, but when cell networks collapsed in 2001, many people were left disconnected and wanting for immediate answers. Last week in Boston, when mobile networks became overloaded following the bombings, the social-media-savvy Boston Police Department turned to Twitter, using the platform as a makeshift newsroom to alert media and concerned citizens to breaking news.
Law-enforcement agencies around the world will note how social media played a prominent role both in telling the story and writing its eventual conclusion. Some key lessons have emerged.”

China identified as main source behind cyber-espionage


Washington Post: “Analyses of hundreds of documented data breaches found that hackers affiliated with the Chinese government were by far the most energetic and successful cyberspies in the world last year, according to a report to be issued Tuesday by government and industry investigators.
Although hackers with financial motives are the most common source of data breaches worldwide, China dominated the category of state-affiliated cyber-espionage of intellectual property, said the 2013 Data Breach Investigations Report. The report was issued by Verizon’s RISK Team and 18 partners, including officials from the United States and several foreign governments.
Of 120 incidents of government cyber-espionage detailed in the report, 96 percent came from China; the source of the other 4 percent was unknown, it said.”

Knowing Where to Focus the Wisdom of Crowds


Nick Bilton in NYT: “It looks as if the theory of the “wisdom of crowds” doesn’t apply to terrorist manhunts. Last week after the Boston Marathon bombings, the Internet quickly offered to help find the people responsible. In a scene metaphorically reminiscent of a movie in which vigilantes swarm the streets with pitchforks and lanterns, people took to Reddit, the popular community and social news Web site, and started scouring images posted online from the bombings.
One Reddit forum told users to search for ”people carrying black bags,” and noted that “if they look suspicious, then post them. Then people will try and follow their movements using all the images.” In the process, each time a scrap of information was discovered — the color of a hat, the type of straps on a backpack, the weighted droop of a bag — it was passed out on Twitter like “Wanted” posters tacked to lampposts. It didn’t matter whether it was right, wrong or even completely made up (some images posted to forums had been manipulated) — off it went, fiction and fact indistinguishable. Some misinformation online landed on the front page of The New York Post, incorrectly identifying an innocent high school student as a suspect. Later in the week, the Web wrongly identified one of the suspects as  a student from Brown University who went missing earlier this month…
Perhaps the scariest aspect of these crowd-like investigations is that when information is incorrect, no one is held responsible.
As my colleague David Carr noted in his column this week, “even good reporters with good sources can end up with stories that go bad.” But the difference between CNN, The Associated Press or The New York Post getting it wrong, is that those names are held accountable when they publish incorrect news. No one is going to remember, or punish, the users on Reddit or Twitter who incorrectly identify random high school runners and missing college students as terrorists.”

Cognitive Overhead


/ˈkɑgnɪtɪv ˈoʊvərˌhɛd/

How many logical connections or jumps your brain has to make in order to understand or contextualize the thing you’re looking at.”

In an earlier post, we reviewed Cass Sunstein’s latest book on the need for government to simplify processes so as to be more effective and participatory. David Lieb, co-founder and CEO of Bump, recently expanded upon this call for simplicity in a blog post at TechCrunch, arguing that anyone trying to engage with the public “should first and foremost minimize the Cognitive Overhead of their products, even though it often comes at the cost of simplicity in other areas”

When explaining what Cognitive Overhead means, David Lieb uses the definition coined by Chicago web designer and engineer David Demaree: cognitive overhead describes “how many logical connections or jumps your brain has to make in order to understand or contextualize the thing you’re looking at.”

David Lieb says: “Minimizing cognitive overhead is imperative when designing for the mass market. Why? Because most people haven’t developed the pattern matching machinery in their brains to quickly convert what they see in your product (app design, messaging, what they heard from friends, etc.) into meaning and purpose.”

In many ways, the concept resonates with the so-called “Cognitive Load Theory” (CLT) which taps into educational psychology and has been used widely for the design of multimedia and other learning materials (to prevent over-load). CLT focuses on the best conditions that are aligned with human cognitive architecture (where short-term memory is limited in the number of elements it can contain simultaneously). John Sweller, the founder of CLT, and others have therefore focused on the role of acquiring schemata (mind maps) to learn.

So how can we provide for cognitive simplicity? According to Lieb, we need to:

  • “Put the user “in the middle of your flow. Make them press an extra button, make them provide some inputs, let them be part of the service-providing, rather than a bystander to it.”;
  • Give the user real-time feedback;
  • Slow down provisioning. Studies have shown that intentionally slowing down results on travel search websites can actually increase perceived user value — people realize and appreciate that the service is doing a lot of work searching all the different travel options on their behalf.”

It seems imperative that anyone who wants to engage with the public (to tap into the “cognitive surplus” (Clay Shirky) of the crowd) must focus–when for instance, defining the problem that needs to be solved—on the cognitive overhead of their engagement platform and message.

Reinvent Regulation


Reinvent Roundtable that will take place on April 23, 2013 11:00 am PT : “Tim O’Reilly has some big ideas about how to dramatically modernize the entire notion of government regulation, particularly “algorithmic regulation” that harnesses computer power, much like top tech companies in Silicon Valley, to oversee the financial industry, which is using those same tools. This roundtable features some top talent from the Valley to apply their brains to figuring out how we could reinvent much more iterative regulation that constantly gets refined through analyzing data and processing feedback loops – much like Google refines its search techniques. In fact, we’ll have a top person from Google Search as well as someone from the US Treasury Department to work on these ideas. Watch Now →”
http://reinventors.net/roundtables/reinvent-regulation/