White House: "We Want Your Input on Building a More Open Government"


Nick Sinai at the White House Blog:”…We are proud of this progress, but recognize that there is always more we can do to build a more efficient, effective, and accountable government.  In that spirit, the Obama Administration has committed to develop a second National Action Plan on Open Government: “NAP 2.0.”
In order to develop a Plan with the most creative and ambitious solutions, we need all-hands-on-deck. That’s why we are asking for your input on what should be in the NAP 2.0:

  1. How can we better encourage and enable the public to participate in government and increase public integrity? For example, in the first National Action Plan, we required Federal enforcement agencies to make publicly available compliance information easily accessible, downloadable and searchable online – helping the public to hold the government and regulated entities accountable.
  • What other kinds of government information should be made more available to help inform decisions in your communities or in your lives?
  • How would you like to be able to interact with Federal agencies making decisions which impact where you live?
  • How can the Federal government better ensure broad feedback and public participation when considering a new policy?
  1. The American people must be able to trust that their Government is doing everything in its power to stop wasteful practices and earn a high return on every tax dollar that is spent.  How can the government better manage public resources? 
  • What suggestions do you have to help the government achieve savings while also improving the way that government operates?
  • What suggestions do you have to improve transparency in government spending?
  1. The American people deserve a Government that is responsive to their needs, makes information readily accessible, and leverages Federal resources to help foster innovation both in the public and private sector.   How can the government more effectively work in collaboration with the public to improve services?
  • What are your suggestions for ways the government can better serve you when you are seeking information or help in trying to receive benefits?
  • In the past few years, the government has promoted the use of “grand challenges,” ambitious yet achievable goals to solve problems of national priority, and incentive prizes, where the government identifies challenging problems and provides prizes and awards to the best solutions submitted by the public.  Are there areas of public services that you think could be especially benefited by a grand challenge or incentive prize?
  • What information or data could the government make more accessible to help you start or improve your business?

Please think about these questions and send your thoughts to [email protected] by September 23. We will post a summary of your submissions online in the future.”

How Mechanical Turkers Crowdsourced a Huge Lexicon of Links Between Words and Emotion


The Physics arXiv Blog: Sentiment analysis on the social web depends on how a person’s state of mind is expressed in words. Now a new database of the links between words and emotions could provide a better foundation for this kind of analysis


One of the buzzphrases associated with the social web is sentiment analysis. This is the ability to determine a person’s opinion or state of mind by analysing the words they post on Twitter, Facebook or some other medium.
Much has been promised with this method—the ability to measure satisfaction with politicians, movies and products; the ability to better manage customer relations; the ability to create dialogue for emotion-aware games; the ability to measure the flow of emotion in novels; and so on.
The idea is to entirely automate this process—to analyse the firehose of words produced by social websites using advanced data mining techniques to gauge sentiment on a vast scale.
But all this depends on how well we understand the emotion and polarity (whether negative or positive) that people associate with each word or combinations of words.
Today, Saif Mohammad and Peter Turney at the National Research Council Canada in Ottawa unveil a huge database of words and their associated emotions and polarity, which they have assembled quickly and inexpensively using Amazon’s crowdsourcing Mechanical Turk website. They say this crowdsourcing mechanism makes it possible to increase the size and quality of the database quickly and easily….The result is a comprehensive word-emotion lexicon for over 10,000 words or two-word phrases which they call EmoLex….
The bottom line is that sentiment analysis can only ever be as good as the database on which it relies. With EmoLex, analysts have a new tool for their box of tricks.”
Ref: arxiv.org/abs/1308.6297: Crowdsourcing a Word-Emotion Association Lexicon

Smaller, Better, Faster, Stronger: Remaking government for the digital age


New Report by PolicyExchange (UK): “The government could save as much as £70 billion by 2020 if it adopted plans to eliminate paper and digitise its activities, work smarter with fewer staff in Whitehall, shop around for the best procurement deals, and accelerate the use of data and analytics.
Smaller, Better, Faster, Stronger shows how the government is wasting billions of pounds by relying on paper based public services. The Crown Prosecution Service prints one million sheets of paper every day while two articulated trucks loaded with letters and paper work pull into the Driving and Vehicle Licensing Authority (DVLA) every day. In order to complete a passport application form online, the Passport Office will print the form out and post it back for the individual to sign and send back.
In the near future, everything the government does should be online, unless a face-to-face interaction is essential. The UK is already nation of internet users with nearly 6 in 10 people accessing the internet via a smartphone. People expect even simple government services like tax returns or driving licences to be online. Fully transforming government with digital technologies could help close the gap between productivity in the public and private sectors.
The report also calls for stronger digital and data skills in Whitehall, making the point that senior officials will make or break this agenda by the interest they take in digital and their willingness to keep up with the times.”

The Other Side of Open is Not Closed


Dazza Greenwood at Civics.com: “Impliedly, the opposite of “open” is “closed” but the other side of open data, open API’s and open access is usually still about enabling access but only when allowed or required. Open government also needs to include adequate methods to access and work with data and other resources that are not fully open. In fact, many (most?) high value, mission critical and societally important data access is restricted in some way. If a data-set is not fully public record then a good practice is to think of it as “protected” and to ensure access according to proper controls.
As a metaphorical illustration, you could look at an open data system like a village square or agora that is architected and intended to be broadly accessible. On the other side of the spectrum, you could see a protected data system more like a castle or garrison, that is architected to be secure from intruders but features guarded gates and controlled access points in order to function.
In fact, this same conceptual approach applies well beyond data and includes everything you could consider an resource on the Internet.  In other words, any asset, service, process or other item that can exist at a URL (or URI) is a resource and can be positioned somewhere on a spectrum from openly accessible to access protected. It is easy to forget that the “R” in URL stands for “Resource” and the whole wonderful web connects to resources of every nature and description. Data – structured, raw or otherwise – is just the tip of the iceberg.
Resources on the web could be apps and other software, or large-scale enterprise network services, or just a single text file with few lines of html. The concept of a enabling access permission to “protected resources” on the web is the cornerstone of OAuth2 and is now being extended by the OpenID Connect standard, the User Managed Access protocol and other specifications to enable a powerful array of REST-based authorization possibilities…”

Citizen science versus NIMBY?


Ethan Zuckerman’s latest blog: “Safecast is a remarkable project born out of a desire to understand the health and safety implications of the release of radiation from the Fukushima Daiichi nuclear power plant in the wake of the March 11, 2011 earthquake and tsunami. Unsatisfied with limited and questionable information about radiation released by the Japanese government, Joi Ito, Peter, Sean and others worked to design, build and deploy GPS-enabled geiger counters which could be used by concerned citizens throughout Japan to monitor alpha, beta and gamma radiation and understand what parts of Japan have been most effected by the Fukushima disaster.

Screen Shot 2013-08-29 at 10.25.44 AM
The Safecast project has produced an elegant map that shows how complicated the Fukushima disaster will be for the Japanese government to recover from. While there are predictably elevated levels of radiation immediately around the Fukushima plant and in the 18 mile exclusion zones, there is a “plume” of increased radiation south and west of the reactors. The map is produced from millions of radiation readings collected by volunteers, who generally take readings while driving – Safecast’s bGeigie meter automatically takes readings every few seconds and stores them along with associated GPS coordinates for later upload to the server.
This long and thoughtful blog post about the progress of government decontamination efforts, the cost-benefit of those efforts, and the government’s transparency or opacity around cleanup gives a sense for what Safecast is trying to do: provide ways for citizens to check and verify government efforts and understand the complexity of decisions about radiation exposure. This is especially important in Japan, as there’s been widespread frustration over the failures of TEPCO to make progress on cleaning up the reactor site, leading to anger and suspicion about the larger cleanup process.
For me, Safecast raises two interesting questions:
– If you’re not getting trustworthy or sufficient information from your government, can you use crowdsourcing, citizen science or other techniques to generate that data?
– How does collecting data relate to civic engagement? Is it a path towards increased participation as an engaged and effective citizen?
To have some time to reflect on these questions, I decided I wanted to try some of my own radiation monitoring. I borrowed Joi Ito’s bGeigie and set off for my local Spent Nuclear Fuel and Greater-Than-Class C Low Level Radioactive Waste dry cask storage facility…

Projects like Safecast – and the projects I’m exploring this coming year under the heading of citizen infrastructure monitoring – have a challenge. Most participants aren’t going to uncover Ed Snowden-calibre information by driving around with a geiger counter or mapping wells in their communities. Lots of data collected is going to reveal that governments and corporations are doing their jobs, as my data suggests. It’s easy to track a path between collecting groundbreaking data and getting involved with deeper civic and political issues – will collecting data that the local nuclear plant is apparently safe get me more involved with issues of nuclear waste disposal?
It just might. One of the great potentials of citizen science and citizen infrastructure monitoring is the possibility of reducing the exotic to the routine….”

Employing digital crowdsourced information resources: Managing the emerging information commons


New Paper by Robin Mansell in the International Journal of the Commons: “This paper examines the ways loosely connected online groups and formal science professionals are responding to the potential for collaboration using digital technology platforms and crowdsourcing as a means of generating data in the digital information commons. The preferred approaches of each of these groups to managing information production, circulation and application are examined in the light of the increasingly vast amounts of data that are being generated by participants in the commons. Crowdsourcing projects initiated by both groups in the fields of astronomy, environmental science and crisis and emergency response are used to illustrate some of barriers and opportunities for greater collaboration in the management of data sets initially generated for quite different purposes. The paper responds to claims in the literature about the incommensurability of emerging approaches to open information management as practiced by formal science and many loosely connected online groups, especially with respect to authority and the curation of data. Yet, in the wake of technological innovation and diverse applications of crowdsourced data, there are numerous opportunities for collaboration. This paper draws on examples employing different social technologies of authority to generate and manage data in the commons. It suggests several measures that could provide incentives for greater collaboration in the future. It also emphasises the need for a research agenda to examine whether and how changes in social technologies might foster collaboration in the interests of reaping the benefits of increasingly large data resources for both shorter term analysis and longer term accumulation of useful knowledge.”

Mapping the Twitterverse


Mapping the Twitterverse

Phys.org: “What does your Twitter profile reveal about you? More than you know, according to Chris Weidemann. The GIST master’s student has developed an application that follows geospatial footprints.
You start your day at your favorite breakfast spot. When your order of strawberry waffles with extra whipped cream arrives, it’s too delectable not to share with your Twitter followers. You snap a photo with your smartphone and hit send. Then, it’s time to hit the books.
You tweet your friends that you’ll be at the library on campus. Later that day, palm trees silhouette a neon-pink sunset. You can’t resist. You tweet a picture with the hashtag #ILoveLA.
You may not realize that when you tweet those breezy updates and photos of food, you are sharing information about your location.
Chris Weidemann, a graduate student in the Geographic Information Science and Technology (GIST) online master’s program at USC Dornsife, investigated just how much public was generated by Twitter users and how their information—available through Twitter’s (API)—could potentially be used by third parties. His study was published June 2013 in the International Journal of Geoinformatics
Twitter has approximately 500 million active users, and reports show that 6 percent of users opt-in to allow the platform to broadcast their location using global positioning technology with each tweet they post. That’s about 30 million people sending geo-tagged data out into the Twitterverse. In their tweets, people can choose whether their information is displayed as a city and state, an address or pinpoint their precise latitude and longitude.
That’s only part of their geospatial footprint. Information contained in a post may reveal a user’s location. Depending upon how the account is set up, profiles may include details about their hometown, time zone and language.”
 

Government Is a Good Venture Capitalist


Wall Street Journal: “In a knowledge-intensive economy, innovation drives growth. But what drives innovation? In the U.S., most conservatives believe that economically significant new ideas originate in the private sector, through either the research-and-development investments of large firms with deep pockets or the inspiration of obsessive inventors haunting shabby garages. In this view, the role of government is to secure the basic conditions for honest and efficient commerce—and then get out of the way. Anything more is bound to be “wasteful” and “burdensome.”
The real story is more complex and surprising. For more than four decades, R&D magazine has recognized the top innovations—100 each year—that have moved past the conceptual stage into commercial production and sales. Economic sociologists Fred Block and Matthew Keller decided to ask a simple question: Where did these award-winning innovations come from?
The data indicated seven kinds of originating entities: Fortune 500 companies; small and medium enterprises (including startups); collaborations among private entities; government laboratories; universities; spinoffs started by researchers at government labs or universities; and a grab bag of other public and nonprofit agencies.
Messrs. Block and Keller randomly selected three years in each of the past four decades and analyzed the resulting 1,200 innovations. About 10% originated in foreign entities; the sociologists focused on the domestic innovations, more than 1,050.
Two of their findings stand out. First, the number of award winners originating in Fortune 500 companies—either working alone or in collaboration with others—has declined steadily and sharply, from an annual average of 44 in the 1970s to only nine in the first decade of this century.
Second, the number of top innovations originating in federal laboratories, universities or firms formed by former researchers in those entities rose dramatically, from 18 in the 1970s to 37 in the 1980s and 55 in the 1990s before falling slightly to 49 in the 2000s. Without the research conducted in federal labs and universities (much of it federally funded), commercial innovation would have been far less robust…”

AeroSee: Crowdsourcing Rescue using Drones


“The AeroSee experiment is an exciting new project where you can become a virtual mountain rescue assistant from the comfort of your own home, simply by using your computer. Every year Patterdale Mountain Rescue assist hundreds of injured and missing persons from around the Ullswater area in the North of the Lake District. The average search takes several hours and can require a large team of volunteers to set out in often poor weather conditions. This experiment is to see how the use of UAV (or ‘Drone’) technology, together with your ‘crowd-sourced’ help can reduce the time taken to locate and rescue a person in distress.
Civilian Use of UAVs
Here at UCLan we are interested in investigating the civilian applications of unmanned systems. They offer a rich and exciting source of educational and research material for our students and research staff. As regulations for their use are being developed and matured by aviation authorities, it is important that research is conducted to maximise their benefits to society.
Our Partners
We are working with e-Migs, a light-UAV operator who are providing and operating the aircraft during the search.
How aeroSee Works
infographic
Upon receiving a rescue call the Mountain Rescue services plan their search area and we dispatch our unmanned remotely piloted aircraft to begin a search for the missing persons. Using a real time video link, the aircraft transmits pictures of terrain to our website where you can help by viewing the images and tagging the photos if you spot something that the Mountain Rescue services need to investigate, such as a possible sighting of an injured party. We use some computer algorithms to process the tagged data that we receive and pass this processed data onto the Mountain Rescue Control Centre for a final opinion and dispatch of search teams.
We believe this approach can reduce time and save lives and we need your help to prove it. Once you are signed up, you can practice using the site by choosing ‘Practice Mission’ from the menubar. Fancy yourself as a Virtual Search agent? If have not already done so, sign up here:
Get Started!

Weather Channel Now Also Forecasts What You'll Buy


Katie Rosman int the Wall Street Journal: “The Weather Channel knows the chance for rain in St. Louis on Friday, what the heat index could reach in Santa Fe on Saturday and how humid Baltimore may get on Sunday.
It also knows when you’re most likely to buy bug spray.
The enterprise is transforming from a cable network viewers flip to during hurricane season into an operation that forecasts consumer behavior by analyzing when, where and how often people check the weather. Last fall the Weather Channel Cos. renamed itself the Weather Co. to reflect the growth of its digital-data business.

The Atlanta-based company has amassed more than 75 years’ worth of information: temperatures, dew points, cloud-cover percentages and much more, across North America and elsewhere.
The company supplies information for many major smartphone weather apps and has invested in data-crunching algorithms. It uses this analysis to appeal to advertisers who want to fine-tune their pitches to consumers….
Weather Co. researchers are now diving into weather-sentiment analysis—how local weather makes people feel, and then act—in different regions of the country. To cull this data, Mr. Walsh’s weather-analytics team directly polls visitors to the Weather.com website, asking them about their moods and purchases on specifics days.
In a series of polls conducted between June 3 and Nov. 4 last year, residents of the Northeast region responded to the question, “Yesterday, what was your mood for most of the day?”