The Platform for Political Innovation


The Platform for Political Innovation …”is a project aiming to strengthen the role and efficiency of Civil Society in Greece, focusing on the imperative need for innovation in policy-making. The project is based on the collaboration of three Greek Civil Society Organizations which combine and build on their experience and know-how in order to maximize the impact of activities for Political Innovation. From October 2014 to May 2015, experimental applications of innovative processes and tools are initiated  in different Greek cities, focusing on the re-design of decision-making processes at local and national level. The proposed action plan constitutes the phase B of the wider social project POLITEIA 2.0 which won the Audience Award in the 2012 European Investment Bank Social Innovation Tournament.

The activities of the Platform for Political Innovation focus on Research, Networking, Training, Digital Tools and Innovation Workshops Development in 4 Greek cities….including:

Syntagma 2.0: workshops for the participatory design of a new Constitution for Greece by its citizens.

Pedio_Agora: workshops for the participatory design of a public space. Focus area: Varvakeios Square, Athens….(More)”

Why Entrepreneurs Should Go Work for Government


Michael Blanding interviewing Mitchell B. Weiss for HBS Working Knowledge:  “…In the past five years, cities around the world have increasingly become laboratories in innovation, producing idea labs that partner with outside businesses and nonprofits to solve thorny public policy problems—and along the way deal with challenges of knowing when to follow the established ways of government and when to break the mold. States and federal government, too, have been reaching out to designers, engineers, and entrepreneurs to help redo their operations. The new US Digital Service, for example, follows other federal efforts like 18F and the Presidential Innovation Fellows to streamline government websites and electronic records—adapting from models in the UK and elsewhere.

“We have many talented people in government, but by and large they have tended to be analysts and strategists, rather than inventors and builders,” says Weiss, who hopes his course can help change that. “One reason we didn’t have them is we weren’t training them. At policy schools we had not been training people to be all that entrepreneurial, and at business schools, we were not prepping or prodding entrepreneurial people to enter the public sector or even just to invent for the public realm.”

“Government should be naturals at crowdsourcing”

Government entrepreneurship takes many forms. There are “public-public entrepreneurs” who work within government agencies, as well as “private-public entrepreneurs” who establish private businesses that sell to government agencies or sometimes to citizens directly.

In Philadelphia, for example, Textizen enables citizens to communicate with city health and human services agencies by text messages, leading to new enforcement on air pollution controls. In California, OpenCounter streamlined registration for small businesses and provided zoning clearances in a fraction of the usual time. In New York, Mark43 is developing software to analyze crime statistics and organize law enforcement records. And in Boston, Bridj developed an on-demand bus service for routes underserved by public transportation.

The innovations are happening at a scale large enough to even attract venture capital investment, despite past VC skepticism about funding public projects.

“There was this paradox—on the one hand, government is the biggest customer in the world; on the other hand, 90 out of 100 VCs would say they don’t back business models that sell to government,” says Weiss. “Though that’s starting to change as startups and government are starting to change.” OpenGov received a $15 million round of funding last spring led by Andreessen Horowitz, and $17 million was pumped into civic social-networking app MindMixer last fall….

Governments could attract even more capital by examining their procurement rules to speed buying, says Weiss, giving them that same sense of urgency and lean startup practices needed to be successful in entrepreneurial projects…(More)”

Crowdsourcing America’s cybersecurity is an idea so crazy it might just work


at the Washington Post: “One idea that’s starting to bubble up from Silicon Valley is the concept of crowdsourcing cybersecurity. As Silicon Valley venture capitalist Robert R. Ackerman, Jr. has pointed out, due to “the interconnectedness of our society in cyberspace,” cyber networks are best viewed as an asset that we all have a shared responsibility to protect. Push on that concept hard enough and you can see how many of the core ideas from Silicon Valley – crowdsourcing, open source software, social networking, and the creative commons – can all be applied to cybersecurity.

Silicon Valley venture capitalists are already starting to fund companies that describe themselves as crowdsourcing cybersecurity. For example, take Synack, a “crowd security intelligence” company that received $7.5 million in funding from Kleiner Perkins (one of Silicon Valley’s heavyweight venture capital firms), Allegis Ventures, and Google Ventures in 2014. Synack’s two founders are ex-NSA employees, and they are using that experience to inform an entirely new type of business model. Synack recruits and vets a global network of “white hat hackers,” and then offers their services to companies worried about their cyber networks. For a fee, these hackers are able to find and repair any security risks.

So how would crowdsourced national cybersecurity work in practice?

For one, there would be free and transparent sharing of computer code used to detect cyber threats between the government and private sector. In December, the U.S. Army Research Lab added a bit of free source code, a “network forensic analysis network” known as Dshell, to the mega-popular code sharing site GitHub. Already, there have been 100 downloads and more than 2,000 unique visitors. The goal, says William Glodek of the U.S. Army Research Laboratory, is for this shared code to “help facilitate the transition of knowledge and understanding to our partners in academia and industry who face the same problems.”

This open sourcing of cyber defense would be enhanced with a scaled-up program of recruiting “white hat hackers” to become officially part of the government’s cybersecurity efforts. Popular annual events such as the DEF CON hacking conference could be used to recruit talented cyber sleuths to work alongside the government.

There have already been examples of communities where people facing a common cyber threat gather together to share intelligence. Perhaps the best-known example is the Conficker Working Group, a security coalition that was formed in late 2008 to share intelligence about malicious Conficker malware. Another example is the Financial Services Information Sharing and Analysis Center, which was created by presidential mandate in 1998 to share intelligence about cyber threats to the nation’s financial system.

Of course, there are some drawbacks to this crowdsourcing idea. For one, such a collaborative approach to cybersecurity might open the door to government cyber defenses being infiltrated by the enemy. Ackerman makes the point that you never really know who’s contributing to any community. Even on a site such as Github, it’s theoretically possible that an ISIS hacker or someone like Edward Snowden could download the code, reverse engineer it, and then use it to insert “Trojan Horses” intended for military targets into the code….  (More)

A lot of private-sector data is also used for public good


Josh New in Computerworld: “As the private sector continues to invest in data-driven innovation, the capacity for society to benefit from this data collection grows as well. Much has been said about how the private sector is using the data it collects to improve corporate bottom lines, but positive stories about how that data contributes to the greater public good are largely unknown.
This is unfortunate, because data collected by the private sector is being used in a variety of important ways, including to advance medical research, to help students make better academic decisions and to provide government agencies and nonprofits with actionable insights. However, overzealous actions by government to restrict the collection and use of data by the private sector are likely to have a chilling effect on such data-driven innovation.
Companies are working to advance medical research with data sharing. Personal genetics company 23andMe, which offers its customers inexpensive DNA test kits, has obtained consent from three-fourths of its 800,000 customers to donate their genetic information for research purposes. 23andMe has partnered with pharmaceutical companies, such as Genentech and Pfizer, to advance genomics research by providing scientists with the data needed to develop new treatments for diseases like Crohn’s and Parkinson’s. The company has also worked with researchers to leverage its network of customers to recruit patients for clinical trials more effectively than through previous protocols.
Private-sector data is also helping students make more informed decisions about education. With the cost of attending college rising, data that helps make this investment worthwhile is incredibly valuable. The social networking company LinkedIn has built tools that provide prospective college students with valuable information about their potential career path, field of study and choice of school. By analyzing the education tracks and careers of its users, LinkedIn can offer students critical data-driven insights into how to make the best out of the enormous and costly decision to go to college. Through LinkedIn’s higher-education tools, students now have an unprecedented resource to develop data-supported education and career plans….(More)”

At Universities, a Push for Data-Driven Career Services


at The New York Times: “Officials at the University of California, San Diego, had sparse information on the career success of their graduates until they set up a branded page for the university on LinkedIn a couple of years ago.

“Back then, we had records on 125,000 alumni, but we had good employment information on less than 10,000 of them,” recalled Armin Afsahi, who oversees alumni relations as the university’s associate vice chancellor for advancement. “Aside from Qualcomm, which is in our back yard, we didn’t know who employed our alumni.”

Within three months of setting up the university page, LinkedIn connections surfaced information on 92,000 alumni, Mr. Afsahi said.

The LinkedIn page of University of California, San Diego.
The LinkedIn page of University of California, San Diego.Credit

….

“The old models of alumni relations don’t work,” Mr. Afsahi said. “We have to be a data-driven, intelligence-oriented organization to create the engagement and value” that students and alumni expect.

In an article on Sunday, I profiled two analytics start-ups, EverTrue and Graduway, which aim to help colleges and universities identify their best prospective donors or student mentors by scanning their graduates’ social networking activities. Each start-up taps into LinkedIn profiles of alumni — albeit in different ways — to help institutions of higher education stay up-to-date with their graduates’ contact information and careers.

Since 2013, however, LinkedIn has offered its own proprietary service, called University Pages, where schools can create hubs for alumni outreach and networking. About 25,000 institutions of higher learning around the world now have official university pages on the site…(More).”

USDA Opens VIVO Research Networking Tool to Public


 Sharon Durham at the USDA: VIVO, a Web application used internally by U.S. Department of Agriculture (USDA) scientists since 2012 to allow better national networking across disciplines and locations, is now available to the public. USDA VIVO will be a “one-stop shop” for Federal agriculture expertise and research outcomes.”USDA employs over 5,000 researchers to ensure our programs are based on sound public policy and the best available science,” said USDA Chief Scientist and Undersecretary for Research, Education, and Economics Dr. Catherine Woteki. “USDA VIVO provides a powerful Web search tool for connecting interdisciplinary researchers, research projects and outcomes with others who might bring a different approach or scope to a research project. Inviting private citizens to use the system will increase the potential for collaboration to solve food- and agriculture-related problems.”
The idea behind USDA VIVO is to link researchers with peers and potential collaborators to ignite synergy among our nation’s best scientific minds and to spark unique approaches to some of our toughest agricultural problems. This efficient networking tool enables scientists to easily locate others with a particular expertise. VIVO also makes it possible to quickly identify scientific expertise and respond to emerging agricultural issues, like specific plant and animal disease or pests.
USDA’s Agricultural Research Service (ARS), Economic Research Service, National Institute of Food and Agriculture, National Agricultural Statistics Service and Forest Service are the first five USDA agencies to participate in VIVO. The National Agricultural Library, which is part of ARS, will host the Web application. USDA hopes to add other agencies in the future.
VIVO was in part developed under a $12.2 million grant from the National Center for Research Resources, part of the National Institutes of Health (NIH). The grant, made under the 2009 American Recovery and Reinvestment Act, was provided to the University of Florida and collaborators at Cornell University, Indiana University, Weill Cornell Medical College, Washington University in St. Louis, the Scripps Research Institute and the Ponce School of Medicine.
VIVO’s underlying database draws information about research being conducted by USDA scientists from official public systems of record and then makes it uniformly available for searching. The data can then be easily leveraged in other applications. In this way, USDA is also making its research projects and related impacts available to the Federal RePORTER tool, released by NIH on September 22, 2014. Federal RePORTER is part of a collaborative effort between Federal entities and other research institutions to create a repository that will be useful to assess the impact of Federal research and development investments.”

Can Government Mine Tweets to Assess Public Opinion?


at Government Technology: “What if instead of going to a city meeting, you could go on Twitter, tweet your opinion, and still be heard by those in government? New research suggests this is a possibility.
The Urban Attitudes Lab at Tufts University has conducted research on accessing “big data” on social networking sites for civic purposes, according to Justin Hollander, associate professor in the Department of Urban and Environmental Policy and Planning at Tufts.
About six months ago, Hollander began researching new ways of accessing how people think about the places they live, work and play. “We’re looking to see how tapping into social media data to understand attitudes and opinions can benefit both urban planning and public policy,” he said.
Harnessing natural comments — there are about one billion tweets per day — could help governments learn what people are saying and feeling, said Hollander. And while formal types of data can be used as proxies for how happy people are, people openly share their sentiments on social networking sites.
Twitter and other social media sites can also provide information in an unobtrusive way. “The idea is that we can capture a potentially more valid and reliable view [of people’s] opinions about the world,” he said. As an inexact science, social science relies on a wide range of data sources to inform research, including surveys, interviews and focus groups; but people respond to being the subject of study, possibly affecting outcomes, Hollander said.
Hollander is also interested in extracting data from social sites because it can be done on a 24/7 basis, which means not having to wait for government to administer surveys, like the Decennial Census. Information from Twitter can also be connected to place; Hollander has approximated that about 10 percent of all tweets are geotagged to location.
In its first study earlier this year, the lab looked at using big data to learn about people’s sentiments and civic interests in New Bedford, Mass., comparing Twitter messages with the city’s published meeting minutes.
To extract tweets over a six-week period from February to April, researchers used the lab’s own software to capture 122,186 tweets geotagged within the city that also had words pertaining to the New Bedford area. Hollander said anyone can get API information from Twitter to also mine data from an area as small as a neighborhood containing a couple hundred houses.
Researchers used IBM’s SPSS Modeler software, comparing this to custom-designed software, to leverage a sentiment dictionary of nearly 3,000 words, assigning a sentiment score to each phrase — ranging from -5 for awful feelings to +5 for feelings of elation. The lab did this for the Twitter messages, and found that about 7 percent were positive versus 5.5 percent negative, and correspondingly in the minutes, 1.7 percent were positive and .7 percent negative. In total, about 11,000 messages contained sentiments.
The lab also used NVivo qualitative software to analyze 24 key words in a one-year sample of the city’s meeting minutes. By searching for the same words in Twitter posts, the researchers found that “school,” “health,” “safety,” “parks,” “field” and “children” were used frequently across both mediums.
….
Next up for the lab is a new study contrasting Twitter posts from four Massachusetts cities with the recent election results.

Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems


New paper by Charles D Borromeo, Titus K Schleyer, Michael J Becich, and Harry Hochheiser: “Background: Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs.
Objective: The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype.
Methods: Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS).
Results: Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified.
Conclusions: Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the implications of collaborator search tools for researcher workflows.”

How social media is reshaping news


Monica Anderson And Andrea Caumont at Pew Research Center: “The ever-growing digital native news world now boasts about 5,000 digital news sector jobs, according to our recent calculations, 3,000 of which are at 30 big digital-only news outlets. Many of these digital organizations emphasize the importance of social media in storytelling and engaging their audiences. As journalists gather for the annual Online News Association conference, here are answers to five questions about social media and the news.
1 How do social media sites stack up on news? When you take into account both the total reach of a site (the share of Americans who use it) and the proportion of users who get news on the site, Facebook is the obvious news powerhouse among the social media sites. Roughly two-thirds (64%) of U.S. adults use the site, and half of those users get news there — amounting to 30% of the general population….
2 How do social media users participate in news? Half of social network site users have shared news stories, images or vidoes , and nearly as many  (46%) have discussed a news issue or event. In addition to sharing news on social media, a small number are also covering the news themselves, by posting photos or videos of news events. Pew Research found that in 2014, 14% of social media users posted their own photos of news events to a social networking site, while 12% had posted videos. This practice has played a role in a number of recent breaking news events, including the riots in Ferguson, Mo
3 How do social media users discover news? Facebook is an important source of website referrals for many news outlets, but the users who arrive via Facebook spend far less time and consume far fewer pages than those who arrive directly. The same is true of users arriving by search. Our analysis of comScore data found visitors who go to a news media website directly spend roughly three times as long as those who wind up there through search or Facebook, and they view roughly five times as many pages per month. This higher level of engagement from direct visitors is evident whether a site’s traffic is driven by search or social sharing and it has big implications for news organizations who are experimenting with digital subscriptions while endeavoring to build a loyal audience.
4 What’s the news experience like on Facebook? Our study of news consumption on Facebook found Facebook users are experiencing a relatively diverse array of news stories on the site — roughly half of Facebook users regularly see six different topic areas. The most common news people see is entertainment news: 73% of Facebook users regularly see this kind of content on the site. Unlike Twitter, where a core function is the distribution of information as news breaks, Facebook is not yet a place many turn to for learning about breaking news. …
5 How does social media impact the discussion of news events? Our recent survey revealed social media doesn’t always facilitate conversation around the important issues of the day. In fact, we found people were less willing to discuss their opinion on the Snowden-NSA story on social media than they were in person. And Facebook and Twitter users were less likely to want to share their opinions in many face-to-face settings, especially if they felt their social audience disagreed with them.”

Crowd-Sourced, Gamified Solutions to Geopolitical Issues


Gamification Corp: “Daniel Green, co-founder and CTO of Wikistrat, spoke at GSummit 2014 on an intriguing topic: How Gamification Motivates All Age Groups: Or How to Get Retired Generals to Play Games Alongside Students and Interns.

Wikistrat, a crowdsourced consulting company, leverages a worldwide network of experts from various industries to solve some of the world’s geopolitical problems through the power of gamification. Wikistrat also leverages fun, training, mentorship, and networking as core concepts in their company.

Dan (@wsdan) spoke with TechnologyAdvice host Clark Buckner about Wikistrat’s work, origins, what clients can expect from working with Wikistrat, and how gamification correlates with big data and business intelligence. Listen to the podcast and read the summary below:

Wikistrat aims to solve a common problem faced by most governments and organizations when generating strategies: “groupthink.” Such entities can devise a diverse set of strategies, but they always seem to find their resolution in the most popular answer.

In order to break group thinking, Wikistrat carries out geopolitical simulations that work around “collaborative competition.” The process involves:

  • Securing analysts: Wikistrat recruits a diverse group of analysts who are experts in certain fields and located in different strategic places.

  • Competing with ideas: These analysts are placed in an online environment where, instead of competing with each other, one analyst contributes an idea, then other analysts create 2-3 more ideas based on the initial idea.

  • Breaking group thinking: Now the competition becomes only about ideas. People champion the ideas they care about rather than arguing with other analysts. That’s when Wikistrat breaks group thinking and helps their clients discover ideas they may have never considered before.

Gamification occurs when analysts create different scenarios for a specific angle or question the client raises. Plus, Wikistrat’s global analyst coverage is so good that they tout having at least one expert in every country. They accomplished this by allowing anyone—not just four-star generals—to register as an analyst. However, applicants must submit a resume and a writing sample, as well as pass a face-to-face interview….”