What cybersecurity can learn from citizen science


But as anyone who has observed an online forum thread dissecting the minutiae of geek culture can attest, hobbyists can be remarkably thorough in their exploration of topics they are passionate about. And it is often a point of pride to pick the subject that is the least conventional or popular.

The idea of citizen science is to include amateur science enthusiasts in the collection and processing of data. Thanks to the Internet, we’ve seen a surge in the number of self-taught experts in a variety of subjects. New participation platforms are social and gamified – utilizing people’s desire to compete or collaborate with others who share their passion.

How this process plays out differs from one app to the next, according to their needs: StarDust@Home asks volunteers to help sort through samples captured by the Stardust spacecraft when it flew through the coma of comet Wild 2 in 2004. They do this by viewing movies of the contents of the aerogel tiles that were used as collectors.

The security community is ripe for using the citizen science in similar ways to these. Most antimalware vendors make use of customer samples for adding detection and cleaning to their products. Many security companies use customers’ reports to gather file reputation, telemetry and prevalence data. And bug reports come from researchers of all ages and education levels – not just professional security researchers. “Month of Bug” events are a more controversial way that security is gamified. Could security companies or organizations be doing more to engage enthusiasts to help improve our response to security issues?

It could be argued that the stuff of security research – especially malware research – is potentially harmful in the hands of amateurs and should be handled only by seasoned professionals. Not only that, security is an adversarial system where the criminals would likely try to game the system to improve their profits. These are important concerns that would need to be addressed.

But the citizen science approach provides good lessons…(More)”

Advancing Collaboration Theory: Models, Typologies, and Evidence


New book edited by John C. Morris, Katrina Miller-Stevens: “The term collaboration is widely used but not clearly understood or operationalized. However, collaboration is playing an increasingly important role between and across public, nonprofit, and for-profit sectors. Collaboration has become a hallmark in both intragovernmental and intergovernmental relationships. As collaboration scholarship rapidly emerges, it diverges into several directions, resulting in confusion about what collaboration is and what it can be used to accomplish. This book provides much needed insight into existing ideas and theories of collaboration, advancing a revised theoretical model and accompanying typologies that further our understanding of collaborative processes within the public sector.

Organized into three parts, each chapter presents a different theoretical approach to public problems, valuing the collective insights that result from honoring many individual perspectives. Case studies in collaboration, split across three levels of government, offer additional perspectives on unanswered questions in the literature. Contributions are made by authors from a variety of backgrounds, including an attorney, a career educator, a federal executive, a human resource administrator, a police officer, a self-employed entrepreneur, as well as scholars of public administration and public policy. Drawing upon the individual experiences offered by these perspectives, the book emphasizes the commonalities of collaboration. It is from this common ground, the shared experiences forged among seemingly disparate interactions that advances in collaboration theory arise.

Advancing Collaboration Theory offers a unique compilation of collaborative models and typologies that enhance the existing understanding of public sector collaboration….(More)”

Big Data’s Impact on Public Transportation


InnovationEnterprise: “Getting around any big city can be a real pain. Traffic jams seem to be a constant complaint, and simply getting to work can turn into a chore, even on the best of days. With more people than ever before flocking to the world’s major metropolitan areas, the issues of crowding and inefficient transportation only stand to get much worse. Luckily, the traditional methods of managing public transportation could be on the verge of changing thanks to advances in big data. While big data use cases have been a part of the business world for years now, city planners and transportation experts are quickly realizing how valuable it can be when making improvements to city transportation. That hour long commute may no longer be something travelers will have to worry about in the future.

In much the same way that big data has transformed businesses around the world by offering greater insight in the behavior of their customers, it can also provide a deeper look at travellers. Like retail customers, commuters have certain patterns they like to keep to when on the road or riding the rails. Travellers also have their own motivations and desires, and getting to the heart of their actions is all part of what big data analytics is about. By analyzing these actions and the factors that go into them, transportation experts can gain a better understanding of why people choose certain routes or why they prefer one method of transportation over another. Based on these findings, planners can then figure out where to focus their efforts and respond to the needs of millions of commuters.

Gathering the accurate data needed to make knowledgeable decisions regarding city transportation can be a challenge in itself, especially considering how many people commute to work in a major city. New methods of data collection have made that effort easier and a lot less costly. One way that’s been implemented is through the gathering of call data records (CDR). From regular transactions made from mobile devices, information about location, time, and duration of an action (like a phone call) can give data scientists the necessary details on where people are traveling to, how long it takes them to get to their destination, and other useful statistics. The valuable part of this data is the sample size, which provides a much bigger picture of the transportation patterns of travellers.

That’s not the only way cities are using big data to improve public transportation though. Melbourne in Australia has long been considered one of the world’s best cities for public transit, and much of that is thanks to big data. With big data and ad hoc analysis, Melbourne’s acclaimed tram system can automatically reconfigure routes in response to sudden problems or challenges, such as a major city event or natural disaster. Data is also used in this system to fix problems before they turn serious.Sensors located in equipment like tram cars and tracks can detect when maintenance is needed on a specific part. Crews are quickly dispatched to repair what needs fixing, and the tram system continues to run smoothly. This is similar to the idea of the Internet of Things, wherein embedded sensors collect data that is then analyzed to identify problems and improve efficiency.

Sao Paulo, Brazil is another city that sees the value of using big data for its public transportation. The city’s efforts concentrate on improving the management of its bus fleet. With big data collected in real time, the city can get a more accurate picture of just how many people are riding the buses, which routes are on time, how drivers respond to changing conditions, and many other factors. Based off of this information, Sao Paulo can optimize its operations, providing added vehicles where demand is genuine whilst finding which routes are the most efficient. Without big data analytics, this process would have taken a very long time and would likely be hit-or-miss in terms of accuracy, but now, big data provides more certainty in a shorter amount of time….(More)”

Handbook: How to Catalyze Humanitarian Innovation in Computing Research Institutes


Patrick Meier: “The handbook below provides practical collaboration guidelines for both humanitarian organizations & computing research institutes on how to catalyze humanitarian innovation through successful partnerships. These actionable guidelines are directly applicable now and draw on extensive interviews with leading humanitarian groups and CRI’s including the International Committee of the Red Cross (ICRC), United Nations Office for the Coordination of Humanitarian Affairs (OCHA), United Nations Children’s Fund (UNICEF), United Nations High Commissioner for Refugees (UNHCR), UN Global Pulse, Carnegie Melon University (CMU), International Business Machines (IBM), Microsoft Research, Data Science for Social Good Program at the University of Chicago and others.

This handbook, which is the first of its kind, also draws directly on years of experience and lessons learned from the Qatar Computing Research Institute’s (QCRI) active collaboration and unique partnerships with multiple international humanitarian organizations. The aim of this blog post is to actively solicit feedback on this first, complete working draft, which is available here as an open and editable Google Doc. …(More)”

Want to fix the world? Start by making clean energy a default setting


Chris Mooney in the Washington Post: “In recent years, psychologists and behavioral scientists have begun to decipher why we make the choices that we do when it comes to using energy. And the bottom line is that it’s hard to characterize those choices as fully “rational.”

Rather than acting like perfect homo economicuses, they’ve found, we’rehighly swayed by the energy use of our neighbors and friends — peer pressure, basically. At the same time, we’re also heavily biased by the status quo — we delay in switching to new energy choices, even when they make a great deal of economic sense.

 All of which has led to the popular idea of “nudging,” or the idea that you can subtly sway people to change their behavior by changing, say, the environment in which they make choices, or the kinds of information they receive. Not in a coercive way, but rather, through gentle tweaks and prompts. And now, a major study in Nature Climate Change demonstrates that one very popular form of energy-use nudging that might be called “default switching,” or the “default effect,” does indeed work — and indeed, could possibly work at a very large scale.

“This is the first demonstration of a large-scale nudging effect using defaults in the domain of energy choices,” says Sebastian Lotz of Stanford University and the University of Lausanne in Switzerland, who conducted the research with Felix Ebeling of the University of Cologne in Germany….(More)”

Government data does not mean data governance: Lessons learned from a public sector application audit


Paper by Nik ThompsonRavi Ravindran, and Salvatore Nicosia: “Public sector agencies routinely store large volumes of information about individuals in the community. The storage and analysis of this information benefits society, as it enables relevant agencies to make better informed decisions and to address the individual’s needs more appropriately. Members of the public often assume that the authorities are well equipped to handle personal data; however, due to implementation errors and lack of data governance, this is not always the case. This paper reports on an audit conducted in Western Australia, focusing on findings in the Police Firearms Management System and the Department of Health Information System. In the case of the Police, the audit revealed numerous data protection issues leading the auditors to report that they had no confidence in the accuracy of information on the number of people licensed to possess firearms or the number of licensed firearms. Similarly alarming conclusions were drawn in the Department of Health as auditors found that they could not determine which medical staff member was responsible for clinical data entries made. The paper describes how these issues often do not arise from existing business rules or the technology itself, but a lack of sound data governance. Finally, a discussion section presents key data governance principles and best practices that may guide practitioners involved in data management. These cases highlight the very real data management concerns, and the associated recommendations provide the context to spark further interest in the applied aspects of data protection….(More)”

 

Civic open data at a crossroads: Dominant models and current challenges


Renee E. Sieber and Peter A. Johnson in Government Information Quarterly: “As open data becomes more widely provided by government, it is important to ask questions about the future possibilities and forms that government open data may take. We present four models of open data as they relate to changing relations between citizens and government. These models include; a status quo ‘data over the wall’ form of government data publishing, a form of ‘code exchange’, with government acting as an open data activist, open data as a civic issue tracker, and participatory open data. These models represent multiple end points that can be currently viewed from the unfolding landscape of government open data. We position open data at a crossroads, with significant concerns of the conflicting motivations driving open data, the shifting role of government as a service provider, and the fragile nature of open data within the government space. We emphasize that the future of open data will be driven by the negotiation of the ethical-economic tension that exists between provisioning governments, citizens, and private sector data users….(More)”

 

Confidence in U.S. Institutions Still Below Historical Norms


Jeffrey M. Jones at Gallup: “Americans’ confidence in most major U.S. institutions remains below the historical average for each one. Only the military (72%) and small business (67%) — the highest-rated institutions in this year’s poll — are currently rated higher than their historical norms, based on the percentage expressing “a great deal” or “quite a lot” of confidence in the institution.

Confidence in U.S. Institutions, 2015 vs. Historical Average for Each Institution

These results are based on a June 2-7 Gallup poll that included Gallup’s latest update on confidence in U.S. institutions. Gallup first measured confidence ratings in 1973 and has updated them each year since 1993.

Americans’ confidence in most major institutions has been down for many years as the nation has dealt with prolonged wars in Iraq and Afghanistan, a major recession and sluggish economic improvement, and partisan gridlock in Washington. In fact, 2004 was the last year most institutions were at or above their historical average levels of confidence. Perhaps not coincidentally, 2004 was also the last year Americans’ satisfaction with the way things are going in the United States averaged better than 40%. Currently, 28% of Americans are satisfied with the state of the nation.

From a broad perspective, Americans’ confidence in all institutions over the last two years has been the lowest since Gallup began systematic updates of a larger set of institutions in 1993. The average confidence rating of the 14 institutions asked about annually since 1993 — excluding small business, asked annually since 2007 — is 32% this year. This is one percentage point above the all-institution average of 31% last year. Americans were generally more confident in all institutions in the late 1990s and early 2000s as the country enjoyed a strong economy and a rally in support for U.S. institutions after the 9/11 terrorist attacks.

Trend: Average Confidence Rating Across All Institutions, by Year

Confidence in Political, Financial and Religious Institutions Especially Low

Today’s confidence ratings of Congress, organized religion, banks, the Supreme Court and the presidency show the greatest deficits compared with their historical averages, all running at least 10 points below that mark. Americans’ frustration with the government’s performance has eroded the trust they have in all U.S. political institutions….(More)”

The Climatologist’s Almanac


Clara Chaisson at onEarth: “Forget your weather app with its five- or even ten-day forecasts—a supercomputer at NASA has just provided us with high-resolution climate projections through the end of the century. The massive new 11-terabyte data set combines historical daily temperatures and precipitation measurements with climate simulations under two greenhouse gas emissions scenarios. The project spans from 1950 to 2100, but users can easily zero in on daily timescales for their own locales—which is precisely the point.

The projections can be found on Amazon for free for all to see and plan by. The space agency hopes that developing nations and poorer communities that may not have any spare supercomputers lying around will use the info to predict and prepare for climate change. …(More)”

Why open data should be central to Fifa reform


Gavin Starks in The Guardian: “Over the past two weeks, Fifa has faced mounting pressure to radically improve its transparency and governance in the wake of corruption allegations. David Cameron has called for reforms including expanding the use of open data.

Open data is information made available by governments, businesses and other groups for anyone to read, use and share. Data.gov.uk was launched as the home of UK open government data in January 2010 and now has almost 21,000 published datasets, including on government spending.

Allowing citizens to freely access data related to the institutions that govern them is essential to a well-functioning democratic society. It is the first step towards holding leaders to account for failures and wrongdoing.

Fifa has a responsibility for the shared interests of millions of fans around the world. Football’s popularity means that Fifa’s governance has wide-ranging implications for society, too. This is particularly true of decisions about hosting the World Cup, which is often tied to large-scale government investment in infrastructure and even extends to law-making. Brazil spent up to £10bn hosting the 2014 World Cup and had to legalise the sale of beer at matches.

Following Sepp Blatter’s resignation, Fifa will gather its executive committee in July to plan for a presidential election, expected to take place in mid-December. Open data should form the cornerstone of any prospective candidate’s manifesto. It can help Fifa make better spending decisions and ensure partners deliver value for money, restore the trust of the international football community.

Fifa’s lengthy annual financial report gives summaries of financial expenditure,budgeted at £184m for operations and governance alone in 2016, but individual transactions are not published. Publishing spending data incentivises better spending decisions. If all Fifa’s outgoings – which totalled around £3.5bn between 2011 and 2014 – were made open, it would encourage much more efficiency….(more)”