Breaking Public Administrations’ Data Silos. The Case of Open-DAI, and a Comparison between Open Data Platforms.


Paper by Raimondo Iemma, Federico Morando, and Michele Osella: “An open reuse of public data and tools can turn the government into a powerful ‘platform’ also involving external innovators. However, the typical information system of a public agency is not open by design. Several public administrations have started adopting technical solutions to overcome this issue, typically in the form of middleware layers operating as ‘buses’ between data centres and the outside world. Open-DAI is an open source platform designed to expose data as services, directly pulling from legacy databases of the data holder. The platform is the result of an ongoing project funded under the EU ICT PSP call 2011. We present the rationale and features of Open-DAI, also through a comparison with three other open data platforms: the Socrata Open Data portal, CKAN, and ENGAGE….(More)”

US government and private sector developing ‘precrime’ system to anticipate cyber-attacks


Martin Anderson at The Stack: “The USA’s Office of the Director of National Intelligence (ODNI) is soliciting the involvement of the private and academic sectors in developing a new ‘precrime’ computer system capable of predicting cyber-incursions before they happen, based on the processing of ‘massive data streams from diverse data sets’ – including social media and possibly deanonymised Bitcoin transactions….
At its core the predictive technologies to be developed in association with the private sector and academia over 3-5 years are charged with the mission ‘to invest in high-risk/high-payoff research that has the potential to provide the U.S. with an overwhelming intelligence advantage over our future adversaries’.
The R&D program is intended to generate completely automated, human-free prediction systems for four categories of event: unauthorised access, Denial of Service (DoS), malicious code and scans and probes which are seeking access to systems.
The CAUSE project is an unclassified program, and participating companies and organisations will not be granted access to NSA intercepts. The scope of the project, in any case, seems focused on the analysis of publicly available Big Data, including web searches, social media exchanges and trawling ungovernable avalanches of information in which clues to future maleficent actions are believed to be discernible.
Program manager Robert Rahmer says: “It is anticipated that teams will be multidisciplinary and might include computer scientists, data scientists, social and behavioral scientists, mathematicians, statisticians, content extraction experts, information theorists, and cyber-security subject matter experts having applied experience with cyber capabilities,”
Battelle, one of the concerns interested in participating in CAUSE, is interested in employing Hadoop and Apache Spark as an approach to the data mountain, and includes in its preliminary proposal an intent to ‘de-anonymize Bitcoin sale/purchase activity to capture communication exchanges more accurately within threat-actor forums…’.
Identifying and categorising quality signal in the ‘white noise’ of Big Data is a central plank in CAUSE, and IARPA maintains several offices to deal with different aspects of it. Its pointedly-named ‘Office for Anticipating Surprise’  frames the CAUSE project best, since it initiated it. The OAS is occupied with ‘Detecting and forecasting the emergence of new technical capabilities’, ‘Early warning of social and economic crises, disease outbreaks, insider threats, and cyber attacks’ and ‘Probabilistic forecasts of major geopolitical trends and rare events’.
Another concerned department is The Office of Incisive Analysis, which is attempting to break down the ‘data static’ problem into manageable mission stages:
1) Large data volumes and varieties – “Providing powerful new sources of information from massive, noisy data that currently overwhelm analysts”
2) Social-Cultural and Linguistic Factors – “Analyzing language and speech to produce insights into groups and organizations. “
3) Improving Analytic Processes – “Dramatic enhancements to the analytic process at the individual and group level. “
The Office of Smart Collection develops ‘new sensor and transmission technologies, with the seeking of ‘Innovative approaches to gain access to denied environments’ as part of its core mission, while the Office of Safe and Secure Operations concerns itself with ‘Revolutionary advances in science and engineering to solve problems intractable with today’s computers’.
The CAUSE program, which attracted 150 developers, organisations, academics and private companies to the initial event, will announce specific figures about funding later in the year, and practice ‘predictions’ from participants will begin in the summer, in an accelerating and stage-managed program over five years….(More)”

Netpolitik: What the Emergence of Networks Means for Diplomacy and Statecraft


Charlie Firestone and Leshuo Dong at the Aspen Journal of Ideas: “…The network is emerging as a dominant form of organization for our age of complexity. This is supported by technological and economic trends. Furthermore, enemies are networks, players are networks, even governments are becoming networks. It makes sense to understand network principles and apply them for use in the world of diplomacy. Accordingly, governments, organizations and individuals should heed these recommendations:

  • Understand and apply two-way communications and network principles to all forms of diplomacy with the aim of earning the sympathy, empathy and where applicable, the loyalty of future generations. This is a mindset shift for governments, diplomats and citizens around the world.
  • This means engaging the world’s populations to communicate with each other. That will entail physical connections to the global common medium, an ability to have what you send be received by others in the form you send it, end to end, and literacy in the communications methods of the day. The world’s population should have a meaningful right to connect.
  • Of course, if there is to be a global communications network, it needs to be safe, so governments remain in the role of protector of the environment needed for users to trust in their networks. States have a role to protect against cyberwar, cybercrimes, and loss of a person’s identity, i.e., security and privacy online. But these protections cannot be a screen for illegitimate governmental controls over or unwarranted surveillance of its citizens. Nor can governments be expected to shoulder that burden alone. Everyone will need to practice a basic level of Net hygiene and literacy as an element of their digital citizenship.

As networks proliferate, principles of netpolitik will emerge. Governments, businesses, non-governmental organizations, and every citizen would be well advised to be thinking in these terms in the years ahead….(More).”

Open data could turn Europe’s digital desert into a digital rainforest


Joanna Roberts interviews Dirk Helbing, Professor of Computational Social Science at ETH Zurich at Horizon: “…If we want to be competitive, Europe needs to find its own way. How can we differentiate ourselves and make things better? I believe Europe should not engage in the locked data strategy that we see in all these huge IT giants. Instead, Europe should engage in open data, open innovation, and value-sensitive design, particularly approaches that support informational self-determination. So everyone can use this data, generate new kinds of data, and build applications on top. This is going to create ever more possibilities for everyone else, so in a sense that will turn a digital desert into a digital rainforest full of opportunities for everyone, with a rich information ecosystem.’…
The Internet of Things is the next big emerging information communication technology. It’s based on sensors. In smartphones there are about 15 sensors; for light, for noise, for location, for all sorts of things. You could also buy additional external sensors for humidity, for chemical substances and almost anything that comes to your mind. So basically this allows us to measure the environment and all the features of our physical, biological, economic, social and technological environment.
‘Imagine if there was one company in the world controlling all the sensors and collecting all the information. I think that might potentially be a dystopian surveillance nightmare, because you couldn’t take a single step or speak a single word without it being recorded. Therefore, if we want the Internet of Things to be consistent with a stable democracy then I believe we need to run it as a citizen web, which means to create and manage the planetary nervous system together. The citizens themselves would buy the sensors and activate them or not, would decide themselves what sensor data they would share with whom and for what purpose, so informational self-determination would be at the heart, and everyone would be in control of their own data.’….
A lot of exciting things will become possible. We would have a real-time picture of the world and we could use this data to be more aware of what the implications of our decisions and actions are. We could avoid mistakes and discover opportunities we would otherwise have missed. We will also be able to measure what’s going on in our society and economy and why. In this way, we will eventually identify the hidden forces that determine the success or failure of a company, of our economy or even our society….(More)”

Why Information Grows: The Evolution of Order, from Atoms to Economies


Forthcoming book: “In Why Information Grows, rising star César Hidalgo offers a radical interpretation of global economicsWhile economists often turn to measures like GDP or per-capita income, César Hidalgo turns to information theory to explain the success or failure of a country’s economic performance. Through a radical rethinking of what the economy is, Hidalgo shows that natural constraints in our ability to accumulate knowledge, knowhow and information explain the evolution of social and economic complexity. This is a rare tour de force, linking economics, sociology, physics, biology and information theory, to explain the evolution of social and economic systems as a consequence of the physical embodiment of information in a world where knowledge is quite literally power.
César Hidalgo leads the Macro Connections group at the MIT Media Lab. A trained statistical physicist and an expert on Networks and Complex Systems, he also has extensive experience in the field of economic development and has pioneered research on how big data impacts economic decision-making….(More)”

Measuring government impact in a social media world


Arthur Mickoleit & Ryan Androsoff at OECD Insights: “There is hardly a government around the world that has not yet felt the impact of social media on how it communicates and engages with citizens. And while the most prominent early adopters in the public sector have tended to be politicians (think of US President Barack Obama’s impressive use of social media during his 2008 campaign), government offices are also increasingly jumping on the bandwagon. Yes, we are talking about those – mostly bricks-and-mortar – institutions that often toil away from the public gaze, managing the public administration in our countries. As the world changes, they too are increasingly engaging in a very public way through social media.
Research from our recent OECD working paper “Social Media Use by Governments” shows that as of November 2014, out of 34 OECD countries, 28 have a Twitter account for the office representing the top executive institution (head of state, head of government, or government as a whole), and 21 have a Facebook account….
 
But what is the impact governments can or should expect from social media? Is it all just vanity and peer pressure? Surely not.
Take the Spanish national police force (e.g. on Twitter, Facebook & YouTube), a great example of using social media to build long-term engagement, trust and a better public service. The thing so many governments yearn for, in this case the Spanish police seem to have managed well.
Or take the Danish “tax daddy” on Twitter – @Skattefar. It started out as the national tax administration’s quest to make it easier for everyone to submit correct tax filings; it is now one of the best examples around of a tax agency gone social.
Government administrations can use social media for internal purposes too. The Government of Canada used public platforms like Twitter and internal platforms like GCpedia and GCconnex to conduct a major employee engagement exercise (Blueprint 2020) to develop a vision for the future of the Canadian federal public service.
And when it comes to raising efficiency in the public sector, read this account of a Dutch research facility’s Director who decided to stop email. Not reduce it, but stop it altogether and replace it with social media.
There are so many other examples that could be cited. But the major question is how can we even begin to appraise the impact of these different initiatives? Because as we’ve known since the 19th century, “if you cannot measure it, you cannot improve it” (quote usually attributed to Lord Kelvin). Some aspects of impact measurement for social media can be borrowed from the private sector with regards to presence, popularity, penetration, and perception. But it’s around purpose that impact measurement agendas will split between the private sector and government. Virtually all companies will want to calculate the return on social media investments based on whether it helps them improve their financial returns. That’s different in the public sector where purpose is rarely defined in commercial terms.
A good impact assessment for social media in the public sector therefore needs to be built around its unique purpose-orientation. This is much more difficult to measure and it will involve a mix of quantitative data (e.g. reach of target audience) and qualitative data (e.g. case studies describing tangible impact). Social Media Use by Governments proposes a framework to start looking at social media measurement in gradual steps – from measuring presence, to popularity, to penetration, to perception, and finally, to purpose-orientation. The aim of this framework is to help governments develop truly relevant metrics and start treating social media activity by governments with the same public management rigour that is applied to other government activities. You can see a table summarising the framework by clicking on the thumbnail below.
This is far from an exact science, but we are beginning the work collaborating with member and partner governments to develop a toolkit that will help decision-makers implement the OECD Recommendation on Digital Government Strategies, including on the issue of social media metrics…(More)”.

Platform lets patients contribute to their own medical records


Springwise: “Those with complex medical conditions often rely heavily on their own ability to communicate their symptoms in short — and sometimes stressful — healthcare visits. We have recently seen Ginger.io, a smartphone app which uses big data to improve communication between patients and clinicians in between visits, and now OurNotes is a Commonwealth grant funded program that will enable patients to contribute to their own electronic medical records.
The scheme, currently being researched at Beth Isreal Deaconess Medical Centre in Boston and four other sites in the US, is part of a countrywide initiative called OpenNotes, which has already enabled five million patients to read their medical records online. Since an initial pilot scheme in 2012, OpenNotes has met with great success — creating improved communication between patients and doctors, and making patients feel more in control of their healthcare and treatments.
The new OurNotes scheme is expected to have particular benefits for medically complex patients who have have multiple chronic health conditions. It will enable patients to make notes on an upcoming visit, listing topics and questions they want to cover. In turn, this presents doctors with an opportunity to prepare and research for tricky or niche questions before meeting their patient…(More)”

Data-Driven Development Pathways for Progress


Report from the World Economic Forum: “Data is the lifeblood of sustainable development and holds tremendous potential for transformative positive change particularly for lower- and middle-income countries. Yet despite the promise of a “Data Revolution”, progress is not a certainty. Lack of clarity on privacy and ethical issues, asymmetric power dynamics and an array of entangled societal and commercial risks threaten to hinder progress.
Written by the World Economic Forum Global Agenda Council on Data-Driven Development, this report serves to clarify how big data can be leveraged to address the challenges of sustainable development. Providing a blueprint for balancing competing tensions, areas of focus include: addressing the data deficit of the Global South, establishing resilient governance and strengthening capacities at the community and individual level. (PDF)”

Urban technology analysis matrix


New Paper by  Pablo Emilio Branchi , Carlos Fernández-Valdivielso , and Ignacio Raúl Matías: “Our objective is to develop a method for better analyzing the utility and impact of new technologies on Smart Cities. We have designed a tool that will evaluate new technologies according to a three-pronged scoring system that considers the impact on physical space, environmental issues, and city residents. The purpose of this tool is to be used by city planners as part of a strategic approach to the implementation of a Smart City initiative in order to reduce unnecessary public spending and ensure the optimal allocation of city resources….

The paper provides a list of the different elements to be analyzed in Smart Cities in the form of a matrix and develops the methodology to evaluate them in order to obtain a final score for technologies prior to its application in cities….Traditional technological scenarios have been challenged, and Smart Cities have become the center of urban competitiveness. A lack of clarity has been detected in the way of describing what Smart Cities are, and we try to establish a methodology for urban policy makers to do so. As a dynamic process that affects several aspects, researchers are encouraged to test the proposed solution further. (More)”

 

Using Flash Crowds to Automatically Detect Earthquakes & Impact Before Anyone Else


Patrick Meier at iRevolutions: “It is said that our planet has a new nervous system; a digital nervous system comprised of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. Next generation humanitarian technologies seek to leverage this new nervous system to detect and diagnose the impact of disasters within minutes rather than hours. To this end, LastQuake may be one of the most impressive humanitarian technologies that I have recently come across. Spearheaded by the European-Mediterranean Seismological Center (EMSC), the technology combines “Flashsourcing” with social media monitoring to auto-detect earthquakes before they’re picked up by seismometers or anyone else.

Screen Shot 2014-10-23 at 5.08.30 PM

Scientists typically draw on ground-motion prediction algorithms and data on building infrastructure to rapidly assess an earthquake’s potential impact. Alas, ground-motion predictions vary significantly and infrastructure data are rarely available at sufficient resolutions to accurately assess the impact of earthquakes. Moreover, a minimum of three seismometers are needed to calibrate a quake and said seismic data take several minutes to generate. This explains why the EMSC uses human sensors to rapidly collect relevant data on earthquakes as these reduce the uncertainties that come with traditional rapid impact assessment methodologies. Indeed, the Center’s important work clearly demonstrates how the Internet coupled with social media are “creating new potential for rapid and massive public involvement by both active and passive means” vis-a-vis earthquake detection and impact assessments. Indeed, the EMSC can automatically detect new quakes within 80-90 seconds of their occurrence while simultaneously publishing tweets with preliminary information on said quakes, like this one:

Screen Shot 2014-10-23 at 5.44.27 PM

In reality, the first human sensors (increases in web traffic) can be detected within 15 seconds (!) of a quake…(More)