Forging Trust Communities: How Technology Changes Politics


Book by Irene S. Wu: “Bloggers in India used social media and wikis to broadcast news and bring humanitarian aid to tsunami victims in South Asia. Terrorist groups like ISIS pour out messages and recruit new members on websites. The Internet is the new public square, bringing to politics a platform on which to create community at both the grassroots and bureaucratic level. Drawing on historical and contemporary case studies from more than ten countries, Irene S. Wu’s Forging Trust Communities argues that the Internet, and the technologies that predate it, catalyze political change by creating new opportunities for cooperation. The Internet does not simply enable faster and easier communication, but makes it possible for people around the world to interact closely, reciprocate favors, and build trust. The information and ideas exchanged by members of these cooperative communities become key sources of political power akin to military might and economic strength.

Wu illustrates the rich world history of citizens and leaders exercising political power through communications technology. People in nineteenth-century China, for example, used the telegraph and newspapers to mobilize against the emperor. In 1970, Taiwanese cable television gave voice to a political opposition demanding democracy. Both Qatar (in the 1990s) and Great Britain (in the 1930s) relied on public broadcasters to enhance their influence abroad. Additional case studies from Brazil, Egypt, the United States, Russia, India, the Philippines, and Tunisia reveal how various technologies function to create new political energy, enabling activists to challenge institutions while allowing governments to increase their power at home and abroad.

Forging Trust Communities demonstrates that the way people receive and share information through network communities reveals as much about their political identity as their socioeconomic class, ethnicity, or religion. Scholars and students in political science, public administration, international studies, sociology, and the history of science and technology will find this to be an insightful and indispensable work….(More)”

Creating Value through Open Data


Press Release: “Capgemini Consulting, the global strategy and transformation consulting arm of the Capgemini Group, today published two new reports on the state of play of Open Data in Europe, to mark the launch of the European Open Data Portal. The first report addresses “Open Data Maturity in Europe 2015: Insights into the European state of play” and the second focuses on “Creating Value through Open Data: Study on the Impact of Re-use of Public Data Resources.” The countries covered by these assessments include the EU28 countries plus Iceland, Liechtenstein, Norway, and Switzerland – commonly referred to as the EU28+ countries. The reports were requested by the European Commission within the framework of the Connecting Europe Facility program, supporting the deployment of European Open Data infrastructure.

Open Data refers to the information collected, produced or paid for by public bodies and can be freely used, modified and shared by anyone.. For the period 2016-2020, the direct market size for Open Data is estimated at EUR 325 billion for Europe. Capgemini’s study “Creating Value through Open Data” illustrates how Open Data can create economic value in multiple ways including increased market transactions, job creation from producing services and products based on Open Data, to cost savings and efficiency gains. For instance, effective use of Open Data could help save 629 million hours of unnecessary waiting time on the roads in the EU; and help reduce energy consumption by 16%. The accumulated cost savings for public administrations making use of Open Data across the EU28+ in 2020 are predicted to equal 1.7 bn EUR. Reaping these benefits requires reaching a high level of Open Data maturity.

In order to address the accessibility and the value of Open Data across European countries, the European Union has launched the Beta version of the European Data Portal. The Portal addresses the whole Data Value Chain, from data publishing to data re-use. Over 240,000 data sets are referenced on the Portal and 34 European countries. It offers seamless access to public data across Europe, with over 13 content categories to categorize data, ranging from health or education to transport or even science and justice. Anyone, citizens, businesses, journalists or administrations can search, access and re-use the full data collection. A wide range of data is available, from crime records in Helsinki, labor mobility in the Netherlands, forestry maps in France to the impact of digitization in Poland…..The study, “Open Data Maturity in Europe 2015: Insights into the European state of play”, uses two key indicators: Open Data Readiness and Portal Maturity. These indicators cover both the maturity of national policies supporting Open Data as well as an assessment of the features made available on national data portals. The study shows that the EU28+ have completed just 44% of the journey towards achieving full Open Data Maturity and there are large discrepancies across countries. A third of European countries (32%), recognized globally, are leading the way with solid policies, licensing norms, good portal traffic and many local initiatives and events to promote Open Data and its re-use….(More)”

How Big Data is Helping to Tackle Climate Change


Bernard Marr at DataInformed: “Climate scientists have been gathering a great deal of data for a long time, but analytics technology’s catching up is comparatively recent. Now that cloud, distributed storage, and massive amounts of processing power are affordable for almost everyone, those data sets are being put to use. On top of that, the growing number of Internet of Things devices we are carrying around are adding to the amount of data we are collecting. And the rise of social media means more and more people are reporting environmental data and uploading photos and videos of their environment, which also can be analyzed for clues.

Perhaps one of the most ambitious projects that employ big data to study the environment is Microsoft’s Madingley, which is being developed with the intention of creating a simulation of all life on Earth. The project already provides a working simulation of the global carbon cycle, and it is hoped that, eventually, everything from deforestation to animal migration, pollution, and overfishing will be modeled in a real-time “virtual biosphere.” Just a few years ago, the idea of a simulation of the entire planet’s ecosphere would have seemed like ridiculous, pie-in-the-sky thinking. But today it’s something into which one of the world’s biggest companies is pouring serious money. Microsoft is doing this because it believes that analytical technology has finally caught up with the ability to collect and store data.

Another data giant that is developing tools to facilitate analysis of climate and ecological data is EMC. Working with scientists at Acadia National Park in Maine, the company has developed platforms to pull in crowd-sourced data from citizen science portals such as eBird and iNaturalist. This allows park administrators to monitor the impact of climate change on wildlife populations as well as to plan and implement conservation strategies.

Last year, the United Nations, under its Global Pulse data analytics initiative, launched the Big Data Climate Challenge, a competition aimed to promote innovate data-driven climate change projects. Among the first to receive recognition under the program is Global Forest Watch, which combines satellite imagery, crowd-sourced witness accounts, and public datasets to track deforestation around the world, which is believed to be a leading man-made cause of climate change. The project has been promoted as a way for ethical businesses to ensure that their supply chain is not complicit in deforestation.

Other initiatives are targeted at a more personal level, for example by analyzing transit routes that could be used for individual journeys, using Google Maps, and making recommendations based on carbon emissions for each route.

The idea of “smart cities” is central to the concept of the Internet of Things – the idea that everyday objects and tools are becoming increasingly connected, interactive, and intelligent, and capable of communicating with each other independently of humans. Many of the ideas put forward by smart-city pioneers are grounded in climate awareness, such as reducing carbon dioxide emissions and energy waste across urban areas. Smart metering allows utility companies to increase or restrict the flow of electricity, gas, or water to reduce waste and ensure adequate supply at peak periods. Public transport can be efficiently planned to avoid wasted journeys and provide a reliable service that will encourage citizens to leave their cars at home.

These examples raise an important point: It’s apparent that data – big or small – can tell us if, how, and why climate change is happening. But, of course, this is only really valuable to us if it also can tell us what we can do about it. Some projects, such as Weathersafe, which helps coffee growers adapt to changing weather patterns and soil conditions, are designed to help humans deal with climate change. Others are designed to tackle the problem at the root, by highlighting the factors that cause it in the first place and showing us how we can change our behavior to minimize damage….(More)”

Remaking Participation: Science, Environment and Emergent Publics


Book edited by Jason Chilvers and Matthew Kearnes: “Changing relations between science and democracy – and controversies over issues such as climate change, energy transitions, genetically modified organisms and smart technologies – have led to a rapid rise in new forms of public participation and citizen engagement. While most existing approaches adopt fixed meanings of ‘participation’ and are consumed by questions of method or critiquing the possible limits of democratic engagement, this book offers new insights that rethink public engagements with science, innovation and environmental issues as diverse, emergent and in the making. Bringing together leading scholars on science and democracy, working between science and technology studies, political theory, geography, sociology and anthropology, the volume develops relational and co-productionist approaches to studying and intervening in spaces of participation. New empirical insights into the making, construction, circulation and effects of participation across cultures are illustrated through examples ranging from climate change and energy to nanotechnology and mundane technologies, from institutionalised deliberative processes to citizen-led innovation and activism, and from the global north to global south. This new way of seeing participation in science and democracy opens up alternative paths for reconfiguring and remaking participation in more experimental, reflexive, anticipatory and responsible ways….(More)”

How big data and The Sims are helping us to build the cities of the future


The Next Web: “By 2050, the United Nations predicts that around 66 percent of the world’s population will be living in urban areas. It is expected that the greatest expansion will take place in developing regions such as Africa and Asia. Cities in these parts will be challenged to meet the needs of their residents, and provide sufficient housing, energy, waste disposal, healthcare, transportation, education and employment.

So, understanding how cities will grow – and how we can make them smarter and more sustainable along the way – is a high priority among researchers and governments the world over. We need to get to grips with the inner mechanisms of cities, if we’re to engineer them for the future. Fortunately, there are tools to help us do this. And even better, using them is a bit like playing SimCity….

Cities are complex systems. Increasingly, scientists studying cities have gone from thinking about “cities as machines”, to approaching “cities as organisms”. Viewing cities as complex, adaptive organisms – similar to natural systems like termite mounds or slime mould colonies – allows us to gain unique insights into their inner workings. …So, if cities are like organisms, it follows that we should examine them from the bottom-up, and seek to understand how unexpected large-scale phenomena emerge from individual-level interactions. Specifically, we can simulate how the behaviour of individual “agents” – whether they are people, households, or organisations – affect the urban environment, using a set of techniques known as “agent-based modelling”….These days, increases in computing power and the proliferation of big datagive agent-based modelling unprecedented power and scope. One of the most exciting developments is the potential to incorporate people’s thoughts and behaviours. In doing so, we can begin to model the impacts of people’s choices on present circumstances, and the future.

For example, we might want to know how changes to the road layout might affect crime rates in certain areas. By modelling the activities of individuals who might try to commit a crime, we can see how altering the urban environment influences how people move around the city, the types of houses that they become aware of, and consequently which places have the greatest risk of becoming the targets of burglary.

To fully realise the goal of simulating cities in this way, models need a huge amount of data. For example, to model the daily flow of people around a city, we need to know what kinds of things people spend their time doing, where they do them, who they do them with, and what drives their behaviour.

Without good-quality, high-resolution data, we have no way of knowing whether our models are producing realistic results. Big data could offer researchers a wealth of information to meet these twin needs. The kinds of data that are exciting urban modellers include:

  • Electronic travel cards that tell us how people move around a city.
  • Twitter messages that provide insight into what people are doing and thinking.
  • The density of mobile telephones that hint at the presence of crowds.
  • Loyalty and credit-card transactions to understand consumer behaviour.
  • Participatory mapping of hitherto unknown urban spaces, such as Open Street Map.

These data can often be refined to the level of a single person. As a result, models of urban phenomena no longer need to rely on assumptions about the population as a whole – they can be tailored to capture the diversity of a city full of individuals, who often think and behave differently from one another….(More)

The Human Face of Big Data


A film by Sandy Smolan [56 minutes]: “Big Data is defined as the real time collection, analyses, and visualization of vast amounts of information. In the hands of Data Scientists this raw information is fueling a revolution which many people believe may have as big an impact on humanity going forward as the Internet has over the past two decades. Its enable us to sense, measure, and understand aspects of our existence in ways never before possible.

The Human Face of Big Data captures an extraordinary revolution sweeping, almost invisibly, through business, academia, government, healthcare, and everyday life. It’s already enabling us to provide a healthier life for our children. To provide our seniors with independence while keeping them safe. To help us conserve precious resources like water and energy. To alert us to tiny changes in our health, weeks or years before we develop a life—threatening illness. To peer into our own individual genetic makeup. To create new forms of life. And soon, as many predict, to re—engineer our own species. And we’ve barely scratched the surface…

This massive gathering and analyzing of data in real time is allowing us to address some of humanities biggest challenges. Yet, as Edward Snowden and the release of the NSA documents has shown, the accessibility of all this data can come at a steep price….(More)”

A multi-source dataset of urban life in the city of Milan and the Province of Trentino


Paper by Gianni Barlacchi et al in Scientific Data/Nature: “The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others….(More)”

Demystifying the hackathon


Ferry Grijpink, Alan Lau, and Javier Vara at McKinsey: “The “hackathon” has become one of the latest vogue terms in business. Typically used in reference to innovation jams like those seen at Rails Rumble or TechCrunch Disrupt, it describes an event that pools eager entrepreneurs and software developers into a confined space for a day or two and challenges them to create a cool killer app. Yet hackathons aren’t just for the start-up tech crowd. Businesses are employing the same principles to break through organizational inertia and instill more innovation-driven cultures. That’s because they offer a baptism by fire: a short, intense plunge that assaults the senses and allows employees to experience creative disruption in a visceral way.

For large organizations in particular, hackathons can be adapted to greatly accelerate the process of digital transformation. They are less about designing new products and more about “hacking” away at old processes and ways of working. By giving management and others the ability to kick the tires of collaborative design practices, 24-hour hackathons can show that big organizations are capable of delivering breakthrough innovation at start-up speed. And that’s never been more critical: speed and agility are today central to driving business value,1 making hackathons a valuable tool for accelerating organizational change and fostering a quick-march, customercentric, can-do culture.

What it takes to do a good 24-hour hackathon

A 24-hour hackathon differs from more established brainstorming sessions in that it is all about results and jump-starting a way of working, not just idea generation. However, done well, it can help shave 25 to 50 percent from the time it takes to bring a service or product to market. The best 24-hour hackathons share several characteristics. They are:

  • Centered on the customer. A hackathon is focused on a single customer process or journey and supports a clear business target—for example, speed, revenue growth, or a breakthrough customer experience. It goes from the front to the back, starting with the customer experience and moving through various organizational and process steps that come into play to deliver on that interaction and the complete customer journey.
  • Deeply cross-functional. This is not just for the IT crowd. Hackathons bring together people from across the business to force different ways of working a problem. In addition to IT and top management, whose involvement as participants or as sponsors is critical, hackathon participants can include frontline personnel, brand leaders, user-experience specialists, customer service, sales, graphic designers, and coders. That assortment forces a range of perspectives to keep group think at bay while intense deadlines dispense with small talk and force quick, deep collaboration.
  • Starting from scratch. Successful hackathons deliberately challenge participants to reimagine an idealized method for addressing a given customer need, such as taking a paper-based, offline account-opening procedure and turning it into a simple, single-step, self-service online process. There’s an intentional irreverence in this disruption, too. Participants go in knowing that everything can and should be challenged. That’s liberating. The goal is to toss aside traditional notions of how things are done and reimagine the richest, most efficient way to improve the customer experience.
  • Concrete and focused on output. Sessions start with ideas but end with a working prototype that people can see and touch, such as clickable apps or a 3-D printed product (exhibit). Output also includes a clear development path that highlights all the steps needed, including regulatory, IT, and other considerations, to accelerate production and implementation. After an intense design workshop, which includes sketching a minimum viable product and overnight coding and development of the prototype, a 24-hour hackathon typically concludes with an experiential presentation to senior leaders. This management showcase includes a real-life demonstration of the new prototype and a roadmap of IT and other capabilities needed to bring the final version to market in under 12 weeks.
  • Iterative and continuous. Once teams agree on a basic experience, designers and coders go to work creating a virtual model that the group vets, refines and re-releases in continual cycles until the new process or app meets the desired experience criteria. When hackathons end, there is usually a surge of enthusiasm and energy. But that energy can dissipate unless management puts in place new processes to sustain the momentum. That includes creating mechanisms for frontline employees to report back on progress and rewards for adopting new behaviors….(More)”

The Internet of Things: Frequently Asked Questions


Eric A. Fischer at the Congressional Research Service: “Internet of Things” (IoT) refers to networks of objects that communicate with other objects and with computers through the Internet. “Things” may include virtually any object for which remote communication, data collection, or control might be useful, such as vehicles, appliances, medical devices, electric grids, transportation infrastructure, manufacturing equipment, or building systems. In other words, the IoT potentially includes huge numbers and kinds of interconnected objects. It is often considered the next major stage in the evolution of cyberspace. Some observers believe it might even lead to a world where cyberspace and human space would seem to effectively merge, with unpredictable but potentially momentous societal and cultural impacts.

Two features makes objects part of the IoT—a unique identifier and Internet connectivity. Such “smart” objects each have a unique Internet Protocol (IP) address to identify the object sending and receiving information. Smart objects can form systems that communicate among themselves, usually in concert with computers, allowing automated and remote control of many independent processes and potentially transforming them into integrated systems. Those systems can potentially impact homes and communities, factories and cities, and every sector of the economy, both domestically and globally. Although the full extent and nature of the IoT’s impacts remain uncertain, economic analyses predict that it will contribute trillions of dollars to economic growth over the next decade. Sectors that may be particularly affected include agriculture, energy, government, health care, manufacturing, and transportation.

The IoT can contribute to more integrated and functional infrastructure, especially in “smart cities,” with projected improvements in transportation, utilities, and other municipal services. The Obama Administration announced a smart-cities initiative in September 2015. There is no single federal agency that has overall responsibility for the IoT. Agencies may find IoT applications useful in helping them fulfill their missions. Each is responsible for the functioning and security of its own IoT, although some technologies, such as drones, may fall under the jurisdiction of other agencies as well. Various agencies also have relevant regulatory, sector-specific, and other mission-related responsibilities, such as the Departments of Commerce, Energy, and Transportation, the Federal Communications Commission, and the Federal Trade Commission.

Security and privacy are often cited as major issues for the IoT, given the perceived difficulties of providing adequate cybersecurity for it, the increasing role of smart objects in controlling components of infrastructure, and the enormous increase in potential points of attack posed by the proliferation of such objects. The IoT may also pose increased risks to privacy, with cyberattacks potentially resulting in exfiltration of identifying or other sensitive information about an individual. With an increasing number of IoT objects in use, privacy concerns also include questions about the ownership, processing, and use of the data they generate….(More)”

US Administration Celebrates Five-Year Anniversary of Challenge.gov


White House Fact Sheet: “Today, the Administration is celebrating the five-year anniversary of Challenge.gov, a historic effort by the Federal Government to collaborate with members of the public through incentive prizes to address our most pressing local, national, and global challenges. True to the spirit of the President’s charge from his first day in office, Federal agencies have collaborated with more than 200,000 citizen solvers—entrepreneurs, citizen scientists, students, and more—in more than 440 challenges, on topics ranging from accelerating the deployment of solar energy, to combating breast cancer, to increasing resilience after Hurricane Sandy.

Highlighting continued momentum from the President’s call to harness the ingenuity of the American people, the Administration is announcing:

  • Nine new challenges from Federal agencies, ranging from commercializing NASA technology, to helping students navigate their education and career options, to protecting marine habitats.
  • Expanding support for use of challenges and prizes, including new mentoring support from the General Services Administration (GSA) for interested agencies and a new $244 million innovation platform opened by the U.S. Agency for International Development (USAID) with over 70 partners.

In addition, multiple non-governmental institutions are announcing 14 new challenges, ranging from improving cancer screenings, to developing better technologies to detect, remove, and recover excess nitrogen and phosphorus from water, to increasing the resilience of island communities….

Expanding the Capability for Prize Designers to find one another

The GovLab and MacArthur Foundation Research Network on Opening Governance will launch an expert network for prizes and challenges. The Governance Lab (GovLab) and MacArthur Foundation Research Network on Opening Governance will develop and launch the Network of Innovators (NoI) expert networking platform. NoI will make easily searchable the know-how of innovators on topics ranging from developing prize-backed challenges, opening up data, and use of crowdsourcing for public good. Platform users will answer questions about their skills and experiences, creating a profile that enables them to be matched to those with complementary knowledge to enable mutual support and learning. A beta version for user testing within the Federal prize community will launch in early October, with a full launch at the end of October. NoI will be open to civil servants around the world…(More)”