Civic Crowd Analytics: Making sense of crowdsourced civic input with big data tools


Paper by  that: “… examines the impact of crowdsourcing on a policymaking process by using a novel data analytics tool called Civic CrowdAnalytics, applying Natural Language Processing (NLP) methods such as concept extraction, word association and sentiment analysis. By drawing on data from a crowdsourced urban planning process in the City of Palo Alto in California, we examine the influence of civic input on the city’s Comprehensive City Plan update. The findings show that the impact of citizens’ voices depends on the volume and the tone of their demands. A higher demand with a stronger tone results in more policy changes. We also found an interesting and unexpected result: the city government in Palo Alto mirrors more or less the online crowd’s voice while citizen representatives rather filter than mirror the crowd’s will. While NLP methods show promise in making the analysis of the crowdsourced input more efficient, there are several issues. The accuracy rates should be improved. Furthermore, there is still considerable amount of human work in training the algorithm….(More)”

Ten Actions to Implement Big Data Initiatives: A Study of 65 Cities


Ten Actions to Implement Big Data Initiatives: A Study of 65 Cities
IBM Center for the Business of Government: “Professor Ho conducted a survey and phone interviews with city officials responsible for Big Data initiatives. Based on his research, the report presents a framework for Big Data initiatives which consists of two major cycles: the data cycle and the decision-making cycle. Each cycle is described in the report.

The trend toward Big Data initiatives is likely to accelerate in future years. In anticipation of the increased use of Big Data, Professor Ho identified factors that are likely to influence its adoption by local governments. He identified three organizational factors that influence adoption: leadership attention, adequate staff capacity, and pursuit of partners. In addition, he identified four organizational strategies that influence adoption: governance structures, team approach, incremental initiatives, and Big Data policies.

Based on his research findings, Professor Ho sets forth 10 recommendations for those responsible for implementing cities’ Big Data initiatives—five recommendations are directed to city leaders and five to city executives. A key recommendation is that city leaders should think about a “smart city system,” not just data. Another key recommendation is that city executives should develop a multi-year strategic data plan to enhance the effectiveness of Big Data initiatives….(More)”

Artificial Intelligence can streamline public comment for federal agencies


John Davis at the Hill: “…What became immediately clear to me was that — although not impossible to overcome — the lack of consistency and shared best practices across all federal agencies in accepting and reviewing public comments was a serious impediment. The promise of Natural Language Processing and cognitive computing to make the public comment process light years faster and more transparent becomes that much more difficult without a consensus among federal agencies on what type of data is collected – and how.

“There is a whole bunch of work we have to do around getting government to be more customer friendly and making it at least as easy to file your taxes as it is to order a pizza or buy an airline ticket,” President Obama recently said in an interview with WIRED. “Whether it’s encouraging people to vote or dislodging Big Data so that people can use it more easily, or getting their forms processed online more simply — there’s a huge amount of work to drag the federal government and state governments and local governments into the 21st century.”

…expanding the discussion around Artificial Intelligence and regulatory processes to include how the technology should be leveraged to ensure fairness and responsiveness in the very basic processes of rulemaking – in particular public notices and comments. These technologies could also enable us to consider not just public comments formally submitted to an agency, but the entire universe of statements made through social media posts, blogs, chat boards — and conceivably every other electronic channel of public communication.

Obviously, an anonymous comment on the Internet should not carry the same credibility as a formally submitted, personally signed statement, just as sworn testimony in court holds far greater weight than a grapevine rumor. But so much public discussion today occurs on Facebook pages, in Tweets, on news website comment sections, etc. Anonymous speech enjoys explicit protection under the Constitution, based on a justified expectation that certain sincere statements of sentiment might result in unfair retribution from the government.

Should we simply ignore the valuable insights about actual public sentiment on specific issues made possible through the power of Artificial Intelligence, which can ascertain meaning from an otherwise unfathomable ocean of relevant public conversations? With certain qualifications, I believe Artificial Intelligence, or AI, should absolutely be employed in the critical effort to gain insights from public comments – signed or anonymous.

“In the criminal justice system, some of the biggest concerns with Big Data are the lack of data and the lack of quality data,” the NSTC report authors state. “AI needs good data. If the data is incomplete or biased, AI can exacerbate problems of bias.” As a former federal criminal prosecutor and defense attorney, I am well familiar with the absolute necessity to weigh the relative value of various forms of evidence – or in this case, data…(More)

Privacy Preservation in the Age of Big Data


A survey and primer by John S. Davis II and Osonde Osoba at Rand: “Anonymization or de-identification techniques are methods for protecting the privacy of subjects in sensitive data sets while preserving the utility of those data sets. The efficacy of these methods has come under repeated attacks as the ability to analyze large data sets becomes easier. Several researchers have shown that anonymized data can be reidentified to reveal the identity of the data subjects via approaches such as so-called “linking.” In this report, we survey the anonymization landscape of approaches for addressing re-identification and we identify the challenges that still must be addressed to ensure the minimization of privacy violations. We also review several regulatory policies for disclosure of private data and tools to execute these policies….(More)”.

Remote Data Collection: Three Ways to Rethink How You Collect Data in the Field


Magpi : “As mobile devices have gotten less and less expensive – and as millions worldwide have climbed out of poverty – it’s become quite common to see a mobile phone in every person’s hand, or at least in every family, and this means that we can utilize an additional approach to data collection that were simply not possible before….

In our Remote Data Collection Guide, we discuss these new technologies and the:

  • Key benefits of remote data collection in each of three different situations.
  • The direct impact of remote data collection on reducing the cost of your efforts.
  • How to start the process of choosing the right option for your needs….(More)”

Three Use Cases How Big Data Helps Save The Earth


DataFloq: “The earth is having a difficult time, for quite some time already. Deforestation is still happening at a large scale across the globe. In Brazil alone 40,200 hectares were deforested in the past year. The great pacific garbage patch is still growing and smog in Beijing is more common than a normal bright day. This is nothing new unfortunately. A possible solution is however. Since a few years, scientists, companies and governments are turning to Big Data to solve such problems or even prevent them from happening. It turns out that Big Data can help save the earth and if done correctly, this could lead to significant results in the coming years. Let’s have a look at some fascinating use cases of how Big Data can contribute:

Monitoring Biodiversity Across the Globe

Conservation International, a non-profit environmental organization with a mission to protect nature and its biodiversity, crunches vast amounts of data from images to monitor biodiversity around the world. At 16 important sites across the continents, they have installed over a 1000 smart cameras. Thanks to the motion sensor, these cameras will captures images as soon as the sensor is triggered by animals passing by. Per site these cameras cover approximately 2.000 square kilometres…. They automatically determine which species have appeared in the images and enrich the data with other information such as climate data, flora and fauna data and land use data to better understand how animal populations change over time…. the Wildlife Picture Index (WPI) Analytics System, a project dashboard and analytics tool for visualizing user-friendly, near real-time data-driven insights on the biodiversity. The WPI monitors ground-dwelling tropical medium and large mammals and birds, species that are important economically, aesthetically and ecologically.

Using Satellite Imagery to Combat Deforestation

Mapping deforestation is becoming a lot easier today thanks to Big Data. Imagery analytics allows environmentalists and policy makers to monitor, almost in real-time, the status of forests around the globe with the help of satellite imagery. New tools like the Global Forest Watch uses a massive amount of high-resolution NASA satellite imagery to assist conservation organizations, governments and concerned citizens monitor deforestation in “near-real time.”…

But that’s not all. Planet Labs has developed a tiny satellite that they are currently sending into space, dozens at a time. The satellite measures only 10 by 10 by 30 centimeters but is outfitted with the latest technology. They aim to create a high-resolution image of every spot on the earth, updated daily. Once available, this will generate massive amounts of data that they will open source for others to develop applications that will improve earth.

Monitoring and Predicting with Smart Oceans

Over 2/3 of the world consists of oceans and also these oceans can be monitored. Earlier this year, IBM Canada and Ocean Networks Canada announced a three-year program to better understand British Colombia’s Oceans. Using the latest technology and sensors, they want to predict offshore accidents, natural disasters and tsunamis and forecast the impact of these incidents. Using hundreds of cabled marine sensors they are capable of monitoring waves, currents, water quality and vessel traffic in some of the major shipping channels….These are just three examples of how Big Data can help save the planet. There are of course a lot more fascinating examples and here is list of 10 of such use cases….(More)”

Seeing Cities Through Big Data


Book edited by Thakuriah, Piyushimita (Vonu), Tilahun, Nebiyou, and Zellner, Moira: “… introduces the latest thinking on the use of Big Data in the context of urban systems, including  research and insights on human behavior, urban dynamics, resource use, sustainability and spatial disparities, where it promises improved planning, management and governance in the urban sectors (e.g., transportation, energy, smart cities, crime, housing, urban and regional economies, public health, public engagement, urban governance and political systems), as well as Big Data’s utility in decision-making, and development of indicators to monitor economic and social activity, and for urban sustainability, transparency, livability, social inclusion, place-making, accessibility and resilience…(More)”

The challenges and limits of big data algorithms in technocratic governance


Paper by Marijn Janssen and George Kuk in Government Information Quarterly: “Big data is driving the use of algorithm in governing mundane but mission-critical tasks. Algorithms seldom operate on their own and their (dis)utilities are dependent on the everyday aspects of data capture, processing and utilization. However, as algorithms become increasingly autonomous and invisible, they become harder for the public to detect and scrutinize their impartiality status. Algorithms can systematically introduce inadvertent bias, reinforce historical discrimination, favor a political orientation or reinforce undesired practices. Yet it is difficult to hold algorithms accountable as they continuously evolve with technologies, systems, data and people, the ebb and flow of policy priorities, and the clashes between new and old institutional logics. Greater openness and transparency do not necessarily improve understanding. In this editorial we argue that through unraveling the imperceptibility, materiality and governmentality of how algorithms work, we can better tackle the inherent challenges in the curatorial practice of data and algorithm. Fruitful avenues for further research on using algorithm to harness the merits and utilities of a computational form of technocratic governance are presented….(More)

 

Helping Smart Cities Harness Big Data


Linda Poon in CityLab: “Harnessing the power of open data is key to developing the smart cities of the future. But not all governments have the capacity—be that funding or human capital—to collect all the necessary information and turn it into a tool. That’s where Mapbox comes in.

Mapbox offers open-source mapping platforms, and is no stranger to turning complex data into visualizations cities can use, whether it’s mapping traffic fatalities in the U.S. or the conditions of streets in Washington, D.C., during last year’s East Coast blizzard. As part of the White House Smart Cities Initiative, which announced this week that it would make more than $80 million in tech investments this year, the company is rolling out Mapbox Cities, a new “mentorship” program that, for now, will give three cities the tools and support they need to solve some of their most pressing urban challenges. It issued a call for applications earlier this week, and responses have poured in from across the globe says Christina Franken, who specializes in smart cities at Mapbox.

“It’s very much an experimental approach to working with cities,” she says. “A lot of cities have open-data platforms but they don’t really do something with the data. So we’re trying to bridge that gap.”

During Hurricane Sandy, Mapbox launched a tool to help New Yorkers figure out if they were in an evacuation zone. (Mapbox)

But the company isn’t approaching the project blindly. In a way, Mapbox has the necessary experience to help cities jumpstart their own projects. Its resume includes, for example, a map that visualizes the sheer quantity of traffic fatalities along any commuting route in the U.S., showcasing its ability to turn a whopping five years’ worth of data into a public-safety tool. During 2012’s Hurricane Sandy, they created a disaster-relief tool to help New Yorkers find shelter.

And that’s just in the United States. Mapbox recently also started a group focusing primarily on humanitarian issues and bringing their mapping and data-collecting tools to aid organizations all over the world in times of crisis. It provides free access to its vast collection of resources, and works closely with collaborators to help them customize maps based on specific needs….(More)”

Social Machines: The Coming Collision of Artificial Intelligence, Social Networking, and Humanity


Book by James Hendler and Alice Mulvehill: “Will your next doctor be a human being—or a machine? Will you have a choice? If you do, what should you know before making it?

This book introduces the reader to the pitfalls and promises of artificial intelligence in its modern incarnation and the growing trend of systems to “reach off the Web” into the real world. The convergence of AI, social networking, and modern computing is creating an historic inflection point in the partnership between human beings and machines with potentially profound impacts on the future not only of computing but of our world.

AI experts and researchers James Hendler and Alice Mulvehill explore the social implications of AI systems in the context of a close examination of the technologies that make them possible. The authors critically evaluate the utopian claims and dystopian counterclaims of prognosticators. Social Machines: The Coming Collision of Artificial Intelligence, Social Networking, and Humanity is your richly illustrated field guide to the future of your machine-mediated relationships with other human beings and with increasingly intelligent machines.

What you’ll learn

• What the concept of a social machine is and how the activities of non-programmers are contributing to machine intelligence• How modern artificial intelligence technologies, such as Watson, are evolving and how they process knowledge from both carefully produced information (such as Wikipedia or journal articles) and from big data collections

• The fundamentals of neuromorphic computing

• The fundamentals of knowledge graph search and linked data as well as the basic technology concepts that underlie networking applications such as Facebook and Twitter

• How the change in attitudes towards cooperative work on the Web, especially in the younger demographic, is critical to the future of Web applications…(More)”