Socialbots and Their Friends: Digital Media and the Automation of Sociality


Socialbots and Their Friends: Digital Media and the Automation of Sociality (Paperback) book coverBook edited by Robert W. Gehl and Maria Bakardjieva: “Many users of the Internet are aware of bots: automated programs that work behind the scenes to come up with search suggestions, check the weather, filter emails, or clean up Wikipedia entries. More recently, a new software robot has been making its presence felt in social media sites such as Facebook and Twitter – the socialbot. However, unlike other bots, socialbots are built to appear human. While a weatherbot will tell you if it’s sunny and a spambot will incessantly peddle Viagra, socialbots will ask you questions, have conversations, like your posts, retweet you, and become your friend. All the while, if they’re well-programmed, you won’t know that you’re tweeting and friending with a robot.

Who benefits from the use of software robots? Who loses? Does a bot deserve rights? Who pulls the strings of these bots? Who has the right to know what about them? What does it mean to be intelligent? What does it mean to be a friend? Socialbots and Their Friends: Digital Media and the Automation of Sociality is one of the first academic collections to critically consider the socialbot and tackle these pressing questions….(More)”

 

Artificial Intelligence Could Help Colleges Better Plan What Courses They Should Offer


Jeffrey R. Young at EdSsurge: Big data could help community colleges better predict how industries are changing so they can tailor their IT courses and other programs. After all, if Amazon can forecast what consumers will buy and prestock items in their warehouses to meet the expected demand, why can’t colleges do the same thing when planning their curricula, using predictive analytics to make sure new degree or certificates programs are started just in time for expanding job opportunities?

That’s the argument made by Gordon Freedman, president of the nonprofit National Laboratory for Education Transformation. He’s part of a new center that will do just that, by building a data warehouse that brings together up-to-date information on what skills employers need and what colleges currently offer—and then applying artificial intelligence to attempt to predict when sectors or certain employment needs might be expanding.

He calls the approach “opportunity engineering,” and the center boasts some heavy-hitting players to assist in the efforts, including the University of Chicago, the San Diego Supercomputing Center and Argonne National Laboratory. It’s called the National Center for Opportunity Engineering & Analysis.

Ian Roark, vice president of workforce development at Pima Community College in Arizona, is among those eager for this kind of “opportunity engineering” to emerge.

He explains when colleges want to start new programs, they face a long haul—it takes time to develop a new curriculum, put it through an internal review, and then send it through an accreditor….

Other players are already trying to translate the job market into a giant data set to spot trends. LinkedIn sits on one of the biggest troves of data, with hundreds of millions of job profiles, and ambitions to create what it calls the “economic graph” of the economy. But not everyone is on LinkedIn, which attracts mainly those in white-collar jobs. And companies such as Burning Glass Technologies have scanned hundreds of thousands of job listings and attempt to provide real-time intelligence on what employers say they’re looking for. Those still don’t paint the full picture, Freedman argues, such as what jobs are forming at companies.

“We need better information from the employer, better information from the job seeker and better information from the college, and that’s what we’re going after,” Freedman says…(More)”.

Can you crowdsource water quality data?


Pratibha Mistry at The Water Blog (Worldbank): “The recently released Contextual Framework for Crowdsourcing Water Quality Data lays out a strategy for citizen engagement in decentralized water quality monitoring, enabled by the “mobile revolution.”

According to the WHO, 1.8 billion people lack access to safe drinking water worldwide. Poor source water quality, non-existent or insufficient treatment, and defects in water distribution systems and storage mean these consumers use water that often doesn’t meet the WHO’s Guidelines for Drinking Water Quality.

The crowdsourcing framework develops a strategy to engage citizens in measuring and learning about the quality of their own drinking water. Through their participation, citizens provide utilities and water supply agencies with cost-effective water quality data in near-real time. Following a typical crowdsourcing model: consumers use their mobile phones to report water quality information to a central service. That service receives the information, then repackages and shares it via mobile phone messages, websites, dashboards, and social media. Individual citizens can thus be educated about their water quality, and water management agencies and other stakeholders can use the data to improve water management; it’s a win-win.

A well-implemented crowdsourcing project both depends on and benefits end users.Source: Figure modified from Hutchings, M., Dev, A., Palaniappan, M., Srinivasan, V., Ramanathan, N., Taylor, J.  2012. “mWASH: Mobile Phone Applications for the Water, Sanitation, and Hygiene Sector.” Pacific Institute, Oakland, California.  114 p.  (Link to full text)

Several groups, from the private sector to academia to non-profits, have taken a recent interest in developing a variety of so-called mWASH apps (mobile phone applications for the water, sanitation, and hygiene WASH sector).  A recent academic study analyzed how mobile phones might facilitate the flow of water quality data between water suppliers and public health agencies in Africa. USAID has invested in piloting a mobile application in Tanzania to help consumers test their water for E. coli….(More)”

The Government Isn’t Doing Enough to Solve Big Problems with AI


Mike Orcutt at MIT Technology Review: “The government should play a bigger role in developing new tools based on artificial intelligence, or we could miss out on revolutionary applications because they don’t have obvious commercial upside.

That was the message from prominent AI technologists and researchers at a Senate committee hearing last week. They agreed that AI is in a crucial developmental moment, and that government has a unique opportunity to shape its future. They also said that the government is in a better position than technology companies to invest in AI applications aimed at broad societal problems.

Today just a few companies, led by Google and Facebook, account for the lion’s share of AI R&D in the U.S. But Eric Horvitz, technical fellow and managing director of Microsoft Research, told the committee members that there are important areas that are rich and ripe for AI innovation, such as homelessness and addiction, where the industry isn’t making big investments. The government could help support those pursuits, Horvitz said.

For a more specific example, take the plight of a veteran seeking information online about medical options, says Andrew Moore, dean of the school of computer science at Carnegie Mellon University. If an application that could respond to freeform questions, search multiple government data sets at once, and provide helpful information about a veteran’s health care options were commercially attractive, it might be available already, he says.

There is a “real hunger for basic research” says Greg Brockman, cofounder and chief technology officer of the nonprofit research company OpenAI, because technologists understand that they haven’t made the most important advances yet. If we continue to leave the bulk of it to industry, not only could we miss out on useful applications, but also on the chance to adequately explore urgent scientific questions about ethics, safety, and security while the technology is still young, says Brockman. Since the field of AI is growing “exponentially,” it’s important to study these things now, he says, and the government could make that a “top line thing that they are trying to get done.”….(More)”.

Too Much Democracy in All the Wrong Places: Toward a Grammar of Participation


Christopher M. Kelty at Current Anthropology: “Participation is a concept and practice that governs many aspects of new media and new publics. There are a wide range of attempts to create more of it and a surprising lack of theorization. In this paper I attempt to present a “grammar” of participation by looking at three cases where participation has been central in the contemporary moment of new, social media and the Internet as well as in the past, stretching back to the 1930s: citizen participation in public administration, workplace participation, and participatory international development. Across these three cases I demonstrate that the grammar of participation shifts from a language of normative enthusiasm to one of critiques of co-optation and bureaucratization and back again. I suggest that this perpetually aspirational logic results in the problem of “too much democracy in all the wrong places.”…(More)”

What does Big Data mean to public affairs research?


Ines Mergel, R. Karl Rethemeyer, and Kimberley R. Isett at LSE’s The Impact Blog: “…Big Data promises access to vast amounts of real-time information from public and private sources that should allow insights into behavioral preferences, policy options, and methods for public service improvement. In the private sector, marketing preferences can be aligned with customer insights gleaned from Big Data. In the public sector however, government agencies are less responsive and agile in their real-time interactions by design – instead using time for deliberation to respond to broader public goods. The responsiveness Big Data promises is a virtue in the private sector but could be a vice in the public.

Moreover, we raise several important concerns with respect to relying on Big Data as a decision and policymaking tool. While in the abstract Big Data is comprehensive and complete, in practice today’sversion of Big Data has several features that should give public sector practitioners and scholars pause. First, most of what we think of as Big Data is really ‘digital exhaust’ – that is, data collected for purposes other than public sector operations or research. Data sets that might be publicly available from social networking sites such as Facebook or Twitter were designed for purely technical reasons. The degree to which this data lines up conceptually and operationally with public sector questions is purely coincidental. Use of digital exhaust for purposes not previously envisioned can go awry. A good example is Google’s attempt to predict the flu based on search terms.

Second, we believe there are ethical issues that may arise when researchers use data that was created as a byproduct of citizens’ interactions with each other or with a government social media account. Citizens are not able to understand or control how their data is used and have not given consent for storage and re-use of their data. We believe that research institutions need to examine their institutional review board processes to help researchers and their subjects understand important privacy issues that may arise. Too often it is possible to infer individual-level insights about private citizens from a combination of data points and thus predict their behaviors or choices.

Lastly, Big Data can only represent those that spend some part of their life online. Yet we know that certain segments of society opt in to life online (by using social media or network-connected devices), opt out (either knowingly or passively), or lack the resources to participate at all. The demography of the internet matters. For instance, researchers tend to use Twitter data because its API allows data collection for research purposes, but many forget that Twitter users are not representative of the overall population. Instead, as a recent Pew Social Media 2016 update shows, only 24% of all online adults use Twitter. Internet participation generally is biased in terms of age, educational attainment, and income – all of which correlate with gender, race, and ethnicity. We believe therefore that predictive insights are potentially biased toward certain parts of the population, making generalisations highly problematic at this time….(More)”

Future of e-government: learning from the past


Special issue of SOCRATES edited by Manoj Dixit: “We are living in an era of digitization thus moving towards a digital government. The use of ICT in public-administration is beneficial and it is not mere a coincidence that the top 10 countries in e-government implementation (according to UN E-Government Survey 2016) are flourishing democracies. There has been a sharp rise in the number of countries using e-government to provide public services online through one stop-platform. According to the 2016 survey, 90 countries now offer one or more single entry portal on public information or online services, or both and 148 countries provide at-least one form of online transaction services. More and more countries are making efforts through e-government to ensure and increase inclusiveness, effectiveness, accountability and transparency in their public institutions. Across the globe, data for public information and security is being opened up. The 2016 survey shows that 128 countries now provide data-sets on government spending in machine readable formats. E-government and innovation seems to have provided significant opportunities to transform public administration into an instrument of sustainable development. The governments around the globe are rapidly transforming. The use of information and communication technology in public administration – combined with organizational change and new skills- seems to be improving public services and democratic processes and strengthening support to public policies. There has been an increased effort to utilize advanced electronic and mobile services that benefits all. Fixed and wireless broadband subscriptions have increased unevenly across regions, with Europe leading, but Africa still lagging behind. We have to focus on these substantial region disparities and growing divide. All countries agreed, in SDG 9, that a major effort is required to ensure universal access to internet in the least developed countries. The rise of Social media and its easy access seems to have enabled an increasing number of countries moving towards participatory decision making, in which developed European countries are among the top 50 performers. But, the issues of diminishing collective thinking and rising Individual thinking are some rising issues that we will have to deal with in the future. There are more sensitive issues like the new classification of citizens into literate-illiterate, e-literate and e-illiterate, that the governments need to look upon. It is a good sign that many developing countries are making good progress. Enhanced e-participation can support the realization of the SDGs by enabling more participatory decision making, but the success of e-government will ultimately depend upon our ability and capability to solve the contrasting issues raised due to this transition with sensitivity.

In this issue of SOCRATES we have discussed, this new era of Digital Government. We have focused on what we have learned from the past and the future we want. From discussions on the role of e-governance within the local government settings in a modern democratic state to the experience of an academia with online examination, we have tried to include every possible aspect of e-government….(More)”

A Guide to Data Innovation for Development – From idea to proof-of-concept


Press Release: “UNDP and UN Global Pulse today released a comprehensive guide on how to integrate new sources of data into development and humanitarian work.

New and emerging data sources such as mobile phone data, social media, remote sensors and satellites have the potential to improve the work of governments and development organizations across the globe.

Entitled A Guide to Data Innovation for Development – From idea to proof-of-concept,’ this publication was developed by practitioners for practitioners. It provides step-by-step guidance for working with new sources of data to staff of UN agencies and international Non-Governmental Organizations.

The guide is a result of a collaboration of UNDP and UN Global Pulse with support from UN Volunteers. Led by UNDP innovation teams in Europe and Central Asia and Arab States, six UNDP offices in Armenia, Egypt, Kosovo[1], fYR Macedonia, Sudan and Tunisia each completed data innovation projects applicable to development challenges on the ground.

The publication builds on these successful case trials and on the expertise of data innovators from UNDP and UN Global Pulse who managed the design and development of those projects.

It provides practical guidance for jump-starting a data innovation project, from the design phase through the creation of a proof-of-concept.

The guide is structured into three sections – (I) Explore the Problem & System, (II) Assemble the Team and (III) Create the Workplan. Each of the sections comprises of a series of tools for completing the steps needed to initiate and design a data innovation project, to engage the right partners and to make sure that adequate privacy and protection mechanisms are applied.

…Download ‘A Guide to Data Innovation for Development – From idea to proof-of-concept’ here.”

Just good enough data: Figuring data citizenships through air pollution sensing and data stories


Jennifer Gabrys, Helen Pritchard, and Benjamin Barratt in Big Data & Society: “Citizen sensing, or the use of low-cost and accessible digital technologies to monitor environments, has contributed to new types of environmental data and data practices. Through a discussion of participatory research into air pollution sensing with residents of northeastern Pennsylvania concerned about the effects of hydraulic fracturing, we examine how new technologies for generating environmental data also give rise to new problems for analysing and making sense of citizen-gathered data. After first outlining the citizen data practices we collaboratively developed with residents for monitoring air quality, we then describe the data stories that we created along with citizens as a method and technique for composing data. We further mobilise the concept of ‘just good enough data’ to discuss the ways in which citizen data gives rise to alternative ways of creating, valuing and interpreting datasets. We specifically consider how environmental data raises different concerns and possibilities in relation to Big Data, which can be distinct from security or social media studies. We then suggest ways in which citizen datasets could generate different practices and interpretive insights that go beyond the usual uses of environmental data for regulation, compliance and modelling to generate expanded data citizenships….(More)”

Towards Scalable Governance: Sensemaking and Cooperation in the Age of Social Media


Iyad Rahwan in Philosophy & Technology: “Cybernetics, or self-governance of animal and machine, requires the ability to sense the world and to act on it in an appropriate manner. Likewise, self-governance of a human society requires groups of people to collectively sense and act on their environment. I argue that the evolution of political systems is characterized by a series of innovations that attempt to solve (among others) two ‘scalability’ problems: scaling up a group’s ability to make sense of an increasingly complex world, and to cooperate in increasingly larger groups. I then explore some recent efforts toward using the Internet and social media to provide alternative means for addressing these scalability challenges, under the banners of crowdsourcing and computer-supported argumentation. I present some lessons from those efforts about the limits of technology, and the research directions more likely to bear fruit….(More)”