What top technologies should the next generation know how to use?


Lottie Waters at Devex: “Technology provides some great opportunities for global development, and a promising future. But for the next generation of professionals to succeed, it’s vital they stay up to date with the latest tech, innovations, and tools.

In a recent report produced by Devex in collaboration with the United States Agency for International Development and DAI, some 86 percent of survey respondents believe the technology, skills, and approaches development professionals will be using in 10 years’ time will be significantly different to today’s.

In fact, “technology for development” is regarded as the sector that will see the most development progress, but is also cited as the one that will see the biggest changes in skills required, according to the survey.

“As different technologies develop, new possibilities will open up that we may not even be aware of yet. These opportunities will bring new people into the development sector and require those in it to be more agile in adapting technologies to meet development challenges,” said one survey respondent.

While “blockchain,” “artificial intelligence,” and “drones” may be the current buzzwords surrounding tech in global development, geographical information systems, or GIS, and big data are actually the top technologies respondents believe the next generation of development professionals should learn how to utilize.

So, how are these technologies currently being used in development, how might this change in the near future, and what will their impact be in the next 10 years? Devex spoke with experts in the field who are already integrating these technologies into their work to find out….(More)”

Algorithms are taking over – and woe betide anyone they class as a ‘deadbeat’


Zoe Williams at The Guardian: “The radical geographer and equality evangelist Danny Dorling tried to explain to me once why an algorithm could be bad for social justice.

Imagine if email inboxes became intelligent: your messages would be prioritised on arrival, so if the recipient knew you and often replied to you, you’d go to the top; I said that was fine. That’s how it works already. If they knew you and never replied, you’d go to the bottom, he continued. I said that was fair – it would teach me to stop annoying that person.

If you were a stranger, but typically other people replied to you very quickly – let’s say you were Barack Obama – you’d sail right to the top. That seemed reasonable. And if you were a stranger who others usually ignored, you’d fall off the face of the earth.

“Well, maybe they should get an allotment and stop emailing people,” I said.

“Imagine how angry those people would be,” Dorling said. “They already feel invisible and they [would] become invisible by design.”…

All our debates about the use of big data have centred on privacy, and all seem a bit distant: I care, in principle, whether or not Ocado knows what I bought on Amazon. But in my truest heart, I don’t really care whether or not my Frube vendor knows that I also like dystopian fiction of the 1970s.

I do, however, care that a program exists that will determine my eligibility for a loan by how often I call my mother. I care if landlords are using tools to rank their tenants by compliant behaviour, to create a giant, shared platform of desirable tenants, who never complain about black mould and greet each rent increase with a basket of muffins. I care if the police in Durham are using Experian credit scores to influence their custodial decisions, an example – as you may have guessed by its specificity – that is already real. I care that the same credit-rating company has devised a Mosaic score, which splits households into comically bigoted stereotypes: if your name is Liam and you are an “avid texter”, that puts you in “disconnected youth”, while if you’re Asha you’re in “crowded kaleidoscope”. It’s not a privacy issue so much as a profiling one, although, as anyone who has ever been the repeated victim of police stop-and-search could have told me years ago, these are frequently the same thing.

Privacy isn’t the right to keep secrets: it’s the right to be an individual, not a type; the right to make a choice that’s entirely your own; the right to be private….(More)”.

Ethics as Methods: Doing Ethics in the Era of Big Data Research—Introduction


Introduction to the Special issue of Social Media + Society on “Ethics as Methods: Doing Ethics in the Era of Big Data Research”: Building on a variety of theoretical paradigms (i.e., critical theory, [new] materialism, feminist ethics, theory of cultural techniques) and frameworks (i.e., contextual integrity, deflationary perspective, ethics of care), the Special Issue contributes specific cases and fine-grained conceptual distinctions to ongoing discussions about the ethics in data-driven research.

In the second decade of the 21st century, a grand narrative is emerging that posits knowledge derived from data analytics as true, because of the objective qualities of data, their means of collection and analysis, and the sheer size of the data set. The by-product of this grand narrative is that the qualitative aspects of behavior and experience that form the data are diminished, and the human is removed from the process of analysis.

This situates data science as a process of analysis performed by the tool, which obscures human decisions in the process. The scholars involved in this Special Issue problematize the assumptions and trends in big data research and point out the crisis in accountability that emerges from using such data to make societal interventions.

Our collaborators offer a range of answers to the question of how to configure ethics through a methodological framework in the context of the prevalence of big data, neural networks, and automated, algorithmic governance of much of human socia(bi)lity…(More)”.

Let’s make private data into a public good


Article by Mariana Mazzucato: “The internet giants depend on our data. A new relationship between us and them could deliver real value to society….We should ask how the value of these companies has been created, how that value has been measured, and who benefits from it. If we go by national accounts, the contribution of internet platforms to national income (as measured, for example, by GDP) is represented by the advertisement-related services they sell. But does that make sense? It’s not clear that ads really contribute to the national product, let alone to social well-being—which should be the aim of economic activity. Measuring the value of a company like Google or Facebook by the number of ads it sells is consistent with standard neoclassical economics, which interprets any market-based transaction as signaling the production of some kind of output—in other words, no matter what the thing is, as long as a price is received, it must be valuable. But in the case of these internet companies, that’s misleading: if online giants contribute to social well-being, they do it through the services they provide to users, not through the accompanying advertisements.

This way we have of ascribing value to what the internet giants produce is completely confusing, and it’s generating a paradoxical result: their advertising activities are counted as a net contribution to national income, while the more valuable services they provide to users are not.

Let’s not forget that a large part of the technology and necessary data was created by all of us, and should thus belong to all of us. The underlying infrastructure that all these companies rely on was created collectively (via the tax dollars that built the internet), and it also feeds off network effects that are produced collectively. There is indeed no reason why the public’s data should not be owned by a public repository that sells the data to the tech giants, rather than vice versa. But the key issue here is not just sending a portion of the profits from data back to citizens but also allowing them to shape the digital economy in a way that satisfies public needs. Using big data and AI to improve the services provided by the welfare state—from health care to social housing—is just one example.

Only by thinking about digital platforms as collective creations can we construct a new model that offers something of real value, driven by public purpose. We’re never far from a media story that stirs up a debate about the need to regulate tech companies, which creates a sense that there’s a war between their interests and those of national governments. We need to move beyond this narrative. The digital economy must be subject to the needs of all sides; it’s a partnership of equals where regulators should have the confidence to be market shapers and value creators….(More)”.

Big Data for the Greater Good


Book edited by Ali Emrouznejad and Vincent Charles: “This book highlights some of the most fascinating current uses, thought-provoking changes, and biggest challenges that Big Data means for our society. The explosive growth of data and advances in Big Data analytics have created a new frontier for innovation, competition, productivity, and well-being in almost every sector of our society, as well as a source of immense economic and societal value. From the derivation of customer feedback-based insights to fraud detection and preserving privacy; better medical treatments; agriculture and food management; and establishing low-voltage networks – many innovations for the greater good can stem from Big Data. Given the insights it provides, this book will be of interest to both researchers in the field of Big Data, and practitioners from various fields who intend to apply Big Data technologies to improve their strategic and operational decision-making processes….(More)”.

Data infrastructure literacy


Paper by Jonathan Gray, Carolin Gerlitz and Liliana Bounegru at Big Data & Society: “A recent report from the UN makes the case for “global data literacy” in order to realise the opportunities afforded by the “data revolution”. Here and in many other contexts, data literacy is characterised in terms of a combination of numerical, statistical and technical capacities. In this article, we argue for an expansion of the concept to include not just competencies in reading and working with datasets but also the ability to account for, intervene around and participate in the wider socio-technical infrastructures through which data is created, stored and analysed – which we call “data infrastructure literacy”. We illustrate this notion with examples of “inventive data practice” from previous and ongoing research on open data, online platforms, data journalism and data activism. Drawing on these perspectives, we argue that data literacy initiatives might cultivate sensibilities not only for data science but also for data sociology, data politics as well as wider public engagement with digital data infrastructures. The proposed notion of data infrastructure literacy is intended to make space for collective inquiry, experimentation, imagination and intervention around data in educational programmes and beyond, including how data infrastructures can be challenged, contested, reshaped and repurposed to align with interests and publics other than those originally intended….(More)”

Small Wars, Big Data: The Information Revolution in Modern Conflict


Book by Eli Berman, Joseph H. Felter & Jacob N. Shapiro: “The way wars are fought has changed starkly over the past sixty years. International military campaigns used to play out between large armies at central fronts. Today’s conflicts find major powers facing rebel insurgencies that deploy elusive methods, from improvised explosives to terrorist attacks. Small Wars, Big Datapresents a transformative understanding of these contemporary confrontations and how they should be fought. The authors show that a revolution in the study of conflict–enabled by vast data, rich qualitative evidence, and modern methods—yields new insights into terrorism, civil wars, and foreign interventions. Modern warfare is not about struggles over territory but over people; civilians—and the information they might choose to provide—can turn the tide at critical junctures.

The authors draw practical lessons from the past two decades of conflict in locations ranging from Latin America and the Middle East to Central and Southeast Asia. Building an information-centric understanding of insurgencies, the authors examine the relationships between rebels, the government, and civilians. This approach serves as a springboard for exploring other aspects of modern conflict, including the suppression of rebel activity, the role of mobile communications networks, the links between aid and violence, and why conventional military methods might provide short-term success but undermine lasting peace. Ultimately the authors show how the stronger side can almost always win the villages, but why that does not guarantee winning the war.

Small Wars, Big Data provides groundbreaking perspectives for how small wars can be better strategized and favorably won to the benefit of the local population….(More)”.

Sentiment Analysis of Big Data: Methods, Applications, and Open Challenges


Paper by Shahid Shayaa et al at IEEE: “The development of IoT technologies and the massive admiration and acceptance of social media tools and applications, new doors of opportunity have been opened for using data analytics in gaining meaningful insights from unstructured information. The application of opinion mining and sentiment analysis (OMSA) in the era of big data have been used a useful way in categorize the opinion into different sentiment and in general evaluating the mood of the public. Moreover, different techniques of OMSA have been developed over the years in different datasets and applied to various experimental settings. In this regard, this study presents a comprehensive systematic literature review, aims to discuss both technical aspect of OMSA (techniques, types) and non-technical aspect in the form of application areas are discussed. Furthermore, the study also highlighted both technical aspect of OMSA in the form of challenges in the development of its technique and non-technical challenges mainly based on its application. These challenges are presented as a future direction for research….(More)”.

Migration Data using Social Media


European Commission JRC Technical Report: “Migration is a top political priority for the European Union (EU). Data on international migrant stocks and flows are essential for effective migration management. In this report, we estimated the number of expatriates in 17 EU countries based on the number of Facebook Network users who are classified by Facebook as “expats”. To this end, we proposed a method for correcting the over- or under-representativeness of Facebook Network users compared to countries’ actual population.

This method uses Facebook penetration rates by age group and gender in the country of previous residence and country of destination of a Facebook expat. The purpose of Facebook Network expat estimations is not to reproduce migration statistics, but rather to generate separate estimates of expatriates, since migration statistics and Facebook Network expats estimates do not measure the same quantities of interest.

Estimates of social media application users who are classified as expats can be a timely, low-cost, and almost globally available source of information for estimating stocks of international migrants. Our methodology allowed for the timely capture of the increase of Venezuelan migrants in Spain. However, there are important methodological and data integrity issues with using social media data sources for studying migration-related phenomena. For example, our methodology led us to significantly overestimate the number of expats from Philippines in Spain and in Italy and there is no evidence that this overestimation may be valid. While research on the use of big data sources for migration is in its infancy, and the diffusion of internet technologies in less developed countries is still limited, the use of big data sources can unveil useful insights on quantitative and qualitative characteristics of migration….(More)”.

My City Forecast: Urban planning communication tool for citizen with national open data


Paper by Y. Hasegawa, Y. Sekimoto, T. Seto, Y. Fukushima et al in Computers, Environment and Urban Systems: “In urban management, the importance of citizen participation is being emphasized more than ever before. This is especially true in countries where depopulation has become a major concern for urban managers and many local authorities are working on revising city master plans, often incorporating the concept of the “compact city.” In Japan, for example, the implementation of compact city plans means that each local government decides on how to designate residential areas and promotes citizens moving to these areas in order to improve budget effectiveness and the vitality of the city. However, implementing a compact city is possible in various ways. Given that there can be some designated withdrawal areas for budget savings, compact city policies can include disadvantages for citizens. At this turning point for urban structures, citizen–government mutual understanding and cooperation is necessary for every step of urban management, including planning.

Concurrently, along with the recent rapid growth of big data utilization and computer technologies, a new conception of cooperation between citizens and government has emerged. With emerging technologies based on civic knowledge, citizens have started to obtain the power to engage directly in urban management by obtaining information, thinking about their city’s problems, and taking action to help shape the future of their city themselves (Knight Foundation, 2013). This development is also supported by the open government data movement, which promotes the availability of government information online (Kingston, Carver, Evans, & Turton, 2000). CityDashboard is one well-known example of real-time visualization and distribution of urban information. CityDashboard, a web tool launched in 2012 by University College London, aggregates spatial data for cities around the UK and displays the data on a dashboard and a map. These new technologies are expected to enable both citizens and government to see their urban situation in an interface presenting an overhead view based on statistical information.

However, usage of statistics and governmental data is as yet limited in the actual process of urban planning…

To help improve this situation and increase citizen participation in urban management, we have developed a web-based urban planning communication tool using open government data for enhanced citizen–government cooperation. The main aim of the present research is to evaluate the effect of our system on users’ awareness of and attitude toward the urban situation. We have designed and developed an urban simulation system, My City Forecast (http://mycityforecast.net,) that enables citizens to understand how their environment and region are likely to change by urban management in the future (up to 2040)….(More)”.