Paper by Nicolas Pflanzl , Tadeu Classe, Renata Araujo, and Gottfried Vossen: “One of the challenges envisioned for eGovernment is how to actively involve citizens in the improvement of public services, allowing governments to offer better services. However, citizen involvement in public service design through ICT is not an easy goal. Services have been deployed internally in public organizations, making it difficult to be leveraged by citizens, specifically those without an IT background. This research moves towards decreasing the gap between public services process opacity and complexity and citizens’ lack of interest or competencies to understand them. The paper discusses game design as an approach to motivate, engage and change citizens’ behavior with respect to public services improvement. The design of a sample serious game is proposed; benefits and challenges are discussed using a public service delivery scenario from Brazil….(More)”
Questioning Big Data: Crowdsourcing crisis data towards an inclusive humanitarian response
More)”.
, , , , and The aim of this paper is to critically explore whether crowdsourced Big Data enables an inclusive humanitarian response at times of crisis. We argue that all data, including Big Data, are socially constructed artefacts that reflect the contexts and processes of their creation. To support our argument, we qualitatively analysed the process of ‘Big Data making’ that occurred by way of crowdsourcing through open data platforms, in the context of two specific humanitarian crises, namely the 2010 earthquake in Haiti and the 2015 earthquake in Nepal. We show that the process of creating Big Data from local and global sources of knowledge entails the transformation of information as it moves from one distinct group of contributors to the next. The implication of this transformation is that locally based, affected people and often the original ‘crowd’ are excluded from the information flow, and from the interpretation process of crowdsourced crisis knowledge, as used by formal responding organizations, and are marginalized in their ability to benefit from Big Data in support of their own means. Our paper contributes a critical perspective to the debate on participatory Big Data, by explaining the process of in and exclusion during data making, towards more responsive humanitarian relief….(Counterterrorism and Counterintelligence: Crowdsourcing Approach
Literature review by Sanket Subhash Khanwalkar: “Despite heavy investment by the United States and several other national governments, terrorism related problems are rising at an alarming rate. Lone-wolf terrorism, in particular, in the last decade, has caused 70% of all terrorism related deaths in the US and the West. This literature survey describes lone-wolf terrorism in detail to analyse its structure, characteristics, strengths and weaknesses. It also investigates crowdsourcing intelligence, as an unorthodox approach to counter lone-wolf terrorism, by reviewing its current state-of-the-art and identifying the areas for improvement….(More)”
Smart Economy in Smart Cities
Book edited by Vinod Kumar, T. M.: “The present book highlights studies that show how smart cities promote urban economic development. The book surveys the state of the art of Smart City Economic Development through a literature survey. The book uses 13 in depth city research case studies in 10 countries such as the North America, Europe, Africa and Asia to explain how a smart economy changes the urban spatial system and vice versa. This book focuses on exploratory city studies in different countries, which investigate how urban spatial systems adapt to the specific needs of smart urban economy. The theory of smart city economic development is not yet entirely understood and applied in metropolitan regional plans. Smart urban economies are largely the result of the influence of ICT applications on all aspects of urban economy, which in turn changes the land-use system. It points out that the dynamics of smart city GDP creation takes ‘different paths,’ which need further empirical study, hypothesis testing and mathematical modelling. Although there are hypotheses on how smart cities generate wealth and social benefits for nations, there are no significant empirical studies available on how they generate urban economic development through urban spatial adaptation. This book with 13 cities research studies is one attempt to fill in the gap in knowledge base….(More)”
Make Data Sharing Routine to Prepare for Public Health Emergencies
Jean-Paul Chretien, Caitlin M. Rivers, and Michael A. Johansson in PLOS Medicine: “In February 2016, Wellcome Trust organized a pledge among leading scientific organizations and health agencies encouraging researchers to release data relevant to the Zika outbreak as rapidly and widely as possible [1]. This initiative echoed a September 2015 World Health Organization (WHO) consultation that assessed data sharing during the recent West Africa Ebola outbreak and called on researchers to make data publicly available during public health emergencies [2]. These statements were necessary because the traditional way of communicating research results—publication in peer-reviewed journals, often months or years after data collection—is too slow during an emergency.
The acute health threat of outbreaks provides a strong argument for more complete, quick, and broad sharing of research data during emergencies. But the Ebola and Zika outbreaks suggest that data sharing cannot be limited to emergencies without compromising emergency preparedness. To prepare for future outbreaks, the scientific community should expand data sharing for all health research….
Open data deserves recognition and support as a key component of emergency preparedness. Initiatives to facilitate discovery of datasets and track their use [40–42]; provide measures of academic contribution, including data sharing that enables secondary analysis [43]; establish common platforms for sharing and integrating research data [44]; and improve data-sharing capacity in resource-limited areas [45] are critical to improving preparedness and response.
Research sponsors, scholarly journals, and collaborative research networks can leverage these new opportunities with enhanced data-sharing requirements for both nonemergency and emergency settings. A proposal to amend the International Health Regulations with clear codes of practice for data sharing warrants serious consideration [46]. Any new requirements should allow scientists to conduct and communicate the results of secondary analyses, broadening the scope of inquiry and catalyzing discovery. Publication embargo periods, such as one under consideration for genetic sequences of pandemic-potential influenza viruses [47], may lower barriers to data sharing but may also slow the timely use of data for public health.
Integrating open science approaches into routine research should make data sharing more effective during emergencies, but this evolution is more than just practice for emergencies. The cause and context of the next outbreak are unknowable; research that seems routine now may be critical tomorrow. Establishing openness as the standard will help build the scientific foundation needed to contain the next outbreak.
Recent epidemics were surprises—Zika and chikungunya sweeping through the Americas; an Ebola pandemic with more than 10,000 deaths; the emergence of severe acute respiratory syndrome and Middle East respiratory syndrome, and an influenza pandemic (influenza A[H1N1]pdm09) originating in Mexico—and we can be sure there are more surprises to come. Opening all research provides the best chance to accelerate discovery and development that will help during the next surprise….(More)”
Everyday ‘Placebo Buttons’ Create Semblance of Control

Each of these seemingly disconnected everyday buttons you pressed may have something in common: it is quite possible that none of them did a thing to influence the world around you. Any perceived impact may simply have been imaginary, a placebo effect giving you the illusion of control.
In the early 2000s, New York City transportation officials finally admitted what many had suspected: the majority of crosswalk buttons in the city are completely disconnected from the traffic light system. Thousands of these initially worked to request a signal change but most no longer do anything, even if their signage suggests otherwise.
Naturally, a number of street art projects have popped up around the humorous futility of pedestrians pressing placebo buttons:
Crosswalk buttons were originally introduced to NYC during the 1960s. At the time, there was less congestion and it made sense to leave green lights on for major thoroughfares until cross traffic came along … or until a pedestrian wanting to cross the street pushed a button.
Today, a combination of carefully orchestrated automation and higher traffic has made most of these buttons obsolete. Citywide, there are around 100 crosswalk buttons that still work in NYC but close to 1,000 more that do nothing at all. So why not take them down? Removing the remaining nonfunctional buttons would cost the city millions, a potential waste of already limited funds for civic infrastructure….(More)”
How Big Data Analytics is Changing Legal Ethics
Renee Knake at Bloomberg Law: “Big data analytics are changing how lawyers find clients, conduct legal research and discovery, draft contracts and court papers, manage billing and performance, predict the outcome of a matter, select juries, and more. Ninety percent of corporate legal departments, law firms, and government lawyers note that data analytics are applied in their organizations, albeit in limited ways, according to a 2015 survey. The Legal Services Corporation, the largest funder of civil legal aid for low-income individuals in the United States, recommended in 2012 that all states collect and assess data on case progress/outcomes to improve the delivery of legal services. Lawyers across all sectors of the market increasingly recognize how big data tools can enhance their work.
A growing literature advocates for businesses and governmental bodies to adopt data ethics policies, and many have done so. It is not uncommon to find data-use policies prominently displayed on company or government websites, or required a part of a click-through consent before gaining access to a mobile app or webpage. Data ethics guidelines can help avoid controversies, especially when analytics are used in potentially manipulative or exploitive ways. Consider, for example, Target’s data analytics that uncovered a teen’s pregnancy before her father did, or Orbitz’s data analytics offered pricier hotels to Mac users. These are just two of numerous examples in recent years where companies faced criticism for how they used data analytics.
While some law firms and legal services organizations follow data-use policies or codes of conduct, many do not. Perhaps this is because the legal profession was not transformed as early or rapidly as other industries, or because until now, big data in legal was largely limited to e-discovery, where the data use is confined to the litigation and is subject to judicial oversight. Another reason may be that lawyers believe their rules of professional conduct provide sufficient guidance and protection. Unlike other industries, lawyers are governed by a special code of ethical obligations to clients, the justice system, and the public. In most states, this code is based in part upon the American Bar Association (ABA) Model Rules of Professional Conduct, though rules often vary from jurisdiction to jurisdiction. Several of the Model Rules are relevant to big data use. That said, the Model Rules are insufficient for addressing a number of fundamental ethical concerns.
At the moment, legal ethics for big data analytics is at best an incomplete mix of professional conduct rules and informal policies adopted by some, but not all law practices. Given the increasing prevalence of data analytics in legal services, lawyers and law students should be familiar not only with the relevant professional conduct rules, but also the ethical questions left unanswered. Listed below is a brief summary of both, followed by a proposed legal ethics agenda for data analytics. …
Questions Unanswered by Lawyer Ethics Rules
Access/Ownership. Who owns the original data — the individual source or the holder of the pooled information? Who owns the insights drawn from its analysis? Who should receive access to the data compilation and the results?
Anonymity/Identity. Should all personally identifiable or sensitive information be removed from the data? What protections are necessary to respect individual autonomy? How should individuals be able to control and shape their electronic identity?
Consent. Should individuals affirmatively consent to use of their personal data? Or is it sufficient to provide notice, perhaps with an opt-out provision?
Privacy/Security. Should privacy be protected beyond the professional obligation of client confidentiality? How should data be secured? The ABA called upon private and public sector lawyers to implement cyber-security policies, including data use, in a 2012resolution and produced a cyber-security handbook in 2013.
Process. How involved should lawyers be in the process of data collection and analysis? In the context of e-discovery, for example, a lawyer is expected to understand how documents are collected, produced, and preserved, or to work with a specialist. Should a similar level of knowledge be required for all forms of data analytics use?
Purpose. Why was the data first collected from individuals? What is the purpose for the current use? Is there a significant divergence between the original and secondary purposes? If so, is it necessary for the individuals to consent to the secondary purpose? How will unintended consequences be addressed?
Source. What is the source of the data? Did the lawyer collect it directly from clients, or is the lawyer relying upon a third-party source? Client-based data is, of course, subject to the lawyer’s professional conduct rules. Data from any source should be trustworthy, reasonable, timely, complete, and verifiable….(More)”
Open Data for Social Change and Sustainable Development
Special issue of the Journal of Community Informatics edited by Raed M. Sharif and Francois Van Schalkwyk: “As the second phase of the Emerging Impacts of Open Data in Developing Countries (ODDC) drew to a close, discussions started on a possible venue for publishing some of the papers that emerged from the research conducted by the project partners. In 2012 the Journal of Community Informatics published a special issue titled ‘Community Informatics and Open Government Data’. Given the journal’s previous interest in the field of open data, its established reputation and the fact that it is a peer-reviewed open access journal, the Journal of Community Informatics was approached and agreed to a second special issue with a focus on open data. A closed call for papers was sent out to the project research partners. Shortly afterwards, the first Open Data Research Symposium was held ahead of the International Open Data Conference 2015 in Ottawa, Canada. For the first time, a forum was provided to academics and researchers to present papers specifically on open data. Again there were discussions about an appropriate venue to publish selected papers from the Symposium. The decision was taken by the Symposium Programme Committee to invite the twenty plus presenters to submit full papers for consideration in the special issue.
The seven papers published in this special issue are those that were selected through a double-blind peer review process. Researchers are often given a rough ride by open data advocates – the research community is accused of taking too long, not being relevant enough and of speaking in tongues unintelligible to social movements and policy-makers. And yet nine years after the ground-breaking meeting in Sebastopol at which the eight principles of open government data were penned, seven after President Obama injected political legitimacy into a movement, and five after eleven nation states formed the global Open Government Partnership (OGP), which has grown six-fold in membership; an email crosses our path in which the authors of a high-level report commit to developing a comprehensive understanding of a continental open data ecosystem through an examination of open data supply. Needless to say, a single example is not necessarily representative of global trends in thinking about open data. Yet, the focus on government and on the supply of open data by open data advocates – with little consideration of open data use, the differentiation of users, intermediaries, power structures or the incentives that propel the evolution of ecosystems – is still all too common. Empirical research has already revealed the limitations of ‘supply it and they will use it’ open data practices, and has started to fill critical knowledge gaps to develop a more holistic understanding of the determinants of effective open data policy and practice. As open data policies and practices evolve, the need to capture the dynamics of this evolution and to trace unfolding outcomes becomes critical to advance a more efficient and progressive field of research and practice. The trajectory of the existing body of literature on open data and the role of public authorities, both local and national, in the provision of open data
As open data policies and practices evolve, the need to capture the dynamics of this evolution and to trace unfolding outcomes becomes critical to advance a more efficient and progressive field of research and practice. The trajectory of the existing body of literature on open data and the role of public authorities, both local and national, in the provision of open data is logical and needed in light of the central role of government in producing a wide range of types and volumes of data. At the same time, the complexity of open data ecosystem and the plethora of actors (local, regional and global suppliers, intermediaries and users) makes a compelling case for opening avenues for more diverse discussion and research beyond the supply of open data. The research presented in this special issue of the Journal of Community Informatics touches on many of these issues, sets the pace and contributes to the much-needed knowledge base required to promote the likelihood of open data living up to its promise. … (More)”
How Medical Crowdsourcing Empowers Patients & Doctors
Rob Stretch at Rendia: “Whether you’re a solo practitioner in a rural area, or a patient who’s bounced from doctor to doctor with adifficult–to-diagnose condition, there are many reasons why you might seek out expert medical advice from a larger group. Fortunately, in 2016, seeking feedback from other physicians or getting a second opinion is as easy as going online.
“Medical crowdsourcing” sites and apps are gathering steam, from provider-only forums likeSERMOsolves and Figure 1, to patient-focused sites like CrowdMed. They share the same mission of empowering doctors and patients, reducing misdiagnosis, and improving medicine. Is crowdsourcing the future of medicine? Read on to find out more.
Fixing misdiagnosis
An estimated 10 percent to 20 percent of medical cases are misdiagnosed, even more than drug errors and surgery on the wrong patient or body part, according to the National Center for Policy Analysis. And diagnostic errors are the leading cause of malpractice litigation. Doctors often report that with many of their patient cases, they would benefit from the support and advice of their peers.
The photo-sharing app for health professionals, Figure 1, is filling that need. Since we reported on it last year, the app has reached 1 million users and added a direct-messaging feature. The app is geared towards verified medical professionals, and goes to great lengths to protect patient privacy in keeping with HIPAAlaws. According to co-founder and CEO Gregory Levey, an average of 10,000 unique users check in toFigure 1 every hour, and medical professionals and students in 190 countries currently use the app.
Using Figure 1 to crowdsource advice from the medical community has saved at least one life. EmilyNayar, a physician assistant in rural Oklahoma and a self-proclaimed “Figure 1 addict,” told Wired magazine that because of photos she’d seen on the app, she was able to correctly diagnose a patient with shingles meningitis. Another doctor had misdiagnosed him previously, and the wrong medication could have killed him.
Collective knowledge at zero cost
In addition to serving as “virtual colleagues” for isolated medical providers, crowdsourcing forums can pool knowledge from an unprecedented number of doctors in different specialties and even countries,and can do so very quickly.
When we first reported on SERMO, the company billed itself as a “virtual doctors’ lounge.” Now, the global social network with 600,000 verified, credentialed physician members has pivoted to medical crowdsourcing with SERMOsolves, one of its most popular features, according to CEO Peter Kirk.
“Crowdsourcing patient cases through SERMOsolves is an ideal way for physicians to gain valuable information from the collective knowledge of hundreds of physicians instantly,” he said in a press release.According to SERMO, 3,500 challenging patient cases were posted in 2014, viewed 700,000 times, and received 50,000 comments. Most posted cases received responses within 1.5 hours and were resolved within a day. “We have physicians from more than 96 specialties and subspecialties posting on the platform, working together to share their valuable insights at zero cost to the healthcare system.”
While one early user of SERMO wrote on KevinMD.com that he felt the site’s potential was overshadowed by the anonymous rants and complaining, other users have noted that the medical crowdsourcing site has,like Figure 1, directly benefitted patients.
In an article on PhysiciansPractice.com, Richard Armstrong, M.D., cites the example of a family physician in Canada who posted a case of a young girl with an E. coli infection. “Physicians from around the world immediately began to comment and the recommendations resulted in a positive outcome for the patient.This instance offered cross-border learning experiences for the participating doctors, not only regarding the specific medical issue but also about how things are managed in different health systems,” wrote Dr.Armstrong.
Patients get proactive
While patients have long turned to social media to (questionably) crowdsource their medical queries, there are now more reputable sources than Facebook.
Tech entrepreneur Jared Heyman launched the health startup CrowdMed in 2013 after his sister endured a “terrible, undiagnosed medical condition that could have killed her,” he told the Wall Street Journal. She saw about 20 doctors over three years, racking up six-figure medical bills. The NIH Undiagnosed DiseaseProgram finally gave her a diagnosis: fragile X-associated primary ovarian insufficiency, a rare disease that affects just 1 in 15,000 women. A hormone patch resolved her debilitating symptoms….(More)”
How Technology Can Restore Our Trust in Democracy
Cenk Sidar in Foreign Policy: “The travails of the Arab Spring, the rise of the Islamic State, and the upsurge of right-wing populism throughout the countries of West all demonstrate a rising frustration with the liberal democratic order in the years since the 2008 financial crisis. There is a growing intellectual consensus that the world is sailing into uncharted territory: a realm marked by authoritarianism, shallow populism, and extremism.
One way to overcome this global resentment is to use the best tools we have to build a more inclusive and direct democracy. Could new technologies such as Augmented Reality (AR), Virtual Reality (VR), data analytics, crowdsourcing, and Blockchain help to restore meaningful dialogue and win back people’s hearts and minds?
Underpinning our unsettling current environment is an irony: Thanks to modern communication technology, the world is more connected than ever — but average people feel more disconnected. In the United States, polls show that trust in government is at a 50-year low. Frustrated Trump supporters and the Britons who voted for Brexit both have a sense of having “lost out” as the global elite consolidates its power and becomes less responsive to the rest of society. This is not an irrational belief: Branko Milanovic, a leading inequality scholar, has found that people in the lower and middle parts of rich countries’ income distributions have been the losers of the last 15 years of globalization.
The same 15 years have also brought astounding advances in technology, from the rise of the Internet to the growing ubiquity of smartphones. And Western society has, to some extent, struggled to find its bearings amid this transition. Militant groups seduce young people through social media. The Internet enables consumers to choose only the news that matches their preconceived beliefs, offering a bottomless well of partisan fury and conspiracy theories. Cable news airing 24/7 keeps viewers in a state of agitation. In short, communication technologies that are meant to bring us together end up dividing us instead (and not least because our politicians have chosen to game these tools for their own advantage).
It is time to make technology part of the solution. More urgently than ever, leaders, innovators, and activists need to open up the political marketplace to allow technology to realize its potential for enabling direct citizen participation. This is an ideal way to restore trust in the democratic process.
As the London School of Economics’ Mary Kaldor put it recently: “The task of global governance has to be reconceptualized to make it possible for citizens to influence the decisions that affect their lives — to reclaim substantive democracy.” One notable exception to the technological disconnect has been fundraising, as candidates have tapped into the Internet to enable millions of average voters to donate small sums. With the right vision, however, technological innovation in politics could go well beyond asking people for money….(More)”