Springwise: “Safe & the City is a free app designed to help users identify which streets are safe for them. Sexual harassment and violent crimes against women in particular are a big problem in many urban environments. This app uses crowdsourced data and crime statistics to help female pedestrians stay safe.
It is a development of traditional navigation apps but instead of simply providing the fastest route, it also has information on what is the safest. The Live Map relies on user data. Victims can report harassment or assault on the app. The information will then be available to other users to warn them of a potential threat in the area. Incidents can be ranked from a feeling of discomfort or threat, verbal harassment, or a physical assault. Whilst navigating, the Live Map can also alert users to potentially dangerous intersections coming. This reminds people to stay alert and not only focus on their phone while walking.
The Safe Sites feature is also a way of incorporating the community. Businesses and organisations can register to be Safe Sites. They will then receive training from SafeSeekers in how to provide the best support and assistance in emergency situations. The locations of such Sites will be available on the app, should a user need one.
The IOS app launched in March 2018 on International Women’s Day. It is currently only available for London…(More)”
Richard Harris at NPR: “A startup genetics company says it’s now offering to sequence your entire genome at no cost to you. In fact, you would own the data and may even be able to make money off it.
Nebula Genomics, created by the prominent Harvard geneticist George Church and his lab colleagues, seeks to upend the usual way genomic information is owned.
Today, companies like 23andMe make some of their money by scanning your genetic patterns and then selling that information to drug companies for use in research. (You choose whether to opt in.)
Church says his new enterprise leaves ownership and control of the data in an individual’s hands. And the genomic analysis Nebula will perform is much more detailed than what 23andMe and similar companies offer.
Nebula will do a full genome sequence, rather than a snapshot of key gene variants. That wider range of genetic information would makes the data more appealing to biologists and biotech and pharmaceutical companies….
Church’s approach is part of a trend that’s pushing back against the multibillion-dollar industry to buy and sell medical information. Right now, companies reap those profits and control the data.
“Patients should have the right to decide for themselves whether they want to share their medical data, and, if so, with whom,” Adam Tanner, at Harvard’s Institute for Quantitative Social Science, says in an email. “Efforts to empower people to fine-tune the fate of their medical information are a step in the right direction.” Tanner, author of a book on the subject of the trade in medical data, isn’t involved in Nebula.
The current system is “very paternalistic,” Church says. He aims to give people complete control over who gets access to their data, and let individuals decide whether or not to sell the information, and to whom.
“In this case, everything is private information, stored on your computer or a computer you designate,” Church says. It can be encrypted so nobody can read it, even you, if that’s what you want.
Drug companies interested in studying, say, diabetes patients would ask Nebula to identify people in their system who have the disease. Nebula would then identify those individuals by launching an encrypted search of participants.
People who have indicated they’re interested in selling their genetic data to a company would then be given the option of granting access to the information, along with medical data that person has designated.
Other companies are also springing up to help people control — and potentially profit from — their medical data. EncrypGen lets people offer up their genetic data, though customers have to provide their own DNA sequence. Hu-manity.co is also trying to establish a system in which people can sell their medical data to pharmaceutical companies….(More)”.
Blog by Giovanni Buttarelli: “…There are few authorities monitoring the impact of new technologies on fundamental rights so closely and intensively as data protection and privacy commissioners. At the International Conference of Data Protection and Privacy Commissioners, the 40th ICDPPC (which the EDPS had the honour to host), they continued the discussion on AI which began in Marrakesh two years ago with a reflection paper prepared by EDPS experts. In the meantime, many national data protection authorities have invested considerable efforts and provided important contributions to the discussion. To name only a few, the data protection authorities from Norway, France, the UKandSchleswig-Holstein have published research and reflections on AI, ethics and fundamental rights. We all see that some applications of AI raise immediate concerns about data protection and privacy; but it also seems generally accepted that there are far wider-reaching ethical implications, as a group of AI researchers also recently concluded. Data protection and privacy commissioners have now made a forceful intervention by adopting a declaration on ethics and data protection in artificial intelligence which spells out six principles for the future development and use of AI – fairness, accountability, transparency, privacy by design, empowerment and non-discrimination – and demands concerted international efforts to implement such governance principles. Conference members will contribute to these efforts, including through a new permanent working group on Ethics and Data Protection in Artificial Intelligence.
The ICDPPC was also chosen by an alliance of NGOs and individuals, The Public Voice, as the moment to launch its own Universal Guidelines on Artificial Intelligence (UGAI). The twelve principles laid down in these guidelines extend and complement those of the ICDPPC declaration.
We are only at the beginning of this debate. More voices will be heard: think tanks such as CIPL are coming forward with their suggestions, and so will many other organisations.
At international level, the Council of Europe has invested efforts in assessing the impact of AI, and has announced a report and guidelines to be published soon. The European Commission has appointed an expert group which will, among other tasks, give recommendations on future-related policy development and on ethical, legal and societal issues related to AI, including socio-economic challenges.
As I already pointed out in an earlier blogpost, it is our responsibility to ensure that the technologies which will determine the way we and future generations communicate, work and live together, are developed in such a way that the respect for fundamental rights and the rule of law are supported and not undermined….(More)”.
Gabriella Capone at apolitical (a winner of the 2018 Apolitical Young Thought Leaders competition): “Rain ravaged Gdańsk in 2016, taking the lives of two residents and causing millions of euros in damage. Despite its 700-year history of flooding the city was overwhelmed by these especially devastating floods. Also, Gdańsk is one of the European coasts most exposed to rising sea levels. It needed a new approach to avoid similar outcomes for the next, inevitable encounter with this worsening problem.
Bringing in citizens to tackle such a difficult issue was not the obvious course of action. Yet this was the proposal of Dr. Marcin Gerwin, an advocate from a neighbouring town who paved the way for Poland’s first participatory budgeting experience.
Mayor Adamowicz of Gdańsk agreed and, within a year, they welcomed about 60 people to the first Citizens Assembly on flood mitigation. Implemented by Dr. Gerwin and a team of coordinators, the Assembly convened over four Saturdays, heard expert testimony, and devised solutions.
The Assembly was not only deliberative and educational, it was action-oriented. Mayor Adamowicz committed to implement proposals on which 80% or more of participants agreed. The final 16 proposals included the investment of nearly $40 million USD in monitoring systems and infrastructure, subsidies to incentivise individuals to improve water management on their property, and an educational “Do Not Flood” campaign to highlight emergency resources.
It may seem risky to outsource the solving of difficult issues to citizens. Yet, when properly designed, public problem-solving can produce creative resolutions to formidable challenges. Beyond Poland, public problem-solving initiatives in Mexico and the United States are making headway on pervasive issues, from flooding to air pollution, to technology in public spaces.
The GovLab, with support from the Tinker Foundation, is analysing what makes for more successful public problem-solving as part of its City Challenges program. Below, I provide a glimpse into the types of design choices that can amplify the impact of public problem-solving….(More)
Aaron Smith at the Pew Research Center: “Algorithms are all around us, utilizing massive stores of data and complex analytics to make decisions with often significant impacts on humans. They recommend books and movies for us to read and watch, surface news stories they think we might find relevant, estimate the likelihood that a tumor is cancerous and predict whether someone might be a criminal or a worthwhile credit risk. But despite the growing presence of algorithms in many aspects of daily life, a Pew Research Center survey of U.S. adults finds that the public is frequently skeptical of these tools when used in various real-life situations.
This skepticism spans several dimensions. At a broad level, 58% of Americans feel that computer programs will always reflect some level of human bias – although 40% think these programs can be designed in a way that is bias-free. And in various contexts, the public worries that these tools might violate privacy, fail to capture the nuance of complex situations, or simply put the people they are evaluating in an unfair situation. Public perceptions of algorithmic decision-making are also often highly contextual. The survey shows that otherwise similar technologies can be viewed with support or suspicion depending on the circumstances or on the tasks they are assigned to do….
The following are among the major findings.
The public expresses broad concerns about the fairness and acceptability of using computers for decision-making in situations with important real-world consequences
By and large, the public views these examples of algorithmic decision-making as unfair to the people the computer-based systems are evaluating. Most notably, only around one-third of Americans think that the video job interview and personal finance score algorithms would be fair to job applicants and consumers. When asked directly whether they think the use of these algorithms is acceptable, a majority of the public says that they are not acceptable. Two-thirds of Americans (68%) find the personal finance score algorithm unacceptable, and 67% say the computer-aided video job analysis algorithm is unacceptable….
Attitudes toward algorithmic decision-making can depend heavily on context
Despite the consistencies in some of these responses, the survey also highlights the ways in which Americans’ attitudes toward algorithmic decision-making can depend heavily on the context of those decisions and the characteristics of the people who might be affected….
When it comes to the algorithms that underpin the social media environment, users’ comfort level with sharing their personal information also depends heavily on how and why their data are being used. A 75% majority of social media users say they would be comfortable sharing their data with those sites if it were used to recommend events they might like to attend. But that share falls to just 37% if their data are being used to deliver messages from political campaigns.
In other instances, different types of users offer divergent views about the collection and use of their personal data. For instance, about two-thirds of social media users younger than 50 find it acceptable for social media platforms to use their personal data to recommend connecting with people they might want to know. But that view is shared by fewer than half of users ages 65 and older….(More)”.
Book by Rob Reich: “Is philanthropy, by its very nature, a threat to today’s democracy? Though we may laud wealthy individuals who give away their money for society’s benefit, Just Giving shows how such generosity not only isn’t the unassailable good we think it to be but might also undermine democratic values and set back aspirations of justice. Big philanthropy is often an exercise of power, the conversion of private assets into public influence. And it is a form of power that is largely unaccountable, often perpetual, and lavishly tax-advantaged. The affluent—and their foundations—reap vast benefits even as they influence policy without accountability. And small philanthropy, or ordinary charitable giving, can be problematic as well. Charity, it turns out, does surprisingly little to provide for those in need and sometimes worsens inequality.
These outcomes are shaped by the policies that define and structure philanthropy. When, how much, and to whom people give is influenced by laws governing everything from the creation of foundations and nonprofits to generous tax exemptions for donations of money and property. Rob Reich asks: What attitude and what policies should democracies have concerning individuals who give money away for public purposes? Philanthropy currently fails democracy in many ways, but Reich argues that it can be redeemed. Differentiating between individual philanthropy and private foundations, the aims of mass giving should be the decentralization of power in the production of public goods, such as the arts, education, and science. For foundations, the goal should be what Reich terms “discovery,” or long-time-horizon innovations that enhance democratic experimentalism. Philanthropy, when properly structured, can play a crucial role in supporting a strong liberal democracy.
Just Giving investigates the ethical and political dimensions of philanthropy and considers how giving might better support democratic values and promote justice….(More)”
Lorena Abad and Lucas van der Meer in information: “Stimulating non-motorized transport has been a key point on sustainable mobility agendas for cities around the world. Lisbon is no exception, as it invests in the implementation of new bike infrastructure. Quantifying the connectivity of such a bicycle network can help evaluate its current state and highlight specific challenges that should be addressed. Therefore, the aim of this study is to develop an exploratory score that allows a quantification of the bicycle network connectivity in Lisbon based on open data.
For each part of the city, a score was computed based on how many common destinations (e.g., schools, universities, supermarkets, hospitals) were located within an acceptable biking distance when using only bicycle lanes and roads with low traffic stress for cyclists. Taking a weighted average of these scores resulted in an overall score for the city of Lisbon of only 8.6 out of 100 points. This shows, at a glance, that the city still has a long way to go before achieving their objectives regarding bicycle use in the city….(More)”.
But a growing body of research has raised concerns that many of these discoveries suffer from severe biases of their own. Specifically, the vast majority of what we know about human psychology and behavior comes from studies conducted with a narrow slice of humanity – college students, middle-class respondents living near universities and highly educated residents of wealthy, industrialized and democratic nations.
Blue countries represent the locations of 93 percent of studies published in Psychological Science in 2017. Dark blue is U.S., blue is Anglophone colonies with a European descent majority, light blue is western Europe. Regions sized by population.
To illustrate the extent of this bias, consider that more than 90 percent of studies recently published in psychological science’s flagship journal come from countries representing less than 15 percent of the world’s population.
If people thought and behaved in basically the same ways worldwide, selective attention to these typical participants would not be a problem. Unfortunately, in those rare cases where researchers have reached out to a broader range of humanity, they frequently find that the “usual suspects” most often included as participants in psychology studies are actually outliers. They stand apart from the vast majority of humanity in things like how they divvy up windfalls with strangers, how they reason about moral dilemmas and how they perceive optical illusions.
Given that these typical participants are often outliers, many scholars now describe them and the findings associated with them using the acronym WEIRD, for Western, educated, industrialized, rich and democratic.
WEIRD isn’t universal
Because so little research has been conducted outside this narrow set of typical participants, anthropologists like me cannot be sure how pervasive or consequential the problem is. A growing body of case studies suggests, though, that assuming such typical participants are the norm worldwide is not only scientifically suspect but can also havepractical consequences….(More)”.
Book by Kevin Werbach: “The blockchain entered the world on January 3, 2009, introducing an innovative new trust architecture: an environment in which users trust a system—for example, a shared ledger of information—without necessarily trusting any of its components. The cryptocurrency Bitcoin is the most famous implementation of the blockchain, but hundreds of other companies have been founded and billions of dollars invested in similar applications since Bitcoin’s launch. Some see the blockchain as offering more opportunities for criminal behavior than benefits to society. In this book, Kevin Werbach shows how a technology resting on foundations of mutual mistrust can become trustworthy.
The blockchain, built on open software and decentralized foundations that allow anyone to participate, seems like a threat to any form of regulation. In fact, Werbach argues, law and the blockchain need each other. Blockchain systems that ignore law and governance are likely to fail, or to become outlaw technologies irrelevant to the mainstream economy. That, Werbach cautions, would be a tragic waste of potential. If, however, we recognize the blockchain as a kind of legal technology that shapes behavior in new ways, it can be harnessed to create tremendous business and social value….(More)”
Report by the World Bank: “…Decisions based on data can greatly improve people’s lives. Data can uncover patterns, unexpected relationships and market trends, making it possible to address previously intractable problems and leverage hidden opportunities. For example, tracking genes associated with certain types of cancer to improve treatment, or using commuter travel patterns to devise public transportation that is affordable and accessible for users, as well as profitable for operators.
Data is clearly a precious commodity, and the report points out that people should have greater control over the use of their personal data. Broadly speaking, there are three possible answers to the question “Who controls our data?”: firms, governments, or users. No global consensus yet exists on the extent to which private firms that mine data about individuals should be free to use the data for profit and to improve services.
User’s willingness to share data in return for benefits and free services – such as virtually unrestricted use of social media platforms – varies widely by country. In addition to that, early internet adopters, who grew up with the internet and are now age 30–40, are the most willing to share (GfK 2017).
Are you willing to share your data? (source: GfK 2017)
On the other hand, data can worsen the digital divide – the data poor, who leave no digital trail because they have limited access, are most at risk from exclusion from services, opportunities and rights, as are those who lack a digital ID, for instance.
Firms and Data
For private sector firms, particularly those in developing countries, the report suggests how they might expand their markets and improve their competitive edge. Companies are already developing new markets and making profits by analyzing data to better understand their customers. This is transforming conventional business models. For years, telecommunications has been funded by users paying for phone calls. Today, advertisers pay for users’ data and attention are funding the internet, social media, and other platforms, such as apps, reversing the value flow.
Governments and Data
For governments and development professionals, the report provides guidance on how they might use data more creatively to help tackle key global challenges, such as eliminating extreme poverty, promoting shared prosperity, or mitigating the effects of climate change. The first step is developing appropriate guidelines for data sharing and use, and for anonymizing personal data. Governments are already beginning to use the huge quantities of data they hold to enhance service delivery, though they still have far to go to catch up with the commercial giants, the report finds.
Data for Development
The Information and Communications for Development report analyses how the data revolution is changing the behavior of governments, individuals, and firms and how these changes affect economic, social, and cultural development. This is a topic of growing importance that cannot be ignored, and the report aims to stimulate wider debate on the unique challenges and opportunities of data for development. It will be useful for policy makers, but also for anyone concerned about how their personal data is used and how the data revolution might affect their future job prospects….(More)”.