Smart Cities, Smart Governments and Smart Citizens: A Brief Introduction


Paper by Gabriel Puron Cid et al in the International Journal of E-Planning Research (IJEPR): “Although the field of study surrounding the “smart city” is in an embryonic phase, the use of information and communication technologies (ICT) in urban settings is not new (Dameri and Rosenthal-Sabroux, 2014; Toh and Low, 1993; Tokmakoff and Billington, 1994). Since ancient times, cities and metropolitan areas have propelled social transformation and economic prosperity in many societies (Katz and Bradley, 2013). Many modern urban sites and metros have leveraged the success and competitiveness of ICTs (Caragliu, Del Bo and Nijkamp, 2011). At least in part, the recent growth of smart city initiatives can be attributed to the rapid adoption of mobile and sensor technologies, as well as the diversity of available Internet applications (Nam and Pardo, 2011; Oberti and Pavesi, 2013).

The effective use of technological innovations in urban sites has been embraced by the emergent term “smart city”, with a strong focus on improving living conditions, safeguarding the sustainability of the natural environment, and engaging with citizens more effectively and actively (Dameri and Rosenthal-Sabroux, 2014). Also known as smart city, digital city, or intelligent city, many of these initiatives have been introduced as strategies to improve the utilization of physical infrastructure (e.g., roads and utility grids), engage citizens in active local governance and decision making, foster sustainable growth, and help government officials learn and innovate as the environment changes….(More)”

Big Data. Big Obstacles.


Dalton Conley et al. in the Chronicle of Higher Education: “After decades of fretting over declining response rates to traditional surveys (the mainstay of 20th-century social research), an exciting new era would appear to be dawning thanks to the rise of big data. Social contagion can be studied by scraping Twitter feeds; peer effects are tested on Facebook; long-term trends in inequality and mobility can be assessed by linking tax records across years and generations; social-psychology experiments can be run on Amazon’s Mechanical Turk service; and cultural change can be mapped by studying the rise and fall of specific Google search terms. In many ways there has been no better time to be a scholar in sociology, political science, economics, or related fields.

However, what should be an opportunity for social science is now threatened by a three-headed monster of privatization, amateurization, and Balkanization. A coordinated public effort is needed to overcome all of these obstacles.

While the availability of social-media data may obviate the problem of declining response rates, it introduces all sorts of problems with the level of access that researchers enjoy. Although some data can be culled from the web—Twitter feeds and Google searches—other data sit behind proprietary firewalls. And as individual users tune up their privacy settings, the typical university or independent researcher is increasingly locked out. Unlike federally funded studies, there is no mandate for Yahoo or Alibaba to make its data publicly available. The result, we fear, is a two-tiered system of research. Scientists working for or with big Internet companies will feast on humongous data sets—and even conduct experiments—and scholars who do not work in Silicon Valley (or Alley) will be left with proverbial scraps….

To address this triple threat of privatization, amateurization, and Balkanization, public social science needs to be bolstered for the 21st century. In the current political and economic climate, social scientists are not waiting for huge government investment like we saw during the Cold War. Instead, researchers have started to knit together disparate data sources by scraping, harmonizing, and geo­coding any and all information they can get their hands on.

Currently, many firms employ some well-trained social and behavioral scientists free to pursue their own research; likewise, some companies have programs by which scholars can apply to be in residence or work with their data extramurally. However, as Facebook states, its program is “by invitation only and requires an internal Facebook champion.” And while Google provides services like Ngram to the public, such limited efforts at data sharing are not enough for truly transparent and replicable science….(More)”

 

You Can’t Handle the (Algorithmic) Truth


 at Slate: “Critics of “algorithms” are everywhere. Algorithms tell you how to vote.Algorithms can revoke your driver’s license and terminate your disability benefits. Algorithms predict crimes. Algorithms ensured you didn’t hear about #FreddieGray on Twitter. Algorithms are everywhere, and, to hear critics, they are trouble. What’s the problem? Critics allege that algorithms are opaque, automatic, emotionless, and impersonal, and that they separate decision-makers from the consequences of their actions. Algorithms cannot appreciate the context of structural discrimination, are trained on flawed datasets, and are ruining lives everywhere. There needs to be algorithmic accountability. Otherwise, who is to blame when a computational process suddenly deprives someone of his or her rights and livelihood?

But at heart, criticism of algorithmic decision-making makes an age-old argument about impersonal, automatic corporate and government bureaucracy. The machinelike bureaucracy has simply become the machine. Instead of a quest for accountability, much of the rhetoric and discourse about algorithms amounts to a surrender—an unwillingness to fight the ideas and bureaucratic logic driving the algorithms that critics find so creepy and problematic. Algorithmic transparency and accountability can only be achieved if critics understand that transparency (no modifier is needed) is the issue. If the problem is that a bureaucratic system is impersonal, unaccountable, creepy, and has a flawed or biased decision criteria, then why fetishize and render mysterious the mere mechanical instrument of the system’s will?…(More)”

Platforms connect talented Instagrammers with good causes


Springwise: “….a number of charities using social media campaigns to spread awareness about their causes. Only last week we covered the #EndangeredEmoji campaign from WWF, which uses Twitter and emojis to highlight the plight of seventeen endangered species. Now, two startups — Gramforacause and Gramming For Good — are turning to the photocentric platform Instagram to connect photographers with non-profits, helping spread the word about their causes through social photography.

Each startup curates a service tailored to the needs of the nonprofit — whether that be providing photos to populate the feed, an Instagram takeover or simply spreading the word on a photographer’s own account. When matching a photographer, both companies consider preferred photography type — phone, DSLR or film — their expected rate of pay and where their passions lie….

Website: www.grammingforgood.com & www.gramforacause.com “

Selected Readings on Data Governance


Jos Berens (Centre for Innovation, Leiden University) and Stefaan G. Verhulst (GovLab)

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data governance was originally published in 2015.

Context
The field of Data Collaboratives is premised on the idea that sharing and opening-up private sector datasets has great – and yet untapped – potential for promoting social good. At the same time, the potential of data collaboratives depends on the level of societal trust in the exchange, analysis and use of the data exchanged. Strong data governance frameworks are essential to ensure responsible data use. Without such governance regimes, the emergent data ecosystem will be hampered and the (perceived) risks will dominate the (perceived) benefits. Further, without adopting a human-centered approach to the design of data governance frameworks, including iterative prototyping and careful consideration of the experience, the responses may fail to be flexible and targeted to real needs.

Selected Readings List (in alphabetical order)

Annotated Selected Readings List (in alphabetical order)

Better Place Lab, “Privacy, Transparency and Trust.” Mozilla, 2015. Available from: http://www.betterplace-lab.org/privacy-report.

  • This report looks specifically at the risks involved in the social sector having access to datasets, and the main risks development organizations should focus on to develop a responsible data use practice.
  • Focusing on five specific countries (Brazil, China, Germany, India and Indonesia), the report displays specific country profiles, followed by a comparative analysis centering around the topics of privacy, transparency, online behavior and trust.
  • Some of the key findings mentioned are:
    • A general concern on the importance of privacy, with cultural differences influencing conception of what privacy is.
    • Cultural differences determining how transparency is perceived, and how much value is attached to achieving it.
    • To build trust, individuals need to feel a personal connection or get a personal recommendation – it is hard to build trust regarding automated processes.

Montjoye, Yves Alexandre de; Kendall, Jake and; Kerry, Cameron F. “Enabling Humanitarian Use of Mobile Phone Data.” The Brookings Institution, 2015. Available from: http://www.brookings.edu/research/papers/2014/11/12-enabling-humanitarian-use-mobile-phone-data.

  • Focussing in particular on mobile phone data, this paper explores ways of mitigating privacy harms involved in using call detail records for social good.
  • Key takeaways are the following recommendations for using data for social good:
    • Engaging companies, NGOs, researchers, privacy experts, and governments to agree on a set of best practices for new privacy-conscientious metadata sharing models.
    • Accepting that no framework for maximizing data for the public good will offer perfect protection for privacy, but there must be a balanced application of privacy concerns against the potential for social good.
    • Establishing systems and processes for recognizing trusted third-parties and systems to manage datasets, enable detailed audits, and control the use of data so as to combat the potential for data abuse and re-identification of anonymous data.
    • Simplifying the process among developing governments in regards to the collection and use of mobile phone metadata data for research and public good purposes.

Centre for Democracy and Technology, “Health Big Data in the Commercial Context.” Centre for Democracy and Technology, 2015. Available from: https://cdt.org/insight/health-big-data-in-the-commercial-context/.

  • Focusing particularly on the privacy issues related to using data generated by individuals, this paper explores the overlap in privacy questions this field has with other data uses.
  • The authors note that although the Health Insurance Portability and Accountability Act (HIPAA) has proven a successful approach in ensuring accountability for health data, most of these standards do not apply to developers of the new technologies used to collect these new data sets.
  • For non-HIPAA covered, customer facing technologies, the paper bases an alternative framework for consideration of privacy issues. The framework is based on the Fair Information Practice Principles, and three rounds of stakeholder consultations.

Center for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice.” Centre for Information Policy Leadership, Hunton & Williams LLP, 2015. Available from: https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.

  • This white paper is part of a project aiming to explain what is often referred to as a new, risk-based approach to privacy, and the development of a privacy risk framework and methodology.
  • With the pace of technological progress often outstripping the capabilities of privacy officers to keep up, this method aims to offer the ability to approach privacy matters in a structured way, assessing privacy implications from the perspective of possible negative impact on individuals.
  • With the intended outcomes of the project being “materials to help policy-makers and legislators to identify desired outcomes and shape rules for the future which are more effective and less burdensome”, insights from this paper might also feed into the development of innovative governance mechanisms aimed specifically at preventing individual harm.

Centre for Information Policy Leadership, “Data Governance for the Evolving Digital Market Place”, Centre for Information Policy Leadership, Hunton & Williams LLP, 2011. Available from: http://www.huntonfiles.com/files/webupload/CIPL_Centre_Accountability_Data_Governance_Paper_2011.pdf.

  • This paper argues that as a result of the proliferation of large scale data analytics, new models governing data inferred from society will shift responsibility to the side of organizations deriving and creating value from that data.
  • It is noted that, with the reality of the challenge corporations face of enabling agile and innovative data use “In exchange for increased corporate responsibility, accountability [and the governance models it mandates, ed.] allows for more flexible use of data.”
  • Proposed as a means to shift responsibility to the side of data-users, the accountability principle has been researched by a worldwide group of policymakers. Tailing the history of the accountability principle, the paper argues that it “(…) requires that companies implement programs that foster compliance with data protection principles, and be able to describe how those programs provide the required protections for individuals.”
  • The following essential elements of accountability are listed:
    • Organisation commitment to accountability and adoption of internal policies consistent with external criteria
    • Mechanisms to put privacy policies into effect, including tools, training and education
    • Systems for internal, ongoing oversight and assurance reviews and external verification
    • Transparency and mechanisms for individual participation
    • Means of remediation and external enforcement

Crawford, Kate; Schulz, Jason. “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harm.” NYU School of Law, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2325784&download=yes.

  • Considering the privacy implications of large-scale analysis of numerous data sources, this paper proposes the implementation of a ‘procedural data due process’ mechanism to arm data subjects against potential privacy intrusions.
  • The authors acknowledge that some privacy protection structures already know similar mechanisms. However, due to the “inherent analytical assumptions and methodological biases” of big data systems, the authors argue for a more rigorous framework.

Letouze, Emmanuel, and; Vinck, Patrick. “The Ethics and Politics of Call Data Analytics”, DataPop Alliance, 2015. Available from: http://static1.squarespace.com/static/531a2b4be4b009ca7e474c05/t/54b97f82e4b0ff9569874fe9/1421442946517/WhitePaperCDRsEthicFrameworkDec10-2014Draft-2.pdf.

  • Focusing on the use of Call Detail Records (CDRs) for social good in development contexts, this whitepaper explores both the potential of these datasets – in part by detailing recent successful efforts in the space – and political and ethical constraints to their use.
  • Drawing from the Menlo Report Ethical Principles Guiding ICT Research, the paper explores how these principles might be unpacked to inform an ethics framework for the analysis of CDRs.

Data for Development External Ethics Panel, “Report of the External Ethics Review Panel.” Orange, 2015. Available from: http://www.d4d.orange.com/fr/content/download/43823/426571/version/2/file/D4D_Challenge_DEEP_Report_IBE.pdf.

  • This report presents the findings of the external expert panel overseeing the Orange Data for Development Challenge.
  • Several types of issues faced by the panel are described, along with the various ways in which the panel dealt with those issues.

Federal Trade Commission Staff Report, “Mobile Privacy Disclosures: Building Trust Through Transparency.” Federal Trade Commission, 2013. Available from: www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf.

  • This report looks at ways to address privacy concerns regarding mobile phone data use. Specific advise is provided for the following actors:
    • Platforms, or operating systems providers
    • App developers
    • Advertising networks and other third parties
    • App developer trade associations, along with academics, usability experts and privacy researchers

Mirani, Leo. “How to use mobile phone data for good without invading anyone’s privacy.” Quartz, 2015. Available from: http://qz.com/398257/how-to-use-mobile-phone-data-for-good-without-invading-anyones-privacy/.

  • This paper considers the privacy implications of using call detail records for social good, and ways to mitigate risks of privacy intrusion.
  • Taking example of the Orange D4D challenge and the anonymization strategy that was employed there, the paper describes how classic ‘anonymization’ is often not enough. The paper then lists further measures that can be taken to ensure adequate privacy protection.

Bernholz, Lucy. “Several Examples of Digital Ethics and Proposed Practices” Stanford Ethics of Data conference, 2014, Available from: http://www.scribd.com/doc/237527226/Several-Examples-of-Digital-Ethics-and-Proposed-Practices.

  • This list of readings prepared for Stanford’s Ethics of Data conference lists some of the leading available literature regarding ethical data use.

Abrams, Martin. “A Unified Ethical Frame for Big Data Analysis.” The Information Accountability Foundation, 2014. Available from: http://www.privacyconference2014.org/media/17388/Plenary5-Martin-Abrams-Ethics-Fundamental-Rights-and-BigData.pdf.

  • Going beyond privacy, this paper discusses the following elements as central to developing a broad framework for data analysis:
    • Beneficial
    • Progressive
    • Sustainable
    • Respectful
    • Fair

Lane, Julia; Stodden, Victoria; Bender, Stefan, and; Nissenbaum, Helen, “Privacy, Big Data and the Public Good”, Cambridge University Press, 2014. Available from: http://www.dataprivacybook.org.

  • This book treats the privacy issues surrounding the use of big data for promoting the public good.
  • The questions being asked include the following:
    • What are the ethical and legal requirements for scientists and government officials seeking to serve the public good without harming individual citizens?
    • What are the rules of engagement?
    • What are the best ways to provide access while protecting confidentiality?
    • Are there reasonable mechanisms to compensate citizens for privacy loss?

Richards, Neil M, and; King, Jonathan H. “Big Data Ethics”. Wake Forest Law Review, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2384174.

  • This paper describes the growing impact of big data analytics on society, and argues that because of this impact, a set of ethical principles to guide data use is called for.
  • The four proposed themes are: privacy, confidentiality, transparency and identity.
  • Finally, the paper discusses how big data can be integrated into society, going into multiple facets of this integration, including the law, roles of institutions and ethical principles.

OECD, “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data”. Available from: http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.

  • A globally used set of principles to inform thought about handling personal data, the OECD privacy guidelines serve as one the leading standards for informing privacy policies and data governance structures.
  • The basic principles of national application are the following:
    • Collection Limitation Principle
    • Data Quality Principle
    • Purpose Specification Principle
    • Use Limitation Principle
    • Security Safeguards Principle
    • Openness Principle
    • Individual Participation Principle
    • Accountability Principle

The White House Big Data and Privacy Working Group, “Big Data: Seizing Opportunities, Preserving Values”, White House, 2015. Available from: https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf.

  • Documenting the findings of the White House big data and privacy working group, this report lists i.a. the following key recommendations regarding data governance:
    • Bringing greater transparency to the data services industry
    • Stimulating international conversation on big data, with multiple stakeholders
    • With regard to educational data: ensuring data is used for the purpose it is collected for
    • Paying attention to the potential for big data to facilitate discrimination, and expanding technical understanding to stop discrimination

William Hoffman, “Pathways for Progress” World Economic Forum, 2015. Available from: http://www3.weforum.org/docs/WEFUSA_DataDrivenDevelopment_Report2015.pdf.

  • This paper treats i.a. the lack of well-defined and balanced governance mechanisms as one of the key obstacles preventing particularly corporate sector data from being shared in a controlled space.
  • An approach that balances the benefits against the risks of large scale data usage in a development context, building trust among all stake holders in the data ecosystem, is viewed as key.
  • Furthermore, this whitepaper notes that new governance models are required not just by the growing amount of data and analytical capacity, and more refined methods for analysis. The current “super-structure” of information flows between institutions is also seen as one of the key reasons to develop alternatives to the current – outdated – approaches to data governance.

Launching the Police Data Initiative


Megan Smith and Roy L. Austin, Jr.at the White House: “Last December, President Obama launched the Task Force on 21st Century Policing to better understand specific policing challenges and help communities identify actions they can take to improve law enforcement and enhance community engagement. Since that time, we have seen law enforcement agencies around the country working harder than ever to make the promise of community policing real.

Many of the Task Force’s recommendations emphasize the opportunity for departments to better use data and technology to build community trust. As a response, the White House has launched the Police Data Initiative, which has mobilized 21 leading jurisdictions across the country to take fast action on concrete deliverables responding to these Task Force recommendations in the area of data and technology. Camden is one such jurisdiction.

By finding innovative work already underway in these diverse communities and bringing their leaders together with top technologists, researchers, data scientists and design experts, the Police Data Initiative is helping accelerate progress around data transparency and analysis, toward the goal of increased trust and impact. Through the Initiative, key stakeholders are establishing a community of practice that will allow for knowledge sharing, community-sourced problem solving, and the establishment of documented best practices that can serve as examples for police departments nationwide….

Commitment highlights include:

1. Use open data to build transparency and increase community trust.

  • All 21 police departments have committed to release a combined total of 101 data sets that have not been released to the public. The types of data include uses of force, police pedestrian and vehicle stops, officer involved shootings and more, helping the communities gain visibility into key information on police/citizen encounters.
    • Code for America and others are helping on this. For information on how Police Departments can jumpstart their open police data efforts, click here.
  • To make police open data easy to find and use, the Police Foundation and ESRI are building a public safety open data portal to serve, in part, as a central clearinghouse option for police open data, making it easily accessible to law enforcement agencies, community groups and researchers.
  • Code for America and CI Technologies will work together to build an open source software tool to make it easier for the 500+ U.S. law enforcement agencies using IA Pro police integrity software to extract and open up data.
  • To make it easier for agencies to share data with the public about policing, Socrata will provide technical assistance to cities and agencies who are working toward increased transparency.
  • To help this newly released data come alive for communities through mapping, visualizations and other tools, city leaders, non-profit organizations, and private sector partners will host open data hackathons in cities around the country. In New Orleans, Operation Spark, a non-profit organization that teaches at-risk New Orleans youth software development skills, will work with data opened by the New Orleans Police Department at a weeklong code academy.
  • Presidential Innovation Fellows working with the U.S. Chief Technology Officer and Chief Data Scientist will work collaboratively with key stakeholders, such as Code for America and the Sunlight Foundation, to develop and release an Open Data Playbook for police departments that they can use as a reference for open data best practices and case studies.
  • The Charlotte-Mecklenburg Police Department is working with the Southern Coalition for Social Justice to use open data to provide a full picture of key policing activities, including stops, searches and use-of-force trends, information and demographics on neighborhoods patrolled, and more. This partnership will build on a website and tools already developed by the Southern Coalition for Justice which provide visualization and search functions to make this data easily accessible and understandable.
  • The International Association of Chiefs of Police, the Police Foundation, and Code for America have committed to helping grow a community of practice for law enforcement agencies and technologists around open data and transparency in police community interactions.

2. Internal accountability and effective data analysis.

  • While many police departments have systems in place, often called “early warning systems,” to identify officers who may be having challenges in their interactions with the public and link them with training and other assistance, there has been little to no research to determine which indicators are most closely linked to bad outcomes. To tackle this issue, twelve police departments committed to sharing data on police/citizen encounters with data scientists for in-depth data analysis, strengthening the ability of police to intervene early and effectively: Austin, TX; Camden, NJ; Charlotte, NC; Dallas, TX; Indianapolis, IN; Knoxville, TN; LA City; LA County; Louisville, KY; New Orleans, LA; Philadelphia, PA; and Richmond, CA….(More)

Chicago uses new technology to solve this very old urban problem


 at Fortune: “Chicago has spent 12 years collecting data on resident complaints. Now the city is harnessing that data to control the rat population, stopping infestations before residents spot rats in the first place.

For the past three years, Chicago police have been analyzing 911 calls to better predict crime patterns across the city and, in one case, actually forecasted a shootout minutes before it occurred.

Now, the city government is turning its big data weapons on the city’s rat population.

The city has 12 years of data on the resident complaints, ranging from calls about rodent sitting to graffiti. Those clusters of data lead the engineers to where the rats can potentially breed. The report is shared with the city’s sanitation team, which later cleans up the rat-infested areas.

“We discovered really interesting relationship that led to developing an algorithm about rodent prediction,” says Brenna Berman, Chicago’s chief information officer. “It involved 31 variables related to calls about overflowing trash bins and food poisoning in restaurants.”

The results, Berman says, are 20% more efficient versus the old responsive model.

Governing cities in the 21st century is a difficult task. It needs a political and economic support. In America, it was only in the early 1990s—when young adults started moving from the suburbs back to the cities—that the academic and policy consensus shifted back toward urban centers. Since then, cities are facing an influx of new residents, overwhelming the service providing agencies. To meet that demand amid the recent budget sequestration, cities like New York, San Francisco, Philadelphia, and Chicago are constantly elevating the art of governance through innovative policies.

Due to this new model, in Chicago, you might not even spot a rat. The city’s Department of Innovation and Technology analyzes big chunks of data to an extent where the likelihood of a rodent infestation is thwarted seven days ahead of resident rat-sightings…(More)”

The Future of Citizen Engagement: five trends transforming government


Catherine Andrews at GovLoop: “Every year, citizen engagement seems to improve. New technologies are born; new innovations connect citizens with the government; new ideas start to take root.

It’s 2015, and citizen engagement has gone far beyond basic social media and town halls. As we make our way through the 21st century, citizen engagement is continuing to evolve. New platforms and concepts such as geographic information systems (GIS), GitHub, open data, human-centered design and novel uses of social media have challenged the traditional notions of citizen engagement and pushed government into uncharted territories. As citizens become more tech-savvy, this growth is only continuing.

This GovLoop guide will dive into five of the latest and newest trends in citizen engagement. From the customer experience to the Internet of Things, we’ll highlight the most innovative ways federal, state and local governments are connecting with citizens….(More)”

Detroit Revitalizes City with 311 App


Jason Shueh at Government Technology: “In the wake of the Detroit bankruptcy, blight sieged parts of the city as its populous exited. The fallouts were typical. There was a run of vandalism, thefts, arson and graffiti. Hard times pushed throngs of looters into scores of homes to scavenge for anything that wasn’t bolted down — and often, even for the things that were…. For solutions, Detroit Mayor Mike Duggan and DWSD’s CIO Dan Rainey partnered with SeeClickFix. The company, based in New Haven, Conn., is known for its mobile platform that’s embedded itself as a conduit between city service departments and citizen non-emergency — or 311 — requests. Duggan saw the platform as an opportune answer to address more than a single issue. Instead, the mayor asked how the app could be integrated throughout the city. Potholes, downed trees, graffiti, missing signage, streetlight outages — the mayor wanted a bundled solution to handle an array of common challenges.

Improve Detroit was his answer. The city app, officially available since April, allows citizens to report problems using photos, location data and by request type. Notifications on progress follow and residents can even pay utility bills through the app. For departments, it’s ingrained into work orders and workflows, while analytics provide data for planning, and filters permit a deep-dive analysis….

improve detroit app

Detroit now sits among many metropolitan cities pioneering such 311 apps. San Francisco, New York, Los Angeles, Philadelphia and Chicago are just a few of them. And there are a host of equally adroit tech providers supplying and supporting the apps — companies like Salesforce, CitySourced, PublicStuff, Fix 311 and others. Some cities have even developed their own apps through their internal IT departments.

What’s unique in Detroit is the city’s ambition to leverage a 311 app against major blight while the city works to demolish more than 20,000 abandoned homes — susceptible to fire, flooding, pest infestations and criminal activity. Beyond this, Lingholm said the initiative doubles as a tool to rejuvenate public trust. Data from the app is fed to the city’s new open data portal, and departments have set goals to ensure responsiveness….(More)

How to use mobile phone data for good without invading anyone’s privacy


Leo Mirani in Quartz: “In 2014, when the West African Ebola outbreak was at its peak, some academics argued that the epidemic could have been slowed by using mobile phone data.

Their premise was simple: call-data records show the true nature of social networks and human movement. Understanding social networks and how people really move—as seen from phone movements and calls—could give health officials the ability to predict how a disease will move and where a disease will strike next, and prepare accordingly.

The problem is that call-data records are very hard to get a hold of. The files themselves are huge, there are enormous privacy risks, and the process of making the records safe for distribution is long.
First, the technical basics

Every time you make a phone call from your mobile phone to another mobile phone, the network records the following information (note: this is not a complete list):

  • The number from which the call originated
  • The number at which the call terminated
  • Start time of the call
  • Duration of the call
  • The ID number of the phone making the call
  • The ID number of the SIM card used to make the call
  • The code for the antenna used to make the call

On their own, these records are not creepy. Indeed, without them, networks would be unable to connect calls or bill customers. But it is easy to see why operators aren’t rushing to share this information. Even though the data includes none of the actual content of a phone call in the data, simply knowing which number is calling which, and from where and when, is usually more than enough to identify people.
So how can network operators use this valuable data for good while also protecting their own interests and those of their customers? A good example can be found in Africa, where Orange, a French mobile phone network with interests across several African countries, has for the second year run its “Data for Development” (D4D) program, which offers researchers a chance to mine call data for clues on development problems.

Steps to safe sharing

After a successful first year in Ivory Coast, Orange this year ran the D4D program in Senegal. The aim of the program is to give researchers and scientists at universities and other research labs access to data in order to find novel ways to aid development in health, agriculture, transport or urban planning, energy, and national statistics….(More)”