Improving patient care by bridging the divide between doctors and data scientists


 at the Conversation: “While wonderful new medical discoveries and innovations are in the news every day, doctors struggle daily with using information and techniques available right now while carefully adopting new concepts and treatments. As a practicing doctor, I deal with uncertainties and unanswered clinical questions all the time….At the moment, a report from the National Academy of Medicine tells us, most doctors base most of their everyday decisions on guidelines from (sometimes biased) expert opinions or small clinical trials. It would be better if they were from multicenter, large, randomized controlled studies, with tightly controlled conditions ensuring the results are as reliable as possible. However, those are expensive and difficult to perform, and even then often exclude a number of important patient groups on the basis of age, disease and sociological factors.

Part of the problem is that health records are traditionally kept on paper, making them hard to analyze en masse. As a result, most of what medical professionals might have learned from experiences was lost – or at least was inaccessible to another doctor meeting with a similar patient.

A digital system would collect and store as much clinical data as possible from as many patients as possible. It could then use information from the past – such as blood pressure, blood sugar levels, heart rate and other measurements of patients’ body functions – to guide future doctors to the best diagnosis and treatment of similar patients.

Industrial giants such as Google, IBM, SAP and Hewlett-Packard have also recognized the potential for this kind of approach, and are now working on how to leverage population data for the precise medical care of individuals.

Collaborating on data and medicine

At the Laboratory of Computational Physiology at the Massachusetts Institute of Technology, we have begun to collect large amounts of detailed patient data in the Medical Information Mart in Intensive Care (MIMIC). It is a database containing information from 60,000 patient admissions to the intensive care units of the Beth Israel Deaconess Medical Center, a Boston teaching hospital affiliated with Harvard Medical School. The data in MIMIC has been meticulously scoured so individual patients cannot be recognized, and is freely shared online with the research community.

But the database itself is not enough. We bring together front-line clinicians (such as nurses, pharmacists and doctors) to identify questions they want to investigate, and data scientists to conduct the appropriate analyses of the MIMIC records. This gives caregivers and patients the best individualized treatment options in the absence of a randomized controlled trial.

Bringing data analysis to the world

At the same time we are working to bring these data-enabled systems to assist with medical decisions to countries with limited health care resources, where research is considered an expensive luxury. Often these countries have few or no medical records – even on paper – to analyze. We can help them collect health data digitally, creating the potential to significantly improve medical care for their populations.

This task is the focus of Sana, a collection of technical, medical and community experts from across the globe that is also based in our group at MIT. Sana has designed a digital health information system specifically for use by health providers and patients in rural and underserved areas.

At its core is an open-source system that uses cellphones – common even in poor and rural nations – to collect, transmit and store all sorts of medical data. It can handle not only basic patient data such as height and weight, but also photos and X-rays, ultrasound videos, and electrical signals from a patient’s brain (EEG) and heart (ECG).

Partnering with universities and health organizations, Sana organizes training sessions (which we call “bootcamps”) and collaborative workshops (called “hackathons”) to connect nurses, doctors and community health workers at the front lines of care with technology experts in or near their communities. In 2015, we held bootcamps and hackathons in Colombia, Uganda, Greece and Mexico. The bootcamps teach students in technical fields like computer science and engineering how to design and develop health apps that can run on cellphones. Immediately following the bootcamp, the medical providers join the group and the hackathon begins…At the end of the day, though, the purpose is not the apps….(More)

An App to Save Syria’s Lost Generation? What Technology Can and Can’t Do


 in Foreign Affairs: ” In January this year, when the refugee and migrant crisis in Europe had hit its peak—more than a million had crossed into Europe over the course of 2015—the U.S. State Department and Google hosted a forum of over 100 technology experts. The goal was to “bridge the education gap for Syrian refugee children.” Speaking to the group assembled at Stanford University, Deputy Secretary of State Antony Blinken announced a $1.7 million prize “to develop a smartphone app that can help Syrian children learn how to read and improve their wellbeing.” The competition, known as EduApp4Syria, is being run by the Norwegian Agency for Development Cooperation (Norad) and is supported by the Australian government and the French mobile company Orange.

Less than a month later, a group called Techfugees brought together over 100 technologists for a daylong brainstorm in New York City focused exclusively on education solutions. “We are facing the largest refugee crisis since World War II,” said U.S. Ambassador to the United Nations Samantha Power to open the conference. “It is a twenty-first-century crisis and we need a twenty-first-century solution.” Among the more promising, according to Power, were apps that enable “refugees to access critical services,” new “web platforms connecting refugees with one another,” and “education programs that teach refugees how to code.”

For example, the nonprofit PeaceGeeks created the Services Advisor app for the UN Refugee Agency, which maps the location of shelters, food distribution centers, and financial services in Jordan….(More)”

Building Data Responsibility into Humanitarian Action


Stefaan Verhulst at The GovLab: “Next Monday, May 23rd, governments, non-profit organizations and citizen groups will gather in Istanbul at the first World Humanitarian Summit. A range of important issues will be on the agenda, not least of which the refugee crisis confronting the Middle East and Europe. Also on the agenda will be an issue of growing importance and relevance, even if it does not generate front-page headlines: the increasing potential (and use) of data in the humanitarian context.

To explore this topic, a new paper, “Building Data Responsibility into Humanitarian Action,” is being released today, and will be presented tomorrow at the Understanding Risk Forum. This paper is the result of a collaboration between the United Nations Office for the Coordination of Humanitarian Affairs (OCHA), The GovLab (NYU Tandon School of Engineering), the Harvard Humanitarian Initiative, and Leiden UniversityCentre for Innovation. It seeks to identify the potential benefits and risks of using data in the humanitarian context, and begins to outline an initial framework for the responsible use of data in humanitarian settings.

Both anecdotal and more rigorously researched evidence points to the growing use of data to address a variety of humanitarian crises. The paper discusses a number of data risk case studies, including the use of call data to fight Malaria in Africa; satellite imagery to identify security threats on the border between Sudan and South Sudan; and transaction data to increase the efficiency of food delivery in Lebanon. These early examples (along with a few others discussed in the paper) have begun to show the opportunities offered by data and information. More importantly, they also help us better understand the risks, including and especially those posed to privacy and security.

One of the broader goals of the paper is to integrate the specific and the theoretical, in the process building a bridge between the deep, contextual knowledge offered by initiatives like those discussed above and the broader needs of the humanitarian community. To that end, the paper builds on its discussion of case studies to begin establishing a framework for the responsible use of data in humanitarian contexts. It identifies four “Minimum Humanitarian standards for the Responsible use of Data” and four “Characteristics of Humanitarian Organizations that use Data Responsibly.” Together, these eight attributes can serve as a roadmap or blueprint for humanitarian groups seeking to use data. In addition, the paper also provides a four-step practical guide for a data responsibility framework (see also earlier blog)….(More)” Full Paper: Building Data Responsibility into Humanitarian Action

Crowdsourcing corruption in India’s maternal health services


Joan Okitoi-Heisig at DW Akademie: “…The Mera Swasthya Meri Aawaz (MSMA) project is the first of its kind in India to track illicit maternal fees demanded in government hospitals located in the northern state of Uttar Pradesh.

MSMA (“My Health, My Voice”) is part of SAHAYOG, a non-governmental umbrella organization that helped launch the project. MSMA uses an Ushahidi platform to map and collect data on unofficial fees that plague India’ ostensibly “free” maternal health services. It is one of the many projects showcased in DW Akademie’s recently launched Digital Innovation Library. SAHAYOG works closely with grassroots organizations to promote gender equality and women’s health issues from a human rights perspective…

SAYAHOG sees women’s maternal health as a human rights issue. Key to the MSMA project is exposing government facilities that extort bribes from among the poorest and most vulnerable in society.

Sandhya and her colleagues are convinced that promoting transparency and accountability through the data collected can empower the women. If they’re aware of their entitlements, she says, they can demand their rights and in the process hold leaders accountable.

“Information is power,” Sandhya explains. Without this information, she says, “they aren’t in a position to demand what is rightly theirs.”

Health care providers hold a certain degree of power when entrusted with taking care of expectant mothers. Many give into bribes for fear of being otherwise neglected or abused.

With the MSMA project, however, poor rural women have technology that is easy to use and accessible on their mobile phones, and that empowers them to make complaints and report bribes for services that are supposed to be free.

MSMA is an innovative data-driven platform that combines a toll free number, an interactive voice response system (IVRS) and a website that contains accessible reports. In addition to enabling poor women to air their frustrations anonymously, the project aggregates actionable data which can then be used by the NGO as well as the government to work towards improving the situation for mothers in India….(More)”

Society’s biggest problems need more than a nudge


 at the Conversation: “So-called “nudge units” are popping up in governments all around the world.

The best-known examples include the U.K.’s Behavioural Insights Team, created in 2010, and the White House-based Social and Behavioral Sciences Team, introduced by the Obama administration in 2014. Their mission is to leverage findings from behavioral science so that people’s decisions can be nudged in the direction of their best intentions without curtailing their ability to make choices that don’t align with their priorities.

Overall, these – and other – governments have made important strides when it comes to using behavioral science to nudge their constituents into better choices.

Yet, the same governments have done little to improve their own decision-making processes. Consider big missteps like the Flint water crisis. How could officials in Michigan decide to place an essential service – safe water – and almost 100,000 people at risk in order to save US$100 per day for three months? No defensible decision-making process should have allowed this call to be made.

When it comes to many of the big decisions faced by governments – and the private sector – behavioral science has more to offer than simple nudges.

Behavioral scientists who study decision-making processes could also help policy-makers understand why things went wrong in Flint, and how to get their arms around a wide array of society’s biggest problems – from energy transitions to how to best approach the refugee crisis in Syria.

When nudges are enough

The idea of nudging people in the direction of decisions that are in their own best interest has been around for a while. But it was popularized in 2008 with the publication of the bestseller “Nudge“ by Richard Thaler of the University of Chicago and Cass Sunstein of Harvard.

A common nudge goes something like this: if we want to eat better but are having a hard time doing it, choice architects can reengineer the environment in which we make our food choices so that healthier options are intuitively easier to select, without making it unrealistically difficult to eat junk food if that’s what we’d rather do. So, for example, we can shelve healthy foods at eye level in supermarkets, with less-healthy options relegated to the shelves nearer to the floor….

Sometimes a nudge isn’t enough

Nudges work for a wide array of choices, from ones we face every day to those that we face infrequently. Likewise, nudges are particularly well-suited to decisions that are complex with lots of different alternatives to choose from. And, they are advocated in situations where the outcomes of our decisions are delayed far enough into the future that they feel uncertain or abstract. This describes many of the big decisions policy-makers face, so it makes sense to think the solution must be more nudge units.

But herein lies the rub. For every context where a nudge seems like a realistic option, there’s at least another context where the application of passive decision support would be either be impossible – or, worse, a mistake.

Take, for example, the question of energy transitions. These transitions are often characterized by the move from infrastructure based on fossil fuels to renewables to address all manner of risks, including those from climate change. These are decisions that society makes infrequently. They are complex. And, the outcomes – which are based on our ability to meet conflicting economic, social and environmental objectives – will be delayed.

But, absent regulation that would place severe restrictions on the kinds of options we could choose from – and which, incidentally, would violate the freedom-of-choice tenet of choice architecture – there’s no way to put renewable infrastructure options at proverbial eye level for state or federal decision-makers, or their stakeholders.

Simply put, a nudge for a decision like this would be impossible. In these cases, decisions have to be made the old-fashioned way: with a heavy lift instead of a nudge.

Complex policy decisions like this require what we call active decision support….(More)”

Jakarta’s plans for predictive government


 at GovInsider: “Jakarta is predicting floods and traffic using complaints data, and plans to do so for dengue as well.

Its Smart City Unit has partnered with startup Qlue to build a dashboard, analysing data from online complaints, sensors and traffic apps. “Our algorithms can predict several things related to our reports such as flood, traffic, and others”, Qlue co-founder and CEO Rama Raditya told GovInsider.

Take floods, for instance. Using trends in complaints from citizens, water level history from sensors and weather data, it can predict the intensity of floods in specific locations next year. “They can predict what will happen when they compare the weather with the flood conditions from last year”, he said.

The city will start to predict dengue hotspots from next year, Rama said. The dashboard was not originally looking at dengue, but after receiving “thousands of complaints on dengue locations”, the government is now looking into this data. “Next year our algorithm will allow the government to know before it happens so they can prepare the amount of medication and so on within each district,” he said.

The dashboard is paired with an app. The app started with collecting citizens’ complaints and has been expanding with new features. It now has a virtual reality section to explore tourist sites in the city. Next week it is launching an augmented reality feature giving directions to nearby ATMs, restaurants,mosques and parks, Rama said.

Qlue has become a strategic part of the Jakarta administration, with the Governor himself using it to decide who to fire and promote. Following its rise in the capital city, it is now being used by 12 other cities across Indonesia: Bandung, Makassar, Bali, Manado, Surabaya, Bogor, Depok, Palembang, Bekasi,Yogyakarta, Riau and Semarang….(More)

Open Data Supply: Enriching the usability of information


Report by Phoensight: “With the emergence of increasing computational power, high cloud storage capacity and big data comes an eager anticipation of one of the biggest IT transformations of our society today.

Open data has an instrumental role to play in our digital revolution by creating unprecedented opportunities for governments and businesses to leverage off previously unavailable information to strengthen their analytics and decision making for new client experiences. Whilst virtually every business recognises the value of data and the importance of the analytics built on it, the ability to realise the potential for maximising revenue and cost savings is not straightforward. The discovery of valuable insights often involves the acquisition of new data and an understanding of it. As we move towards an increasing supply of open data, technological and other entrepreneurs will look to better utilise government information for improved productivity.

This report uses a data-centric approach to examine the usability of information by considering ways in which open data could better facilitate data-driven innovations and further boost our economy. It assesses the state of open data today and suggests ways in which data providers could supply open data to optimise its use. A number of useful measures of information usability such as accessibility, quantity, quality and openness are presented which together contribute to the Open Data Usability Index (ODUI). For the first time, a comprehensive assessment of open data usability has been developed and is expected to be a critical step in taking the open data agenda to the next level.

With over two million government datasets assessed against the open data usability framework and models developed to link entire country’s datasets to key industry sectors, never before has such an extensive analysis been undertaken. Government open data across Australia, Canada, Singapore, the United Kingdom and the United States reveal that most countries have the capacity for improvements in their information usability. It was found that for 2015 the United Kingdom led the way followed by Canada, Singapore, the United States and Australia. The global potential of government open data is expected to reach 20 exabytes by 2020, provided governments are able to release as much data as possible within legislative constraints….(More)”

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”

The era of development mutants


Guilo Quaggiotto at Nesta: “If you were looking for the cutting edge of the development sector, where would you go these days? You would probably look at startups like Premise who have predicted food trends 25 days faster than national statistics in Brazil, or GiveDirectly who are pushing the boundaries on evidence – from RCTs to new ways of mapping poverty – to fast track the adoption of cash transfers.

Or perhaps you might draw your attention to PetaJakarta who are experimenting with new responses to crises by harnessing human sensor networks. You might be tempted to consider Airbnb’s Disaster Response programme as an indicator of an emerging alternative infrastructure for disaster response (and perhaps raising questions about the political economy of this all).

And could Bitnation’s Refugee Emergency programme in response to the European refugee crisis be the possible precursor of future solutions for transnational issues – among the development sector’s hardest challenges? Are the business models of One Acre Fund, which provides services for smallholder farmers, or Floodtags, which analyses citizen data during floods for water and disaster managers, an indicator of future pathways to scale – that elusive development unicorn?

If you want to look at the future of procuring solutions for the development sector, should you be looking at initiatives like Citymart, which works with municipalities across the world to rethink traditional procurement and unleash the expertise and innovation capabilities of their citizens? By the same token, projects like Pathogen Box, Poverty Stoplight or Patient Innovation point to a brave new world where lead-user innovation and harnessing ‘sticky’ local knowledge becomes the norm, rather than the exception. You would also be forgiven for thinking that social movements across the world are the place to look for signs of future mechanisms for harnessing collective intelligence – Kawal Pamilu’s “citizen experts” self-organising around the Indonesian elections in 2014 is a textbook case study in this department.

The list could go on and on: welcome to the era of development mutants. While established players in the development sector are engrossed in soul-searching and their fitness for purpose is being scrutinised from all quarters, a whole new set of players is emerging, unfettered by legacy and borrowing from a variety of different disciplines. They point to a potentially different future – indeed, many potentially different futures – for the sector…..

But what if we wanted to invert this paradigm? How could we move from denial to fruitful collaboration with the ‘edgeryders’ of the development sector and accelerate its transformation?

Adopting new programming principles

Based on our experience working with development organisations, we believe that partnering with the mutants involves two types of shifts for traditional players: at the programmatic and the operational level. At the programmatic level, our work on the ground led us to articulate the following emerging principles:

  1. Mapping what people have, not what they need: even though approaches like jugaad and positive deviance have been around for a long time, unfortunately the default starting point for many development projects is still mapping needs, not assets. Inverting this paradigm allows for potentially disruptive project design and partnerships to emerge. (Signs of the future: Patient Innovation, Edgeryders, Community Mirror, Premise)

  2. Getting ready for multiple futures: When distributed across an organisation and not limited to a centralised function, the discipline of scanning the horizon for emergent solutions that contradict the dominant paradigm can help move beyond the denial phase and develop new interfaces to collaborate with the mutants. Here the link between analysis (to understand not only what is probable, but also what is possible) and action is critical – otherwise this remains purely an academic exercise. (Signs of the future: OpenCare, Improstuctures, Seeds of Good Anthropocene, Museum of the Future)

  3. Running multiple parallel experiments: According to Dave Snowden, in order to intervene in a complex system “you need multiple parallel experiments and they should be based on different and competing theories/hypotheses”. Unfortunately, many development projects are still based on linear narratives and assumptions such as “if only we run an awareness raising campaign citizens will change their behaviour”. Turning linear narratives into hypotheses to be tested (without becoming religious on a specific approach) opens up the possibility to explore the solution landscape and collaborate with non-obvious partners that bring new approaches to the table. (Signs of the future: Chukua Hakua, GiveDirectly, Finnish PM’s Office of Experiments, Ideas42, Cognitive Edge)

  4. Embracing obliquity: A deep, granular understanding of local assets and dynamics along with system mapping (see point 5 below) and pairing behavioural experts with development practitioners can help identify entry points for exploring new types of intervention based on obliquity principles. Mutants are often faster in adopting this approach and partnering with them is a way to bypass organisational inertia and explore nonlinear interventions. (Signs of the future: Sardex, social prescriptions, forensic architecture)

  5. From projects to systems: development organisations genuinely interested in developing new partnerships need to make the shift from the project logic to system investments. This involves, among other things, shifting the focus from providing solutions to helping every actor in the system to develop a higher level of consciousness about the issues they are facing and to take better decisions over time. It also entails partnering with mutants to explore entirely new financial mechanisms. (Signs of the future: Lankelly Chase, Indonesia waste banks, Dark Matter Labs)

Adopting new interfaces for working with the mutants

Harvard Business School professor Carliss Baldwin argued that most bureaucracies these days have a ‘non-contractible’ problem: they don’t know where smart people are, or how to evaluate how good they are. Most importantly, most smart people don’t want to work for them because they find them either too callous, unrewarding or slow (or a combination of all of these)….(More)”

Website Seeks to Make Government Data Easier to Sift Through


Steve Lohr at the New York Times: “For years, the federal government, states and some cities have enthusiastically made vast troves of data open to the public. Acres of paper records on demographics, public health, traffic patterns, energy consumption, family incomes and many other topics have been digitized and posted on the web.

This abundance of data can be a gold mine for discovery and insights, but finding the nuggets can be arduous, requiring special skills.

A project coming out of the M.I.T. Media Lab on Monday seeks to ease that challenge and to make the value of government data available to a wider audience. The project, called Data USA, bills itself as “the most comprehensive visualization of U.S. public data.” It is free, and its software code is open source, meaning that developers can build custom applications by adding other data.

Cesar A. Hidalgo, an assistant professor of media arts and sciences at the M.I.T. Media Lab who led the development of Data USA, said the website was devised to “transform data into stories.” Those stories are typically presented as graphics, charts and written summaries….Type “New York” into the Data USA search box, and a drop-down menu presents choices — the city, the metropolitan area, the state and other options. Select the city, and the page displays an aerial shot of Manhattan with three basic statistics: population (8.49 million), median household income ($52,996) and median age (35.8).

Lower on the page are six icons for related subject categories, including economy, demographics and education. If you click on demographics, one of the so-called data stories appears, based largely on data from the American Community Survey of the United States Census Bureau.

Using colorful graphics and short sentences, it shows the median age of foreign-born residents of New York (44.7) and of residents born in the United States (28.6); the most common countries of origin for immigrants (the Dominican Republic, China and Mexico); and the percentage of residents who are American citizens (82.8 percent, compared with a national average of 93 percent).

Data USA features a selection of data results on its home page. They include the gender wage gap in Connecticut; the racial breakdown of poverty in Flint, Mich.; the wages of physicians and surgeons across the United States; and the institutions that award the most computer science degrees….(More)