Open Data as Open Educational Resources: Case studies of emerging practice


Book edited by Javiera Atenas and Leo Havemann: “…is the outcome of a collective effort that has its origins in the 5th Open Knowledge Open Education Working Group call, in which the idea of using Open Data in schools was mentioned. It occurred to us that Open Data and open educational resources seemed to us almost to exist in separate open worlds.

We decided to seek out evidence in the use of open data as OER, initially by conducting a bibliographical search. As we could not find published evidence, we decided to ask educators if they were in fact, using open data in this way, and wrote a post for this blog (with Ernesto Priego) explaining our perspective, called The 21st Century’s Raw Material: Using Open Data as Open Educational Resources. We ended the post with a link to an exploratory survey, the results of which indicated a need for more awareness of the existence and potential value of Open Data amongst educators…..

the case studies themselves. They have been provided by scholars and practitioners from different disciplines and countries, and they reflect different approaches to the use of open data. The first case study presents an approach to educating both teachers and students in the use of open data for civil monitoring via Scuola di OpenCoesione in Italy, and has been written by Chiara Ciociola and Luigi Reggi. The second case, by Tim Coughlan from the Open University, UK, showcases practical applications in the use of local and contextualised open data for the development of apps. The third case, written by Katie Shamash, Juan Pablo Alperin & Alessandra Bordini from Simon Fraser University, Canada, demonstrates how publishing students can engage, through data analysis, in very current debates around scholarly communications and be encouraged to publish their own findings. The fourth case by Alan Dix from Talis and University of Birmingham, UK, and Geoffrey Ellis from University of Konstanz, Germany, is unique because the data discussed in this case is self-produced, indeed ‘quantified self’ data, which was used with students as material for class discussion and, separately, as source data for another student’s dissertation project. Finally, the fifth case, presented by Virginia Power from University of the West of England, UK, examines strategies to develop data and statistical literacies in future librarians and knowledge managers, aiming to support and extend their theoretical understanding of the concept of the ‘knowledge society’ through the use of Open Data….(More)

The book can be downloaded here Open Data as Open Educational Resources

Building Trust and Protecting Privacy: Progress on the President’s Precision Medicine Initiative


The White House: “Today, the White House is releasing the Privacy and Trust Principles for the President’s Precision Medicine Initiative (PMI). These principles are a foundation for protecting participant privacy and building trust in activities within PMI.

PMI is a bold new research effort to transform how we characterize health and treat disease. PMI will pioneer a new model of patient-powered research that promises to accelerate biomedical discoveries and provide clinicians with new tools, knowledge, and therapies to select which treatments will work best for which patients. The initiative includes development of a new voluntary research cohort by the National Institutes of Health (NIH), a novel regulatory approach to genomic technologies by the Food and Drug Administration, and new cancer clinical trials by the National Cancer Institute at NIH.  In addition, PMI includes aligned efforts by the Federal government and private sector collaborators to pioneer a new approach for health research and healthcare delivery that prioritizes patient empowerment through access to information and policies that enable safe, effective, and innovative technologies to be tested and made available to the public.

Following President Obama’s launch of PMI in January 2015, the White House Office of Science and Technology Policy worked with an interagency group to develop the Privacy and Trust Principles that will guide the Precision Medicine effort. The White House convened experts from within and outside of government over the course of many months to discuss their individual viewpoints on the unique privacy challenges associated with large-scale health data collection, analysis, and sharing. This group reviewed the bioethics literature, analyzed privacy policies for large biobanks and research cohorts, and released a draft set of Principles for public comment in July 2015…..

The Privacy and Trust Principles are organized into 6 broad categories:

  1. Governance that is inclusive, collaborative, and adaptable;
  2. Transparency to participants and the public;
  3. Respecting participant preferences;
  4. Empowering participants through access to information;
  5. Ensuring appropriate data sharing, access, and use;
  6. Maintaining data quality and integrity….(More)”

2015 Digital Cities: Winners Experiment with Forward-Thinking Tech Projects


List of winners at Govtech:

1st place // City of Philadelphia, Pa.

A savvy mix of data-driven citizen engagement, tech modernization and outside-the-box thinking powered Philadelphia to its first-place ranking. A new city websitelaunched last year is designed to provide new levels of user convenience. For instance, three navigation options are squeezed into the top of the site — a search bar, a list of common actions like “report a problem” or “pay a bill,” and a menu of city functions arranged topically — giving citizens multiple ways to find what they need. The site was created using agile principles, launching as a work in progress in December and shaped by user feedback. The city also is broadening its use of open data as a citizen-engagement tool. A new generation of civic apps relies on open data sets to give residents easy access to property tax calculations, property ownership information anddetailed maps of various city resources. These improvements in customer-facing services have been facilitated by upgrades to back-end systems that are improving reliability and reducing staff support requirements. The city estimates that half of its IT systems now are procured as a service. Finally, an interesting pilot involving the city IT department and a local middle school is aimed at drawing more kids into STEM-related careers. Students met weekly in the city Innovation Lab for a series of hands-on experiences led by members of the Philadelphia Office of Information Technology.

2nd place // City of Los Angeles, Calif.

Second-ranked Los Angeles is developing a new model for funding innovative ideas, leveraging private-sector platforms to improve services, streamlining internal processes and closing the broadband gap. The city established a $1 million innovation fund late last year to seed pilot projects generated by city employees’ suggestions. More than a dozen projects have been launched so far. Through open APIs, the city trades traffic information with Google’s Waze traffic app. The app consumes city traffic data to warn drivers about closed roads, hazards and dangerous intersections, while the city transportation department uses information submitted by Waze users to identify potholes, illegal road construction and traffic patterns. MyPayLA, launched by the LA Controller’s Office and the city Information Technology Agency, is a mobile app that lets city employees view their payroll information on a mobile device. And theCityLinkLA broadband initiative is designed to attract broadband providers to the city with expedited permitting and access to existing assets like streetlights, real estate and fiber.

2nd place // City of Louisville, Ky.

Louisville’s mobile-friendly Web portal garnered the city a second-place finish in the Center for Digital Government’s Best of the Web awards earlier this year. Now, Louisville has a No. 2 ranking in the 2015 Digital Cities Survey to add to its trophy case. Besides running an excellent website — built on the open source Drupal platform and hosted in the cloud — Louisville is equipping its entire police force with body-worn cameras and expects to be finished by the end of 2015. Video from 1,000 officers, as well as footage from Metro Watch cameras placed around the city, will be stored in the cloud. Louisville’s Metro Police Department, one of 21 cities involved in the White House Police Data Initiative, also became one of the first in the nation to release data sets on assaulted officers, arrests and citations, and hate crimes on the city’s open data portal. In addition, a public-private partnership called Code Louisville offers free technology training to local residents. More than 500 people have taken 12-week classes to learn Web or mobile development skills.

3rd place // City of Kansas City, Mo.

Kansas City’s Art of Data initiative may be one of the nation’s most creative attempts to engage citizens through open data. The city selected 10 local artists earlier this year to turn information from its open data site into visual art. The artists pulled information from 10 different data sets, ranging from life expectancy by ZIP code to citizen satisfaction with the safety of their neighborhoods. The exhibit drew a large crowd when it opened in June, according to the city, and more than 3,000 residents eventually viewed the works of art. Kansas City also was chosen to participate in a new HUD digital inclusion program called ConnectHome, which will offer broadband access, training, digital literacy programs and devices for residents in assisted housing units. And the city is working with a local startup business, RFP365, to simplify its RFP process. Through a pilot partnership, Kansas City will use the RFP365 platform — which lets buyers track and receive bids from vendors and suppliers — to make the government purchasing process easier and more transparent.

3rd place // City of Phoenix, Ariz.

The development of a new citywide transportation plan in Phoenix offers a great example of how to use digital engagement tools. Using the MindMixer platform, the city developed a website to let citizens suggest ideas for new transit services and street infrastructure, as well as discuss a range of transportation-related issues. Using polling, mapping, open-ended questions and discussion prompts, residents directly helped to develop the plan. The engagement process reached more 3,700 residents and generated hundreds of comments online. In addition, a city-led technology summit held late last year brought together big companies, small businesses and citizens to discuss how technology could improve city operations and boost economic development. And new court technology lets attorneys receive hearing notifications on a mobile device and enables Web and interactive voice response (IVR) payments for a variety of cases.

…(More)”

Politics and the New Machine


Jill Lepore in the NewYorker on “What the turn from polls to data science means for democracy”: “…The modern public-opinion poll has been around since the Great Depression, when the response rate—the number of people who take a survey as a percentage of those who were asked—was more than ninety. The participation rate—the number of people who take a survey as a percentage of the population—is far lower. Election pollsters sample only a minuscule portion of the electorate, not uncommonly something on the order of a couple of thousand people out of the more than two hundred million Americans who are eligible to vote. The promise of this work is that the sample is exquisitely representative. But the lower the response rate the harder and more expensive it becomes to realize that promise, which requires both calling many more people and trying to correct for “non-response bias” by giving greater weight to the answers of people from demographic groups that are less likely to respond. Pollster.com’s Mark Blumenthal has recalled how, in the nineteen-eighties, when the response rate at the firm where he was working had fallen to about sixty per cent, people in his office said, “What will happen when it’s only twenty? We won’t be able to be in business!” A typical response rate is now in the single digits.

Meanwhile, polls are wielding greater influence over American elections than ever….

Still, data science can’t solve the biggest problem with polling, because that problem is neither methodological nor technological. It’s political. Pollsters rose to prominence by claiming that measuring public opinion is good for democracy. But what if it’s bad?

A “poll” used to mean the top of your head. Ophelia says of Polonius, “His beard as white as snow: All flaxen was his poll.” When voting involved assembling (all in favor of Smith stand here, all in favor of Jones over there), counting votes required counting heads; that is, counting polls. Eventually, a “poll” came to mean the count itself. By the nineteenth century, to vote was to go “to the polls,” where, more and more, voting was done on paper. Ballots were often printed in newspapers: you’d cut one out and bring it with you. With the turn to the secret ballot, beginning in the eighteen-eighties, the government began supplying the ballots, but newspapers kept printing them; they’d use them to conduct their own polls, called “straw polls.” Before the election, you’d cut out your ballot and mail it to the newspaper, which would make a prediction. Political parties conducted straw polls, too. That’s one of the ways the political machine worked….

Ever since Gallup, two things have been called polls: surveys of opinions and forecasts of election results. (Plenty of other surveys, of course, don’t measure opinions but instead concern status and behavior: Do you own a house? Have you seen a doctor in the past month?) It’s not a bad idea to reserve the term “polls” for the kind meant to produce election forecasts. When Gallup started out, he was skeptical about using a survey to forecast an election: “Such a test is by no means perfect, because a preelection survey must not only measure public opinion in respect to candidates but must also predict just what groups of people will actually take the trouble to cast their ballots.” Also, he didn’t think that predicting elections constituted a public good: “While such forecasts provide an interesting and legitimate activity, they probably serve no great social purpose.” Then why do it? Gallup conducted polls only to prove the accuracy of his surveys, there being no other way to demonstrate it. The polls themselves, he thought, were pointless…

If public-opinion polling is the child of a strained marriage between the press and the academy, data science is the child of a rocky marriage between the academy and Silicon Valley. The term “data science” was coined in 1960, one year after the Democratic National Committee hired Simulmatics Corporation, a company founded by Ithiel de Sola Pool, a political scientist from M.I.T., to provide strategic analysis in advance of the upcoming Presidential election. Pool and his team collected punch cards from pollsters who had archived more than sixty polls from the elections of 1952, 1954, 1956, 1958, and 1960, representing more than a hundred thousand interviews, and fed them into a UNIVAC. They then sorted voters into four hundred and eighty possible types (for example, “Eastern, metropolitan, lower-income, white, Catholic, female Democrat”) and sorted issues into fifty-two clusters (for example, foreign aid). Simulmatics’ first task, completed just before the Democratic National Convention, was a study of “the Negro vote in the North.” Its report, which is thought to have influenced the civil-rights paragraphs added to the Party’s platform, concluded that between 1954 and 1956 “a small but significant shift to the Republicans occurred among Northern Negroes, which cost the Democrats about 1 per cent of the total votes in 8 key states.” After the nominating convention, the D.N.C. commissioned Simulmatics to prepare three more reports, including one that involved running simulations about different ways in which Kennedy might discuss his Catholicism….

Data science may well turn out to be as flawed as public-opinion polling. But a stage in the development of any new tool is to imagine that you’ve perfected it, in order to ponder its consequences. I asked Hilton to suppose that there existed a flawless tool for measuring public opinion, accurately and instantly, a tool available to voters and politicians alike. Imagine that you’re a member of Congress, I said, and you’re about to head into the House to vote on an act—let’s call it the Smeadwell-Nutley Act. As you do, you use an app called iThePublic to learn the opinions of your constituents. You oppose Smeadwell-Nutley; your constituents are seventy-nine per cent in favor of it. Your constituents will instantly know how you’ve voted, and many have set up an account with Crowdpac to make automatic campaign donations. If you vote against the proposed legislation, your constituents will stop giving money to your reëlection campaign. If, contrary to your convictions but in line with your iThePublic, you vote for Smeadwell-Nutley, would that be democracy? …(More)”

 

How Satellite Data and Artificial Intelligence could help us understand poverty better


Maya Craig at Fast Company: “Governments and development organizations currently measure poverty levels by conducting door-to-door surveys. The new partnership will test the use of AI to supplement these surveys and increase the accuracy of poverty data. Orbital said its AI software will analyze satellite images to see if characteristics such as building height and rooftop material can effectively indicate wealth.

The pilot study will be conducted in Sri Lanka. If successful, the World Bank hopes to scale it worldwide. A recent study conducted by the organization found that more than 50 countries lack legitimate poverty estimates, which limits the ability of the development community to support the world’s poorest populations.

“Data depravation is a serious issue, especially in many of the countries where we need it most,” says David Newhouse, senior economist at the World Bank. “This technology has the potential to help us get that data more frequently and at a finer level of detail than is currently possible.”

The announcement is the latest in an emerging industry of AI analysis of satellite photos. A growing number of investors and entrepreneurs are betting that the convergence of these fields will have far-reaching impacts on business, policy, resource management and disaster response.

Wall Street’s biggest hedge-fund businesses have begun using the technology to improve investment strategies. The Pew Charitable Trust employs the method to monitor oceans for illegal fishing activities. And startups like San Francisco-based Mavrx use similar analytics to optimize crop harvest.

The commercial earth-imaging satellite market, valued at $2.7 billion in 2014, is predicted to grow by 14% each year through the decade, according to a recent report.

As recently as two years ago, there were just four commercial earth imaging satellites operated in the U.S., and government contracts accounted for about 70% of imagery sales. By 2020, there will be hundreds of private-sector “smallsats” in orbit capturing imagery that will be easily accessible online. Companies like Skybox Imaging and Planet Labs have the first of these smallsats already active, with plans for more.

The images generated by these companies will be among the world’s largest data sets. And recent breakthroughs in AI research have made it possible to analyze these images to inform decision-making…(More)”

How smartphones are solving one of China’s biggest mysteries


Ana Swanson at the Washington Post: “For decades, China has been engaged in a building boom of a scale that is hard to wrap your mind around. In the last three decades, 260 million people have moved from the countryside to Chinese cities — equivalent to around 80 percent of the population of the U.S. To make room for all of those people, the size of China’s built-up urban areas nearly quintupled between 1984 and 2010.

Much of that development has benefited people’s lives, but some has not. In a breathless rush to boost growth and development, some urban areas have built vast, unused real estate projects — China’s infamous “ghost cities.” These eerie, shining developments are complete except for one thing: people to live in them.

China’s ghost cities have sparked a lot of debate over the last few years. Some argue that the developments are evidence of the waste in top-down planning, or the result of too much cheap funding for businesses. Some blame the lack of other good places for average people to invest their money, or the desire of local officials to make a quick buck — land sales generate a lot of revenue for China’s local governments.

Others say the idea of ghost cities has been overblown. They espouse a “build it and they will come” philosophy, pointing out that, with time, some ghost cities fill up and turn into vibrant communities.

It’s been hard to evaluate these claims, since most of the research on ghost cities has been anecdotal. Even the most rigorous research methods leave a lot to be desired — for example, investment research firms sending poor junior employees out to remote locations to count how many lights are turned on in buildings at night.

Now new research from Baidu, one of China’s biggest technology companies, provides one of the first systematic looks at Chinese ghost cities. Researchers from Baidu’s Big Data Lab and Peking University in Beijing used the kind of location data gathered by mobile phones and GPS receivers to track how people moved in and out suspected ghost cities, in real time and on a national scale, over a period of six months. You can see the interactive project here.

Google has been blocked in China for years, and Baidu dominates the market in terms of search, mobile maps and other offerings. That gave the researchers a huge data base to work with —  770 million users, a hefty chunk of China’s 1.36 billion people.

To identify potential ghost cities, the researchers created an algorithm that identifies urban areas with a relatively spare population. They define a ghost city as an urban region with a population of fewer than 5,000 people per square kilometer – about half the density recommended by the Chinese Ministry of Housing and Urban-Rural Development….(More)”

New traffic app and disaster prevention technology road tested


Psych.org: “A new smartphone traffic app tested by citizens in Dublin, Ireland allows users to give feedback on traffic incidents, enabling traffic management centres to respond quicker when collisions and other incidents happen around the city. The ‘CrowdAlert’ app, which is now available for download, is one of the key components utilised in the EU-funded INSIGHT project and a good example of how smartphones and social networks can be harnessed to improve public services and safety.

‘We are witnessing an explosion in the quantity, quality, and variety of available information, fuelled in large part by advances in sensor networking, the availability of low-cost sensor-enabled devices and by the widespread adoption of powerful smart-phones,’ explains  coordinator professor Dimitrios Gunopulos from the National and Kapodistrian University of Athens. ‘These revolutionary technologies are driving the development and adoption of applications where mobile devices are used for continuous data sensing and analysis.’

The project also developed a novel citywide real-time traffic monitoring tool, the ‘INSIGHT System’, which was tested in real conditions in the Dublin City control room, along with nationwide disaster monitoring technologies. The INSIGHT system was shown to provide early warnings to experts at situation centres, enabling them to monitor situations in real-time, including disasters with potentially nation-wide impacts such as severe weather conditions, floods and subsequent knock-on events such as fires and power outages.

The project’s results will be of interest to public services, which have until now lacked the necessary infrastructure for handling and integrating miscellaneous data streams, including data from static and mobile sensors as well as information coming from social network sources, in real-time. Providing cities with the ability to manage emergency situations with enhanced capabilities will also open up new markets for network technologies….(More)”

Good Governance by All Means


 at Huffington Post: “Citizens today have higher expectations and demand effective solutions to every day issues and challenges. From climate change to expedient postal services, governments are required to act with transparency and diligence. Public accountability demands us, public servants, to act with almost no margin of error and using the most open and transparent means available to achieve our goals. The name of the game is simple: government efforts should focus on building stronger, better and healthier relationships with civil society. Nobody should be left behind when tailoring public policy. For the Mexican Government, it is crystal clear, that such endeavor is no longer the State’s monopoly and thus, the pressing need for governments to use smarter and more efficient tool boxes, such as the one that the Open Government Partnership (OGP), provides. The buzzword is good governance by all means.

The High Level Segment of the 70th Session of the United Nations General Assembly was a milestone for the open government community. It allowed the 13 countries taking part of the OGP Steering Committee and several civil society organizations to endorse the Joint Declaration: Open Government for the Implementation of the 2030 Agenda for Sustainable Development. This declaration highlights the paramount importance of promoting the principles of open government (transparency, accountability, citizen participation and innovation) as key enablers of the Sustainable Development Goals. The Declaration particularly embraces Agenda 2030’s Goal 16 as a common target for all 66 OGP member countries. Our common goal is to continue building stronger institutions while weaving peaceful and inclusive societies. Our meeting in New York also allowed us to work with key players to develop the Open Data Charter that recognizes the value of having timely, comprehensive, accessible, and comparable data for the promotion of greater citizen engagement triggering development and innovation….(More)

Advancing Open and Citizen-Centered Government


The White House: “Today, the United States released our third Open Government National Action Plan, announcing more than 40 new or expanded initiatives to advance the President’s commitment to an open and citizen-centered government….In the third Open Government National Action Plan, the Administration both broadens and deepens efforts to help government become more open and more citizen-centered. The plan includes new and impactful steps the Administration is taking to openly and collaboratively deliver government services and to support open government efforts across the country. These efforts prioritize a citizen-centric approach to government, including improved access to publicly available data to provide everyday Americans with the knowledge and tools necessary to make informed decisions.

One example is the College Scorecard, which shares data through application programming interfaces (APIs) to help students and families make informed choices about education. Open APIs help create an ecosystem around government data in which civil society can provide useful visual tools, making this data more accessible and commercial developers can enable even more value to be extracted to further empower students and their families. In addition to these newer approaches, the plan also highlights significant longstanding open government priorities such as access to information, fiscal transparency, and records management, and continues to push for greater progress in that work.

The plan also focuses on supporting implementation of the landmark 2030 Agenda for Sustainable Development, which sets out a vision and priorities for global development over the next 15 years and was adopted last month by 193 world leaders including President Obama. The plan includes commitments to harness open government and progress toward the Sustainable Development Goals (SDGs) both in the United States and globally, including in the areas of education, health, food security, climate resilience, science and innovation, justice and law enforcement. It also includes a commitment to take stock of existing U.S. government data that relates to the 17 SDGs, and to creating and using data to support progress toward the SDGs.

Some examples of open government efforts newly included in the plan:

  • Promoting employment by unlocking workforce data, including training, skill, job, and wage listings.
  • Enhancing transparency and participation by expanding available Federal services to theOpen311 platform currently available to cities, giving the public a seamless way to report problems and request assistance.
  • Releasing public information from the electronically filed tax forms of nonprofit and charitable organizations (990 forms) as open, machine-readable data.
  • Expanding access to justice through the White House Legal Aid Interagency Roundtable.
  • Promoting open and accountable implementation of the Sustainable Development Goals….(More)”

Can Mobile Phone Surveys Identify People’s Development Priorities?


Ben Leo and Robert Morello at the Center for Global Development: “Mobile phone surveys are fast, flexible, and cheap. But, can they be used to engage citizens on how billions of dollars in donor and government resources are spent? Over the last decade, donor governments and multilateral organizations have repeatedly committed to support local priorities and programs. Yet, how are they supposed to identify these priorities on a timely, regular basis? Consistent discussions with the local government are clearly essential, but so are feeding ordinary people’s views into those discussions. However, traditional tools, such as household surveys or consultative roundtables, present a range of challenges for high-frequency citizen engagement. That’s where mobile phone surveys could come in, enabled by the exponential rise in mobile coverage throughout the developing world.

Despite this potential, there have been only a handful of studies into whether mobile surveys are a reliable and representative tool across a broad range of developing-country contexts. Moreover, there have been almost none that specifically look at collecting information about people’s development priorities. Along with Tiago Peixoto,Steve Davenport, and Jonathan Mellon, who focus on promoting citizen engagement and open government practices at the World Bank, we sought to address this policy research gap. Through a study focused on four low-income countries (Afghanistan, Ethiopia, Mozambique, and Zimbabwe), we rigorously tested the feasibility of interactive voice recognition (IVR) surveys for gauging citizens’ development priorities.

Specifically, we wanted to know whether respondents’ answers are sensitive to a range of different factors, such as (i) the specified executing actor (national government or external partners); (ii) time horizons; or (iii) question formats. In other words, can we be sufficiently confident that surveys about people’s priorities can be applied more generally to a range of development actors and across a range of country contexts?

Several of these potential sensitivity concerns were raised in response to an earlier CGD working paper, which found that US foreign aid is only modestly aligned with Africans’ and Latin Americans’ most pressing concerns. This analysis relied upon Afrobarometer and Latinobarometro survey data (see explanatory note below). For instance, some argued that people’s priorities for their own government might be far less relevant for donor organizations. Put differently, the World Bank or USAID shouldn’t prioritize job creation in Nigeria simply because ordinary Nigerians cite it as a pressing government priority. Our hypothesis was that development priorities would likely transcend all development actors, and possibly different timeframes and question formats as well. But, we first needed to test these assumptions.

So, what did we find? We’ve included some of the key highlights below. For a more detailed description of the study and the underlying analysis, please see our new working paper. Along with our World Bank colleagues, we also published an accompanying paper that considers a range of survey method issues, including survey representativeness….(More)”