Crowdsourcing Ideas to Accelerate Economic Growth and Prosperity through a Strategy for American Innovation


Jason Miller and Tom Kalil at the White House Blog: “America’s future economic growth and international competitiveness depend crucially on our capacity to innovate. Creating the jobs and industries of the future will require making the right investments to unleash the unmatched creativity and imagination of the American people.
We want to gather bold ideas for how we as a nation can build on and extend into the future our historic strengths in innovation and discovery. Today we are calling on thinkers, doers, and entrepreneurs across the country to submit their proposals for promising new initiatives or pressing needs for renewed investment to be included in next year’s updated Strategy for American Innovation.
What will the next Strategy for American Innovation accomplish? In part, it’s up to you. Your input will help guide the Administration’s efforts to catalyze the transformative innovation in products, processes, and services that is the hallmark of American ingenuity.
Today, we released a set of questions for your comment, which you can access here and on Quora – an online platform that allows us to crowdsource ideas from the American people.

  Calling on America’s Inventors and Innovators for Great Ideas
Among the questions we are posing today to innovators across the country are:

  • What specific policies or initiatives should the Administration consider prioritizing in the next version of the Strategy for American Innovation?
  • What are the biggest challenges to, and opportunities for, innovation in the United States that will generate long-term economic growth and rising standards of living for more Americans?
  • What additional opportunities exist to develop high-impact platform technologies that reduce the time and cost associated with the “design, build, test” cycle for important classes of materials, products, and systems?
  • What investments, strategies, or technological advancements, across both the public and private sectors, are needed to rebuild the U.S. “industrial commons” (i.e., regional manufacturing capabilities) and ensure the latest technologies can be produced here?
  • What partnerships or novel models for collaboration between the Federal Government and regions should the Administration consider in order to promote innovation and the development of regional innovation ecosystems?

 
In today’s world of rapidly evolving technology, the Administration is adapting its approach to innovation-driven economic growth to reflect the emergence of new and exciting possibilities. Now is the time to gather input from the American people in order to envision and shape the innovations of the future. The full Request for Information can be found here and the 2011 Strategy for American Innovation can be found here. Comments are due by September 23, 2014, and can be sent to innovationstrategy@ostp.gov.  We look forward to hearing your ideas!”

Unleashing Climate Data to Empower America’s Agricultural Sector


Secretary Tom Vilsack and John P. Holdren at the White House Blog: “Today, in a major step to advance the President’s Climate Data Initiative, the Obama administration is inviting leaders of the technology and agricultural sectors to the White House to discuss new collaborative steps to unleash data that will help ensure our food system is resilient to the effects of climate change.

More intense heat waves, heavier downpours, and severe droughts and wildfires out west are already affecting the nation’s ability to produce and transport safe food. The recently released National Climate Assessment makes clear that these kinds of impacts are projected to become more severe over this century.

Food distributors, agricultural businesses, farmers, and retailers need accessible, useable data, tools, and information to ensure the effectiveness and sustainability of their operations – from water availability, to timing of planting and harvest, to storage practices, and more.

Today’s convening at the White House will include formal commitments by a host of private-sector companies and nongovernmental organizations to support the President’s Climate Data Initiative by harnessing climate data in ways that will increase the resilience of America’s food system and help reduce the contribution of the nation’s agricultural sector to climate change.

Microsoft Research, for instance, will grant 12 months of free cloud-computing resources to winners of a national challenge to create a smartphone app that helps farmers increase the resilience of their food production systems in the face of weather variability and climate change; the Michigan Agri-Business Association will soon launch a publicly available web-based mapping tool for use by the state’s agriculture sector; and the U.S. dairy industry will test and pilot four new modules – energy, feed, nutrient, and herd management – on the data-driven Farm Smart environmental-footprint calculation tool by the end of 2014. These are just a few among dozens of exciting commitments.

And the federal government is also stepping up. Today, anyone can log onto climate.data.gov and find new features that make data accessible and usable about the risks of climate change to food production, delivery, and nutrition – including current and historical data from the Census of Agriculture on production, supply, and distribution of agricultural products, and data on climate-change-related risks such as storms, heat waves, and drought.

These steps are a direct response to the President’s call for all hands on deck to generate further innovation to help prepare America’s communities and business for the impacts of climate change.

We are delighted about the steps being announced by dozens of collaborators today, and we can’t wait to see what further tools, apps, and services are developed as the Administration and its partners continue to unleash data to make America’s agriculture enterprise stronger and more resilient than ever before.

Read a fact sheet about all of today’s Climate Data Initiative commitments here.

Sharing Data Is a Form of Corporate Philanthropy


Matt Stempeck in HBR Blog:  “Ever since the International Charter on Space and Major Disasters was signed in 1999, satellite companies like DMC International Imaging have had a clear protocol with which to provide valuable imagery to public actors in times of crisis. In a single week this February, DMCii tasked its fleet of satellites on flooding in the United Kingdom, fires in India, floods in Zimbabwe, and snow in South Korea. Official crisis response departments and relevant UN departments can request on-demand access to the visuals captured by these “eyes in the sky” to better assess damage and coordinate relief efforts.

DMCii is a private company, yet it provides enormous value to the public and social sectors simply by periodically sharing its data.
Back on Earth, companies create, collect, and mine data in their day-to-day business. This data has quickly emerged as one of this century’s most vital assets. Public sector and social good organizations may not have access to the same amount, quality, or frequency of data. This imbalance has inspired a new category of corporate giving foreshadowed by the 1999 Space Charter: data philanthropy.
The satellite imagery example is an area of obvious societal value, but data philanthropy holds even stronger potential closer to home, where a wide range of private companies could give back in meaningful ways by contributing data to public actors. Consider two promising contexts for data philanthropy: responsive cities and academic research.
The centralized institutions of the 20th century allowed for the most sophisticated economic and urban planning to date. But in recent decades, the information revolution has helped the private sector speed ahead in data aggregation, analysis, and applications. It’s well known that there’s enormous value in real-time usage of data in the private sector, but there are similarly huge gains to be won in the application of real-time data to mitigate common challenges.
What if sharing economy companies shared their real-time housing, transit, and economic data with city governments or public interest groups? For example, Uber maintains a “God’s Eye view” of every driver on the road in a city:
stempeck2
Imagine combining this single data feed with an entire portfolio of real-time information. An early leader in this space is the City of Chicago’s urban data dashboard, WindyGrid. The dashboard aggregates an ever-growing variety of public datasets to allow for more intelligent urban management.
stempeck3
Over time, we could design responsive cities that react to this data. A responsive city is one where services, infrastructure, and even policies can flexibly respond to the rhythms of its denizens in real-time. Private sector data contributions could greatly accelerate these nascent efforts.
Data philanthropy could similarly benefit academia. Access to data remains an unfortunate barrier to entry for many researchers. The result is that only researchers with access to certain data, such as full-volume social media streams, can analyze and produce knowledge from this compelling information. Twitter, for example, sells access to a range of real-time APIs to marketing platforms, but the price point often exceeds researchers’ budgets. To accelerate the pursuit of knowledge, Twitter has piloted a program called Data Grants offering access to segments of their real-time global trove to select groups of researchers. With this program, academics and other researchers can apply to receive access to relevant bulk data downloads, such as an period of time before and after an election, or a certain geographic area.
Humanitarian response, urban planning, and academia are just three sectors within which private data can be donated to improve the public condition. There are many more possible applications possible, but few examples to date. For companies looking to expand their corporate social responsibility initiatives, sharing data should be part of the conversation…
Companies considering data philanthropy can take the following steps:

  • Inventory the information your company produces, collects, and analyzes. Consider which data would be easy to share and which data will require long-term effort.
  • Think who could benefit from this information. Who in your community doesn’t have access to this information?
  • Who could be harmed by the release of this data? If the datasets are about people, have they consented to its release? (i.e. don’t pull a Facebook emotional manipulation experiment).
  • Begin conversations with relevant public agencies and nonprofit partners to get a sense of the sort of information they might find valuable and their capacity to work with the formats you might eventually make available.
  • If you expect an onslaught of interest, an application process can help qualify partnership opportunities to maximize positive impact relative to time invested in the program.
  • Consider how you’ll handle distribution of the data to partners. Even if you don’t have the resources to set up an API, regular releases of bulk data could still provide enormous value to organizations used to relying on less-frequently updated government indices.
  • Consider your needs regarding privacy and anonymization. Strip the data of anything remotely resembling personally identifiable information (here are some guidelines).
  • If you’re making data available to researchers, plan to allow researchers to publish their results without obstruction. You might also require them to share the findings with the world under Open Access terms….”

How Thousands Of Dutch Civil Servants Built A Virtual 'Government Square' For Online Collaboration


Federico Guerrini at Forbes: “Democracy needs a reboot, or as the founders of Democracy Os, an open source platform for political debate say, “a serious upgrade”. They are not alone in trying to change the way citizens and governments communicate with each other. Not long ago, I covered on this blog a Greek platform, VouliWatch, which aims at boosting civic engagement following the model of other similar initiatives in countries like Germany, France and Austria, all running thanks to a software called Parliament Watch.
Other decision making tools, used by activists and organizations that try to reduce the distance between the people and their representatives include Liquid Feedback, and Airesis. But the quest for disintermediation doesn’t regard only the relationship between governments and citizens: it’s changing the way public organisations work internally as well. Civil servants are starting to develop and use their internal “social networks”, to exchange ideas, discussing issues and collaborate on projects.
One such thing is happening in the Netherlands: thousands of civil servants belonging to all government organizations have built their own “intranet” using Pleio (“government square”, in Dutch) a platform that runs on the open source networking engine Elgg.
It all started in 2010, thanks to the work of a group of four founders, Davied van Berlo, Harrie Custers, Wim Essers and Marcel Ziemerink. Growth has been steady and now Pleio can count on some 75.000 users spread in about 800 subsites. The nice thing about the platform, in fact, is that it is modular: subscribers can collaborate on a group and then start a sub group to get in more depth with a smaller team. To learn a little more about this unique experience, I reached out for van Berlo, who kindly answered a few questions. Check the interview below.
pleio
Where did the Pleio idea come from?Were you inspired by other experiences?

The idea came mainly from the developments around us: the whole web 2.0 movement at the time. This has shown us the power of platforms to connect people, bring them together and let them cooperate. I noticed that civil servants were looking for ways of collaborating across organisational borders and many were using the new online tools. That’s why I started the Civil Servant 2.0 network, so they could exchange ideas and experiences in this new way of working.
However, these tools are not always the ideal solution. They’re commercial for one, which can get in the way of the public goals we work for. They’re often American, where other laws and practices apply. You can’t change them or add to them. Usually you have to get another tool (and login) for different functionalities. And they were outright forbidden by some government agencies. I noticed there was a need for a platform where different tools were integrated, where people from different organisations and outside government could work together and where all information would remain in the Netherlands and in the hands of the original owner. Since there was no such platform we started one of our own….”

Open Data for economic growth: the latest evidence


Andrew Stott at the Worldbank OpenData Blog: “One of the key policy drivers for Open Data has been to drive economic growth and business innovation. There’s a growing amount of evidence and analysis not only for the total potential economic benefit but also for some of the ways in which this is coming about. This evidence is summarised and reviewed in a new World Bank paper published today.
There’s a range of studies that suggest that the potential prize from Open Data could be enormous – including an estimate of $3-5 trillion a year globally from McKinsey Global Institute and an estimate of $13 trillion cumulative over the next 5 years in the G20 countries.  There are supporting studies of the value of Open Data to certain sectors in certain countries – for instance $20 billion a year to Agriculture in the US – and of the value of key datasets such as geospatial data.  All these support the conclusion that the economic potential is at least significant – although with a range from “significant” to “extremely significant”!
At least some of this benefit is already being realised by new companies that have sprung up to deliver new, innovative, data-rich services and by older companies improving their efficiency by using open data to optimise their operations. Five main business archetypes have been identified – suppliers, aggregators, enrichers, application developers and enablers. What’s more there are at least four companies which did not exist ten years ago, which are driven by Open Data, and which are each now valued at around $1 billion or more. Somewhat surprisingly the drive to exploit Open Data is coming from outside the traditional “ICT sector” – although the ICT sector is supplying many of the tools required.
It’s also becoming clear that if countries want to maximise their gain from Open Data the role of government needs to go beyond simply publishing some data on a website. Governments need to be:

  • Suppliers – of the data that business need
  • Leaders – making sure that municipalities, state owned enterprises and public services operated by the private sector also release important data
  • Catalysts – nurturing a thriving ecosystem of data users, coders and application developers and incubating new, data-driven businesses
  • Users – using Open Data themselves to overcome the barriers to using data within government and innovating new ways to use the data they collect to improve public services and government efficiency.

Nevertheless, most of the evidence for big economic benefits for Open Data comes from the developed world. So on Wednesday the World Bank is holding an open seminar to examine critically “Can Open Data Boost Economic Growth and Prosperity” in developing countries. Please join us and join the debate!
Learn more:

New Commerce Department report explores huge benefits, low cost of government data


Mark Doms, Under Secretary for Economic Affairs in a blog: Today we are pleased to roll out an important new Commerce Department report on government data. “Fostering Innovation, Creating Jobs, Driving Better Decisions: The Value of Government Data,” arrives as our society increasingly focuses on how the intelligent use of data can make our businesses more competitive, our governments smarter, and our citizens better informed.

And when it comes to data, as the Under Secretary for Economic Affairs, I have a special appreciation for the Commerce Department’s two preeminent statistical agencies, the Census Bureau and the Bureau of Economic Analysis. These agencies inform us on how our $17 trillion economy is evolving and how our population (318 million and counting) is changing, data critical to our country. Although “Big Data” is all the rage these days, the government has been in this  business for a long time: the first Decennial Census was in 1790, gathering information on close to four million people, a huge dataset for its day, and not too shabby by today’s standards as well.

Just how valuable is the data we provide? Our report seeks to answer this question by exploring the range of federal statistics and how they are applied in decision-making. Examples of our data include gross domestic product, employment, consumer prices, corporate profits, retail sales, agricultural supply and demand, population, international trade and much more.

Clearly, as shown in the report, the value of this information to our society far exceeds its cost – and not just because the price tag is shockingly low: three cents, per person, per day. Federal statistics guide trillions of dollars in annual investments at an average annual cost of $3.7 billion: just 0.02 percent of our $17 trillion dollar economy covers the massive amount of data collection, processing and dissemination. With a statistical system that is comprehensive, consistent, confidential, relevant and accessible, the federal government is uniquely positioned to provide a wide range of statistics that complement the vast and growing sources of private sector data.

Our federally collected information is frequently “invisible,” because attribution is not required. But it flows daily into myriad commercial products and services. Today’s report identifies the industries that intensively use our data and provides a rough estimate of the size of this sector. The lower-bound estimate suggests government statistics help private firms generate revenues of at least $24 billion annually – more than six times what we spend for the data. The upper-bound estimate suggests annual revenues of $221 billion!

This report takes a first crack at putting an actual dollars and cents value to government data. We’ve learned a lot from this initial study, and look forward to honing in even further on that figure in our next report.”

Eigenmorality


Blog from Scott Aaronson: “This post is about an idea I had around 1997, when I was 16 years old and a freshman computer-science major at Cornell.  Back then, I was extremely impressed by a research project called CLEVER, which one of my professors, Jon Kleinberg, had led while working at IBM Almaden.  The idea was to use the link structure of the web itself to rank which web pages were most important, and therefore which ones should be returned first in a search query.  Specifically, Kleinberg defined “hubs” as pages that linked to lots of “authorities,” and “authorities” as pages that were linked to by lots of “hubs.”  At first glance, this definition seems hopelessly circular, but Kleinberg observed that one can break the circularity by just treating the World Wide Web as a giant directed graph, and doing some linear algebra on its adjacency matrix.  Equivalently, you can imagine an iterative process where each web page starts out with the same hub/authority “starting credits,” but then in each round, the pages distribute their credits among their neighbors, so that the most popular pages get more credits, which they can then, in turn, distribute to their neighbors by linking to them.
I was also impressed by a similar research project called PageRank, which was proposed later by two guys at Stanford named Sergey Brin and Larry Page.  Brin and Page dispensed with Kleinberg’s bipartite hubs-and-authorities structure in favor of a more uniform structure, and made some other changes, but otherwise their idea was very similar.  At the time, of course, I didn’t know that CLEVER was going to languish at IBM, while PageRank (renamed Google) was going to expand to roughly the size of the entire world’s economy.
In any case, the question I asked myself about CLEVER/PageRank was not the one that, maybe in retrospect, I should have asked: namely, “how can I leverage the fact that I know the importance of this idea before most people do, in order to make millions of dollars?”
Instead I asked myself: “what other ‘vicious circles’ in science and philosophy could one unravel using the same linear-algebra trick that CLEVER and PageRank exploit?”  After all, CLEVER and PageRank were both founded on what looked like a hopelessly circular intuition: “a web page is important if other important web pages link to it.”  Yet they both managed to use math to defeat the circularity.  All you had to do was find an “importance equilibrium,” in which your assignment of “importance” to each web page was stable under a certain linear map.  And such an equilibrium could be shown to exist—indeed, to exist uniquely.
Searching for other circular notions to elucidate using linear algebra, I hit on morality.  Philosophers from Socrates on, I was vaguely aware, had struggled to define what makes a person “moral” or “virtuous,” without tacitly presupposing the answer.  Well, it seemed to me that, as a first attempt, one could do a lot worse than the following:

A moral person is someone who cooperates with other moral people, and who refuses to cooperate with immoral people.

Obviously one can quibble with this definition on numerous grounds: for example, what exactly does it mean to “cooperate,” and which other people are relevant here?  If you don’t donate money to starving children in Africa, have you implicitly “refused to cooperate” with them?  What’s the relative importance of cooperating with good people and withholding cooperation with bad people, of kindness and justice?  Is there a duty not to cooperate with bad people, or merely the lack of a duty to cooperate with them?  Should we consider intent, or only outcomes?  Surely we shouldn’t hold someone accountable for sheltering a burglar, if they didn’t know about the burgling?  Also, should we compute your “total morality” by simply summing over your interactions with everyone else in your community?  If so, then can a career’s worth of lifesaving surgeries numerically overwhelm the badness of murdering a single child?
For now, I want you to set all of these important questions aside, and just focus on the fact that the definition doesn’t even seem to work on its own terms, because of circularity.  How can we possibly know which people are moral (and hence worthy of our cooperation), and which ones immoral (and hence unworthy), without presupposing the very thing that we seek to define?
Ah, I thought—this is precisely where linear algebra can come to the rescue!  Just like in CLEVER or PageRank, we can begin by giving everyone in the community an equal number of “morality starting credits.”  Then we can apply an iterative update rule, where each person A can gain morality credits by cooperating with each other person B, and A gains more credits the more credits B has already.  We apply the rule over and over, until the number of morality credits per person converges to an equilibrium.  (Or, of course, we can shortcut the process by simply finding the principal eigenvector of the “cooperation matrix,” using whatever algorithm we like.)  We then have our objective measure of morality for each individual, solving a 2400-year-old open problem in philosophy….”

15 Ways to bring Civic Innovation to your City


Chris Moore at AcuitasGov: “In my previous blog post I wrote about a desire to see our Governments transform to be part of the  21st century.  I saw a recent reference to how governments across Canada have lost their global leadership, how government in Canada at all levels is providing analog services to a digital society.  I couldn’t agree more.  I have been thinking lately about some practical ways that Mayors and City Managers could innovate in their communities.  I realize that there are a number of municipal elections happening this fall across Canada, a time when leadership changes and new ideas emerge.  So this blog is also for Mayoral candidates who have a sense that technology and innovation have a role to play in their city and in their administration.
I thought I would identify 15 initiatives that cities could pursue as part of their Civic Innovation Strategy.   For the last 50 years technology in local government in Canada has been viewed as an expense, as a necessary evil, not always understood by elected officials and senior administrators.  Information and Technology is part of every aspect of a city, it is critical in delivering services.  It is time to not just think of this as an expense but as an investment, as a way to innovate, reduce costs, enhance citizen service delivery and transform government operations.
Here are my top 15 ways to bring Civic Innovation to your city:
1. Build 21st Century Digital Infrastructure like the Chattanooga Gig City Project.
2. Build WiFi networks like the City of Edmonton on your own and in partnership with others.
3. Provide technology and internet to children and youth in need like the City of Toronto.
4. Connect to a national Education and Research network like Cybera in Alberta and CANARIE.
5. Create a Mayors Task-force on Innovation and Technology leveraging your city’s resources.
6. Run a hackathon or two or three like the City of Glasgow or maybe host a hacking health event like the City of Vancouver.
7. Launch a Startup incubator like Startup Edmonton or take it to the next level and create a civic lab like the City of Barcelona.
8. Develop an Open Government Strategy, I like to the Open City Strategy from Edmonton.
9. If Open Government is too much then just start with Open Data, Edmonton has one of the best.
10. Build a Citizen Dashboard to showcase your cities services and commitment to the public.
11. Put your Crime data online like the Edmonton Police Service.
12. Consider a pilot project with sensor technology for parking like the City of Nice or for  waste management like the City of Barcelona.
13. Embrace Car2Go, Modo and UBER as ways to move people in your city.
14. Consider turning your IT department into the Innovation and Technology Department like they did at the City of Chicago.
15. Partner with other near by local governments to create a shared Innovation and Technology agency.
Now more than ever before cities need to find ways to innovate, to transform and to create a foundation that is sustainable.  Now is the time for both courage and innovations in government.  What is your city doing to move into the 21st Century?”

How Long Is Too Long? The 4th Amendment and the Mosaic Theory


Law and Liberty Blog: “Volume 8.2 of the NYU Journal of Law and Liberty has been sent to the printer and physical copies will be available soon, but the articles in the issue are already available online here. One article that has gotten a lot of attention so far is by Steven Bellovin, Renee Hutchins, Tony Jebara, and Sebastian Zimmeck titled “When Enough is Enough: Location Tracking, Mosaic Theory, and Machine Learning.” A direct link to the article is here.
The mosaic theory is a modern corollary accepted by some academics – and the D.C. Circuit Court of Appeals in Maynard v. U.S. – as a twenty-first century extension of the Fourth Amendment’s prohibition on unreasonable searches of seizures. Proponents of the mosaic theory argue that at some point enough individual data collections, compiled and analyzed together, become a Fourth Amendment search. Thirty years ago the Supreme Court upheld the use of a tracking device for three days without a warrant, however the proliferation of GPS tracking in cars and smartphones has made it significantly easier for the police to access a treasure trove of information about our location at any given time.
It is easy to see why this theory has attracted some support. Humans are creatures of habit – if our public locations are tracked for a few days, weeks, or a month, it is pretty easy for machines to learn our ways and assemble a fairly detailed report for the government about our lives. Machines could basically predict when you will leave your house for work, what route you will take, when and where you go grocery shopping, all before you even do it, once it knows your habits. A policeman could observe you moving about in public without a warrant of course, but limited manpower will always reduce the probability of continuous mass surveillance. With current technology, a handful of trained experts could easily monitor hundreds of people at a time from behind a computer screen, and gather even more information than most searches requiring a warrant. The Supreme Court indicated a willingness to consider the mosaic theory in U.S. v. Jones, but has yet to embrace it…”

The article in Law & Liberty details the need to determine at which point machine learning creates an intrusion into our reasonable expectations of privacy, and even discusses an experiment that could be run to determine how long data collection can proceed before it is an intrusion. If there is a line at which individual data collection becomes a search, we need to discover where that line is. One of the articles’ authors, Steven Bollovin, has argued that the line is probably at one week – at that point your weekday and weekend habits would be known. The nation’s leading legal expert on criminal law, Professor Orin Kerr, fired back on the Volokh Conspiracy that Bollovin’s one week argument is not in line with previous iterations of the mosaic theory.

Open Government Data: Helping Parents to find the Best School for their Kids


Radu Cucos at the Open Government Partnership blog: “…This challenge – finding the right school – is probably one of the most important decisions in many parents’ lives.  Parents are looking for answers to questions such as which schools are located in safe neighborhoods, which ones have the highest teacher – students’ ratio, which schools have the best funding, which schools have the best premises or which ones have the highest grades average.
It is rarely an easy decision, but is made doubly difficult in the case of migrants.  People residing in the same location for a long time know, more or less, which are the best education institutions in their city, town or village. For migrants, the situation is absolutely the opposite. They have to spend extra time and resources in identifying relevant information about schools.
Open Government Data is an effective solution which can ease the problem of a lack of accessible information about existing schools in a particular country or location. By adopting the Open Government Data policy in the educational field, governments release data about grades, funding, student and teacher numbers, data generated throughout time by schools, colleges, universities and other educational settings.
Developers then use this data for creating applications which portray information in easy accessible formats. Three of the best apps which I have come across are highlighted below:

  • Discover Your School, developed under the Province of British Columbia of Canada Open Data Initiative, is a platform for parents who are interested in finding a school for their kids, learning about the school districts or comparing schools in the same area. The application provides comprehensive information, such as the number of students enrolled in schools each year, class sizes, teaching language, disaster readiness, results of skills assessment, and student and parent satisfaction. Information and data can be viewed in interactive formats, including maps. On top of that, Discover Your School engages parents in policy making and initiatives such as Erase Bullying or British Columbia Education Plan.
  • The School Portal, developed under the Moldova Open Data Initiative, uses data made public by the Ministry of Education of Moldova to offer comprehensive information about 1529 educational institutions in the Republic of Moldova. Users of the portal can access information about schools yearly budgets, budget implementation, expenditures, school rating, students’ grades, schools’ infrastructure and communications. The School Portal has a tool which allows visitors to compare schools based on different criteria – infrastructure, students’ performance or annual budgets. The additional value of the portal is the fact that it serves as a platform for private sector entities which sell school supplies to advertise their products. The School Portal also allows parents to virtually interact with the Ministry of Education of Moldova or with a psychologist in case they need additional information or have concerns regarding the education of their children.
  • RomaScuola, developed under the umbrella of the Italian Open Data Initiative, allows visitors to obtain valuable information about all schools in the Rome region. Distinguishing it from the two listed above is the ability to compare schools depending on such facets as frequency of teacher absence, internet connectivity, use of IT equipment for teaching, frequency of students’ transfer to other schools and quality of education in accordance with the percentage of issued diplomas.

Open data on schools has great value not only for parents but also for the educational system in general. Each country has its own school market, if education is considered as a product in this market. Perfect information about products is one of the main characteristics of competitive markets. From this perspective, giving parents the opportunity to have access to information about schools characteristics will contribute to the increase in the competitiveness of the schools market. Educational institutions will have incentives to improve their performance in order to attract more students…”