Innovation in Philanthropy is not a Hack-a-thon


Sam McAfee in Medium: “…Antiquated funding models and lack of a rapid data-driven evaluation process aren’t the only issues though. Most of the big ideas in the technology-for-social-impact space are focused either on incremental improvements to existing service models, maybe leveraging online services or mobile applications to improve cost-efficiency marginally. Or they solve only a very narrow niche problem for a small audience, often applying a technology that was already in development, and just happened to find a solution in the field.

Innovation Requires Disruption

When you look at innovation in the commercial world, like the Ubers and AirBnBs of the world, what you see is a clear and substantive break from previous modes of thinking about transportation and accommodation. And it’s not the technology itself that is all that impressive. There is nothing ground-breaking technically under the hood of either of those products that wasn’t already lying around for a decade. What makes them different is that they created business models that stepped completely out of the existing taxi and hotel verticals, and simply used technology to leverage existing frustrations with those antiquated models and harness latent demands, to produce a new, vibrant commercial ecosystem.

Now, let’s imagine the same framework in the social sector, where there are equivalent long-standing traditional modes of providing resources. To find new ways of meeting human needs that disrupt those models requires both safe-to-fail experimentation and rapid feedback and iteration in the field, with clear success criteria. Such rapid development can only be accomplished by a sharp, nimble and multifaceted team of thinkers and doers who are passionate about the problem, yes, but also empowered and enabled to break a few institutional eggs on the way to the creative omelet.

Agile and Lean are Proven Methods

It turns out that there are proven working models for cultivating and fostering this kind of innovative thinking and experimentation. As I mentioned above, agile and lean are probably the single greatest contribution to the world by the tech sector, far more impactful than any particular technology produced by it. Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems. They are able to do this precisely because they are unhindered by the hulking bureaucratic structures of the old guard. This is precisely why so many Fortune 500 companies are experimenting with innovation and R&D laboratories. Because they know their existing staff, structures, and processes cannot produce innovation within those constraints. Only the small, nimble teams can do it, and they can only do it if they are kept separate from, protected from even, the traditional production systems of the previous product cycle.

Yet big philanthropy still have barely experimented with this model, only trying it in a few isolated instances. Here at Neo, for example, we are working on a project for teachers funded by a forward-thinking foundation. What our client is trying to disrupt is no less than the entire US education system, and with goals and measurements developed by teachers for teachers, not by Silicon Valley hotshots who have no clue how to fix education.

Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems.

To start with, the project was funded in iterations of six-weeks at a time, each with a distinct and measurable goal. We built a small cross-functional team to tackle some of the tougher issues faced by teachers trying to raise the level of excellence in their classrooms. The team was empowered to talk directly to teachers, and incorporate their feedback into new versions of the project, released on almost a daily basis. We have iterated the design more than sixteen times in less then four months, and it’s starting to really take shape.

We have no idea whether this particular project will be successful in the long run. But what we do know is that the client and their funder have had the courage to step out of the traditional project funding models and apply agile and lean thinking to a very tough problem. And we’re proud to be invited along for the ride.

The vast majority of the social sector is still trying to tackle social problems with program and funding models that were pioneered early in the last century. Agile and lean methods hold the key to finally breaking the mold of the old, traditional model of resourcing social change initiatives. The philanthropic community should be interested in the agile and lean methods produced by the technology sector, not the money produced by it, and start reorganizing project teams and resource allocation strategies and timelines in line this proven innovation model.

Only then we will be in a position to really innovate for social change.”

Data revolution: How the UN is visualizing the future


Kate Krukiel at Microsoft Government: “…world leaders met in New York for the 69th session of the United Nations (UN) General Assembly. Progress toward achieving the eight Millennium Development Goals (MDGs) by the December 2015 target date—just 454 days away—was top of mind. So was the post-2015 agenda, which will pick up where the MDGs leave off. Ahead of the meetings, the UN Millennium Campaign asked Microsoft to build real-time visualizations of the progress on each goal—based on data spanning 21 targets, 60 indicators, and about 190 member countries. With the data visualizations we created (see them at http://www.mdgleaders.org/), UN and global leaders can decide where to focus in the next 15 months and, more importantly, where change needs to happen post-2015. Their experience offers three lessons for governments:

1. Data has a shelf life.

Since the MDGs were launched in 2000, the UN has relied on annual reports to assess its progress. But in August, UN Secretary-General Ban Ki-moon called for a “data revolution for sustainable development”, which in effect makes real-time data visualization a requirement, not just for tracking the MDGs, but for everything from Ebola to climate change….

2.Governments need visualization tools.

Just as the UN is using data visualization to track its progress and plan for the future, you can use the technology to better understand the massive amounts of data you collect—on everything from water supply and food prices to child mortality and traffic jams. Data visualization technology makes it possible to pull insights from historical data, develop forecasts, and spot gaps in your data far easier than you can with raw data. As they say, a picture is worth a thousand words. To get a better idea of what’s possible, check out the MDG visualizations Microsoft created for the UN using our Power BI tool.

3.The private sector can help.

The UN called on the private sector to assist in determining the exact MDG progress and inspire ongoing global efforts. …

Follow the UN’s lead and join the #datarevolution now, if you haven’t already. It’s an opportunity to work across silos and political boundaries to address the world’s most pressing problems. It takes citizens’ points of view into account through What People Want. And it extends to the private sector, where expertise in using technology to create a sustainable future already exists. I encourage all government leaders to engage. To follow where the UN takes its revolution, watch for updates on the Data Revolution Group website or follow them on Twitter @data_rev….”

Killer Apps in the Gigabit Age


New Pew report By , and : “The age of gigabit connectivity is dawning and will advance in coming years. The only question is how quickly it might become widespread. A gigabit connection can deliver 1,000 megabits of information per second (Mbps). Globally, cloud service provider Akamai reports that the average global connection speed in quarter one of 2014 was 3.9 Mbps, with South Korea reporting the highest average connection speed, 23.6 Mbps and the US at 10.5 Mbps.1
In some respects, gigabit connectivity is not a new development. The US scientific community has been using hyper-fast networks for several years, changing the pace of data sharing and enabling levels of collaboration in scientific disciplines that were unimaginable a generation ago.
Gigabit speeds for the “average Internet user” are just arriving in select areas of the world. In the US, Google ran a competition in 2010 for communities to pitch themselves for the construction of the first Google Fiber network running at 1 gigabit per second—Internet speeds 50-100 times faster than the majority of Americans now enjoy. Kansas City was chosen among 1,100 entrants and residents are now signing up for the service. The firm has announced plans to build a gigabit network in Austin, Texas, and perhaps 34 other communities. In response, AT&T has said it expects to begin building gigabit networks in up to 100 US cities.2 The cities of Chattanooga, Tennessee; Lafayette, Louisiana; and Bristol, Virginia, have super speedy networks, and pockets of gigabit connectivity are in use in parts of Las Vegas, Omaha, Santa Monica, and several Vermont communities.3 There are also other regional efforts: Falcon Broadband in Colorado Springs, Colorado; Brooklyn Fiber in New York; Monkey Brains in San Francisco; MINET Fiber in Oregon; Wicked Fiber in Lawrence, Kansas; and Sonic.net in California, among others.4 NewWave expects to launch gigabit connections in 2015 in Poplar Bluff, Missouri Monroe, Rayville, Delhi; and Tallulah, Louisiana, and Suddenlink Communications has launched Operation GigaSpeed.5
In 2014, Google and Verizon were among the innovators announcing that they are testing the capabilities for currently installed fiber networks to carry data even more efficiently—at 10 gigabits per second—to businesses that handle large amounts of Internet traffic.
To explore the possibilities of the next leap in connectivity we asked thousands of experts and Internet builders to share their thoughts about likely new Internet activities and applications that might emerge in the gigabit age. We call this a canvassing because it is not a representative, randomized survey. Its findings emerge from an “opt in” invitation to experts, many of whom play active roles in Internet evolution as technology builders, researchers, managers, policymakers, marketers, and analysts. We also invited comments from those who have made insightful predictions to our previous queries about the future of the Internet. (For more details, please see the section “About this Canvassing of Experts.”)…”

Beyond the “Good Governance” mantra


Alan Hudson at Global Integrity: “…The invocation of “Good Governance” is something that happens a lot, including in ongoing discussions of whether and how governance – or governance-related issues – should be addressed in the post-2015 development framework. Rather than simply squirm uncomfortably every time someone invokes the “Good Governance” mantra, I thought it would be more constructive to explain – again (see here and here) – why I find the phrase problematic, and to outline why I think that “Open Governance” might be a more helpful formulation.
My primary discomfort with the “Good Governance” mantra is that it obscures and wishes away much of the complexity about governance. Few would disagree with the idea that: i) governance arrangements have distributional consequences; ii) governance arrangements play a role in shaping progress towards development outcomes; and iii) effective governance arrangements – forms of governance – will vary by context. But the “Good Governance” mantra, it seems to me, unhelpfully side-steps these key issues, avoiding, or at least postponing, a number of key questions: good from whose perspective, good for what, good for where?
Moreover, the notion of “Good Governance” risks giving the impression that “we” – which tends to mean people outside of the societies that they’re talking about – know what governance is good, and further still that “we” know what needs to happen to make governance good. On both counts, the evidence is that that is seldom the case.
These are not new points. A number of commentators including Merilee Grindle, Matt Andrews, Mushtaq Khan and, most recently, Brian Levy, have pointed out the problems with a “Good Governance” agenda for many years. But, despite their best efforts, in policy discussions, including around post-2015, their warnings are too rarely heeded.
However, rather than drop the language of governance entirely, I do think that there is value in a more flexible, perhaps less normative – or differently normative, more focused on function than form – notion of governance. One that centers on transparency, participation and accountability. One that is about promoting the ability of communities in particular places to address the governance challenges relating to the specific priorities that they face, and which puts people in those places – rather than outsiders – center-stage in improving governance in ways that work for them. Indeed, the targets in the Open Working Group’s Goal 16 includes important elements of this.
The “Good Governance” mantra may be hard to shake, but I remain hopeful that open governance – a more flexible framing which is about empowering people and governments with information so that they can work together to tackle problems they prioritize, in their particular places – may yet win the day. The sooner that happens, the better.”

New Data for a New Energy Future


(This post originally appeared on the blog of the U.S. Chamber of Commerce Foundation.)

Two growing concerns—climate change and U.S. energy self-sufficiency—have accelerated the search for affordable, sustainable approaches to energy production and use. In this area, as in many others, data-driven innovation is a key to progress. Data scientists are working to help improve energy efficiency and make new forms of energy more economically viable, and are building new, profitable businesses in the process.
In the same way that government data has been used by other kinds of new businesses, the Department of Energy is releasing data that can help energy innovators. At a recent “Energy Datapalooza” held by the department, John Podesta, counselor to the President, summed up the rationale: “Just as climate data will be central to helping communities prepare for climate change, energy data can help us reduce the harmful emissions that are driving climate change.” With electric power accounting for one-third of greenhouse gas emissions in the United States, the opportunities for improvement are great.
The GovLab has been studying the business applications of public government data, or “open data,” for the past year. The resulting study, the Open Data 500, now provides structured, searchable information on more than 500 companies that use open government data as a key business driver. A review of those results shows four major areas where open data is creating new business opportunities in energy and is likely to build many more in the near future.

Commercial building efficiency
Commercial buildings are major energy consumers, and energy costs are a significant business expense. Despite programs like LEED Certification, many commercial buildings waste large amounts of energy. Now a company called FirstFuel, based in Boston, is using open data to drive energy efficiency in these buildings. At the Energy Datapalooza, Swap Shah, the company’s CEO, described how analyzing energy data together with geospatial, weather, and other open data can give a very accurate view of a building’s energy consumption and ways to reduce it. (Sometimes the solution is startlingly simple: According to Shah, the largest source of waste is running heating and cooling systems at the same time.) Other companies are taking on the same kind of task – like Lucid, which provides an operating system that can track a building’s energy use in an integrated way.

Home energy use
A number of companies are finding data-driven solutions for homeowners who want to save money by reducing their energy usage. A key to success is putting together measurements of energy use in the home with public data on energy efficiency solutions. PlotWatt, for example, promises to help consumers “save money with real-time energy tracking” through the data it provides. One of the best-known companies in this area, Opower, uses a psychological strategy: it simultaneously gives people access to their own energy data and lets them compare their energy use to their neighbors’ as an incentive to save. Opower partners with utilities to provide this information, and the Virginia-based company has been successful enough to open offices in San Francisco, London, and Singapore. Soon more and more people will have access to data on their home energy use: Green Button, a government-promoted program implemented by utilities, now gives about 100 million Americans data about their energy consumption.

Solar power and renewable energy
As solar power becomes more efficient and affordable, a number of companies are emerging to support this energy technology. Clean Power Finance, for example, uses its database to connect solar entrepreneurs with sources of capital. In a different way, a company called Solar Census is analyzing publicly available data to find exactly where solar power can be produced most efficiently. The kind of analysis that used to require an on-site survey over several days can now be done in less than a minute with their algorithms.
Other kinds of geospatial and weather data can support other forms of renewable energy. The data will make it easier to find good sites for wind power stations, water sources for small-scale hydroelectric projects, and the best opportunities to tap geothermal energy.

Supporting new energy-efficient vehicles
The Tesla and other electric vehicles are becoming commercially viable, and we will soon see even more efficient vehicles on the road. Toyota has announced that its first fuel-cell cars, which run on hydrogen, will be commercially available by mid-2015, and other auto manufacturers have announced plans to develop fuel-cell vehicles as well. But these vehicles can’t operate without a network to supply power, be it electricity for a Tesla battery or hydrogen for a fuel cell.
It’s a chicken-and-egg problem: People won’t buy large numbers of electric or fuel-cell cars unless they know they can power them, and power stations will be scarce until there are enough vehicles to support their business. Now some new companies are facilitating this transition by giving drivers data-driven tools to find and use the power sources they need. Recargo, for example, provides tools to help electric car owners find charging stations and operate their vehicles.
The development of new energy sources will involve solving social, political, economic, and technological issues. Data science can help develop solutions and bring us more quickly to a new kind of energy future.
Joel Gurin, senior advisor at the GovLab and project director, Open Data 500. He also currently serves as a fellow of the U.S. Chamber of Commerce Foundation.

Crowdsourcing and collaborative translation: mass phenomena or silent threat to translation studies?


Article by Alberto Fernandez Costales: ” This article explores the emerging phenomenon of amateur translation and tries to shed some light on the implications this process may have both for Translation Studies as an academic discipline and for the translation industry itself. The paper comments on the main activities included within the concept of fan translation and approaches the terminological issues concerning the categorization of “non-professional translation”. In addition, the article focuses on the existing differences between collaborative translation and crowdsourcing, and posits new hypotheses regarding the development of these initiatives and the possible erosion of the boundaries which separate them. The question of who-does-what in the industry of translation is a major issue to be addressed in order to gain a clear view of the global state of translation today.”

Real-time information about public transport's position using crowdsourcing


Paper by Nikos Souliotis et al, published at PCI ’14 Proceedings of the 18th Panhellenic Conference on Informatics: “Nowadays there is a multitude of mobile and tablet applications being developed in order to facilitate or disrupt every day tasks. Many of these are location based. A technique to serve in providing information and content is crowdsourcing. This technique is based on the public contributing information or resources giving them the opportunity to become both service providers and recipients at the same time.
Taking into account the above and after observing passengers using the public transport system, we came to the conclusion that it would be useful to be able to determine which transport medium (i.e which bus line out of a number running concurrently) is nearer at any given moment. This information allows for better decision making and choice of transportation.
For this we propose the development of an application to show the position of a selected transport vehicle. The position will be calculated based on geo-tracking provided by passengers boarded on a vehicle. This will allow for real time information to the application users in order to be able to determine their optimal route.”

MIT launches Laboratory for Social Machines with major Twitter investment


MIT Press Release: “The MIT Media Lab today announced the creation of the Laboratory for Social Machines (LSM), funded by a five-year, $10 million commitment from Twitter. As part of the new program, Twitter will also provide full access to its real-time, public stream of tweets, as well as the archive of every tweet dating back to the first.
The new initiative, based at the Media Lab, will focus on the development of new technologies to make sense of semantic and social patterns across the broad span of public mass media, social media, data streams, and digital content. Pattern discovery and data visualization will be explored to reveal interaction patterns and shared interests in relevant social systems, while collaborative tools and mobile apps will be developed to enable new forms of public communication and social organization.
A main goal for the LSM will be to create new platforms for both individuals and institutions to identify, discuss, and act on pressing societal problems. Though funded by Twitter, the LSM will have complete operational and academic independence. In keeping with the academic mission of LSM, students and staff will work across many social media and mass media platforms — including, but not limited to, Twitter.
“The Laboratory for Social Machines will experiment in areas of public communication and social organization where humans and machines collaborate on problems that can’t be solved manually or through automation alone,” says Deb Roy, an associate professor at the Media Lab who will lead the LSM, and who also serves as Twitter’s chief media scientist. “Social feedback loops based on analysis of public media and data can be an effective catalyst for increasing accountability and transparency — creating mutual visibility among institutions and individuals.”
“With this investment, Twitter is seizing the opportunity to go deeper into research to understand the role Twitter and other platforms play in the way people communicate, the effect that rapid and fluid communication can have and apply those findings to complex societal issues,” says Dick Costolo, CEO of Twitter…”

Uncovering State And Local Gov’s 15 Hidden Successes


Emily Jarvis at GovLoop: “From garbage trucks to vacant lots, cities and states are often tasked with the thankless job of cleaning up a community’s mess. These are tasks that are often overlooked, but are critical to keeping a community vibrant.
But even in these sometimes thankless jobs, there are real innovations happening. Take Miami-Dade County where they are using hybrid garbage trucks to save the community millions of dollars in fuel every year and make the environment a little cleaner. Or head over to Milwaukee where the city is turning vacant and abandoned lots into urban farms.
There are just two of the fifteen examples, GovLoop uncovered in our new guide, From the State House to the County Clerk – 15 Challenges and Success Stories.
We have broken the challenges into four categories:

  • Internal Best Practices
  • Tech Challenges
  • Health and Safety
  • Community Engagement and Outreach

Here’s another example, the open data movement has the potential to effect governing and civic engagement at the state and local government levels. But today very few agencies are actively providing open data. In fact, only 46 U.S. cities and counties have open data sites. One of the cities on the leading edge of the open data movement is Fort Worth, Texas.

“When I came into office, that was one of my campaign promises, that we would get Fort Worth into this century on technology and that we would take a hard look at open records requests and requests for data,” Mayor Betsy Price said in an interview with the Star-Telegram. “It goes a lot further to being transparent and letting people participate in their government and see what we are doing. It is the people’s data, and it should be easy to access.”

The website, data.fortworthtexas.gov, offers data and documents such as certificates of occupancy, development permits and residential permits for download in several formats, including Excel and PDF. Not all datasets are available yet — the city said its priority was to put the most-requested data on the portal first. Next up? Crime data, code violations, restaurant ratings and capital projects progress.

City officials’ ultimate goal is to create and adopt a full open data policy. As part of the launch, they are also looking for local software developers and designers who want to help guide the open data initiative. Those interested in participating can sign up online to receive more information….”

UN Data Revolution Group


Website: “UN Secretary-General Ban Ki-moon has asked an Independent Expert Advisory Group to make concrete recommendations on bringing about a data revolution in sustainable development. Here you can find out more about the work of the group, and feed into the process by adding your comments to this site or sending a private consultation submission

Consultation Areas