Charities Try New Ways to Test Ideas Quickly and Polish Them Later


Ben Gose in the Chronicle of Philanthropy: “A year ago, a division of TechSoup Global began working on an app to allow donors to buy a hotel room for victims of domestic violence when no other shelter is available. Now that app is a finalist in a competition run by a foundation that combats human trafficking—and a win could mean a grant worth several hundred thousand dollars. The app’s evolution—adding a focus on sex slaves to the initial emphasis on domestic violence—was hardly accidental.
Caravan Studios, the TechSoup division that created the app, has embraced a new management approach popular in Silicon Valley known as “lean start-up.”
The principles, which are increasingly popular among nonprofits, emphasize experimentation over long-term planning and urge groups to get products and services out to clients as early as possible so the organizations can learn from feedback and make changes.
When the app, known as SafeNight, was still early in the design phase, Caravan posted details about the project on its website, including applications for grants that Caravan had not yet received. In lean-start-up lingo, Caravan put out a “minimal viable product” and hoped for feedback that would lead to a better app.
Caravan soon heard from antitrafficking organizations, which were interested in the same kind of service. Caravan eventually teamed up with the Polaris Project and the State of New Jersey, which were working on a similar app, to jointly create an app for the final round of the antitrafficking contest. Humanity United, the foundation sponsoring the contest, plans to award $1.8-million to as many as three winners later this month.
Marnie Webb, CEO of Caravan, which is building an array of apps designed to curb social problems, says lean-start-up principles help Caravan work faster and meet real needs.
“The central idea is that any product that we develop will get better if it lives as much of its life as possible outside of our office,” Ms. Webb says. “If we had kept SafeNight inside and polished it and polished it, it would have been super hard to bring on a partner because we would have invested too much.”….
Nonprofits developing new tech tools are among the biggest users of lean-start-up ideas.
Upwell, an ocean-conservation organization founded in 2011, scans the web for lively ocean-related discussions and then pushes to turn them into full-fledged movements through social-media campaigns.
Lean principles urge groups to steer clear of “vanity metrics,” such as site visits, that may sound impressive but don’t reveal much. Upwell tracks only one number—“social mentions”—the much smaller group of people who actually say something about an issue online.
After identifying a hot topic, Upwell tries to assemble a social-media strategy within 24 hours—what it calls a “minimum viable campaign.”
“We do the least amount of work to get something out the door that will get results and information,” says Rachel Dearborn, Upwell’s campaign director.
Campaigns that don’t catch on are quickly scrapped. But campaigns that do catch on get more time, energy, and money from Upwell.
After Hurricane Sandy, in 2012, a prominent writer on ocean issues and others began pushing the idea that revitalizing the oyster beds near New York City could help protect the shore from future storm surges. Upwell’s “I (Oyster) New York” campaign featured a catchy logo and led to an even bigger spike in attention.

‘Build-Measure-Learn’

Some organizations that could hardly be called start-ups are also using lean principles. GuideStar, the 20-year-old aggregator of financial information about charities, is using the lean approach to develop tools more quickly that meet the needs of its users.
The lean process promotes short “build-measure-learn” cycles, in which a group frequently updates a product or service based on what it hears from its customers.
GuideStar and the Nonprofit Finance Fund have developed a tool called Financial Scan that allows charities to see how they compare with similar groups on various financial measures, such as their mix of earned revenue and grant funds.
When it analyzed who was using the tool, GuideStar found heavy interest from both foundations and accounting firms, says Evan Paul, GuideStar’s senior director of products and marketing.
In the future, he says, GuideStar may create three versions of Financial Scan to meet the distinct interests of charities, foundations, and accountants.
“We want to get more specific about how people are using our data to make decisions so that we can help make those decisions better and faster,” Mr. Paul says….


Lean Start-Up: a Glossary of Terms for a Hot New Management Approach

Build-Measure-Learn

Instead of spending considerable time developing a product or service for a big rollout, organizations should consider using a continuous feedback loop: “build” a program or service, even if it is not fully fleshed out; “measure” how clients are affected; and “learn” by improving the program or going in a new direction. Repeat the cycle.

Minimum Viable Product

An early version of a product or service that may be lacking some features. This approach allows an organization to obtain feedback from clients and quickly determine the usefulness of a product or service and how to improve it.

Get Out of the Building

To determine whether a product or service is needed, talk to clients and share your ideas with them before investing heavily.

A/B Testing

Create two versions of a product or service, show them to different groups, and see which performs best.

Failing Fast

By quickly realizing that a product or service isn’t viable, organizations save time and money and gain valuable information for their next effort.

Pivot

Making a significant change in strategy when the early testing of a minimum viable product shows that the product or service isn’t working or isn’t needed.

Vanity Metrics

Measures that seem to provide a favorable picture but don’t accurately capture the impact of a product. An example might be a tally of website page views. A more meaningful measure—or an “actionable metric,” in the lean lexicon—might be the number of active users of an online service.
Sources: The Lean Startup, by Eric Ries; The Ultimate Dictionary of Lean for Social Good, a publication by Lean Impact”

After the Protests


Zeynep Tufekc in the New York Times on why social media is fueling a boom-and-bust cycle of political: “LAST Wednesday, more than 100,000 people showed up in Istanbul for a funeral that turned into a mass demonstration. No formal organization made the call. The news had come from Twitter: Berkin Elvan, 15, had died. He had been hit in the head by a tear-gas canister on his way to buy bread during the Gezi protests last June. During the 269 days he spent in a coma, Berkin’s face had become a symbol of civic resistance shared on social media from Facebook to Instagram, and the response, when his family tweeted “we lost our son” and then a funeral date, was spontaneous.

Protests like this one, fueled by social media and erupting into spectacular mass events, look like powerful statements of opposition against a regime. And whether these take place in Turkey, Egypt or Ukraine, pundits often speculate that the days of a ruling party or government, or at least its unpopular policies, must be numbered. Yet often these huge mobilizations of citizens inexplicably wither away without the impact on policy you might expect from their scale.

This muted effect is not because social media isn’t good at what it does, but, in a way, because it’s very good at what it does. Digital tools make it much easier to build up movements quickly, and they greatly lower coordination costs. This seems like a good thing at first, but it often results in an unanticipated weakness: Before the Internet, the tedious work of organizing that was required to circumvent censorship or to organize a protest also helped build infrastructure for decision making and strategies for sustaining momentum. Now movements can rush past that step, often to their own detriment….

But after all that, in the approaching local elections, the ruling party is expected to retain its dominance.

Compare this with what it took to produce and distribute pamphlets announcing the Montgomery bus boycott in 1955. Jo Ann Robinson, a professor at Alabama State College, and a few students sneaked into the duplicating room and worked all night to secretly mimeograph 52,000 leaflets to be distributed by hand with the help of 68 African-American political, religious, educational and labor organizations throughout the city. Even mundane tasks like coordinating car pools (in an era before there were spreadsheets) required endless hours of collaborative work.

By the time the United States government was faced with the March on Washington in 1963, the protest amounted to not just 300,000 demonstrators but the committed partnerships and logistics required to get them all there — and to sustain a movement for years against brutally enforced Jim Crow laws. That movement had the capacity to leverage boycotts, strikes and demonstrations to push its cause forward. Recent marches on Washington of similar sizes, including the 50th anniversary march last year, also signaled discontent and a desire for change, but just didn’t pose the same threat to the powers that be.

Social media can provide a huge advantage in assembling the strength in numbers that movements depend on. Those “likes” on Facebook, derided as slacktivism or clicktivism, can have long-term consequences by defining which sentiments are “normal” or “obvious” — perhaps among the most important levers of change. That’s one reason the same-sex marriage movement, which uses online and offline visibility as a key strategy, has been so successful, and it’s also why authoritarian governments try to ban social media.

During the Gezi protests, Prime Minister Recep Tayyip Erdogan called Twitter and other social media a “menace to society.” More recently, Turkey’s Parliament passed a law greatly increasing the government’s ability to censor online content and expand surveillance, and Mr. Erdogan said he would consider blocking access to Facebook and YouTube. It’s also telling that one of the first moves by President Vladimir V. Putin of Russia before annexing Crimea was to shut down the websites of dissidents in Russia.
Media in the hands of citizens can rattle regimes. It makes it much harder for rulers to maintain legitimacy by controlling the public sphere. But activists, who have made such effective use of technology to rally supporters, still need to figure out how to convert that energy into greater impact. The point isn’t just to challenge power; it’s to change it.”

Climate Data Initiative Launches with Strong Public and Private Sector Commitments


John Podesta and Dr. John P. Holdren at the White House blog:  “…today, delivering on a commitment in the President’s Climate Action Plan, we are launching the Climate Data Initiative, an ambitious new effort bringing together extensive open government data and design competitions with commitments from the private and philanthropic sectors to develop data-driven planning and resilience tools for local communities. This effort will help give communities across America the information and tools they need to plan for current and future climate impacts.
The Climate Data Initiative builds on the success of the Obama Administration’s ongoing efforts to unleash the power of open government data. Since data.gov, the central site to find U.S. government data resources, launched in 2009, the Federal government has released troves of valuable data that were previously hard to access in areas such as health, energy, education, public safety, and global development. Today these data are being used by entrepreneurs, researchers, tech innovators, and others to create countless new applications, tools, services, and businesses.
Data from NOAA, NASA, the U.S. Geological Survey, the Department of Defense, and other Federal agencies will be featured on climate.data.gov, a new section within data.gov that opens for business today. The first batch of climate data being made available will focus on coastal flooding and sea level rise. NOAA and NASA will also be announcing an innovation challenge calling on researchers and developers to create data-driven simulations to help plan for the future and to educate the public about the vulnerability of their own communities to sea level rise and flood events.
These and other Federal efforts will be amplified by a number of ambitious private commitments. For example, Esri, the company that produces the ArcGIS software used by thousands of city and regional planning experts, will be partnering with 12 cities across the country to create free and open “maps and apps” to help state and local governments plan for climate change impacts. Google will donate one petabyte—that’s 1,000 terabytes—of cloud storage for climate data, as well as 50 million hours of high-performance computing with the Google Earth Engine platform. The company is challenging the global innovation community to build a high-resolution global terrain model to help communities build resilience to anticipated climate impacts in decades to come. And the World Bank will release a new field guide for the Open Data for Resilience Initiative, which is working in more than 20 countries to map millions of buildings and urban infrastructure….”

How Open Data Policies Unlock Innovation


Tim Cashman at Socrata: “Several trends made the Web 2.0 world we now live in possible. Arguably, the most important of these has been the evolution of online services as extensible technology platforms that enable users, application developers, and other collaborators to create value that extends far beyond the original offering itself.

The Era of ‘Government-as-a-Platform’

The same principles that have shaped the consumer web are now permeating government. Forward-thinking public sector organizations are catching on to the idea that, to stay relevant and vital, governments must go beyond offering a few basic services online. Some have even come to the realization that they are custodians of an enormously valuable resource: the data they collect through their day-to-day operations.  By opening up this data for public consumption online, innovative governments are facilitating the same kind of digital networks that consumer web services have fostered for years.  The era of government as a platform is here, and open data is the catalyst.

The Role of Open Data Policy in Unlocking Innovation in Government

The open data movement continues to transition from an emphasis on transparency to measuring the civic and economic impact of open data programs. As part of this transition, governments are realizing the importance of creating a formal policy to define strategic goals, describe the desired benefits, and provide the scope for data publishing efforts over time.  When well executed, open data policies yield a clear set of benefits. These range from spurring slow-moving bureaucracies into action to procuring the necessary funding to sustain open data initiatives beyond a single elected official’s term.

Four Types of Open Data Policies

There are four main types of policy levers currently in use regarding open data: executive orders, non-binding resolutions, new laws, new regulations, and codified laws. Each of these tools has specific advantages and potential limitations.

Executive Orders

The prime example of an open data executive order in action is President Barack Obama’s Open Data Initiative. While this executive order was short – only four paragraphs on two pages – the real policy magic was a mandate-by-reference that required all U.S. federal agencies to comply with a detailed set of time-bound actions. All of these requirements are publicly viewable on a GitHub repository – a free hosting service for open source software development projects – which is revolutionary in and of itself. Detailed discussions on government transparency took place not in closed-door boardrooms, but online for everyone to see, edit, and improve.

Non-Binding Resolutions

A classic example of a non-binding resolution can be found by doing an online search for the resolution of Palo Alto, California. Short and sweet, this town squire-like exercise delivers additional attention to the movement inside and outside of government. The lightweight policy tool also has the benefit of lasting a bit longer than any particular government official. Although, in recognition of the numerous resolutions that have ever come out of any small town, resolutions are only as timeless as people’s memory.

Internal Regulations

The New York State Handbook on Open Data is a great example of internal regulations put to good use. Originating from the Office of Information Technology Resources, the handbook is a comprehensive, clear, and authoritative guide on how open data is actually supposed to work. Also available on GitHub, the handbook resembles the federal open data project in many ways.

Codified Laws

The archetypal example of open data law comes from San Francisco.
Interestingly, what started as an “Executive Directive” from Mayor Gavin Newsom later turned into legislation and brought with it the power of stronger department mandates and a significant budget. Once enacted, laws are generally hard to revise. However, in the case of San Francisco, the city council has already revised the law two times in four years.
At the federal government level, the Digital Accountability and Transparency Act, or DATA Act, was introduced in both the U.S. House of Representatives (H.R. 2061) and the U.S. Senate (S. 994) in 2013. The act mandates the standardization and publication of a wide of variety of the federal government’s financial reports as open data. Although the Housed voted to pass the Data Act, it still awaits a vote in the Senate.

The Path to Government-as-a-Platform

Open data policies are an effective way to motivate action and provide clear guidance for open data programs. But they are not a precondition for public-sector organizations to embrace the government-as-a-platform model. In fact, the first step does not involve technology at all. Instead, it involves government leaders realizing that public data belongs to the people. And, it requires the vision to appreciate this data as a shared resource that only increases in value the more widely it is distributed and re-used for analytics, web and mobile apps, and more.
The consumer web has shown the value of open data networks in spades (think Facebook). Now, it’s government’s turn to create the next web.”

Expanding Opportunity through Open Educational Resources


Hal Plotkin and Colleen Chien at the White House: “Using advanced technology to dramatically expand the quality and reach of education has long been a key priority for the Obama Administration.
In December 2013, the President’s Council of Advisors on Science and Technology (PCAST) issued a report exploring the potential of Massive Open Online Courses (MOOCs) to expand access to higher education opportunities. Last month, the President announced a $2B down payment, and another $750M in private-sector commitments to deliver on the President’s ConnectEd initiative, which will connect 99% of American K-12 students to broadband by 2017 at no cost to American taxpayers.
This week, we are happy to be joining with educators, students, and technologists worldwide to recognize and celebrate Open Education Week.
Open Educational Resources (“OER”) are educational resources that are released with copyright licenses allowing for their free use, continuous improvement, and modification by others. The world is moving fast, and OER enables educators and students to access, customize, and remix high-quality course materials reflecting the latest understanding of the world and materials that incorporate state of the art teaching methods – adding their own insights along the way. OER is not a silver bullet solution to the many challenges that teachers, students and schools face. But it is a tool increasingly being used, for example by players like edX and the Kahn Academy, to improve learning outcomes and create scalable platforms for sharing educational resources that reach millions of students worldwide.
Launched at MIT in 2001, OER became a global movement in 2007 when thousands of educators around the globe endorsed the Cape Town Declaration on Open Educational Resources. Another major milestone came in 2011, when Secretary of Education Arne Duncan and then-Secretary of Labor Hilda Solis unveiled the four-year, $2B Trade Adjustment Assistance Community College and Career Training Grant Program (TAACCCT). It was the first Federal program to leverage OER to support the development of a new generation of affordable, post-secondary educational programs that can be completed in two years or less to prepare students for careers in emerging and expanding industries….
Building on this record of success, OSTP and the U.S. Agency for International Development (USAID) are exploring an effort to inspire and empower university students through multidisciplinary OER focused on one of the USAID Grand Challenges, such as securing clean water, saving lives at birth, or improving green agriculture. This effort promises to  be a stepping stone towards leveraging OER to help solve other grand challenges such as the NAE Grand Challenges in Engineering or Grand Challenges in Global Health.
This is great progress, but there is more work to do. We look forward to keeping the community updated right here. To see the winning videos from the U.S. Department of Education’s “Why Open Education Matters” Video Contest, click here.”

Why the wealthiest countries are also the most open with their data


Emily Badger in the Washington Post: “The Oxford Internet Institute this week posted a nice visualization of the state of open data in 70 countries around the world, reflecting the willingness of national governments to release everything from transportation timetables to election results to machine-readable national maps. The tool is based on the Open Knowledge Foundation’s Open Data Index, an admittedly incomplete but telling assessment of who willingly publishes updated, accurate national information on, say, pollutants (Sweden) and who does not (ahem, South Africa).

Oxford Internet Institute
Tally up the open data scores for these 70 countries, and the picture looks like this, per the Oxford Internet Institute (click on the picture to link through to the larger interactive version):
Oxford Internet Institute
…With apologies for the tiny, tiny type (and the fact that many countries aren’t listed here at all), a couple of broad trends are apparent. For one, there’s a prominent global “openness divide,” in the words of the Oxford Internet Institute. The high scores mostly come from Europe and North America, the low scores from Asia, Africa and Latin America. Wealth is strongly correlated with “openness” by this measure, whether we look at World Bank income groups or Gross National Income per capita. By the OII’s calculation, wealth accounts for about a third of the variation in these Open Data Index scores.
Perhaps this is an obvious correlation, but the reasons why open data looks like the luxury of rich economies are many, and they point to the reality that poor countries face a lot more obstacles to openness than do places like the United States. For one thing, openness is also closely correlated with Internet penetration. Why open your census results if people don’t have ways to access it (or means to demand it)? It’s no easy task to do this, either.”

The intelligent citizen


John Bell in AlJazeera: “A quarter century after the demise of the Soviet Union, is the Western model of government under threat? …. The pressures are coming from several directions.
All states are feeling the pressure from unregulated global flows of capital that create obscene concentrations of wealth, and an inability of the nation-state to respond.Relatedly, citizens either ignore or distrust traditional institutions, and ethnic groups demand greater local autonomy.
A recent Pew survey shows that Americans aged 18-33 mostly identify as political independents and distrust institutions. The classic model is indeed frayed, and new developments have made it feel like very old cloth.
One natural reflex is to assert even greater control, a move suited to the authoritarians, such as China, Russia or General Abdel Fattah al-Sisi‘s Egypt: Strengthen the nation by any means to withstand the pressures. The reality, however, is that all systems, democracies or otherwise, were designed for an industrial age, and the management of an anonymous mass, and cannot cope with globalised economics and the information world of today.
The question remains: What can effectively replace the Western model? The answer may not lie only in the invention of new structures, but in the improvement of a key component found in all: The citizen.
The citizen today is mostly a consumer, focused on the purchase of goods or services, or the insistent consumption of virtual information, often as an ersatz politics. Occasionally, when a threat rises, he or she also becomes a demandeur of absolute security from the state. Indeed, some are using the new technologies for democratic purposes, they are better informed, criticise abuse or corruption, and organise and rally movements.
But, the vast majority is politically disengaged and cynical of political process; many others are too busy trying to survive to even care. A grand apathy has set in, the stage left vacant for a very few extremists, or pied pipers of the old tunes of nationalisms and tribal belonging disguised as leaders. The citizen is either invisible in this circus, an endangered species in the 21st century, or increasingly drawn to dark and polarising forces.
Some see the glass as half full and believe that technology and direct democracy can bridge the gaps. Indeed, the internet provides a plethora of information and a greater sense of empowerment. Lesser-known protests in Bosnia have led to direct democracy plenums, and the Swiss do revert to national referenda. However, whether direct or representative, democracy will still depend on the quality of the citizen, and his or her decisions.
Opinion, dogma and bias
Opinion, dogma and bias remain common political operating system and, as a result, our politics are still an unaffordable game of chance. The optimists may be right, but discussions in social media on issues ranging from Ukraine to gun control reveal more deep bias and the lure of excitement than the pursuit of a constructive answer.
People crave excitement in their politics. Whether it is through asserting their own opinion or in battling others, politics offers a great ground for this high. The cost, however, comes in poor judgment and dangerous decisions. George W. Bush was elected twice, Vladimir Putin has much support, climate change is denied, and an intoxicated Mayor of Toronto, Rob Ford, may be re-elected.
Few are willing to admit their role in this state of affairs, but they will gladly see the ill in others. Even fewer, including often myself, will admit that they don’t really know how to think through a challenge, political or otherwise. This may seem absurd, thinking feels as natural as walking, but the formation of political opinion is a complex affair, a flawed duet between our minds and outside input. Media, government propaganda, family, culture, and our own unique set of personal experiences, from traumas to chance meetings, all play into the mix. High states of emotion, “excitement”, also weigh in, making us dumb and easily manipulated….
This step may also be a precursor for another that involves the ordinary person. Today being a citizen involves occasional voting, politics as spectator sport, and, among some, being a watchdog against corruption or street activism. What may be required is more citizens’ participation in local democracy, not just in assemblies or casting votes, but in local management and administration.
This will help people understand the complexities of politics, gain greater responsibility, and mitigate some of the vices of centralisation and representative democracy. It may also harmonise with the information age, where everyone, it seems, wishes to be empowered.
Do people have time in their busy lives? A rotational involvement in local affairs can help solve this, and many might even find the engagement enjoyable. This injection of a real citizen into the mix may improve the future of politics while large institutions continue to hum their tune.
In the end, a citizen who has responsibility for his actions can probably make any structure work, while rejecting any that would endanger his independence and dignity. The rise of a more intelligent and committed citizen may clarify politics, improve liberal democracies, and make populism and authoritarianism less appealing and likely paths.”

Protect the open web and the promise of the digital age


Richard Waters in the Financial Times:  “There is much to be lost if companies and nations put up fences around our digital open plains
For all the drawbacks, it is not hard to feel nostalgic about the early days of the web. Surfing between slow-loading, badly designed sites on a dial-up internet connection running at 56 kilobits per second could be frustrating. No wonder it was known as the “world wide wait”. But the “wow” factor was high. There was unparalleled access to free news and information, even if some of it was deeply untrustworthy. Then came that first, revelatory search on Google, which untangled the online jumble with almost miraculous speed.
Later, an uproarious outbreak of blogs converted what had been a passive medium into a global rant. And, with YouTube and Facebook, a mass audience found digital self-expression for the first time.
As the world wide web turns 25, it is easy to take all this for granted. For a generation that has grown up digital, it is part of the fabric of life.
It is also easy to turn away without too many qualms. More than 80 per cent of time spent on smartphones and tablets does not involve the web at all: it is whiled away in apps, which offer the instant gratification that comes from a tap or swipe of a finger.
Typing a URL on a small device, trying to stretch or shrink a web page to fit the small screen, browsing through Google links in a mobile browser: it is all starting to seem so, well, anachronistic.
But if the world wide web is coming to play a smaller part in everyday life, the significance of its relative decline should be kept in perspective. After all, the web is only one of the applications that rides on top of the internet: it is the standards and technologies of the internet itself that provide the main foundation for the modern, connected world. As long as all bits flow freely (and cheaply), the promise of the digital age will remain intact.
Before declaring the web era over and moving on, however, it is worth dwelling on what it represents – and what could be lost if this early manifestation of digital life were to be consigned to history.
Sir Tim Berners-Lee, who wrote the technical paper a quarter of a century ago that laid out the architecture of the web, certainly senses the threat. The open technical standards and open access that lie at the heart of the web – based on the freedom to link any online document to any other – are not guaranteed. What is needed, he argued this week, is nothing less than a digital bill of rights: a statement that would enshrine the ideals on which the medium was founded.
As this suggests, the web has always been much more than a technology. It is a state of mind, a dream of participation, a call to digital freedom that transcends geography. What place it finds in the connected world of tomorrow will help define what it means to be a digital citizen…”

Computational Social Science: Exciting Progress and Future Directions


Duncan Watts in The Bridge: “The past 15 years have witnessed a remarkable increase in both the scale and scope of social and behavioral data available to researchers. Over the same period, and driven by the same explosion in data, the study of social phenomena has increasingly become the province of computer scientists, physicists, and other “hard” scientists. Papers on social networks and related topics appear routinely in top science journals and computer science conferences; network science research centers and institutes are sprouting up at top universities; and funding agencies from DARPA to NSF have moved quickly to embrace what is being called computational social science.
Against these exciting developments stands a stubborn fact: in spite of many thousands of published papers, there’s been surprisingly little progress on the “big” questions that motivated the field of computational social science—questions concerning systemic risk in financial systems, problem solving in complex organizations, and the dynamics of epidemics or social movements, among others.
Of the many reasons for this state of affairs, I concentrate here on three. First, social science problems are almost always more difficult than they seem. Second, the data required to address many problems of interest to social scientists remain difficult to assemble. And third, thorough exploration of complex social problems often requires the complementary application of multiple research traditions—statistical modeling and simulation, social and economic theory, lab experiments, surveys, ethnographic fieldwork, historical or archival research, and practical experience—many of which will be unfamiliar to any one researcher. In addition to explaining the particulars of these challenges, I sketch out some ideas for addressing them….”

How Maps Drive Decisions at EPA


Joseph Marks at NextGov: “The Environmental Protection Agency office charged with taking civil and criminal actions against water and air polluters used to organize its enforcement targeting meetings and conference calls around spreadsheets and graphs.

The USA National Wetlands Inventory is one of the interactive maps produced by the Geoplatform.gov tool.

Those spreadsheets detailed places with large oil and gas production and other possible pollutants where EPA might want to focus its own inspection efforts or reach out to state-level enforcement agencies.
During the past two years, the agency has largely replaced those spreadsheets and tables with digital maps, which make it easier for participants to visualize precisely where the top polluting areas are and how those areas correspond to population centers, said Harvey Simon, EPA’s geospatial information officer, making it easier for the agency to focus inspections and enforcement efforts where they will do the most good.
“Rather than verbally going through tables and spreadsheets you have a lot of people who are not [geographic information systems] practitioners who are able to share map information,” Simon said. “That’s allowed them to take a more targeted and data-driven approach to deciding what to do where.”
The change is a result of the EPA Geoplatform, a tool built off Esri’s ArcGIS Online product, which allows companies and government agencies to build custom Web maps using base maps provided by Esri mashed up with their own data.
When the EPA Geoplatform launched in May 2012 there were about 250 people registered to create and share mapping data within the agency. That number has grown to more than 1,000 during the past 20 months, Simon said.
“The whole idea of the platform effort is to democratize the use of geospatial information within the agency,” he said. “It’s relatively simple now to make a Web map and mash up data that’s useful for your work, so many users are creating Web maps themselves without any support from a consultant or from a GIS expert in their office.”
A governmentwide Geoplatform launched in 2012, spurred largely by agencies’ frustrations with the difficulty of sharing mapping data after the 2010 explosion of the Deepwater Horizon oil rig in the Gulf of Mexico. The platform’s goal was twofold. First officials wanted to share mapping data more widely between agencies so they could avoid duplicating each other’s work and to share data more easily during an emergency.
Second, the government wanted to simplify the process for viewing and creating Web maps so they could be used more easily by nonspecialists.
EPA’s geoplatform has essentially the same goals. The majority of the maps the agency builds using the platform aren’t  publicly accessible so the EPA doesn’t have to worry about scrubbing maps of data that could reveal personal information about citizens or proprietary data about companies. It publishes some maps that don’t pose any privacy concerns on EPA websites as well as on the national geoplatform and to Data.gov, the government data repository.
Once ArcGIS Online is judged compliant with the Federal Information Security Management Act, or FISMA, which is expected this month, EPA will be able to share significantly more nonpublic maps through the national geoplatform and rely on more maps produced by other agencies, Simon said.
EPA’s geoplatform has also made it easier for the agency’s environmental justice office to share common data….”