Springwise: “When a train arrives into a station, it’s often the case that travelers aren’t spread evenly along the platform and are huddled in the same spot. This is annoying for both commuters and operators because it means carriages get full while others are left empty and leads to longer boarding times. In the Netherlands, the NS Reisplanner Xtra app has already offered train users a way to find a seat using their smartphone. Now the country’s Edenspiekermann design agency has developed a platform-length LED display which provides real-time information on carriage crowdedness and other details.
Created for train operators ProRail and NS with the help of design researchers STBY, the service consists of a 180-meter long color LED strip that spans the length of the platform. The display aims to give commuters all the information they need to know where they should wait to get on the right carriage. Numbers show whether the carriage is first or standard class, and the exact position the doors will be is also marked. Symbols show the carriages that are best for bikes, buggies, wheelchairs and large luggage, as well as quiet carriages. The boards also work with infrared sensors located on each train that detect how full each carriage is. A green strip means there are seats available, a yellow strip indicates that the carriage is fairly crowded and a red strip means it’s full.
Website: www.edenspiekermann.com“
As Data Overflows Online, Researchers Grapple With Ethics
The New York Times: “Scholars are exhilarated by the prospect of tapping into the vast troves of personal data collected by Facebook, Google, Amazon and a host of start-ups, which they say could transform social science research.
atOnce forced to conduct painstaking personal interviews with subjects, scientists can now sit at a screen and instantly play with the digital experiences of millions of Internet users. It is the frontier of social science — experiments on people who may never even know they are subjects of study, let alone explicitly consent.
“This is a new era,” said Jeffrey T. Hancock, a Cornell University professor of communication and information science. “I liken it a little bit to when chemistry got the microscope.”
But the new era has brought some controversy with it. Professor Hancock was a co-author of the Facebook study in which the social network quietly manipulated the news feeds of nearly 700,000 people to learn how the changes affected their emotions. When the research was published in June, the outrage was immediate…
Such testing raises fundamental questions. What types of experiments are so intrusive that they need prior consent or prompt disclosure after the fact? How do companies make sure that customers have a clear understanding of how their personal information might be used? Who even decides what the rules should be?
Existing federal rules governing research on human subjects, intended for medical research, generally require consent from those studied unless the potential for harm is minimal. But many social science scholars say the federal rules never contemplated large-scale research on Internet users and provide inadequate guidance for it.
For Internet projects conducted by university researchers, institutional review boards can be helpful in vetting projects. However, corporate researchers like those at Facebook don’t face such formal reviews.
Sinan Aral, a professor at the Massachusetts Institute of Technology’s Sloan School of Management who has conducted large-scale social experiments with several tech companies, said any new rules must be carefully formulated.
“We need to understand how to think about these rules without chilling the research that has the promise of moving us miles and miles ahead of where we are today in understanding human populations,” he said. Professor Aral is planning a panel discussion on ethics at a M.I.T. conference on digital experimentation in October. (The professor also does some data analysis for The New York Times Company.)
Mary L. Gray, a senior researcher at Microsoft Research and associate professor at Indiana University’s Media School, who has worked extensively on ethics in social science, said that too often, researchers conducting digital experiments work in isolation with little outside guidance.
She and others at Microsoft Research spent the last two years setting up an ethics advisory committee and training program for researchers in the company’s labs who are working with human subjects. She is now working with Professor Hancock to bring such thinking to the broader research world.
“If everyone knew the right thing to do, we would never have anyone hurt,” she said. “We really don’t have a place where we can have these conversations.”…
Public Innovation through Collaboration and Design
New book edited by Christopher Ansell, and Jacob Torfing: “While innovation has long been a major topic of research and scholarly interest for the private sector, it is still an emerging theme in the field of public management. While ‘results-oriented’ public management may be here to stay, scholars and practitioners are now shifting their attention to the process of management and to how the public sector can create ‘value’.
This volume provides insights for practitioners who are interested in developing an innovation strategy for their city, agency, or administration and will be essential reading for scholars, practitioners and students in the field of public policy and public administration.
Contents:
Knowledge is Beautiful
New book by David McCandless: “In this mind-blowing follow-up to the bestselling Information is Beautiful, the undisputed king of infographics David McCandless uses stunning and unique visuals to reveal unexpected insights into how the world really works. Every minute of every hour of every day we are bombarded with information – be it on television, in print or online. How can we relate to this mind-numbing overload? Enter David McCandless and his amazing infographics: simple, elegant ways to understand information too complex or abstract to grasp any way but visually. McCandless creates dazzling displays that blend the facts with their connections, contexts and relationships, making information meaningful, entertaining – and beautiful. Knowledge is Beautiful is an endlessly fascinating spin through the world of visualized data, all of it bearing the hallmark of David McCandless’s ground-breaking signature style. Taking infographics to the next level, Knowledge is Beautiful offers a deeper, more wide-ranging look at the world and its history. Covering everything from dog breeds and movie plots to the most commonly used passwords and crazy global warming solutions, Knowledge is Beautiful is guaranteed to enrich your understanding of the world.”
Delivering a Customer-Focused Government Through Smarter IT
White House: “As technology changes, government must change with it to address new challenges and take advantage of new opportunities. This Administration has made important strides in modernizing government so that it serves its constituents more effectively and efficiently, but we know there is much more to do.
Last year, a group of digital and technology experts from the private sector helped us fix HealthCare.gov – a turnaround that enabled millions of Americans to sign up for quality health insurance. This effort also reminded us why the President’s commitment to bringing more of the nation’s top information technology (IT) talent into government is so critical to delivering the best possible results for our customers – the American people.
A core part of the President’s Management Agenda is improving the value we deliver to citizens through Federal IT. That’s why, today, the Administration is formally launching the U.S. Digital Service. The Digital Service will be a small team made up of our country’s brightest digital talent that will work with agencies to remove barriers to exceptional service delivery and help remake the digital experience that people and businesses have with their government…
The Digital Service will also collaborate closely with 18F, an exciting new unit of the U.S. General Services Administration (GSA). GSA’s 18F houses a growing group of talented developers and digital professionals who are designing and building the actual digital platforms and providing services across the government….
Leveraging Best Practices with the Digital Services Playbook
To help the Digital Service achieve its mission, today the Administration is releasing the initial version of a Digital Services Playbook that lays out best practices for building effective digital services like web and mobile applications and will serve as a guide for agencies across government. To increase the success of government digital service projects, this playbook outlines 13 key “plays” drawn from private and public-sector best practices that, if followed together, will help federal agencies deliver services that work well for users and require less time and money to develop and operate.
The technologies used to create digital services are changing rapidly. The Playbook is designed to encourage the government to adopt the best of these advances into our own work. To further strengthen this important tool, we encourage folks across the public and private sectors to provide feedback on the Playbook, so we can strengthen this important tool.
Using Agile Processes to Procure Digital Services with the TechFAR Handbook
To ensure government has the right tech tools to do its job, the Administration is also today launching the TechFAR Handbook, a guide that explains how agencies can execute key plays in the Playbook in ways consistent with the Federal Acquisition Regulation (FAR), which governs how the government must buy services from the private sector….
Stay informed — sign up here to monitor the latest news from the U.S. Digital Service.“
The city as living labortory: A playground for the innovative development of smart city applications
Paper by Veeckman, Carina and van der Graaf, Shenja: “Nowadays the smart-city concept is shifting from a top-down, mere technological approach towards bottom-up processes that are based on the participation of creative citizens, research organisations and companies. Here, the city acts as an urban innovation ecosystem in which smart applications, open government data and new modes of participation are fostering innovation in the city. However, detailed analyses on how to manage smart city initiatives as well as descriptions of underlying challenges and barriers seem still scarce. Therefore, this paper investigates four, collaborative smart city initiatives in Europe to learn how cities can optimize the citizen’s involvement in the context of open innovation. The analytical framework focuses on the innovation ecosystem and the civic capacities to engage in the public domain. Findings show that public service delivery can be co-designed between the city and citizens, if different toolkits aligned with the specific capacities and skills of the users are provided. By providing the right tools, even ordinary citizens can take a much more active role in the evolution of their cities and generate solutions from which both the city and everyday urban life can possibly benefit.”
Reality Mining: Using Big Data to Engineer a Better World
New book by Nathan Eagle and Kate Greene : “Big Data is made up of lots of little data: numbers entered into cell phones, addresses entered into GPS devices, visits to websites, online purchases, ATM transactions, and any other activity that leaves a digital trail. Although the abuse of Big Data—surveillance, spying, hacking—has made headlines, it shouldn’t overshadow the abundant positive applications of Big Data. In Reality Mining, Nathan Eagle and Kate Greene cut through the hype and the headlines to explore the positive potential of Big Data, showing the ways in which the analysis of Big Data (“Reality Mining”) can be used to improve human systems as varied as political polling and disease tracking, while considering user privacy.
What Cars Did for Today’s World, Data May Do for Tomorrow’s
Quentin Hardy in the New York Times: “New technology products head at us constantly. There’s the latest smartphone, the shiny new app, the hot social network, even the smarter thermostat.
As great (or not) as all these may be, each thing is a small part of a much bigger process that’s rarely admired. They all belong inside a world-changing ecosystem of digital hardware and software, spreading into every area of our lives.
Thinking about what is going on behind the scenes is easier if we consider the automobile, also known as “the machine that changed the world.” Cars succeeded through the widespread construction of highways and gas stations. Those things created a global supply chain of steel plants and refineries. Seemingly unrelated things, including suburbs, fast food and drive-time talk radio, arose in the success.
Today’s dominant industrial ecosystem is relentlessly acquiring and processing digital information. It demands newer and better ways of collecting, shipping, and processing data, much the way cars needed better road building. And it’s spinning out its own unseen businesses.
A few recent developments illustrate the new ecosystem. General Electric plans to announce Monday that it has created a “data lake” method of analyzing sensor information from industrial machinery in places like railroads, airlines, hospitals and utilities. G.E. has been putting sensors on everything it can for a couple of years, and now it is out to read all that information quickly.
The company, working with an outfit called Pivotal, said that in the last three months it has looked at information from 3.4 million miles of flights by 24 airlines using G.E. jet engines. G.E. said it figured out things like possible defects 2,000 times as fast as it could before.
The company has to, since it’s getting so much more data. “In 10 years, 17 billion pieces of equipment will have sensors,” said William Ruh, vice president of G.E. software. “We’re only one-tenth of the way there.”
It hardly matters if Mr. Ruh is off by five billion or so. Billions of humans are already augmenting that number with their own packages of sensors, called smartphones, fitness bands and wearable computers. Almost all of that will get uploaded someplace too.
Shipping that data creates challenges. In June, researchers at the University of California, San Diego announced a method of engineering fiber optic cable that could make digital networks run 10 times faster. The idea is to get more parts of the system working closer to the speed of light, without involving the “slow” processing of electronic semiconductors.
“We’re going from millions of personal computers and billions of smartphones to tens of billions of devices, with and without people, and that is the early phase of all this,” said Larry Smarr, drector of the California Institute for Telecommunications and Information Technology, located inside U.C.S.D. “A gigabit a second was fast in commercial networks, now we’re at 100 gigabits a second. A terabit a second will come and go. A petabit a second will come and go.”
In other words, Mr. Smarr thinks commercial networks will eventually be 10,000 times as fast as today’s best systems. “It will have to grow, if we’re going to continue what has become our primary basis of wealth creation,” he said.
Add computation to collection and transport. Last month, U.C. Berkeley’s AMP Lab, created two years ago for research into new kinds of large-scale computing, spun out a company called Databricks, that uses new kinds of software for fast data analysis on a rental basis. Databricks plugs into the one million-plus computer servers inside the global system of Amazon Web Services, and will soon work inside similar-size megacomputing systems from Google and Microsoft.
It was the second company out of the AMP Lab this year. The first, called Mesosphere, enables a kind of pooling of computing services, building the efficiency of even million-computer systems….”
How you can help build a more agile government
Luke Fretwell at GovFresh: “Earlier this year, I began doing research work with CivicActions on agile development in government — who was doing it, how and what the needs were to successfully get it deployed.
After the Healthcare.gov launch mishaps, calls for agile practices as the panacea to all of government IT woes reached a high. While agile as the ultimate solution oversimplifies the issue, we’ve evolved as a profession (both software development and public service) that moving towards an iterative approach to operations is the way of the future.
My own formal introduction with agile began with my work with CivicActions, so the research coincided with an introductory immersion into how government is using it. Having been involved with startups for the past 15 years, iterative development is the norm, however, the layer of project management processes has forced me to be a better professional overall.
What I’ve found through many discussions and interviews is that you can’t just snap your fingers and execute agile within the framework of government bureaucracy. There are a number of issues — from procurement to project management training to executive-level commitment to organizational-wide culture change — that hinder its adoption. For IT, launching a new website or app is this easy part. Changing IT operational processes and culture is often overlooked or avoided, especially for a short-term executive, because they reach into the granular organizational challenges most people don’t want to bother with.
After talking with a number of agile government and private sector practitioners, it was clear there was enthusiasm around how it could be applied to fundamentally change the way government works. Beyond just execution from professional project management professionals, everyone I spoke with talked about how deploying agile gives them a stronger sense of public service.
What came from these discussions is the desire to have a stronger community of practitioners and those interested in deploying it to better support one another.
To meet that need, a group of federal, state, local government and private sector professionals have formed Agile for Gov, a “community-powered network of agile government professionals.”…
Monitoring Arms Control Compliance With Web Intelligence
Chris Holden and Maynard Holliday at Commons Lab: “Traditional monitoring of arms control treaties, agreements, and commitments has required the use of National Technical Means (NTM)—large satellites, phased array radars, and other technological solutions. NTM was a good solution when the treaties focused on large items for observation, such as missile silos or nuclear test facilities. As the targets of interest have shrunk by orders of magnitude, the need for other, more ubiquitous, sensor capabilities has increased. The rise in web-based, or cloud-based, analytic capabilities will have a significant influence on the future of arms control monitoring and the role of citizen involvement.
Since 1999, the U.S. Department of State has had at its disposal the Key Verification Assets Fund (V Fund), which was established by Congress. The Fund helps preserve critical verification assets and promotes the development of new technologies that support the verification of and compliance with arms control, nonproliferation, and disarmament requirements.
Sponsored by the V Fund to advance web-based analytic capabilities, Sandia National Laboratories, in collaboration with Recorded Future (RF), synthesized open-source data streams from a wide variety of traditional and nontraditional web sources in multiple languages along with topical texts and articles on national security policy to determine the efficacy of monitoring chemical and biological arms control agreements and compliance. The team used novel technology involving linguistic algorithms to extract temporal signals from unstructured text and organize that unstructured text into a multidimensional structure for analysis. In doing so, the algorithm identifies the underlying associations between entities and events across documents and sources over time. Using this capability, the team analyzed several events that could serve as analogs to treaty noncompliance, technical breakout, or an intentional attack. These events included the H7N9 bird flu outbreak in China, the Shanghai pig die-off and the fungal meningitis outbreak in the United States last year.
For H7N9 we found that open source social media were the first to report the outbreak and give ongoing updates. The Sandia RF system was able to roughly estimate lethality based on temporal hospitalization and fatality reporting. For the Shanghai pig die-off the analysis tracked the rapid assessment by Chinese authorities that H7N9 was not the cause of the pig die-off as had been originally speculated. Open source reporting highlighted a reduced market for pork in China due to the very public dead pig display in Shanghai. Possible downstream health effects were predicted (e.g., contaminated water supply and other overall food ecosystem concerns). In addition, legitimate U.S. food security concerns were raised based on the Chinese purchase of the largest U.S. pork producer (Smithfield) because of a fear of potential import of tainted pork into the United States….
To read the full paper, please click here.”