What is CitySDK?: “Helping cities to open their data and giving developers the tools they need, the CitySDK aims for a step change in how to deliver services in urban environments. With governments around the world looking at open data as a kick start for their economies, CitySDK provides better and easier ways for the cities throughout the Europe to release their data in a format that is easy for the developers to re-use.
Taking the best practices around the world the project will foresee the development of a toolkit – CitySDK v1.0 – that can be used by any city looking to create a sustainable infrastructure of “city apps”.
Piloting the CitySDK
The Project focuses on three Pilot domains: Smart Participation, Smart Mobility and Smart Tourism. Within each of the three domains, a large-scale Lead Pilot is carried out in one city. The experiences of the Lead Pilot will be applied in the Replication Pilots in other Partner cities.
Funding
CitySDK is a 6.8 million Euro project, part funded by the European Commission. It is a Pilot Type B within the ICT Policy Support Programme of the Competitiveness and Framework Programme. It runs from January 2012-October 2014.”
As Data Overflows Online, Researchers Grapple With Ethics
The New York Times: “Scholars are exhilarated by the prospect of tapping into the vast troves of personal data collected by Facebook, Google, Amazon and a host of start-ups, which they say could transform social science research.
atOnce forced to conduct painstaking personal interviews with subjects, scientists can now sit at a screen and instantly play with the digital experiences of millions of Internet users. It is the frontier of social science — experiments on people who may never even know they are subjects of study, let alone explicitly consent.
“This is a new era,” said Jeffrey T. Hancock, a Cornell University professor of communication and information science. “I liken it a little bit to when chemistry got the microscope.”
But the new era has brought some controversy with it. Professor Hancock was a co-author of the Facebook study in which the social network quietly manipulated the news feeds of nearly 700,000 people to learn how the changes affected their emotions. When the research was published in June, the outrage was immediate…
Such testing raises fundamental questions. What types of experiments are so intrusive that they need prior consent or prompt disclosure after the fact? How do companies make sure that customers have a clear understanding of how their personal information might be used? Who even decides what the rules should be?
Existing federal rules governing research on human subjects, intended for medical research, generally require consent from those studied unless the potential for harm is minimal. But many social science scholars say the federal rules never contemplated large-scale research on Internet users and provide inadequate guidance for it.
For Internet projects conducted by university researchers, institutional review boards can be helpful in vetting projects. However, corporate researchers like those at Facebook don’t face such formal reviews.
Sinan Aral, a professor at the Massachusetts Institute of Technology’s Sloan School of Management who has conducted large-scale social experiments with several tech companies, said any new rules must be carefully formulated.
“We need to understand how to think about these rules without chilling the research that has the promise of moving us miles and miles ahead of where we are today in understanding human populations,” he said. Professor Aral is planning a panel discussion on ethics at a M.I.T. conference on digital experimentation in October. (The professor also does some data analysis for The New York Times Company.)
Mary L. Gray, a senior researcher at Microsoft Research and associate professor at Indiana University’s Media School, who has worked extensively on ethics in social science, said that too often, researchers conducting digital experiments work in isolation with little outside guidance.
She and others at Microsoft Research spent the last two years setting up an ethics advisory committee and training program for researchers in the company’s labs who are working with human subjects. She is now working with Professor Hancock to bring such thinking to the broader research world.
“If everyone knew the right thing to do, we would never have anyone hurt,” she said. “We really don’t have a place where we can have these conversations.”…
Knowledge is Beautiful
New book by David McCandless: “In this mind-blowing follow-up to the bestselling Information is Beautiful, the undisputed king of infographics David McCandless uses stunning and unique visuals to reveal unexpected insights into how the world really works. Every minute of every hour of every day we are bombarded with information – be it on television, in print or online. How can we relate to this mind-numbing overload? Enter David McCandless and his amazing infographics: simple, elegant ways to understand information too complex or abstract to grasp any way but visually. McCandless creates dazzling displays that blend the facts with their connections, contexts and relationships, making information meaningful, entertaining – and beautiful. Knowledge is Beautiful is an endlessly fascinating spin through the world of visualized data, all of it bearing the hallmark of David McCandless’s ground-breaking signature style. Taking infographics to the next level, Knowledge is Beautiful offers a deeper, more wide-ranging look at the world and its history. Covering everything from dog breeds and movie plots to the most commonly used passwords and crazy global warming solutions, Knowledge is Beautiful is guaranteed to enrich your understanding of the world.”
The city as living labortory: A playground for the innovative development of smart city applications
Paper by Veeckman, Carina and van der Graaf, Shenja: “Nowadays the smart-city concept is shifting from a top-down, mere technological approach towards bottom-up processes that are based on the participation of creative citizens, research organisations and companies. Here, the city acts as an urban innovation ecosystem in which smart applications, open government data and new modes of participation are fostering innovation in the city. However, detailed analyses on how to manage smart city initiatives as well as descriptions of underlying challenges and barriers seem still scarce. Therefore, this paper investigates four, collaborative smart city initiatives in Europe to learn how cities can optimize the citizen’s involvement in the context of open innovation. The analytical framework focuses on the innovation ecosystem and the civic capacities to engage in the public domain. Findings show that public service delivery can be co-designed between the city and citizens, if different toolkits aligned with the specific capacities and skills of the users are provided. By providing the right tools, even ordinary citizens can take a much more active role in the evolution of their cities and generate solutions from which both the city and everyday urban life can possibly benefit.”
Reality Mining: Using Big Data to Engineer a Better World
New book by Nathan Eagle and Kate Greene : “Big Data is made up of lots of little data: numbers entered into cell phones, addresses entered into GPS devices, visits to websites, online purchases, ATM transactions, and any other activity that leaves a digital trail. Although the abuse of Big Data—surveillance, spying, hacking—has made headlines, it shouldn’t overshadow the abundant positive applications of Big Data. In Reality Mining, Nathan Eagle and Kate Greene cut through the hype and the headlines to explore the positive potential of Big Data, showing the ways in which the analysis of Big Data (“Reality Mining”) can be used to improve human systems as varied as political polling and disease tracking, while considering user privacy.
Monitoring Arms Control Compliance With Web Intelligence
Chris Holden and Maynard Holliday at Commons Lab: “Traditional monitoring of arms control treaties, agreements, and commitments has required the use of National Technical Means (NTM)—large satellites, phased array radars, and other technological solutions. NTM was a good solution when the treaties focused on large items for observation, such as missile silos or nuclear test facilities. As the targets of interest have shrunk by orders of magnitude, the need for other, more ubiquitous, sensor capabilities has increased. The rise in web-based, or cloud-based, analytic capabilities will have a significant influence on the future of arms control monitoring and the role of citizen involvement.
Since 1999, the U.S. Department of State has had at its disposal the Key Verification Assets Fund (V Fund), which was established by Congress. The Fund helps preserve critical verification assets and promotes the development of new technologies that support the verification of and compliance with arms control, nonproliferation, and disarmament requirements.
Sponsored by the V Fund to advance web-based analytic capabilities, Sandia National Laboratories, in collaboration with Recorded Future (RF), synthesized open-source data streams from a wide variety of traditional and nontraditional web sources in multiple languages along with topical texts and articles on national security policy to determine the efficacy of monitoring chemical and biological arms control agreements and compliance. The team used novel technology involving linguistic algorithms to extract temporal signals from unstructured text and organize that unstructured text into a multidimensional structure for analysis. In doing so, the algorithm identifies the underlying associations between entities and events across documents and sources over time. Using this capability, the team analyzed several events that could serve as analogs to treaty noncompliance, technical breakout, or an intentional attack. These events included the H7N9 bird flu outbreak in China, the Shanghai pig die-off and the fungal meningitis outbreak in the United States last year.
For H7N9 we found that open source social media were the first to report the outbreak and give ongoing updates. The Sandia RF system was able to roughly estimate lethality based on temporal hospitalization and fatality reporting. For the Shanghai pig die-off the analysis tracked the rapid assessment by Chinese authorities that H7N9 was not the cause of the pig die-off as had been originally speculated. Open source reporting highlighted a reduced market for pork in China due to the very public dead pig display in Shanghai. Possible downstream health effects were predicted (e.g., contaminated water supply and other overall food ecosystem concerns). In addition, legitimate U.S. food security concerns were raised based on the Chinese purchase of the largest U.S. pork producer (Smithfield) because of a fear of potential import of tainted pork into the United States….
To read the full paper, please click here.”
EU-funded tool to help our brain deal with big data
EU Press Release: “Every single minute, the world generates 1.7 million billion bytes of data, equal to 360,000 DVDs. How can our brain deal with increasingly big and complex datasets? EU researchers are developing an interactive system which not only presents data the way you like it, but also changes the presentation constantly in order to prevent brain overload. The project could enable students to study more efficiently or journalists to cross check sources more quickly. Several museums in Germany, the Netherlands, the UK and the United States have already showed interest in the new technology.
Data is everywhere: it can either be created by people or generated by machines, such as sensors gathering climate information, satellite imagery, digital pictures and videos, purchase transaction records, GPS signals, etc. This information is a real gold mine. But it is also challenging: today’s datasets are so huge and complex to process that they require new ideas, tools and infrastructures.
Researchers within CEEDs (@ceedsproject) are transposing big data into an interactive environment to allow the human mind to generate new ideas more efficiently. They have built what they are calling an eXperience Induction Machine (XIM) that uses virtual reality to enable a user to ‘step inside’ large datasets. This immersive multi-modal environment – located at Pompeu Fabra University in Barcelona – also contains a panoply of sensors which allows the system to present the information in the right way to the user, constantly tailored according to their reactions as they examine the data. These reactions – such as gestures, eye movements or heart rate – are monitored by the system and used to adapt the way in which the data is presented.
Jonathan Freeman,Professor of Psychology at Goldsmiths, University of London and coordinator of CEEDs, explains: “The system acknowledges when participants are getting fatigued or overloaded with information. And it adapts accordingly. It either simplifies the visualisations so as to reduce the cognitive load, thus keeping the user less stressed and more able to focus. Or it will guide the person to areas of the data representation that are not as heavy in information.”
Neuroscientists were the first group the CEEDs researchers tried their machine on (BrainX3). It took the typically huge datasets generated in this scientific discipline and animated them with visual and sound displays. By providing subliminal clues, such as flashing arrows, the machine guided the neuroscientists to areas of the data that were potentially more interesting to each person. First pilots have already demonstrated the power of this approach in gaining new insights into the organisation of the brain….”
The Emergence of Government Innovation Teams
Hollie Russon Gilman at TechTank: “A new global currency is emerging. Governments understand that people at home and abroad evaluate them based on how they use technology and innovative approaches in their service delivery and citizen engagement. This raises opportunities, and critical questions about the role of innovation in 21st century governance.
Bloomberg Philanthropies and Nesta, the UK’s Innovation foundation, recently released a global report highlighting 20 government innovation teams. Importantly, the study included teams that were established and funded by all levels of government (city, regional and national), and aims to find creative solutions to seemingly intractable solutions. This report features 20 teams across six continents and features some basic principles and commonalities that are instructive for all types of innovators, inside and outside, of government.
Using Government to Locally Engage
One of the challenges of representational democracy is that elected officials and government officials spend time in bureaucracies isolated from the very people they aim to serve. Perhaps there can be different models. For example, Seoul’s Innovation Bureau is engaging citizens to re-design and re-imagine public services. Seoul is dedicated to becoming a Sharing City; including Tool Kit Centers where citizens can borrow machinery they would rarely use that would also benefit the whole community. This approach puts citizens at the center of their communities and leverages government to work for the people…
As I’ve outlined in a earlier TechTank post, there are institutional constraints for governments to try the unknown. There are potential electoral costs, greater disillusionment, and gaps in vital service delivery. Yet, despite all of these barriers there are a variety of promising tools. For example, Finland has Sitra, an Innovation fund, whose mission is to foster experimentation to transform a diverse set of policy issues including sustainable energy and healthcare. Sitra invests in both the practical research and experiments to further public sector issues as well as invest in early stage companies.
We need a deeper understanding of the opportunities, and challenges, of innovation in government. Luckily there are many researchers, think-tanks, and organizations beginning analysis. For example, Professor and Associate Dean Anita McGahan, of the Rotman School of Management at the University of Toronto, calls for a more strategic approach toward understanding the use of innovation, including big data, in the public sector…”
Using technology, data and crowdsourcing to hack infrastructure problems
Courtney M. Fowler at CAFWD.ORG: “Technology has become a way of life for most Americans, not just for communication but also for many daily activities. However, there’s more that can be done than just booking a trip or crushing candy. With a majority of Americans now owning smartphones, it’s only becoming more obvious that there’s room for governments to engage the public and provide more bang for their buck via technology.
CA Fwd has been putting on an “Open Data roadshow” around the state to highlight ways the marriage of tech and info can make government more efficient and transparent.
Jurisdictions have also been discovering that using technology and smartphone apps can be beneficial in the pursuit of improving infrastructure. Saving any amount of money on such projects is especially important for California, where it’s been estimated the state will only have half of the $765 billion needed for infrastructure investments over the next decade.
One of the best examples of applying technology to infrastructure problems comes from South Carolina, where an innovative bridge-monitoring system is producing real savings, despite being in use on only eight bridges.
Girder sensors are placed on each bridge so that they can measure its carrying capacity and can be monitored 24/7. Although, the monitors don’t eliminate the need for inspections, the technology does make the need for them significantly less frequent. Data from the monitors also led the South Carolina Department of Transportation to correct one bridge’s problems with a $100,000 retrofit, rather than spending $800,000 to replace it…”
In total, having the monitors on just eight bridges, at a cost of about $50,000 per bridge, saved taxpayers $5 million.
That kind of innovation and savings is exactly what California needs to ensure that infrastructure projects happen in a more timely and efficient fashion in the future. It’s also what is driving civic innovators to bring together technology and crowdsourcing and make sure infrastructure projects also are results oriented.
Opportunities for strengthening open meetings with open data
Alisha Green at the Sunlight Foundation Blog: “Governments aren’t alone in thinking about how open data can help improve the open meetings process. There are an increasing number of tools governments can use to help bolster open meetings with open data. From making public records generated by meetings more easily accessible and reusable online to inviting the public to participate in the decision-making process from wherever they may be, these tools allow governments to upgrade open meetings for the opportunities and demands of the 21st Century.
Improving open meetings with open data may involve taking advantage of simple solutions already freely available online, developing new tools within government, using open-source tools, or investing in new software, but it can all help serve the same goal: bringing more information online where it’s easily accessible to the public….
It’s not just about making open meetings more accessible, either. More communities are thinking about how they can bring government to the people. Open meetings are typically held in government-designated buildings at specified times, but are those locations and times truly accessible for most of the public or for those who may be most directly impacted by what’s being discussed?
Technology presents opportunities for governments to engage with the public outside of regularly scheduled meetings. Tools like Speakup and Textizen, for example, are being used to increase public participation in the general decision-making process. A continually increasing array of toolsprovidenewways for government and the public to identify issues, share ideas, and work toward solutions, even outside of open meetings. Boston, for example, took an innovative approach to this issue with its City Hall To Go truck and other efforts, bringing government services to locations around the city rather than requiring people to come to a government building…”