The Smart City and its Citizens


Paper by Carlo Francesco Capra on “Governance and Citizen Participation in Amsterdam Smart City…Smart cities are associated almost exclusively with modern technology and infrastructure. However, smart cities have the possibility to enhance the involvement and contribution of citizens to urban development. This work explores the role of governance as one of the factors influencing the participation of citizens in smart cities projects. Governance characteristics play a major role in explaining different typologies of citizen participation. Through a focus on Amsterdam Smart City program as a specific case study, this research examines the characteristics of governance that are present in the overall program and within a selected sample of projects, and how they relate to different typologies of citizen participation. The analysis and comprehension of governance characteristics plays a crucial role both for a better understanding and management of citizen participation, especially in complex settings where multiple actors are interacting….(More)”

The Open (Data) Market


Sean McDonald at Medium: “Open licensing privatizes technology and data usability. How does that effect equality and accessibility?…The open licensing movement(open data, open source software, etc.) predicates its value on increasing accessibility and transparency by removing legal and ownership restrictions on use. The groups that advocate for open data and open sourcecode, especially in government and publicly subsidized industries, often come from transparency, accountability, and freedom of information backgrounds. These efforts, however, significantly underestimate the costs of refining, maintaining, targeting, defining a value proposition, marketing, and presenting both data and products in ways that are effective and useful for the average person. Recent research suggests the primary beneficiaries of civic technologies — those specifically built on government data or services — are privileged populations. The World Banks recent World Development Report goes further to point out that public digitization can be a driver of inequality.

The dynamic of self-replicating privilege in both technology and openmarkets is not a new phenomenon. Social science research refers to it as the Matthew Effect, which says that in open or unregulated spaces, theprivileged tend to become more privileged, while the poor become poorer.While there’s no question the advent of technology brings massive potential,it is already creating significant access and achievement divides. Accordingto the Federal Communication Commission’s annual Broadband Progressreport in 2015, 42% of students in the U.S. struggle to do their homeworkbecause of web access — and 39% of rural communities don’t even have abroadband option. Internet access skews toward urban, wealthycommunities, with income, geography, and demographics all playing a rolein adoption. Even further, research suggests that the rich and poor use technology differently. This runs counter to narrative of Interneteventualism, which insist that it’s simply a (small) matter of time beforethese access (and skills) gaps close. Evidence suggests that for upper andmiddle income groups access is almost universal, but the gaps for lowincome groups are growing…(More)”

Translator Gator


Yulistina Riyadi & Lalitia Apsar at Global Pulse: “Today Pulse Lab Jakarta launches Translator Gator, a new language game to support research initiatives in Indonesia. Players can earn phone credit by translating words between English and six common Indonesian languages. The database of keywords generated by the game will be used by researchers on topics ranging from computational social science to public policy.

Translator Gator is inspired by the need to socialise the 17 Sustainable Development Goals (SDGs), currently being integrated into the Government of Indonesia’s programme, and the need to better monitor progress against the varied indicators. Thus, Translator Gator will raise awareness of the SDGs and develop a taxonomy of keywords to inform research.

An essential element of public policy research is to pay attention to citizens’ feedback, both active and passive, for instance, citizens’ complaints to governments through official channels and on social media. To do this in a computational manner, researchers need a set of keywords, or ‘taxonomy’, by topic or government priorities for example.

But given the rich linguistic and cultural diversity in Indonesia, this poses some difficulties in that many languages and dialects are used in different provinces and islands. On social media, such variations – including jargon – make building a list of keywords more challenging as words, context and, by extension, meaning change from region to region. …(More)”

Idea to retire: Leaders can’t take risks or experiment


David Bray at TechTank: “Technology is rapidly changing our world. Traditionally, a nation’s physical borders could mark the beginning of their sovereign space, but in the early to mid-20th century airplanes challenged this notion. Later on, space-based satellites began flying in space above all nations. By the early 21st century, smartphone technologies costing $100 or so gave individuals computational capabilities that dwarfed the multi-million dollar computers operated by large nation-states just three decades earlier.

In this period of exponential change, all of us across the public sector must work together, enabling more inclusive work across government workers, citizen-led contributions, and public-private partnerships. Institutions must empower positive change agents on the inside of public service to pioneer new ways of delivering superior results. Institutions must also open their data for greater public interaction, citizen-led remixing, and discussions.

All together, these actions will transform public service to truly be “We the (mobile, data-enabled, collaborative) People” working to improve our world. These actions all begin creating creative spaces that allow public service professionals the opportunities to experiment and explore new ways of delivering superior results to the public.

21st Century Reality #1: Public service must include workspaces for those who want to experiment and explore new ways of delivering results.

The world we face now is dramatically different then the world of 50, 100, or 200 years ago. More technological change is expected to occur in the next five years than the last 15 years combined. Advances in technology have blurred what traditionally was considered government, and consequentially we must experiment and explore new ways of delivering results.

21st Century Reality #2: Public service agencies need, within reason, to be allowed to have things fail, and be allowed to take risks.

The words “expertise” and “experiments” have the same etymological root, which is “exper,” meaning “out of danger.” Whereas the motto in Silicon Valley and other innovation hubs around the world might be “fail fast and fail often,” such a model is not going to work for public service, where certain endeavors absolutely must succeed and cannot waste taxpayer funds.

The only way public sector technologists will gain the expertise needed to respond to and take advantage of the digital disruptions occurring globally will be to do “dangerous experiments” as positive change agents akin to what entrepreneurs in Silicon Valley also do….

21st Century Reality #3: Public service cannot be done solely by government professionals in a top-down fashion.

With the communication capabilities provided by smartphones, social media, and freely available apps, individual members of the public can voluntarily access, analyze, remix, and choose to contribute data and insights to better inform public service. Recognizing this shift from top-down to bottom-up activities represents the first step to the resiliency of our legacy institutions….

Putting a cultural shift into practice

Senior executives need to shift from managing those who report to them to championing and creating spaces for creativity within their organizations. Within any organization, change agents should be able to approach an executive, pitch new ideas, bring data to support these ideas, and if a venture is approved move forward with speed to transform public service away from our legacy approaches….

The work of public service also can be done by public-private partnerships acting beyond their own corporate interests to benefit the nation and local communities. Historically the U.S. has lagged other nations, like Singapore or the U.K., in exploring new innovative forms of public-private partnerships. This could change by examining the pressing issues of the day and considering how the private sector might solve challenging issues, or complement the efforts of government professionals. This could include rotations of both government and private sector professionals as part of public-private partnerships to do public service that now might be done more collaboratively, effectively, and innovatively using alternative forms of organizational design and delivery.

If public service returns to first principles – namely, what “We the People” choose to do together – new forms of organizing, collaborating, incentivizing, and delivering results will emerge. Our exponential era requires such transformational partnerships for the future ahead….(More)”

Open Data Is Changing the World in Four Ways…


 at The GovLab Blog: “New repository of case studies documents the impact of open data globally: odimpact.org.

odimpact-tweet-3

Despite global commitments to and increasing enthusiasm for open data, little is actually known about its use and impact. What kinds of social and economic transformation has open data brought about, and what is its future potential? How—and under what circumstances—has it been most effective? How have open data practitioners mitigated risks and maximized social good?

Even as proponents of open data extol its virtues, the field continues to suffer from a paucity of empiricalevidence. This limits our understanding of open data and its impact.

Over the last few months, The GovLab (@thegovlab), in collaboration with Omidyar Network(@OmidyarNetwork), has worked to address these shortcomings by developing 19 detailed open data case studies from around the world. The case studies have been selected for their sectoral and geographic representativeness. They are built in part from secondary sources (“desk research”), and also from more than60 first-hand interviews with important players and key stakeholders. In a related collaboration withOmidyar Network, Becky Hogge(@barefoot_techie), an independent researcher, has developed an additional six open data case studies, all focused on the United Kingdom.  Together, these case studies, seek to provide a more nuanced understanding of the various processes and factors underlying the demand, supply, release, use and impact of open data.

Today, after receiving and integrating comments from dozens of peer reviewers through a unique open process, we are delighted to share an initial batch of 10 case studies, as well three of Hogge’s UK-based stories. These are being made available at a new custom-built repository, Open Data’s Impact (http://odimpact.org), that will eventually house all the case studies, key findings across the studies, and additional resources related to the impact of open data. All this information will be stored in machine-readable HTML and PDF format, and will be searchable by area of impact, sector and region….(More)

Big-data analytics: the power of prediction


Rachel Willcox in Public Finance: “The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead…

Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time….

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says….

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety….(More)”

 

Yahoo Releases the Largest-ever Machine Learning Dataset for Researchers


Suju Rajan at Yahoo Labs: “Data is the lifeblood of research in machine learning. However, access to truly large-scale datasets is a privilege that has been traditionally reserved for machine learning researchers and data scientists working at large companies – and out of reach for most academic researchers.

Research scientists at Yahoo Labs have long enjoyed working on large-scale machine learning problems inspired by consumer-facing products. This has enabled us to advance the thinking in areas such as search ranking, computational advertising, information retrieval, and core machine learning. A key aspect of interest to the external research community has been the application of new algorithms and methodologies to production traffic and to large-scale datasets gathered from real products.

Today, we are proud to announce the public release of the largest-ever machine learning dataset to the research community. The dataset stands at a massive ~110B events (13.5TB uncompressed) of anonymized user-news item interaction data, collected by recording the user-news item interactions of about 20M users from February 2015 to May 2015.

The Yahoo News Feed dataset is a collection based on a sample of anonymized user interactions on the news feeds of several Yahoo properties, including the Yahoo homepage, Yahoo News, Yahoo Sports, Yahoo Finance, Yahoo Movies, and Yahoo Real Estate.

Our goals are to promote independent research in the fields of large-scale machine learning and recommender systems, and to help level the playing field between industrial and academic research. The dataset is available as part of the Yahoo Labs Webscope data-sharing program, which is a reference library of scientifically-useful datasets comprising anonymized user data for non-commercial use.

In addition to the interaction data, we are providing categorized demographic information (age range, gender, and generalized geographic data) for a subset of the anonymized users. On the item side, we are releasing the title, summary, and key-phrases of the pertinent news article. The interaction data is timestamped with the relevant local time and also contains partial information about the device on which the user accessed the news feeds, which allows for interesting work in contextual recommendation and temporal data mining….(More)”

Don’t let transparency damage science


Stephan Lewandowsky and Dorothy Bishop explain in Nature “how the research community should protect its members from harassment, while encouraging the openness that has become essential to science:…

Screen Shot 2016-01-26 at 10.37.26 AMTransparency has hit the headlines. In the wake of evidence that many research findings are not reproducible, the scientific community has launched initiatives to increase data sharing, transparency and open critique. As with any new development, there are unintended consequences. Many measures that can improve science — shared data, post-publication peer review and public engagement on social media — can be turned against scientists. Endless information requests, complaints to researchers’ universities, online harassment, distortion of scientific findings and even threats of violence: these were all recurring experiences shared by researchers from a broad range of disciplines at a Royal Society-sponsored meeting last year that we organized to explore this topic. Orchestrated and well-funded harassment campaigns against researchers working in climate change and tobacco control are well documented. Some hard-line opponents to other research, such as that on nuclear fallout, vaccination, chronic fatigue syndrome or genetically modified organisms, although less resourced, have employed identical strategies….(More)”

 

Iowa fights snow with data


Patrick Marshall at GCN: “Most residents of the Mid-Atlantic states, now digging out from the recent record-setting snowstorm, probably don’t know how soon their streets will be clear.  If they lived in Iowa, however, they could simply go to the state’s Track a Plow website to see in near real time where snow plows are and in what direction they’re heading.

In fact, the Track a Plow site — the first iteration of which launched three years ago — shows much more than just the location and direction of the state’s more than 900 plows. Because they are equipped with geolocation equipment and a variety of sensors,  the plows also provide information on road conditions, road closures and whether trucks are applying liquid or solid materials to counter snow and ice.  That data is regularly uploaded to Track a Plow, which also offers near-real-time video and photos of conditions.

Track a Plow screenshot

According to Eric Abrams, geospatial manager at the Iowa Department of Transportation, the service is very popular and is being used for a variety of purposes.  “It’s been one of the greatest public interface things that DOT has ever done,” he said.  In addition to citizens considering travel, Abrams said the, site’s heavy users include news stations, freight companies routing vehicles and school districts determining whether to delay opening or cancel classes.

How it works

While Track a Plow launched with just location information, it has been frequently enhanced over the past two years, beginning with the installation of video cameras.  “The challenge was to find a cost-effective way to put cams in the plows and then get those images not just to supervisors but to the public,” Abrams said.  The solution he arrived at was dashboard-mounted iPhones that transmit time and location data in addition to images.  These were especially cost-effective because they were free with the department’s Verizon data plan. “Our IT division built a custom iPhone app that is configurable for how often it sends pictures back to headquarters here, where we process them and get them out to the feed,” he explained….(More)”

Opening Governance – Change, Continuity and Conceptual Ambiguity


Introduction to special issue of IDS Bulletin by Rosemary McGee and Duncan Edwards: “Open government and open data are new areas of research, advocacy and activism that have entered the governance field alongside the more established areas of transparency and accountability. This article reviews recent scholarship in these areas, pinpointing contributions to more open, transparent, accountable and responsive governance via improved practice, projects and programmes. The authors set the rest of the articles from this IDS Bulletin in the context of the ideas, relationships, processes, behaviours, policy frameworks and aid funding practices of the last five years, and critically discuss questions and weaknesses that limit the effectiveness and impact of this work. Identifying conceptual ambiguity as a key problem, they offer a series of definitions to help overcome the technical and political difficulties this causes. They also identify hype and euphemism, and offer a series of conclusions to help restore meaning and ideological content to work on open government and open data in transparent and accountable governance….(More)”