Alisha Green at the Sunlight Foundation Blog: “Governments aren’t alone in thinking about how open data can help improve the open meetings process. There are an increasing number of tools governments can use to help bolster open meetings with open data. From making public records generated by meetings more easily accessible and reusable online to inviting the public to participate in the decision-making process from wherever they may be, these tools allow governments to upgrade open meetings for the opportunities and demands of the 21st Century.
Improving open meetings with open data may involve taking advantage of simple solutions already freely available online, developing new tools within government, using open-source tools, or investing in new software, but it can all help serve the same goal: bringing more information online where it’s easily accessible to the public….
It’s not just about making open meetings more accessible, either. More communities are thinking about how they can bring government to the people. Open meetings are typically held in government-designated buildings at specified times, but are those locations and times truly accessible for most of the public or for those who may be most directly impacted by what’s being discussed?
Technology presents opportunities for governments to engage with the public outside of regularly scheduled meetings. Tools like Speakup and Textizen, for example, are being used to increase public participation in the general decision-making process. A continually increasing array of toolsprovidenewways for government and the public to identify issues, share ideas, and work toward solutions, even outside of open meetings. Boston, for example, took an innovative approach to this issue with its City Hall To Go truck and other efforts, bringing government services to locations around the city rather than requiring people to come to a government building…”
Thousands Can Fact-Check The News With Grasswire
Cat Zakrzewski in TechCrunch: “We all know you can’t believe everything you read on the Internet. But with Grasswire, you can at least “refute” it.
Austen Allred’s new venture allows news junkies to confirm and refute posts about breaking news. The “real-time newsroom controlled by everyone” divides posts into popular news topics, such as the Malaysia Airlines Crash in Ukraine and the Israeli-Palestinian conflict.
Once you select a topic, you then can upvote posts like Reddit to make them appear at the top of the page. If you see something that is incorrect, you can refute it by posting a source URL to information that disproves it. You can do the same to confirm a report. When you share the post on social media, all of these links are shared with it….
“Obviously there are some journalists who think turning journalism over to people who aren’t professional journalists is dangerous, but we disagree with those people,” Allred said. “I feel like the ability to refute something is not that incredibly difficult. The real power of journalism is when we have massive amounts of people trying to scrutinize whether or not that is accurate enough.”…
But despite these flaws, other attempts to fact check breaking news online have faltered. We still see false reports tweeted by verified accounts all the time, for instance. Something like Grasswire could serve the same role as a correction or a revision posted on an article. By linking to source material that continues to appear every time the post is shared, it is much like an article with an editor’s note that explains why something has been altered or changed.
For journalists trying to balance old-school ethics with new media tools, this option could be crucial. If executed correctly, it could lead to far fewer false reports because thousands of people could be fact checking information, not just a handful in a newsroom….”
Complexity, Governance, and Networks: Perspectives from Public Administration
Paper by Naim Kapucu: “Complex public policy problems require a productive collaboration among different actors from multiple sectors. Networks are widely applied as a public management tool and strategy. This warrants a deeper analysis of networks and network management in public administration. There is a strong interest in both in practice and theory of networks in public administration. This requires an analysis of complex networks within public governance settings. In this this essay I briefly discuss research streams on complex networks, network governance, and current research challenges in public administration.”
App enables citizens to report water waste in drought regions
Springwise: “Rallying citizens to take a part in looking after the community they live in has become easier thanks to smartphones. In the past, the Creek Watch app has enabled anyone to help monitor their local water quality by sending data back to the state water board. Now Everydrop LA wants to use similar techniques to avoid drought in California, encouraging residents to report incidents of water wastage.
According to the team behind the app — which also created the CitySourced platform for engaging users in civic issues — even the smallest amount of water wastage can lead to meaningful losses over time. A faucet that drips just once a minute will lose over 2000 gallons of drinkable water each year. Using the Everydrop LA, citizens can report the location of leaking faucets and fire hydrants as well as occurrences of blatant water wastage. They can also see how much water is being wasted in their local area and learn about what they can do to cut their own water usage. In times when drought is a risk, the app notifies users to conserve. Cities and counties can use the data in their reports and learn more about how water wastage is affecting their jurisdiction.”
Digital Footprints: Opportunities and Challenges for Online Social Research
Paper by Golder, Scott A. and Macy, Michael for the Annual Review of Sociology: “Online interaction is now a regular part of daily life for a demographically diverse population of hundreds of millions of people worldwide. These interactions generate fine-grained time-stamped records of human behavior and social interaction at the level of individual events, yet are global in scale, allowing researchers to address fundamental questions about social identity, status, conflict, cooperation, collective action, and diffusion, both by using observational data and by conducting in vivo field experiments. This unprecedented opportunity comes with a number of methodological challenges, including generalizing observations to the offline world, protecting individual privacy, and solving the logistical challenges posed by “big data” and web-based experiments. We review current advances in online social research and critically assess the theoretical and methodological opportunities and limitations. [J]ust as the invention of the telescope revolutionized the study of the heavens, so too by rendering the unmeasurable measurable, the technological revolution in mobile, Web, and Internet communications has the potential to revolutionize our understanding of ourselves and how we interact…. [T]hree hundred years after Alexander Pope argued that the proper study of mankind should lie not in the heavens but in ourselves, we have finally found our telescope. Let the revolution begin. —Duncan Watts”
Time for 21st century democracy
Martin Smith and Dave Richards at Policy Network (UK): “…The way that the world has changed is leading to a clash between two contrasting cultures. Traditional, top down, elite models of democracy and accountability are no longer sustainable in an age of a digitally more open-society. As the recent Hansard Society Report into PMQs clearly reveals, the people see politicians as out of touch and remote. What we need are two major changes. One is the recognition by institutions that they are now making decisions in an open world. That even if they make decisions in private (which in certain cases they clearly have to) they should recognise that at some point those decisions may need to be justified. Therefore every decision should be made on the basis that if it were open it would be deemed as legitimate.
The second is the development of bottom up accountability – we have to develop mechanisms where accountability is not mediated through institutions (as is the case with parliamentary accountability). In its conclusion, the Hansard Society report proposes new technology could be used to allow citizens rather than MPs to ask questions at Prime Minister’s question time. This is one of many forms of citizen led accountability that could reinforce the openness of decision making.
New technology creates the opportunity to move away from 19th century democracy. Technology can be used to change the way decisions are made, how citizens are involved and how institutions are held to account. This is already happening with social groups using social media, on-line petitions and mobile technologies as part of their campaigns. However, this process needs to be formalised (such as in the Hansard Society’s suggestion for citizen’s questions). There is also a need for more user friendly ways of analysing big data around government performance. Big data creates many new ways in which decisions can be opened up and critically reviewed. We also need much more explicit policies of leak and whistleblowing so that those who do reveal the inner workings of governments are not criminalised….”
Fundamentally, the real change is about treating citizens as grown-ups recognising that they can be privy to the details of the policy-making process. There is a great irony in the playground behaviour of Prime Minister’s question time and the patronising attitudes of political elites towards voters (which tends to infantilise citizens as not to have the expertise to fully participate). The most important change is that institutions start to act as if they are operating in an open society where they are directly accountable and hence are in a position to start regaining the trust of the people. The closed world of institutions is no longer viable in a digital age.
Transforming Performance Measurement for the 21st Century
Paper by Harry P. Hatry at the Urban Institute: “While substantial progress has been made in spreading performance measurement across the country and world, much of the information from performance measurement systems has been shallow. Modern technology and the considerable demand for information on progress in achieving the outcomes of public programs and policies are creating major opportunities for considerably improving the usefulness of performance information. This report provides a number of recommendations to help public and private service organizations take advantage of these opportunities, particularly for:(a) selecting appropriate performance indicators and data collection procedures; (b) analyzing and reporting the information; and (c) using the information to improve services. Read complete document: PDF ”
Fifteen open data insights
Tim Davies from ODRN: “…below are the 15 points from the three-page briefing version, and you can find a full write-up of these points for download. You can also find reports from all the individual project partners, including a collection of quick-read research posters over on the Open Data Research Network website.
15 insights into open data supply, use and impacts
(1) There are many gaps to overcome before open data availability, can lead to widespread effective use and impact. Open data can lead to change through a ‘domino effect’, or by creating ripples of change that gradually spread out. However, often many of the key ‘domino pieces’ are missing, and local political contexts limit the reach of ripples. Poor data quality, low connectivity, scarce technical skills, weak legal frameworks and political barriers may all prevent open data triggering sustainable change. Attentiveness to all the components of open data impact is needed when designing interventions.
(2) There is a frequent mismatch between open data supply and demand in developing countries. Counting datasets is a poor way of assessing the quality of an open data initiative. The datasets published on portals are often the datasets that are easiest to publish, not the datasets most in demand. Politically sensitive datasets are particularly unlikely to be published without civil society pressure. Sometimes the gap is on the demand side – as potential open data users often do not articulate demands for key datasets.
(3) Open data initiatives can create new spaces for civil society to pursue government accountability and effectiveness. The conversation around transparency and accountability that ideas of open data can support is as important as the datasets in some developing countries.
(4) Working on open data projects can change how government creates, prepares and uses its own data. The motivations behind an open data initiative shape how government uses the data itself. Civil society and entrepreneurs interacting with government through open data projects can help shape government data practices. This makes it important to consider which intermediaries gain insider roles shaping data supply.
(5) Intermediaries are vital to both the supply and the use of open data. Not all data needed for governance in developing countries comes from government. Intermediaries can create data, articulate demands for data, and help translate open data visions from political leaders into effective implementations. Traditional local intermediaries are an important source of information, in particular because they are trusted parties.
(6) Digital divides create data divides in both the supply and use of data. In some developing countries key data is not digitised, or a lack of technical staff has left data management patchy and inconsistent. Where Internet access is scarce, few citizens can have direct access to data or services built with it. Full access is needed for full empowerment, but offline intermediaries, including journalists and community radio stations, also play a vital role in bridging the gaps between data and citizens.
(7) Where information is already available and used, the shift to open data involves data evolution rather than data revolution. Many NGOs and intermediaries already access the information which is now becoming available as data. Capacity building should start from existing information and data practices in organisations, and should look for the step-by-step gains to be made from a data-driven approach.
(8) Officials’ fears about the integrity of data are a barrier to more machine-readable data being made available. The publication of data as PDF or in scanned copies is often down to a misunderstanding of how open data works. Only copies can be changed, and originals can be kept authoritative. Helping officials understand this may help increase the supply of data.
(9) Very few datasets are clearly openly licensed, and there is low understanding of what open licenses entail. There are mixed opinions on the importance of a focus on licensing in different contexts. Clear licenses are important to building a global commons of interoperable data, but may be less relevant to particular uses of data on the ground. In many countries wider conversation about licensing are yet to take place.
(10) Privacy issues are not on the radar of most developing country open data projects, although commercial confidentiality does arise as a reason preventing greater data transparency. Much state held data is collected either from citizens or from companies. Few countries in the ODDC study have weak or absent privacy laws and frameworks, yet participants in the studies raised few personal privacy considerations. By contrast, a lack of clarity, and officials’ concerns, about potential breaches of commercial confidentiality when sharing data gathered from firms was a barrier to opening data.
(11) There is more to open data than policies and portals. Whilst central open data portals act as a visible symbol of open data initiatives, a focus on portal building can distract attention from wider reforms. Open data elements can also be built on existing data sharing practices, and data made available through the locations where citizens, NGOs are businesses already go to access information.
(12) Open data advocacy should be aware of, and build upon, existing policy foundations in specific countries and sectors. Sectoral transparency policies for local government, budget and energy industry regulation, amongst others, could all have open data requirements and standards attached, drawing on existing mechanisms to secure sustainable supplies of relevant open data in developing countries. In addition, open data conversations could help make existing data collection and disclosure requirements fit better with the information and data demands of citizens.
(13) Open data is not just a central government issue: local government data, city data, and data from the judicial and legislative branches are all important. Many open data projects focus on the national level, and only on the executive branch. However, local government is closer to citizens, urban areas bring together many of the key ingredients for successful open data initiatives, and transparency in other branches of government is important to secure citizens democratic rights.
(14) Flexibility is needed in the application of definitions of open data to allow locally relevant and effective open data debates and advocacy to emerge. Open data is made up of various elements, including proactive publication, machine-readability and permissions to re-use. Countries at different stages of open data development may choose to focus on one or more of these, but recognising that adopting all elements at once could hinder progress. It is important to find ways to both define open data clearly, and to avoid a reductive debate that does not recognise progressive steps towards greater openness.
(15) There are many different models for an open data initiative: including top-down, bottom-up and sector-specific. Initiatives may also be state-led, civil society-led and entrepreneur-led in their goals and how they are implemented – with consequences for the resources and models required to make them sustainable. There is no one-size-fits-all approach to open data. More experimentation, evaluation and shared learning on the components, partners and processes for putting open data ideas into practice must be a priority for all who want to see a world where open-by-default data drives real social, political and economic change.
You can read more about each of these points in the full report.”
Using the Wisdom of the Crowd to Democratize Markets
David Weidner at the Wall Street Journal: “For years investors have largely depended on three sources to distill the relentless onslaught of information about public companies: the companies themselves, Wall Street analysts and the media.
Each of these has their strengths, but they may have even bigger weaknesses. Companies spin. Analysts have conflicts of interest. The financial media is under deadline pressure and ill-equipped to act as a catch-all watchdog.
But in recent years, the tech whizzes out of Silicon Valley have been trying to democratize the markets. In 2010 I wrote about an effort called Moxy Vote, an online system for shareholders to cast ballots in proxy contests. Moxy Vote had some initial success but ran into regulatory trouble and failed to gain traction.
Some newer efforts are more promising, mostly because they depend on users, or some form of crowdsourcing, for their content. Crowdsourcing is when a need is turned over to a large group, usually an online community, rather than traditional paid employees or outside providers….
Estimize.com is one. It was founded in 2011 by former trader Leigh Drogan, but recently has undergone some significant expansion, adding a crowd-sourced prediction for mergers and acquisitions. Estimize also boasts a track record. It claims it beats Wall Street analysts 65.9% of the time during earnings season. Like SeekingAlpha, Estimize does, however, lean heavily on pros or semi-pros. Nearly 5,000 of its contributors are analysts.
Closer to the social networking world there’s scutify.com, a website and mobile app that aggregates what’s being said about individual stocks on social networks, blogs and other sources. It highlights trending stocks and links to chatter on social networks. (The site is owned by Cody Willard, a contributor to MarketWatch, which is owned by Dow Jones, the publisher of The Wall Street Journal.)
Perhaps the most intriguing startup is TwoMargins.com. The site allows investors, analysts, average Joes — anyone, really — to annotate company releases. In that way, Two Margins potentially can tap the power of the crowd to provide a fourth source for the marketplace.
Two Margins, a startup funded by Bloomberg L.P.’s venture capital fund, borrows annotation technology that’s already in use on other sites such as genius.com and scrible.com. Participants can sign in with their Twitter or Facebook accounts and post to those networks from the site. (Dow Jones competes with Bloomberg in the provision of news and financial data.)
At this moment, Two Margins isn’t a game changer. Founders Gniewko Lubecki and Akash Kapur said the site is in a pre-beta phase, which is to say it’s sort of up and running and being constantly tweaked.
Right now there’s nothing close to the critical mass needed for an exhaustive look at company filings. There’s just a handful of users and less than a dozen company releases and filings available.
Still, in the first moments after Twitter Inc.’s earnings were released Tuesday, Two Margins’ most loyal users began to scour the release. “Looks like Twitter is getting significantly better at monetizing users,” wrote a user named “George” who had annotated the revenue line from the company’s financial statement. Another user, “Scott Paster,” noted Twitter’s stock option grants to executives were nearly as high as its reported loss.
“The sum is greater than it’s parts when you pull together a community of users,” Mr. Kapur said. “Widening access to these documents is one goal. The other goal is broadening the pool of knowledge that’s brought to bear on these documents.”
In the end, this new wave of tech-driven services may never capture enough users to make it into the investing mainstream. They all struggle with uninformed and inaccurate content especially if they gain critical mass. Vetting is a problem.
For that reasons, it’s hard to predict whether these new entries will flourish or even survive. That’s not a bad thing. The march of technology will either improve on the idea or come up with a new one.
Ultimately, technology is making possible what hasn’t been. That is, free discussion, access and analysis of information. Some may see it as a threat to Wall Street, which has always charged for expert analysis. Really, though, these efforts are good for markets, which pride themselves on being fair and transparent.
It’s not just companies that should compete, but ideas too.”
Quantifying the Interoperability of Open Government Datasets
Paper by Pieter Colpaert, Mathias Van Compernolle, Laurens De Vocht, Anastasia Dimou, Miel Vander Sande, Peter Mechant, Ruben Verborgh, and Erik Mannens, to be published in Computer: “Open Governments use the Web as a global dataspace for datasets. It is in the interest of these governments to be interoperable with other governments worldwide, yet there is currently no way to identify relevant datasets to be interoperable with and there is no way to measure the interoperability itself. In this article we discuss the possibility of comparing identifiers used within various datasets as a way to measure semantic interoperability. We introduce three metrics to express the interoperability between two datasets: the identifier interoperability, the relevance and the number of conflicts. The metrics are calculated from a list of statements which indicate for each pair of identifiers in the system whether they identify the same concept or not. While a lot of effort is needed to collect these statements, the return is high: not only relevant datasets are identified, also machine-readable feedback is provided to the data maintainer.”