New Paper by several authors of the Rensselaer Polytechnic Institute, including Jim Hendler, Marie Joan Kristine Gloria, Dominic DiFranzo and Marco Fernando Navarro: “As the discipline of Web Science matures, its interdisciplinary claim has many researchers unsure about its core theory and methodology. Instead, we often see results that are more multi-disciplinary than interdisciplinary. The following contribution attempts to recast our understanding of the current methodologies and tools leveraged by the Web Science community. Specifically, we review the Semantic Web and Linked Data technologies not just from a technical perspective; but, through a critical reading of key social theories such as Goffman’s theory of performance. Our goal is to re- conceptualize the performativity of semantic web tools their boundaries, and any potential avenues for future research”
The Art of Data Visualization (video)
PBS Off Book: “Humans have a powerful capacity to process visual information, skills that date far back in our evolutionary lineage. And since the advent of science, we have employed intricate visual strategies to communicate data, often utilizing design principles that draw on these basic cognitive skills. In a modern world where we have far more data than we can process, the practice of data visualization has gained even more importance. From scientific visualization to pop infographics, designers are increasingly tasked with incorporating data into the media experience. Data has emerged as such a critical part of modern life that it has entered into the realm of art, where data-driven visual experiences challenge viewers to find personal meaning from a sea of information, a task that is increasingly present in every aspect of our information-infused lives.”
Informed, Structured Citizen Networks Benefit Government Best
Dan Bevarly in his blog, aheadofideas: “An online collective social process based on the Group Forming Networks (GFN) model with third party facilitation (perhaps via a community foundation or other local nonprofit) offers an effective solution for successful resident engagement for public policy making. It is essential that the process be accepted by elected officials and other policy making agencies that must contribute information and data for the networks, and accept the collaboration of their subgroups and participants as valid, deliberative civic engagement.
Residents will become engaged around a policy discussion (and perhaps join a network on the topic) based on a certain variables including:
- interest, existing knowledge or expertise in the subject matter;
- personal or community impact or relevance from decisions surrounding the policy topic(s); and
- belief that participation will lead to real or visible outcome or resolution.
Government (as policy maker) must support these networks by providing objective, in-depth information about a policy issue, project or challenge to establish and feed a knowledge base for citizen/resident education.
Government needs informed citizen participation that helps address its many challenges with new ideas and knowledge. It is in their best interest to embrace structured networks to increase resident participation and consensus in the policy making process, and to increase efficiency in providing programs and services. But it should not be responsible for maintaining these networks….”
The Declassification Engine
Wired: “The CIA offers an electronic search engine that lets you mine about 11 million agency documents that have been declassified over the years. It’s called CREST, short for CIA Records Search Tool. But this represents only a portion the CIA’s declassified materials, and if you want unfettered access to the search engine, you’ll have to physically visit the National Archives at College Park, Maryland….
a new project launched by a team of historians, mathematicians, and computer scientists at Columbia University in New York City. Led by Matthew Connelly — a Columbia professor trained in diplomatic history — the project is known as The Declassification Engine, and it seeks to provide a single online database for declassified documents from across the federal government, including the CIA, the State Department, and potentially any other agency.
The project is still in the early stages, but the team has already assembled a database of documents that stretches back to the 1940s, and it has begun building new tools for analyzing these materials. In aggregating all documents into a single database, the researchers hope to not only provide quicker access to declassified materials, but to glean far more information from these documents than we otherwise could.
In the parlance of the day, the project is tackling these documents with the help of Big Data. If you put enough of this declassified information in a single place, Connelly believes, you can begin to predict what government information is still being withheld”
Digital Strategy: Delivering Better Results for the Public
The White House Blog: “Today marks one year since we released the Digital Government Strategy (PDF/ HTML5), as part of the President’s directive to build a 21st Century Government that delivers better services to the American people.
The Strategy is built on the proposition that all Americans should be able to access information from their Government anywhere, anytime, and on any device; that open government data – data that are publicly accessible in easy-to-use formats – can fuel innovation and economic growth; and that technology can make government more transparent, more efficient, and more effective.
A year later, there’s a lot to be proud of:
Information Centric
In twelve months, the Federal Government has significantly shifted how it thinks about digital information – treating data as a valuable national asset that should be open and available to the public, to entrepreneurs, and others, instead of keeping it trapped in government systems. …
Shared Platform
The Federal Government and the American people cannot afford to have each agency build isolated and duplicative technology solutions. Instead, we must use modern platforms for digital services that can be shared across agencies….
Customer-Centric
Citizens shouldn’t have to struggle to access the information they need. To ensure that the American people can easily find government services, we implemented a government-wide Digital Analytics Program across all Federal websites….
Security and Privacy
Throughout all of these efforts, maintaining cyber security and protecting privacy have been paramount….
In the end, the digital strategy is all about connecting people to government resources in useful ways. And by “connecting” we mean a two-way street….
Learn more at: http://www.whitehouse.gov/digitalgov/strategy-milestones and http://www.whitehouse.gov/digitalgov/deliverables.”
Deepbills project
Cato Institute: “The Deepbills project takes the raw XML of Congressional bills (available at FDsys and Thomas) and adds additional semantic information to them in inside the text.
You can download the continuously-updated data at http://deepbills.cato.org/download…
Congress already produces machine-readable XML of almost every bill it proposes, but that XML is designed primarily for formatting a paper copy, not for extracting information. For example, it’s not currently possible to find every mention of an Agency, every legal reference, or even every spending authorization in a bill without having a human being read it….
Currently the following information is tagged:
- Legal citations…
- Budget Authorities (both Authorizations of Appropriations and Appropriations)…
- Agencies, bureaus, and subunits of the federal government.
- Congressional committees
- Federal elective officeholders (Congressmen)”
How We Imagined the Internet Before the Internet Even Existed
Matt Novak in Paleofuture : “In a few years, men will be able to communicate more effectively through a machine than face to face. Sounds obvious today. But in 1968, a full year before ARPANET made its first connection? It was downright clairvoyant…
The paper was written by J.C.R. Licklider and Robert Taylor, illustrated by Rowland B. Wilson, and appeared in the April 1968 issue of Science and Technology. The article includes some of the most amazingly accurate predictions for what networked computing would eventually allow….
The article rather boldly predicts that the computerized networks of the future will be even more important for communication than the “printing press and the picture tube”—another idea not taken for granted in 1968:
Creative, interactive communication requires a plastic or moldable medium that can be modeled, a dynamic medium in which premises will flow into consequences, and above all a common medium that can be contributed to and experimented with by all.
Such a medium is at hand—the programmed digital computer. Its presence can change the nature and value of communication even more profoundly than did the printing press and the picture tube, for, as we shall show, a well-programmed computer can provide direct access both to informational resources and to the processes for making use of the resources.
The paper predicts that the person-to-person interaction that a networked computer system allows for will not only build relationships between individuals, but will build communities.
What will on-line interactive communities be like? In most fields they will consist of geographically separated members, sometimes grouped in small clusters and sometimes working individually. They will be communities not of common location, but of common interest. In each field, the overall community of interest will be large enough to support a comprehensive system of field-oriented programs and data.
…In the end, Licklider and Taylor predict that all of this interconnectedness will make us happier and even make unemployment a thing of the past. Their vision of everyone sitting at a console, working “through the network” is stunningly accurate for an information-driven society that fifty years ago would’ve looked far less tech-obsessed.
When people do their informational work “at the console” and “through the network,” telecommunication will be as natural an extension of individual work as face-to-face communication is now. The impact of that fact, and of the marked facilitation of the communicative process, will be very great—both on the individual and on society.
First, life will be happier for the on-line individual because the people with whom one interacts most strongly will be selected more by commonality of interests and goals than by accidents of proximity. Second, communication will be more effective and productive, and therefore more enjoyable. Third, much communication and interaction will be with programs and programmed models, which will be (a) highly responsive, (b) supplementary to one’s own capabilities, rather than competitive, and (c) capable of representing progressively more complex ideas without necessarily displaying all the levels of their structure at the same time-and which will therefore be both challenging and rewarding. And, fourth, there will be plenty of opportunity for everyone (who can afford a console) to find his calling, for the whole world of information, with all its fields and disciplines, will be open to him—with programs ready to guide him or to help him explore.
(You can read the entire paper online [pdf]. )”
Collaborate.org launches new platform to map the world
Dan Farber in CNET: “Collaborate.org wants to bring geospatial data to the masses, beyond where Google Earth has gone. The company, which launched Wednesday at the Future in Review conference here, is built around a geospatial visualizer, with more than 2 million data layers that can be overlaid on maps, and a broad set of collaboration tools.
“We want to harness the collective knowledge of the online global community, sharing expertise and enthusiasm,” said company CEO Kevin Montgomery. “We are providing worldwide geospatial infrastructure to empower people.” Collaborate.org grew out of Intelesense, a company headed by Montgomery that provides monitoring products for wireless sensor networks and a spatial data exchange.
Collaborate.org is built around World Wind, an open-source, spatial visualization platform developed by NASA. Google Earth doesn’t allow for the global community to contribute data or modify the code…
Collaborate.org is currently in private beta. The company will offer the service in a “freemium” model, Montgomery said. Users can upload data for free as long as it is publicly available. The company will charge fees for storing private data. In addition, Montgomery said Collaborate.org will generate revenue from consulting and possibly from custom data services. Versions of Collaborate.org for mobile devices also are in development.”
Introducing: Project Open Data
White House Blog: “Technology evolves rapidly, and it can be challenging for policy and its implementation to evolve at the same pace. Last week, President Obama launched the Administration’s new Open Data Policy and Executive Order aimed at ensuring that data released by the government will be as accessible and useful as possible. To make sure this tech-focused policy can keep up with the speed of innovation, we created Project Open Data.
Project Open Data is an online, public repository intended to foster collaboration and promote the continual improvement of the Open Data Policy. We wanted to foster a culture change in government where we embrace collaboration and where anyone can help us make open data work better. The project is published on GitHub, an open source platform that allows communities of developers to collaboratively share and enhance code. The resources and plug-and-play tools in Project Open Data can help accelerate the adoption of open data practices. For example, one tool instantly converts spreadsheets and databases into APIs for easier consumption by developers. The idea is that anyone, from Federal agencies to state and local governments to private citizens, can freely use and adapt these open source tools—and that’s exactly what’s happening.
Within the first 24 hours after Project Open Data was published, more than two dozen contributions (or “pull requests” in GitHub speak) were submitted by the public. The submissions included everything from fixing broken links, to providing policy suggestions, to contributing new code and tools. One pull request even included new code that translates geographic data from locked formats into open data that is freely available for use by anyone…”
IRS: Turn over a new leaf, Open up Data
Beth Simone Noveck and Stefaan Verhulst in Forbes: “The core task for Danny Werfel, the new acting commissioner of the IRS, is to repair the agency’s tarnished reputation and achieve greater efficacy and fairness in IRS investigations. Mr. Werfel can show true leadership by restructuring how the IRS handles its tax-exempt enforcement processes.
One of Mr. Werfel’s first actions on the job should be the immediate implementation of the groundbreaking Presidential Executive Order and Open Data policy, released last week, that requires data captured and generated by the government be made available in open, machine-readable formats. Doing so will make the IRS a beacon to other agencies in how to use open data to screen any wrongdoing and strengthen law enforcement.
By sharing readily available IRS data on tax-exempt organizations, encouraging Congress to pass a budget proposal that mandates release of all tax-exempt returns in a machine-readable format, and increasing the transparency of its own processes, the agency can begin to turn the page on this scandal and help rebuild trust and partnership between government and its citizens.”
See full article here.