Transforming Our Conversation of Information Architecture with Structure


Nathaniel Davis: Information architecture has been characterized as both an art and a science. Because there’s more evidence of the former than the latter, the academic and research community is justified in hesitating to give the practice of information architecture more attention.
If you probe the history of information architecture for the web, its foundation appears to be rooted in library science. But you’ll also find a pattern of borrowing methods and models from many other disciplines like architecture and urban planning, linguistics and ethnography, cognition and psychology, to name a few. This history leads many to wonder if the practice of information architecture is anything other than an art of induction for solving problems of architecture and design for the web…
Certainly, there is one concept that has persisted under the radar for many years with limited exploration. It is littered throughout countless articles, books and papers and is present in the most cited IA practice definitions. It may be the single concept that truly bridges practitioner and academic interests around a central and worthwhile topic. That concept is structure.”

Crowdsourcing—Harnessing the Masses to Advance Health and Medicine


A Systematic Review of the literature in the Journal of General Internal Medicine: “Crowdsourcing research allows investigators to engage thousands of people to provide either data or data analysis. However, prior work has not documented the use of crowdsourcing in health and medical research. We sought to systematically review the literature to describe the scope of crowdsourcing in health research and to create a taxonomy to characterize past uses of this methodology for health and medical research..
Twenty-one health-related studies utilizing crowdsourcing met eligibility criteria. Four distinct types of crowdsourcing tasks were identified: problem solving, data processing, surveillance/monitoring, and surveying. …
Utilizing crowdsourcing can improve the quality, cost, and speed of a research project while engaging large segments of the public and creating novel science. Standardized guidelines are needed on crowdsourcing metrics that should be collected and reported to provide clarity and comparability in methods.”

Open Data Tools: Turning Data into ‘Actionable Intelligence’


Shannon Bohle in SciLogs: “My previous two articles were on open access and open data. They conveyed major changes that are underway around the globe in the methods by which scientific and medical research findings and data sets are circulated among researchers and disseminated to the public. I showed how E-science and ‘big data’ fit into the philosophy of science though a paradigm shift as a trilogy of approaches: deductive, empirical, and computational, which was pointed out, provides a logical extenuation of Robert Boyle’s tradition of scientific inquiry involving “skepticism, transparency, and reproducibility for independent verification” to the computational age…
This third article on open access and open data evaluates new and suggested tools when it comes to making the most of the open access and open data OSTP mandates. According to an article published in The Harvard Business Review’s “HBR Blog Network,” this is because, as its title suggests, “open data has  little value if people can’t use it.” Indeed, “the goal is for this data to become actionable intelligence: a launchpad for investigation, analysis, triangulation, and improved decision making at all levels.” Librarians and archivists have key roles to play in not only storing data, but packaging it for proper accessibility and use, including adding descriptive metadata and linking to existing tools or designing new ones for their users. Later, in a comment following the article, the author, Craig Hammer, remarks on the importance of archivists and international standards, “Certified archivists have always been important, but their skillset is crucially in demand now, as more and more data are becoming available. Accessibility—in the knowledge management sense—must be on par with digestibility / ‘data literacy’ as priorities for continuing open data ecosystem development. The good news is that several governments and multilaterals (in consultation with data scientists and – yep! – certified archivists) are having continuing ‘shared metadata’ conversations, toward the possible development of harmonized data standards…If these folks get this right, there’s a real shot of (eventual proliferation of) interoperability (i.e. a data platform from Country A can ‘talk to’ a data platform from Country B), which is the only way any of this will make sense at the macro level.”

The Science of Familiar Strangers: Society’s Hidden Social Network


The Physics arXiv Blog “We’ve all experienced the sense of being familiar with somebody without knowing their name or even having spoken to them. These so-called “familiar strangers” are the people we see every day on the bus on the way to work, in the sandwich shop at lunchtime, or in the local restaurant or supermarket in the evening.
These people are the bedrock of society and a rich source of social potential as neighbours, friends, or even lovers.
But while many researchers have studied the network of intentional links between individuals—using mobile-phone records, for example—little work has been on these unintentional links, which form a kind of hidden social network.
Today, that changes thanks to the work of Lijun Sun at the Future Cities Laboratory in Singapore and a few pals who have analysed the passive interactions between 3 million residents on Singapore’s bus network (about 55 per cent of the city’s population).  ”This is the first time that such a large network of encounters has been identied and analyzed,” they say.
The results are a fascinating insight into this hidden network of familiar strangers and the effects it has on people….
Perhaps the most interesting result involves the way this hidden network knits society together. Lijun and co say that the data hints that the connections between familiar strangers grows stronger over time. So seeing each other more often increases the chances that familiar strangers will become socially connected.
That’s a fascinating insight into the hidden social network in which we are all embedded. It’s important because it has implications for our understanding of the way things like epidemics can spread through cities.
Perhaps a more interesting is the insight it gives into how links form within communities and how these can strengthened. With the widespread adoption of smart cards on transport systems throughout the world, this kind of study can easily be repeated in many cities, which may help to tease apart some of the factors that make them so different.”
Ref: arxiv.org/abs/1301.5979: Understanding Metropolitan Patterns of Daily Encounters

Social: Why Our Brains are Wired to Connect


Book by Matthew D. Lieberman : “Why are we influenced by the behaviour of complete strangers? Why does the brain register similar pleasure when I perceive something as ‘fair’ or when I eat chocolate? Why can we be so profoundly hurt by bereavement? What are the evolutionary benefits of these traits? The young discipline of ‘social cognitive neuroscience’ has been exploring this fascinating interface between brain science and human behaviour since the late 1990s. Now one of its founding pioneers, Matthew D. Lieberman, presents the discoveries that he and fellow researchers have made. Using fMRI scanning and a range of other techniques, they have been able to see that the brain responds to social pain and pleasure the same way as physical pain and pleasure; and that unbeknown to ourselves, we are constantly ‘mindreading’ other people so that we can fit in with them. It is clear that our brains are designed to respond to and be influenced by others. For good evolutionary reasons, he argues, we are wired to be social. The implications are numerous and profound. Do we have to rethink what we understand by identity, and free will? How can managers improve the way their teams relate and perform? Could we organize large social institutions in ways that would work far better? And could there be whole new methods of education?”

Citizen Science Profile: SeaSketch


Blog entry from the Commons Lab within the  Science and Technology Innovation Program of the Woodrow Wilson International Center for Scholars: “As part of the Commons Lab’s ongoing initiative to highlight the intersection of emerging technologies and citizen science, we present a profile of SeaSketch, a marine management software that makes complex spatial planning tools accessible to everyone. This was prepared with the gracious assistance of Will McClintock, director of the McClintock Lab.
The SeaSketch initiative highlights key components of successful citizen science projects. The end product is a result of an iterative process where the developers applied previous successes and learned from mistakes. The tool was designed to allow people without technical training to participate, expanding access to stakeholders. MarineMap had a quantifiable impact on California marine protected areas, increasing their size from 1 percent to 16 percent of the coastline. The subsequent version, SeaSketch, is uniquely suited to scale out worldwide, addressing coastal and land management challenges. By emphasizing iterative development, non-expert accessibility and scalability, SeaSketch offers a model of successful citizen science….
SeaSketch succeeded as a citizen science initiative by focusing on three project priorities:

  • Iterative Development: The current version of SeaSketch’s PGIS software is the result of seven years of trial and error. Doris and MarineMap helped the project team learn what worked and adjust accordingly. The final result would have been impossible without a sustained commitment to the project and regular product assessments.
  • Non-Expert Accessibility: GIS software is traditionally limited to those with technical expertise. SeaSketch was developed anticipating that stakeholders without GIS training would use the software. New features allow users to contribute spatial surveys, sharing their knowledge of the area to better inform planning. This ease of use means the project is outward facing: More people can participate, meaning the analyses better reflect community priorities.
  • Scalability: Although MarineMap was built specifically to guide the MLPA process, the concept is highly flexible. SeaSketch  is being used to support oceanic management issues worldwide, including in areas of international jurisdiction. The software can support planning with legal implications as well as cooperative agreements. SeaSketch’s project team believes it can also be used for freshwater and terrestrial management issues.”

Analyzing the Analyzers


catAn Introspective Survey of Data Scientists and Their Work,By Harlan Harris, Sean Murphy, Marck Vaisman: “There has been intense excitement in recent years around activities labeled “data science,” “big data,” and “analytics.” However, the lack of clarity around these terms and, particularly, around the skill sets and capabilities of their practitioners has led to inefficient communication between “data scientists” and the organizations requiring their services. This lack of clarity has frequently led to missed opportunities. To address this issue, we surveyed several hundred practitioners via the Web to explore the varieties of skills, experiences, and viewpoints in the emerging data science community.

We used dimensionality reduction techniques to divide potential data scientists into five categories based on their self-ranked skill sets (Statistics, Math/Operations Research, Business, Programming, and Machine Learning/Big Data), and four categories based on their self-identification (Data Researchers, Data Businesspeople, Data Engineers, and Data Creatives). Further examining the respondents based on their division into these categories provided additional insights into the types of professional activities, educational background, and even scale of data used by different types of Data Scientists.
In this report, we combine our results with insights and data from others to provide a better understanding of the diversity of practitioners, and to argue for the value of clearer communication around roles, teams, and careers.”

Sensing and Shaping Emerging Conflicts


cover.phpA new Report of a Joint Workshop of the National Academy of Engineering and the United States Institute of Peace: Roundtable on Technology, Science, and Peacebuilding: “Technology has revolutionized many aspects of modern life, from how businesses operate, to how people get information, to how countries wage war. Certain technologies in particular, including not only cell phones and the Internet but also satellites, drones, and sensors of various kinds, are transforming the work of mitigating conflict and building peaceful societies. Rapid increases in the capabilities and availability of digital technologies have put powerful communications devices in the hands of most of the world’s population.
These technologies enable one-to-one and one-to-many flows of information, connecting people in conflict settings to individuals and groups outside those settings and, conversely, linking humanitarian organizations to people threatened by violence. Communications within groups have also intensified and diversified as the group members use new technologies to exchange text, images, video, and audio. Monitoring and analysis of the flow and content of this information can yield insights into how violence can be prevented or mitigated. In this way technologies and the resulting information can be used to detect and analyze, or sense, impending conflict or developments in ongoing conflict.”

Can Silicon Valley Save the World?


Charles Kenny and Justin Sandefur in Foreign Policy: “Not content with dominating IPOs on Wall Street, Silicon Valley entrepreneurs are taking their can-do, failure-conquering, technology-enabled tactics to the challenge of global poverty. And why not? If we can look up free Khan Academy math lectures using the cheap, kid-friendly computers handed out by the folks at One Laptop per Child, who needs to worry about the complexities of education reform? With a lamp lit up by an electricity-generating soccer ball in every hut, who needs coal-fired power stations and transmission lines? And if even people in refugee camps can make money transcribing outsourced first-world dental records, who needs manufacturing or the roads and port systems required to export physical goods? No wonder the trendiest subject these days for TED talks is cracking the code on digital-era do-gooding, with 100 recent talks and counting just on the subjects of Africa and development…
But entrepreneurial spirit and even the fanciest of gadgets will only get you so far. All the technological transformation of the last 200 years hasn’t come close to wiping out global poverty. More than half the planet still lives on less than $4 a day, and 2.4 billion people live on less than $2 a day. And that’s after a decade that saw the biggest drop in extreme poverty ever. What’s more, millions and millions of people still die annually from easily and cheaply preventable or treatable diseases like diarrhea and pneumonia. None of this is for a lack of science; often it isn’t even for lack of money. It is because parents don’t follow simple health practices like washing their hands, government bureaucrats can’t or won’t provide basic water and sanitation programs, and arbitrary immigration restrictions prevent the poor from moving to places with better opportunities.
Sorry, but no iPhone, even one loaded with the coolest apps, is going to change all that….
SO WHAT CAN BE DONE to harness technological innovation, filter the good ideas from the bad, and spread a little of Silicon Valley’s fairy dust on the world’s poorer regions? The answer, according to Harvard economist Michael Kremer, is market discipline and rigorous testing. Kremer is a MacArthur “genius” grant winner whose name pops up in speculation about future Nobel Prize contenders. He thinks that technological fixes can dramatically improve the lives of the global poor, but markets won’t provide the right innovations without support.”

Targeting Transparency


New paper by David Weil, Mary Graham, and Archon Fung in Science Magazine: “When rules, taxes, or subsidies prove impractical as policy tools, governments increasingly employ “targeted transparency,” compelling disclosure of information as an alternative means of achieving specific objectives. For example, the U.S. Affordable Care Act of 2010 requires calories be posted on menus to enlist both restaurants and patrons in the effort to reduce obesity. It is crucial to understand when and how such targeted transparency works, as well as when it is inappropriate. Research about its use and effectiveness has begun to take shape, drawing on social and behavioral scientists, economists, and legal scholars. We explore questions central to the performance of targeted transparency policies.

Targeted transparency differs from broader “right-to-know” and “open-government” policies that span from the 1966 Freedom of Information Act to the Obama Administration’s “open-government” initiative encouraging officials to make existing data sets readily available and easy to parse as an end in itself (1, 2). Targeted transparency offers a more focused approach often used to introduce new scientific evidence of public risks into market choices. Government compels companies or agencies to disclose information in standardized formats to reduce specific risks, to ameliorate externalities arising from a failure of consumers or producers to fully consider social costs associated with a product, or to improve provision of public goods and services. Such policies are more light-handed than conventional regulation, relying on the power of information rather than on enforcement of rules and standards or financial inducements….”

See also the Transparency Policy Project at http://transparencypolicy.net/