Big Data Visualization: Review of 20 Tools


Edoardo L’Astorina at BluFrame: “Big Data is amazing. It describes our everyday behavior, keeps track of the places we go, stores what we like to do and how much time we spend doing our favorite activities.

Big Data is made of numbers, and I think we all agree when we say: Numbers are difficult to look at. Enter Big Data visualization….Data visualization lets you interact with data. It goes beyond analysis. Visualization brings a presentation to life. It keeps your audience’s eyes on the screen. And gets people interested….

We made everything easy for you and prepared a series of reviews that cover all the features of the best data visualization tools out there. And we divided our reviews in two sections: data visualization tools for presentations and data visualization tools for developers.

Here are reviews of our 20 best tools for Big Data visualization.

Data Visualization Tools for Presentations: Zero Coding Required:…

Tableau.. is the big data visualization tool for corporate. Tableau lets you create charts, graphs, maps and many other graphics. A desktop app is available for visual analytics….

Infogram…lets you link their visualizations and infographics to real time big data…

ChartBlocks… is an easy-to-use online tool that requires no coding, and builds visualizations from spreadsheets, databases… and live feeds….

Datawrapper.. is aimed squarely at publishers and journalists…

Plotly…will help you create a sharp and slick chart in just a few minutes, starting from a simple spreadsheet….

RAW… boasts on its homepage to be “the missing link between spreadsheets and vector graphics”….

Visual.ly… is a visual content service….

Data Visualization Tools for Developers: JavaScript libraries

D3.js…runs on JavaScript and uses HTML, CSS and SVG. D3.js is open-source and applies data-driven transformation to a webpage and – as you can see from their examples – allows for beautiful and fast visualizations….

Ember Charts is – as the name suggests – based on the Ember.js framework and uses D3.js under the hood….

NVD3…runs on top of D3.js –surprise surprise– and aims to build re-usable charts and components….

Google Charts… runs on HTML5 and SVG and aims at Android, iOS and total cross-browser compatibility, including older Internet Explorer versions supported via VML

FusionCharts is – according to their site – the most comprehensive JavaScript charting library, and includes over 90 charts and 900 maps….

Highcharts…is a JavaScript API that integrates easily with jQuery and boasts being used by 61 out of the world’s 100 largest companies….

Chart.js…For a small chart project, Chart.js is your go-to place….

Leaflet… leveragesOpenStreetMap data and adds HTML5/CSS3 visualizations and interactivity on top to ensure everything is responsive and mobile ready….

Chartist.js.. is born out of a community effort to blow all other JavaScript charting libraries out of the water…

n3-charts…is for the AngularJS lovers out there….

Sigma JS…is what you want for interactivity….

Polymaps…visualizes…. you guessed it: maps….

Processing.js …is a JavaScript library that sits on top of the Processing visual programming language…(More)

The impact of a move towards Open Data in West Africa


 at the Georgetown Journal of International Affairs:  “The concept of “open data” is not new, but its definition is quite recent. Since computers began communicating through networks, engineers have been developing standards to share data. The open data philosophy holds that some data should be freely available for use, reuse, distribute and publish without copyright and patent controls. Several mechanisms can also limit access to data like restricted database access, use of proprietary technologies or encryption. Ultimately, open data buttresses government initiatives to boost innovation, support transparency, empower citizens, encourage accountability, and fight corruption.

West Africa is primed for open data. The region experienced a 6% growth in 2014, according to the Africa Development Bank. Its Internet user network is also growing: 17% of the sub-Saharan population owned a unique smartphone in 2013, a number projected to grow to 37% by 2020 according to the GSMA. To improve the quality of governance and services in the digital age, the region must develop new infrastructures, revise digital strategies, simplify procurement procedures, adapt legal frameworks, and allow access to public data. Open data can enhance local economies and the standard of living.

This paper speaks towards the impact of open data in West Africa. First it assesses open data as a positive tool for governance and civil society. Then, it analyzes the current situation of open data across the region. Finally, it highlights specific best practices for enhancing impact in the future….(More)”

#BuildHereNow


Crowdsourcing Campaign by Strong Towns: “Nearly every urban neighborhood in this country — whether small town or big city — has properties that could use a little love. This week at Strong Towns we’re talking about the federal rules that have made that love difficult to find, tilting the playing field so that capital and expertise flow away from walkable, mixed-use neighborhoods. Over eighty years of this distortion has created a lot of opportunity for Americans to make good, high-returning investments in our core cities and neighborhoods.

WE NEED YOUR HELP TO SHOW JUST HOW MUCH POTENTIAL IS OUT THERE.

We all know that empty lot, that underutilized building, that is just waiting for the right person to come along and knit it back into the fabric of the neighborhood. Imagine that right person could actually get the financing — that the rules weren’t rigged against them — and all they needed was your encouragement. This week, let’s provide that encouragement.

Let’s shine a huge spotlight on these spaces. They don’t need expensive utilities, a new road or a tax subsidy. They just need a fair shake.

HOW CAN I PARTICIPATE?

  • Get outside and take pictures of the vacant or underutilized properties in your town.
  • Upload your photos to Twitter or Instagram with the hashtag #BuildHereNow
  • Bonus points if you include the location and a suggestion of what you would like to see built there. (Note that turning on location services will also greatly aid us in mapping out these posts all over the country.)…(More)”

New #ODimpact Release: How is Open Data Creating Economic Opportunities and Solving Public Problems?


Andrew Young at The GovLab: “Last month, the GovLab and Omidyar Network launched Open Data’s Impact (odimpact.org), a custom-built repository offering a range of in-depth case studies on global open data projects. The initial launch of theproject featured the release of 13 open data impact case studies – ten undertaken by the GovLab, as well asthree case studies from Becky Hogge (@barefoot_techie), an independent researcher collaborating withOmidyar Network. Today, we are releasing a second batch of 12 case studies – nine case studies from theGovLab and three from Hogge…

The batch of case studies being revealed today examines two additional dimensions of impact. They find that:

  • Open data is creating new opportunities for citizens and organizations, by fostering innovation and promoting economic growth and job creation.
  • Open data is playing a role in solving public problems, primarily by allowing citizens and policymakers access to new forms of data-driven assessment of the problems at hand. It also enables data-driven engagement, producing more targeted interventions and enhanced collaboration.

The specific impacts revealed by today’s release of case studies are wide-ranging, and include both positive and negative transformations. We have found that open data has enabled:

  • The creation of new industries built on open weather data released by the United States NationalOceanic and Atmospheric Administration (NOAA).
  • The generation of billions of dollars of economic activity as a result of the Global Positioning System(GPS) being opened to the global public in the 1980s, and the United Kingdom’s Ordnance Survey geospatial offerings.
  • A more level playing field for small businesses in New York City seeking market research data.
  • The coordinated sharing of data among government and international actors during the response to theEbola outbreak in Sierra Leone.
  • The identification of discriminatory water access decisions in the case Kennedy v the City of Zanesville, resulting in a $10.9 million settlement for the African-American plaintiffs.
  • Increased awareness among Singaporeans about the location of hotspots for dengue fever transmission.
  • Improved, data-driven emergency response following earthquakes in Christchurch, New Zealand.
  • Troubling privacy violations on Eightmaps related to Californians’ political donation activity….(More)”

All case studies available at odimpact.org.

 

A Tale of Four Algorithms


 at Slate: “Algorithms don’t just power search results and news feeds, shaping our experience of Google, Facebook, Amazon, Spotify, and Tinder.Algorithms are widely—and largely invisibly—integrated into American political life, policymaking, and program administration.

Algorithms can terminate your Medicaid benefits, exclude you from air travel, purge you from voter rolls, or predict if you are likely to commit a crime in the future. They make decisions about who has access to public services, who undergoes extrascrutiny, and where we target scarce resources.

But are all algorithms created equal? Does the kind of algorithm used by government agencies have anything to do with who it is aimed at?

Bias can enter algorithmic processes through many doors. Discriminatory datacollection can mean extra scrutiny for whole communities, creating a feedback cycleof “garbage in, garbage out.” For example, much of the initial data that populated CalGang, an intelligence database used to target and track suspected gang members, was collected by the notorious Community Resources Against Street Hoodlums unitsof the LAPD, including in the scandal-ridden Rampart division. Algorithms can alsomirror and reinforce entrenched cultural assumptions. For example, as Wendy HuiKyong Chun has written, Googling “Asian + woman” a decade ago turned up moreporn sites in the first 10 hits than a search for “pornography.”

But can automated policy decisions be class-biased? Let’s look at four algorithmic systems dedicated to one purpose—identifying and decreasing fraud, waste, and abuse in federal programs—each aimed at a different economic class. We ‘ll investigate the algorithms in terms of their effectiveness at protecting key American political values—efficacy, transparency, fairness, and accountability—and see which ones make the grade.

160210_EconClass_Chart_01

Below, I’ve scored each of the four policy algorithms on a scale of 1 to 5, 1 being very low and 5 being high…

Of course this ad hoc survey is merely suggestive, not conclusive. But it indicates areality that those of us who talk about data-driven policy rarely address: All algorithmsare not created equal. Policymakers and programmers make inferences about theirtargets that get baked into the code of both legislation and high-tech administrativetools—that SNAP recipients are sneakier than other people and deserve less due process protection, for example….(More)

Privacy as a Public Good


Joshua A.T. Fairfield & Christoph Engel in Duke Law Journal: “Privacy is commonly studied as a private good: my personal data is mine to protect and control, and yours is yours. This conception of privacy misses an important component of the policy problem. An individual who is careless with data exposes not only extensive information about herself, but about others as well. The negative externalities imposed on nonconsenting outsiders by such carelessness can be productively studied in terms of welfare economics. If all relevant individuals maximize private benefit, and expect all other relevant individuals to do the same, neoclassical economic theory predicts that society will achieve a suboptimal level of privacy. This prediction holds even if all individuals cherish privacy with the same intensity. As the theoretical literature would have it, the struggle for privacy is destined to become a tragedy.

But according to the experimental public-goods literature, there is hope. Like in real life, people in experiments cooperate in groups at rates well above those predicted by neoclassical theory. Groups can be aided in their struggle to produce public goods by institutions, such as communication, framing, or sanction. With these institutions, communities can manage public goods without heavy-handed government intervention. Legal scholarship has not fully engaged this problem in these terms. In this Article, we explain why privacy has aspects of a public good, and we draw lessons from both the theoretical and the empirical literature on public goods to inform the policy discourse on privacy…(More)”

See also:

Privacy, Public Goods, and the Tragedy of the Trust Commons: A Response to Professors Fairfield and Engel, Dennis D. Hirsch

Response to Privacy as a Public Good, Priscilla M. Regan

Data Collaboratives: Matching Demand with Supply of (Corporate) Data to solve Public Problems


Blog by Stefaan G. Verhulst, IrynaSusha and Alexander Kostura: “Data Collaboratives refer to a new form of collaboration, beyond the public-private partnership model, in which participants from different sectors (private companies, research institutions, and government agencies) share data to help solve public problems. Several of society’s greatest challenges — from climate change to poverty — require greater access to big (but not always open) data sets, more cross-sector collaboration, and increased capacity for data analysis. Participants at the workshop and breakout session explored the various ways in which data collaborative can help meet these needs.

Matching supply and demand of data emerged as one of the most important and overarching issues facing the big and open data communities. Participants agreed that more experimentation is needed so that new, innovative and more successful models of data sharing can be identified.

How to discover and enable such models? When asked how the international community might foster greater experimentation, participants indicated the need to develop the following:

· A responsible data framework that serves to build trust in sharing data would be based upon existing frameworks but also accommodates emerging technologies and practices. It would also need to be sensitive to public opinion and perception.

· Increased insight into different business models that may facilitate the sharing of data. As experimentation continues, the data community should map emerging practices and models of sharing so that successful cases can be replicated.

· Capacity to tap into the potential value of data. On the demand side,capacity refers to the ability to pose good questions, understand current data limitations, and seek new data sets responsibly. On the supply side, this means seeking shared value in collaboration, thinking creatively about public use of private data, and establishing norms of responsibility around security, privacy, and anonymity.

· Transparent stock of available data supply, including an inventory of what corporate data exist that can match multiple demands and that is shared through established networks and new collaborative institutional structures.

· Mapping emerging practices and models of sharing. Corporate data offers value not only for humanitarian action (which was a particular focus at the conference) but also for a variety of other domains, including science,agriculture, health care, urban development, environment, media and arts,and others. Gaining insight in the practices that emerge across sectors could broaden the spectrum of what is feasible and how.

In general, it was felt that understanding the business models underlying data collaboratives is of utmost importance in order to achieve win-win outcomes for both private and public sector players. Moreover, issues of public perception and trust were raised as important concerns of government organizations participating in data collaboratives….(More)”

Technology and the Future of Cities


Mark Gorenberg, Craig Mundie, Eric Schmidt and Marjory Blumenthal at PCAST: “Growing urbanization presents the United States with an opportunity to showcase its innovation strength, grow its exports, and help to improve citizens’ lives – all at once. Seizing this triple opportunity will involve a concerted effort to develop and apply new technologies to enhance the way cities work for the people who live there.

A new report released today by the President’s Council of Advisors on Science and Technology (PCAST), Technology and the Future of Cities, lays out why now is a good time to promote technologies for cities: more (and more diverse) people are living in cities; people are increasingly open to different ways of using space, living, working, and traveling across town; physical infrastructures for transportation, energy, and water are aging; and a wide range of innovations are in reach that can yield better infrastructures and help in the design and operation of city services.

There are also new ways to collect and use information to design and operate systems and services. Better use of information can help make the most of limited resources – whether city budgets or citizens’ time – and help make sure that the neediest as well as the affluent benefit from new technology.

Although the vision of technology’s promise applies city-wide, PCAST suggests that a practical way for cities to adopt infrastructural and other innovation is by starting in a discrete area  – a district, the dimensions of which depend on the innovation in question. Experiences in districts can help inform decisions elsewhere in a given city – and in other cities. PCAST urges broader sharing of information about, and tools for, innovation in cities.

Such sharing is already happening in isolated pockets focused on either specific kinds of information or recipients of specific kinds of funding. A more comprehensive City Web, achieved through broader interconnection, could inform and impel urban innovation. A systematic approach to developing open-data resources for cities is recommended, too.

PCAST recommends a variety of steps to make the most of the Federal Government’s engagement with cities. To begin, it calls for more – and more effective – coordination among Federal agencies that are key to infrastructural investments in cities.  Coordination across agencies, of course, is the key to place-based policy. Building on the White House Smart Cities Initiative, which promotes not only R&D but also deployment of IT-based approaches to help cities solve challenges, PCAST also calls for expanding research and development coordination to include the physical, infrastructural technologies that are so fundamental to city services.

A new era of city design and city life is emerging. If the United States steers Federal investments in cities in ways that foster innovation, the impacts can be substantial. The rest of the world has also seen the potential, with numerous cities showcasing different approaches to innovation. The time to aim for leadership in urban technologies and urban science is now….(More)”

Public-Private Partnerships for Statistics: Lessons Learned, Future Steps


Report by Nicholas Robin, Thilo Klein and Johannes Jütting for Paris 21: “Non-offcial sources of data, big data in particular, are currently attracting enormous interest in the world of official statistics. An impressive body of work focuses on how different types of big data (telecom data, social media, sensors, etc.) can be used to fll specifc data gaps, especially with regard to the post-2015 agenda and the associated technology challenges. The focus of this paper is on a different aspect, but one that is of crucial importance: what are the perspectives of the commercial operations and national statistical offces which respectively produce and might use this data and which incentives, business models and protocols are needed in order to leverage non-offcial data sources within the offcial statistics community?

Public-private partnerships (PPPs) offer signifcant opportunities such as cost effectiveness, timeliness, granularity, new indicators, but also present a range of challenges that need to be surmounted. These comprise technical diffculties, risks related to data confdentiality as well as a lack of incentives. Nevertheless, a number of collaborative projects have already emerged and can be

Nevertheless, a number of collaborative projects have already emerged and can be classified into four ideal types: namely the in-house production of statistics by the data provider, the transfer of private data sets to the end user, the transfer of private data sets to a trusted third party for processing and/or analysis, and the outsourcing of national statistical office functions (the only model which is not centred around a data-sharing dimension). In developing countries, a severe lack of resources and particular statistical needs (to adopt a system-wide approach within national statistical systems and fill statistical gaps which are relevant to national development plans) highlight the importance of harnessing the private sector’s resources and point to the most holistic models (in-house and third party) in which the private sector contributes to the processing and analysis of data. The following key lessons are drawn from four case studies….(More)”

Drones better than human rescuers at following mountain pathways


Springwise: “Every year in Switzerland, emergency centers respond to around 1,000 call outs for lost and injured hikers. It can often take hours and significant manpower to locate lost mountaineers, but new software for quadcopter drones is making the hunt quicker and easier, and has the potential to help find human survivors in disaster zones around the world.

The drone uses a computer algorithm called a Deep Neural Network. The program was developed by researchers at the University of Zurich and the Dalle Molle Institute for Artificial Intelligence. The drone uses the algorithm to learn trails and paths through a pair of small cameras, interpreting the images and recognizing man-made pathways. Even when working on a previously unseen trail, it was able to guess the correct direction in 85 percent of the cases. The drones’ speed and accuracy make them more effective than human trackers.

Drohnen-Scaramuzza-2
The researchers hope that eventually multiple small drones could be combined with human search and rescue missions, to cover more terrain and find people faster. The drones can cover terrain quickly and check hazardous areas to minimize risk to human workers, and its AI can identify paths and avoid crashing without any human involvement….(More)”