Paper by Frank L. K. Ohemeng and Kwaku Ofosu-Adarkwa in the Government Information Quarterly: “In recent years the necessity for governments to develop new public values of openness and transparency, and thereby increase their citizenries’ sense of inclusiveness, and their trust in and confidence about their governments, has risen to the point of urgency. The decline of trust in governments, especially in developing countries, has been unprecedented and continuous. A new paradigm that signifies a shift to citizen-driven initiatives over and above state- and market-centric ones calls for innovative thinking that requires openness in government. The need for this new synergy notwithstanding, Open Government cannot be considered truly open unless it also enhances citizen participation and engagement. The Ghana Open Data Initiative (GODI) project strives to create an open data community that will enable government (supply side) and civil society in general (demand side) to exchange data and information. We argue that the GODI is too narrowly focused on the supply side of the project, and suggest that it should generate an even platform to improve interaction between government and citizens to ensure a balance in knowledge sharing with and among all constituencies….(More)”
Accur8Africa
“Accur8Africa aims to be the leading platform supporting the accuracy of data in the continent. If we intend to meet the Sustainable Development Goals (SDGs) in the next fifteen years, accurate data remains a non-negotiable necessity. Accur8Africa recognizes that nothing less than a data revolution is required. To achieve this we are building the statistical capacity of institutions across Africa and encouraging the use of data-driven decisions alongside better development metrics for key sectors such as gender equality, climate change, equity and social inclusion and health.
Africa has data in abundance but it exists in a fragmented and disorganized manner. As a result, the achievements of the Millennium Development Goals will be largely unquantifiable. As we transition from the MDG’s to the Sustainable Development Goals, and national governments meet to discuss the 17 goals that could transform the world by 2030, we believe that the African Continent deserves better and more accurate data…..Africa has a great role to play in the next fifteen years. The United Nations development agenda has generated momentum for a worldwide “data revolution,” shining a much-needed light on the need for better development data in Africa and elsewhere. Governments, international institutions, and donors need accurate data on basic development metrics such as inflation, vaccination coverage, and school enrolment in order to accurately plan, budget, and evaluate their activities. Governments, citizens, and civil society can then use this data as a “currency” for accountability. When statistical systems function properly, good-quality data are exchanged freely amongst all stakeholders ensuring that funding and development efforts are producing the desired results….(More)”
Weathernews thinks crowdsourcing is the future of weather
Andrew Freedman at Mashable: “The weather forecast of the future will be crowdsourced, if one Japanese weather firm sees its vision fulfilled.
On Monday, Weathernews Inc. of Japan announced a partnership with the Chinese firm Moji to bring Weathernews’ technology to the latter company’s popular MoWeather app.
The benefit for Weathernews, in addition to more users and entry into the Chinese market, is access to more data that can then be turned into weather forecasts.
The company says that this additional user base, when added to its existing users, will make Weathernews “the largest crowdsourced weather service in the world,” with 420 million users across 175 countries.
…So far, though, mobile phones have not proven to be more reliable weather sensors than the network of thousands of far more expensive and specialized surface weather observation sites throughout the world, but crowdsourcing’s day in the sun may be close at hand. As Weathernews leaders were quick to point out to Mashable in an interview, the existing weather observing network on which most forecasts rely has significant drawbacks that makes crowdsourcing especially appealing outside the U.S.
For example, most surface weather stations are in wealthy nations, primarily in North America and Europe. There’s a giant forecasting blind spot over much of Africa, where many countries lack a national weather agency. However, these countries do have rapidly growing mobile phone networks that, if utilized in certain ways, could provide a way to fill in data gaps and make weather forecasts more accurate, too.
“At Weathernews, we have a core belief that more weather data is better,” said Weathernews managing director Tomohiro Ishibashi.
“So having access to the additional datasets from MoWeather’s vast user community allows us to provide more accurate and safer weather forecasting for all,” he said. “Our advanced algorithms analyze these new datasets and put them in our existing computer forecasting models.”
Weathernews is trying to use observations that most weather companies might regard as interesting but not worth the effort to tailor for computer modeling. For example, photos of clouds are a potential way to ground truth weather satellite imagery, Ishibashi told Mashable.
“For us the picture of the sky… has a lot of information,” he said. (The company’s website refers to such observations as “eye-servation.”)…
Compared to Weathernews’ ambitions, AccuWeather’s recent decision to incorporate crowdsourced data into its iOS app seems more traditional, like a TV weather forecaster adding a few new “weather watchers” to their station’s network during local television’s heyday in the 1980s and 90s.
Now, we’re all weather watchers….(More)”
Scientists Are Hoarding Data And It’s Ruining Medical Research
Ben Goldacre at Buzzfeed: “We like to imagine that science is a world of clean answers, with priestly personnel in white coats, emitting perfect outputs, from glass and metal buildings full of blinking lights.
The reality is a mess. A collection of papers published on Wednesday — on one of the most commonly used medical treatments in the world — show just how bad things have become. But they also give hope.
The papers are about deworming pills that kill parasites in the gut, at extremely low cost. In developing countries, battles over the usefulness of these drugs have become so contentious that some people call them “The Worm Wars.”…
This “deworm everybody” approach has been driven by a single, hugely influential trial published in 2004 by two economists, Edward Miguel and Michael Kremer. This trial, done in Kenya, found that deworming whole schools improved children’s health, school performance, and school attendance. What’s more, these benefits apparently extended to children in schools several miles away, even when those children didn’t get any deworming tablets (presumably, people assumed, by interrupting worm transmission from one child to the next).
A decade later, in 2013, these two economists did something that very few researchers have ever done. They handed over their entire dataset to independent researchers on the other side of the world, so that their analyses could be checked in public. What happened next has every right to kick through a revolution in science and medicine….
This kind of statistical replication is almost vanishingly rare. A recent study set out to find all well-documented cases in which the raw data from a randomized trial had been reanalysed. It found just 37, out of many thousands. What’s more, only five were conducted by entirely independent researchers, people not involved in the original trial.
These reanalyses were more than mere academic fun and games. The ultimate outcomes of the trials changed, with terrifying frequency: One-third of them were so different that the take-home message of the trial shifted.
This matters. Medical trials aren’t conducted out of an abstract philosophical interest, for the intellectual benefit of some rarefied class in ivory towers. Researchers do trials as a service, to find out what works, because they intend to act on the results. It matters that trials get an answer that is not just accurate, but also reliable.
So here we have an odd situation. Independent reanalysis can improve the results of clinical trials, and help us not go down blind alleys, or give the wrong treatment to the wrong people. It’s pretty cheap, compared to the phenomenal administrative cost of conducting a trial. And it spots problems at an alarmingly high rate.
And yet, this kind of independent check is almost never done. Why not? Partly, it’s resources. But more than that, when people do request raw data, all too often the original researchers duck, dive, or simply ignore requests….
Two years ago I published a book on problems in medicine. Front and center in this howl was “publication bias,” the problem of clinical trial results being routinely and legally withheld from doctors, researchers, and patients. The best available evidence — from dozens of studieschasing results for completed trials — shows that around half of all clinical trials fail to report their results. The same is true of industry trials, and academic trials. What’s more, trials with positive results are about twice as likely to post results, so we see a biased half of the literature.
This is a cancer at the core of evidence-based medicine. When half the evidence is withheld, doctors and patients cannot make informed decisions about which treatment is best. When I wrote about this, various people from the pharmaceutical industry cropped up to claim that the problem was all in the past. So I befriended some campaigners, we assembled a group of senior academics, and started the AllTrials.net campaign with one clear message: “All trials must be registered, with their full methods and results reported.”
Dozens of academic studies had been published on the issue, and that alone clearly wasn’t enough. So we started collecting signatures, and we now have more than 85,000 supporters. At the same time we sought out institutional support. Eighty patient groups signed up in the first month, with hundreds more since then. Some of the biggest research funders, and even government bodies, have now signed up.
This week we’re announcing support from a group of 85 pension funds and asset managers, representing more than 3.5 trillion euros in funds, who will be asking the pharma companies they invest in to make plans to ensure that all trials — past, present, and future — report their results properly. Next week, after two years of activity in Europe, we launch our campaign in the U.S….(More)”
Local Governments Need Financial Transparency Tools
Cities of the Future: “Comprehensive financial transparency — allowing anyone to look up the allocation of budgets, expenses by department, and even the ledger of each individual expense as it happens — can help local governments restore residents’ confidence, help manage the budget efficiently and make more informed decisions for new projects and money allocation.
A few weeks ago, we had municipal elections in Spain. Many local governments changed hands and the new administrations had to review the current budgets, see where money was being spent and, on occasion, discovered expenses they were not expecting.
As costs rise and cities find it more difficult to provide the same services without raising taxes, citizens among others are demanding full disclosure of income and expenses.
Tools such as OpenGov platform are helping cities accomplish that goal…Earlier this year the city of Beaufort (pop. 13,000), South Carolina’s second oldest city known for its historic charm and moss-laden oak trees, decided to implement OpenGov. It rolled out the platform to the public last February, becoming the first city in the State to provide the public with in-depth, comprehensive financial data (spanning five budget years).
The reception by the city council and residents was extremely positive. Residents can now look up where their tax money goes down to itemized expenses. They can also see up-to-date charts of every part of the budget, how it is being spent, and what remains to be used. City council members can monitor the administration’s performance and ask informed questions at town meetings about the budget use, right down to the smallest expenses….
Many cities are now implementing open data tools to share information on different aspects of city services, such as transit information, energy use, water management, etc. But those tools are difficult to use and do not provide comprehensive financial information about the use of public money. …(More)”
Geek Heresy
Book by Kentaro Toyama “…, an award-winning computer scientist, moved to India to start a new research group for Microsoft. Its mission: to explore novel technological solutions to the world’s persistent social problems. Together with his team, he invented electronic devices for under-resourced urban schools and developed digital platforms for remote agrarian communities. But after a decade of designing technologies for humanitarian causes, Toyama concluded that no technology, however dazzling, could cause social change on its own.
Technologists and policy-makers love to boast about modern innovation, and in their excitement, they exuberantly tout technology’s boon to society. But what have our gadgets actually accomplished? Over the last four decades, America saw an explosion of new technologies – from the Internet to the iPhone, from Google to Facebook – but in that same period, the rate of poverty stagnated at a stubborn 13%, only to rise in the recent recession. So, a golden age of innovation in the world’s most advanced country did nothing for our most prominent social ill.
Toyama’s warning resounds: Don’t believe the hype! Technology is never the main driver of social progress. Geek Heresy inoculates us against the glib rhetoric of tech utopians by revealing that technology is only an amplifier of human conditions. By telling the moving stories of extraordinary people like Patrick Awuah, a Microsoft millionaire who left his lucrative engineering job to open Ghana’s first liberal arts university, and Tara Sreenivasa, a graduate of a remarkable South Indian school that takes children from dollar-a-day families into the high-tech offices of Goldman Sachs and Mercedes-Benz, Toyama shows that even in a world steeped in technology, social challenges are best met with deeply social solutions….(More)”
Using Twitter as a data source: An overview of current social media research tools
Wasim Ahmed at the LSE Impact Blog: “I have a social media research blog where I find and write about tools that can be used to capture and analyse data from social media platforms. My PhD looks at Twitter data for health, such as the Ebola outbreak in West Africa. I am increasingly asked why I am looking at Twitter, and what tools and methods there are of capturing and analysing data from other platforms such as Facebook, or even less traditional platforms such as Amazon book reviews. Brainstorming a couple of responses to this question by talking to members of the New Social Media New Social Science network, there are at least six reasons:
- Twitter is a popular platform in terms of the media attention it receives and it therefore attracts more research due to its cultural status
- Twitter makes it easier to find and follow conversations (i.e., by both its search feature and by tweets appearing in Google search results)
- Twitter has hashtag norms which make it easier gathering, sorting, and expanding searches when collecting data
- Twitter data is easy to retrieve as major incidents, news stories and events on Twitter are tend to be centred around a hashtag
- The Twitter API is more open and accessible compared to other social media platforms, which makes Twitter more favourable to developers creating tools to access data. This consequently increases the availability of tools to researchers.
- Many researchers themselves are using Twitter and because of their favourable personal experiences, they feel more comfortable with researching a familiar platform.
It is probable that a combination of response 1 to 6 have led to more research on Twitter. However, this raises another distinct but closely related question: when research is focused so heavily on Twitter, what (if any) are the implications of this on our methods?
As for the methods that are currently used in analysing Twitter data i.e., sentiment analysis, time series analysis (examining peaks in tweets), network analysis etc., can these be applied to other platforms or are different tools, methods and techniques required? In addition to qualitative methods such as content analysis, I have used the following four methods in analysing Twitter data for the purposes of my PhD, below I consider whether these would work for other social media platforms:
- Sentiment analysis works well with Twitter data, as tweets are consistent in length (i.e., <= 140) would sentiment analysis work well with, for example Facebook data where posts may be longer?
- Time series analysis is normally used when examining tweets overtime to see when a peak of tweets may occur, would examining time stamps in Facebook posts, or Instagram posts, for example, produce the same results? Or is this only a viable method because of the real-time nature of Twitter data?
- Network analysis is used to visualize the connections between people and to better understand the structure of the conversation. Would this work as well on other platforms whereby users may not be connected to each other i.e., public Facebook pages?
- Machine learning methods may work well with Twitter data due to the length of tweets (i.e., <= 140) but would these work for longer posts and for platforms that are not text based i.e., Instagram?
It may well be that at least some of these methods can be applied to other platforms, however they may not be the best methods, and may require the formulation of new methods, techniques, and tools.
So, what are some of the tools available to social scientists for social media data? In the table below I provide an overview of some the tools I have been using (which require no programming knowledge and can be used by social scientists):…(More)”
Open Innovation, Open Science, Open to the World
Speech by Carlos Moedas, EU Commissioner for Research, Science and Innovation: “On 25 April this year, an earthquake of magnitude 7.3 hit Nepal. To get real-time geographical information, the response teams used an online mapping tool called Open Street Map. Open Street Map has created an entire online map of the world using local knowledge, GPS tracks and donated sources, all provided on a voluntary basis. It is open license for any use.
Open Street Map was created by a 24 year-old computer science student at University College London in 2004, has today 2 million users and has been used for many digital humanitarian and commercial purposes: From the earthquakes in Haiti and Nepal to the Ebola outbreak in West Africa.
This story is one of many that demonstrate that we are moving into a world of open innovation and user innovation. A world where the digital and physical are coming together. A world where new knowledge is created through global collaborations involving thousands of people from across the world and from all walks of life.
Ladies and gentlemen, over the next two days I would like us to chart a new path for European research and innovation policy. A new strategy that is fit for purpose for a world that is open, digital and global. And I would like to set out at the start of this important conference my own ambitions for the coming years….
Open innovation is about involving far more actors in the innovation process, from researchers, to entrepreneurs, to users, to governments and civil society. We need open innovation to capitalise on the results of European research and innovation. This means creating the right ecosystems, increasing investment, and bringing more companies and regions into the knowledge economy. I would like to go further and faster towards open innovation….
I am convinced that excellent science is the foundation of future prosperity, and that openness is the key to excellence. We are often told that it takes many decades for scientific breakthroughs to find commercial application.
Let me tell you a story which shows the opposite. Graphene was first isolated in the laboratory by Profs. Geim and Novoselov at the University of Manchester in 2003 (Nobel Prizes 2010). The development of graphene has since benefitted from major EU support, including ERC grants for Profs. Geim and Novoselov. So I am proud to show you one of the new graphene products that will soon be available on the market.
This light bulb uses the unique thermal dissipation properties of graphene to achieve greater energy efficiencies and a longer lifetime that LED bulbs. It was developed by a spin out company from the University of Manchester, called Graphene Lighting, as is expected to go on sale by the end of the year.
But we must not be complacent. If we look at indicators of the most excellent science, we find that Europe is not top of the rankings in certain areas. Our ultimate goal should always be to promote excellence not only through ERC and Marie Skłodowska-Curie but throughout the entire H2020.
For such an objective we have to move forward on two fronts:
First, we are preparing a call for European Science Cloud Project in order to identify the possibility of creating a cloud for our scientists. We need more open access to research results and the underlying data. Open access publication is already a requirement under Horizon 2020, but we now need to look seriously at open data…
When innovators like LEGO start fusing real bricks with digital magic, when citizens conduct their own R&D through online community projects, when doctors start printing live tissues for patients … Policymakers must follow suit…(More)”
Forging Trust Communities: How Technology Changes Politics
Book by Irene S. Wu: “Bloggers in India used social media and wikis to broadcast news and bring humanitarian aid to tsunami victims in South Asia. Terrorist groups like ISIS pour out messages and recruit new members on websites. The Internet is the new public square, bringing to politics a platform on which to create community at both the grassroots and bureaucratic level. Drawing on historical and contemporary case studies from more than ten countries, Irene S. Wu’s Forging Trust Communities argues that the Internet, and the technologies that predate it, catalyze political change by creating new opportunities for cooperation. The Internet does not simply enable faster and easier communication, but makes it possible for people around the world to interact closely, reciprocate favors, and build trust. The information and ideas exchanged by members of these cooperative communities become key sources of political power akin to military might and economic strength.
Wu illustrates the rich world history of citizens and leaders exercising political power through communications technology. People in nineteenth-century China, for example, used the telegraph and newspapers to mobilize against the emperor. In 1970, Taiwanese cable television gave voice to a political opposition demanding democracy. Both Qatar (in the 1990s) and Great Britain (in the 1930s) relied on public broadcasters to enhance their influence abroad. Additional case studies from Brazil, Egypt, the United States, Russia, India, the Philippines, and Tunisia reveal how various technologies function to create new political energy, enabling activists to challenge institutions while allowing governments to increase their power at home and abroad.
Forging Trust Communities demonstrates that the way people receive and share information through network communities reveals as much about their political identity as their socioeconomic class, ethnicity, or religion. Scholars and students in political science, public administration, international studies, sociology, and the history of science and technology will find this to be an insightful and indispensable work…(More)”
Exploring Open Energy Data in Urban Areas
The Worldbank: “…Energy efficiency – using less energy input to deliver the same level of service – has been described by many as the ‘first fuel’ of our societies. However, lack of adequate data to accurately predict and measure energy efficiency savings, particularly at the city level, has limited the realization of its promise over the past two decades.
Why Open Energy Data?
Open Data can be a powerful tool to reduce information asymmetry in markets, increase transparency and help achieve local economic development goals. Several sectors like transport, public sector management and agriculture have started to benefit from Open Data practices. Energy markets are often characterized by less-than-optimal conditions with high system inefficiencies, misaligned incentives and low levels of transparency. As such, the sector has a lot to potentially gain from embracing Open Data principles.
The United States is a leader in this field with its ‘Energy Data’ initiative. This initiative makes data easy to find, understand and apply, helping to fuel a clean energy economy. For example, the Energy Information Administration’s (EIA) open application programming interface (API) has more than 1.2 million time series of data and is frequently visited by users from the private sector, civil society and media. In addition, the Green Button initiative is empowering American citizens to have access to their own energy usage data, and OpenEI.org is an Open Energy Information platform to help people find energy information, share their knowledge and connect to other energy stakeholders.
Introducing the Open Energy Data Assessment
To address this data gap in emerging and developing countries, the World Bank is conducting a series of Open Energy Data Assessments in urban areas. The objective is to identify important energy-related data, raise awareness of the benefits of Open Data principles and improve the flow of data between traditional energy stakeholders and others interested in the sector.
The first cities we assessed were Accra, Ghana and Nairobi, Kenya. Both are among the fastest-growing cities in the world, with dynamic entrepreneurial and technology sectors, and both are capitals of countries with an ongoing National Open Data Initiative., The two cities have also been selected to be part of the Negawatt Challenge, a World Bank international competition supporting technology innovation to solve local energy challenges.
The ecosystem approach
The starting point for the exercise was to consider the urban energy sector as an ecosystem, comprised of data suppliers, data users, key datasets, a legal framework, funding mechanisms, and ICT infrastructure. The methodology that we used adapted the established World Bank Open Data Readiness Assessment (ODRA), which highlights valuable connections between data suppliers and data demand. The assessment showcases how to match pressing urban challenges with the opportunity to release and use data to address them, creating a longer-term commitment to the process. Mobilizing key stakeholders to provide quick, tangible results is also key to this approach….(More) …See also World Bank Open Government Data Toolkit.”