Handbook by Douwe Korff and Marie Georges: “This Handbook was prepared for and is used in the EU-funded “T4DATA” training‐of-trainers programme. Part I explains the history and development of European data protection law and provides an overview of European data protection instruments including the Council of Europe Convention and its “Modernisation” and the various EU data protection instruments relating to Justice and Home Affairs, the CFSP and the EU institutions, before focusing on the GDPR in Part II. The final part (Part III) consists of detailed practical advice on the various tasks of the Data Protection Officer now institutionalised by the GDPR. Although produced for the T4DATA programme that focusses on DPOs in the public sector, it is hoped that the Handbook will be useful also to anyone else interested in the application of the GDPR, including DPOs in the private sector….(More)”.
Centre for Humanitarian Data: “Survey and needs assessment data, or what is known as ‘microdata’, is essential for providing adequate response to crisis-affected people. However, collecting this information does present risks. Even as great effort is taken to remove unique identifiers such as names and phone numbers from microdata so no individual persons or communities are exposed, combining key variables such as location or ethnicity can still allow for re-identification of individual respondents. Statistical Disclosure Control (SDC) is one method for reducing this risk.
The Centre has developed a Guidance Note on Statistical Disclosure Control that outlines the steps involved in the SDC process, potential applications for its use, case studies and key actions for humanitarian data practitioners to take when managing sensitive microdata. Along with an overview of what SDC is and what tools are available, the Guidance Note outlines how the Centre is using this process to mitigate risk for datasets shared on HDX. …(More)”.
Book edited by Sébastien Lechevalier: ” The major purpose of this book is to clarify the importance of non-technological factors in innovation to cope with contemporary complex societal issues while critically reconsidering the relations between science, technology, innovation (STI), and society. For a few decades now, innovation—mainly derived from technological advancement—has been considered a driving force of economic and societal development and prosperity.
With that in mind, the following questions are dealt with in this book: What are the non-technological sources of innovation? What can the progress of STI bring to humankind? What roles will society be expected to play in the new model of innovation? The authors argue that the majority of so-called technological innovations are actually socio-technical innovations, requiring huge resources for financing activities, adapting regulations, designing adequate policy frames, and shaping new uses and new users while having the appropriate interaction with society.
This book gathers multi- and trans-disciplinary approaches in innovation that go beyond technology and take into account the inter-relations with social and human phenomena. Illustrated by carefully chosen examples and based on broad and well-informed analyses, it is highly recommended to readers who seek an in-depth and up-to-date integrated overview of innovation in its non-technological dimensions….(More)”.
Paper by Jaehyuk Park et al: “…One of the most popular concepts for policy makers and business economists to understand the structure of the global economy is “cluster”, the geographical agglomeration of interconnected firms such as Silicon Valley, Wall Street, and Hollywood. By studying those well-known clusters, we become to understand the advantage of participating in a geo-industrial cluster for firms and how it is related to the economic growth of a region.
However, the existing definition of geo-industrial cluster is not systematic enough to reveal the whole picture of the global economy. Often, after defining as a group of firms in a certain area, the geo-industrial clusters are considered as independent to each other. As we should consider the interaction between accounting team and marketing team to understand the organizational structure of a firm, the relationships among those geo-industrial clusters are the essential part of the whole picture….
In this new study, my colleagues and I at Indiana University — with support from LinkedIn — have finally overcome these limitations by defining geo-industrial clusters through labor flow and constructing a global labor flow network from LinkedIn’s individual-level job history dataset. Our access to this data was made possible by our selection as one of 11 teams selected to participate in the LinkedIn Economic Graph Challenge.
The transitioning of workers between jobs and firms — also known as labor flow — is considered central in driving firms towards geo-industrial clusters due to knowledge spillover and labor market pooling. In response, we mapped the cluster structure of the world economy based on labor mobility between firms during the last 25 years, constructing a “labor flow network.”
To do this, we leverage LinkedIn’s data on professional demographics and employment histories from more than 500 million people between 1990 and 2015. The network, which captures approximately 130 million job transitions between more than 4 million firms, is the first-ever flow network of global labor.
The resulting “map” allows us to:
- identify geo-industrial clusters systematically and organically using network community detection
- verify the importance of region and industry in labor mobility
- compare the relative importance between the two constraints in different hierarchical levels, and
- reveal the practical advantage of the geo-industrial cluster as a unit of future economic analyses.
- show a better picture of what industry in what region leads the economic growth of the industry or the region, at the same time
- find out emerging and declining skills based on the representativeness of them in growing and declining geo-industrial clusters…(More)”.
Katie Langin at Science: “With more than 30,000 academic journals now in circulation, academics can have a hard time figuring out where to submit their work for publication. The decision is made all the more difficult by the sky-high pressure of today’s academic environment—including working toward tenure and trying to secure funding, which can depend on a researcher’s publication record. So, what does a researcher prioritize?
According to a new study posted on the bioRxiv preprint server, faculty members say they care most about whether the journal is read by the people they most want to reach—but they think their colleagues care most about journal prestige. Perhaps unsurprisingly, prestige also held more sway for untenured faculty members than for their tenured colleagues.
“I think that it is about the security that comes with being later in your career,” says study co-author Juan Pablo Alperin, an assistant professor in the publishing program at Simon Fraser University in Vancouver, Canada. “It means you can stop worrying so much about the specifics of what is being valued; there’s a lot less at stake.”
According to a different preprint that Alperin and his colleagues posted on PeerJ in April, 40% of research-intensive universities in the United States and Canada explicitly mention that journal impact factors can be considered in promotion and tenure decisions. More likely do so unofficially, with faculty members using journal names on a CV as a kind of shorthand for how “good” a candidate’s publication record is. “You can’t ignore the fact that journal impact factor is a reality that gets looked at,” Alperin says. But some argue that journal prestige and impact factor are overemphasized and harm science, and that academics should focus on the quality of individual work rather than journal-wide metrics.
In the new study, only 31% of the 338 faculty members who were surveyed—all from U.S. and Canadian institutions and from a variety of disciplines, including 38% in the life and physical sciences and math—said that journal prestige was “very important” to them when deciding where to submit a manuscript. The highest priority was journal readership, which half said was very important. Fewer respondents felt that publication costs (24%) and open access (10%) deserved the highest importance rating.
But, when those same faculty members were asked to assess how their colleagues make the same decision, journal prestige shot to the top of the list, with 43% of faculty members saying that it was very important to their peers when deciding where to submit a manuscript. Only 30% of faculty members thought the same thing about journal readership—a drop of 20 percentage points compared with how faculty members assessed their own motivations….(More)”.
Joy Ito at Wired: “If you looked at how many people check books out of libraries these days, you would see failure. Circulation, an obvious measure of success for an institution established to lend books to people, is down. But if you only looked at that figure, you’d miss the fascinating transformation public libraries have undergone in recent years. They’ve taken advantage of grants to become makerspaces, classrooms, research labs for kids, and trusted public spaces in every way possible. Much of the successful funding encouraged creative librarians to experiment and scale when successful, iterating and sharing their learnings with others. If we had focused our funding to increase just the number of books people were borrowing, we would have missed the opportunity to fund and witness these positive changes.
I serve on the boards of the MacArthur Foundation and the Knight Foundation, which have made grants that helped transform our libraries. I’ve also worked over the years with dozens of philanthropists and investors—those who put money into ventures that promise environmental and public health benefits in addition to financial returns. All of us have struggled to measure the effectiveness of grants and investments that seek to benefit the community, the environment, and so forth. My own research interest in the practice of change has converged with the research of those who are trying to quantify this change, and so recently, my colleague Louis Kang and I have begun to analyse the ways in which people are currently measuring impact and perhaps find methods to better measure the impact of these investments….(More)”.
Chapter by Michael Howlett and Stuti Rawat: “Behavioral science consists of the systematic analysis of processes underlying human behavior through experimentation and observation, drawing on knowledge, research, and methods from a variety of fields such as economics, psychology, and sociology. Because policymaking involves efforts to modify or alter the behavior of policy-takers and centers on the processes of decision-making in government, it has always been concerned with behavioral psychology. Classic studies of decision-making in the field derived their frameworks and concepts from psychology, and the founder of policy sciences, Harold Lasswell, was himself trained as a behavioral political scientist. Hence, it should not be surprising that the use of behavioral science is a feature of many policy areas, including climate change policy.
This is given extra emphasis, however, because climate change policymaking and the rise of climate change as a policy issue coincides with a resurgence in behaviorally inspired policy analysis and design brought about by the development of behavioral economics. Thus efforts to deal with climate change have come into being at a time when behavioral governance has been gaining traction worldwide under the influence of works by, among others, Kahneman and Tversky, Thaler, and Sunstein. Such behavioral governance studies have focused on the psychological and cognitive behavioral processes in individuals and collectives, in order to inform, design, and implement different modes of governing. They have been promoted by policy scholars, including many economists working in the area who prefer its insights to those put forward by classical or neoclassical economics.
In the context of climate change policy, behavioral science plays two key roles—through its use of behaviorally premised policy instruments as new modes of public policy being used or proposed to be used, in conjunction with traditional climate change policy tools; and as a way of understanding some of the barriers to compliance and policy design encountered by governments in combating the “super wicked problem” of climate change. Five kinds of behavioral tools have been found to be most commonly used in relation to climate change policy: provision of information, use of social norms, goal setting, default rules, and framing. A large proportion of behavioral tools has been used in the energy sector, because of its importance in the context of climate change action and the fact that energy consumption is easy to monitor, thereby facilitating impact assessment….(More)”.
EU report by Rene Van Bavel et al: “Recognising that advances in behavioural, decision and social sciences demonstrate that we are not purely rational beings, this report brings new insights into our political behaviour and this understanding have the potential to address some of the current crises in our democracies. Sixty experts from across the globe working in the fields of behavioural and social sciences as well as the humanities, have contributed to the research that underpins this JRC report that calls upon evidence-informed policymaking not to be taken for granted. There is a chapter dedicated to each key finding which outlines the latest scientific thinking as well as an overview of the possible implications for policymaking. The key findings are:
- Misperception and Disinformation: Our thinking skills are challenged by today’s information environment and make us vulnerable to disinformation. We need to think more about how we think.
- Collective Intelligence: Science can help us re-design the way policymakers work together to take better decisions and prevent policy mistakes.
- Emotions: We can’t separate emotion from reason. Better information about citizens’ emotions and greater emotional literacy could improve policymaking.
- Values and Identities drive political behaviour but are not properly understood or debated.
- Framing, Metaphor and Narrative: Facts don’t speak for themselves. Framing, metaphors and narratives need to be used responsibly if evidence is to be heard and understood.
- Trust and Openness: The erosion of trust in experts and in government can only be addressed by greater honesty and public deliberation about interests and values.
- Evidence-informed policymaking: The principle that policy should be informed by evidence is under attack. Politicians, scientists and civil society need to defend this cornerstone of liberal democracy….(More)”
Conference Paper by Christine Meschede and Tobias Siebenlist: “Since the adoption of the United Nations’ Sustainable Development Goals (SDGs) in 2015 – an ambitious agenda to end poverty, combat environmental threats and ensure prosperity for everyone – some effort has been made regarding the adequate measuring of the progress on its targets. As the crucial point is the availability of sufficient, comparable information, open data can play a key role. The coverage of open data, i.e., data that is machine-readable, freely available and reusable for everyone, is assessed by several measurement tools. We propose the use of open governmental data to make the achievement of SDGs easy and transparent to measure. For this purpose, a mapping of the open data categories to the SDGs is presented. Further, we argue that the SDGs need to be tackled in particular at the city level. For analyzing the current applicability of open data for measuring progress on the SDGs, we provide a small-scale case study on German open data portals and the embedded data categories and datasets. The results suggest that further standardization is needed in order to be able to use open data for comparing cities and their progress towards the SDGs….(More)”.
Paper by Wolfgang Kerber and Daniel Moeller: “The need for regulatory solutions for access to in-vehicle data and resources of connected cars is one of the big controversial and unsolved policy issues. Last year the EU revised the Motor Vehicle Type Approval Regulation which already entailed a FRAND-like solution for the access to repair and maintenance information (RMI) to protect competition on the automotive aftermarkets. However, the transition to connected cars changes the technological conditions for this regulatory solution significantly. This paper analyzes the reform of the type approval regulation and shows that the regulatory solutions for access to RMI are so far only very insufficiently capable of dealing with the challenges coming along with increased connectivity, e.g. with regard to the new remote diagnostic, repair and maintenance services. Therefore, an important result of the paper is that the transition to connected cars will require a further reform of the rules for the regulated access to RMI (esp. with regard to data access, interoperability, and safety/security issues). However, our analysis also suggests that the basic approach of the current regulated access regime for RMI in the type approval regulation can also be a model for developing general solutions for the currently unsolved problems of access to in-vehicle data and resources in the ecosystem of connected driving….(More)”.