Living Labs As A Collaborative Framework For Changing Perceptions And Goals


Co-Val: “In the…Report on cross-country comparison on existing innovation and living labsLars Fuglsang and Anne Vorre Hansen from Roskilde University describe various applications of living labs to decision-making. The basic two examples are living labs as a collaborative framework for changing perceptions and goals and living labs as an ecosystem for policy innovation.

Living labs can involve a change in mindset and goals as expressed in one paper on public sector innovation labs (Carstensen & Bason, 2012). Carstensen and Bason (2012) report the important story of the Danish Mindlab (2002-2018) – a cross-governmental innovation lab involving public sector organisations, citizens and businesses in creating new solutions for society. They argue that innovation labs are designed to foster collaboration since labs are platforms where multiple stakeholders can engage in interaction, dialogue, and development activities.  Innovation needs a different approach than everyday activities and a change in mindset and culture shift of employees towards thinking more systematically about innovation. Mindlab’s methodologies are anchored in design thinking, qualitative research and policy development, with the aim of capturing the subjective reality experienced by both citizens and businesses in the development of new solutions. Carstensen and Bason (2012) list the following key principles of Mindlab: take charge of on-going renewal, maintain top management backing, create professional empathy, insist on collaboration, do – don’t just think, recruit and develop likeable people, don’t be too big, communicate.

Also, Buhr et al. (2016) show how living labs can be important for developing and implementing collective goals and creating new opportunities for citizens to influence public affairs. They describe two cases in two suburban areas (located in Sweden and Finland), where the living lab approach was used to improve the feeling of belonging in a community. In one of the two suburbs studied, a living lab approach was used to change the lightning on a pathway that seemed unsafe; and in the other case, a living lab approach was used to strengthen the social community by renovating a kiosk and organizing varied activities for the citizens. Both living labs motivated the residents to work on societal goals for sustainability and choose solutions. The study indicates that a living lab approach can be used for gaining support for change and thereby increasing the citizens’ appreciation of a local area. Further, living labs may give citizens a feeling that they are being listened to. Living labs can thus create opportunities for citizens to develop the city together with municipal policy-makers and other stakeholders and enable policy-makers to respond to the expressed needs of the citizens….(More)”

Data Science for Local Government


Report by Jonathan Bright, Bharath Ganesh, Cathrine Seidelin and Thomas Vogl: “The Data Science for Local Government project was about understanding how the growth of ‘data science’ is changing the way that local government works in the UK. We define data science as a dual shift which involves both bringing in new decision making and analytical techniques to local government work (e.g. machine learning and predictive analytics, artificial intelligence and A/B testing) and also expanding the types of data local government makes use of (for example, by repurposing administrative data, harvesting social media data, or working with mobile phone companies). The emergence of data science is facilitated by the growing availability of free, open-source tools for both collecting data and performing analysis.

Based on extensive documentary review, a nationwide survey of local authorities, and in-depth interviews with over 30 practitioners, we have sought to produce a comprehensive guide to the different types of data science being undertaken in the UK, the types of opportunities and benefits created, and also some of the challenges and difficulties being encountered.

Our aim was to provide a basis for people working in local government to start on their own data science projects, both by providing a library of dozens of ideas which have been tried elsewhere and also by providing hints and tips for overcoming key problems and challenges….(More)”

Dark Data Plagues Federal Organizations


Brandi Vincent at NextGov: “While government leaders across the globe are excited about the unleashing artificial intelligence in their organizations, most are struggling with deploying it for their missions because they can’t wrangle their data, a new study suggests.

In a survey released this week, Splunk and TRUE Global Intelligence polled 1,365 global business managers and IT leaders across seven countries. The research indicates that the majority of organizations’ data is “dark,” or unquantified, untapped and usually generated by systems, devices or interactions.

AI runs on data and yet few organizations seem to be able to tap into its value—or even find it.

“Neglected by business and IT managers, dark data is an underused asset that demands a more sophisticated approach to how organizations collect, manage and analyze information,” the report said. “Yet respondents also voiced hesitance about diving in.”

A third of respondents said more than 75% of their organizations’ data is dark and only one in every nine people reports that less than a quarter of their organizations’ data is dark.

Many of the global respondents said a lack of interest from their leadership makes it hard to recover dark data. Another 60% also said more than half of their organizations’ data is not captured and “much of it is not even understood to exist.”

Research also suggests that while almost 100% of respondents believe data skills are critical for jobs in the future, more than half feel too old to learn new skills and 69% are content to keep doing what they are doing, even if it means they won’t be promoted.

“Many say they’d be content to let others take the lead, even at the expense of their own career progress,” the report said.

More than half of the respondents said they don’t understand AI well, as it’s still in its early stages, and 39% said their colleagues and industry don’t get it either. They said few organizations are deploying the new tech right now, but the majority of respondents do see its potential….(More)”.

The Ruin of the Digital Town Square


Special Issue of The Atlantis: “Across the political spectrum, a consensus has arisen that Twitter, Facebook, YouTube, and other digital platforms are laying ruin to public discourse. They trade on snarkiness, trolling, outrage, and conspiracy theories, and encourage tribalism, information bubbles, and social discord. How did we get here, and how can we get out? The essays in this symposium seek answers to the crisis of “digital discourse” beyond privacy policies, corporate exposés, and smarter algorithms.

The Inescapable Town Square
L. M. Sacasas on how social media combines the worst parts of past eras of communication

Preserving Real-Life Childhood
Naomi Schaefer Riley on why decency online requires raising kids who know life offline

How Not to Regulate Social Media
Shoshana Weissmann on proposed privacy and bot laws that would do more harm than good

The Four Facebooks
Nolen Gertz on misinformation, manipulation, dependency, and distraction

Do You Know Who Your ‘Friends’ Are?
Ashley May on why treating others well online requires defining our relationships

The Distance Between Us
Micah Meadowcroft on why we act badly when we don’t speak face-to-face

The Emergent Order of Twitter
Andy Smarick on why the platform should be fixed from the bottom up, not the top down

Imagine All the People
James Poulos on how the fantasies of the TV era created the disaster of social media

Making Friends of Trolls
Caitrin Keiper on finding familiar faces behind the black mirror…(More)”

Surround Sound


Report by the Public Affairs Council: “Millions of citizens and thousands of organizations contact Congress each year to urge Senators and House members to vote for or against legislation. Countless others weigh in with federal agencies on regulatory issues ranging from healthcare to livestock grazing rights. Congressional and federal agency personnel are inundated with input. So how do staff know what to believe? Who do they trust? And which methods of communicating with government seem to be most effective? To find out, the Public Affairs Council teamed up with Morning Consult in an online survey of 173 congressional and federal employees. Participants were asked for their views on social media, fake news, influential methods of communication and trusted sources of policy information.

When asked to compare the effectiveness of different advocacy techniques, congressional staff rate personal visits to Washington, D.C., (83%) or district offices (81%), and think tank reports (81%) at the top of the list. Grassroots advocacy techniques such as emails, phone calls and postal mail campaigns also score above 75% for effectiveness.

Traditional in-person visits from lobbyists are considered effective by a strong majority (75%), as are town halls (73%) and lobby days (72%). Of the 13 options considered, the lowest score goes to social media posts, which are still rated effective by 57% of survey participants.

Despite their unpopularity with the general public, corporate CEOs are an asset when it comes to getting meetings scheduled with members of Congress. Eighty-three percent (83%) of congressional staffers say their boss would likely meet with a CEO from their district or state when that executive comes to Washington, D.C., compared with only 7% who say their boss would be unlikely to take the meeting….(More)”.

New Report Examines Reproducibility and Replicability in Science, Recommends Ways to Improve Transparency and Rigor in Research


National Academies of Sciences: “While computational reproducibility in scientific research is generally expected when the original data and code are available, lack of ability to replicate a previous study — or obtain consistent results looking at the same scientific question but with different data — is more nuanced and occasionally can aid in the process of scientific discovery, says a new congressionally mandated report from the National Academies of Sciences, Engineering, and Medicine.  Reproducibility and Replicability in Science recommends ways that researchers, academic institutions, journals, and funders should help strengthen rigor and transparency in order to improve the reproducibility and replicability of scientific research.

Defining Reproducibility and Replicability

The terms “reproducibility” and “replicability” are often used interchangeably, but the report uses each term to refer to a separate concept.  Reproducibility means obtaining consistent computational results using the same input data, computational steps, methods, code, and conditions of analysis.  Replicability means obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data.   

Reproducing research involves using the original data and code, while replicating research involves new data collection and similar methods used in previous studies, the report says.  Even when a study was rigorously conducted according to best practices, correctly analyzed, and transparently reported, it may fail to be replicated. 

“Being able to reproduce the computational results of another researcher starting with the same data and replicating a previous study to test its results facilitate the self-correcting nature of science, and are often cited as hallmarks of good science,” said Harvey Fineberg, president of the Gordon and Betty Moore Foundation and chair of the committee that conducted the study.  “However, factors such as lack of transparency of reporting, lack of appropriate training, and methodological errors can prevent researchers from being able to reproduce or replicate a study.  Research funders, journals, academic institutions, policymakers, and scientists themselves each have a role to play in improving reproducibility and replicability by ensuring that scientists adhere to the highest standards of practice, understand and express the uncertainty inherent in their conclusions, and continue to strengthen the interconnected web of scientific knowledge — the principal driver of progress in the modern world.”….(More)”.

The future of work? Work of the future!


European Commission: “While historical evidence suggests that previous waves of automation have been overwhelmingly positive for the economy and society, AI is in a different league, with the potential to be much more disruptive. It builds upon other digital technologies but also brings about and amplifies major socioeconomic changes of its own.

What do recent technological developments in AI and robotisation mean for the economy, businesses and jobs? Should we be worried or excited? Which jobs will be destroyed and which new ones created? What should education systems, businesses, governments and social partners do to manage the coming transition successfully?
These are some of the questions considered by Michel Servoz, Senior Adviser on Artificial Intelligence, Robotics and the Future of Labour, in this in-depth study requested by European Commission President Jean-Claude Juncker….(More)”.

GIS and the 2020 Census


ESRI:GIS and the 2020 Census: Modernizing Official Statistics provides statistical organizations with the most recent GIS methodologies and technological tools to support census workers’ needs at all the stages of a census. Learn how to plan and carry out census work with GIS using new technologies for field data collection and operations management. International case studies illustrate concepts in practice….(More)”.

Data Pools: Wi-Fi Geolocation Spoofing


AH Projects: “DataPools is a Wi-Fi geolocation spoofing project that virtually relocates your phone to the latitudes and longitudes of Silicon Valley success. It includes a catalog and a SkyLift device with 12 pre-programmed locations. DataPools was produced for the Tropez summer art event in Berlin and in collaboration with Anastasia Kubrak.

DataPools catalog pool index

DataPools catalog pool index

Weren’t invited to Jeff Bezos’s summer pool party? No problem. DataPools uses the SkyLift device to mimick the Wi-Fi network infrastructure at 12 of the top Silicon Valley CEOs causing your phone to show up, approximately, at their pool. Because Wi-Fi spoofing affects the core geolocation services of iOS and Android smartphones, all apps on phone and the metadata they generate, will be located in the spoofed location…

Data Pools is a metaphor for a store of wealth that is private. The luxurious pools and mansions of Silicon Valley are financed by the mechanisms of economic surveillance and ownership of our personal information. Yet, the geographic locations of these premises are often concealed, hidden, and removed from open source databases. What if we could reverse this logic and plunge into the pools of ludicrous wealth, both virtually and physically? Could we apply the same methods of data extraction to highlight the ridiculous inequalities between CEOs and platform users?

Comparison of wealth distribution among top Silicon Valley CEOs

Comparison of wealth distribution among top Silicon Valley CEOs

Data

Technically, DataPools uses a Wi-Fi microcontroller programmed with the BSSIDs and SSIDs from the target locations, which were all obtained using openly published information from web searches and wigle.net. This data is then programmed onto the firmware of the SkyLift device. One SkyLift device contains all 12 pool locations. However, throughout the installation improvements were made and the updated firmware now uses one main location with multiple sub-locations to cover a larger area during installations. This method was more effective at spoofing many phones in large area and is ideal for installations….(More)”.

The Blockchain Game: A great new tool for your classroom


IBM Blockchain Blog: “Blockchain technology can be a game-changer for accounting, supply chainbanking, contract law, and many other fields. But it will only be useful if lots and lots of non-technical managers and leaders trust and adopt it. And right now, just understanding what blockchain is, can be difficult to understand even for the brightest in these fields. Enter The Blockchain Game, a hands-on exercise that explains blockchain’s core principals, and serves as a launching pad for discussion of blockchain’s real-world applications.

In The Blockchain Game students act as nodes and miners on a blockchain network for storing student grades at a university. Participants record the grade and course information, and then “build the block” by calculating a unique identifier (a hash) to secure the grade ledger, and miners get rewarded for their work. As the game is played, the audience learns about hashes, private keys, and what uses are appropriate for a blockchain ledger.

Basics of the Game

  • A hands-on simulation centering around a blockchain for academic scores, including a discussion at the end of the simulation regarding if storing grades would be a good application for blockchain.
  • No computers. Participants are the computors and calculate blocks.
  • The game seeks to teach core concepts about a distributed ledger but can be modified to whichever use case the educator wishes to use — smart contracts, supply chain, applications and others.
  • Additional elements can be added if instructors want to facilitate the game on a computer….(More)”.