Artificial Intelligence Could Help Colleges Better Plan What Courses They Should Offer


Jeffrey R. Young at EdSsurge: Big data could help community colleges better predict how industries are changing so they can tailor their IT courses and other programs. After all, if Amazon can forecast what consumers will buy and prestock items in their warehouses to meet the expected demand, why can’t colleges do the same thing when planning their curricula, using predictive analytics to make sure new degree or certificates programs are started just in time for expanding job opportunities?

That’s the argument made by Gordon Freedman, president of the nonprofit National Laboratory for Education Transformation. He’s part of a new center that will do just that, by building a data warehouse that brings together up-to-date information on what skills employers need and what colleges currently offer—and then applying artificial intelligence to attempt to predict when sectors or certain employment needs might be expanding.

He calls the approach “opportunity engineering,” and the center boasts some heavy-hitting players to assist in the efforts, including the University of Chicago, the San Diego Supercomputing Center and Argonne National Laboratory. It’s called the National Center for Opportunity Engineering & Analysis.

Ian Roark, vice president of workforce development at Pima Community College in Arizona, is among those eager for this kind of “opportunity engineering” to emerge.

He explains when colleges want to start new programs, they face a long haul—it takes time to develop a new curriculum, put it through an internal review, and then send it through an accreditor….

Other players are already trying to translate the job market into a giant data set to spot trends. LinkedIn sits on one of the biggest troves of data, with hundreds of millions of job profiles, and ambitions to create what it calls the “economic graph” of the economy. But not everyone is on LinkedIn, which attracts mainly those in white-collar jobs. And companies such as Burning Glass Technologies have scanned hundreds of thousands of job listings and attempt to provide real-time intelligence on what employers say they’re looking for. Those still don’t paint the full picture, Freedman argues, such as what jobs are forming at companies.

“We need better information from the employer, better information from the job seeker and better information from the college, and that’s what we’re going after,” Freedman says…(More)”.

Solving some of the world’s toughest problems with the Global Open Policy Report


 at Creative Commons: “Open Policy is when governments, institutions, and non-profits enact policies and legislation that makes content, knowledge, or data they produce or fund available under a permissive license to allow reuse, revision, remix, retention, and redistribution. This promotes innovation, access, and equity in areas of education, data, software, heritage, cultural content, science, and academia.

For several years, Creative Commons has been tracking the spread of open policies around the world. And now, with the new Global Open Policy Report (PDF) by the Open Policy Network, we’re able to provide a systematic overview of open policy development.

screen-shot-2016-12-02-at-5-57-09-pmThe first-of-its-kind report gives an overview of open policies in 38 countries, across four sectors: education, science, data and heritage. The report includes an Open Policy Index and regional impact and local case studies from Africa, the Middle East, Asia, Australia, Latin America, Europe, and North America. The index measures open policy strength on two scales: policy strength and scope, and level of policy implementation. The index was developed by researchers from CommonSphere, a partner organization of CC Japan.

The Open Policy Index scores were used to classify countries as either Leading, Mid-Way, or Delayed in open policy development. The ten countries with the highest scores are Argentina, Bolivia, Chile, France, Kyrgyzstan, New Zealand, Poland, South Korea, Tanzania, and Uruguay…(More)

The data-driven social worker


NESTA: “Newcastle City Council has been using data to change the way it delivers long-term social work to vulnerable children and families.

Social workers have data analysts working alongside them. This is helping them to identify common factors among types of cases, understand the root causes of social problems and create more effective (and earlier) interventions.

What is it?

Social work teams have an embedded data analyst, whose role is to look for hypotheses to test and analysis to perform that offers insight into how best to support families.

Their role is not purely quantitative; they are expected to identify patterns, and undertake deep-dive or case study analysis. The data analysts also test what works, measuring the success of externally commissioned services, along with cost information.

While each social worker only has knowledge of their own individual cases, data analysts have a bird’s-eye view of the whole team’s activity, enabling them to look across sets of families for common patterns.

How does it work?

Data analysts are responsible for maintaining ChildStat, a data dashboard that social workers use to help manage their caseloads. The data insights found by the embedded analysts can highlight the need to work in a different way.

For example, one unit works with children at risk of physical abuse. Case file analysis of the mental health histories of the parents found that 20% of children had parents with a personality disorder, while 60-70% had a parent who had experienced sexual or physical abuse as children themselves.

Traditional social work methods may not have uncovered this insight, which led Newcastle to look for new responses to working with these types of families.

Data analysis has also helped to identify the factors that are most predictive of a child becoming NEET (not in education, employment or training), enabling the team to review their approach to working with families and focus on earlier intervention….(More)”

Data Literacy – What is it and how can we make it happen?


Introduction by Mark Frank, Johanna Walker, Judie Attard, Alan Tygel of Special Issue on Data Literacy of The Journal of Community Informatics: “With the advent of the Internet and particularly Open Data, data literacy (the ability of non-specialists to make use of data) is rapidly becoming an essential life skill comparable to other types of literacy. However, it is still poorly defined and there is much to learn about how best to increase data literacy both amongst children and adults. This issue addresses both the definition of data literacy and current efforts on increasing and sustaining it. A feature of the issue is the range of contributors. While there are important contributions from the UK, Canada and other Western countries, these are complemented by several papers from the Global South where there is an emphasis on grounding data literacy in context and relating it the issues and concerns of communities. (Full Text: PDF)

See also:

Creating an Understanding of Data Literacy for a Data-driven Society by Annika Wolff, Daniel Gooch, Jose J. Cavero Montaner, Umar Rashid, Gerd Kortuem

Data Literacy defined pro populo: To read this article, please provide a little information by David Crusoe

Data literacy conceptions, community capabilities by Paul Matthews

Urban Data in the primary classroom: bringing data literacy to the UK curriculum by Annika Wolff, Jose J Cavero Montaner, Gerd Kortuem

Contributions of Paulo Freire for a Critical Data Literacy: a Popular Education Approach by Alan Freihof Tygel, Rosana Kirsch

DataBasic: Design Principles, Tools and Activities for Data Literacy Learners by Catherine D’Ignazio, Rahul Bhargava

Perceptions of ICT use in rural Brazil: Factors that impact appropriation among marginalized communities by Paola Prado, J. Alejandro Tirado-Alcaraz, Mauro Araújo Câmara

Graphical Perception of Value Distributions: An Evaluation of Non-Expert Viewers’ Data Literacy by Arkaitz Zubiaga, Brian Mac Namee

How to Hold Algorithms Accountable


Nicholas Diakopoulos and Sorelle Friedler at MIT Technology Review:  Algorithms are now used throughout the public and private sectors, informing decisions on everything from education and employment to criminal justice. But despite the potential for efficiency gains, algorithms fed by big data can also amplify structural discrimination, produce errors that deny services to individuals, or even seduce an electorate into a false sense of security. Indeed, there is growing awareness that the public should be wary of the societal risks posed by over-reliance on these systems and work to hold them accountable.

Various industry efforts, including a consortium of Silicon Valley behemoths, are beginning to grapple with the ethics of deploying algorithms that can have unanticipated effects on society. Algorithm developers and product managers need new ways to think about, design, and implement algorithmic systems in publicly accountable ways. Over the past several months, we and some colleagues have been trying to address these goals by crafting a set of principles for accountable algorithms….

Accountability implies an obligation to report and justify algorithmic decision-making, and to mitigate any negative social impacts or potential harms. We’ll consider accountability through the lens of five core principles: responsibility, explainability, accuracy, auditability, and fairness.

Responsibility. For any algorithmic system, there needs to be a person with the authority to deal with its adverse individual or societal effects in a timely fashion. This is not a statement about legal responsibility but, rather, a focus on avenues for redress, public dialogue, and internal authority for change. This could be as straightforward as giving someone on your technical team the internal power and resources to change the system, making sure that person’s contact information is publicly available.

Explainability. Any decisions produced by an algorithmic system should be explainable to the people affected by those decisions. These explanations must be accessible and understandable to the target audience; purely technical descriptions are not appropriate for the general public. Explaining risk assessment scores to defendants and their legal counsel would promote greater understanding and help them challenge apparent mistakes or faulty data. Some machine-learning models are more explainable than others, but just because there’s a fancy neural net involved doesn’t mean that a meaningful explanationcan’t be produced.

Accuracy. Algorithms make mistakes, whether because of data errors in their inputs (garbage in, garbage out) or statistical uncertainty in their outputs. The principle of accuracy suggests that sources of error and uncertainty throughout an algorithm and its data sources need to be identified, logged, and benchmarked. Understanding the nature of errors produced by an algorithmic system can inform mitigation procedures.

Auditability. The principle of auditability states that algorithms should be developed to enable third parties to probe and review the behavior of an algorithm. Enabling algorithms to be monitored, checked, and criticized would lead to more conscious design and course correction in the event of failure. While there may be technical challenges in allowing public auditing while protecting proprietary information, private auditing (as in accounting) could provide some public assurance. Where possible, even limited access (e.g., via an API) would allow the public a valuable chance to audit these socially significant algorithms.

Fairness. As algorithms increasingly make decisions based on historical and societal data, existing biases and historically discriminatory human decisions risk being “baked in” to automated decisions. All algorithms making decisions about individuals should be evaluated for discriminatory effects. The results of the evaluation and the criteria used should be publicly released and explained….(More)”

New Data Portal to analyze governance in Africa


Governance and Service Delivery: Practical Applications of Social Accountability Across Sectors


Book edited by Derick W. Brinkerhoff, Jana C. Hertz, and Anna Wetterberg: “…Historically, donors and academics have sought to clarify what makes sectoral projects effective and sustainable contributors to development. Among the key factors identified have been (1) the role and capabilities of the state and (2) the relationships between the state and citizens, phenomena often lumped together under the broad rubric of “governance.” Given the importance of a functioning state and positive interactions with citizens, donors have treated governance as a sector in its own right, with projects ranging from public sector management reform, to civil society strengthening, to democratization (Brinkerhoff, 2008). The link between governance and sectoral service delivery was highlighted in the World Bank’s 2004 World Development Report, which focused on accountability structures and processes (World Bank, 2004).

Since then, sectoral specialists’ awareness that governance interventions can contribute to service delivery improvements has increased substantially, and there is growing recognition that both technical and governance elements are necessary facets of strengthening public services. However, expanded awareness has not reliably translated into effective integration of governance into sectoral programs and projects in, for example, health, education, water, agriculture, or community development. The bureaucratic realities of donor programming offer a partial explanation…. Beyond bureaucratic barriers, though, lie ongoing gaps in practical knowledge of how best to combine attention to governance with sector-specific technical investments. What interventions make sense, and what results can reasonably be expected? What conditions support or limit both improved governance and better service delivery? How can citizens interact with public officials and service providers to express their needs, improve services, and increase responsiveness? Various models and compilations of best practices have been developed, but debates remain, and answers to these questions are far from settled. This volume investigates these questions and contributes to building understanding that will enhance both knowledge and practice. In this book, we examine six recent projects, funded mostly by the United States Agency for International Development and implemented by RTI International, that pursued several different paths to engaging citizens, public officials, and service providers on issues related to accountability and sectoral services…(More)”

Digital Kenya: An Entrepreneurial Revolution in the Making


(Open Access) book edited by Bitange Ndemo and Tim Weiss: “Presenting rigorous and original research, this volume offers key insights into the historical, cultural, social, economic and political forces at play in the creation of world-class ICT innovations in Kenya. Following the arrival of fiber-optic cables in 2009, Digital Kenya examines why the initial entrepreneurial spirit and digital revolution has begun to falter despite support from motivated entrepreneurs, international investors, policy experts and others. Written by engaged scholars and professionals in the field, the book offers 15 eye-opening chapters and 14 one-on-one conversations with entrepreneurs and investors to ask why establishing ICT start-ups on a continental and global scale remains a challenge on the “Silicon Savannah”. The authors present evidence-based recommendations to help Kenya to continue producing globally impactful  ICT innovations that improve the lives of those still waiting on the side-lines, and to inspire other nations to do the same….(More)”

Make Democracy Great Again: Let’s Try Some ‘Design Thinking’


Ken Carbone in the Huffington Post: “Allow me to begin with the truth. I’ve never studied political science, run for public office nor held a position in government. For the last forty years I’ve led a design agency working with enduring brands across the globe. As with any experienced person in my profession, I have used research, deductive reasoning, logic and “design thinking“ to solve complex problems and create opportunities. Great brands that are showing their age turn to our agency to get back on course. In this light, I believe American democracy is a prime target for some retooling….

The present campaign cycle has left many voters wondering how such divisiveness and national embarrassment could be happening in the land of the free and home of the brave. This could be viewed as symptomatic of deeper structural problems in our tradition bound 240 year-old democracy. Great brands operate on a “innovate or die” model to insure success. The continual improvement of how a business operates and adapts to market conditions is a sound and critical practice.

Although the current election frenzy will soon be over, I want to examine three challenges to our election process and propose possible solutions for consideration. I’ll use the same diagnostic thinking I use with major corporations:

Term Limits…

Voting and Voter registration…

Political Campaigns…

In June of this year I attended the annual leadership conference of AIGA, the professional association for design, in Raleigh NC. A provocative question posed to a select group of designers was “What would you do if you were Secretary of Design.” The responses addressed issues concerning positive social change, education and Veteran Affairs. The audience was full of several hundred trained professionals whose everyday problem solving methods encourage divergent thinking to explore many solutions (possible or impossible) and then use convergent thinking to select and realize the best resolution. This is the very definition of “design thinking.” That leads to progress….(More)”.

We All Need Help: “Big Data” and the Mismeasure of Public Administration


Essay by Stephane Lavertu in Public Administration Review: “Rapid advances in our ability to collect, analyze, and disseminate information are transforming public administration. This “big data” revolution presents opportunities for improving the management of public programs, but it also entails some risks. In addition to potentially magnifying well-known problems with public sector performance management—particularly the problem of goal displacement—the widespread dissemination of administrative data and performance information increasingly enables external political actors to peer into and evaluate the administration of public programs. The latter trend is consequential because external actors may have little sense of the validity of performance metrics and little understanding of the policy priorities they capture. The author illustrates these potential problems using recent research on U.S. primary and secondary education and suggests that public administration scholars could help improve governance in the data-rich future by informing the development and dissemination of organizational report cards that better capture the value that public agencies deliver….(More)”.