Colleen Flaherty at Inside Higher Ed: “What a difference preparation makes when it comes to doing research in Arctic-level air-conditioned academic libraries (or ones that are otherwise freezing — or not air-conditioned at all). Luckily, Megan L. Cook, assistant professor of English at Colby College, published a crowdsourced document called“How Cold Is that Library?” ….
Cook, who was not immediately available for comment, has said the document was group effort. Juliet Sperling, a faculty fellow in American art at Colby, credited her colleague’s “brilliance” but said the document was “generally inspired by conversations we’ve had as co-fellows” in the Andrew W. Mellon Society of Fellows in Critical Bibliography. The society brings together 60-some scholars of rare books and material texts from a variety of disciplinary or institutional approaches, she said, “so collectively, we’ve all spent quite a bit of time in libraries of various climates all over the world.” In addition to library temperatures, lighting and even humidity levels, the scholars trade research destinations’ photo policies and nearby eateries and drinking holes, among other tips. A spreadsheet opens up that resource to others, Sperling said. …(More)”.
IBM Blockchain Blog: “Blockchain technology can be a game-changer for accounting, supply chain, banking, contract law, and many other fields. But it will only be useful if lots and lots of non-technical managers and leaders trust and adopt it. And right now, just understanding what blockchain is, can be difficult to understand even for the brightest in these fields. Enter The Blockchain Game, a hands-on exercise that explains blockchain’s core principals, and serves as a launching pad for discussion of blockchain’s real-world applications.
In The Blockchain Game students act as nodes and miners on a blockchain network for storing student grades at a university. Participants record the grade and course information, and then “build the block” by calculating a unique identifier (a hash) to secure the grade ledger, and miners get rewarded for their work. As the game is played, the audience learns about hashes, private keys, and what uses are appropriate for a blockchain ledger.
Basics of the Game
A hands-on simulation centering around a blockchain for academic scores, including a discussion at the end of the simulation regarding if storing grades would be a good application for blockchain.
No computers. Participants are the computors and calculate blocks.
The game seeks to teach core concepts about a distributed ledger but can be modified to whichever use case the educator wishes to use — smart contracts, supply chain, applications and others.
Additional elements can be added if instructors want to facilitate the game on a computer….(More)”.
Blog by Jennifer Latson for Arnold Ventures: “When you buy a car, you want to know it will get you where you’re going. Before you invest in a certain model, you check its record. How does it do in crash tests? Does it have a history of breaking down? Are other owners glad they bought it?
Students choosing between college programs can’t do the same kind of homework. Much of the detailed data we demand when we buy a car isn’t available for postsecondary education — data such as how many students find jobs in the fields they studied, what they earn, how much debt they accumulate, and how quickly they repay it — yet choosing a college is a much more important financial decision.
The most promising solution to filling in the gaps, according to data advocates, is the College Transparency Act, which would create a secure, comprehensive national data network with information on college costs, graduation rates, and student career paths — and make this data publicly available. The bill, which will be discussed in Congress this year, has broad support from both Republicans and Democrats in the House and the Senate in part because it includes precautions to protect privacy and secure student data….
The data needed to answer questions about student success already exists but is scattered among various agencies and institutions: the Department of Educationfor data on student loan repayment; the Treasury Department for earnings information; and schools themselves for graduation rates.
“We can’t connect the dots to find out how these programs are serving certain students, and that’s because the Department of Education isn’t allowed to connect all the information these places have already collected,” says Amy Laitinen, director for higher education at New America, a think tank collaborating with IHEP to promote educational transparency. And until recently, publicly available federal postsecondary data only included full-time students who’d never enrolled in a college program before, ignoring the more than half of the higher ed population made up of students who attend school part time or who transfer from one institution to another….(More)”.
CCCBLab: “In recent years we have been witnessing a constant trickle of news on artificial intelligence, machine learning and computer vision. We are told that machines learn, see, create… and all this builds up a discourse based on novelty, on a possible future and on a series of worries and hopes. It is difficult, sometimes, to figure out in this landscape which are real developments, and which are fantasies or warnings. And, undoubtedly, this fog that surrounds it forms part of the power that we grant, both in the present and on credit, to these tools, and of the negative and positive concerns that they arouse in us. Many of these discourses may fall into the field of false debates or, at least, of the return of old debates. Thus, in the classical artistic field, associated with the discourse on creation and authorship, there is discussion regarding the entity to be awarded to the images created with these tools. (Yet wasn’t the argument against photography in art that it was an image created automatically and without human participation? And wasn’t that also an argument in favour of taking it and using it to put an end to a certain idea of art?)
Metaphors are essential in the discourse on all digital tools and the power that they have. Are expressions such as “intelligence”, “vision”, “learning”, “neural” and the entire range of similar words the most adequate for defining these types of tools? Probably not, above all if their metaphorical nature is sidestepped. We would not understand them in the same way if we called them tools of probabilistic classification or if instead of saying that an artificial intelligence “has painted” a Rembrandt, we said that it has produced a statistical reproduction of his style (something which is still surprising, and to be celebrated, of course). These names construct an entity for these tools that endows them with a supposed autonomy and independence upon which their future authority is based.
Because that is what it’s about in many discourses: constructing a characterisation that legitimises an objective or non-human capacity in data analysis….
We now find ourselves in what is, probably, the point of the first cultural reception of these tools. Of their development in fields of research and applications that have already been derived, we are moving on to their presence in the public discourse. It is in this situation and context, where we do not fully know the breadth and characteristics of these technologies (meaning fears are more abstract and diffuse and, thus, more present and powerful), when it is especially important to understand what we are talking about, to appropriate the tools and to intervene in the discourses. Before their possibilities are restricted and solidified until they seem indisputable, it is necessary to experiment with them and reflect on them; taking advantage of the fact that we can still easily perceive them as in creation, malleable and open.
In our projects The Bad Pupil. Critical pedagogy for artificial intelligences and Latent Spaces. Machinic Imaginations we have tried to approach to these tools and their imaginary. In the statement of intentions of the former, we expressed our desire, in the face of the regulatory context and the metaphor of machine learning, to defend the bad pupil as one who escapes the norm. And also how, faced with an artificial intelligence that seeks to replicate the human on inhuman scales, it is necessary to defend and construct a non-mimetic one that produces unexpected relations and images.
Fragment of De zeven werken van barmhartigheid, Meester van Alkmaar, 1504 (Rijksmuseum, Amsterdam) analysed with YOLO9000 | The Bad Pupil – Estampa
Both projects are also attempts to appropriate these tools, which means, first of all, escaping industrial barriers and their standards. In this field in which mass data are an asset within reach of big companies, employing quantitively poor datasets and non-industrial calculation potentials is not just a need but a demand….(More)”.
About: “The Public Interest Technology Universities Network is a partnership that fosters collaboration between 21 universities and colleges committed to building the nascent field of public interest technology and growing a new generation of civic-minded technologists. Through the development of curricula, research agendas, and experiential learning programs in the public interest technology space, these universities are trying innovative tactics to produce graduates with multiple fluencies at the intersection of technology and policy. By joining PIT-UN, members commit to field building on campus. Members may choose to focus on some or all of these elements, in addition to other initiatives they deem relevant to establishing public interest technology on campus:
Support curriculum and faculty development to enable interdisciplinary and cross-disciplinary education of students, so they can critically assess the ethical, political, and societal implications of new technologies, and design technologies in service of the public good.
Develop experiential learning opportunities such as clinics, fellowships, apprenticeships, and internship, with public and private sector partners in the public interest technology space.
Find ways to support graduates who pursue careers working in public interest technology, recognizing that financial considerations may make careers in this area unaffordable to many.
Create mechanisms for faculty to receive recognition for the research, curriculum development, teaching, and service work needed to build public interest technology as an arena of inquiry.
Provide institutional data that will allow us to measure the effectiveness of our interventions in helping to develop the field of public interest technology….(More)”.
Brookings: “Policymakers under President Obama implemented behaviorally-informed policies to improve college access, completion, and affordability. Given the complexity of the college application process, many of these policies aimed to simplify college and financial aid application processes and reduce informational barriers that students face when evaluating college options. Katharine Meyer and Kelly Ochs Rosinger summarize empirical evidence on these policies and conclude that behaviorally-informed policies play an important role, especially as supplements to (rather than replacements for) broader structural changes. For example, recent changes in the FAFSA filing timeline provided students with more time to complete the form. But this large shift may be more effective in changing behavior when accompanied by informational campaigns and nudges that improve students’ understanding of the new system. Governments and colleges can leverage behavioral science to increase awareness of student support services if more structural policy changes occur to provide the services in the first place….(More)”.
Book edited by Bill Johnston, Sheila MacNeill and Keith Smyth: “Despite the increasing ubiquity of the term, the concept of the digital university remains diffuse and indeterminate. This book examines what the term ‘digital university’ should encapsulate and the resulting challenges, possibilities and implications that digital technology and practice brings to higher education. Critiquing the current state of definition of the digital university construct, the authors propose a more holistic, integrated account that acknowledges the inherent diffuseness of the concept. The authors also question the extent to which digital technologies and practices can allow us to re-think the location of universities and curricula; and how they can extend higher education as a public good within the current wider political context. Framed inside a critical pedagogy perspective, this volume debates the role of the university in fostering the learning environments, skills and capabilities needed for critical engagement, active open participation and reflection in the digital age. This pioneering volume will be of interest and value to students and scholars of digital education, as well as policy makers and practitioners….(More)”
By Alexandra Shaw, Michelle Winowatan, Andrew Young, and Stefaan Verhulst
The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on open data and was originally published in 2018.
Stefaan Verhulst at Apolitical: “2018 will probably be remembered as the bust of the blockchain hype. Yet even as crypto currencies continue to sink in value and popular interest, the potential of using blockchain technologies to achieve social ends remains important to consider but poorly understood.
In 2019, business will continue to explore blockchain for sectors as disparate as finance, agriculture, logistics and healthcare. Policymakers and social innovators should also leverage 2019 to become more sophisticated about blockchain’s real promise, limitations and current practice.
In a recent report I prepared with Andrew Young, with the support of the Rockefeller Foundation, we looked at the potential risks and challenges of using blockchain for social change — or “Blockchan.ge.” A number of implementations and platforms are already demonstrating potential social impact.
In an illustration of the breadth of current experimentation, Stanford’s Center for Social Innovation recently analysed and mapped nearly 200 organisations and projects trying to create positive social change using blockchain. Likewise, the GovLab is developing a mapping of blockchange implementations across regions and topic areas; it currently contains 60 entries.
All these examples provide impressive — and hopeful — proof of concept. Yet despite the very clear potential of blockchain, there has been little systematic analysis. For what types of social impact is it best suited? Under what conditions is it most likely to lead to real social change? What challenges does blockchain face, what risks does it pose and how should these be confronted and mitigated?
These are just some of the questions our report, which builds its analysis on 10 case studies assembled through original research, seeks to address.
While the report is focused on identity management, it contains a number of lessons and insights that are applicable more generally to the subject of blockchange.
In particular, it contains seven design principles that can guide individuals or organisations considering the use of blockchain for social impact. We call these the Genesis principles, and they are outlined at the end of this article…(More)”.
Aaron Smith at the Pew Research Center: “Algorithms are all around us, utilizing massive stores of data and complex analytics to make decisions with often significant impacts on humans. They recommend books and movies for us to read and watch, surface news stories they think we might find relevant, estimate the likelihood that a tumor is cancerous and predict whether someone might be a criminal or a worthwhile credit risk. But despite the growing presence of algorithms in many aspects of daily life, a Pew Research Center survey of U.S. adults finds that the public is frequently skeptical of these tools when used in various real-life situations.
This skepticism spans several dimensions. At a broad level, 58% of Americans feel that computer programs will always reflect some level of human bias – although 40% think these programs can be designed in a way that is bias-free. And in various contexts, the public worries that these tools might violate privacy, fail to capture the nuance of complex situations, or simply put the people they are evaluating in an unfair situation. Public perceptions of algorithmic decision-making are also often highly contextual. The survey shows that otherwise similar technologies can be viewed with support or suspicion depending on the circumstances or on the tasks they are assigned to do….
The following are among the major findings.
The public expresses broad concerns about the fairness and acceptability of using computers for decision-making in situations with important real-world consequences
By and large, the public views these examples of algorithmic decision-making as unfair to the people the computer-based systems are evaluating. Most notably, only around one-third of Americans think that the video job interview and personal finance score algorithms would be fair to job applicants and consumers. When asked directly whether they think the use of these algorithms is acceptable, a majority of the public says that they are not acceptable. Two-thirds of Americans (68%) find the personal finance score algorithm unacceptable, and 67% say the computer-aided video job analysis algorithm is unacceptable….
Attitudes toward algorithmic decision-making can depend heavily on context
Despite the consistencies in some of these responses, the survey also highlights the ways in which Americans’ attitudes toward algorithmic decision-making can depend heavily on the context of those decisions and the characteristics of the people who might be affected….
When it comes to the algorithms that underpin the social media environment, users’ comfort level with sharing their personal information also depends heavily on how and why their data are being used. A 75% majority of social media users say they would be comfortable sharing their data with those sites if it were used to recommend events they might like to attend. But that share falls to just 37% if their data are being used to deliver messages from political campaigns.
In other instances, different types of users offer divergent views about the collection and use of their personal data. For instance, about two-thirds of social media users younger than 50 find it acceptable for social media platforms to use their personal data to recommend connecting with people they might want to know. But that view is shared by fewer than half of users ages 65 and older….(More)”.