Late one lonely, dark night in the 1980s, out on the rugged frontier of knowledge, Library Science and Artificial Intelligence found each other. In yearning and need they reached out, beyond the familiar. LS was drawn to AI’s raw power and bold, adventurous spirit. AI was attracted by LS’s ample stacks of data, bursting with meaning, earthy and redolent with real, precious knowledge. Once together, their union was unbreakable. The spawn of that union? Informatics.
During the Manhattan Project in Los Alamos in the 1940s, nuclear fission was modeled by semi-automated computation by adding machines and expert human operators. Through electronics, computers became increasingly powerful and automated mathematical calculators. Early programming referred to hardware configuration of digital or analog electronic circuits. With data storage via magnetic tape and punch cards, “software” was born. And Electrical Engineering begat Computer Science. Computer Science branched into theory of languages and algorithms, and allowed us to reconsider the computer, not just as a super calculator, but as a symbol processor, maybe even as a cognition engine.
Others might have courted and conoodled with Library Science but were too dazzled by their own sounds and lights. Computer Science was ascendant, exciting, in demand. The Computer Age was upon us, and Computer Science had prodigious attitude (though the adoption of “science” in its name reflected a certain insecurity). Data, it was thought, is data, only a single glyph on a complex flowchart of algorithms, symbols, formalized logic, the languages of cyber cognition.
Two tipsy bar patrons settle a dispute on the roster of the 1969 Mets using their smartphones in a few seconds, then move on to another crucial topic. During a keynote lecture at a scientific conference, a student in the audience checks factual claims on her laptop and kindly offers corrections to the (one hopes) grateful expert. 3rd graders know how to look things up “on the Internet” and are evolving a cultural understanding of this shared and readily accessible knowledge-base quite new to our species. The research of the 3rd grader or drunken debater is not so much analogous to the literature research of the professional scientist, it is much the same. Of course not everything you read on the web is correct. Neither is everything you read in scientific journals. This equivalence can inform our progress as we all face similar challenges in improving online resources and our shared cyberinfrastructure and cyberculture. Wikipedia and Google are hyper- examples, too big to be only examples of Informatics. Google’s mission statement: "to organize the world's information and make it universally accessible and useful" describes what librarians have done since librarians began. Arguably there is an analogy between Wikipedia and Google and the pre-Guttenberg great libraries (e.g. Alexandria) which preserved, promoted, and propagated knowledge and culture through scrolls and books. Like the ancient libraries, these modern structures convey huge volumes of information and thereby assume special, though not unique, roles in the culture and collective knowledge.
Philosophy and Mathematics are understandably self-satisfied in their supremacy in the knowledge hierarchy. The immutable, perfect forms of Socrates, numbers and theorems of Pythagoreas and Euclid continue to be worshipped by lesser fields. Why condescend to the level of data? Data, as Library Science will readily admit, is error ridden, inconsistent, noisy, untidy. But there is beauty in tidying.
The meaning of “Artifical Intelligence” has evolved and deserves explanation. The Turing test concept of artifical intelligence as indistinguishable from human intelligence is an important reference. But it now can be applied and refined given for the variety of modern computing methodology. Some computation may be regarded as superior to human intelligence (e.g. chess playing). However, visions of computers learning in the flexible, adaptable, way of human toddlers seems far from realization. Computer cognition is an important kind of intelligence whether or not like human intelligence. Important now, in 2014, in very concrete ways, because it is already controlling much of what goes on in our world. In addition to questions of human intelligence, and computer intelligence, what about combinations as humans develop devices which enhance cognition and memory, and, maybe dramatizing a bit, co-evolve towards some cognitive sym-cyber-biote.
Natural Sciences enjoy their disdain for Social Sciences. How could belief and behavior possibly be as rigorous and scientific as the mass and velocity of an electron? However, eventually Artificial Intelligence recognized that to develop computer cognition, it became necessary to better understand human cognition and social science. Neuropsychologists are fond of scanned, quantified brain activity, but must also infer cognitive state in very human ways. To the discomfort of some, Philosophy is not far from questions like What does it mean to know? How can we learn? What is the fundamental particle of knowledge?
What is “informatics”? And why might “informatics” be a very big deal right now. Why might “informatics” need quotes? Because like with any new field, terminology is in flux, obfuscated and distorted by hype. Popular magazines and scientific journals alike note that we are in “The Information Age”, the era of “Big Data”, and that science has evolved a “4th Paradigm”. The Internet and World Wide Web have apparently given rise to “Web 2.0”, and the “Semantic Web.” We apparently suffer from “information overload” and “data loss” due to limited “Cyberinfrastructure”. We must be “data driven” and avoid “data silos” in our “4th Paradigm” thinking. Thanks to modern social networking technology, we can upload, in a few touches of a handheld, megabytes of images (happy hour hijinks, a new gas grille, a cat making a face, a homebrew recipe), to “The Cloud”, our cultural computing commonwealth. As if we didn’t have enough data to worry about, we should also mind our “metadata” and beyond that our “ontologies”. Data loss, data corruption and cybersecurity issues keep us nervous as do concerns that we are falling behind the technology curve and our smart phones are not keeping up with the neighbors.
The emergence and ascendance of Informatics is unnoticed by some, admired and even revered by others. There is a predictable tendency of some to elevate and expand Informatics beyond even its great importance. When over hyped and over stated, Informatics becomes synonymous with Knowledge, or Science, or Language, Culture or Civilization. All good, but we don’t need new names for them.
From Big Data, sophisticated computer models produce economic forecasts, or suggest what we should consume for better health and longer life. Simultaneously, data can be selected and shaped to statistically support almost any claim. Scientists and citizens alike are vulnerable to misrepresentations and abuse of data, both accidental and intentional. Computer models now dominate fields from the stock trading to baseball management. For those who must take risks on future outcomes, Informatics is a game changer, separating winners and losers (e.g. 2012 election, multiple professional sports leagues).
The epic union of Library Science and Artificial Intelligence may well inspire religions and myths if history is a guide. There is a resonant truth in the indivisibility of memory, information, knowledge, cognition, and computation. All logic is expressed in symbols built upon elemental experience. All experience is defined by logical frameworks of cognition. Somehow the ultimate learning genius, a human baby, in the ultimate boot-strapping algorithm, begins all with the force and warmth of lips on nipple, and develops a system of knowledge which encompasses quantum physics or abstract art.
Keywords: Library science, artificial intelligence, information systems, automated reasoning, machine learning, computer science, electrical engineering, cognition, 4th Paradigm, Big Data, The Information Age, Internet, World Wide Web, Semantic Web, semantic technology, data science.