The concept of links between documents begin to be discussed as a paradigm for organizing textual material and knowledge.
ZIP (Zoning Improvement Plan) codes are introduced by the US Post Office.
The first full-text searching of documents by computer is demonstrated.
The National Bureau of Standards (now NIST) publishes tables and properties of many higher mathematical functions
The first version of the MeSH medical lexicon goes into use.
US President Lyndon Johnson signs the act into law, mandating public access to government records.
Roger Tomlinson initiates the Canada Geographic Information System, creating the first GIS system.
British SBN codes are introduced, later generalized to ISBN in 1970.
ASCII Code defines a standard bit representation for every character in English.
The DIALOG online information retrieval system becomes accessible from remote locations.
Eugene Garfield publishes the first edition of the Science Citation Index, which indexes scientific literature through references in papers.
Henriette Avram creates the MAchine-Readable Cataloging system at the Library of Congress, defining metatagging standards for books.
Dun & Bradstreet begins to assign a unique number to every company.
Relational databases and query languages allow huge amounts of data to be stored in a way that makes certain common kinds of queries efficient enough to be done as a routine part of business.
Lexis provides full-text records of US court opinions in an online retrieval system.
With the emergence of progressively cheaper computers, it becomes possible to do computations immediately, integrating them as part of the everyday process of working with knowledge.
Neil Sloane begins to catalog "interesting" sequences of integers.
Largely as an offshoot of AI, expert systems are an attempt to capture the knowledge of human experts in specialized domains, using logic-based inferential systems.
The UPC standard for barcodes is launched.
With precursors in the 1940s, neural networks emerge in the 1980s as a concept for storing and manipulating various types of knowledge using connections reminiscent of nerve cells.
Cyc is a long-running project to encode common sense facts in a computable form.
Walter Goad at Los Alamos founds GenBank to collect all genome sequences being found.
Mathematica is created to provide a uniform system for all forms of algorithmic computation by defining a symbolic language to represent arbitrary constructs and then assembling a huge web of consistent algorithms to operate on them.
The Domain Name System for hierarchical Internet addresses is created; in 1984, .com and other top-level domains (TLDs) are named.
The web grows to provide billions of pages of freely available information from all corners of civilization.
The Internet Movie Database is launched.
Gopher provides a menu-based system for finding material on computers connected to the internet.
Ti Kan indexes CDs with CDDB, which becomes Gracenote.
The Unicode standard assigns a numerical code to every glyph in every human language.
Brewster Kahle founds the Internet Archive to begin systematically capturing and storing the state of the web.
Tim Berners-Lee creates the Virtual Library, the first systematic catalog of the web.
Google and other search engines provide highly efficient capabilities to do textual searches across the whole content of the web.
Quick Response (QR) scannable barcodes are created in Japan, encoding information for computer eyes to read.
The Sloan Digital Sky Survey spends nearly a decade automatically mapping every visible object in the astronomical universe.
The Human Genome Project is declared complete in finding a reference DNA sequence for every human.
Social networking and other collective websites define a mechanism for collectively assembling information by and about people.
Facebook begins to capture social relations between people on a large scale.
Volunteer contributors assemble millions of pages of encyclopedia material, providing textual descriptions of practically all areas of human knowledge.
Steve Coast initiates a project to create a crowdsourced street-level map of the world.
Stephen Wolfram explores the universe of possible simple programs and shows that knowledge about many natural and artificial processes could be represented in terms of surprisingly simple programs.
Wolfram|Alpha is launched as a website that computes answers to natural-language queries based on a large collection of algorithms and curated data.