Upper Ontology Summit
March 15, 2006
 The theory and technology of knowledge representation, reasoning and conceptual modeling have advanced to a stage where meanings of terms can be formally specified in computer systems with great detail and precision.
 With the success and expansion of the internet, the potential for achieving semantic interoperability across interconnected applications has become widely recognized, and the number of teams and individuals creating knowledge classifications of varying degrees of logical formality has dramatically increased. As this technology develops further, it will enable deployment of computer applications with increasing ability to make reliable knowledge-based decisions that currently require human effort. Programs with such enhanced capacity will increase the speed, efficiency, and sophistication of automated information analysis and exploitation.
 Much recent emphasis has been focused on creating common syntactic formalisms for representing knowledge, but syntactic formalisms alone do not provide an effective way for describing what counts most:
 The complementary technology for effectively representing the semantic content of complex widely used concepts is also available, but agreement on standardized conceptual building blocks has not yet been reached.
 The need for such agreement is increasing rapidly as many isolated projects of varying complexity have been initiated to capture knowledge in computer-interpretable formalisms. Without the means for specifying intended meaning by means of well-understood conceptual building blocks clearly related and contrasted with each other, the great potential for sharing knowledge usable for computer reasoning will not be realized.
 Several candidate upper ontologies are available, reflecting decades of research and development.
 Each upper ontology has an existing community of users, but each community only has access to a fraction of the total resources available.
 To promote interoperability and the exploitation of these upper ontologies, we intend to find a principled means of articulating the relationships (including differences) among them. As a result, this initiative will significantly enhance the value of the knowledge in each of the communities whose knowledge bases are linked to these interrelated upper ontologies.
 These upper ontologies are available and should be rigorously and independently evaluated. They must also be easy to use and assess by developers of domain ontologies and applications.
 For the forseeable future, we anticipate there will be multiple upper ontologies. We will articulate the commonalities and the reasons for the major differences in the upper ontologies.
Hereby unanimously supported by the following upper ontology custodians:
John Bateman – Spatial Cognition, Robotics and Natural Language
Aldo Gangemi – DOLCE – Description & Situation extensions
Michael Gruninger – PSL / ISO 18629
Nicola Guarino – DOLCE
Doug Lenat – OpenCyc
Adam Pease – SUMO
Barry Smith – BFO
Matthew West – ISO 15926
Project Mission: To round up a joint Ontolog community effort, to author a response and submit that to the (US) National Health Information Network, Request for Information of 2004.11.15. Using that as the beginning of a collaborative effort where the Ontolog community can contribute professionally to the NHIN initiative.
New: [health-ont] Re: Proceedings: Ontolog Forum Panel Discussion Aug. 25, 2005: Health Informatics
Peter Yim comments:
Good turnout … great presentations … and wonderful discussions exploring the healthcare informatics landscape! Kudos to Rex Brooks for proposing, organizing and moderating this session; and to Bob Smith for helping frame the discussion.
Appreciations to panelists Marc Wine, Bob Smith, Mark Musen, Ram Sriram, David Whitten and Brand Niemann for sharing their insights with the community (and the world at large). Thanks should also go to all who had taken the time to participate, and to contribute to the discussion. Proceedings of the session are captured at the wiki session page, at:
In particular, full audio recording of the session is now available too – see: