for IT professionals

For IT professionals, software developers, potential investors and others seeking a more in-depth understanding of the semantic and programming concepts underlying Hiveware, Hiveware Inc offers the following documents for examination.


Explanation of the Hiveware Architecture
A general introduction to Hiveware technology for both IT specialists and non-specialists.

SGML Implementation with 100pct Tag Omission
This paper explains how the Hiveware technology eliminates the need for markup tags like Some Title or even ....

HIVEWARE: A synchronous Distributed Context Distributed Content Groupware Architecture
An explanation of the theory behind the technology.

Distributed Context as Long-running Parse Tree
An academic introduction to the subject from a parse-tree perspective. (Windows users click here to view video clip associated with the paper).

Computer supported Methodically Formed Networks
Critical comments on decision-making during the Hurricane Katrina disaster of 2005 - a potential real-world application of Hiveware. Includes a response to an article by Denning and Hayes-Roth (see below).

Decision Making in Very Large Networks
by Peter Denning and Rick Hayes-Roth. An article in Communications of the ACM, November 2006/Vol. 49, No. 11. A case study of FEMA decision-making during Hurricane Katrina (see above entry for Robert Tischer’s comments on this article).

Technical Reference US Patent 7,124,362
by Robert Tischer. Using Hiveware for Word as an embodiment of the software invention, a scenario is illustrated and commented where one author signs up a second author into a SmallBusProposal hive. Population and content delegate are fully described and related to the claims of the patent.

NOTE: about computer science's distributed compiling problem

Ever since Noam Chomsky in his 1957 "Syntactic Structures" pronounced that meaning was not necessary to determine linguistic structure, software compilers have been stuck in analytic limbo. All of software's compilers today can generate intermediate code and subsequently optimized object code, but they cannot establish language meaning and associate it with object code that runs on today's von Neumann CPU architecture. This is quite sad since problems require the meaning dimension to be really solved. I liken today's software compiler to the Antikythera clockwork mechanism found by a Greek sponge diver off the tiny island of Antikythera in 1900. At first scientists thought that the ancient greek mechanism was an astronomical computer capable of predicting the positions of the sun and moon in the zodiac on any given date. Additional research has shown that the device was specifically designed to model a particular form of "epicyclic" motion. The Greeks believed in an earth-centric universe and accounted for celestial bodies' motion using elaborate models based on epicycles. (see The Economist's "The Clockwork Computer" article)


Analogous to the ancient Greeks and their earth-centric universe, Computer Scientists only believe in analytic compiling despite the fact that linguistics and therefore meaning is at the heart of designing software tools that are linguistic extensions of groups of people (who use computer tools).

NOTE: about computer science's push to use artificial intelligence

Big technical corporations have been pushing Artificial Intelligence (AI) as the next big thing. This is yet another false narrative that merely perpetuates Computer Science's grip on centralized computing architectures. AI anthropomorphizes computing power. As a result, the technical narrative focuses on whether or not the computer will take over when it becomes smarter than humans. Computer Science has not yet discovered the Humanities, in particular Language Psychology, and as a result its solutions are all hierarchical and non-scalable in the long run. Its solutions always increase in cost as usage increases. Computer Science has yet to discover that the File and Folder are computer artifacts, and not categories in the brain, which doesn't bode well for AI.

Hiveware, in contrast, seeks to be an extension of the brain, not a cheap replacement for the brain. It helps the human by doing what the Von Neumann architecture does best: it keeps track of items. And by using the Hiveware platform, any and all components can become digital assets by imbueing them with security, ownership and privacy traits.

Dynamic Syntax Compiling
by Robert Tischer is a Master's thesis which also explains the role of the software developer as a solo activity (See Figure 2: Basic conventional compiler linguistics components). Its thesis is that conventional compiling is in essence non-distributed functionally. For a live example of the essence of how distributed compiling can work, windows users may click here to view a video clip showing how compiled functionality can be interjected into a larger running and distributed program.

  Site Map