for IT professionals

For IT professionals, software developers, potential investors and others seeking a more in-depth understanding of the semantic and programming concepts underlying Hiveware, Hiveware Inc offers the following documents for examination.


 

Explanation of the Hiveware Architecture
A general introduction to Hiveware technology for both IT specialists and non-specialists.

SGML Implementation with 100pct Tag Omission
This paper explains how the Hiveware technology eliminates the need for markup tags like Some Title or even ....

HIVEWARE: A synchronous Distributed Context Distributed Content Groupware Architecture
An explanation of the theory behind the technology.

Distributed Context as Long-running Parse Tree
An academic introduction to the subject from a parse-tree perspective. (Windows users click here to view video clip associated with the paper).

Computer supported Methodically Formed Networks
Critical comments on decision-making during the Hurricane Katrina disaster of 2005 - a potential real-world application of Hiveware. Includes a response to an article by Denning and Hayes-Roth (see below).

Decision Making in Very Large Networks
by Peter Denning and Rick Hayes-Roth. An article in Communications of the ACM, November 2006/Vol. 49, No. 11. A case study of FEMA decision-making during Hurricane Katrina (see above entry for Robert Tischer’s comments on this article).

Technical Reference US Patent 7,124,362
by Robert Tischer. Using Hiveware for Word as an embodiment of the software invention, a scenario is illustrated and commented where one author signs up a second author into a SmallBusProposal hive. Population and content delegate are fully described and related to the claims of the patent.

NOTE 1: about computer science and BigTech's distributed compiling problem

Ever since Noam Chomsky in his 1957 "Syntactic Structures" pronounced that meaning was not necessary to determine linguistic structure, software compilers have been stuck in analytic limbo. All of software's compilers today can generate intermediate code and subsequently optimized object code, but they cannot establish language meaning and associate it with object code that runs on today's von Neumann CPU architecture. This is quite sad since problems require the meaning dimension to be really solved. I liken today's software compiler to the Antikythera clockwork mechanism found by a Greek sponge diver off the tiny island of Antikythera in 1900. At first scientists thought that the ancient greek mechanism was an astronomical computer capable of predicting the positions of the sun and moon in the zodiac on any given date. Additional research has shown that the device was specifically designed to model a particular form of "epicyclic" motion. The Greeks believed in an earth-centric universe and accounted for celestial bodies' motion using elaborate models based on epicycles. (see The Economist's "The Clockwork Computer" article)

Antikythera_clockwork_computer

Analogous to the ancient Greeks and their earth-centric universe, Computer Scientists only believe in analytic compiling despite the fact that linguistics and therefore meaning is at the heart of designing software tools that are linguistic extensions of groups of people (who use computer tools).


NOTE 2: about computer science's push to use artificial intelligence

Big technical corporations have been pushing Artificial Intelligence (AI) as the next big thing. AI attempts to anthropomorphize computing power. As a result, the technical narrative focuses on whether or not the computer will take over when it becomes smarter than humans and how much our biases will affect machine's learning corpus. Computer Science has not yet discovered the Humanities, in particular Language Psychology, and as a result its solutions are hierarchical (client/server) and non-scalable in the long run. Its solutions always increase in cost as usage increases (non-scalable). Computer Science has yet to discover that the File and Folder are computer artifacts, and not categories in the brain, which doesn't bode well for AI. Hierarchical computer systems represent the ultimate short cut and have created an un-imagineable technical debt.

Hiveware, in contrast, seeks to be an extension of the brain, not a cheap replacement for the brain. It helps the human by doing what the Von Neumann architecture does best: keeps track of items. And by using the Hiveware platform, any and all components can become digital assets by imbueing them with security, ownership and privacy traits.

NOTE 3: about Homo Deus by Yuval Harari's prognosis for artificial intelligence

In his book, Harari declares the eminent death of individualism dubbing those whose jobs have been overtaken by intelligent machines as the useless class. He derives his claim by extrapolating on the gains AI has made over recent years. Although brilliantly articulated in Homo Deus, that is not what is going to happen. Harari is missing knowledge about epigenesis which is the scientific term for the long-standing debate about nature over nurture. Had he better understood that, he would have realized that the human brain's adult intelligence is not fully determined by its set of genes at birth. By journalistic extrapolation Harari predicts that the human brain will be irrovocably tampered with thus creating super-sapiens that render homo sapiens superfluous.

Harari underestimates the human brain regarding its ability to cope with accelerating environmental change, including AI. A more detailed defense of this position can be found in the Danish masters (equivalent) thesis "The Anatomy of Context (Udsigelses anatomi på dansk)" by Robert Tischer who develops the concepts of tool thinking and language thinking. These concepts were originally developed by Bøje Katzenelson, Danish Psychologist. For an english translation of the abstract, see Appendix: C in the Dynamic Syntax Compiling link below.

Context Intelligence (CI) is the best term for the new era ushered in by software like Hiveware. It brings a third tier to Katzenelson psychology epochs: tool thinking and language thinking that regeneratively (as in, positive feedback loop) unleashed the homo sapien brain over millions of years. I (Robert Tischer) propose this new era be called: virtual object thinking, aka, CI. Although the collabortive internet has had an arguably powerful effect on the sapien brain, that will be nothing in comparison to the cumulative effect virtual object thinking will have. Think of the difference as being the same difference as is the case for white light versus lazer light. This prediction stands in stark contrast to Harari's apocryphal prediction of the demise of the average sapien. Professor Katzenelson would term this new era "double double folding" (where folding refers to the regenerative and cumulative effect of the tools/objects we create on the future individuals that use these tools).

Dynamic Syntax Compiling
by Robert Tischer is a Master's (equivalent) thesis which also explains the role of the software developer as a solo activity (See Figure 2: Basic conventional compiler linguistics components). Its thesis is that conventional compiling is in essence non-distributed functionally. For a live example of the essence of how distributed compiling can work, windows users may click here to view a video clip showing how compiled functionality can be interjected into a larger running and distributed program.

 
 
 
 
  Site Map