Tuesday, November 14, 2006

Meta Programming to create a modelling system

A requirement of this research is that open standard semantic languages are used to represent information, to be used both as input and output of the model. These languages are based on XML. Also the open standard languages can be used for developing the program code of models. It is proposed that software, and information represented by the software, be separated but represented in the same open standard searchable way. Software and the information it manipulates are just information that has different uses, there is no reason why a model must be represented differently from the taxonomy which represents it. So XML can be used both as the information processed by the application, and the application itself. This enables a recursive relationship between the model and the taxonomy. This recursion makes 'meta-programming' possible. Meta programming is the writing of programs by other programs. The purpose of this is to provide a cascading series of layers that translate a relatively easy to use visual representation of a problem to be modelled, into code that can be run by present day compilers and interpreters. This is to make it easier for computer literate non-programmers to specify instructions to a computer, without learning and writing code in computer languages. To achieve this, any layer of software or information must be able to read the code or the information represented in any other. Code and information are only separated out as a matter of design choice to aid human comprehension; they can be represented in the same way using the same kinds of open standard languages.

The diagram below illustrates the aim of having a two way translation between all levels in a hierarchy of translation between human and computer, and between different software environments. This definition from Wikipedia [1] and used in this paper by Simons and Parmee [2] explains the aim - 'a kind of action that occurs as two or more objects have an effect on each other. The idea of a two-way effect is essential to the concept of interaction, as opposed to a one way causal effect. Combinations of many simple interactions can lead to surprising emergent phenomena'. This communications could improve opportunities for end user modelling and programming, sharing of information, and education of both users and computer software. The analogy of educating computer software to do what the user intends is called programming by demonstration in Watch What I Do: Programming by Demonstration [3]. The user has the role of an educator of the software which acts as an apprentice to learn what is required. The user is thus able to instruct the software and so program.

The diagram shows the communication between a high level representation the user provides, to the computer, and back to the user.

Figure - Translation Process

1 Wikipedia, http://en.wikipedia.org/wiki/Interaction, Interaction (2006).

2 C. L. Simons, I. C., Parmee, http://www.cems.uwe.ac.uk/~clsimons/Publications/CooperativeInteraction.pdf, A manifesto for cooperative human / machine interaction (2006).

3 A. Cypher, 1993, Watch What I Do Programming by Demonstration, MIT Press, http://www.acypher.com/wwid/ ISBN:0262032139.

No comments: