Information Science Discussion Papers Series: 1998 Abstracts

98/01: User defined spatial business rules: Storage, management and implementation—A pipe network case study

S. Cockcroft

The application of business rules as a means of ensuring data quality is an accepted approach in information systems development. Rules, defined by the user, are stored and manipulated by a repository or data dictionary. The repository stores the system design, including rules which result from constraints in the user’s environment, and enforces these rules at runtime. The work presented here represents the application of this approach to spatial information system design using an integrated spatial software engineering tool (ISSET) with a repository at its core.

Keywords: spatial information systems development, integrity constraints, business rules, topological relationships


98/02: Electronic security

H. B. Wolfe

Electronic security in this day and age covers a wide variety of techniques. One of the most important areas that must be addressed is that of commerce on the Internet. The Internet is an insecure medium to say the least. Every message sent must pass through many computers that are most likely controlled by unrelated and untrusted organizations before it ultimately reaches the final destination. At any one of these relays the information within the message can be scrutinized, analyzed and/or copied for later reference. There are documented and suspected instances of surveillance of Internet traffic. It has been suggested that several of the major communication switches (through which 90% or more of Internet traffic must pass) have permanent surveillance in place.

Another insidious but less obvious fact about Internet use is that messages once sent, are not discarded nor do they disappear forever. Usually, at one or more relays, copies of messages are archived and kept for differing time periods. Most ordinary users are not aware that messages sent six months ago may be able to be retrieved. That fact could have serious legal ramifications for the sender.

At this time cryptography is really the only effective method that can be used to protect Internet transactions and communications from unauthorized interception. Unauthorized means anyone who you have not expressly given permission to read your private communications. Cryptography is the art or science of hidden writing. Plain text (your message in readable form) is modified using an algorithm (like a mathematical equation) that requires at least one special variable (your special private key that no one else knows) to create ciphered text (your message in unreadable form). At the destination the person who the message is meant for must have the “special key” in order to be able to unlock the ciphered message.

All encryption is not created equal nor does it necessarily provide equivalent security. It would be wrong to intimate that merely using “encryption” to protect your communication is enough. There are other factors at work here as well and they have to do with the politics of privacy. I have often heard it said in New Zealand that “if you have nothing to hide then it shouldn’t matter who reads your communications”. Of course, that opinion is naïve and does not represent reality in any meaningful way.


98/03: Looking for a new AI paradigm: Evolving connectionist and fuzzy connectionist systems—Theory and applications for adaptive, on-line intelligent systems

N. Kasabov

The paper introduces one paradigm of neuro-fuzzy techniques and an approach to building on-line, adaptive intelligent systems. This approach is called evolving connectionist systems (ECOS). ECOS evolve through incremental, on-line learning, both supervised and unsupervised. They can accommodate new input data, including new features, new classes, etc. The ECOS framework is presented and illustrated on a particular type of evolving neural networks—evolving fuzzy neural networks. ECOS are three to six orders of magnitude faster than the multilayer perceptrons, or the fuzzy neural networks, trained with the backpropagation algorithm, or with a genetic programming technique. ECOS belong to the new generation of adaptive intelligent systems. This is illustrated on several real world problems for adaptive, on-line classification, prediction, decision making and control: phoneme-based speech recognition; moving person identification; wastewater flow time-series prediction and control; intelligent agents; financial time series prediction and control. The principles of recurrent ECOS and reinforcement learning are outlined.

Keywords: evolving neuro-fuzzy systems, fuzzy neural networks, on-line adaptive control, on-line decision making, intelligent agents


98/04: Connectionist methods for classification of fruit populations based on visible-near infrared spectrophotometry data

J. Kim and N. Kasabov and A. Mowat and P. Poole

Variation in fruit maturation can influence harvest timing and duration, post-harvest fruit attributes and consumer acceptability. Present methods of managing and identifying lines of fruit with specific attributes both in commercial fruit production systems and breeding programs are limited by a lack of suitable tools to characterise fruit attributes at different stages of development in order to predict fruit behaviour at harvest, during storage or in relation to consumer acceptance. With visible-near infrared (VNIR) reflectance spectroscopy a vast array of analytical information is collected rapidly with a minimum of sample pre-treatment. VNIR spectra contain information about the amount and the composition of constituents within fruit. This information can be obtained from intact fruit at different stage of development. Spectroscopic data is processed using chemometrics techniques such as principal component analysis (PCA), discriminant analysis and/or connectionist approaches in order to extract qualitative and quantitative information for classification and predictive purposes. In this paper, we will illustrate the effectiveness of a model, connectionist and hybrid approaches, for fruit quality classification problems.


98/05: A fuzzy neural network model for the estimation of the feeding rate to an anaerobic waste water treatment process

J. Kim and R. Kozma and N. Kasabov and B. Gols and M. Geerink and T. Cohen

Biological processes are among the most challenging to predict and control. It has been recognised that the development of an intelligent system for the recognition, prediction and control of process states in a complex, nonlinear biological process control is difficult. Such unpredictable system behaviour requires an advanced, intelligent control system which learns from observations of the process dynamics and takes appropriate control action to avoid collapse of the biological culture. In the present study, a hybrid system called fuzzy neural network is considered, where the role of the fuzzy neural network is to estimate the correct feed demand as a function of the process responses. The feed material is an organic and/or inorganic mixture of chemical compounds for the bacteria to grow on. Small amounts of the feed sources must be added and the response of the bacteria must be measured. This is no easy task because the process sensors used are non-specific and their response would vary during the developmental stages of the process. This hybrid control strategy retains the advantages of both neural networks and fuzzy control. These strengths include fast and accurate learning, good generalisation capabilities, excellent explanation facilities in the form of semantically meaningful fuzzy rules, and the ability to accommodate both numerical data and existing expert knowledge about the problem under consideration. The application to the estimation and prediction of the correct feed demand shows the power of this strategy as compared with conventional fuzzy control.

Keywords: fuzzy neural networks, hybrid learning, knowledge extraction and insertion, estimation, biological process and control, bacterial system, total organic carbon (TOC)


98/06: Induction of labour for post term pregnancy: An observational study

E. Parry and D. Parry and N. Pattison

The aim of the study was to compare the 2 management protocols for postterm pregnancy; elective induction of labour at 42 weeks’ gestation and continuing the pregnancy with fetal monitoring while awaiting spontaneous labour. A retrospective observational study compared a cohort of 360 pregnancies where labour was induced with 486 controls. All pregnancies were postterm (>294 days) by an early ultrasound scan. Induction of labour was achieved with either prostaglandin vaginal pessaries or gel or forewater rupture and Syntocinon infusion. The control group consisted of women with postterm pregnancies who were not induced routinely and who usually had twice weekly fetal assessment with cardiotocography and/or ultrasound. Women who had their labour induced differed from those who awaited spontaneous labour. Nulliparas (OR 1.54; 95% CI 1.24-1.83) and married women (OR 1.76; 95% CI 1.45-2.06) were more likely to have their labour induced. There was no association between the type of caregiver and induction of labour. Induction of labour was associated with a reduction in the incidence of normal vaginal delivery (OR 0.63, 95% CI 0.43-0.92) and an increased incidence of operative vaginal delivery (OR 1.46; 95% CI 1.34-2.01). There was no difference in the overall rate of Caesarean section. There was no difference in fetal or neonatal outcomes. Parity had a major influence on delivery outcomes from a policy of induction of labour. Nulliparas in the induced group had worse outcomes with only 43% achieving a normal vaginal delivery (OR 0.78, 95% CI 0.65-0.95). In contrast for multiparas, the induced group had better outcomes with less Caesarean sections (OR 0.88, 95% CI 0.81-0.96). This retrospective observational study of current clinical practice shows that induction of labour for postterm pregnancy appears to be favoured by nulliparous married women. It suggests that induction of labour may improve delivery outcomes for multigravas but has an adverse effect for nulliparas.


98/07: Automating information processing tasks: An agent-based architecture

S. Cranefield and B. McKinlay and E. Moreale and M. K. Purvis

This paper describes an agent-based architecture designed to provide automation support for users who perform information processing tasks using a collection of distributed and disparate software tools and on-line resources. The architecture extends previous work on agent-based software interoperability. The unique features of the information processing domain compared to distributed information retrieval are discussed and a novel extension of hierarchical task network (HTN) planning to support this domain is presented.

Download (PDF, 172 KB)


98/08: Spatial isomorphism

A. Holt and S. MacDonell and G. Benwell

This research continues with current innovative geocomputational research trends that aim to provide enhanced spatial analysis tools. The coupling of case-based reasoning (CBR) with GIS provides the focus of this paper. This coupling allows the retrieval, reuse, revision and retention of previous similar spatial cases. CBR is therefore used to develop more complex spatial data modelling methods (by using the CBR modules for improved spatial data manipulation) and provide enhanced exploratory geographical analysis tools (to find and assess certain patterns and relationships that may exist in spatial databases). This paper details the manner in which spatial similarity is assessed, for the purpose of re-using previous spatial cases. The authors consider similarity assessment a useful concept for retrieving and analysing spatial information as it may help researchers describe and explore a certain phenomena, its immediate environment and its relationships to other phenomena. This paper will address the following questions: What makes phenomena similar? What is the definition of similarity? What principles govern similarity? and How can similarity be measured?

Generally, phenomena are similar when they share common attributes and circumstances. The degree of similarity depends on the type and number of commonalties they share. Within this research, similarity is examined from a spatial perspective. Spatial similarity is broadly defined by the authors as the spatial matching and ranking according to a specific context and scale. More specifically, similarity is governed by context (function, use, reason, goal, users frame-of mind), scale (coarse or fine level), repository (the application, local domain, site and data specifics), techniques (the available technology for searching, retrieving and recognising data) and measure and ranking systems.

The degree of match is the score between a source and a target. In spatial matching a source and a target could be a pixel, region or coverage. The principles that govern spatial similarity are not just the attributes but also the relationships between two phenomena. This is one reason why CBR coupled with a GIS is fortuitous. A GIS is used symbiotically to extract spatial variables that can be used by CBR to determine similar spatial relations between phenomena. These spatial relations are used to assess the similarity between two phenomena (for example proximity and neighborhood analysis). Developing the concept of spatial similarity could assist with analysing spatial databases by developing techniques to match similar areas. This would help maximise the information that could be extracted from spatial databases. From an exploratory perspective, spatial similarity serves as an organising principle by which spatial phenomena are classified, relationships identified and generalisations made from previous bona fide experiences or knowledge. This paper will investigate the spatial similarity concept.

Download (PDF, 456 KB)


98/09: Development of a generic system for modelling spatial processes

A. Marr and R. Pascoe and G. Benwell and S. Mann

In this paper is proposed a structure for the development of a generic graphical system for modelling spatial processes (SMSP). This system seeks to integrate the spatial data handling operations of a GIS with specialist numerical modelling functionality, by the description of the processes involved. A conceptual framework is described, the foundation of which are six defined modules (or services) that are considered a minimum requirement for basic system operation. The services are identified following description of the three key components to systems integration, and the examination of the preferred integrating structure. The relationship of the integration components to sample commentary on the future requirements of integration is discussed, and the benefits and deficiencies of an implemented system for modelling spatial processes are noted.

Download (PDF, 644 KB)


98/10: Neuro-fuzzy methods for environmental modelling

M. K. Purvis and N. Kasabov and G. Benwell and Q. Zhou and F. Zhang

This paper describes combined approaches of data preparation, neural network analysis, and fuzzy inferencing techniques (which we collectively call neuro-fuzzy engineering) to the problem of environmental modelling. The overall neuro-fuzzy architecture is presented, and specific issues associated with environmental modelling are discussed. A case study that shows how these techniques can be combined is presented for illustration. We also describe our current software implementation that incorporates neuro-fuzzy analytical tools into commercially available geographical information system software.

Download (PDF, 384 KB)