Newer
Older
Discussion_Papers / Website / dp1996-abstracts-contents.htm
nstanger on 10 Jun 2009 34 KB - Regenerated 1993 to 1996 pages.
<div class="sectionTitle">Information Science Discussion Papers Series: 1996 Abstracts</div>

<hr>

<a name="dp1996-01"></a><h3>96/01: Using data models to estimate required effort in creating a spatial information system</h3> 

<h4>G. L. Benwell and S. G. MacDonell</h4> 

<p>The creation of spatial information systems can be viewed from many directions. One such view is to see the creation in terms of data collection, data modelling, codifying spatial processes, information management, analysis and presentation. The amount of effort to create such systems is frequently under-estimated;  this is true for each aspect of the above view. The accuracy of the assessment of effort will vary for each aspect. This paper concentrates on the effort required to create the code for spatial processes and analysis. Recent experience has indicated that this is an area where considerable under-estimation is occurring. Function point analysis presented in this paper provides a reliable metric for spatial systems developers to assess required effort based on spatial data models.</p> 


<hr>

<a name="dp1996-02"></a><h3>96/02: The use of a metadata repository in spatial database development</h3> 

<h4>S. K. S. Cockcroft</h4> 

<p>Database schemas currently used to define spatial databases are deficient in that they do not incorporate facilities to specify business rules/integrity constraints. This shortcoming has been noted by G&#252;nther and Lamberts [G&#252;nther &amp; Lamberts, 1994] who commented that geographical information systems (GIS) do not generally offer any functionality to preserve semantic integrity. It is desirable that this functionality be incorporated for reasons of consistency and so that an estimate of the accuracy of data entry can be made. Research into constraints upon spatial relationships at the conceptual level is well documented. A number of researchers have shown that the transition from conceptual to logical spatial data models is possible [Firns, 1994; Hadzilacos &amp; Tryfona, 1995]. The algorithmic accomplishment of this transition is a subject of current research. This paper presents one approach to incorporating spatial business rules in spatially referenced database schemas by means of a repository. It is demonstrated that the repository has an important role to play in spatial data management and in particular automatic schema generation for spatially referenced databases.</p> 

<p><a href="papers/dp1996-02.pdf">Download</a> (PDF, 188 KB)</p> 

<hr>

<a name="dp1996-03"></a><h3>96/03: Connectionist-based information systems: A proposed research theme</h3> 

<h4>N. K. Kasabov and M. K. Purvis and P. J. Sallis</h4> 

<p>General Characteristics of the Theme</p>

<p>* Emerging technology with rapidly growing practical applications<br/>* Nationally and internationally recognised leadership of the University of Otago<br/>* Already established organisation for research and working teams<br/>* Growing number of postgraduate students working on the theme<br/>* Growing number of research projects in this area <br/>* Growing number of publications by members of the team</p> 

<p><a href="papers/dp1996-03.pdf">Download</a> (PDF, 180 KB)</p> 

<hr>

<a name="dp1996-04"></a><h3>96/04: Agent modelling with Petri nets</h3> 

<h4>M. K. Purvis and S. J. S. Cranefield</h4> 

<p>The use of intelligent software agents is a modelling paradigm that is gaining increasing attention in the applications of distributed systems. This paper identifies essential characteristics of agents and shows how they can be mapped into a coloured Petri net representation so that the coordination of activities both within agents and between interacting agents can be visualised and analysed. The detailed structure and behaviour of an individual agent in terms of coloured Petri nets is presented, as well as a description of how such agents interact. A key notion is that the essential functional components of an agent are explicitly represented by means of coloured Petri net constructs in this representation.</p> 


<hr>

<a name="dp1996-05"></a><h3>96/05: A comparison of alternatives to regression analysis as model building techniques to develop predictive equations for software metrics</h3> 

<h4>A. R. Gray and S. G. MacDonell</h4> 

<p>The almost exclusive use of regression analysis to derive predictive equations for software development metrics found in papers published before 1990 has recently been complemented by increasing numbers of studies using non-traditional methods, such as neural networks, fuzzy logic models, case-based reasoning systems, rule-based systems, and regression trees. There has also been an increasing level of sophistication in the regression-based techniques used, including robust regression methods, factor analysis, resampling methods, and more effective and efficient validation procedures. This paper examines the implications of using these alternative methods and provides some recommendations as to when they may be appropriate. A comparison between standard linear regression, robust regression, and the alternative techniques is also made in terms of their modelling capabilities with specific reference to software metrics.</p> 


<hr>

<a name="dp1996-06"></a><h3>96/06: Process management for geographical information system development</h3> 

<h4>S. G. MacDonell and G. L. Benwell</h4> 

<p>The controlled management of software processes, an area of ongoing research in the business systems domain, is equally important in the development of geographical information systems (GIS). Appropriate software processes must be defined, used and managed in order to ensure that, as much as possible, systems are developed to quality standards on time and within budget. However, specific characteristics of geographical information systems, in terms of their inherent need for graphical output, render some process management tools and techniques less appropriate. This paper examines process management activities that are applicable to GIS, and suggests that it may be possible to extend such developments into the visual programming domain. A case study concerned with development effort estimation is presented as a precursor to a discussion of the implications of system requirements for significant graphical output.</p> 

<p><a href="papers/dp1996-06.pdf">Download</a> (PDF, 180 KB)</p> 

<hr>

<a name="dp1996-07"></a><h3>96/07: Experimental transformation of a cognitive schema into a display structure</h3> 

<h4>W. B. L. Wong and D. P. O&rsquo;Hare and P. J. Sallis</h4> 

<p>The purpose of this paper is to report on an experiment conducted to evaluate the feasibility of an empirical approach for translating a cognitive schema into a display structure. This experiment is part of a series of investigations aimed at determining how information about dynamic environments should be portrayed to facilitate decision making. Studies to date have generally derived an information display organisation that is largely based on a designer&rsquo;s experience, intuition and understanding of the processes. In this study we report on how we attempted to formalise this design process so that if the procedures were adopted, other less experienced designers would still be able to objectively formulate a display organisation that is just as effective. This study is based on the first stage of the emergency dispatch management process, the call-taking stage. The participants in the study were ambulance dispatch officers from the Dunedin-based Southern Regional Communications Centre of the St. John&rsquo;s Ambulance Service in New Zealand.</p> 


<hr>

<a name="dp1996-08"></a><h3>96/08: Neuro-fuzzy engineering for spatial information processing</h3> 

<h4>N. K. Kasabov and M. K. Purvis and F. Zhang and G. L. Benwell</h4> 

<p>This paper proposes neuro-fuzzy engineering as a novel approach to spatial data analysis and for building decision making systems based on spatial information processing, and the development of this approach by the authors is presented in this paper. It has been implemented as a software environment and is illustrated on a case study problem.</p> 

<p><a href="papers/dp1996-08.pdf">Download</a> (PDF, 376 KB)</p> 

<hr>

<a name="dp1996-09"></a><h3>96/09: Improved learning strategies for multimodular fuzzy neural network systems: A case study on image classification</h3> 

<h4>S. A. Israel and N. K. Kasabov</h4> 

<p>This paper explores two different methods for improved learning in multimodular fuzzy neural network systems for classification. It demonstrates these methods on a case study of satellite image classification using 3 spectral inputs and 10 coastal vegetation covertype outputs. The classification system is a multimodular one; it has one fuzzy neural network per output. All the fuzzy neural networks are trained in parallel for a small number of iterations. Then, the system performance is tested on new data to determine the types of interclass confusion. Two strategies are developed to improve classification performance. First, the individual modules are additionally trained for a very small number of iterations on a subset of the data to decrease the false positive and the false negative errors. The second strategy is to create new units, &lsquo;experts&rsquo;, which are individually trained to discriminate only the ambiguous classes. So, if the main system classifies a new input into one of the ambiguous classes, then the new input is passed to the &lsquo;experts&rsquo; for final classification. Two learning techniques are presented and applied to both classification performance enhancement strategies; the first one reduces omission, or false negative, error; the second reduces comission, or false positive, error. Considerable improvement is achieved by using these learning techniques and thus, making it feasible to incorporate them into a real adaptive system that improves during operation.</p> 

<p><a href="papers/dp1996-09.pdf">Download</a> (PDF, 440 KB)</p> 

<hr>

<a name="dp1996-10"></a><h3>96/10: Politics and techniques of data encryption</h3> 

<h4>H. B. Wolfe</h4> 

<p>Cryptography is the art or science, depending on how you look at it, of keeping messages secure. It has been around for a couple of thousand years in various forms. The Spartan Lysander and even Caesar made use of cryptography in some of their communications. Others in history include Roger Bacon, Edgar Allan Poe, Geoffrey Chaucer, and many more. By today&rsquo;s standards cryptographic techniques, through the ages right up to the end of World War I, have been pretty primitive. With the development of the electro-mechanical devices cryptography came of age. The subsequent evolution of the computer has raised the level of security that cryptography can provide in communications and data storage.</p> 


<hr>

<a name="dp1996-11"></a><h3>96/11: Reasonable security safeguards for small to medium organisations</h3> 

<h4>H. B. Wolfe</h4> 

<p>In today&rsquo;s world most businesses, large and small, depend on their computer(s) to provide vital functions consistently and without interruption. In many organizations the loss of the computer function could mean the difference between continued operation and shutdown. Reliability and continuity, therefore, become the critical aspect of any computer system(s) currently in use. This paper attempts to describe some of the most important issues any organization should address in order to reduce their risk where it relates to computer related failure.</p> 


<hr>

<a name="dp1996-12"></a><h3>96/12: Information warfare: Where are the threats?</h3> 

<h4>H. B. Wolfe</h4> 

<p>(No abstract.)</p> 

<p><a href="papers/dp1996-12.pdf">Download</a> (PDF, 72 KB)</p> 

<hr>

<a name="dp1996-13"></a><h3>96/13: Interactive visualisation tools for analysing NIR data</h3> 

<h4>H. Munro and K. Novins and G. L. Benwell and A. Moffat</h4> 

<p>This paper describes a tool being developed to allow users to visualise the ripening characteristics of fruit. These characteristics, such as sugar, acid and moisture content, can be measured using non-destructive Near Infrared Reflectance (NIR) analysis techniques. The four dimensional nature of the NIR data introduces some interesting visualisation problems. The display device only provides two dimensions, making it necessary to design two dimensional methods for representing the data. In order to help the user fully understand the dataset, a graphical display system is created with an interface that provides flexible visualisation tools.</p> 

<p><strong>Keywords:</strong> NIR spectroscopy, Polhemus FasTrak&#8482;, interaction, interactive graphics, interfaces, visualisation, scientific visualisation</p> 


<hr>

<a name="dp1996-14"></a><h3>96/14: A connectionist computational architecture based on an optical thin-film model</h3> 

<h4>M. K. Purvis and X. Li</h4> 

<p>A novel connectionist architecture that differs from conventional architectures based on the neuroanatomy of biological organisms is described. The proposed scheme is based on the model of multilayered optical thin-films, with the thicknesses of the individual thin-film layers serving as adjustable &lsquo;weights&rsquo; for the training. A discussion of training techniques for this model and some sample simulation calculations in the area of pattern recognition are presented. These results are shown to compare with results when the same training data are used in connection with a feed-forward neural network with back propagation training. A physical realization of this architecture could largely take advantage of existing optical thin-film deposition technology.</p> 


<hr>

<a name="dp1996-15"></a><h3>96/15: Measurement of database systems: An empirical study</h3> 

<h4>S. G. MacDonell and M. J. Shepperd and P. J. Sallis</h4> 

<p>There is comparatively little work, other than function points, that tackles the problem of building prediction systems for software that is dominated by data considerations, in particular systems developed using 4GLs. We describe an empirical investigation of 70 such systems. Various easily obtainable counts were extracted from data models (e.g. number of entities) and from specifications (e.g. number of screens). Using simple regression analysis, prediction systems of implementation size with accuracy of MMRE=21% were constructed. Our work shows that it is possible to develop simple and effective prediction systems based upon metrics easily derived from functional specifications and data models.</p> 

<p><strong>Keywords:</strong> metrics, entity-relationship models, 4GL, empirical, prediction</p> 

<p><a href="papers/dp1996-15.pdf">Download</a> (PDF, 200 KB)</p> 

<hr>

<a name="dp1996-16"></a><h3>96/16: Teaching a diploma in medical informatics using the World Wide Web</h3> 

<h4>R. T. Pascoe and D. Abernathy</h4> 

<p>In this paper is discussed the preliminary development of a Diploma in Medical Informatics which will comprise courses offered entirely through the Internet in the form of World Wide Web documents and electronic mail. Proposed use of such educational technology for the delivery of these courses within a distance learning environment is based upon a conversational framework developed by Laurillard (1993) and an associated classification of this technology according to the length to which elements within the conversational framework is supported.</p> 

<p><strong>Keywords:</strong> Diploma in Medical Informatics, World Wide Web (WWW), distance learning, educational technology</p> 


<hr>

<a name="dp1996-17"></a><h3>96/17: Alternatives to regression models for estimating software projects</h3> 

<h4>S. G. MacDonell and A. R. Gray</h4> 

<p>The use of &lsquo;standard&rsquo; regression analysis to derive predictive equations for software development has recently been complemented by increasing numbers of analyses using less common methods, such as neural networks, fuzzy logic models, and regression trees. This paper considers the implications of using these methods and provides some recommendations as to when they may be appropriate. A comparison of techniques is also made in terms of their modelling capabilities with specific reference to function point analysis.</p> 

<p><a href="papers/dp1996-17.pdf">Download</a> (PDF, 232 KB)</p> 

<hr>

<a name="dp1996-18"></a><h3>96/18: A goal-oriented approach for designing decision support displays in dynamic environments</h3> 

<h4>W. B. L. Wong and D. P. O&rsquo;Hare and P. J. Sallis</h4> 

<p>This paper reports on how the Critical Decision Method, a cognitive task analysis technique, was employed to identify the goal states of tasks performed by dispatchers in a dynamic environment, the Sydney Ambulance Co-ordination Centre. The analysis identified five goal states: Notification; Situation awareness; Planning resource to task compatibility; Speedy response; Maintain history of developments. These goals were then used to guide the development of display concepts  that support decision strategies invoked by dispatchers in this task environment.</p> 

<p><strong>Keywords:</strong> critical decision method (CDM), cognitive task analysis, cognitive engineering, ambulance dispatch, command and control, information portrayal, display design, decision support</p> 

<p><a href="papers/dp1996-18.pdf">Download</a> (PDF, 148 KB)</p> 

<hr>

<a name="dp1996-19"></a><h3>96/19: Software process engineering for measurement-driven software quality programs&mdash;Realism and idealism</h3> 

<h4>S. G. MacDonell and A. R. Gray</h4> 

<p>This paper brings together a set of commonsense recommendations relating to the delivery of software quality, with some emphasis on the adoption of realistic perspectives for software process/product stakeholders in the area of process improvement. The use of software measurement is regarded as an essential component for a quality development program, in terms of prediction, control, and adaptation as well as the communication necessary for stakeholders&rsquo; realistic perspectives. Some recipes for failure are briefly considered so as to enable some degree of contrast between what is currently perceived to be good and bad practices. This is followed by an evaluation of the quality-at-all-costs model, including a brief pragmatic investigation of quality in other, more mature, disciplines. Several programs that claim to assist in the pursuit of quality are examined, with some suggestions made as to how they may best be used in practice.</p> 

<p><a href="papers/dp1996-19.pdf">Download</a> (PDF, 240 KB)</p> 

<hr>

<a name="dp1996-20"></a><h3>96/20: Using genetic algorithms for an optical thin-film learning model</h3> 

<h4>X. Li and M. K. Purvis</h4> 

<p>A novel connectionist architecture based on an optical thin-film multilayer model (OTFM) is described. The architecture is explored as an alternative to the widely used neuron-inspired models, with the thin-film thicknesses serving as adjustable &lsquo;weights&rsquo; for the computation. The use of genetic algorithms for training the thin-film model, along with experimental results on the parity problem and the iris data classification are presented.</p> 

<p><a href="papers/dp1996-20.pdf">Download</a> (PDF, 448 KB)</p> 

<hr>

<a name="dp1996-21"></a><h3>96/21: Early experiences in measuring multimedia systems development effort</h3> 

<h4>T. Fletcher and W. B. L. Wong and S. G. MacDonell</h4> 

<p>The development of multimedia information systems must be managed and controlled just as it is for other generic system types. This paper proposes an approach for assessing multimedia component and system characteristics with a view to ultimately using these features to estimate the associated development effort. Given the different nature of multimedia systems, existing metrics do not appear to be entirely useful in this domain; however, some general principles can still be applied in analysis. Some basic assertions concerning the influential characteristics of multimedia systems are made and a small preliminary set of data is evaluated</p> 

<p><strong>Keywords:</strong> multimedia, management, metrics</p> 

<p><a href="papers/dp1996-21.pdf">Download</a> (PDF, 220 KB)</p> 

<hr>

<a name="dp1996-22"></a><h3>96/22: Applying soft systems methodology to multimedia systems requirements analysis</h3> 

<h4>D. Z. Butt and T. Fletcher and S. G. MacDonell and B. E. Norris and W. B. L. Wong</h4> 

<p>The Soft Systems Methodology (SSM) was used to identify requirements for the development of one or more information systems for a local company. The outcome of using this methodology was the development of three multimedia information systems. This paper discusses the use of the SSM when developing for multimedia environments. Namely, this paper covers the problems with traditional methods of requirements analysis (which the SSM addresses), how the SSM can be used to elicit multimedia information system requirements, and our personal experience of the method. Our personal experience is discussed in terms of the systems we developed using the SSM.</p> 

<p><strong>Keywords:</strong> multimedia information systems, Soft Systems methodology, systems development lifecycle</p> 


<hr>

<a name="dp1996-23"></a><h3>96/23: FuNN/2&mdash;A fuzzy neural network architecture for adaptive learning and knowledge acquisition</h3> 

<h4>N. K. Kasabov and J. Kim and M. J. Watts and A. R. Gray</h4> 

<p>Fuzzy neural networks have several features that make them well suited to a wide range of knowledge engineering applications. These strengths include fast and accurate learning, good generalisation capabilities, excellent explanation facilities in the form of semantically meaningful fuzzy rules, and the ability to accommodate both data and existing expert knowledge about the problem under consideration. This paper investigates adaptive learning, rule extraction and insertion, and neural/fuzzy reasoning for a particular model of a fuzzy neural network called FuNN. As well as providing for representing a fuzzy system with an adaptable neural architecture, FuNN also incorporates a genetic algorithm in one of its adaptation modes. A version of FuNN&mdash;FuNN/2, which employs triangular membership functions and correspondingly modified learning and adaptation algorithms, is also presented in the paper.</p> 


<hr>

<a name="dp1996-24"></a><h3>96/24: An agent-based architecture for software tool coordination</h3> 

<h4>S. J. S. Cranefield and M. K. Purvis</h4> 

<p>This paper presents a practical multi-agent architecture for assisting users to coordinate the use of both special and general purpose software tools for performing tasks in a given problem domain. The architecture is open and extensible being based on the techniques of agent-based software interoperability (ABSI), where each tool is encapsulated by a KQML-speaking agent. The work reported here adds additional facilities for the user to describe the problem domain, the tasks that are commonly performed in that domain and the ways in which various software tools are commonly used by the user. Together, these features provide the computer with a degree of autonomy in the user&rsquo;s problem domain in order to help the user achieve tasks through the coordinated use of disparate software tools.</p>

<p>This research focuses on the representational and planning capabilities required to extend the existing benefits of the ABSI architecture to include domain-level problem-solving skills. In particular, the paper proposes a number of standard ontologies that are required for this type of problem, and discusses a number of issues related to planning the coordinated use of agent-encapsulated tools.</p> 

<p><a href="papers/dp1996-24.pdf">Download</a> (PDF, 72 KB)</p> 

<hr>

<a name="dp1996-25"></a><h3>96/25: Special issue: GeoComputation &rsquo;96</h3> 

<h4>Various</h4> 

<p>A collection of papers authored by members of the Information Science department and presented at the 1st International Conference on GeoComputation, Leeds, United Kingdom. It comprises the nine papers listed below.</p> 

<a name="dp1996-25a"></a><h3>96/25a: Spatial databases&mdash;Creative future concepts and use</h3> 

<h4>G. L. Benwell</h4> 

<p>There is continuing pressure to develop spatial information systems. This paper develops two concepts that could emerge. The first is a new spatial paradigm&mdash;an holistic model&mdash;which is less of an abstraction from reality than current models. Second, is the concept of federated databases for the improved and transparent access to data by disparate users. The latter concept is hardly new and is included in this paper to emphasize its growing importance. These two developments are presented after a introductory discussion of the present state of the discipline of geographical information systems and spatial analysis.</p> 

<p style="color: green;">(Not in the electronic version.)</p>

<a name="dp1996-25b"></a><h3>96/25b: First experiences in implementing a spatial metadata repository</h3> 

<h4>S. Cockcroft</h4> 

<p>Integrated software engineering environments (ISEE) for traditional non spatial information systems are well developed, incorporating Database Management Systems (DBMS) and Computer Aided Software Engineering (CASE) tools. The core component of the ISEE is the repository. It brings all the other components together and provides a common area to which all tools can link. In this fashion it also provides a central point for control. No such facility exists for the management of spatial data. This paper describes the development of such a facility in the form of a spatial metadata repository.</p> 

<p style="color: green;">(Not in the electronic version.)</p>

<a name="dp1996-25c"></a><h3>96/25c: Incorporating a new computational reasoning approach to spatial modelling</h3> 

<h4>A. Holt</h4> 

<p>Decision support systems, statistics and expert systems were some of the mainstay techniques used for modelling environmental phenomena. Now modelling systems utilise artificial intelligence (AI) techniques for the extra computational analysis they provide. Whilst operating in a toolbox environment and by adopting AI techniques, the geographic information system (GIS) modellers have greater options available for solving problems. This paper outlines a new approach in applying artificial intelligence techniques to solve spatial problems. The approach combines case-based reasoning (CBR) with geographic information systems and allows both techniques to be applied to solve spatial problems. More specifically this paper examines techniques applied to the problem of soil classification. Spatial cases are defined and analysed using the case-based reasoning techniques of retrieve, reuse, revise and retain. Once the structure of cases are defined a case base is compiled. When the case base is of sufficient size, the problem of soil classification is tested using this new approach. The problem is solved by searching the case base for another spatial phenomena similar to that which exists. Then the knowledge from that searched case is used to formulate an answer to the problem. A comparison of the results obtained by this approach and a traditional method of soil classification is then undertaken. This paper also documents the saving data concept in translating from decision trees to CBR. The logistics of the problems that are characteristic of case-based reasoning systems are discussed, for example, how should the spatial domain of an environmental phenomena be best represented in a case base? What are the constraints of CBR, what data are lost, and what functions are gained? Finally, the following question is posed:  &ldquo;to what real world level can the environment be modelled using GIS and case-based reasoning techniques&rdquo;?</p> 

<p><a href="papers/dp1996-25c.pdf">Download</a> (PDF, 1 MB)</p> 

<a name="dp1996-25d"></a><h3>96/25d: GIS, expert systems and interoperability</h3> 

<h4>L. Lilburne and G. Benwell and R. Buick</h4> 

<p>How should geographic information systems be developed? There is a strong demand from users for enhanced functionality and power. Vendors can and do respond to these demands. But where will this lead? Will the result be one all-embracing and all-conquering program or geographic information system (GIS)? A GIS could grow to incorporate all statistical functions, all visualisation techniques, all data management functions etc. It is possible to perceive a scenario in which GIS is developed to &lsquo;bloatware&rsquo; proportions.</p>

<p>An alternative scenario is one in which a GIS is interfaced with other software systems. Embedding database bridges and other product-specific links, providing data import and export routines, and system calls are all ways of interfacing GIS with other systems. GIS vendors could opt to produce a &lsquo;linkware&rsquo; GIS, interfaced to as many third party systems as possible.</p>

<p>Given these two alternatives to GIS development, an interesting set of questions arises. How far do vendors go with enhancing their systems compared with interfacing with third party systems? Is there a balance? Or do GIS users just keep calling for &lsquo;more&rsquo;, regardless of the solution set?</p>

<p>There is a balance. GIS is likely to be developed by being enhanced AND by being interfaced with third party software. In a way, this is a third developmental track leading to an increasingly functional GIS whose ability to interact with other systems is greatly improved. This interoperable GIS allows flexible combinations of systems components while still providing a comprehensive range of spatial operations and analytical functions.</p>

<p>Of these three developmental tracks, this paper presents an example of what can be achieved with the interoperable GIS. Expert systems are introduced along with the client/server and object-oriented paradigms. By using these paradigms, a generic, spatial, rule-based toolbox called SES (spatial expert shell) has been created. SES is described using examples and contrasted with other documented expert system-GIS linkages. But first integration is modelled in three dimensions to highlight the need for improvements in how GISs can interact with other systems.</p> 

<p><a href="papers/dp1996-25d.pdf">Download</a> (PDF, 564 KB)</p> 

<a name="dp1996-25e"></a><h3>96/25e: Environmental decisions with spatial process modelling</h3> 

<h4>S. Mann</h4> 

<p>This paper first describes the difficulties inherent in supporting a class of environmental problems, those involved in Regional Environmental Decision Making. A set of conceptual criteria are presented along with discussion on how the criteria might be approached. It is shown that a major obstacle is the need for a system that integrates components of Geographic Information Systems with process modelling functions. A new approach, Spatial Process Modelling is proposed. More detailed design criteria for this system are developed which are then used to develop a prototype system. The system is described and benefits and limitations discussed.</p> 

<p><a href="papers/dp1996-25e.pdf">Download</a> (PDF, 348 KB)</p> 

<a name="dp1996-25f"></a><h3>96/25f: GIS maturity and integration</h3> 

<h4>A. J. Marr and G. L. Benwell</h4> 

<p>This paper discusses the concept of maturity in the use of GIS and then formulates a computational method for measuring an organisations maturity level from the construction of a surrogate indicator. Generation of this model is made under the proposition that maturity is linked to the level that GIS has been integrated and utilised on an organisation wide basis in day to day activities. The research focuses on New Zealand local government and incorporates parallel studies of conventional information technology (IT) with recently collected data to provide support for the concepts and techniques used. It is postulated that due to similarities of function found in other local authorities, that the model has the potential, with further research for wide application.</p> 

<p><a href="papers/dp1996-25f.pdf">Download</a> (PDF, 500 MB)</p> 

<a name="dp1996-25g"></a><h3>96/25g: Wildlife population analysis with GIS: Conservation management of royal albatross</h3> 

<h4>B. R. McLennan and M. K. Purvis and C. J. R. Robertson</h4> 

<p>This paper describes the use of a prototype spatial information system to facilitate exploratory analyses of 60 years of scientific observation data concerning a breeding population of royal albatrosses at Taiaroa Head, on the east coast of New Zealand&rsquo;s South Island. This system shall form the basis on an on-going data collection, management and analysis effort. Incorporation of breeding records with spatial and landscape data permits the investigation of spatial interactions between the location of nest sites and other phenomena. Three example analyses that explore these interactions are described and discussed.</p> 

<p style="color: green;">(Not in the electronic version.)</p>

<a name="dp1996-25h"></a><h3>96/25h: Data sharing using the X.500 directory</h3> 

<h4>R. T. Pascoe</h4> 

<p>Sharing geographical data sets is highly desirable for economical and technical reasons. In this paper the author describes the development of an agency for sharing geographical data which is based on the use of the ISODE implementation of the X.500 Directory Service and a collection of software agents which collaborate with each other to perform the various task associated with sharing data.</p> 

<p style="color: green;">(Not in the electronic version.)</p>

<a name="dp1996-25i"></a><h3>96/25i: Spatial data acquisition from motion video</h3> 

<h4>M. Williams</h4> 

<p>Geographic information systems are an important tool for the field of geocomputing. A key component of every system is the data&mdash;spatial data has traditionally been labour-intensive to collect, and hence expensive. This paper establishes a new method of acquiring spatial data from motion video. The proposed method is based upon the principles of photogrammetry, but allows position to be calculated with feature tracking rather than point correspondence. By doing so, it avoids many constraints imposed by previous solutions. The new method is demonstrated with linear and rotational motion.</p> 

<p><a href="papers/dp1996-25i.pdf">Download</a> (PDF, 808 KB)</p>