During our era of enormous geospatial data (abbr. geodata), the comprehensibility of its contents to people often remains weak and the data contains more and more features and phenomena unknown to its users. However, effective use of these data and understanding phenomena related to geographic features in them require comprehending exactly this unknown information. The task is challenging for human interpreters because, in these unknown cases, they do not specifically know what they are searching for but, rather, seek previously familiar patterns and shapes in the data. Finding the patterns and shapes necessitates examining the data from different viewpoints, for which tools of geovisual analytics and exploration (abbr. geoexploration) are often utilised. Nonetheless, use of these tools can be left deficient due to inadequate classification and parametrisation choices in geovisualization, to which problem this research project searches answers. Methods for human-aided artificial intelligence (AI) will be investigated for creating such tools and a prototype of a map user interface, which will allow for effectively bringing out diverse characteristics of unknown features of geodata in future.
This research project takes its aims at 1) creating knowledge about the state of the art on AI methods in geoexploration; 2) determining suitable AI methodology for geoexplorative AI tools and 3) developing such tools and a prototype of a map user interface. For user interaction, in addition to traditional control devices, eye tracking will be investigated. To do so, geographic information systems as well as software for user interfaces and AI will be employed. Created AI solutions and their needs for further development will be assessed by scientific user research, for example, in the use case of comprehending features in terrain geodata. The project will produce information and publications about principles for creating geoexplorative AI systems that will enable examining and comprehending unknown features in geodata in future.