Locational intelligence through Geoinformatics and AI?

People use computers in their everyday life. From navigation systems to shopping planners and smart houses our life is affected more and more by intelligence systems. But how are such systems working and what will it mean to society?

Artificial intelligence (AI) is the ability of a computer programme or a machine to think logically, solve problems and to learn. In practice however, most applications have picked on problems which computers can do well. Searching data bases and doing calculations are things computers do better than people. On the other hand, "perceiving its environment" in any real sense is way beyond present-day computing. Generally speking, an ideal intelligent machine mimics human cognition. AI involves many different fields like computer science, mathematics, linguistics, psychology, neuroscience, and philosophy.

Researchers from the Department of Geoinformatics - Z_GIS at the University of Salzburg and computer scientists from the University of Klagenfurt critically discussed recent developments of AI for locational intelligence. The main goal of the workshop was to describe and analyze various “intelligent” methods related to natural language processing (NLP) and spatial information. They identified five major topics:

Sentiment analysis
Sentiment analysis is an essential part of Natural Language Processing, which focuses on determining whether the attitude of a piece of text on a particular topic is either positive, negative or neutral. Markus’ work focuses on evaluating the most cutting-edge technologies in sentiment analysis. In particular, he developed a benchmark tool that measures the accuracy, performance and ratio between those qualities and it supports both English and German language.

Topic Modeling
A system was presented that collects news from various sources (blogs, news sites, etc.) categorized based on the specific topic they refer to. Topics are derived by crawling a predefined set of sources various times a day, which results in a corpora for headlines. The process followed extends the traditional bag-of-words search of articles by taking into consideration queries that focus on word order and various regular expression techniques.

Localization based on textual information
Only in some cases text messages can unambiguously be assigned to a location. While there are many methods to retrieve the meaning or the sentiment of a short message like a tweet, localization is still challenging. Recent research demonstrates for extreme events such as floods or Earthquakes that it is possible to construct a spatial footprint of a large body of messages mentioning such events or related keywords.

Pattern Recognition of Places
Traditional search of places (e.g. using Google Maps) is performed using simple sets of keywords. This approach is limited since it cannot identify features of place that go beyond place names, such as activities and purposes. To address this, a novel place search process is proposed based on pattern-based representation. Patterns are derived through a combination of knowledge extraction from narratives and probabilistic machine learning based on real world data.   

Human Trajectories
Most humans generate spatial traces ones they move. Telecom operators always know where customers are but are of course reluctant to give away personal information. However, some providers started to provide generalized and anonymized data which can still provide the base for analyses which aim to understand flows in cities. But many users provide their spatial footprints voluntarily or unknowingly through their activities in social media. This data help scientists to understand movement pattern but trigger research on geo-privacy.