Machine-learning alone doesn't solve the natural language problem. DEEPSEARCH's technology does.

Semantic Spaces

Semantic spaces are created by specific, language agnostic concepts of the real-world and their relationships with each other (knowledge graph).

Concepts are described in different languages by words, phrases, word combinations and collocations. The closer they are to the center, the higher their impact.

Semantic spaces in the natural language domain aim to create representations of natural language that are capable of capturing meaning. The original motivation for semantic spaces stems from two core challenges of natural language:

1. Vocabulary mismatch (the fact that the same meaning can be expressed in many ways) and

2. ambiguity of natural language (the fact that the same term can have several meanings).

Example "Noise disturbance"

DEEPSEARCH's mighty knowledge base consists of 120.000+ "concepts" amplifying its understanding of the real world.

Simplified example of the concept "dog"

Easy to handle "semantic building blocks" point to their place in the knowledge base and therefore always know their context.

Building blocks used in a utility business context