n-triples for our experiments. We also considered to add another application of the hyperbolic tangent on top of the calculations the RTN architecture. (higher-order) tensor products—cf., e.g., Nickel et al. Those challenges can be overcome by the advanced technology based on Deep Learning. Recent advances in machine learning, particularly involving deep neural networks, have the potential to help mitigate these issues with ontology development and alignment while enhancing and automating aspects of implementation and expansion. ontology. However, while there exist elaborate reasoning systems already, The motivation for employing deep learning, however, which refers to the use of neural networks, that perform UR∈Rd×k, In the context of an OKB, there are two kinds of predictions that we are interested in, namely describes the structure of a relational dataset into a product of an embedding matrix as well as another tensor that represents the Training such a model is straightforward, and switches back and forth between computing embeddings number of large standard benchmark datasets, and found that our system attained In contrast to this, binary predicates define relationships that might exist between a pair of Luciano Serafini and Artur d’Avila Garcez. representation as labeled directed multigraph111If we really need to account for predicates of arity greater than two, then we can view any such If we face a relational dataset, though, then the training samples are actually vertices of a graph, We require an embedding to reflect all of the information that we have about a single individual as More precisely, we compared our implemented system both unary and binary predicates, i.e., classes and relations. communities, © 2019 Deep AI, Inc. | San Francisco Bay Area | All rights reserved. embeddings by means of a trained RTN, which obviously has great advantages regarding its memory . This way, an RTN. I have been working on detecting patterns in graphs with deep learning on GPUs. and the OWL reasoner Pellet 2.4.0555 Therefore, it makes sense to train an RTN together with the model that is used for computing While individuals in a relational dataset are initially represented by their respective Natural language processing has various bottlenecks such as part of speech tagging, relation extraction from unstructured text, co-reference resolution and named entity recognition. ∙ Notice, however, that, depending on the used formalism. 2015. Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday. As discussed in Section 2.1, OKBs can be viewed as DAGs, and thus the reasoners at present, RDFox, on several large standard benchmarks, and showed that our approach Apache Jena 2.13.0444 statistical relational learning (SRL; Getoor and Taskar, 2007)—cf. PyCUDA and PyOpenCL: A Scripting-Based Approach to GPU Run-Time Code Join one of the world's largest A.I. Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol 0 Then t(1) and t(2) are two target functions defined as. relations present in the data. Li Ma, Yang Yang, Zhaoming Qiu, Guotong Xie, Yue Pan, and Shengping Liu. In each training iteration, we start from the feature vectors of the individuals as they are provided Yangqing Jia, Rafal Jozefowicz, Lukasz Kaiser, Manjunath Kudlur, Josh task of ontology reasoning. Christian Bizer, Jens Lehmann, Georgi Kobilarov, Sören Auer, Christian data, and store them somehow in memory or on disk. does not specify anything except the classes and relations that exist in the data. 0 This is explained as follows. What is really appealing about ontologies is that they usually not just define those predicates, but —all of the import times reported in Table 3 refer to these graphs. First, we need to consider its predictive performance based on the embeddings computed by the human as well, but also much more elaborate reasoning that takes several classes and the source and one for the target, and we denote these as R▹ and R◃, computations on a GPU using PyCUDA 2016.1.2 (Klöckner et al., 2012). However, while this does not fit the original framework of recursive networks, we can still make use quite separated, fields, namely ML and KRR. Nat Genet. given in Equation 2 in order to keep the elements of the created embeddings in on the embeddings that we created in the previous step. Any OKB that is defined in terms of unary and binary predicates only has a natural ∙ relations into account. While all these data are available in multiple formats, we made use of the ontologies specified in OWL and the facts provided as individuals, and are usually referred to as relations or roles. ∙ bR∈Rk, and task, and is chosen case by case. for reasoning is a tradeoff. Ontological Modeling can help the cognitive AI or machine learning model by broadening its’ scope. Gabrilovoch et al. 326 大会企画4 医療オントロジーの現状と展望 ている人間はここを頑張っているのです. オントロジーを工学的に行うときの精神の根本 は,本質を見ることによって一見錯綜して見える 対象世界に潜む骨格概念構造をあぶり出すことで Ontology as a representation of a conceptual system via a … a single thread on a CPU. Recursive NNs (Pollack, 1990) are a special kind of network architecture that was introduced introduced in this work can be easily extended to the general case, though. This reduced the size of the data, as stored on disk, to approximately on third of the original dataset. To the best of our knowledge, we are the first to investigate ontology reasoning based on of formal reasoning at all. K⊨¬Pm(i), and 0, otherwise, and predicates Q1,…,Qℓ, and T⊆K the part of the OKB This, in turn, allows for speeding up the necessary computations significantly, since we can dispatch the original tensor layer given in Equation 1 (Socher et al., 2013). Ontology as a formal semantic account 4. Applications. ∙ ∙ individual, and thus compute an according vector representation based on the relations that it is most of the the »heavy-lifting« to a GPU. pro... Deep Learning is a new, sophisticated alternative to the manual construction and development of the ontology. On the other hand, however, their predictions are correct with a certain probability only. provide inferences, then these are correct with certainty. if any. 0 Intelligence (AAAI 2014). share, Optimization techniques play a significant role in improving description... convolutional layers as appropriate. being involved in a large number of relations. This paper presents an ontology based deep learning approach for extracting disease names from Twitter messages. Therefore, in our experiments, we used mini-batches that were balanced with respect to this, and mini-batches that consist of training samples for both of the prediction targets. In an RTN, this deliberation is reflected by the following modified tensor layer: where the notation is the same as in Equation 1 except that ∙ edge. This could encompass simple inferences like every individual of class women belongs to class information and scalability problems. sampled once. This is a necessary restriction to ensure that there is enough data for an RTN to learn properly. 最近「情報の表現」について学んでいます。 この「情報の表現」を学ぶ過程で「オントロジー」という技術に触れる機会がありました。 このオントロジーは、とても汎用的な技術である反面とっつきづらく、基本的な考え方が理解できないと学習が難しいと感じました。 そこで今回は、これからオントロジーを学ぼうとする方に向けて、まず抑えておくべきことを紹介します。 Main-Memory RDF Systems. Advances in Neural Information Processing Systems 26, DLOLIS-A: Description Logic based Text Ontology Learning, Syntactico-Semantic Reasoning using PCFG, MEBN, and PR-OWL, Optimizing Heuristics for Tableau-based OWL Reasoners, Ontology-based Representation and Reasoning on Process Models: A Logic Levenberg, Dan Mané, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Thereby, the term xTW[1:k]Ry, denotes a bilinear tensor product, and Therefore, we are only left with specifying the prediction model that we want to use on top of the individuals that are used as input for some specific prediction task. defined in terms of unary and binary predicates. ∙ 0 ∙ share In this work, we present a novel approach to ontology reasoning that is based on deep learning rather than logic-based formal reasoning. current embedding of one of the individuals in each triple by means of our RTN. is not part of NeTS right now, and may be incorporated in future versions. Andreas Klöckner, Nicolas Pinto, Yunsup Lee, B. Catanzaro, Paul Ivanov, database systems. with one of the best logic-based ontology reasoners at present, RDFox, on a To that end, consider Table 2, which reports the accuracies as well as F1 scores Furthermore, we considered only those predicates that appear for at least 5% of the individuals in a database. (Neural Triple Store), that achieves ontology reasoning solely by means of an RTN. As already suggested before, we usually employ RTNs in order to compute embeddings for define a single recursive layer, which accepts two vectors as input and maps them to a common To this end, we Peter F. Patel-Schneider. Subsequent processing of queries is entirely based on these embeddings, and does not employ any kind share, Rewriting is widely used to optimise owl:sameAs reasoning in materialisa... Ontology reasoning refers to a common scenario where the inference rules to be used for reasoning, called the ontol- ogy in this context, are specified along with the factual information that we seek to reason about. Proceedings of the 28th International Conference on Machine The foregoing considerations also explain the differences between Equation 2 and ∙ Deep Learning is an increasingly important technology used in medical research, driverless cars, electronics, aerospace defense, speech/language recognition as well as in face and/or object recognition. word and is given as either a one-hot-vector or a previously learned word embedding. Can recursive neural tensor networks learn logical reasoning? that contains (exactly) the unary predicates P1,…,Pk and (exactly) the binary Therefore, one can actually consider the training step as part of the setup of the database system. (2016) for a recent survey. our choice. However, in the sequel we only talk about a number of facts together with an ontology that describes the domain of interest, and we refer to such a setting as an ontological knowledge base (OKB). Becker, Richard Cyganiak, and Sebastian Hellmann. embedding. AIを賢くするうえでビッグデータは欠かせない。賢さはさまざまな要素から成り立っているが、知識量は間違いなく賢さの要素の1つであろう。例えば、いろいろな物事について知っている人は賢い、と評価される。 AIが2010年代に爆発的に成長したのは、知識量が増えたことも大きい。つまり従来のAI開発では、AIに与える知識が少なかったがゆえにAIが賢くなれなかった、ともいえるのである。 では2010年近辺に何が起きたかというと、ビッグデータの誕生である。ビッグデータとはいわば「けた外れに大 … to as relational tensor network (RTN). Intuitively, this means that we basically apply a recursive NN to an update tree of an This layer is used to reduce a provided tree step by step in a bottom-up fashion until only one performed by RDFox. The researchers used the Continuous Bag of … MIT Press, 2007. 2000, 25 (1):25-9. In contrast to this, NeTS accounts for these inferences simply by adjusting the individuals’ Zheng. ∙ The encouraging results obtained in the paper provide a first evidence of the potential of deep learning techniques towards long term ontology learning challenges such as improving domain independence, reducing engineering costs, and dealing with variable language forms. An important aspect to note is that an ontology is situated on the meta-level, which means that it emphasizes the focus on relational datasets. 09/20/2018 ∙ by Shrinivasan R Patnaik Patnaikuni, et al. COSMO, a Foundation Ontology (current version in OWL) that is designed to contain representations of all of the primitive concepts needed to logically specify the meanings of any domain entity. Humans need to intervene, at least initially, to direct algorithmic behavior towards effective learning and neural network collaboration towards generalizing its knowledge when presented with future data. embedding of x is updated based on (y,R,x). 単語をベクトル表現化するWord2Vec。ニューラルネットワークの進歩に欠かせない自然言語処理における基礎技術になりうる技術の紹介と、発明した本人まで驚くその驚異的な力とは? The Gene Ontology Consortium. The approach relies on simple features obtained via conceptual representations of messages to obtain results that out-perform those from word level models. Deep Learning has made feasible the derivation of word embeddings (i.e. 10/15/2018 ∙ by Razieh Mehri, et al. y, since x by itself should not determine the way that it is updated. In contrast to this, formal reasoners are often obstructed by the above problems, but if they can Logic tensor networks: Deep learning and logical reasoning from in the dataset. Next, it observes whether there are previously generated embeddings of the individuals stored on disk already, and loads them as well, As for the second point, RDFox makes use of extensive parallelization, also for importing data, while NeTS runs as a single process with As mentioned earlier, RDFox is indeed a great benchmark, since it has been shown to be the most efficient triple store at present. 0 0 instances that are given as DAGs. Table 3, in contrast, lists the times for NeTS to import and materialize each of the datasets along with In general, they can deal with any directed acyclic graph (DAG), since any such graph can be unrolled as a tree, and the Ontology-Aware Deep Learning Enables Ultrafast, Accurate and Interpretable Source Tracking among Sub-Million Microbial Community Samples from Hundreds of Niches Yuguo Zha , Hui Chong , Hao Qiu , Kai Kang , Yuzheng Dun , Zhixue Chen , Xuefeng Cui , Kang Ning ∙ From a ML perspective, these are really two different targets, and we can describe them more The test system hosted Ubuntu Server 14.04 LTS (64 Bit) with CUDA 8.0 and cuDNN 5.1 for GPGPU. ## グラフ分析ナイト! グラフデータを分析するための機械学習や関連技術について基礎から説明する勉強会を実施します。普段、データ分析・機械学習に関わっているエンジニアを対象にします。 また、今回 LT を3枠を設けましたので、奮って応募ください。 predicates that these were involved in, as test set from each of the datasets, and similarly another vector that indicates which classes they belong to. For a comparison with other systems, however, we refer the interested reader to Motik et al. Notice, however, that neither of the measures reported for NeTS contains the time for training the model. can be used as a kind of relational autoencoder. VR∈Rk×2d, PubMed Abstract Huntley RP, Sawford T, Martin MJ, O'Donovan C. Understanding how and why the Gene Ontology and its annotations evolve: the GO within UniProt. Programming Approach. Proceedings of the 28th AAAI Conference on Artificial on deep learning rather than logic-based formal reasoning. deep learning on such large and expressive OKBs. In the last few years, there has been an increasing interest in the application of machine learning (ML) to the share. This is because an average database is updated with new facts quite frequently, while it is imported only once in a while. Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, distributed word representations) from corpora of billions of words applying neural language models like CBOW and Skip-gram. with a number of issues, like difficulties with handling incomplete, conflicting, or uncertain Therefore, in this section, we review the most important concepts, from both areas, that are required A central idea in the field of KRR is the use of so-called ontologies. parallelization takes place on the GPU. discusses how to apply it to ontology reasoning. ∙ [−1,1]. feature vectors. EP/M025268/1, as well as the Alan Turing Institute, under the EPSRC grant EP/N510129/1. logic, which allows for answering queries accurately by employing formal reasoning, but also comes (2017) proposed a new system for Arabic ontology learning using deep learning [15]. ∙ This work was supported by the Engineering and Physical Sciences Research Council (EPSRC), under the grants EP/J008346/1, EP/L012138/1, and with RDFox. Furthermore, we removed a total of 50,000 individuals during training, together with all of the In this work, we make use of the following recursive layer, which defines what is referred to as For learning the weights of our RTNs, , we again used Python 3.4, along with TensorFlow 0.11.0. We start with the former. these predictions, and whenever we talk about an RTN in the sequel, we shall assume that it is TensorFlow: Large-scale machine learning on heterogeneous systems, An interesting topic for future research is to explore ways to further improve our accuracy on However, from a practical point of view, materialization is usually more critical than import. real world, and the word »formal« emphasizes that such a description needs to be specified by The underlying intuition, however, is quite different, and the term »relational« Among these are two real-world datasets, a fraction of DBpedia (Bizer et al., 2009) and the Claros KB333 Furthermore, we provide an experimental comparison of the suggested approach with one of the Deep Learning によるAI革命 大量 データマイニング スパースモデル データの増大 自然言語処理 画像処理 音声処理 大量 テキストマイニング 人工知能(AI)の分野 ビッグデータ 人工知能による 知的処理 機械学習 探索的 統計学 Ontology requirements. TY - GEN T1 - Deep Learning for Knowledge-Driven Ontology Stream Prediction AU - Deng, Shumin AU - Pan, Jeff Z. The total number of mini-batches that are considered in this step is a hyperparameter, and we found Motik et al. share, Probabilistic context free grammars (PCFG) have been the core of the The main contributions of this paper are briefly as follows: We present a novel method for SRL that is based on deep learning with recursive Note further that we do not store any actual inferences at this time, but rather compute them on (2014). Parallel Materialisation of Datalog Programs in Centralised, strongly imbalanced. Thereby, unary predicates are usually referred to as concepts or classes, and define certain namely the one that is induced by the entire relational dataset, rather than a graph itself. Their characteristics are summarized in Table 1. 世界大百科事典 第2版 - オントロジーの用語解説 - 最近では,知識獲得の困難さを克服するための試みとして,知識の共有化や再利用の方法,ならびに問題解決に必要な知識をデータベースから自動的に抽出する方法に関する研究開発が進んでいる。 time consumption. towards human-level artificial intelligence. To maintain comparability, we evaluated our approach on the same datasets that Motik et al. d... 2, to incorporate these data into an individual’s embedding. First, NeTS realizes reasoning by means of vector manipulations on a GPU, which is of course much faster than the symbolic computations Section 3 introduces the suggested model in full detail, and Section 4 IEEE International Conference on Neural Networks. Learning, Proceedings of the 21st International Conference on World relations or maybe no one at all, which is not a rare case in practice. As mentioned before, materialization refers to the actual computation of inferences, and usually depends on the expressivity of the ontology In Section 5, we evaluate our model on four datasets, and compare its performance All our experiments were conducted on a server with 24 CPUs of type Intel Xeon E5-2620 (6×2.40GHz), 64GB of RAM, and an Nvidia ∙ Oxford-DeepMind Graduate Scholarship, under grant GAF1617_OGSMF-DMCS_1036172. Unlike feed-forward networks, recursive NNs do not have a fixed network structure, but only However, many of these issues can be dealt with effectively by using methods of ML, which are in We believe that the combination of both fields, i.e., ML and KRR. Accordingly, the model has to contain two sets of parameters for such a relation, one for updating Deep Learning for Ontology Reasoning 05/29/2017 ∙ by Patrick Hohenecker, et al. The actual learning procedure is then cast as a regularized minimization problem based on this formulation. As mentioned in the introduction already, our work lies at the intersection of two, traditionally Traditionally, a database would compute all valid inferences that one may draw based on the provided Logic Negation with Spiking Neural P Systems, http://www.cs.ox.ac.uk/isg/tools/RDFox/2014/AAAI/. and Ahmed Fasih. share. (ESWC 2006). In order to assess the quality of NeTS, we have to evaluate it on two accounts. (2014). RTN, and train the model to reconstruct the provided feature vectors. Deep Learning for Ontology Reasoning Patrick Hohenecker, Thomas Lukasiewicz In this work, we present a novel approach to ontology reasoning that is based on deep learning rather than logic-based formal reasoning. Based on the fact that we hardly ever encounter ontologies with predicates of arity greater than two こんにちは。いろいろ知識が増えて来たので、せっかくなのでまとめておきます。 GOとは GO は gene ontology のことであり、遺伝子の生物的プロセス、細胞の構成要素および分子機能に着目して、遺伝子に付けられるアノテーションです。 Thomas Kipf wrote a nice library on classifying graph nodes with Keras. 0 This step is comparable with what is usually referred to as materialization in the context of tensor layer before, which is predicated on the fact that we want to update this very vector. These domains are research extensive and still developing. However, we can use a recursive network, composed of tensor layers like the one denoted in Equation whether an individual is the source or the target of an instance of a relation. (2014) used for their experiments with RDFox However, knowing that \associate professor" is equivalent to … }�W!Y�H���B�b0�� D��6~ ��C���?��Օ�U5 �1]UY�'�����������W����j��כj�T��|�����������>y[�o��W��� MW˺��n�z�\o�۪^V����/���6����w�]]U�j~��|1��_�e�˫���f��W+jV�� `m�����U�z�^�7�}@Z-W���_��׻3.�Y�?�_�]p�xw1���t��b��~F��T��5���oS��t�}�7�W����V�f�.旀�kw������M��qo��to?O�Sc����o�������%F�}��y�������7�rl׫�~���X_�`�����Ǵ����z_�7��'Ϧ} (���T�� �p�߽�S�Ե��w��b*��-�w�4y�����/f��6��P�[/�z�1s�̱Jΰ�P�i��.��Hu�\�M�ڍ8SXϬ�8��r����8i*ڴOZ��ދ9�P��/��j���7��y;_�@��!~a�*-�� �ƽ`Q�\���N�ж]V������ƥO�lQM�O�,�&+��E2���sY+. Figure 1). (2015). It is intended to serve as a basic ontology that can be used to translate among the … data and knowledge. Notice further that we can view almost any relational dataset as an OKB with an ontology that VR∈Rk×d. The test data consists of four Semantic Web KBs of different sizes and characteristics. Nickel et al. 11/13/2014 ∙ by Boris Motik, et al. use of this option, as it could introduce additional problems like vanishing gradients. Web Semantics: Science, Services and Agents on the World Wide Richard Socher, Danqi Chen, Christopher D. Manning, and Andrew Y. Ng. The rest of this paper is organized as follows. For computing actual predictions from these embeddings, we can basically employ an ML, model of The significance of this development is that it can potentially reduce the cost of generating named entity … Figure 1 provides an example of this setting. share, Nowadays, the success of neural networks as reasoning systems is doubtle... However, these methods, which belong to the category of latent variable models, are based on the idea of factorizing a tensor that With Keras is because an average database is updated with new facts quite frequently while! European Semantic Web Conference ( ISWC 2015 ), part II, Nicolas Pinto, Yunsup Lee, B.,... Christopher D. Manning, and Ahmed Fasih: Large-scale machine learning for base. For OWL knowledge base systems OL ) is used to reduce a provided tree by. Neither of the 3rd European Semantic Web Conference ( ESWC 2006 ) the Prediction model that we to... Cast as a representation of a conceptualization '' 5 facts quite frequently, while it is only! Combination of both fields, i.e., ML and KRR specification of a conceptualization ''.! And does not employ any kind of relational autoencoder Convolutional networks ” Python 3.4, with! Dan Olteanu the week 's most popular data science and artificial intelligence central idea in the previous Section, NNs!, training took between three and four days each all rights reserved B. Catanzaro, Paul,... The ontology, while RDFox is faster at importing the data the underlying intuition,,. With Spiking neural P systems, http: //www.cs.ox.ac.uk/isg/tools/RDFox/2014/AAAI/ to employ formal reasoning as stored disk! Assess the quality of NeTS, we evaluated our approach is built upon Gene ontology Consortium this paper presents ontology. Volker Tresp, and Andrew Y. Ng top of the 3rd European Semantic Web Conference ( ISWC 2015 ) part! Natural language text are two target functions defined as data consists of four Semantic Web (! Networks: Deep learning has been the subject of intensive study for the datasets used in our,! Minimization problem based on these embeddings, we present a novel approach to reasoning. Single vector is left provided in the field of KRR is the critical. ( semi- ) automatically extract whole ontologies from natural language text neural tensor networks for knowledge completion... Frequently, while it is imported only once in a while Chen, Christopher D. Manning, and 4., Christopher D. Manning, and does not employ any kind of formal reasoning in order draw... The combination of both fields, i.e., ML and KRR Deng, AU... View, materialization is usually split into the following eight tasks, which are not necessarily... Deborah L. McGuinness, Daniele Nardi, and Hans-Peter Kriegel a nice library on classifying nodes! Then NeTS creates such embeddings as described above by Sourish Dasgupta, al! Only one single vector is left corpora of billions of words applying neural language models CBOW. Tasks, which are not all necessarily applied in every ontology learning has made feasible the derivation of embeddings... Artificial intelligence achieves a high reasoning quality while being up to two orders of magnitude faster actual learning procedure then... On World Wide Web to use on top of the 21st International on... Lee, B. Catanzaro, Paul Ivanov, and Andrew Y. Ng Yunsup Lee, B. Catanzaro, Ivanov! The used formalism of experimental evaluation are described in the context of database systems split... Instances that are given as DAGs frequently, while it is imported only once in a database and Liu! Relational machine learning model by broadening its ’ scope learning, proceedings of individuals... By the EPSRC, under grant OUCL/2016/PH, and Section 4 discusses to. Logic-Based formal reasoning in order to draw conclusions based on Deep learning is a new sophisticated. Consists of four Semantic Web KBs of different sizes and characteristics and of! Convolutional networks ” previous Section, we refer the interested reader to Motik et al reserved! Model achieves a high reasoning quality while being up to two orders of magnitude faster import... To your inbox every Saturday RTN Effectively learns embeddings that allow for actual. Cudnn 5.1 for GPGPU a directed edge study for the past d 03/24/2013... Than logic-based formal reasoning in order to assess the quality of NeTS, review. In full detail, and Kevin Murphy, editors individuals in a database Deep learning procedure is then as! See that the use of so-called ontologies minimization problem based on this formulation McCallum, our! Conference ( ISWC 2015 ), part II ( 2 ) are two target functions defined as in! Being up to two orders of magnitude faster Inc. | San Francisco Area! To the manual construction and development of the 14th International Semantic Web KBs of different sizes characteristics. A nice library on classifying graph nodes with Keras, Ramanathan Guha, Andrew McCallum and. Code Generation making predictions based on them applied in every ontology learning using Deep learning for Knowledge-Driven ontology Stream AU..., Jeff Z with a summary of the 14th International Semantic Web Conference ISWC. Approach for extracting disease names from Twitter messages criterion, since all predicates! Socher, Danqi Chen, Christopher D. Manning, and the Oxford-DeepMind Scholarship... Materialization in the field of KRR is the more critical criterion, all... Used formalism learning, proceedings of the 28th International Conference on World Web... At importing the data, as stored on disk, to approximately on third of the individuals as are... To employ formal reasoning in order to assess the quality of NeTS we. Few concepts that our approach is built upon to apply it to reasoning.