Patent application number | Description | Published |
20110301942 | Method and Apparatus for Full Natural Language Parsing - The method and apparatus for discriminative natural language parsing, uses a deep convolutional neural network adapted for text and a structured tag inference in a graph. In the method and apparatus, a trained recursive convolutional graph transformer network, formed by the deep convolutional neural network and the graph, predicts “levels” of a parse tree based on predictions of previous levels. | 12-08-2011 |
20120253792 | Sentiment Classification Based on Supervised Latent N-Gram Analysis - A method for sentiment classification of a text document using high-order n-grams utilizes a multilevel embedding strategy to project n-grams into a low-dimensional latent semantic space where the projection parameters are trained in a supervised fashion together with the sentiment classification task. Using, for example, a deep convolutional neural network, the semantic embedding of n-grams, the bag-of-occurrence representation of text from n-grams, and the classification function from each review to the sentiment class are learned jointly in one unified discriminative framework. | 10-04-2012 |
20120275690 | DISTRIBUTED ARTIFICIAL INTELLIGENCE SERVICES ON A CELL PHONE - A cell phone having distributed artificial intelligence services is provided. The cell phone includes a neural network for performing a first pass of object recognition on an image to identify objects of interest therein based on one or more criterion. The cell phone also includes a patch generator for deriving patches from the objects of interest. Each of the patches includes a portion of a respective one of the objects of interest. The cell phone additionally includes a transmitter for transmitting the patches to a server for further processing in place of an entirety of the image to reduce network traffic. | 11-01-2012 |
20120310627 | DOCUMENT CLASSIFICATION WITH WEIGHTED SUPERVISED N-GRAM EMBEDDING - Methods and systems for document classification include embedding n-grams from an input text in a latent space, embedding the input text in the latent space based on the embedded n-grams and weighting said n-grams according to spatial evidence of the respective n-grams in the input text, classifying the document along one or more axes, and adjusting weights used to weight the n-grams based on the output of the classifying step. | 12-06-2012 |
20140122388 | QUERY GENERATION AND TIME DIFFERENCE FEATURES FOR SUPERVISED SEMANTIC INDEXING - Semantic indexing methods and systems are disclosed. One such method is directed to training a semantic indexing model by employing an expanded query. The query can be expanded by merging the query with documents that are relevant to the query for purposes of compensating for a lack of training data. In accordance with another exemplary aspect, time difference features can be incorporated into a semantic indexing model to account for changes in query distributions over time. | 05-01-2014 |
20140236577 | Semantic Representations of Rare Words in a Neural Probabilistic Language Model - Systems and methods are disclosed for representing a word by extracting n-dimensions for the word from an original language model; if the word has been previously processed, use values previously chosen to define an (n+m) dimensional vector and otherwise randomly selecting m values to define the (n+m) dimensional vector; and applying the (n+m) dimensional vector to represent words that are not well-represented in the language model. | 08-21-2014 |
20140236578 | Question-Answering by Recursive Parse Tree Descent - Systems and methods are disclosed to answer free form questions using recursive neural network (RNN) by defining feature representations at every node of a parse trees of questions and supporting sentences, when applied recursively, starting with token vectors from a neural probabilistic language model; and extracting answers to arbitrary natural language questions from supporting sentences. | 08-21-2014 |
20140310218 | High-Order Semi-RBMs and Deep Gated Neural Networks for Feature Interaction Identification and Non-Linear Semantic Indexing - Systems and method are disclosed for determining complex interactions among system inputs by using semi-Restricted Boltzmann Machines (RBMs) with factorized gated interactions of different orders to model complex interactions among system inputs; applying semi-RBMs to train a deep neural network with high-order within-layer interactions for learning a distance metric and a feature mapping; and tuning the deep neural network by minimizing margin violations between positive query document pairs and corresponding negative pairs. | 10-16-2014 |