Patent application number | Description | Published |
20080221878 | FAST SEMANTIC EXTRACTION USING A NEURAL NETWORK ARCHITECTURE - A system and method for semantic extraction using a neural network architecture includes indexing each word in an input sentence into a dictionary and using these indices to map each word to a d-dimensional vector (the features of which are learned). Together with this, position information for a word of interest (the word to labeled) and a verb of interest (the verb that the semantic role is being predicted for) with respect to a given word are also used. These positions are integrated by employing a linear layer that is adapted to the input sentence. Several linear transformations and squashing functions are then applied to output class probabilities for semantic role labels. All the weights for the whole architecture are trained by backpropagation. | 09-11-2008 |
20090171868 | Method and Apparatus for Early Termination in Training of Support Vector Machines - Disclosed is a method for early termination in training support vector machines. A support vector machine is iteratively trained based on training examples using an objective function having primal and dual formulations. At each iteration, a termination threshold is calculated based on the current SVM solution. The termination threshold increases with the number of training examples. The termination threshold can be calculated based on the observed variance of the loss for the current SVM solution. The termination threshold is compared to a duality gap between primal and dual formulations at the current SVM solution. When the duality gap is less than the termination threshold, the training is terminated. | 07-02-2009 |
20090204556 | Large Scale Manifold Transduction - A method for training a learning machine for use in discriminative classification and regression includes randomly selecting, in a first computer process, an unclassified datapoint associated with a phenomenon of interest; determining, in a second computer process, a set of datapoints associated with the phenomenon of interest that is likely to be in the same class as the selected unclassified datapoint; predicting, in a third computer process, a class label for the selected unclassified datapoint in a third computer process; predicting a class label for the set of datapoints in a fourth computer process; combining the predicted class labels in a fifth computer process, to predict a composite class label that describes the selected unclassified datapoint and the set of datapoints; and using the combined class label to adjust at least one parameter of the learning machine in a sixth computer process. | 08-13-2009 |
20090204558 | METHOD FOR TRAINING A LEARNING MACHINE HAVING A DEEP MULTI-LAYERED NETWORK WITH LABELED AND UNLABELED TRAINING DATA - A method for training a learning machine having a deep network with a plurality of layers, includes applying a regularizer to one or more of the layers of the deep network; training the regularizer with unlabeled data; and training the deep network with labeled data. Also, an apparatus for use in discriminative classification and regression, including an input device for inputting unlabeled and labeled data associated with a phenomenon of interest; a processor; and a memory communicating with the processor. The memory includes instructions executable by the processor for implementing a learning machine having a deep network structure and training the learning machine by applying a regularizer to one or more of the layers of the deep network; training the regularizer with unlabeled data; and training the deep network with labeled data. | 08-13-2009 |
20090204605 | Semantic Search Via Role Labeling - A method and system for searching for information contained in a database of documents each includes an offline part and an online part. The offline part includes predicting, in a first computer process, semantic data for sentences of the documents contained in the database and storing this data in a database. The online part includes querying the database for information with a semantically-sensitive query, predicting, in a real time computer process, semantic data for the query, and determining, in a second computer process, a matching score against all the documents in the database, which incorporates the semantic data for the sentences and the query. | 08-13-2009 |
20090210218 | Deep Neural Networks and Methods for Using Same - A method and system for labeling a selected word of a sentence using a deep neural network includes, in one exemplary embodiment, determining an index term corresponding to each feature of the word, transforming the index term or terms of the word into a vector, and predicting a label for the word using the vector. The method and system, in another exemplary embodiment, includes determining, for each word in the sentence, an index term corresponding to each feature of the word, transforming the index term or terms of each word in the sentence into a vector, applying a convolution operation to the vector of the selected word and at least one of the vectors of the other words in the sentence, to transform the vectors into a matrix of vectors, each of the vectors in the matrix including a plurality of row values, constructing a single vector from the vectors in the matrix, and predicting a label for the selected word using the single vector. | 08-20-2009 |
20100179933 | SUPERVISED SEMANTIC INDEXING AND ITS EXTENSIONS - A system and method for determining a similarity between a document and a query includes building a weight vector for each of a plurality of documents in a corpus of documents stored in memory and building a weight vector for a query input into a document retrieval system. A weight matrix is generated which distinguishes between relevant documents and lower ranked documents by comparing document/query tuples using a gradient step approach. A similarity score is determined between weight vectors of the query and documents in a corpus by determining a product of a document weight vector, a query weight vector and the weight matrix. | 07-15-2010 |