Patent application number | Description | Published |
20110053527 | UPLINK TRANSMIT DIVERSITY ENHANCEMENT - A method for uplink transmit diversity enhancement is disclosed. A subset of two or more potential uplink transmission configurations are determined. A subset of potential uplink transmission configurations is evaluated. An uplink transmission configuration is selected based on the evaluation. Metrics of the selected uplink transmission configuration are determined. The selected uplink transmission configuration is applied for an extended use period. | 03-03-2011 |
20120027112 | Antenna Switching in a Closed Loop Transmit Diversity System - A method for closed loop transmit diversity is disclosed. Data from user equipment (UE) that was transmitted using multiple transmit antennas is received. A new transmit antenna is selected for the UE. A new cycle period for the UE is determined. A new transmit antenna index and a testing indication based on the new cycle period are sent to the UE. | 02-02-2012 |
20120028627 | Method and Apparatus for Biasing a Handoff Decision Based on a Blackhaul Link - A decision whether to perform a handover between a relay and a base station may depend, at least in part, on a backhaul link between the relay and the donor base station serving the relay. That is, the relay may provide information relating to a characteristic of the backhaul link to the user equipment, and the user equipment may utilize this information to bias its measurements of signals transmitted from the relay and the base station in accordance with the characteristic of the backhaul link. In this way, if the backhaul link suffers such that it becomes a bottleneck, the handover decision between the relay and the base station is better informed than a decision based solely on the transmissions from the relay and from the base station. | 02-02-2012 |
20120113834 | SYSTEMS, APPARATUS AND METHODS TO FACILITATE EFFICIENT REPEATER USAGE - In accordance with aspects of the disclosure, a method, apparatus, and computer program product are provided for wireless communication. The method, apparatus, and computer program product may be provided for detecting a change in power of received signals and adjusting amplification of the received signals based on the detected change in power prior to transmitting the signals. | 05-10-2012 |
20120140695 | System and Method for Wireless Communication Diversity Retransmission Relay - A wireless communication system transmits a data packet from a cell to a terminal and to a diversity relay. The diversity relay stores the data packet and, based on a state of an acknowledgement signal from the terminal indicating a failure of the terminal to receive and decode the data packet, determines that a subsequent data packet will be a retransmission and, in response, retrieves and transmits the stored data packet to the terminal, cooperative with the retransmission from the data packet from the cell to the terminal. Optionally, the diversity relay transmits pilot signals to the terminal and, optionally, modifies channel quality reports sent from the terminal to the cell. | 06-07-2012 |
20120265716 | MACHINE LEARNING OF KNOWN OR UNKNOWN MOTION STATES WITH SENSOR FUSION - Example methods, apparatuses, or articles of manufacture are disclosed herein that may be utilized, in whole or in part, to facilitate or support one or more operations or techniques for machine learning of known or unknown motion states with sensor fusion. | 10-18-2012 |
20130024409 | METHOD AND APPARATUS OF ROBUST NEURAL TEMPORAL CODING, LEARNING AND CELL RECRUITMENTS FOR MEMORY USING OSCILLATION - Certain aspects of the present disclosure support a technique for robust neural temporal coding, learning and cell recruitments for memory using oscillations. Methods are proposed for distinguishing temporal patterns and, in contrast to other “temporal pattern” methods, not merely coincidence of inputs or order of inputs. Moreover, the present disclosure propose practical methods that are biologically-inspired/consistent but reduced in complexity and capable of coding, decoding, recognizing, and learning temporal spike signal patterns. In this disclosure, extensions are proposed to a scalable temporal neural model for robustness, confidence or integrity coding, and recruitment of cells for efficient temporal pattern memory. | 01-24-2013 |
20130024410 | METHOD AND APPARATUS OF NEURONAL FIRING MODULATION VIA NOISE CONTROL - Certain aspects of the present disclosure support a technique for neuronal firing modulation via noise control. Response curve of a typical neuron with a threshold can transition from not firing to always firing with a very small change in the neuron's input, thus limiting the range of excitable input patterns for the neuron. By introducing local, region and global noise terms, the slope of the neuron's response curve can be reduced. This may enable a larger set of input spike patterns to be effective in causing the neuron to fire, i.e., the neuron can be responsive to a large range of input patterns instead of an inherently small set of patterns in a noiseless situation. | 01-24-2013 |
20130046716 | METHOD AND APPARATUS FOR NEURAL TEMPORAL CODING, LEARNING AND RECOGNITION - Certain aspects of the present disclosure support a technique for neural temporal coding, learning and recognition. A method of neural coding of large or long spatial-temporal patterns is also proposed. Further, generalized neural coding and learning with temporal and rate coding is disclosed in the present disclosure. | 02-21-2013 |
20130073501 | METHOD AND APPARATUS FOR STRUCTURAL DELAY PLASTICITY IN SPIKING NEURAL NETWORKS - Certain aspects of the present disclosure relate to a technique for adaptive structural delay plasticity applied in spiking neural networks. With the proposed method of structural delay plasticity, the requirement of modeling multiple synapses with different delays can be avoided. In this case, far fewer potential synapses should be modeled for learning. | 03-21-2013 |
20130103626 | METHOD AND APPARATUS FOR NEURAL LEARNING OF NATURAL MULTI-SPIKE TRAINS IN SPIKING NEURAL NETWORKS - Certain aspects of the present disclosure support a technique for neural learning of natural multi-spike trains in spiking neural networks. A synaptic weight can be adapted depending on a resource associated with the synapse, which can be depleted by weight change and can recover over time. In one aspect of the present disclosure, the weight adaptation may depend on a time since the last significant weight change. | 04-25-2013 |
20130117210 | METHODS AND APPARATUS FOR UNSUPERVISED NEURAL REPLAY, LEARNING REFINEMENT, ASSOCIATION AND MEMORY TRANSFER: NEURAL COMPONENT REPLAY - Certain aspects of the present disclosure support techniques for unsupervised neural replay, learning refinement, association and memory transfer. | 05-09-2013 |
20130117211 | METHODS AND APPARATUS FOR UNSUPERVISED NEURAL REPLAY, LEARNING REFINEMENT, ASSOCIATION AND MEMORY TRANSFER: NEURAL COMPONENT MEMORY TRANSFER - Certain aspects of the present disclosure support techniques for unsupervised neural replay, learning refinement, association and memory transfer. | 05-09-2013 |
20130117212 | METHODS AND APPARATUS FOR UNSUPERVISED NEURAL REPLAY, LEARNING REFINEMENT, ASSOCIATION AND MEMORY TRANSFER: NEURAL ASSOCIATIVE LEARNING, PATTERN COMPLETION, SEPARATION, GENERALIZATION AND HIERARCHICAL REPLAY - Certain aspects of the present disclosure support techniques for unsupervised neural replay, learning refinement, association and memory transfer. | 05-09-2013 |
20130117213 | METHODS AND APPARATUS FOR UNSUPERVISED NEURAL REPLAY, LEARNING REFINEMENT, ASSOCIATION AND MEMORY TRANSFER: STRUCTURAL PLASTICITY AND STRUCTURAL CONSTRAINT MODELING - Certain aspects of the present disclosure support techniques for unsupervised neural replay, learning refinement, association and memory transfer. | 05-09-2013 |
20130204814 | METHODS AND APPARATUS FOR SPIKING NEURAL COMPUTATION - Certain aspects of the present disclosure provide methods and apparatus for spiking neural computation of general linear systems. One example aspect is a neuron model that codes information in the relative timing between spikes. However, synaptic weights are unnecessary. In other words, a connection may either exist (significant synapse) or not (insignificant or non-existent synapse). Certain aspects of the present disclosure use binary-valued inputs and outputs and do not require post-synaptic filtering. However, certain aspects may involve modeling of connection delays (e.g., dendritic delays). A single neuron model may be used to compute any general linear transformation x=AX+BU to any arbitrary precision. This neuron model may also be capable of learning, such as learning input delays (e.g., corresponding to scaling values) to achieve a target output delay (or output value). Learning may also be used to determine a logical relation of causal inputs. | 08-08-2013 |
20130204819 | METHODS AND APPARATUS FOR SPIKING NEURAL COMPUTATION - Certain aspects of the present disclosure provide methods and apparatus for spiking neural computation of general linear systems. One example aspect is a neuron model that codes information in the relative timing between spikes. However, synaptic weights are unnecessary. In other words, a connection may either exist (significant synapse) or not (insignificant or non-existent synapse). Certain aspects of the present disclosure use binary-valued inputs and outputs and do not require post-synaptic filtering. However, certain aspects may involve modeling of connection delays (e.g., dendritic delays). A single neuron model may be used to compute any general linear transformation x=AX+BU to any arbitrary precision. This neuron model may also be capable of learning, such as learning input delays (e.g., corresponding to scaling values) to achieve a target output delay (or output value). Learning may also be used to determine a logical relation of causal inputs. | 08-08-2013 |
20130204820 | METHODS AND APPARATUS FOR SPIKING NEURAL COMPUTATION - Certain aspects of the present disclosure provide methods and apparatus for spiking neural computation of general linear systems. One example aspect is a neuron model that codes information in the relative timing between spikes. However, synaptic weights are unnecessary. In other words, a connection may either exist (significant synapse) or not (insignificant or non-existent synapse). Certain aspects of the present disclosure use binary-valued inputs and outputs and do not require post-synaptic filtering. However, certain aspects may involve modeling of connection delays (e.g., dendritic delays). A single neuron model may be used to compute any general linear transformation x=AX+BU to any arbitrary precision. This neuron model may also be capable of learning, such as learning input delays (e.g., corresponding to scaling values) to achieve a target output delay (or output value). Learning may also be used to determine a logical relation of causal inputs. | 08-08-2013 |
20130226851 | METHOD AND APPARATUS FOR MODELING NEURAL RESOURCE BASED SYNAPTIC PLACTICITY - Certain aspects of the present disclosure support a method of designing the resource model in hardware (or software) for learning spiking neural networks. The present disclosure comprises accounting for resources in a different domain (e.g., negative log lack-of-resources instead of availability of resources), modulating weight changes for multiple spike events upon a single trigger, and strategically advancing or retarding the resource replenishment or decay (respectively) to overcome the limitation of single event-based triggering. | 08-29-2013 |
20130304681 | METHOD AND APPARATUS FOR STRATEGIC SYNAPTIC FAILURE AND LEARNING IN SPIKING NEURAL NETWORKS - Certain aspects of the present disclosure support a technique for strategic synaptic failure and learning in spiking neural networks. A synaptic weight for a synaptic connection between a pre-synaptic neuron and a post-synaptic neuron can be first determined (e.g., according to a learning rule). Then, one or more failures of the synaptic connection can be determined based on a set of characteristics of the synaptic connection. The one or more failures can be omitted from computation of a neuronal behavior of the post-synaptic neuron. | 11-14-2013 |
20130325765 | CONTINUOUS TIME SPIKING NEURAL NETWORK EVENT-BASED SIMULATION - Certain aspects of the present disclosure provide methods and apparatus for a continuous-time neural network event-based simulation that includes a multi-dimensional multi-schedule architecture with ordered and unordered schedules and accelerators to provide for faster event sorting; and a formulation of modeling event operations as anticipating (the future) and advancing (update/jump ahead/catch up) rules or methods to provide a continuous-time neural network model. In this manner, the advantages include faster simulation of spiking neural networks (order(s) of magnitude); and a method for describing and modeling continuous time neurons, synapses, and general neural network behaviors. | 12-05-2013 |
20130325767 | DYNAMICAL EVENT NEURON AND SYNAPSE MODELS FOR LEARNING SPIKING NEURAL NETWORKS - Certain aspects of the present disclosure provide methods and apparatus for a continuous-time neural network event-based simulation. This model is flexible, has rich behavioral options, can be solved directly, and is low complexity. One example method generally includes determining a first state of a neuron model at or shortly after a first event, wherein the neuron model has a closed-form solution in continuous time; and determining a second state of the neuron model at or shortly after a second event, based on the first state. Dynamics of the first and second states are coupled to the neuron model only at the first and second events, respectively, and are decoupled between the first and second events. | 12-05-2013 |
20130339280 | LEARNING SPIKE TIMING PRECISION - Certain aspects of the present disclosure provide methods and apparatus for learning or determining delays between neuron models so that the uncertainty in input spike timing is accounted for in the margin of time between a delayed pre-synaptic input spike and a post-synaptic spike. In this manner, a neural network can correctly match patterns (even in the presence of significant jitter) and correctly distinguish between different noisy patterns. One example method generally includes determining an uncertainty associated with a first pre-synaptic spike time of a first neuron model for a pattern to be learned; and determining a delay based on the uncertainty, such that the delay added to a second pre-synaptic spike time of the first neuron model results in a causal margin of time between the delayed second pre-synaptic spike time and a post-synaptic spike time of a second neuron model. | 12-19-2013 |
20140074761 | DYNAMICAL EVENT NEURON AND SYNAPSE MODELS FOR LEARNING SPIKING NEURAL NETWORKS - Certain aspects of the present disclosure support a technique for updating the state of an artificial neuron. A first state of the artificial neuron can be first determined, wherein a neuron model for the artificial neuron has a closed-form solution in continuous time and wherein state dynamics of the neuron model are divided into two or more regimes. An operating regime for the artificial neuron can be determined based, at least in part, on the first state. The state of the artificial neuron can be updated based, at least in part, on the first state of the artificial neuron and the determined operating regime. | 03-13-2014 |
20140143193 | METHOD AND APPARATUS FOR DESIGNING EMERGENT MULTI-LAYER SPIKING NETWORKS - Certain aspects of the present disclosure support a technique for designing an emergent multi-layer spiking neural network. Parameters of the neural network can be first determined based upon desired one or more functional features of the neural network. Then, the one or more functional features can be developed towards the desired functional features as the determined parameters are further adapted, tuned and updated. The parameters can comprise at least one of time constants of neuron circuits of the neural network, time constants of synapse connections of the neural network, timing parameters of the neural network, or timing aspects of learning in the neural network. The one or more functional features can comprise at least one of feature detection in a layer of the multi-layer spiking neural network or saliency detection in another layer of the multi-layer spiking neural network. | 05-22-2014 |
20150046383 | BEHAVIORAL HOMEOSTASIS IN ARTIFICIAL NERVOUS SYSTEMS USING DYNAMICAL SPIKING NEURON MODELS - Methods and apparatus are provided for implementing behavioral homeostasis in artificial neurons that use a dynamical spiking neuron model. The homeostatic mechanism may be driven by neuron state, rather than by neuron spiking rate, and this mechanism may drive changes to the neuron temporal dynamics, rather than to contributions of input or weights. As a result, certain aspects of the present disclosure are a more natural fit with spiking neural networks and have many functional and computational advantages. One example method for implementing homeostasis of an artificial nervous system generally includes determining one or more state variables of a neuron model used by an artificial neuron, based at least in part on dynamics of the neuron model; determining one or more conditions based at least in part on the state variables; and adjusting the dynamics based at least in part on the conditions. | 02-12-2015 |
20150052094 | POST GHOST PLASTICITY - Methods and apparatus are provided for inferring and accounting for missing post-synaptic events (e.g., a post-synaptic spike that is not associated with any pre-synaptic spikes) at an artificial neuron and adjusting spike-timing dependent plasticity (STDP) accordingly. One example method generally includes receiving, at an artificial neuron, a plurality of pre-synaptic spikes associated with a synapse, tracking a plurality of post-synaptic spikes output by the artificial neuron, and determining at least one of the post-synaptic spikes is associated with none of the plurality of pre-synaptic spikes. According to certain aspects, determining inferring missing post-synaptic events may be accomplished by using a flag, counter, or other variable that is updated on post-synaptic firings. If this post-ghost variable changes between pre-synaptic-triggered adjustments, then the artificial nervous system can determine there was a missing post-synaptic pairing. | 02-19-2015 |
20150081607 | IMPLEMENTING STRUCTURAL PLASTICITY IN AN ARTIFICIAL NERVOUS SYSTEM - Methods and apparatus are provided for implementing structural plasticity in an artificial nervous system. One example method for altering a structure of an artificial nervous system generally includes determining a synapse in the artificial nervous system for reassignment, determining a first artificial neuron and a second artificial neuron for connecting via the synapse, and reassigning the synapse to connect the first artificial neuron with the second artificial neuron. Another example method for operating an artificial nervous system, generally includes determining a synapse in the artificial nervous system for assignment; determining a first artificial neuron and a second artificial neuron for connecting via the synapse, wherein at least one of the synapse or the first and second artificial neurons are determined randomly or pseudo-randomly; and assigning the synapse to connect the first artificial neuron with the second artificial neuron. | 03-19-2015 |