Patent application number | Description | Published |
20080300787 | METHOD AND APPARATUS FOR ON-VEHICLE CALIBRATION AND ORIENTATION OF OBJECT-TRACKING SYSTEMS - A method for simultaneously tracking a plurality of objects and registering a plurality of object-locating sensors mounted on a vehicle relative to the vehicle is based upon collected sensor data, historical sensor registration data, historical object trajectories, and a weighted algorithm based upon geometric proximity to the vehicle and sensor data variance. | 12-04-2008 |
20080306666 | Method and apparatus for rear cross traffic collision avoidance - A rear cross-traffic collision avoidance system that provides a certain action, such as a driver alert or automatic braking, in the event of a collision threat from cross-traffic. The system includes object detection sensors for detecting objects in the cross-traffic and vehicle sensors for detecting the vehicle turning. A controller uses the signals from the object detection sensors and the vehicle sensors to determine and identify object tracks that may interfere with the subject vehicle based on the vehicle turning. | 12-11-2008 |
20100017128 | Radar, Lidar and camera enhanced methods for vehicle dynamics estimation - A system for estimation vehicle dynamics, including vehicle position and velocity, using a stationary object. The system includes an object sensor that provides object signals of the stationary object. The system also includes in-vehicle sensors that provide signals representative of vehicle motion. The system also includes an association processor that receives the object signals, and provides object tracking through multiple frames of data. The system also includes a longitudinal state estimation processor that receives the object signals and the sensor signals, and provides a correction of the vehicle speed in a forward direction. The system also includes a lateral state estimation processor that receives the object signals and the sensor signals, and provides a correction of the vehicle speed in the lateral direction. | 01-21-2010 |
20100191391 | MULTIOBJECT FUSION MODULE FOR COLLISION PREPARATION SYSTEM - A method for controlling a vehicle operating during a dynamic vehicle event includes monitoring a first input image, monitoring a first tracked object within the first input image in a first tracking cycle, monitoring a second input image, monitoring a second tracked object within the second input image in a second tracking cycle, and determining a dissimilarity measure comparing the first tracked object to the second tracked object. The dissimilarity measure estimates whether the first tracked object and the second tracked object represent a single tracked object proximate to the vehicle. The method further includes associating the first tracked object and the second tracked object based upon the dissimilarity measure, and utilizing the associated objects in a collision preparation system to control operation of the vehicle. | 07-29-2010 |
20100191461 | SYSTEM AND METHOD OF LANE PATH ESTIMATION USING SENSOR FUSION - A method for estimating a projected path of travel for a vehicle on a road includes monitoring a plurality of sensor inputs, determining a road geometry in front of the vehicle based upon the monitored sensor inputs, determining a vehicle position in relation to the road geometry based upon the monitored sensor inputs, determining a plurality of particle points in front of the vehicle representing a potential path of travel from the road geometry and the vehicle position, and utilizing iteratively determined ones of the plurality of particle points to navigate the vehicle including omitting ones of the plurality of particle points passed by the vehicle. | 07-29-2010 |
20100198513 | Combined Vehicle-to-Vehicle Communication and Object Detection Sensing - A vehicle awareness system for monitoring remote vehicles relative to a host vehicle. The vehicle awareness system includes at least one object sensing device and a vehicle-to-vehicle communication device. A data collection module is provided for obtaining a sensor object data map and vehicle-to-vehicle object data map. A fusion module merges the sensor object data map and vehicle-to-vehicle object data map for generating a cumulative object data map. A tracking module estimates the relative position of the remote vehicles to the host vehicle. | 08-05-2010 |
20100245171 | METHOD AND APPARATUS FOR PRECISE RELATIVE POSITIONING IN MULTIPLE VEHICLES - A system and method for determining the position and velocity of a plurality of vehicles relative to a host vehicle using GPS data. The method includes building a graph of the vehicles that define weighted baselines between each of the vehicles and the host vehicle and each of the vehicles where the weighted baselines define a geometric dilution of precision between the vehicles. The method then determines the optimal baseline between the host vehicle and each of the other vehicles using the weighted baselines based on the lowest geometric dilution of precision. The method then computes the relative position and velocity between all of the vehicles and the host vehicle using the optimal baselines. | 09-30-2010 |
20100250132 | OPTIMAL CODING OF GPS MEASUREMENTS FOR PRECISE RELATIVE POSITIONING - A system for coding GPS measurements in a vehicle satellite communications system. The system includes a stand-alone position and velocity estimator that generates an estimated latent state vector from GPS measurements received at a first time and a prediction of a latent state vector from a previous time. The system also includes an observation prediction model that calculates an observation prediction from the estimated latent state vector. The system further includes a first differencer that provides a difference between the observation prediction and the GPS measurements, and a first Huffman encoder that provides a coded output from the difference. The system also includes a state prediction model that provides the predicted latent state vector and a second differencer that provides a difference between the estimated latent state vector and the predicted latent state vector. A second Huffman encoder encodes the difference from the second differencer. | 09-30-2010 |
20110096165 | AUTOMATIC CAMERA CALIBRATION USING GPS AND SOLAR TRACKING - A system and method for automatically calibrating the aiming direction of a camera onboard a vehicle. The method uses GPS and solar almanac data to estimate the location of the sun relative to the vehicle, and compares this to the solar position in an image from the camera. The comparison is performed with an ongoing recursive calculation which yields an improved estimate of the sun position relative to the vehicle, while simultaneously calibrating the aiming direction of the camera. The aiming calibration data is used to improve the accuracy of the camera's images as used for all of the camera's other functions. | 04-28-2011 |
20110098894 | AUTOMATIC CONTROLLER FOR POWERED RETRACTABLE SUN VISOR - A system and method for controlling a vehicle sun visor that uses commonly existing onboard sensors and systems to provide all necessary inputs. The method uses GPS and solar almanac data to determine the location of the sun relative to the vehicle, driver side view mirror angle data to determine the position of the driver's eyes within the vehicle, and an existing outside light metering device to determine whether the sun is actually shining on the vehicle, and uses this information to calculate the optimum position of the sun visor. If a forward-looking camera is available on the vehicle, camera images can be used to improve the estimate of the sun position relative to the vehicle. | 04-28-2011 |
20110133917 | CROSS TRAFFIC COLLISION ALERT SYSTEM - A cross traffic collision alert system is provided that includes an image-based detection device for capturing motion data of objects in a region of a direction of travel. A motion analysis module monitors the captured motion data. A salient region detection module identifies regions of interest within the captured motion data. A predictive object detection module is responsive to the outputs of the motion analysis module and the salient region detection module for generating at least one object candidate. Non-imaging object detection sensors sense a region for detecting the at least one object candidate. An object tracking module tracks the at least one object candidate and fuses the at least one object candidate transitioning between the regions. A threat assessment module determines a threat assessment of a potential collision between a tracked object candidate and the driven vehicle and determines whether a warning should be provided. | 06-09-2011 |
20110282581 | OBJECT AND VEHICLE DETECTION AND TRACKING USING 3-D LASER RANGEFINDER - A method and system for detecting and tracking objects near a vehicle using a three dimensional laser rangefinder. The method receives points from the laser rangefinder, where the points represent locations in space where the rangefinder senses that some object exists. An algorithm first estimates the location of a ground plane, based on a previous ground plane location, data from onboard sensors, and an eigenvector calculation applied to the point data. Next, a plan view occupancy map and elevation map are computed for stationary objects, based on point data in relation to the ground plane. Finally, dynamic objects are detected and tracked, sensing objects which are moving, such as other vehicles, pedestrians, and animals. The output of the method is a set of stationary and dynamic objects, including their shape, range, and velocity. This output can be used by downstream applications. | 11-17-2011 |
20120022739 | ROBUST VEHICULAR LATERAL CONTROL WITH FRONT AND REAR CAMERAS - A method and system for closed-loop vehicle lateral control, using image data from front and rear cameras and information about a leading vehicle's position as input. A host vehicle includes cameras at the front and rear, which can be used to detect lane boundaries such as curbs and lane stripes, among other purposes. The host vehicle also includes a digital map system and a system for sensing the location of a vehicle travelling ahead of the host vehicle. A control strategy is developed which steers the host vehicle to minimize the deviation of the host vehicle's path from a lane reference path, where the lane reference path is computed from the lane boundaries extracted from the front and rear camera images and from the other inputs. The control strategy employs feed-forward and feedback elements, and uses a Kalman filter to estimate the host vehicle's state variables. | 01-26-2012 |
20120062747 | LANE FUSION SYSTEM USING FORWARD-VIEW AND REAR-VIEW CAMERAS - A method and system for computing lane curvature and a host vehicle's position and orientation relative to lane boundaries, using image data from forward-view and rear-view cameras and vehicle dynamics sensors as input. A host vehicle includes cameras at the front and rear, which can be used to detect lane boundaries such as curbs and lane stripes, among other purposes. The host vehicle also includes vehicle dynamics sensors including vehicle speed and yaw rate. A method is developed which computes lane curvature and the host vehicle's position relative to a lane reference path, where the lane reference path is derived from the lane boundaries extracted from a fusion of the front and rear camera images. Mathematical models provided in the disclosure include a Kalman filter tracking routine and a particle filter tracking routine. | 03-15-2012 |
20120116662 | System and Method for Tracking Objects - A methods for tracking objects by generating kinematic models corresponding to the objects using a computerized object-tracking system including a data collection module receiving scan data associated with an object and generating, using the scan data, a new frame F, associated with a new time t+dt and including new points X. A clustering module identifies a new group G of new points X | 05-10-2012 |
20120140061 | Multi-Object Appearance-Enhanced Fusion of Camera and Range Sensor Data - A transportation vehicle configured to track an object external to the vehicle. The vehicle includes a camera, a range sensor, and an on-board computer. The on-board computer includes a processor and a tangible, non-transitory, computer-readable medium comprising instructions that, when executed by the processor, cause the processor to perform select steps. The steps include determining that new-object data corresponding to the object is available based on input received from the sensor sub-system of the vehicle. The steps also include registering the new-object data and estimating an expected location and an expected appearance for the object according to a prediction algorithm to generate a predicted track corresponding to the object. The steps also include analyzing motion for the object including comparing the predicted track with any existing track associated with the object and stored in a database of the on-board computer. | 06-07-2012 |
20120150437 | Systems and Methods for Precise Sub-Lane Vehicle Positioning - A vehicle having an on-board computer, vehicle sensors, a satellite-positioning unit, a database storing a lane-level map performs a method to determine a new pose of the vehicle using map matching. The method includes the on-board computer of the vehicle receiving new data from at least one of the vehicle sensors and collecting measurements from the vehicle sensors. The method also includes the on-board computer of the vehicle computing propagation of vehicle pose with respect to consecutive time instances and performing a curve-fitting process. The method further includes the on-board computer of the vehicle performing a sub-routine of updating at least one observation model based on results of the curve-fitting process, performing a tracking sub-routine including using a probability distribution to update the vehicle pose in terms of data particles, and performing a particle filtering sub-routine based on the data particles to compute the new vehicle pose. | 06-14-2012 |
20120221168 | REDUNDANT LANE SENSING SYSTEMS FOR FAULT-TOLERANT VEHICULAR LATERAL CONTROLLER - A vehicle lateral control system includes a lane marker module configured to determine a heading and displacement of a vehicle in response to images received from a secondary sensing device, a lane information fusion module configured to generate vehicle and lane information in response to data received from heterogeneous vehicle sensors and a lane controller configured to generate a collision free vehicle path in response to the vehicle and lane information from the lane information fusion module and an object map. | 08-30-2012 |
20120256764 | Method and Apparatus for an Object Detection System Using Two Modulated Light Sources - An object detection tracking system for a vehicle includes a first modulated light source for transmitting a first light signal with a first propagation pattern into a target region and a second modulated light source for transmitting a second light signal with a second propagation pattern into the target region. A sensor measures light reflected off objects in the target region. A controller demodulates the measured light to detect when the first or second light signals are received. The controller determines respective ranges for generating a first set of range data for the first light signal and a second set of range data for the second light signal. The controller maintains an object tracking list that includes a position of detected objects relative to the vehicle. The position of each object is determined using trilateration of the first and second sets of range data. | 10-11-2012 |
20120288138 | SYSTEM AND METHOD FOR TRAFFIC SIGNAL DETECTION - A method and system may determine a location of a vehicle, collect an image using a camera associated with the vehicle, analyze the image in conjunction with the location of the vehicle and/or previously collected information on the location of traffic signals or other objects (e.g., traffic signs), and using this analysis locate an image of a traffic signal within the collected image. The position (e.g., a geographic position) of the signal may be determined, and stored for later use. The identification of the signal may be used to provide an output such as the status of the signal, such as green light. | 11-15-2012 |
20120290169 | NOVEL SENSOR ALIGNMENT PROCESS AND TOOLS FOR ACTIVE SAFETY VEHICLE APPLICATIONS - A method and tools for virtually aligning object detection sensors on a vehicle without having to physically adjust the sensors. A sensor misalignment condition is detected during normal driving of a host vehicle by comparing different sensor readings to each other. At a vehicle service facility, the host vehicle is placed in an alignment target fixture, and alignment of all object detection sensors is compared to ground truth to determine alignment calibration parameters. Alignment calibration can be further refined by driving the host vehicle in a controlled environment following a leading vehicle. Final alignment calibration parameters are authorized and stored in system memory, and applications which use object detection data henceforth adjust the sensor readings according to the calibration parameters. | 11-15-2012 |
20120310516 | SYSTEM AND METHOD FOR SENSOR BASED ENVIRONMENTAL MODEL CONSTRUCTION - A method and system may determine the estimated location of a vehicle, measure a location of an object relative to the vehicle using a sensor associated with the vehicle, and determine an updated vehicle location using the measured relative object location in conjunction with previously stored object locations. The estimated vehicle location may be determined using a system different from that associated with the sensor, for example a GPS system. The object location may be measured relative to a sub-map corresponding to the location of the vehicle. | 12-06-2012 |
20130030606 | AUTONOMOUS CONVOYING TECHNIQUE FOR VEHICLES - A method of autonomously convoying vehicles traveling along a route with a leader vehicle being in communication with at least one follower vehicle. The at least one follower vehicle receives a communication relating to a target offset position and route data. Tracking data is generated and derived from on-board sensing devices of the at least one follower vehicle that includes a traveled path of the leader vehicle sensed by the at least one follower vehicle. The route data is compared to the tracking data for identifying accuracy between the route data relative to the tracking data. An adjusted target offset position and a set of trajectory points that provides a trajectory path of travel from a current position of the at least one follower vehicle to the adjusted target offset position are determined based on the accuracy between the route data and the tracking data. | 01-31-2013 |
20130113935 | ACTIVE VISION SYSTEM WITH SUBLIMINALLY STEERED AND MODULATED LIGHTING - An active vision system includes an image capture device for capturing images in a region exterior of a vehicle and a headlamp control unit for controlling a vehicle headlamp beam for illuminating an environment exterior of a vehicle. The headlamp control unit is configured to selectively illuminate between making a path of travel of a road visible to a driver of the vehicle and making the region exterior of the vehicle visible for capturing images by the image capture device. The headlamp control unit utilizes a duty cycle for controlling a first cycle time that the headlamp beam illuminates the path of travel for making the road visible to the driver and for controlling a second cycle time that the headlamp beam makes the captured region visible for capturing images by the image capture device. | 05-09-2013 |
20130116854 | LANE TRACKING SYSTEM - A lane tracking system for tracking the position of a vehicle within a lane includes a camera configured to provide a video feed representative of a field of view and a video processor configured to receive the video feed from the camera and to generate latent video-based position data indicative of the position of the vehicle within the lane. The system further includes a vehicle motion sensor configured to generate vehicle motion data indicative of the motion of the vehicle, and a lane tracking processor. The lane tracking processor is configured to receive the video-based position data, updated at a first frequency; receive the sensed vehicle motion data, updated at a second frequency; estimate the position of the vehicle within the lane from the sensed vehicle motion data; and fuse the video-based position data with the estimate of the vehicle position within the lane using a Kalman filter. | 05-09-2013 |
20130211657 | COUPLED RANGE AND INTENSITY IMAGING FOR MOTION ESTIMATION - A method and system may operate a coupled range imager and intensity imager to obtain a sequence of at least two automatically registered images, each automatically registered image including concurrently acquired range data and intensity data with regard to an imaged scene. The registered range and intensity data is analyzed to yield an estimated motion of an element of an imaged object of the imaged scene. | 08-15-2013 |
20130236047 | ENHANCED DATA ASSOCIATION OF FUSION USING WEIGHTED BAYESIAN FILTERING - A method of associating targets from at least two object detection systems. An initial prior correspondence matrix is generated based on prior target data from a first object detection system and a second object detection system. Targets are identified in a first field-of-view of the first object detection system based on a current time step. Targets are identified in a second field-of-view of the second object detection system based on the current time step. The prior correspondence matrix is adjusted based on respective targets entering and leaving the respective fields-of-view. A posterior correspondence matrix is generated as a function of the adjusted prior correspondence matrix. A correspondence is identified in the posterior correspondence matrix between a respective target of the first object detection system and a respective target of the second object detection system. | 09-12-2013 |
20130242284 | METHODS AND APPARATUS OF FUSING RADAR/CAMERA OBJECT DATA AND LiDAR SCAN POINTS - A system and method for fusing the outputs from multiple LiDAR sensors on a vehicle that includes cueing the fusion process in response to an object being detected by a radar sensor and/or a vision system. The method includes providing object files for objects detected by the LiDAR sensors at a previous sample time, where the object files identify the position, orientation and velocity of the detected objects. The method projects object models in the object files from the previous sample time to provide predicted object models. The method also includes receiving a plurality of scan returns from objects detected in the field-of-view of the sensors at a current sample time and constructing a point cloud from the scan returns. The method then segments the scan points in the point cloud into predicted scan clusters, where each cluster identifies an object detected by the sensors. | 09-19-2013 |
20130242285 | METHOD FOR REGISTRATION OF RANGE IMAGES FROM MULTIPLE LiDARS - A system and method for registering range images from objects detected by multiple LiDAR sensors on a vehicle. The method includes aligning frames of data from at least two LiDAR sensors having over-lapping field-of-views in a sensor signal fusion operation so as to track objects detected by the sensors. The method defines a transformation value for at least one of the LiDAR sensors that identifies an orientation angle and position of the sensor and provides target scan points from the objects detected by the sensors where the target scan points for each sensor provide a separate target point map. The method projects the target point map from the at least one sensor to another one of the LiDAR sensors using a current transformation value to overlap the target scan points from the sensors. | 09-19-2013 |
20130246020 | BAYESIAN NETWORK TO TRACK OBJECTS USING SCAN POINTS USING MULTIPLE LiDAR SENSORS - A system and method for fusing the outputs from multiple LiDAR sensors on a vehicle. The method includes providing object files for objects detected by the sensors at a previous sample time, where the object files identify the position, orientation and velocity of the detected objects. The method also includes receiving a plurality of scan returns from objects detected in the field-of-view of the sensors at a current sample time and constructing a point cloud from the scan returns. The method then segments the scan points in the point cloud into predicted clusters, where each cluster initially identifies an object detected by the sensors. The method matches the predicted clusters with predicted object models generated from objects being tracked during the previous sample time. The method creates new object models, deletes dying object models and updates the object files based on the object models for the current sample time. | 09-19-2013 |
20140002650 | WIDE BASELINE BINOCULAR OBJECT MATCHING METHOD USING MINIMAL COST FLOW NETWORK | 01-02-2014 |
20140032108 | ANCHOR LANE SELECTION METHOD USING NAVIGATION INPUT IN ROAD CHANGE SCENARIOS - A method for selecting an anchor lane for tracking in a vehicle lane tracking system. Digital map data and leading vehicle trajectory data are used to predict lane information ahead of a vehicle. Left and right lane boundary markers are also detected, where available, using a vision system. The lane marker data from the vision system is combined with the lane information from the digital map data and the leading vehicle trajectory data in a lane curvature fusion calculation. The left and right lane marker data from the vision system are also evaluated for conditions such as parallelism and sudden jumps in offsets, while considering the presence of entrance or exit lanes as indicated by the map data. An anchor lane for tracking is selected based on the evaluation of the vision system data, using either the fused curvature calculation or the digital map and leading vehicle trajectory data. | 01-30-2014 |
20140035775 | FUSION OF OBSTACLE DETECTION USING RADAR AND CAMERA - A vehicle obstacle detection system includes an imaging system for capturing objects in a field of view and a radar device for sensing objects in a substantially same field of view. The substantially same field of view is partitioned into an occupancy grid having a plurality of observation cells. A fusion module receives radar data from the radar device and imaging data from the imaging system. The fusion module projects the occupancy grid and associated radar data onto the captured image. The fusion module extracts features from each corresponding cell using sensor data from the radar device and imaging data from the imaging system. A primary classifier determines whether an extracted feature extracted from a respective observation cell is an obstacle. | 02-06-2014 |
20140121932 | SYSTEMS AND METHODS FOR VEHICLE CRUISE CONTROL - An exemplary cruise control system includes an application that integrates curvature speed control, speed limit control, and adaptive speed control and generates an optimized speed profile that is used to control the vehicle. | 05-01-2014 |
20140142800 | METHOD AND APPARATUS FOR STATE OF HEALTH ESTIMATION OF OBJECT SENSING FUSION SYSTEM - A method and system for estimating the state of health of an object sensing fusion system. Target data from a vision system and a radar system, which are used by an object sensing fusion system, are also stored in a context queue. The context queue maintains the vision and radar target data for a sequence of many frames covering a sliding window of time. The target data from the context queue are used to compute matching scores, which are indicative of how well vision targets correlate with radar targets, and vice versa. The matching scores are computed within individual frames of vision and radar data, and across a sequence of multiple frames. The matching scores are used to assess the state of health of the object sensing fusion system. If the fusion system state of health is below a certain threshold, one or more faulty sensors are identified. | 05-22-2014 |
20140347207 | PROBABILISTIC TARGET SELECTION AND THREAT ASSESSMENT METHOD AND APPLICATION TO INTERSECTION COLLISION ALERT SYSTEM - A system and method for providing target selection and threat assessment for vehicle collision avoidance purposes that employ probability analysis of radar scan returns. The system determines a travel path of a host vehicle and provides a radar signal transmitted from a sensor on the host vehicle. The system receives multiple scan return points from detected objects, processes the scan return points to generate a distribution signal defining a contour of each detected object, and processes the scan return points to provide a position, a translation velocity and an angular velocity of each detected object. The system selects the objects that may enter the travel path of the host vehicle, and makes a threat assessment of those objects by comparing a number of scan return points that indicate that the object may enter the travel path to the number of the scan points that are received for that object. | 11-27-2014 |
20150081211 | SENSOR-AIDED VEHICLE POSITIONING SYSTEM - A method and system for localizing a vehicle in a digital map includes generating GPS coordinates of the vehicle on the traveled road and retrieving from a database a digital map of a region traveled by the vehicle based on the location of the GPS coordinates. The digital map includes a geographic mapping of a traveled road and registered roadside objects. The registered roadside objects are positionally identified in the digital map by longitudinal and lateral coordinates. Roadside objects in the region traveled are sensed by the vehicle. The sensed roadside objects are identified on the digital map. A vehicle position on the traveled road is determined utilizing coordinates of the sensed roadside objects identified in the digital map. The position of the vehicle is localized in the road as a function of the GPS coordinates and the determined vehicle position utilizing the coordinates of the sensed roadside objects. | 03-19-2015 |