Patent application number | Description | Published |
20090043881 | CACHE EXPIRY IN MULTIPLE-SERVER ENVIRONMENT - In a multiple-server or multiple-process environment where each server has a local cache, data in one cache may become obsolete because of changes to a data store performed by another server or entity. The present invention provides techniques for efficiently notifying servers as to cache expiry indications that indicate that their local cache data is out of date and should not be used. A cache expiry manager receives cache expiry indications from servers, and sends cache expiry indications to servers in conjunction with client requests or in response to certain trigger events. The need for broadcasting cache expiry notifications to all servers is eliminated, as servers can be informed of cache expiry indications the next time a server is being given a client request that relates to the cache in question. Extraneous and duplicative cache expiry notifications are reduced or eliminated. | 02-12-2009 |
20090254707 | Partial Content Caching - A network device, known as an appliance, is located in the data path between a client and a server. The appliance includes a cache that is used to cache static and near-static cacheable content items. When a request is received, the appliance determines whether any portion of the requested data is available in its cache; if so, that portion can be serviced by the appliance. If any portion of the requested content is dynamic and cannot be serviced by the cache, the dynamic portion is generated by the appliance or obtained from another source such as an application server. The appliance integrates the content retrieved from the cache, the dynamically generated content, and the content received from other sources to generate a response to the original content request. The present invention thus implements partial content caching for content that has a cached portion and a portion to be dynamically generated. | 10-08-2009 |
20090276488 | Extensible, Asynchronous, Centralized Analysis And Optimization Of Server Responses To Client Requests - An optimizer for messaging systems learns the purpose and context of each message and combines that information with knowledge of the specific client that will be rendering the response, such as a specific HTML browser. Any of a number of optimization factors can be applied, singly or in any combination. Messages are analyzed offline until a configurable threshold is reached, indicating that enough data has been sampled to develop a valid instruction set, to be applied to the responses that a server generates for a particular request. Responses are parsed into tokens and instructions for each type of token are compiled into instruction sets that are stored. These instructions sets continue to be iteratively improved as more data is collected, until the configurable sampling threshold is reached. | 11-05-2009 |
20110295979 | Accelerating HTTP Responses In A Client/Server Environment - HTTP responses are accelerated to optimize performance and response time when presenting content in a client/server environment. An optimization technique allows a client to begin requesting additional resources and/or rendering content before the entire response is completed on the server. When a request is received at a proxy device, the proxy device transmits, to the client, links to external resources that will be needed to render the page. This allows the client to begin obtaining external resources before the remaining content is sent to the client, and even before the content has been fully composed by the server, thus improving response time and overall performance. | 12-01-2011 |
20120131141 | Asynchronous Context Data Messaging - An acceleration engine that stores context data is operatively disposed between a network and at least one web server. Incoming requests from the network are inspected by the acceleration engine and passed on to the web server. If the inspection reveals a reference to context data, the acceleration engine retrieves the context data and asynchronously sends the context data to the web server. The web server synchronizes that request and context data and generates a response message accordingly. The response message is forwarded back to the initiator of the request with or without interception by the acceleration engine. Should context data be generated during processing of the request, such context data is sent to the acceleration engine for updating purposes. | 05-24-2012 |
20120194519 | Image Optimization - Viewing of web pages is improved by prioritizing image rendering based on positioning of images within a web page. For example, for images that are likely to be initially viewable upon presentation of the web page (i.e., prior to scrolling), compressed proxy versions are made available so that the images can be transferred and rendered more quickly. These compressed proxy images are later replaced with better quality renderings of the same images. Fetching of images that are not initially visible can be deferred until after other, more important page resources are loaded. Prioritization of page loading in this manner helps to ensure that the page becomes operational earlier, resulting in improved perceived speed and responsiveness, and greater ease of navigation. | 08-02-2012 |
20120233318 | In-Line Network Device For Storing Application-Layer Data, Processing Instructions, And/Or Rule Sets - A network device located in the data path between a user computer and a server stores application data, processing instructions, and/or rule sets. By storing user computer-specific application data, processing instructions, and/or rule sets in the data path between the user computer and the server, the invention reduces the complexity of the web server, improves the handling of server failure, and increases the overall scalability and performance of the system. | 09-13-2012 |
20120303697 | OPTIMIZED RENDERING OF DYNAMIC CONTENT - In a client/server environment, rendering of web-based content is separated into two phases, so as to improve the applicability of HTML response caching. Static portion(s) of a web page are cached and delivered immediately in response to an HTTP request, concurrently with sending a request for a full page and extracting dynamic portion(s) therefrom. Dynamic portion(s) are filled in at the client as they become available. The system and method of the present invention enable optimization of the user experience to occur without requiring any recoding of the original page content. | 11-29-2012 |
20130073609 | MOBILE RESOURCE ACCELERATOR - In a client/server environment wherein resources are returned in response to client requests, a resource can be in-lined the first time it is requested, and then cached locally for use in connection with subsequent requests. When a user returns to the page for a subsequent visit, the resource requests are served from the local cache, thus avoiding the need for re-transmission with each response. According to various embodiments, the system and method of the present invention can be implemented in connection with delivery of any content in a client/server system, including for example HTML responses to requests for web pages. In at least one embodiment, the techniques described herein are tailored to mobile data network constraints; however, these techniques can be applied to any data network. | 03-21-2013 |
20130346483 | SYSTEM AND METHOD FOR CREATION, DISTRIBUTION, APPLICATION, AND MANAGEMENT OF SHARED COMPRESSION DICTIONARIES FOR USE IN SYMMETRIC HTTP NETWORKS - A method and system for creating, distributing, and managing of shared compression dictionaries. The system comprises a compressor configured to generate at least one shared compression dictionary based on a context of data streams flow between a client web browser and an origin server; an origin accelerator communicatively connected to the origin server and configured to encode an encountered data stream to a compressed form based on the least one shared compression dictionary; and an edge accelerator communicatively connected to the client web browser and configured to decode the compressed form of the data stream to an uncompressed form using the least one shared compression dictionary. | 12-26-2013 |
Patent application number | Description | Published |
20100320752 | JOINT ASSEMBLIES - A joint assembly for coupling a first conduit to a second conduit includes a first collar configured to be mounted on the first conduit and a second collar configured to be mounted on the second conduit. The second collar may have a central longitudinal axis that is generally aligned with a z-axis. An inner bushing may be coupled to the first collar and have a generally convex surface, and an outer bushing may be coupled to the second collar and have a generally concave surface. The generally convex surface and the generally concave surface may mate with one another and be configured for relative movement such that the first collar gimbals in an xy-plane. | 12-23-2010 |
20130064410 | FLEXIBLE STRAP LISTENING DEVICE AID - An apparatus comprising a flexible strap, having a portion thereof in communication with at least a portion of a headphone cord is provided. The flexible strap is secured to an article of clothing for wearing the headphone cord in combination with the article of clothing. | 03-14-2013 |
20130064411 | LISTENING AID DEVICE - An earphone plug for receiving sound signals, having a first headphone cord connected to the earphone plug having a first length and a second headphone cord connected to the earphone plug having a second length is provided. A first earphone body connected to the first headphone cord is provided for converting the sound signals to audible sounds. A second earphone body, connected to the second headphone cord, is provided for converting the sound signals to audible sounds. A flexible strap, having a portion thereof in communication with one of at least a portion of the first headphone cord and the second headphone cord is also provided. | 03-14-2013 |
20130064412 | EAR PHONE LISTENING DEVICE - An earphone plug for receiving sound signals, having a first headphone cord connected to the earphone plug having a first length, and a second headphone cord connected to the earphone plug having a second length is provided. A first earphone body connected to the first headphone cord is provided for converting the sound signals to audible sounds. A second earphone body, connected to the second headphone cord, is provided for converting the sound signals to audible sounds. An attachment mechanism coupled to at least a portion of the first headphone cord or the second headphone cord for securing the earphone to an article of clothing. | 03-14-2013 |
20150308350 | MULTI-AXIS ACCESSORY GEARBOXES OF MECHANICAL DRIVE SYSTEMS AND GAS TURBINE ENGINES INCLUDING THE SAME - Gas turbine engines including multi-axis accessory gearboxes of mechanical drive systems are provided. The gearbox comprises a housing, a drive shaft, bevel pinion and drive shaft bevel gears, and a side bevel gear set operable to directly at least one side accessory device. Housing is disposed about a towershaft operatively coupled to main engine shaft and operable to rotate about a first axis. Drive shaft is skewed to main shaft and operable to rotate about a second axis that intersects the first axis at a first angle. Bevel pinion gear is mounted on the towershaft. Drive shaft bevel gear is mounted on the drive shaft. Side bevel gear set comprises an input gear meshing with one or more side bevel gears each having a side bevel gear axis at a second angle to the first axis and independently positionable relative to input gear. Second angle is independent of other angles. | 10-29-2015 |