|Numéro de publication||USRE46309 E1|
|Type de publication||Octroi|
|Numéro de demande||US 14/666,298|
|Date de publication||14 févr. 2017|
|Date de dépôt||23 mars 2015|
|Date de priorité||24 oct. 2007|
|Numéro de publication||14666298, 666298, US RE46309 E1, US RE46309E1, US-E1-RE46309, USRE46309 E1, USRE46309E1|
|Inventeurs||Alexander Say Go, Vladimir Petter|
|Cessionnaire d'origine||Sococo, Inc.|
|Exporter la citation||BiBTeX, EndNote, RefMan|
|Citations de brevets (297), Citations hors brevets (23), Classifications (19), Événements juridiques (1)|
|Liens externes: USPTO, Cession USPTO, Espacenet|
This application is an application for reissue of U.S. Pat. No. 8,407,605 and relates to the following co-pending patent applications, the entirety of each of which is incorporated herein by reference:
U.S. patent application Ser. No. 12/354,709, filed Jan. 15, 2009;
U.S. patent application Ser. No. 61/120,372, filed Dec. 5, 2008;
U.S. patent application Ser. No. 61/042,714, filed Apr. 5, 2008;
U.S. patent application Ser. No. 11/923,629, filed Oct. 24, 2007; and
U.S. patent application Ser. No. 11/923,634, filed Oct. 24, 2007.
When face-to-face communications are not practical, people often rely on one or more technological solutions to meet their communications needs. These solutions typically are designed to simulate one or more aspects of face-to-face communications. Traditional telephony systems enable voice communications between callers. Instant messaging (also referred to as “chat”) communications systems enable users to communicate text messages in real time through instant message computer clients that are interconnected by an instant message server. Some instant messaging systems additionally allow users to be represented in a virtual environment by user-controllable graphic objects (referred to as “avatars”). Interactive virtual reality communication systems enable users in remote locations to communicate over multiple real-time channels and to interact with each other by manipulating their respective avatars in three-dimensional virtual spaces. Each of these modes of communications typically can handle some form of data sharing between the communicants.
A common form of data sharing is application sharing, which involves transmitting application data from one node (referred to as the “sharer node”) to one or more other nodes (referred to as “viewer nodes”). Application sharing has a variety of useful applications, including providing remote technical support, remote collaboration, and remote presentation of demonstrations, documents, and images. In some proposed systems, an application sharing program on the sharer node periodically collects drawing commands (e.g., GDI calls for drawing lines and curves, rendering fonts and handling palettes) from a chained display driver process on the sharer node, packages the drawing commands into an order packet, and sends the order packet to a respective counterpart application sharing program on each of the viewer nodes that accurately constructs the shared view of the sharer's display. Such an application sharing approach, however, requires each viewer node to render its own version of the shared application by passing the drawing commands in the order packet to a display process (e.g., the GDI interface provided by a Microsoft® Windows® operating system).
What are needed are improved application sharing apparatus and methods.
In one aspect, the invention features a method in accordance with which ones of the windows that are associated with a software process are identified in a screen layout on a local display of a sharer network node. On the sharer network node, a composite image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout is generated. The composite image is transmitted from the sharer network node to a viewer network node.
In one aspect, the invention features a method in accordance with which locally-generated commands that are derived from local input device events on a sharer network node are received. Remotely-generated commands that are derived from remote input device events on a remote viewer network node also are received. The received commands are processed into a command sequence. The command sequence is passed to a shared process executing on the sharer network node. In a screen layout on a local display of the sharer network node, one or more windows that are associated with the shared process are presented in accordance with the received command sequence. An image of the one or more windows as they are presented in the screen layout is generated. The image is transmitted from the sharer network node to the viewer network node.
The invention also features apparatus operable to implement the inventive methods described above and computer-readable media storing computer-readable instructions causing a computer to implement the inventive methods described above.
Other features and advantages of the invention will become apparent from the following description, including the drawings and the claims.
In the following description, like reference numbers are used to identify like elements. Furthermore, the drawings are intended to illustrate major features of exemplary embodiments in a diagrammatic manner. The drawings are not intended to depict every feature of actual embodiments nor relative dimensions of the depicted elements, and are not drawn to scale.
A “window” is a visual area of a display that typically includes a user interface. A window typically displays the output of a software process and typically enables a user to input commands or data for the software process. A window that has a parent is called a “child window.” A window that has no parent, or whose parent is the desktop window, is called a “top-level window.” A “desktop” is a system-defined window that paints the background of a graphical user interface (GUI) and serves as the base for all windows displayed by all software processes.
The term “window scraping” refers to a process of extracting data from the display output of another software process. The extraction process is performed by a “scraper” software process.
A “layering order” (also referred to as the “z-order”) is an ordering of overlapping two-dimensional objects, such as windows in a graphical user interface (GUI), along an axis (typically referred to as the “z-axis”) that is perpendicular to a display on which the GUI is presented.
“Compositing” is the combining of visual elements from separate sources into a single composite image (also referred to herein as a “frame”).
The term “obscuration” means the act or process of concealing or hiding by or as if by covering.
A “communicant” is a person who communicates or otherwise interacts with other persons over one or more network connections, where the communication or interaction may or may not occur in the context of a virtual area. A “user” is a communicant who is operating a particular network node that defines a particular perspective for descriptive purposes. A “sharer” is a communicant who is operating a sharer network node. A “viewer” is a communicant who is operating a viewer network node.
A “realtime contact” of a user is a communicant or other person who has communicated with the user via a realtime communications platform.
A “computer” is any machine, device, or apparatus that processes data according to computer-readable instructions that are stored on a computer-readable medium either temporarily or permanently. An “operating system” is a software component of a computer system that manages and coordinates the performance of tasks and the sharing of computing and hardware resources. A “software process” (also referred to as software, an application, computer software, a computer application, a program, and a computer program) is a set of instructions that a computer can interpret and execute to perform one or more specific tasks. A software process may have one or more “threads” of execution. A “shared software process” is a software process whose output is shared with a viewer network node. A “computer data file” is a block of information that durably stores data for use by a software application.
A “database” is an organized collection of records that are presented in a standardized format that can be searched by computers. A database may be stored on a single computer-readable data storage medium on a single computer or it may be distributed across multiple computer-readable data storage media on one or more computers.
A “data sink” (referred to herein simply as a “sink”) is any of a device (e.g., a computer), part of a device, or software that receives data.
A “data source” (referred to herein simply as a “source”) is any of a device (e.g., a computer), part of a device, or software that originates data.
A “network node” (also referred to simply as a “node”) is a junction or connection point in a communications network. Exemplary network nodes include, but are not limited to, a terminal, a computer, and a network switch. A “server” network node is a host computer on a network that responds to requests for information or service. A “client” network node is a computer on a network that requests information or service from a server. A “network connection” is a link between two communicating network nodes. The term “local network node” refers to a network node that currently is the primary subject of discussion. The term “remote network node” refers to a network node that is connected to a local network node by a network communications link. A “connection handle” is a pointer or identifier (e.g., a uniform resource identifier (URI)) that can be used to establish a network connection with a communicant, resource, or service on a network node. A “sharer network node” is a network node that is sharing content with another network node, which is referred to as a “viewer network node. A “network communication” can include any type of information (e.g., text, voice, audio, video, electronic mail message, data file, motion data stream, and data packet) that is transmitted or otherwise conveyed from one network node to another network node over a network connection.
A “communicant interaction” is any type of direct or indirect action or influence between a communicant and another network entity, which may include for example another communicant, a virtual area, or a network service. Exemplary types of communicant interactions include communicants communicating with each other in realtime, a communicant entering a virtual area, and a communicant requesting access to a resource from a network service.
“Presence” refers to the ability and willingness of a networked entity (e.g., a communicant, service, or device) to communicate, where such willingness affects the ability to detect and obtain information about the state of the entity on a network and the ability to connect to the entity:
A “realtime data stream” is data that is structured and processed in a continuous flow and is designed to be received with no delay or only imperceptible delay. Realtime data streams include digital representations of voice, video, user movements, facial expressions and other physical phenomena, as well as data within the computing environment that may benefit from rapid transmission, rapid execution, or both rapid transmission and rapid execution, including for example, avatar movement instructions, text chat, realtime data feeds (e.g., sensor data, machine control instructions, transaction streams and stock quote information feeds), and file transfers.
A “virtual area” (also referred to as an “area” or a “place”) is a representation of a computer-managed space or scene. Virtual areas typically are one-dimensional, two-dimensional, or three-dimensional representations; although in some embodiments a virtual area may correspond to a single point. Oftentimes, a virtual area is designed to simulate a physical, real-world space. For example, using a traditional computer monitor, a virtual area may be visualized as a two-dimensional graphic of a three-dimensional computer-generated space. However, virtual areas do not require an associated visualization to implement switching rules. A virtual area typically refers to an instance of a virtual area schema, where the schema defines the structure and contents of a virtual area in terms of variables and the instance defines the structure and contents of a virtual area in terms of values that have been resolved from a particular context.
A “virtual area application” (also referred to as a “virtual area specification”) is a description of a virtual area that is used in creating a virtual environment. The virtual area application typically includes definitions of geometry, physics, and realtime switching rules that are associated with one or more zones of the virtual area.
A “virtual environment” is a representation of a computer-managed space that includes at least one virtual area and supports realtime communications between communicants.
A “zone” is a region of a virtual area that is associated with at least one switching rule or governance rule. A “switching rule” is an instruction that specifies a connection or disconnection of one or more realtime data sources and one or more realtime data sinks subject to one or more conditions precedent A switching rule controls switching (e.g., routing, connecting, and disconnecting) of realtime data streams between network nodes communicating in the context of a virtual area. A governance rule controls a communicants access to a resource (e.g., an area, a region of an area, or the contents of that area or region), the scope of that access, and follow-on consequences of that access (e.g., a requirement that audit records relating to that access must be recorded). A “renderable zone” is a zone that is associated with a respective visualization.
A “position” in a virtual area refers to a location of a point or an area or a volume in the virtual area. A point typically is represented by a single set of one-dimensional, two-dimensional, or three-dimensional coordinates (e.g., Cartesian coordinates, polar coordinates, or spherical coordinates) that define a spot in the virtual area. A coordinate can be defined as any single or plurality of numbers that establish location. An area typically is represented by the three-dimensional coordinates of three or more coplanar vertices that define a boundary of a closed two-dimensional shape in the virtual area. A volume typically is represented by the three-dimensional coordinates of four or more non-coplanar vertices that define a closed boundary of a three-dimensional shape in the virtual area.
A “spatial state” is an attribute that describes where a user has presence in a virtual area. The spatial state attribute typically has a respective value (e.g., a zone_ID value) for each of the zones in which the user has presence.
A “placemark” is a stored reference (e.g., a hyperlink) to a location in a virtual area. A placemark typically can be selected to present a view of the associated location in the virtual area to a user. The verb “placemark” means the act or operation of creating a placemark.
In the context of a virtual area, an “object” is any type of discrete element in a virtual area that may be usefully treated separately from the geometry of the virtual area. Exemplary objects include doors, portals, windows, view screens, and speakerphone. An object typically has attributes or properties that are separate and distinct from the attributes and properties of the virtual area. An “avatar” is an object that represents a communicant in a virtual area.
The term “double-click” refers to the act or operation of entering or inputting an execution command (e.g., double-clicking the left computer mouse button or by single-clicking a user interface button associated with an execute command, e.g., enter zone or view object). The term “shift-click” refers to the act or operation of entering or inputting a selection command (e.g., clicking the left computer mouse button) while the Shift key of an alphanumeric input device is activated. The term “shift-double-click” refers to the act or operation of entering or inputting an execution command while the Shift key of an alphanumeric input device is activated.
As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on.
The embodiments that are described herein enable application sharing with high fidelity, realtime performance, viewer immersion, and privacy protection. In some embodiments, the screen content that is associated with each thread of a software process can be determined and composited into a respective composite image (or frame) that is free of other window content. The contents of the windows that are associated with one or more software application threads on a network node can be broadcasted to other network nodes without risk of obscuration by the screen content (e.g., windows containing application content, messages, or dialog boxes) that might be generated by other software processes, thereby preventing corruption of shared window content by overlapping screen content that sometimes is generated by processes that are outside of the user's immediate control. This feature avoids the need for sharers to interrupt a presentation in order to remove obscuring screen content, thereby creating a more immersive collaboration experience for viewers of the shared window content. In addition, sharers don't have to worry that private information inadvertently will be shared along with the intended screen content, thereby maintaining the sharer's privacy.
Some embodiments also enable multichannel application sharing in which two or more communicants share applications and screen content with each other at the same time. These embodiments typically include an interface that allows each receiver to distinguish one shared window from another.
Embodiments of the communications application may implement one or more of the following application sharing modes:
In accordance with the method of
The communications application 26 generates a composite image of the identified windows as they are arranged in the screen layout and free of obscuration by any other windows in the screen layout (
In addition to preventing obscuration by other windows, some embodiments also prevent obscuration of the selected windows as a result of the windows being partially or completely off-screen (e.g., outside the visible desktop window that contains the screen layout). For example, a respective image of each of the identified windows is stored in a respective memory buffer, and the process of generating the composite image involves retrieving each of the images from the respective memory buffers and compositing the retrieved images into the composite image. For example, in some exemplary embodiments, each of the windows is a layered window whose screen data is stored in a respective memory buffer through a programmatic call to the Microsoft® Win32 application programming interface (API), which is available in Microsoft® Windows® operating systems versions 2000 and later. These operating systems provide an extended window style that is invoked by setting the WS_EX_LAYERED window style bit. The WS_EX_LAYERED style bit that is associated with a particular window may be set by the shared software process at window creation time (e.g., via the CreateWindowEx API call) or it may be set by the communications application 26 after creation time (e.g., via SetWindowLong API call with GWL_EXSTYLE). With the WS_EX_LAYERED window style bit set for a window, the operating system redirects the drawing of the window into an off-screen bitmap and buffer, which can then be accessed by the communications application 26 for generating the composite image. Similar layered windows functionality is available from other operating systems (e.g., X-Windows on a UNIX based operating system).
After the composite image has been generated, the communications application 26 transmits the composite image to the viewer one of the first and second network nodes (i.e., the one of the first and second network nodes that receives the composite image) 12, 14 (
B. Application Sharing Embodiments
In some embodiments, application sharing is initiated after a sharer network node has published one or more applications or documents that are available to share and at least one viewer has subscribed to at least one of the published applications or documents. In some embodiments, the sharer can publish a shared application or document to a viewscreen object that is associated with a virtual area and the viewer can subscribe to the shared content by activating the viewscreen object in the virtual area (e.g., by double-clicking on the viewscreen object with a user input device).
The viewer typically is granted one of two types of access to the shared content: view access, which allows the viewer to only passively view the shared content; and control access, which allows the viewer to view, control, edit, and manipulate the shared content. The type of access that is granted to the viewer can be set by the sharer or by one or more governance rules that are associated with the context in which the sharing takes place (e.g., a governance rule that is associated with a zone of a virtual area, as described below in section IV).
The shared content typically is streamed from the sharer network node to the viewer network node in the form of streaming bitmaps of the windows on the sharers display that are associated with the shared application or document The bitmap of each window can be streamed separately or already composited. The bitmaps typically are compressed before being streamed. If the viewer has only view access, then the viewer can only passively view images of the shared windows on the sharers display. If the viewer has control access, then the viewer network node can transmit remote control commands that are generated by a user input device (e.g., a keyboard, a computer mouse, a touchpad, and a touch screen) to the sharer network node for controlling, editing, and manipulating the shared content on the sharers network node.
2. Application Sharing Service
In some embodiments, the application sharing functionality of the communications application 26 is provided by a Scraper module, which is a plug-in for an Application Sharing Service that implements a platform specific part of the application sharing. This embodiment implements the application sharing mode in which all windows created by a shared application automatically are shared with subscribing network nodes. This section describes an exemplary embodiment of the Scraper module and the Application Sharing Service that are implemented in a Microsoft® Windows® application environment that provides layered windows functionality.
(i) The Methods Start and Stop
Before Application Sharing Service calls any method on the Scraper module it calls the start method and, during shutdown, it calls the stop method. In the illustrated embodiment no calls can be made before the start method was called and after the stop method was called.
In the start method, the Scraper module starts the thread that listens for WinEvents. The Scraper module listens for WinEvents to get notifications when windows and menus are created/destroyed and shown/hidden.
In the stop method, the Scraper module stops all the application monitors and then shuts down the thread that is listening for WinEvents.
When a WinEvent notification is received, the Scraper module gets the thread identifier (ID) and the process ID for the window. The Scraper module then looks up a monitor by the process ID and notifies the application monitor about the event. In response to a notification that a window was destroyed, the Scraper module notifies all of the application monitors of the event since the process and thread IDs are not available.
(ii) The Method Get_Sharable_Application
As shown in
(iii) The Method Start_Share_Application
(iv) The Method Stop_Share_Application
(v) The Method Get_Shared_Applications
This method returns a list of applications being shared.
(vi) The Method Subscribe
(vii) The Method Unsubscribe
c. Window Scraping
Each application monitor has a thread that wakes up on a periodic basis and performs window scraping.
In accordance with the method of
The scraper module determines a bounding rectangle that encompasses all the windows that are associated with the shared software process (
For each of the windows that are associated with the shared software process, the Scraper module calls a scrape function (
allows individual window compressed bitmaps to be sent, instead of just the compressed composited bitmap.
After completing the scraping process, the Scraper module creates a device independent bitmap of the composite image (
The Scraper module starts with an initial list of all top-level windows that are associated with the shared software process (
The Scraper module determines a z-order of all the windows that currently are associated with the shared software process (
The Scraper module sorts the initial list of top-level windows that are associated with the software process according to the determined z-order (
The Scraper module appends to the sorted list any of the windows in the initial list of top-level windows that are not included in the sorted list (
The Scraper module replaces the initial list with the sorted list of top-level windows (
d. Remote Access
If the viewers have only view access, then the Application Sharing Service on the sharer network node only transmits the composite images of the shared window content (in the form of samples) to the subscribing ones of the viewer network nodes. The viewers on the viewer network nodes can only passively view the composite images of the shared windows on the sharers display.
If the viewers have control access, on the other hand, then the Application Sharing Service on the sharer network node transmits the composite images of the shared window content (in the form of samples) to the subscribing ones of the viewer network nodes. In addition, the Scraper module combines commands that are received from the viewer network nodes with the commands that are generated by the sharer on the sharer network node, and passes the combined set of commands to the shared application. This allows the viewers to control, edit, and manipulate the shared application on the sharer network node. The commands typically are derived from events that are generated by one or more user input devices (e.g., a keyboard, a computer mouse, a touchpad, and a touch screen) on the viewer and sharer network nodes.
The display processes 620, 628 provide the display facilities of the sharer network node 610 and the viewer network node 612, respectively. The display facilities control the writing of visual contents on the sharer and viewer displays 618, 626. In some embodiments, the display facilities include a graphic device interface (e.g., the GDI available in Microsoft® Windows® application environments) that provides functions that can be called by software processes in order to present visual content on the displays 618, 626.
The network layers 624, 632 provide the networking facilities of the sharer network node 610 and the viewer network node 612, respectively. The network facilities include, for example, networking communications protocol stacks and networking hardware that perform processes associated with sending and receiving information over the network 18.
The communications applications 622, 630 respectively provide various communications facilities (including application sharing facilities) to the sharer network node 610 and the viewer network node 612. In the illustrated embodiment, the communications application 622 on the sharer network node 610 generates a composite image 634 of the shared window content 614 on the sharer's display, transmits the composite image 634 over the network 18 to the viewer network node 612 for presentation on the viewer's display 626, and grants the viewer remote control access to the shared window content 614. The communications application 630 on the viewer network node 612 controls the presentation of the composite image 634 on the display 626, transforms user input into commands, and transmits the commands to the sharer network node 610.
In accordance with the method of
The communications application 622 receives commands that are derived from local input device events generated on the sharer network node 610 (
The operating systems on the sharer and viewer network nodes 610, 612 typically convert the pointer input device events into user commands, where the location parameter values are defined with respect to the coordinate system of the main window of the graphical user interface (e.g., the desktop window in a Microsoft® Windows® application environment). The sharer's input commands that are received by the communications application 622 (
The communications application 622 processes the received commands into a command sequence (
The communications application 622 passes the command sequence to the shared process 616 (
The shared process 616 calls one or more graphical device interface functions that are provided by the display process 620 to present the windows that are associated with the shared software process 616 on the sharer display 618 in accordance with the received command sequence (
The process (
A. System Architecture
In some embodiments, the network infrastructure service environment 30 manages sessions of the first and second client nodes 12, 14 in a virtual area 32 in accordance with a virtual area application 34. The virtual area application 34 is hosted by the virtual area 32 and includes a description of the virtual area 32. The communications applications 26 operating on the first and second client network nodes 12, 14 present respective views of the virtual area 32 in accordance with data received from the network infrastructure service environment 30 and provide respective interfaces for receiving commands from the communicants. The communicants typically are represented in the virtual area 32 by respective avatars, which move about the virtual area 32 in response to commands that are input by the communicants at their respective network nodes. Each communicants view of the virtual area 32 typically is presented from the perspective of the communicants avatar, which increases the level of immersion experienced by the communicant. Each communicant typically is able to view any part of the virtual area 32 around his or her avatar. In some embodiments, the communications applications 26 establish realtime data stream connections between the first and second client network nodes 12, 14 and other network nodes sharing the virtual area 32 based on the positions of the communicants' avatars in the virtual area 32.
The network infrastructure service environment 30 also maintains a relationship database 36 that contains records 38 of interactions between communicants. Each interaction record 38 describes the context of an interaction between a pair of communicants.
2. Network Environment
The network 18 may include any of a local area network (LAN), a metropolitan area network (MAN), and a wide area network (WAN) (e.g., the internet). The network 18 typically includes a number of different computing platforms and transport facilities that support the transmission of a wide variety of different media types (e.g., text, voice, audio, and video) between network nodes.
The communications application 26 (see
3. Network Infrastructure Services
The network infrastructure service environment 30 typically includes one or more network infrastructure services that cooperate with the communications applications 26 in the process of establishing and administering network connections between the client nodes 12, 14 and other network nodes (see
The account service manages communicant accounts for the virtual environment. The account service also manages the creation and issuance of authentication tokens that can be used by client network nodes to authenticate themselves to any of the network infrastructure services.
The security service controls communicants' access to the assets and other resources of the virtual environment. The access control method implemented by the security service typically is based on one or more of capabilities (where access is granted to entities having proper capabilities or permissions) and an access control list (where access is granted to entities having identities that are on the list). After a particular communicant has been granted access to a resource, that communicant typically uses the functionality provided by the other network infrastructure services to interact in the network communications environment 10.
The area service administers virtual areas. In some embodiments, the area service remotely configures the communications applications 26 operating on the first and second client network nodes 12, 14 in accordance with the virtual area application 34 subject to a set of constraints 47 (see
The area service also manages network connections that are associated with the virtual area subject to the capabilities of the requesting entities, maintains global state information for the virtual area, and serves as a data server for the client network nodes participating in a shared communication session in a context defined by the virtual area 32. The global state information includes a list of all the objects that are in the virtual area and their respective locations in the virtual area. The area service sends instructions that configure the client network nodes. The area service also registers and transmits initialization information to other client network nodes that request to join the communication session. In this process, the area service may transmit to each joining client network node a list of components (e.g., plugins) that are needed to render the virtual area 32 on the client network node in accordance with the virtual area application 34. The area service also ensures that the client network nodes can synchronize to a global state if a communications fault occurs. The area service typically manages communicant interactions with virtual areas via governance rules that are associated with the virtual areas.
The rendezvous service manages the collection, storage, and distribution of presence information and provides mechanisms for network nodes to communicate with one another (e.g., by managing the distribution of connection handles) subject to the capabilities of the requesting entities. The rendezvous service typically stores the presence information in a presence database. The rendezvous service typically manages communicant interactions with each other via communicant privacy preferences.
The interaction service maintains the relationship database 36 that contains the records 38 of interactions between communicants. For every interaction between communicants, one or more services of the network infrastructure service environment 30 (e.g., the area service) transmit interaction data to the interaction service. In response, the interaction service generates one or more respective interaction records and stores them in the relationship database. Each interaction record describes the context of an interaction between a pair of communicants. For example, in some embodiments, an interaction record contains an identifier for each of the communicants, an identifier for the place of interaction (e.g., a virtual area instance), a description of the hierarchy of the interaction place (e.g., a description of how the interaction room relates to a larger area), start and end times of the interaction, and a list of all files and other data streams that are shared or recorded during the interaction. Thus, for each realtime interaction, the interaction service tracks when it occurred, where it occurred, and what happened during the interaction in terms of communicants involved (e.g., entering and exiting), objects that are activated/deactivated, and the files that were shared.
The interaction service also supports queries on the relationship database 36 subject to the capabilities of the requesting entities. The interaction service presents the results of queries on the interaction database records in a sorted order (e.g., most frequent or most recent) based on virtual area. The query results can be used to drive a frequency sort of contacts whom a communicant has met in which virtual areas, as well as sorts of who the communicant has met with regardless of virtual area and sorts of the virtual areas the communicant frequents most often. The query results also may be used by application developers as part of a heuristic system that automates certain tasks based on relationships. An example of a heuristic of this type is a heuristic that permits communicants who have visited a particular virtual area more than five times to enter without knocking by default or a heuristic that allows communicants who were present in an area at a particular time to modify and delete files created by another communicant who was present in the same area at the same time. Queries on the relationship database 36 can be combined with other searches. For example, queries on the relationship database may be combined with queries on contact history data generated for interactions with contacts using a communication system (e.g., Skype, Facebook, and Flickr) that is outside the domain of the network infrastructure service environment 30.
4. Virtual Areas
The communications application 26 and the network infrastructure service environment 30 typically administer the realtime connections with network nodes in a communication context that is defined by an instance of a virtual area. The virtual area instance may correspond to an abstract (non-geometric) virtual space that is defined with respect to abstract coordinates. Alternatively, the virtual area instance may correspond to a visual virtual space that is defined with respect to one-, two- or three-dimensional geometric coordinates that are associated with a particular visualization. Abstract virtual areas may or may not be associated with respective visualizations, whereas visual virtual areas are associated with respective visualizations.
As explained above, communicants typically are represented by respective avatars in a virtual area that has an associated visualization. The avatars move about the virtual area in response to commands that are input by the communicants at their respective network nodes. In some embodiments, the communicants view of a virtual area instance typically is presented from the perspective of the communicants avatar, and each communicant typically is able to view any part of the visual virtual area around his or her avatar, increasing the level of immersion that is experienced by the communicant.
As explained in detail below, the virtual area 66 includes zones 74, 76, 78, 80, 82 that are associated with respective rules that govern the switching of realtime data streams between the network nodes that are represented by the avatars 68-72 in the virtual area 66. (During a typical communication session, the dashed lines demarcating the zones 74-82 in
A virtual area is defined by a specification that includes a description of geometric elements of the virtual area and one or more rules, including switching rules and governance rules. The switching rules govern realtime stream connections between the network nodes. The governance rules control a communicants access to resources, such as the virtual area itself, regions with the virtual area, and objects within the virtual area. In some embodiments, the geometric elements of the virtual area are described in accordance with the COLLADA-Digital Asset Schema Release 1.4.1 April 2006 specification (available from http://www.khronos.org/collada/), and the switching rules are described using an extensible markup language (XML) text format (referred to herein as a virtual space description format (VSDL)) in accordance with the COLLADA Streams Reference specification described in U.S. application Ser. Nos. 11/923,629 and 11/923,634.
The geometric elements of the virtual area typically include physical geometry and collision geometry of the virtual area. The physical geometry describes the shape of the virtual area. The physical geometry typically is formed from surfaces of triangles, quadrilaterals, or polygons. Colors and textures are mapped onto the physical geometry to create a more realistic appearance for the virtual area. Lighting effects may be provided, for example, by painting lights onto the visual geometry and modifying the texture, color, or intensity near the lights. The collision geometry describes invisible surfaces that determine the ways in which objects can move in the virtual area. The collision geometry may coincide with the visual geometry, correspond to a simpler approximation of the visual geometry, or relate to application-specific requirements of a virtual area designer.
The switching rules typically include a description of conditions for connecting sources and sinks of realtime data streams in terms of positions in the virtual area. Each rule typically includes attributes that define the realtime data stream type to which the rule applies and the location or locations in the virtual area where the rule applies. In some embodiments, each of the rules optionally may include one or more attributes that specify a required role of the source, a required role of the sink, a priority level of the stream, and a requested stream handling topology. In some embodiments, if there are no explicit switching rules defined for a particular part of the virtual area, one or more implicit or default switching rules may apply to that part of the virtual area. One exemplary default switching rule is a rule that connects every source to every compatible sink within an area, subject to policy rules. Policy rules may apply globally to all connections between the client nodes or only to respective connections with individual client nodes. An example of a policy rule is a proximity policy rule that only allows connections of sources with compatible sinks that are associated with respective objects that are within a prescribed distance (or radius) of each other in the virtual area.
In some embodiments, governance rules are associated with a virtual area to control who has access to the virtual area, who has access to its contents, what is the scope of that access to the contents of the virtual area (e.g., what can a user do with the contents), and what are the follow-on consequences of accessing those contents (e.g., record keeping, such as audit logs, and payment requirements). In some embodiments, an entire virtual area or a zone of the virtual area is associated with a “governance mesh.” In some embodiments, a governance mesh is implemented in a way that is analogous to the implementation of the zone mesh described in U.S. application Ser. Nos. 11/923,629 and 11/923,634. A governance mesh enables a software application developer to associate governance rules with a virtual area or a zone of a virtual area. This avoids the need for the creation of individual permissions for every file in a virtual area and avoids the need to deal with the complexity that potentially could arise when there is a need to treat the same document differently depending on the context.
In some embodiments, a virtual area is associated with a governance mesh that associates one or more zones of the virtual area with a digital rights management (DRM) function. The DRM function controls access to one or more of the virtual area or one or more zones within the virtual area or objects within the virtual area. The DRM function is triggered every time a communicant crosses a governance mesh boundary within the virtual area. The DRM function determines whether the triggering action is permitted and, if so, what is the scope of the permitted action, whether payment is needed, and whether audit records need to be generated. In an exemplary implementation of a virtual area, the associated governance mesh is configured such that if a communicant is able to enter the virtual area he or she is able to perform actions on all the documents that are associated with the virtual area, including manipulating the documents, viewing the documents, downloading the documents, deleting the documents, modifying the documents and re-uploading the documents. In this way, the virtual area can become a repository for information that was shared and discussed in the context defined by the virtual area.
Additional details regarding the specification of a virtual area are described in U.S. application Ser. No. 61/042,714 (which was filed on Apr. 4, 2008), Ser. No. 11/923,629 (which was filed on Oct. 24, 2007), and Ser. No. 11/923,634 (which was filed on Oct. 24, 2007).
5. Communications Application
In some embodiments, the communications application 26 includes:
a. local Human Interface Devices (HIDs) and audio playback devices;
b. a So3D graphical display, avatar, and physics engine;
c. a system database and storage facility.
a. Local Human Interface Devices (HIDS) and Audio Playback Devices
The local HIDs enable a communicant to input commands and other signals into the client network node while participating in a virtual area communications session. Exemplary HIDs include a computer keyboard, a computer mouse, a touch screen display, and a microphone.
The audio playback devices enable a communicant to playback audio signals that are received during a virtual area communications session. Exemplary audio playback devices include audio processing hardware (e.g., a sound card) for manipulating (e.g., mixing and applying special effects) audio signals, and speakers for outputting sounds.
b. So3D Graphical Display, Avatar, and Physics Engine
The So3D engine is a three-dimensional visualization engine that controls the presentation of a respective view of a virtual area and objects in the virtual area on a display monitor. The So3D engine typically interfaces with a graphical user interface driver and the HID devices to present the views of the virtual area and to allow the communicant to control the operation of the communications application 26.
In some embodiments, the So3D engine receives graphics rendering instructions from the area service. The So3D engine also may read a local communicant avatar database that contains images needed for rendering the communicants avatar in the virtual area. Based on this information, the So3D engine generates a visual representation (i.e., an image) of the virtual area and the objects in the virtual area from the point of view (position and orientation) of the communicants avatar in the virtual area. The visual representation typically is passed to the graphics rendering components of the operating system, which drive the graphics rendering hardware to render the visual representation of the virtual area on the client network node.
The communicant can control the presented view of the virtual area by inputting view control commands via a HID device (e.g., a computer mouse). The So3D engine updates the view of the virtual area in accordance with the view control commands. The So3D engine also updates the graphic representation of the virtual area on the display monitor in accordance with updated object position information received from the area service 26.
c. System Database and Storage Facility
The system database and storage facility stores various kinds of information that is used by the platform. Exemplary information that typically is stored by the storage facility includes the presence database, the relationship database, an avatar database, a real user id (RUID) database, an art cache database, and an area application database. This information may be stored on a single network node or it may be distributed across multiple network nodes.
6. Client Node Architecture
A communicant typically connects to the network 18 from a client network node. The client network node typically is implemented by a general-purpose computer system or a dedicated communications computer system (or “console”, such as a network-enabled video game console). The client network node executes communications processes that establish realtime data stream connections with other network nodes and typically executes visualization rendering processes that present a view of each virtual area entered by the communicant.
A communicant may interact (e.g., input commands or data) with the computer system 120 using one or more input devices 130 (e.g. one or more keyboards, computer mice, microphones, cameras, joysticks, physical motion sensors such Wii input devices, and touch pads). Information may be presented through a graphical user interface (GUI) that is presented to the communicant on a display monitor 132, which is controlled by a display controller 134. The computer system 120 also may include other input/output hardware (e.g., peripheral output devices, such as speakers and a printer). The computer system 120 connects to other network nodes through a network adapter 136 (also referred to as a “network interface card” or NIC).
A number of program modules may be stored in the system memory 124, including application programming interfaces 138 (APIs), an operating system (OS) 140 (e.g., the Windows XP® operating system available from Microsoft Corporation of Redmond, Wash. U.S.A.), the communications application 26, drivers 142 (e.g., a GUI driver), network transport protocols 144, and data 146 (e.g., input data, output data, program data, a registry, and configuration settings).
7. Server Node Architecture
In some embodiments, the one or more server network nodes of the virtual environment creator 16 are implemented by respective general-purpose computer systems of the same type as the client network node 120, except that each server network node typically includes one or more server software applications.
In other embodiments, the one or more server network nodes of the virtual environment creator 16 are implemented by respective network devices that perform edge services (e.g., routing and switching).
B. Exemplary Communication Session
Referring back to
In some embodiments, the area service maintains global state information that includes a current specification of the virtual area, a current register of the objects that are in the virtual area, and a list of any stream mixes that currently are being generated by the network node hosting the area service. The objects register typically includes for each object in the virtual area a respective object identifier (e.g., a label that uniquely identifies the object), a connection handle (e.g., a URI, such as an IP address) that enables a network connection to be established with a network node that is associated with the object, and interface data that identifies the realtime data sources and sinks that are associated with the object (e.g., the sources and sinks of the network node that is associated with the object). The objects register also typically includes one or more optional role identifiers for each object; the role identifiers may be assigned explicitly to the objects by either the communicants or the area service, or may be inferred from other attributes of the objects or the user. In some embodiments, the objects register also includes the current position of each of the objects in the virtual area as determined by the area service from an analysis of the realtime motion data streams received from the network nodes associated with objects in the virtual area. In this regard, the area service receives realtime motion data streams from the network nodes associated with objects in the virtual area, tracks the communicants' avatars and other objects that enter, leave, and move around in the virtual area based on the motion data. The area service updates the objects register in accordance with the current locations of the tracked objects.
In the process of administering realtime data stream connections with other network nodes, the area service maintains for each of the client network nodes a set of configuration data, including interface data, a zone list and the positions of the objects that currently are in the virtual area. The interface data includes for each object associated with each of the client network nodes a respective list of all the sources and sinks of realtime data stream types that are associated with the object. The zone list is a register of all the zones in the virtual area that currently are occupied by the avatar associated with the corresponding client network node. When a communicant first enters a virtual area, the area service typically initializes the current object positions database with position initialization information. Thereafter, the area service updates the current object positions database with the current positions of the objects in the virtual area as determined from an analysis of the realtime motion data streams received from the other client network nodes sharing the virtual area.
C. Interfacing with a Spatial Virtual Communication Environment
In addition to the local Human Interface Device (HID) and audio playback devices, the So3D graphical display, avatar, and physics engine, and the system database and storage facility, the communications application 26 also includes a graphical navigation and interaction interface (referred to herein as a “seeker interface”) that interfaces the user with the spatial virtual communication environment. The seeker interface includes navigation controls that enable the user to navigate the virtual environment and interaction controls that enable the user to control his or her interactions with other communicants in the virtual communication environment. The navigation and interaction controls typically are responsive to user selections that are made using any type of input device, including a computer mouse, a touch pad, a touch screen display, a keyboard, and a video game controller. The seeker interface is an application that operates on each client network node. The seeker interface is a small, lightweight interface that a user can keep up and running all the time on his or her desktop. The seeker interface allows the user to launch virtual area applications and provides the user with immediate access to realtime contacts and realtime collaborative places (or areas). The seeker interface is integrated with realtime communications applications and/or realtime communications components of the underlying operating system such that the seeker interface can initiate and receive realtime communications with other network nodes. A virtual area is integrated with the users desktop through the seeker interface such that the user can upload files into the virtual environment created by the virtual environment creator 16, use files stored in association with the virtual area using the native client software applications independently of the virtual environment while still present in a virtual area, and more generally treat presence and position within a virtual area as an aspect of their operating environment analogous to other operating system functions rather than just one of several applications.
The spatial virtual communication environment typically can be modeled as a spatial hierarchy of places (also referred to herein as “locations”) and objects. The spatial hierarchy includes an ordered sequence of levels ranging from a top level to a bottom level. Each of the places in a successive one of the levels of the spatial hierarchy is contained in a respective one of the places in a preceding one of the levels. Each of the objects in the spatial hierarchy is contained in a respective one of the places. The levels of the spatial hierarchy typically are associated with respective visualizations that are consistent with a geographical, architectural, or urban metaphor, and are labeled accordingly. The zones of each virtual area are defined by respective meshes, some of which define elements of a physical environment (e.g., spaces, such as rooms and courtyards, that are associated with a building) that may contain objects (e.g., avatars and props, such as view screen objects and conferencing objects).
The navigational controls of the seeker interface allow the user to traverse a path through the virtual environment in accordance with a navigational model that is tied to the underlying spatial hierarchy of places and objects. The network infrastructure service environment 30 records the path traversed by the user. In some embodiments, the network infrastructure service environment 30 records a history that includes a temporally ordered list of views of the virtual area that are presented to the user as the user navigates through the virtual area. Each view typically corresponds to a view of a respective renderable zone of the virtual area. In these embodiments, the navigation controls enable the user to move to selected ones of the zones in the history. The navigation controls also include a graphical representation of a depth path that shows the location in the spatial hierarchy that corresponds to the users current view of the virtual area. In some embodiments, the graphical representation of the depth path includes a respective user-selectable link to a respective view of each of the preceding levels in the spatial hierarchical model of the virtual area above the current view.
The interaction controls of the seeker interface allow the user to manage interactions with other communicants. The interaction options that available to the user typically depend on the zones in which the user has a presence. In some embodiments, the interaction options that are available to communicants who have presence in a particular zone are different from the options that are available to other communicants who do not have presence in that zone. The level of detail and interactivity of the user typically depend on whether or not the user has a presence the particular zone. In one exemplary embodiment, if the user is outside the virtual area, the user is provided with a minimal level of detail of the interactions occurring within the virtual area (e.g., the user can see an outline of the floorplan, background textures, and plants of the area, but the user cannot see where other communicants are present in the area); if the user is within the virtual area but outside a particular zone of the area, the user is provided with a medium level of detail of the interactions occurring within the particular zone (e.g., the user can see where other communicants are present in the area, see a visualization of their current states—talking, typing a chat message, whether or not their headphones and microphones are turned-on—and see whether any of the view screens are active); if the user is within the particular zone of the area, the user is provided with full level of detail of the interactions occurring with the particular zone (e.g., the user can see a thumbnail of the file being shared on a view screen, hear and speak with other communicants in the area, and see elements of a log of chat messages that were generated by communicants in the zone). In some embodiments, the switching and governance rules that are associated with the zones of the virtual area control how the network infrastructure services distinguish between those who have presence in the particular zone from those who do not.
Each of the tabs 164 typically is associated with a respective view of the virtual environment. In the illustrated embodiment the view presented in the table 164 (labeled “My Areas”) is associated with a respective set of virtual areas, which may be a default set of virtual areas in the virtual environment or it may be a set of virtual areas that is identified by a respective filter on the interaction database. In particular, the tab 164 is associated with a set of three virtual areas (i.e., Acme, Sococo Help Area, and Personal Space), which may be a default set of areas that are associated with the user or may be identified by a filter that identifies all of the areas that are associated with the user (e.g., all of the areas in which the user has interacted). Additional tabs may be created by selecting the “+” button 170.
The browsing area 166 of each tab shows graphic representations of the elements of the virtual environment that are associated with the tab. For example, in the illustrated embodiment, the browsing area 166 shows top-level views 172, 174, 176 of the virtual areas that are associated with the tab 164. The user may navigate to the next lower level in the spatial hierarchical model of any of the virtual areas by selecting the corresponding graphic representation of the virtual area.
The toolbar 168 includes an adaptive set of navigational and interaction tools that automatically are selected by the seeker interface based on the current location of the user in the virtual environment In the illustrated embodiment, the toolbar 168 includes a back button 178, a forward button 180, a placemarks button 182, and a home button 184. The back button 178 corresponds to a backward control that enables the user to incrementally move backward to preceding ones of the zones in the history of the zones that were traversed by the user. The forward button 180 corresponds to a forward control that enables the user to incrementally move forward to successive ones of the zones in the history of the zones that were traversed by the user. The placemarks button 182 provides a placemarking control for storing links to zones and a placemark navigation control for viewing a list of links to previously placemarked zones. In response to a user selection of the placemarking control, a placemark is created by storing an image of the location shown in the current view in association with a hyperlink to the corresponding location in the virtual area. In response to a user selection of the placemark navigation control, a placemarks window is presented to the user. The placemarks window includes live visualizations of all locations that have been placemarked by the user. Each of the images in the placemarks window is associated with a respective user-selectable hyperlink. In response to user selection of one of the hyperlinks in the placemarks window, a view of the virtual area corresponding to the location associated with the selected hyperlink is automatically displayed in the browsing area 166 of the seeker interface window 162. The home button 184 corresponds to a control that returns the user to a view of a designated “home” location in the virtual environment (e.g., the view shown in
The seeker interface shows a top or floorplan view of the Acme virtual area in the browsing area 166 of the tab 164 and provides the user with a default set of interaction options. In the illustrated embodiment, a presence automatically is established in a courtyard zone 190 of the virtual area, and the user's microphone and default speakers (e.g., headphones) are turned-on. In the floorplan view shown in
In addition to the backward button 178, the forward button 180, the placemarks button 182, and the home button 184, the toolbar 168 also includes a series of one or more breadcrumb buttons 207 that originate from and include the home button 184. The breadcrumb buttons 207 correspond to a hierarchical sequence of successive, user-selectable links. Each of the successive links corresponds to a view of a respective level in the hierarchical model of the virtual area in which each successive level is contained by preceding ones of the levels. In the illustrated embodiment, the breadcrumb buttons 207 include the home button 184 and an Acme button 208 that corresponds to the current view of the Acme virtual area shown in
When an area is selected or in focus, the button 210 appears as an iconographic representation of two people and is labeled “members,” and allows members and moderators to see the list of people associated with an area. When an audio or chat zone is in focus, the button 210 has a different image (e.g., an image of an arrow pointing downward onto a plane to represent the operation of getting) and is labeled “get”. In response to a user selection of the button 210, a list of all the members of the Acme virtual area 166 is displayed in a user interface. The user may select any of the communicants in the list and click a get button that is presented in the user interface; in response, the platform transmits an invitation to the selected communicant to join the user in a designated one of the zones.
The settings button 212 provides the user with access to a set of controls for specifying default settings that are associated with the current area.
The user may navigate from the view of the Acme virtual area shown in
The user may navigate to any of the zones of the Acme virtual area. In some embodiments, in order to move to a zone, the user transmits a command to execute one of the zones displayed on the monitor, (e.g., by selecting the zone and then clicking the Enter button (in the toolbar), or, as a shortcut, double-clicking the zone) and, in response, the platform depicts the user's avatar in the zone corresponding to the zone object. In response to the zone execution command, the seeker interface outlines the zone (indicating to the user that it is selected) and updates the breadcrumb buttons 207 to show the selected zone location in the hierarchy. Toolbar buttons specific to the selection will also appear to the right of the breadcrumb buttons 207.
The user also may interact with any objects (e.g., a screen, table, or file) that are present in a zone. In some embodiments, in order to interact with an object, the user transmits a command to execute one of the objects displayed on the monitor, (e.g., by selecting the object and then clicking the View button (in the toolbar), or, as a shortcut, double-clicking the object) and, in response, the platform performs an operation with respect to the object (e.g., present a zoomed-in view of the object, open an interaction interface window, etc.). In response to the object execution command, the seeker interface outlines or otherwise highlights the prop (indicating to the user that it is selected) and updates the breadcrumb buttons 207 to show the selected object location in the hierarchy. Toolbar buttons specific to the selection will also appear to the right of the breadcrumb buttons 207.
In response to the user's command to execute the wall object 290, the seeker interface presents in the browsing area 166 of the tab 164 a 2.5-dimensional view of the contents of the wall object 290 and areas of the Main space 213 surrounding the wall object 290. In the embodiment shown in
The interface also shows in the minimap 256 a view of the Main space 213 and areas of the Acme space surrounding the Main space 213. The minimap 256 also shows a highlighted view 292 of the selected North Wall object 290 in the Main space 213.
The breadcrumb buttons 207 shown in the toolbar 168 of the tab 164 include a North Wall button 294 that corresponds to the current level in the hierarchical spatial model of the virtual area. The toolbar 168 includes a rotate left button 296 and a rotate right button 298 that allow the user to rotate the current view left and right by ninety degrees (90°) so that the user can view the contents of different walls of the Main space in the central viewing area of the 2.5-dimensional view of the Main space 213. The user also can double-click a different one of the walls that are shown in the minimap 256 in order to change the contents that are presented in the central viewing area of the 2.5-dimensional view of the Main space 213.
One or more viewers can subscribe to the windows of the shared application showing the contents of the selected data file by clicking (or double-clicking) on the thumbnail image shown on the viewscreen object 291. Each viewer can view, control, edit, and manipulate the shared window content presented on the viewscreen object 291 subject to any governance rules that are associated with the selected data file or the zone containing the viewscreen object 291. If a viewer has control access to the shared window content, the viewer can input commands to the shared process executing on the sharer network node by using one or more input devices on the viewer's network node as described above in section IV. Assuming that realtime performance can be achieved over the respective network connections between the sharer network node and the viewer network nodes, the edits and other manipulations of the shared data file typically will appear to each of the collaborators as if they were made on the same network node.
The embodiments that are described herein enable application sharing with high fidelity, realtime performance, viewer immersion, and privacy protection. Some embodiments also enable multichannel application sharing in which two or more communicants share applications and screen content with each other at the same time. These embodiments typically include an interface that allows each receiver to distinguish one shared window from another.
Other embodiments are within the scope of the claims.
|Brevet cité||Date de dépôt||Date de publication||Déposant||Titre|
|US5471318||22 avr. 1993||28 nov. 1995||At&T Corp.||Multimedia communications network|
|US5491743||24 mai 1994||13 févr. 1996||International Business Machines Corporation||Virtual conference system and terminal apparatus therefor|
|US5627978||16 déc. 1994||6 mai 1997||Lucent Technologies Inc.||Graphical user interface for multimedia call set-up and call handling in a virtual conference on a desktop computer conferencing system|
|US5727155 *||29 janv. 1997||10 mars 1998||Intel Corporation||Method and apparatus for dynamically controlling a remote system's access to shared applications on a host system|
|US5737533||26 oct. 1995||7 avr. 1998||Wegener Internet Projects Bv||System for generating a virtual reality scene in response to a database search|
|US5758110||30 avr. 1997||26 mai 1998||Intel Corporation||Apparatus and method for application sharing in a graphic user interface|
|US5764916||27 sept. 1996||9 juin 1998||Ichat, Inc.||Method and apparatus for real time communication over a computer network|
|US5793365||2 janv. 1996||11 août 1998||Sun Microsystems, Inc.||System and method providing a computer user interface enabling access to distributed workgroup members|
|US5938724||2 avr. 1997||17 août 1999||Ncr Corporation||Remote collaboration system that stores annotations to the image at a separate location from the image|
|US5949414 *||28 oct. 1997||7 sept. 1999||Canon Kabushiki Kaisha||Window control with side conversation and main conference layers|
|US5956038||11 juil. 1996||21 sept. 1999||Sony Corporation||Three-dimensional virtual reality space sharing method and system, an information recording medium and method, an information transmission medium and method, an information processing method, a client terminal, and a shared server terminal|
|US5982372||14 nov. 1996||9 nov. 1999||International Business Machines Corp.||Visual metaphor for shortcut navigation in a virtual world|
|US5995096 *||18 déc. 1997||30 nov. 1999||Hitachi, Ltd.||Conference display control method and apparatus for an electronic conference for displaying either shared or local data and transferring local data|
|US5999206||22 mai 1998||7 déc. 1999||Futaba Denshi Kogyo Kabushiki Kaisha||Device for expanding light-amount correction dynamic range|
|US5999208 *||15 juil. 1998||7 déc. 1999||Lucent Technologies Inc.||System for implementing multiple simultaneous meetings in a virtual reality mixed media meeting room|
|US6005578||25 sept. 1997||21 déc. 1999||Mindsphere, Inc.||Method and apparatus for visual navigation of information objects|
|US6047314 *||27 févr. 1998||4 avr. 2000||Ncr Corporation||Remote collaboration system with multiple host computers using multiple applications|
|US6057856||16 sept. 1997||2 mai 2000||Sony Corporation||3D virtual reality multi-user interaction with superimposed positional information display for each user|
|US6119147 *||28 juil. 1998||12 sept. 2000||Fuji Xerox Co., Ltd.||Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space|
|US6215498||10 sept. 1998||10 avr. 2001||Lionhearth Technologies, Inc.||Virtual command post|
|US6219045||12 nov. 1996||17 avr. 2001||Worlds, Inc.||Scalable virtual world chat client-server system|
|US6226655||2 déc. 1998||1 mai 2001||Netjumper, Inc.||Method and apparatus for retrieving data from a network using linked location identifiers|
|US6304283 *||26 janv. 1996||16 oct. 2001||Canon Kabushiki Kaisha||Conference apparatus and method for realistically reproducing image data and shared board data|
|US6308199 *||7 août 1998||23 oct. 2001||Fuji Xerox Co., Ltd.||Cooperative work support system for managing a window display|
|US6335927||18 nov. 1996||1 janv. 2002||Mci Communications Corporation||System and method for providing requested quality of service in a hybrid network|
|US6342906 *||2 févr. 1999||29 janv. 2002||International Business Machines Corporation||Annotation layer for synchronous collaboration|
|US6380952||22 mars 1999||30 avr. 2002||International Business Machines Corporation||System for continuous display and navigation in a virtual-reality world|
|US6392760||16 mars 1995||21 mai 2002||Avaya Technology Corp.||Multimedia communications network|
|US6396509||21 févr. 1998||28 mai 2002||Koninklijke Philips Electronics N.V.||Attention-based interaction in a virtual environment|
|US6426778||3 avr. 1998||30 juil. 2002||Avid Technology, Inc.||System and method for providing interactive components in motion video|
|US6473096||31 août 1999||29 oct. 2002||Fuji Xerox Co., Ltd.||Device and method for generating scenario suitable for use as presentation materials|
|US6480191||28 sept. 1999||12 nov. 2002||Ricoh Co., Ltd.||Method and apparatus for recording and playback of multidimensional walkthrough narratives|
|US6567980||14 août 1998||20 mai 2003||Virage, Inc.||Video cataloger system with hyperlinked output|
|US6570587||25 juin 1997||27 mai 2003||Veon Ltd.||System and method and linking information to a video|
|US6572476||29 mars 2001||3 juin 2003||Konami Corporation||Game system and computer readable storage medium|
|US6580441||6 mars 2002||17 juin 2003||Vergics Corporation||Graph-based visual navigation through store environments|
|US6704784||1 mars 2002||9 mars 2004||Sony Corporation||Information processing apparatus and method, information processing system and program providing medium|
|US6708172||14 juin 2000||16 mars 2004||Urbanpixel, Inc.||Community-based shared multiple browser environment|
|US6714222||21 juin 2000||30 mars 2004||E2 Home Ab||Graphical user interface for communications|
|US6721741||2 mai 2000||13 avr. 2004||Friskit, Inc.||Streaming media search system|
|US6731314||16 août 1999||4 mai 2004||Muse Corporation||Network-based three-dimensional multiple-user shared environment apparatus and method|
|US6735708||8 oct. 1999||11 mai 2004||Dell Usa, L.P.||Apparatus and method for a combination personal digital assistant and network portable device|
|US6772195||29 oct. 1999||3 août 2004||Electronic Arts, Inc.||Chat clusters for a virtual world application|
|US6784901||31 août 2000||31 août 2004||There||Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment|
|US6833863||6 févr. 1998||21 déc. 2004||Intel Corporation||Method and apparatus for still image capture during video streaming operations of a tethered digital camera|
|US6862625||15 avr. 1998||1 mars 2005||Avaya Technology Corp.||Method and apparatus for real time network communication|
|US6909443 *||31 mars 2000||21 juin 2005||Microsoft Corporation||Method and apparatus for providing a three-dimensional task gallery computer interface|
|US7007235 *||31 mars 2000||28 févr. 2006||Massachusetts Institute Of Technology||Collaborative agent interaction control and synchronization system|
|US7036082||21 sept. 2000||25 avr. 2006||Nortel Networks Limited||Controlling communications through a virtual reality environment|
|US7058896||16 janv. 2002||6 juin 2006||Silicon Graphics, Inc.||System, method and computer program product for intuitive interactive navigation control in virtual environments|
|US7080096||2 nov. 2000||18 juil. 2006||Matsushita Electric Works, Ltd.||Housing space-related commodity sale assisting system, housing space-related commodity sale assisting method, program for assisting housing space-related commodity sale, and computer-readable recorded medium on which program for assisting housing space-related commodity sale is recorded|
|US7086005||16 nov. 2000||1 août 2006||Sony Corporation||Shared virtual space conversation support system using virtual telephones|
|US7145898||18 nov. 1996||5 déc. 2006||Mci Communications Corporation||System, method and article of manufacture for selecting a gateway of a hybrid communication system architecture|
|US7165213||6 nov. 1998||16 janv. 2007||Avaya Technology Corp.||Method and system for coordinating media and messaging operations in an information processing system|
|US7168048 *||24 mars 2000||23 janv. 2007||Microsoft Corporation||Method and structure for implementing a layered object windows|
|US7181690||3 août 2000||20 févr. 2007||Worlds. Com Inc.||System and method for enabling users to interact in a virtual space|
|US7184037||10 nov. 2005||27 févr. 2007||Koninklijke Philips Electronics N.V.||Virtual environment navigation aid|
|US7188317 *||13 juin 2001||6 mars 2007||Microsoft Corporation||Dynamic multiple window display having independently linked windows|
|US7240093||29 févr. 2000||3 juil. 2007||Microsoft Corporation||Use of online messaging to facilitate selection of participants in game play|
|US7260784 *||7 mai 2003||21 août 2007||International Business Machines Corporation||Display data mapping method, system, and program product|
|US7263526||18 déc. 1996||28 août 2007||Avaya Technology Corp.||Method and apparatus for embedding chat functions in a web page|
|US7293243 *||22 mai 2002||6 nov. 2007||Microsoft Corporation||Application sharing viewer presentation|
|US7305438||9 déc. 2003||4 déc. 2007||International Business Machines Corporation||Method and system for voice on demand private message chat|
|US7336779||15 mars 2002||26 févr. 2008||Avaya Technology Corp.||Topical dynamic chat|
|US7356563||6 juin 2002||8 avr. 2008||Microsoft Corporation||Methods of annotating a collaborative application display|
|US7415502||16 nov. 2001||19 août 2008||Sbc Technology Resources, Inc.||Method and system for intelligent routing based on presence detection|
|US7418664||3 avr. 2002||26 août 2008||Microsoft Corporation||Application sharing single document sharing|
|US7451181||15 avr. 2003||11 nov. 2008||Fujitsu Limited||Apparatus for controlling a shared screen|
|US7467356||8 juin 2004||16 déc. 2008||Three-B International Limited||Graphical user interface for 3d virtual display browser using virtual display windows|
|US7474741||20 janv. 2003||6 janv. 2009||Avaya Inc.||Messaging advise in presence-aware networks|
|US7503006||25 sept. 2003||10 mars 2009||Microsoft Corporation||Visual indication of current voice speaker|
|US7516411||17 janv. 2006||7 avr. 2009||Nortel Networks Limited||Graphical user interface for a virtual team environment|
|US7533346||9 janv. 2003||12 mai 2009||Dolby Laboratories Licensing Corporation||Interactive spatalized audiovisual system|
|US7616624||20 juil. 2006||10 nov. 2009||Avaya Inc.||Determining user availability based on the expected duration of a new event|
|US7640300||10 juin 2002||29 déc. 2009||Microsoft Corporation||Presence and notification system for maintaining and communicating information|
|US7680098||20 juil. 2006||16 mars 2010||Avaya Inc.||Determining group availability on different communication media|
|US7680480||20 juil. 2006||16 mars 2010||Avaya Inc.||Determining user availability based on a past event|
|US7680885||15 avr. 2004||16 mars 2010||Citrix Systems, Inc.||Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner|
|US7734692||22 juil. 2005||8 juin 2010||Oracle America, Inc.||Network collaboration system with private voice chat|
|US7765259||5 déc. 2007||27 juil. 2010||Avaya Inc.||System and method for aggregation of user conversations and visualizing personal communications map|
|US7769806||24 oct. 2007||3 août 2010||Social Communications Company||Automated real-time data stream switching in a shared virtual area communication environment|
|US7813488||29 sept. 2003||12 oct. 2010||Siemens Enterprise Communications, Inc.||System and method for providing information regarding an identity's media availability|
|US7840668||27 mai 2008||23 nov. 2010||Avaya Inc.||Method and apparatus for managing communication between participants in a virtual environment|
|US7844724||24 oct. 2007||30 nov. 2010||Social Communications Company||Automated real-time data stream switching in a shared virtual area communication environment|
|US7908663||20 avr. 2004||15 mars 2011||Microsoft Corporation||Abstractions and automation for enhanced sharing and collaboration|
|US7958453||1 oct. 2007||7 juin 2011||Len Bou Taing||System and method for real-time, multi-user, interactive and collaborative environments on the web|
|US7979574||5 mars 2007||12 juil. 2011||Sony Computer Entertainment America Llc||System and method for routing communications among real and virtual communication devices|
|US8191001||3 avr. 2009||29 mai 2012||Social Communications Company||Shared virtual area communication environment based apparatus and methods|
|US8578044||18 juin 2010||5 nov. 2013||Social Communications Company||Automated real-time data stream switching in a shared virtual area communication environment|
|US8621079||12 août 2010||31 déc. 2013||Social Communications Company||Automated real-time data stream switching in a shared virtual area communication environment|
|US8930472||15 août 2011||6 janv. 2015||Social Communications Company||Promoting communicant interactions in a network communications environment|
|US20010023450||25 janv. 2001||20 sept. 2001||Chu Chang-Nam||Authoring apparatus and method for creating multimedia file|
|US20010034740||14 févr. 2001||25 oct. 2001||Andruid Kerne||Weighted interactive grid presentation system and method for streaming a multimedia collage|
|US20020019833||30 juil. 2001||14 févr. 2002||Takashi Hanamoto||Data editing apparatus and method|
|US20020026388||1 août 2001||28 févr. 2002||Chris Roebuck||Method of distributing a product, providing incentives to a consumer, and collecting data on the activities of a consumer|
|US20020033844||11 sept. 2001||21 mars 2002||Levy Kenneth L.||Content sensitive connected content|
|US20020033848||19 avr. 2001||21 mars 2002||Sciammarella Eduardo Agusto||System for managing data objects|
|US20020049814||31 août 2001||25 avr. 2002||Yoo Hwan Soo||System and method for book-marking a specific location in virtual space|
|US20020052918||6 sept. 2001||2 mai 2002||Junichi Rekimoto||Method and system for supporting virtual conversation, and storage medium|
|US20020080195||17 juil. 2001||27 juin 2002||Carlson Samuel Garrett||System and method for navigating in a digital information environment|
|US20020097267||20 déc. 2001||25 juil. 2002||Numedeon, Inc.||Graphical interactive interface for immersive online communities|
|US20020112028||9 févr. 2001||15 août 2002||Colwill Ronald W.||Virtual directory|
|US20020116458||4 avr. 2001||22 août 2002||Jonathan Bricklin||Web-based dating service|
|US20020165922 *||13 avr. 2001||7 nov. 2002||Songxiang Wei||Application based screen sampling|
|US20020169829||20 mars 2002||14 nov. 2002||Brian Shuster||Method, apparatus and system for directing access to content on a computer network|
|US20020178072||24 mai 2001||28 nov. 2002||International Business Machines Corporation||Online shopping mall virtual association|
|US20020188959||12 juin 2001||12 déc. 2002||Koninklijke Philips Electronics N.V.||Parallel and synchronized display of augmented multimedia information|
|US20030009469||19 déc. 2001||9 janv. 2003||Microsoft Corporation||Managing media objects in a database|
|US20030037110||26 févr. 2002||20 févr. 2003||Fujitsu Limited||Method for providing area chat rooms, method for processing area chats on terminal side, computer-readable medium for recording processing program to provide area chat rooms, apparatus for providing area chat rooms, and terminal-side apparatus for use in a system to provide area chat rooms|
|US20030043200||9 août 2002||6 mars 2003||Urbanpixel Inc||Interactive multi-level mapping in a multiple browser environment|
|US20030065558||12 sept. 2001||3 avr. 2003||Cameron Shaw||Method and apparatus for multi-vendor powered business portal with intelligent service promotion and user profile gathering|
|US20030077561 *||18 oct. 2001||24 avr. 2003||Abraham Alsop||Web server for printable whiteboards|
|US20030149731||22 janv. 2003||7 août 2003||Sony Corporation||Information processing device and method, and program|
|US20030158953||21 févr. 2002||21 août 2003||Lal Amrish K.||Protocol to fix broken links on the world wide web|
|US20030184579||29 mars 2002||2 oct. 2003||Hong-Jiang Zhang||System and method for producing a video skim|
|US20030189599 *||5 avr. 2002||9 oct. 2003||Microsoft Corporation||Application sharing user interface improvements|
|US20030192049||15 nov. 2002||9 oct. 2003||Schneider Tina Fay||Binding interactive multichannel digital document system|
|US20030195928||16 avr. 2002||16 oct. 2003||Satoru Kamijo||System and method for providing reference information to allow chat users to easily select a chat room that fits in with his tastes|
|US20030197739 *||23 avr. 2002||23 oct. 2003||Bauer Jason M.||Distribution of application windows in a computing device display|
|US20030215779||8 mai 2002||20 nov. 2003||Anne Dupont||Telecommunications virtual simulator|
|US20040078444||14 oct. 2003||22 avr. 2004||Malik Dale W.||Merging instant messaging (IM) chat sessions|
|US20040128350 *||25 mars 2002||1 juil. 2004||Lou Topfl||Methods and systems for real-time virtual conferencing|
|US20040179038||30 déc. 2003||16 sept. 2004||Blattner Patrick D.||Reactive avatars|
|US20040183827 *||17 mars 2003||23 sept. 2004||Daniel Putterman||Methods and apparatus for implementing a remote application over a network|
|US20040210847 *||17 avr. 2003||21 oct. 2004||Supersonic Aerospace International, Llc||System and method for customizing multiple windows of information on a display|
|US20040268451||26 avr. 2004||30 déc. 2004||Apple Computer, Inc.||Graphical user interface for browsing, searching and presenting media items|
|US20050021624||17 mai 2004||27 janv. 2005||Michael Herf||Networked chat and media sharing systems and methods|
|US20050058094||16 sept. 2004||17 mars 2005||Mihal Lazaridis||Method for creating a peer-to-peer immediate messaging solution without using an instant messaging server|
|US20050075885||25 sept. 2003||7 avr. 2005||Danieli Damon V.||Visual indication of current voice speaker|
|US20050080866||14 oct. 2003||14 avr. 2005||Kent Larry G.||Selectively displaying time indications for instant messaging (IM) messages|
|US20050086612||8 juin 2004||21 avr. 2005||David Gettman||Graphical user interface for an information display system|
|US20050088452 *||23 oct. 2003||28 avr. 2005||Scott Hanggie||Dynamic window anatomy|
|US20050132299 *||15 déc. 2003||16 juin 2005||Dan Jones||Systems and methods for improved application sharing in a multimedia collaboration session|
|US20050138570||22 déc. 2003||23 juin 2005||Palo Alto Research Center, Incorporated||Methods and systems for supporting presentation tools using zoomable user interface|
|US20050144247||9 déc. 2003||30 juin 2005||Christensen James E.||Method and system for voice on demand private message chat|
|US20050154574||8 oct. 2003||14 juil. 2005||Kenichi Takemura||Information processing system, service providing apparatus and method, information processing apparatus and method, recording medium, and program|
|US20050215252||29 nov. 2004||29 sept. 2005||Samsung Electronics Co., Ltd.||System and method for providing a messenger service capable of changing messenger status information based on a schedule|
|US20050232168||15 avr. 2004||20 oct. 2005||Citrix Systems, Inc.||Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner|
|US20050261980||22 mai 2004||24 nov. 2005||Altaf Hadi||System and method for delivering real time remote buying, selling, meeting, and interacting in a virtual reality environment|
|US20060005187 *||30 juin 2004||5 janv. 2006||Microsoft Corporation||Systems and methods for integrating application windows in a virtual machine environment|
|US20060031779||26 mai 2005||9 févr. 2006||Citrix Systems, Inc.||Selectively sharing screen data|
|US20060041684||18 oct. 2002||23 févr. 2006||Bellsouth Intellectual Property Corporation||Server-based message protocol translation|
|US20060117264||17 janv. 2006||1 juin 2006||Nortel Networks Limited||Graphical user interface for a virtual team environment|
|US20060136837||30 janv. 2006||22 juin 2006||Microsoft Corporation||Application sharing single document sharing|
|US20060161624 *||29 déc. 2005||20 juil. 2006||Elaine Montgomery||Methods and apparatuses for dynamically sharing a portion of a display for application based screen sampling|
|US20060167996 *||13 janv. 2005||27 juil. 2006||Orsolini Garry S||System and method for enabling electronic presentations|
|US20060184886||15 déc. 2005||17 août 2006||Urbanpixel Inc.||Spatial chat in a multiple browser environment|
|US20060230156 *||6 avr. 2006||12 oct. 2006||Ericom Software Ltd.||Seamless windows functionality to remote desktop sessions regarding z-order|
|US20060248159||28 avr. 2005||2 nov. 2006||International Business Machines Corporation||Method and apparatus for presenting navigable data center information in virtual reality using leading edge rendering engines|
|US20060271877 *||25 mai 2005||30 nov. 2006||Citrix Systems, Inc.||A system and methods for selective sharing of an application window|
|US20060293103||24 juin 2005||28 déc. 2006||Seth Mendelsohn||Participant interaction with entertainment in real and virtual environments|
|US20070011232 *||6 juil. 2005||11 janv. 2007||Microsoft Corporation||User interface for starting presentations in a meeting|
|US20070047700||29 août 2005||1 mars 2007||Avaya Technology Corp.||Managing held telephone calls from a remote telecommunications terminal|
|US20070061399 *||15 déc. 2005||15 mars 2007||Microsoft Corporation||Filtering obscured data from a remote client display|
|US20070070066||13 sept. 2006||29 mars 2007||Bakhash E E||System and method for providing three-dimensional graphical user interface|
|US20070101282||9 nov. 2006||3 mai 2007||Microsoft Corporation||Method and Structure for Implementing Layered Object Windows|
|US20070136686||8 déc. 2005||14 juin 2007||International Business Machines Corporation||Pop-up repelling frame for use in screen sharing|
|US20070156908||30 déc. 2005||5 juil. 2007||Nokia Corporation||Network entity, method and computer program product for effectuating a conference session|
|US20070160129||27 déc. 2006||12 juil. 2007||Tatsuro Fujisawa||Video decoding apparatus and video decoding method|
|US20070162432||10 janv. 2007||12 juil. 2007||Aol Llc||Searching Recent Content Publication Activity|
|US20070184855||3 févr. 2006||9 août 2007||Research In Motion Limited||Visual representation of contact location|
|US20070192427||16 févr. 2006||16 août 2007||Viktors Berstis||Ease of use feature for audio communications within chat conferences|
|US20070198726||6 avr. 2007||23 août 2007||Peerapp Ltd.||Method and system for accelerating receipt of data in a client to client network|
|US20070204047 *||27 févr. 2006||30 août 2007||Microsoft Corporation||Shared telepointer|
|US20070220568||8 mars 2007||20 sept. 2007||Canon Kabushiki Kaisha||Storage apparatus and method for processing the same|
|US20070226357||22 mars 2006||27 sept. 2007||Mcmurry Kathleen A||Providing an Aggregate Reachability Status|
|US20070233785||30 mars 2006||4 oct. 2007||International Business Machines Corporation||Communicating using collaboration spaces|
|US20070234212||31 mars 2006||4 oct. 2007||Microsoft Corporation||Selective window exclusion for captured content|
|US20070279484||30 avr. 2007||6 déc. 2007||Mike Derocher||User interface for a video teleconference|
|US20070291034||28 févr. 2007||20 déc. 2007||Dones Nelson C||System for presenting a navigable virtual subway system, and method for operating and using the same|
|US20070291706||16 juin 2006||20 déc. 2007||Miller Scott C||Methods, devices and architectures for establishing peer-to-peer sessions|
|US20070299778||22 juin 2006||27 déc. 2007||Microsoft Corporation||Local peer-to-peer digital content distribution|
|US20080019285||20 juil. 2006||24 janv. 2008||Avaya Technology Llc||Rule-based System for Determining User Availability|
|US20080021949||20 juil. 2006||24 janv. 2008||Avaya Technology Llc||Determining User Availability Based on a Past Event|
|US20080033941||7 août 2006||7 févr. 2008||Dale Parrish||Verfied network identity with authenticated biographical information|
|US20080052373||1 mai 2007||28 févr. 2008||Sms.Ac||Systems and methods for a community-based user interface|
|US20080086696||3 déc. 2007||10 avr. 2008||Cadcorporation.Com Inc.||System and Method for Using Virtual Environments|
|US20080091692||8 juin 2007||17 avr. 2008||Christopher Keith||Information collection in multi-participant online communities|
|US20080098121||19 oct. 2007||24 avr. 2008||Nec (China) Co., Ltd.||P2p sip enabled multimedia network communication system|
|US20080101561||31 déc. 2007||1 mai 2008||Nhn Corporation||Messenger notification system and method using synchronization server|
|US20080133580||30 nov. 2007||5 juin 2008||James Andrew Wanless||Method and system for providing automated real-time contact information|
|US20080163090||28 déc. 2006||3 juil. 2008||Yahoo! Inc.||Interface overlay|
|US20080168154||17 mai 2007||10 juil. 2008||Yahoo! Inc.||Simultaneous sharing communication interface|
|US20080209075||23 oct. 2007||28 août 2008||Yahoo! Inc.||Synchronous delivery of media content and real-time communication for online dating|
|US20080221998||1 mai 2008||11 sept. 2008||Disney Enterprises, Inc.||Participant interaction with entertainment in real and virtual environments|
|US20080244458||30 mars 2007||2 oct. 2008||Microsoft Corporation||Remoting of Windows Presentation Framework based Applications in a Non-Composed Desktop|
|US20080250115||22 avr. 2007||9 oct. 2008||Vaidy Iyer||Enterprise Notification System|
|US20080252637||14 avr. 2007||16 oct. 2008||Philipp Christian Berndt||Virtual reality-based teleconferencing|
|US20080262910||20 avr. 2007||23 oct. 2008||Utbk, Inc.||Methods and Systems to Connect People via Virtual Reality for Real Time Communications|
|US20080262911||20 avr. 2007||23 oct. 2008||Utbk, Inc.||Methods and Systems to Search in Virtual Reality for Real Time Communications|
|US20080263460||20 avr. 2007||23 oct. 2008||Utbk, Inc.||Methods and Systems to Connect People for Virtual Meeting in Virtual Reality|
|US20080301557||4 juin 2007||4 déc. 2008||Igor Kotlyar||Systems, methods and software products for online dating|
|US20090018912||27 mai 2008||15 janv. 2009||Utbk, Inc.||Systems and Methods to Provide Communication Connections via Partners|
|US20090079816||24 sept. 2007||26 mars 2009||Fuji Xerox Co., Ltd.||Method and system for modifying non-verbal behavior for social appropriateness in video conferencing and other computer mediated communications|
|US20090089364||2 oct. 2007||2 avr. 2009||Hamilton Ii Rick A||Arrangements for interactivity between a virtual universe and the world wide web|
|US20090089685||30 mai 2008||2 avr. 2009||Mordecai Nicole Y||System and Method of Communicating Between A Virtual World and Real World|
|US20090096810 *||11 oct. 2007||16 avr. 2009||Green Brian D||Method for selectively remoting windows|
|US20090112997||25 oct. 2007||30 avr. 2009||Cisco Technology, Inc.||Utilizing Presence Data Associated with Web Item|
|US20090113066||24 oct. 2007||30 avr. 2009||David Van Wie||Automated real-time data stream switching in a shared virtual area communication environment|
|US20090113314||30 oct. 2007||30 avr. 2009||Dawson Christopher J||Location and placement of avatars in virtual worlds|
|US20090128567||14 nov. 2008||21 mai 2009||Brian Mark Shuster||Multi-instance, multi-user animation with coordinated chat|
|US20090132943||21 oct. 2008||21 mai 2009||Claudia Juliana Minsky||Method and System for Creating a Multifunctional Collage Useable for Client/Server Communication|
|US20090193077||23 janv. 2009||30 juil. 2009||Hiroshi Horii||Communication control|
|US20090199095||1 févr. 2008||6 août 2009||International Business Machines Corporation||Avatar cloning in a virtual world|
|US20090222742||3 mars 2008||3 sept. 2009||Cisco Technology, Inc.||Context sensitive collaboration environment|
|US20090235180 *||16 mars 2009||17 sept. 2009||Jun Feng Liu||Method and Apparatus for Restoring an Occluded Window in Application Sharing Software|
|US20090241037||27 déc. 2008||24 sept. 2009||Nortel Networks Limited||Inclusion of Web Content in a Virtual Environment|
|US20090247196||5 janv. 2009||1 oct. 2009||Lg Electronic Inc.||Mobile terminal for performing instant messaging service|
|US20090251457||3 avr. 2008||8 oct. 2009||Cisco Technology, Inc.||Reactive virtual environment|
|US20090254840||4 avr. 2008||8 oct. 2009||Yahoo! Inc.||Local map chat|
|US20090254843||3 avr. 2009||8 oct. 2009||Social Communications Company||Shared virtual area communication environment based apparatus and methods|
|US20090259948||15 avr. 2008||15 oct. 2009||Hamilton Ii Rick A||Surrogate avatar control in a virtual universe|
|US20090286605||19 mai 2008||19 nov. 2009||Hamilton Ii Rick A||Event determination in a virtual universe|
|US20090288007||27 juil. 2009||19 nov. 2009||Social Communications Company||Spatial interfaces for realtime networked communications|
|US20090300521||30 mai 2008||3 déc. 2009||International Business Machines Corporation||Apparatus for navigation and interaction in a virtual meeting place|
|US20090307620||10 juin 2008||10 déc. 2009||Hamilton Ii Rick A||System for concurrently managing multiple avatars|
|US20100058202||19 févr. 2009||4 mars 2010||Mohamed Rostom||Method system and program product for providing enabling an interactive and social search engine|
|US20100058229 *||2 sept. 2008||4 mars 2010||Palm, Inc.||Compositing Windowing System|
|US20100077034||22 sept. 2008||25 mars 2010||International Business Machines Corporation||Modifying environmental chat distance based on avatar population density in an area of a virtual world|
|US20100131868 *||26 nov. 2008||27 mai 2010||Cisco Technology, Inc.||Limitedly sharing application windows in application sharing sessions|
|US20100146085||4 déc. 2009||10 juin 2010||Social Communications Company||Realtime kernel|
|US20100146118||4 déc. 2009||10 juin 2010||Social Communications Company||Managing interactions in a network communications environment|
|US20100162121||22 déc. 2008||24 juin 2010||Nortel Networks Limited||Dynamic customization of a virtual world|
|US20100164956||28 déc. 2008||1 juil. 2010||Nortel Networks Limited||Method and Apparatus for Monitoring User Attention with a Computer-Generated Virtual Environment|
|US20100169796||28 déc. 2008||1 juil. 2010||Nortel Networks Limited||Visual Indication of Audio Context in a Computer-Generated Virtual Environment|
|US20100169799||30 déc. 2008||1 juil. 2010||Nortel Networks Limited||Method and Apparatus for Enabling Presentations to Large Numbers of Users in a Virtual Environment|
|US20100169801||22 déc. 2009||1 juil. 2010||Aol Llc||Multiple avatar personalities|
|US20100169837||29 déc. 2008||1 juil. 2010||Nortel Networks Limited||Providing Web Content in the Context of a Virtual Environment|
|US20100211880||13 févr. 2009||19 août 2010||International Business Machines Corporation||Virtual world viewer|
|US20100221693||31 mars 2006||2 sept. 2010||Rakesh Kumar Gupta||Instant Messaging For A Virtual Learning Community|
|US20100228547||6 mars 2009||9 sept. 2010||At&T Intellectual Property I, Lp.||Method and apparatus for analyzing discussion regarding media programs|
|US20100228560||4 mars 2009||9 sept. 2010||Avaya Inc.||Predictive buddy list-reorganization based on call history information|
|US20100235501||15 mars 2010||16 sept. 2010||Avaya Inc.||Advanced Availability Detection|
|US20100241432||17 mars 2009||23 sept. 2010||Avaya Inc.||Providing descriptions of visually presented information to video teleconference participants who are not video-enabled|
|US20100246570||11 mars 2010||30 sept. 2010||Avaya Inc.||Communications session preparation method and apparatus|
|US20100246571||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for managing multiple concurrent communication sessions using a graphical call connection metaphor|
|US20100246800||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for managing a contact center with a graphical call connection metaphor|
|US20100251119||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for managing incoming requests for a communication session using a graphical connection metaphor|
|US20100251124||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for mode-neutral communications with a widget-based communications metaphor|
|US20100251127||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for managing trusted relationships in communication sessions using a graphical metaphor|
|US20100251142||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for persistent multimedia conferencing services|
|US20100251158||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for graphically managing communication sessions|
|US20100251177||29 mars 2010||30 sept. 2010||Avaya Inc.||System and method for graphically managing a communication session with a context based contact set|
|US20100257450||3 avr. 2009||7 oct. 2010||Social Communications Company||Application sharing|
|US20100262550||8 avr. 2009||14 oct. 2010||Avaya Inc.||Inter-corporate collaboration overlay solution for professional social networks|
|US20100274848||29 juin 2010||28 oct. 2010||Social Communications Company||Managing network communications between network nodes and stream transport protocol|
|US20100287274||8 mai 2009||11 nov. 2010||Canon Kabushiki Kaisha||Efficient network utilization using multiple physical interfaces|
|US20100322395||22 juin 2009||23 déc. 2010||Avaya Inc.||Unified communications appliance|
|US20110029898||11 oct. 2010||3 févr. 2011||At&T Intellectual Property I, L.P.||Merging Instant Messaging (IM) Chat Sessions|
|US20110029907||31 mars 2010||3 févr. 2011||Bakhash E Eddie||System and method for providing three-dimensional graphical user interface|
|US20110106662||30 oct. 2009||5 mai 2011||Matthew Stinchcomb||System and method for performing interactive online shopping|
|US20110169927||12 janv. 2011||14 juil. 2011||Coco Studios||Content Presentation in a Three Dimensional Environment|
|US20110196930||20 avr. 2011||11 août 2011||Jitendra Chawla||Methods and apparatuses for reporting based on attention of a user during a collaboration session|
|US20110231781||19 mars 2010||22 sept. 2011||International Business Machines Corporation||System and method for virtual object sharing and management in virtual worlds|
|US20110274104||21 juin 2011||10 nov. 2011||Social Communications Company||Virtual area based telephony communications|
|US20110302509||15 août 2011||8 déc. 2011||Social Communications Company||Promoting communicant interactions in a network communications environment|
|US20120066306||9 sept. 2011||15 mars 2012||Social Communications Company||Relationship based presence indicating in virtual area contexts|
|US20120124486 *||21 déc. 2011||17 mai 2012||Addnclick, Inc.||Linking users into live social networking interactions based on the users' actions relative to similar content|
|US20120179672||16 mars 2012||12 juil. 2012||Social Communications Company||Shared virtual area communication environment based apparatus and methods|
|US20120216131||17 févr. 2012||23 août 2012||Social Communications Company||Persistent network resource and virtual area associations for realtime collaboration|
|US20120246582||2 juin 2012||27 sept. 2012||Social Communications Company||Interfacing with a spatial virtual communications environment|
|US20120254858||28 mars 2012||4 oct. 2012||Social Communications Company||Creating virtual areas for realtime communications|
|US20130073978||5 sept. 2012||21 mars 2013||Social Communications Company||Capabilities based management of virtual areas|
|US20130109418||18 déc. 2012||2 mai 2013||Research In Motion Limited||Method for creating a peer-to-peer immediate messaging solution without using an instant messaging server|
|US20140129644 *||4 nov. 2013||8 mai 2014||Rosebud LMS, Inc||Method and software for enabling n-way collaborative work over a network of computers|
|US20140213309||6 mars 2014||31 juil. 2014||Blackberry Limited||Method for Creating a Peer-to-Peer Immediate Messaging Solution Without Using an Instant Messaging Server|
|CN1678994A||16 mai 2003||5 oct. 2005||微软公司||System and method for providing access to user interface information|
|CN1701568A||13 nov. 2002||23 nov. 2005||英特尔公司||Multi-modal web interaction over wireless network|
|CN101499080A||1 févr. 2008||5 août 2009||网秦无限（北京）科技有限公司||Method and system for fast acquiring information service on mobile terminal|
|EP1964597A1||3 oct. 2007||3 sept. 2008||Sony Computer Entertainment Europe Ltd.||Apparatus and method of modifying an online environment|
|EP1964597B1||3 oct. 2007||20 févr. 2013||Sony Computer Entertainment Europe Ltd.||Apparatus and method of modifying an online environment|
|EP2237537A1||30 mars 2010||6 oct. 2010||Avaya Inc.||System and method for graphically managing a communication session with a context based contact set|
|EP2239930A1||30 mars 2010||13 oct. 2010||Avaya Inc.||System and method for managing a contact center with a graphical call connection metaphor|
|JP2002123479A||Titre non disponible|
|JP2002149580A||Titre non disponible|
|JP2002224447A||Titre non disponible|
|JP2003067317A||Titre non disponible|
|JP2004272579A||Titre non disponible|
|JP2005182331A||Titre non disponible|
|JP2005286749A||Titre non disponible|
|JP2007184871A||Titre non disponible|
|JP2007251380A||Titre non disponible|
|JP2007506309A||Titre non disponible|
|JP2008182670A||Titre non disponible|
|JP2010535363A||Titre non disponible|
|KR19990078775A||Titre non disponible|
|KR20000030491A||Titre non disponible|
|KR20010100589A||Titre non disponible|
|KR20030054874A||Titre non disponible|
|KR20040011825A||Titre non disponible|
|KR20060060788A||Titre non disponible|
|KR20070105088A||Titre non disponible|
|KR20090016692A||Titre non disponible|
|WO2006127429A3||19 mai 2006||10 janv. 2008||Albert Alexandrov||Selectively sharing screen data|
|WO2008008806A2||11 juil. 2007||17 janv. 2008||Igor Khalatian||One-click universal screen sharing|
|WO2008106196A1||27 févr. 2008||4 sept. 2008||Sony Computer Entertainment America Inc.||Virtual world avatar control, interactivity and communication interactive messaging|
|WO2009000028A1||23 juin 2008||31 déc. 2008||Global Coordinate Software Limited||Virtual 3d environments|
|1||*||"Microsoft Computer Dictionary", Third Edition, 1997. Definitions for"audit," "auditing," and "audit trail" (p. 36), "log" (p. 288, "security log" (p. 426), "user account," and "user profile" (p. 488).|
|2||01-Allowance Notice dated Oct. 9, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|3||02-IDS dated Sep. 20, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|4||03-RCE dated Sep. 20, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|5||04-Allowance Notice dated Aug. 28, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|6||05-Amendment dated Mar. 18, 2012 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|7||06-Non-final Office Action dated Sep. 7, 2011 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|8||07-Non-final Office action dated Dec. 27, 2011 in related U.S. Appl. No. 12/354,709, filed Jan. 15, 2009.|
|9||11-Final Office Action dated Sep. 10, 2012 in related U.S. Appl. No. 12/509,658, filed Jul. 27, 2009.|
|10||12-Amendment dated May 30, 2012 in related U.S. Appl. No. 12/509,658, filed Jul. 27, 2009.|
|11||13-Non-Final Office Action dated Mar. 1, 2012 in related U.S. Appl. No. 12/509,658, filed Jul. 27, 2009.|
|12||14-Pre-Appeal Brief Request for Review dated Nov. 10, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|13||15-Advisory Action dated Oct. 16, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|14||16-Response After Final dated Sep. 14, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|15||17-Final Rejection dated Sep. 10, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|16||18-Amendment dated Jul. 20, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|17||19-Non-Final Office Action dated Apr. 22, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|18||20-Notice of Allowance dated Jun. 20, 2016 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|19||21-Appeal Brief dated Dec. 8, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|20||22-Pre-Brief Appeal Conference Decision dated Nov. 27, 2015 in related U.S. Appl. No. 13/666,717, filed Nov. 1, 2012.|
|21||International Search Report and Written Opinion issued in counterpart International Application No. PCT/US2012/030766 (mailed Oct. 19, 2012).|
|22||International search report and written opinion received in counterpart International application No. PCTIUS2010/028088, mailed Oct. 27, 2010.|
|23||Search report and written opinion issued on Aug. 13, 2010, in counterpart PCT Application No. PCT/US2010/020596.|
|Classification internationale||G06F3/00, H04L12/18, G09G5/14, G06F9/44, G06F3/0481, G06F15/16, G06F3/14, G06F3/048, H04L29/08, H04L29/06|
|Classification coopérative||G06F2209/545, G06F3/0481, G06F3/1454, H04L12/1813, G06F3/04815, H04L67/38, G06F9/4443, H04L67/025, G09G5/14|
|11 juil. 2015||AS||Assignment|
Owner name: SOCIAL COMMUNICATIONS COMPANY, OREGON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GO, ALEXANDER SAY;PETTER, VLADIMIR;SIGNING DATES FROM 20090331 TO 20090413;REEL/FRAME:036063/0414