WO2008043182A1 - System for supporting collaborative work - Google Patents

System for supporting collaborative work Download PDF

Info

Publication number
WO2008043182A1
WO2008043182A1 PCT/CA2007/001826 CA2007001826W WO2008043182A1 WO 2008043182 A1 WO2008043182 A1 WO 2008043182A1 CA 2007001826 W CA2007001826 W CA 2007001826W WO 2008043182 A1 WO2008043182 A1 WO 2008043182A1
Authority
WO
WIPO (PCT)
Prior art keywords
collaborative work
application
supporting
collaborative
collaboration
Prior art date
Application number
PCT/CA2007/001826
Other languages
French (fr)
Other versions
WO2008043182A8 (en
Inventor
Mohamed Cheriet
Pierre Dumas
Maarouf Saad
Louis Villardier
Saliah-Hassane Hamadou
Gilles Saint-Amant
Samir Hadjout
Original Assignee
Ets (Ecole De Technologie Superieure)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CA002563866A external-priority patent/CA2563866A1/en
Application filed by Ets (Ecole De Technologie Superieure) filed Critical Ets (Ecole De Technologie Superieure)
Priority to CA2702509A priority Critical patent/CA2702509A1/en
Publication of WO2008043182A1 publication Critical patent/WO2008043182A1/en
Publication of WO2008043182A8 publication Critical patent/WO2008043182A8/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • H04L65/4053Arrangements for multi-party communication, e.g. for conferences without floor control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/104Peer-to-peer [P2P] networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/157Conference systems defining a virtual conference space and using avatars or agents

Definitions

  • the present invention generally relates to a system for supporting collaborative work. More specifically, but not exclusively, the present invention is concerned with a multimodal/multi-sensorial integrated system that supports collaborative work using common tools and applications, whereby geographically-disseminated participants thereof can interact in real-time with each other.
  • a purpose of collaborative work and collaborative research is to support cost-effective creative solutions to complex collaborative problems that reduce risk, time and space constraints by means of intensive use of information and communication technologies.
  • Several client/server based products were developed for the collaboration work and are offered on the market.
  • there is a need for an improved system For example, there is a need for robustness and reliability of the system, a need for higher performance for real-time collaborative applications and for sharing expensive or non-accessible equipments, a need for higher quality of multimedia, a need for integration of different applications together, a need for adaptability of the system to changes, a need for more visibility of all the participants along with the services and objects to be shared.
  • the client/server based systems for collaborative work have a central failure point in the server.
  • P2P Peer-to-Peer
  • an object of the present invention is to provide a system for supporting collaborative work that is designed and structured and comprises features adapted in view of overcoming the above discussed drawbacks of the current systems.
  • the system for supporting collaborative work presented in the present invention allows for disseminated participants at remote sites to interact in real-time with each other through a fault tolerant network in a peer-to-peer configuration.
  • the system comprises: (i) a plurality of collaboration platforms having a layered architecture to support collaborative application such as conferencing, instant messaging, audio/video, shared whiteboards, shared data and files, collaborative browsing, application sharing, virtual laboratory application, slide show and audio/video stream sharing with annotation feature; and (ii) plurality of hardware platform and equipments consisting of: a workstation, a multimedia room equipped with a white board, cameras and videoconferencing devices, a projection screen with an associated video projector and a smartboard, a laboratory room and an adapted operating room comprising haptic device and remote machine such as a robot.
  • a system for supporting collaborative work for allowing disseminated participants to interact in real-time with each other through a network comprising: a plurality of collaboration platforms having a layered architecture to support at least one collaborative application and accessible by the disseminated participants at remote sites;
  • the layered architecture includes at least one layer for interconnecting the plurality of collaboration platforms through the network in a peer-to-peer configuration and wherein the plurality of collaboration platforms are so configured so that when a new collaboration platform interconnects with the plurality of collaboration platforms the new collaboration platform inherits the at least one collaborative application.
  • the system for supporting collaborative includes an annotation feature allowing the disseminated participants to annotate the at least one of the collaborative applications.
  • Figures 1a and 1b are block diagrams schematically illustrating a system for supporting collaborative work, in accordance with an illustrative embodiment of the present invention
  • Figure 2 is a schematic diagram illustrating a collaboration platform architecture of the system for supporting collaborative work of Figures 1a and 1 b;
  • Figure 3 is a block diagram schematically illustrating a deployment example of the infrastructure of the system for supporting collaborative work of Figures 1a and 1 b;
  • Figure 4 is a schematic diagram illustrating a plurality of devices found in a multimedia room of the system for supporting collaborative work of Figures 1a and 1 b;
  • Figure 5 is a block diagram schematically illustrating a system for supporting collaborative work comprising a laboratory room, in accordance with another illustrative embodiment of the present invention.
  • Figure 6 is a flow chart illustrating a method used for integrating dynamically new objects or applications in the system for supporting collaborative work of Figure 5;
  • Figure 7 is a block diagram schematically illustrating an application of the system for supporting collaborative work of Figures 1a and 1 b for real-time telesurgery and telementoring;
  • Figure 8 is an image of a graphical participant interface of the system for supporting collaborative work of Figures 1a and 1 b;
  • Figure 9 is a block diagram schematically illustrating an implementation of the annotation feature, which enriches the system for supporting collaborative work of Figures 1a and 1b.
  • the present invention is generally concerned with a system for supporting collaborative work.
  • the system has an architecture that encompasses a plurality of layers, the integration of which results in a multimodal environment having the ability of gathering and integrating multi- sensorial information from a plurality of remote sites, this being described as "telepresence".
  • the system can comprise several collaborative tools and applications, such as conferencing, instant messaging, chatting, audio/video, shared whiteboards, shared data and files, collaborative browsing, slide show, audio/video stream sharing, application sharing, eDrive, a multimedia, virtual laboratory and a virtual operating room.
  • the system for supporting collaborative work also comprises a broadband fault tolerant network that can connect together several multimedia rooms, intended for sending to other remote sites, receiving from other remote sites and processing multi-sensorial information.
  • the system can have recourse to various video, audio and/or haptic human- machine interfaces; it can make use of various media, such as text, voice, sound, touch feeling, graphics, gesture and motion, handwriting and shared applications.
  • the system for supporting collaborative work encompasses several collaborative tools and applications, all of which being integrated in the same work space. Moreover, its distributed architecture makes it possible to integrate new tools or applications thereinto.
  • the system is synchronous, which means that the course of the pertaining events is governed by a common phenomenon, such as a clock.
  • This synchronicity property allows the participants of a collaborative work session to simultaneously interact with each other; a collaborative work session is characterized by a group of participants using a set of collaborative tools and/or applications during a fixed time frame to manipulate or produce artifacts, i.e. databases, documents, files, etc.
  • the system for supporting collaborative work offers a multimedia support to receive or deliver information under various forms, particularly but not exclusively under visual, video, audio and haptic forms.
  • Participants can connect to the system either via the Internet, or by any other network, including an intranet or a local network.
  • the system for supporting collaborative work may comprise one or a plurality of multimedia rooms, a multimedia room being (i) defined as a physical site from where one or many participants can access the system, and (ii) potentially equipped with a plurality of interfaces, i.e. video interface, audio interface, haptic interface, etc.
  • the system for supporting collaborative work might also comprise a remote facility, the functions of which can be remotely controlled by the participants.
  • this remote facility can be a laboratory, or a medical or a surgical room.
  • the system for supporting collaborative work might include a specially dedicated multimedia room from where participants can enter the collaborative work session for learning purposes, this being described as "telementoring".
  • the system for supporting collaborative work makes intensive use of, for example, the multimodal perceptual processes, the transmission of information via broadband networks, the integration of multi-sensor interfaces to a multiple interaction network for receiving and transmitting both data and control signals, for which the applications are suitable to enhance collaborative work, for example collaborative research.
  • the collaborative environment 10 Rather than using a conventional Client/Server architecture, the collaborative environment 10, as illustrated in Figures 1a and 1 b, has an open overlay self-scaling network architecture based on the distributed Peer-to-Peer (P2P) computing paradigm, which integrates various perceptual modes of information. It enables remote access, as well as the direct sharing of resources, responsibilities and services in synchronous/asynchronous manner among geographically disseminated participants such as 22 ( Figure 1 b).
  • P2P Peer-to-Peer
  • the system for supporting collaborative work 10 may be used in various contexts that need to support synchronous collaborative work, such as eLearning, eCommerce, eHealth (eDiagnostics, Teleoperation and telementoring), eDesign, eGoverance, eBusiness, eBanking, eProcurement, eNegotiation, etc.
  • Figure 1a is an illustration of a system for supporting collaborative work 10 that comprises six geographically separate sites, numbered site 1 to site 6, linked together by a network 12.
  • a participant 22 (as shown in Figure 1 b) can join a collaborative work session either from a multimedia room 14a-14c or from a work station 16a-16b.
  • the system for supporting collaborative work 10 also comprises a laboratory room 18, the equipments of which can be remotely controlled by any participant 22 at sites 1-4 or 6.
  • the network 12 is any network that links together the six sites and may comprise, for instance, Internet links, intranet links, local networks, private networks, etc.
  • Figure 1 b is a more detailed view of a system for supporting collaborative work 10 comprising sites 1-5, wherein each site 1 to 3 comprises a collaboration platform 20. Participants 22 interact with the system for supporting collaborative work 10 either through a multimedia room 14, as in sites 1 and 2, or through a workstation 16 as in site 3.
  • site 3 is linked to the network 12 through the Internet.
  • the system for supporting collaborative work 10 includes a laboratory room 18 at site 4, which comprises laboratory servers (not shown), and a multicast video bridge 24 for managing all the cameras (not shown) that may be disseminated at the various sites 1-4.
  • the collaboration platform 20 comprises a multi-layer architecture, comprising the following layers:
  • the participant interface layer 26 represents various participant interfaces; it is the collections of graphical objects, i.e. menus, commands, controls, etc., that enable participants 22 to interact with the collaboration platform 20 and to use the various provided collaborative tools.
  • the participant interface 102 as illustrated in Figure 8, may use a relaxed WYSIWIS (What You See Is What I See) interface metaphor and is characterized by optimal management of the displaying space, flexibility in the presentation and ease of access to collaborative services and functionalities.
  • the application layer 28 constitutes the bulk of the collaboration platform 20 and includes the collection of applicative logics. It uses the services of the middleware layer 30 to build collaborative tools, toolkits and applications, in order to satisfy the needs of the participants 22 in terms of communications, data sharing and collaboration. It may comprise the following key components:
  • a whiteboard tool which is a collaborative drawing tool using a vector- based representation for offering a drawing space of several pages and a graphic toolbox; • an eDrive tool or virtual drive enabling data, file and audio/video stream sharing and exchanging (downloading);
  • an audio/video stream sharing tool with an annotation feature (pointing and highlighting a specific region of the running video) and enabling collaborative control (moving, zooming, etc) of a remote camera.
  • the middleware layer 30 provides the required implementations to develop a distributed system that ensures communication between the participating processes through a duplicated object system (DoS) architecture.
  • DoS duplicated object system
  • a duplicated object is a type of decentralized, distributed object supporting the above enumerated collaborative tools, toolkits and applications, in order to satisfy the needs of the participants 22 in terms of communications, data sharing and collaboration. These objects contain datasets describing all the network-relevant information about a tool, toolkit or application.
  • a duplicated object can be either a duplication master, which is the controlling instance of the object, or a duplica, which is a complete copy of the master object.
  • the middleware layer 30 also provides a set of services such as coherence, session, collection of specialized network services, all of which allow to facilitate the development and implementation of collaborative functionalities of the application layer 28. Finally, the middleware layer 30 provides a distributed clock that ensures temporal order of information during communication processes.
  • the network layer 32 takes in charge all functions related to communication networks, including communication and routing protocols, bandwidth management, and quality of service.
  • This layer 32 is characterized by the following key elements: an automatic message routing mechanism which can allow a peer to join another peer via a third peer, a configurable transport protocol, a hierarchical and cascading message passing model based on multicast groups, and message bundling techniques.
  • the application layer 28 allows control sharing
  • the annotation feature which enriches collaborative tools such as the visioning tool, application sharing tool and an audio/video stream sharing tool by allowing pointing, highlighting and drawing over a presentation or a running video, may be implemented according to the schematic diagram of Figure 9.
  • a transparent GUI control 132 is superposed over the application automation GUI control 136.
  • the graphical annotations i.e. pointers, highlights, drawings, etc.
  • the graphical annotations are translated to duplicated objects 134 at the application layer 28 and published over session in a separate channel, i.e. published over Net-Z.
  • a synchronization is performed between the duplicated objects 134 and the automation objects (data and user commands) 138.
  • the shared content for example a presentation
  • Net- Z refers to the Net-Z middleware of Quazal inc., which was originally dedicated to the development of multi-participant games.
  • the collaboration platform 20 may use, for example, the Net-Z middleware from Quazal inc. for duplicating objects and ensuring that all participants 22 of a collaborative work session have, on their respective work station, a copy of all objects that are in use. Every copy of a duplicated object is made from the same master copy so as to (i) ensure coherence, (ii) maintain the pertaining functionalities and (iii) control the object's instances.
  • the collaboration platform 20 is decentralized and symmetric, since it is based on a peer-to-peer (P2P) distributed architecture, and comprises a broadband network 12 that can connect a plurality of laboratory rooms 18 or multimedia rooms 14 together.
  • P2P peer-to-peer
  • the collaboration platform 20 aggregates various aspects of multimodal perceptual information in order to support collaborative work in different fields of activity.
  • This P2P distributed architecture means that the collaboration platform 20 does not depend on a central server; rather, the application components and the responsibilities of maintaining network services are distributed across multiple work stations. This renders the collaboration platform 20 more robust and tolerant to failure or disconnection of any work station within the collaborative work session. In the occurrence of a failure or disconnection, the duplication master objects will migrate to another station and let the collaborative work session to seamlessly carry on.
  • FIG 3 An example of deployment of the infrastructure of the collaboration platform 20 is illustrated. Each site 1-5 is connected to the network 12 with GigaEthernet links 40 and via a switch 42. Of course, other types of connections can be used and would be obvious to a person of ordinary skill in the art. For example, site 4 is connected to the switch 42 via an optical carrier 44. Also, a testbed network 46 can be present and connected to the network 12 for monitoring and performance analysis purposes.
  • one of the sites 1-5 will include a multicast video bridge (not shown) for managing all the cameras.
  • the multimedia room 14 can comprise and integrate into the collaborative environment 10 a plurality of features, the information of all of which can be accessed by the participants 22.
  • the features can comprise, amongst others, whiteboards 50, smart boards (handwritten recognition acquisition and understanding) 52, screens 54, cameras 56, haptic devices 58, databases and applications, etc. Those features are schematically illustrated in Figure 4.
  • the multimedia room 14 can also be equipped with high- quality accessories and devices enabling intensive computing, high-definition (HD) visualization, 2D and 3D image processing (recognition), group communication, and videoconference services for multiple participants 22. Thanks to the high degree of integration of such intelligent interactive devices, such multimedia rooms 14 are innovative in the field of collaborative work. For instance, during teleconferencing, intelligent cameras 56 are autonomously guided to track the person 60 who is currently speaking by putting him continuously on its vision field. The speaker 60 may create his drawing directly on a shared whiteboard 50 by using the smart board 52 which provides handwriting recognition acquisition and understanding.
  • HD high-definition
  • 2D and 3D image processing recognition
  • group communication group communication
  • videoconference services for multiple participants 22. Thanks to the high degree of integration of such intelligent interactive devices, such multimedia rooms 14 are innovative in the field of collaborative work. For instance, during teleconferencing, intelligent cameras 56 are autonomously guided to track the person 60 who is currently speaking by putting him continuously on its vision field. The speaker 60 may create his drawing directly on a shared whiteboard 50
  • a remote machine not shown
  • the graphical participant interface 102 may also be projected to produce a wide main screen 54, having a large high definition (HD) format, by means of a projector assembly 64.
  • a digitization center (not shown in Figure 4) for storing, distributing and sharing digital data.
  • the collaboration platform 20 may comprise one or several haptic devices 58.
  • a haptic device 58 is a technology which interfaces with the participant 22 through the sense of touch. By enabling a force-feedback output, a haptic device 58 makes possible to actuate teleoperators.
  • the haptic devices 58 may be coupled to stereoscopic spectacles in order to perceive depth of field and, in this manner, add an extra information parameter that helps the participant 22 to properly perform a telemanipulation.
  • haptic device 58 Two examples of use of an haptic device 58 are (i) conducting an experiment in a remote laboratory room 18 and (ii) performing a surgery on a remote patient.
  • a haptic device 58 may comprise Phantom DesktopTM used in connection with two pantins and a universal joint wrist, e.g. a Phantom 1.5ATM and a Freedom 6STM.
  • haptic devices 58 allow for controlling remote machines or robots.
  • the haptic devices 58 enable to start experimentations using different robots or to perform simulations of different machines.
  • the haptic devices 58 allow a surgeon to manipulate surgical robots and instruments.
  • the collaboration platform 20 may encompass a laboratory room 18, which comprises a set of collaborative applications for performing experiments on remote equipments that are actuated by teleoperators.
  • the laboratory room 18 allows to remotely perform interactive experiments, possibly by using haptic devices 58.
  • it is also suitable for distant learning, as it enables many geographically disseminated participants 22 to simultaneously conduct an experiment, to get and share the results thereof and, possibly, to visualize in real-time a 3D video of the experiment.
  • the laboratory room 18 also offers the possibility to simulate an experiment from relevant models and databases.
  • the laboratory room 18 can also be equipped with HD visualization and high quality sound systems such as those found in the multimedia rooms 14.
  • FIG. 5 a collaborative environment comprising a laboratory room 18 is shown.
  • a plurality of sites 1-6 each of which comprising a collaboration platform 20 and either a multimedia room 14 or workstation 16, as well as the laboratory room 18 are connected to the network 12.
  • the network 12 can be a private network, shared by a group of universities and linking them together, for example.
  • the laboratory room 18 comprises a virtual laboratory application 70 and a virtual laboratory application using LabVIEWTM 72.
  • the virtual laboratory application 70 is completely integrated to the collaboration platform 20 and includes a machine 76 or a robot, connected to a data acquisition device 78 or card in order to monitor and save the results from experimentations. Those results are accessible at site 6 and can be shared with any participant 22 involved and connected to a collaborative work session.
  • the laboratory room 18 also comprises a virtual laboratory application using LabVIEWTM 72.
  • This virtual laboratory application 72 comprises a LabVIEWTM server 74, linked to a data acquisition device 78 or card, which is connected to a machine 76 to obtain data from that machine 76. Also, a machine 76 can be directly connected to the LabVIEWTM server 74 for data processing.
  • the LabVIEWTM server 74 is linked to the network 12 so that it is accessible to all the participants 22 located at sites 1-6. Therefore, the disseminated participants 22 can perform on-line remote experimentations and share their results.
  • one of the sites 1-3 includes a multicast video bridge (not shown) for managing all the cameras.
  • collaboration platform 20 allows for adding new virtual laboratory applications 70 or 72 during a collaborative work session.
  • FIG. 6 there is shown a flow diagram of an illustrative example of a process 80 executed when adding new virtual laboratory applications 72 during a collaborative work session.
  • the steps of the process 80 are indicated by blocks 82 to 88.
  • the process 80 starts at block 82 where a participant 22 can create a new virtual laboratory duplicated object, which contains description information (name, type, version, etc.) and the address of the laboratory application server 74 (LabVIEWTM server for example). Then, at block 84, the collaboration platform 20 publishes the duplicated object over Net-Z and activates remotely, at block 86, the virtual laboratory application 72 within LabVIEWTM. Following which, at block 88, the new virtual laboratory application 72 is opened within the shared workspace dedicated to the virtual laboratory tool(s).
  • a new virtual laboratory application 72 is developed and deployed within the LabVIEWTM development environment without affecting the collaboration platform 20.
  • the developers can produce new shared remote control applications, which are called virtual laboratory, in accordance to their particular needs and use in a collaborative work session.
  • Telemedicine One embodiment of the present invention is its use for performing telemedicine. More specifically, the present invention can be used for performing real-time medical or surgical interventions at distance, for diagnosis or curative purposes, whether or not in a telementoring context.
  • Figure 7 illustrates a telesurgery environment.
  • a surgeon 90 is located in a medically-adapted multimedia room 14 with a collaboration platform 20 at site 2 while a distant patient 92 is located in a remote, adapted operating room 94, which incorporates the features of a multimedia room 14 and a collaboration platform 20, at site 1.
  • One or several assistants 96 may also be present in the adapted operating room 94 to assist the surgeon 90 in his work.
  • the medically-adapted multimedia room 14 and the adapted operating room 94 are both equipped, particularly but not exclusively, with two- way cameras 56 and haptic devices 58 (see Figure 4).
  • the cameras 56 allow (i) the surgeon 90 and the assistant(s) 96 to see each other and (ii) the surgeon 90 to see the distant patient 92, especially the area on which the surgery is performed.
  • the haptic devices 58 allow the surgeon 90 to receive a touch feeling feedback from the contact of the instruments he manipulates. For example, this feedback to the surgeon 90 can be the pressure exerted by the patient's body on an instrument the surgeon 90 manipulates thereon or therein.
  • the surgeon 90 performs the surgery on the distant patient 92 via a remote robot 98 placed in the adapted operating room 94.
  • the surgeon 90 uses a laboratory room 18 comprising a virtual laboratory application 72 and the server 74 from LabVIEWTM, and the surgeon 90 interacts with the robot manipulator with embedded camera 98 via the haptic devices 58 and a data acquisition device 78 or card, which is connected to the LabVIEWTM server 74.
  • the camera embedded in the robot manipulator 98 captures a detailed video of the operation which is displayed in high resolution in a main projection screen such as the screen 54 in Figure 4.
  • Audio/video tools facilitate the communications between the surgeon 90 and the assistant(s) 96.
  • resident students 100 located at remote site 3 comprising medically-adapted multimedia room 14, a laboratory room 18 and a collaboration platform 20, may participate to the operation session for the purpose of telementoring and learning.
  • several surgeons 90 may collaborate together and teleoperate on the same patient 92.
  • the dedicated broadband network 12 guaranties secure real-time transmission and an acceptable latency for this kind of task. High standards of communication security and fast response which are needed in a telesurgery context are provided.
  • one of the sites 1-3 includes a multicast video bridge (not shown) for managing all the cameras.
  • FIG 8 there is shown the graphical interface 102 of the collaboration platform 20 as seen by a participant 22.
  • a plurality of applications are integrated into one single window or workspace.
  • the space 104 allows for displaying all the connected and disseminated participants 22.
  • the space 106 shows the available sharing documents from different participants 22.
  • the space 108 allows for chatting and instant messaging to all the connected participants 22.
  • the space 110 is a whiteboard for drawing, writing, taking notes, calculating, etc.
  • the space 112 allows participants 22 to browse the web and do searches.
  • the spaces 114 and 116 allow for displaying pictures and videos from different machines in remote virtual loboratory rooms 18. They also allow for displaying the results of experimentations via graphs or tables.
  • the space 118 allows the remote control, sharing and annotation of applications such as, for example, MS-Office or other applications.
  • a participant 22 added annotations 119, which may be viewed by all other participants 22.
  • the space 120 it allows audio/video streaming with an annotation feature (pointing and highlighting a specific region of the running video) and enabling collaborative control (moving, zooming, etc) of a remote camera.
  • a menu bar 122 shows all the different commands available to the participants 22.
  • the collaboration platform 20 features the group awareness functionality.
  • Group awareness is defined as enabling each participant 22 to a collaborative work session to perceive the actions, activities and states of the other participants 22. Awareness is also sometimes referred to as "ubiquity”.
  • Group awareness management is made possible by functionalities through a dedicated space for awareness information comprising parameters such as presence information, identification, localization, availability, activity level and history.
  • Figure 8 which shows a graphical interface 102 to the collaborative work collaboration platform 20
  • the space 104 displays the name of all the currently connected participants 22, along with their respective activities and tools, which are available to be shared.
  • the space 106 allows each participant 22 to know who wants to share what. It is to be understood that an administrator of the collaboration platform 20 may restrict the use and/or control of certain tools to specific participants 22.
  • inheritance allows a late-comer participant 22 to join and inherit all shared objects or collaborative objects that have been created or produced during an on-going collaborative work session. Moreover, the collaborative work session is not affected if its initiator disconnects therefrom.
  • the inheritance property is implemented by using the Net-Z middleware of Quazal inc., which was originally dedicated to the development of multi-participant games.
  • the middleware 30 provides a set of specialized services, such as coherence synchronization, session and implementation, which facilitate the development and implementation of collaborative functionalities for the application layer 28.
  • the middleware layer 30 makes it possible for the collaboration platform 20 to inherit the performance and properties thereof, such as object duplication and migration, duplicated spaces, data extrapolation, remote procedure calls (RPC), etc.
  • middleware layer 30 allows the collaboration platform 20 to inherit the following characteristics: a descriptive approach with a high level of abstraction, fault tolerance, reliability, flexibility (supporting P2P, Client/Server, and Hybrid architectures), interoperability (it can easily interoperate with other middleware or software components or systems), data integrity (via data encryption), use of compression, ability to evolve (1024 available classes and 4 millions instantiations) and ability to perform load balancing.

Abstract

A system for supporting collaborative work allowing disseminated participants at remote sites to interact in real-time with each other through a fault tolerant network in a peer-to-peer configuration. The system comprises: (i) a plurality of collaboration platforms having a layered architecture to support collaborative application such as conferencing, instant messaging, audio/video, shared whiteboards, shared data and files, collaborative browsing, application sharing, virtual laboratory application, slide show and audio/video stream sharing with annotation feature; and (ii) a plurality of hardware platform and equipments consisting of: a workstation, a multimedia room equipped with a white board, cameras and videoconferencing devices, a projection screen with an associated video projector and a smartboard, a laboratory room and an adapted operating room comprising a haptic device and a remote machine such as a robot.

Description

TITLE OF THE INVENTION
System for supporting collaborative work
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefits of U.S. provisional patent application No. 60/851 ,304 filed October 13, 2006 and Canadian patent application No. 2,563,866 filed October 13, 2006; which are hereby incorporated by reference.
FIELD OF THE INVENTION
[0002] The present invention generally relates to a system for supporting collaborative work. More specifically, but not exclusively, the present invention is concerned with a multimodal/multi-sensorial integrated system that supports collaborative work using common tools and applications, whereby geographically-disseminated participants thereof can interact in real-time with each other.
BACKGROUND OF THE INVENTION
[0003] A purpose of collaborative work and collaborative research is to support cost-effective creative solutions to complex collaborative problems that reduce risk, time and space constraints by means of intensive use of information and communication technologies. Several client/server based products were developed for the collaboration work and are offered on the market. However there is a need for an improved system. For example, there is a need for robustness and reliability of the system, a need for higher performance for real-time collaborative applications and for sharing expensive or non-accessible equipments, a need for higher quality of multimedia, a need for integration of different applications together, a need for adaptability of the system to changes, a need for more visibility of all the participants along with the services and objects to be shared. [0004] The client/server based systems for collaborative work have a central failure point in the server. If the server fails or becomes unavailable, all the participants are affected. Also, the client/server based systems do not provide sufficient flexibility since the server controls all the information, participants cannot directly share files and documents between each other. Therefore, Peer-to-Peer (P2P) based systems for collaborative work have been developed to offer more flexibility and robustness.
[0005] Collaborative work is becoming more and more attractive and relevant, especially among the circle of researchers where real-time applications, specialized devices and expensive equipments are usually not always affordable to everybody. Therefore, the possibility of sharing a virtual laboratory to perform (collaborative) experimentations becomes very interesting.
[0006] Furthermore, current collaborative systems are generally decoupled in their applications. For example, one has to switch from one window to the other or open another window in order to access the browser to surf the Internet or to use any other application. None is known to integrate all the collaborative applications altogether in one single window (or workspace).
[0007] Since technologies change fast, collaborative systems need to be adaptable and allow for adding or incorporating new applications and tools both easily and efficiently.
[0008] Finally, most of the collaborative systems usually indicate only the presence of its participants. No further information concerning, for example, services and objects to be shared, or activity level and history, is provided. Therefore, there is a need for an improved group awareness functionality in such collaborative systems.
[0009] In order to overcome the above discussed drawbacks, there is a need for an improved system for supporting collaborative work. OBJECTS OF THE INVENTION
[0010] Therefore, an object of the present invention is to provide a system for supporting collaborative work that is designed and structured and comprises features adapted in view of overcoming the above discussed drawbacks of the current systems.
[0011] The foregoing and other objects, advantages and features of the present invention will become more apparent upon reading of the following non-restrictive description of illustrative embodiments thereof, given by way of example only, with reference to the accompanying drawings.
SUMMARY OF THE INVENTION
[0012] The system for supporting collaborative work presented in the present invention allows for disseminated participants at remote sites to interact in real-time with each other through a fault tolerant network in a peer-to-peer configuration. The system comprises: (i) a plurality of collaboration platforms having a layered architecture to support collaborative application such as conferencing, instant messaging, audio/video, shared whiteboards, shared data and files, collaborative browsing, application sharing, virtual laboratory application, slide show and audio/video stream sharing with annotation feature; and (ii) plurality of hardware platform and equipments consisting of: a workstation, a multimedia room equipped with a white board, cameras and videoconferencing devices, a projection screen with an associated video projector and a smartboard, a laboratory room and an adapted operating room comprising haptic device and remote machine such as a robot.
[0013] In accordance with an aspect of the present invention there is provided a system for supporting collaborative work for allowing disseminated participants to interact in real-time with each other through a network, the system comprising: a plurality of collaboration platforms having a layered architecture to support at least one collaborative application and accessible by the disseminated participants at remote sites;
wherein the layered architecture includes at least one layer for interconnecting the plurality of collaboration platforms through the network in a peer-to-peer configuration and wherein the plurality of collaboration platforms are so configured so that when a new collaboration platform interconnects with the plurality of collaboration platforms the new collaboration platform inherits the at least one collaborative application.
[0014] In accordance with another aspect of the present invention the system for supporting collaborative includes an annotation feature allowing the disseminated participants to annotate the at least one of the collaborative applications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] In the appended drawings:
[0016] Figures 1a and 1b are block diagrams schematically illustrating a system for supporting collaborative work, in accordance with an illustrative embodiment of the present invention;
[0017] Figure 2 is a schematic diagram illustrating a collaboration platform architecture of the system for supporting collaborative work of Figures 1a and 1 b;
[0018] Figure 3 is a block diagram schematically illustrating a deployment example of the infrastructure of the system for supporting collaborative work of Figures 1a and 1 b; [0019] Figure 4 is a schematic diagram illustrating a plurality of devices found in a multimedia room of the system for supporting collaborative work of Figures 1a and 1 b;
[0020] Figure 5 is a block diagram schematically illustrating a system for supporting collaborative work comprising a laboratory room, in accordance with another illustrative embodiment of the present invention;
[0021] Figure 6 is a flow chart illustrating a method used for integrating dynamically new objects or applications in the system for supporting collaborative work of Figure 5;
[0022] Figure 7 is a block diagram schematically illustrating an application of the system for supporting collaborative work of Figures 1a and 1 b for real-time telesurgery and telementoring;
[0023] Figure 8 is an image of a graphical participant interface of the system for supporting collaborative work of Figures 1a and 1 b; and
[0024] Figure 9 is a block diagram schematically illustrating an implementation of the annotation feature, which enriches the system for supporting collaborative work of Figures 1a and 1b.
DETAILED DESCRIPTION
[0025] Non-restrictive illustrative embodiments of the system for supporting collaborative work in accordance with the present invention will now be described.
General description
[0026] The present invention is generally concerned with a system for supporting collaborative work. The system has an architecture that encompasses a plurality of layers, the integration of which results in a multimodal environment having the ability of gathering and integrating multi- sensorial information from a plurality of remote sites, this being described as "telepresence". The system can comprise several collaborative tools and applications, such as conferencing, instant messaging, chatting, audio/video, shared whiteboards, shared data and files, collaborative browsing, slide show, audio/video stream sharing, application sharing, eDrive, a multimedia, virtual laboratory and a virtual operating room. The system for supporting collaborative work also comprises a broadband fault tolerant network that can connect together several multimedia rooms, intended for sending to other remote sites, receiving from other remote sites and processing multi-sensorial information. The system can have recourse to various video, audio and/or haptic human- machine interfaces; it can make use of various media, such as text, voice, sound, touch feeling, graphics, gesture and motion, handwriting and shared applications.
[0027] The system for supporting collaborative work encompasses several collaborative tools and applications, all of which being integrated in the same work space. Moreover, its distributed architecture makes it possible to integrate new tools or applications thereinto.
[0028] The system is synchronous, which means that the course of the pertaining events is governed by a common phenomenon, such as a clock. This synchronicity property allows the participants of a collaborative work session to simultaneously interact with each other; a collaborative work session is characterized by a group of participants using a set of collaborative tools and/or applications during a fixed time frame to manipulate or produce artifacts, i.e. databases, documents, files, etc.
[0029] The system for supporting collaborative work offers a multimedia support to receive or deliver information under various forms, particularly but not exclusively under visual, video, audio and haptic forms.
[0030] Participants can connect to the system either via the Internet, or by any other network, including an intranet or a local network. The system for supporting collaborative work may comprise one or a plurality of multimedia rooms, a multimedia room being (i) defined as a physical site from where one or many participants can access the system, and (ii) potentially equipped with a plurality of interfaces, i.e. video interface, audio interface, haptic interface, etc.
[0031] The system for supporting collaborative work might also comprise a remote facility, the functions of which can be remotely controlled by the participants. As special cases, this remote facility can be a laboratory, or a medical or a surgical room.
[0032] Also, the system for supporting collaborative work might include a specially dedicated multimedia room from where participants can enter the collaborative work session for learning purposes, this being described as "telementoring".
System for supporting collaborative work
[0033] The system for supporting collaborative work makes intensive use of, for example, the multimodal perceptual processes, the transmission of information via broadband networks, the integration of multi-sensor interfaces to a multiple interaction network for receiving and transmitting both data and control signals, for which the applications are suitable to enhance collaborative work, for example collaborative research. Rather than using a conventional Client/Server architecture, the collaborative environment 10, as illustrated in Figures 1a and 1 b, has an open overlay self-scaling network architecture based on the distributed Peer-to-Peer (P2P) computing paradigm, which integrates various perceptual modes of information. It enables remote access, as well as the direct sharing of resources, responsibilities and services in synchronous/asynchronous manner among geographically disseminated participants such as 22 (Figure 1 b).
[0034] The system for supporting collaborative work 10 may be used in various contexts that need to support synchronous collaborative work, such as eLearning, eCommerce, eHealth (eDiagnostics, Teleoperation and telementoring), eDesign, eGovenement, eBusiness, eBanking, eProcurement, eNegotiation, etc.
[0035] Figure 1a is an illustration of a system for supporting collaborative work 10 that comprises six geographically separate sites, numbered site 1 to site 6, linked together by a network 12. A participant 22 (as shown in Figure 1 b) can join a collaborative work session either from a multimedia room 14a-14c or from a work station 16a-16b. In the illustrative example of Figure 1a, the system for supporting collaborative work 10 also comprises a laboratory room 18, the equipments of which can be remotely controlled by any participant 22 at sites 1-4 or 6. The network 12 is any network that links together the six sites and may comprise, for instance, Internet links, intranet links, local networks, private networks, etc.
[0036] Figure 1 b is a more detailed view of a system for supporting collaborative work 10 comprising sites 1-5, wherein each site 1 to 3 comprises a collaboration platform 20. Participants 22 interact with the system for supporting collaborative work 10 either through a multimedia room 14, as in sites 1 and 2, or through a workstation 16 as in site 3. In Figure 1 b, site 3 is linked to the network 12 through the Internet. Furthermore, the system for supporting collaborative work 10 includes a laboratory room 18 at site 4, which comprises laboratory servers (not shown), and a multicast video bridge 24 for managing all the cameras (not shown) that may be disseminated at the various sites 1-4.
Multi-layer conception of the system for supporting collaborative work
[0037] As illustrated in Figure 2, the collaboration platform 20 comprises a multi-layer architecture, comprising the following layers:
(i) a participant interface layer 26;
(ii) an application layer 28; (iii) a middleware layer 30; and
(iv) a network layer 32.
[0038] The participant interface layer 26 represents various participant interfaces; it is the collections of graphical objects, i.e. menus, commands, controls, etc., that enable participants 22 to interact with the collaboration platform 20 and to use the various provided collaborative tools. The participant interface 102, as illustrated in Figure 8, may use a relaxed WYSIWIS (What You See Is What I See) interface metaphor and is characterized by optimal management of the displaying space, flexibility in the presentation and ease of access to collaborative services and functionalities.
[0039] The application layer 28 constitutes the bulk of the collaboration platform 20 and includes the collection of applicative logics. It uses the services of the middleware layer 30 to build collaborative tools, toolkits and applications, in order to satisfy the needs of the participants 22 in terms of communications, data sharing and collaboration. It may comprise the following key components:
• a conference metaphor for grouping functions related to work session management and for creating the illusion of a real meeting conference;
• an instant messaging tool or chat service enabling the exchange of text messages in public, in group or in private;
• an audio/video tool for synchronous multipoint videoconference (for example compliant with the Standard H.323);
• a whiteboard tool which is a collaborative drawing tool using a vector- based representation for offering a drawing space of several pages and a graphic toolbox; • an eDrive tool or virtual drive enabling data, file and audio/video stream sharing and exchanging (downloading);
• a visioning tool enabling distance team slide show presentation with an annotation feature;
• a navigator tool enabling collaborative navigation (or browsing) for example on the Web or on Intranet;
• a distant shared on-line laboratory (virtual laboratory) versatile toolkit grouping a collection of collaborative applications; enabling to undertake experimentations (either in real mode or in simulation mode), and to control remote machines;
• an application sharing tool enabling remote control and sharing (in control and viewing mode) applications (for example MS-Office or other applications) with an annotation feature; and
• an audio/video stream sharing tool with an annotation feature (pointing and highlighting a specific region of the running video) and enabling collaborative control (moving, zooming, etc) of a remote camera.
[0040] The middleware layer 30 provides the required implementations to develop a distributed system that ensures communication between the participating processes through a duplicated object system (DoS) architecture. A duplicated object is a type of decentralized, distributed object supporting the above enumerated collaborative tools, toolkits and applications, in order to satisfy the needs of the participants 22 in terms of communications, data sharing and collaboration. These objects contain datasets describing all the network-relevant information about a tool, toolkit or application. A duplicated object can be either a duplication master, which is the controlling instance of the object, or a duplica, which is a complete copy of the master object. [0041] The middleware layer 30 also provides a set of services such as coherence, session, collection of specialized network services, all of which allow to facilitate the development and implementation of collaborative functionalities of the application layer 28. Finally, the middleware layer 30 provides a distributed clock that ensures temporal order of information during communication processes.
[0042] The network layer 32 takes in charge all functions related to communication networks, including communication and routing protocols, bandwidth management, and quality of service. This layer 32 is characterized by the following key elements: an automatic message routing mechanism which can allow a peer to join another peer via a third peer, a configurable transport protocol, a hierarchical and cascading message passing model based on multicast groups, and message bundling techniques.
[0043] Furthermore, the application layer 28 allows control sharing
(collaborative manipulation commands), result sharing (the results of experimentations), experimentation visualization (video and sound) as well as the control of the devices or cameras used for visualization. Such a tool offers various laboratories options (some of them use DataSocket Transfer Protocol of LabVIEW™) and allows the integration of new experimentations without providing an extra development effort and without affecting the other platform's components.
[0044] The annotation feature, which enriches collaborative tools such as the visioning tool, application sharing tool and an audio/video stream sharing tool by allowing pointing, highlighting and drawing over a presentation or a running video, may be implemented according to the schematic diagram of Figure 9. At the interface layer 26, a transparent GUI control 132 is superposed over the application automation GUI control 136. The graphical annotations (i.e. pointers, highlights, drawings, etc.) are translated to duplicated objects 134 at the application layer 28 and published over session in a separate channel, i.e. published over Net-Z. A synchronization is performed between the duplicated objects 134 and the automation objects (data and user commands) 138. With this method the shared content (for example a presentation) is not altered. Net- Z refers to the Net-Z middleware of Quazal inc., which was originally dedicated to the development of multi-participant games.
Duplication of objects
[0045] The collaboration platform 20 may use, for example, the Net-Z middleware from Quazal inc. for duplicating objects and ensuring that all participants 22 of a collaborative work session have, on their respective work station, a copy of all objects that are in use. Every copy of a duplicated object is made from the same master copy so as to (i) ensure coherence, (ii) maintain the pertaining functionalities and (iii) control the object's instances.
Decentralized management of applications and distributed architecture
[0046] The collaboration platform 20 is decentralized and symmetric, since it is based on a peer-to-peer (P2P) distributed architecture, and comprises a broadband network 12 that can connect a plurality of laboratory rooms 18 or multimedia rooms 14 together. The collaboration platform 20 aggregates various aspects of multimodal perceptual information in order to support collaborative work in different fields of activity.
[0047] This P2P distributed architecture means that the collaboration platform 20 does not depend on a central server; rather, the application components and the responsibilities of maintaining network services are distributed across multiple work stations. This renders the collaboration platform 20 more robust and tolerant to failure or disconnection of any work station within the collaborative work session. In the occurrence of a failure or disconnection, the duplication master objects will migrate to another station and let the collaborative work session to seamlessly carry on. [0048] In Figure 3, an example of deployment of the infrastructure of the collaboration platform 20 is illustrated. Each site 1-5 is connected to the network 12 with GigaEthernet links 40 and via a switch 42. Of course, other types of connections can be used and would be obvious to a person of ordinary skill in the art. For example, site 4 is connected to the switch 42 via an optical carrier 44. Also, a testbed network 46 can be present and connected to the network 12 for monitoring and performance analysis purposes.
[0049] It is to be understood that if cameras are used, one of the sites 1-5 will include a multicast video bridge (not shown) for managing all the cameras.
Multimedia rooms
[0050] Turning now to Figure 4, there is illustrated a multimedia room
14, which provides an adapted versatile environment that replicates real collaborative situations and can also provide a general-purpose virtual laboratory facility for a large experimentation of engineering problems in the context of collaborative research, for instance by using the LabVIEW™ software.
[0051] The multimedia room 14 can comprise and integrate into the collaborative environment 10 a plurality of features, the information of all of which can be accessed by the participants 22. The features can comprise, amongst others, whiteboards 50, smart boards (handwritten recognition acquisition and understanding) 52, screens 54, cameras 56, haptic devices 58, databases and applications, etc. Those features are schematically illustrated in Figure 4.
[0052] The multimedia room 14 can also be equipped with high- quality accessories and devices enabling intensive computing, high-definition (HD) visualization, 2D and 3D image processing (recognition), group communication, and videoconference services for multiple participants 22. Thanks to the high degree of integration of such intelligent interactive devices, such multimedia rooms 14 are innovative in the field of collaborative work. For instance, during teleconferencing, intelligent cameras 56 are autonomously guided to track the person 60 who is currently speaking by putting him continuously on its vision field. The speaker 60 may create his drawing directly on a shared whiteboard 50 by using the smart board 52 which provides handwriting recognition acquisition and understanding. He can also use a conventional whiteboard 50 to write notes, which can be captured by the cameras 56, which generally have a high resolution given by CDD (Charged Coupled Device) matrices. Furthermore, he can also undertake a demo in which he manipulates a remote machine (not shown), such as a robot for example, by using haptic devices 58.
[0053] By means of the graphical participant interface 102 (Figure 8) and a private camera 62 for each participant 22 in this multimedia room 14, distant participants 22 (from other multimedia rooms 14) to the meetings can communicate and interact with the speaker 60 as well as with each other, for example by annotation feature in conjunction with the visioning tool, application sharing tool and audio/video stream sharing tool. Also, a sound system composed of a distributed microphone network (not shown) allows for accurate and smooth acquisition of voice. Therefore, there is provided a collection of speakers (not shown) enabling high-quality of sound diffusion.
[0054] The graphical participant interface 102 (see Figure 8) may also be projected to produce a wide main screen 54, having a large high definition (HD) format, by means of a projector assembly 64. In addition, there is a digitization center (not shown in Figure 4) for storing, distributing and sharing digital data.
[0055] It is to be understood that the above description is a non- restrictive example of the multiple possibilities of a multimedia room 14.
Haptic devices and robots [0056] The collaboration platform 20 may comprise one or several haptic devices 58. A haptic device 58 is a technology which interfaces with the participant 22 through the sense of touch. By enabling a force-feedback output, a haptic device 58 makes possible to actuate teleoperators. The haptic devices 58 may be coupled to stereoscopic spectacles in order to perceive depth of field and, in this manner, add an extra information parameter that helps the participant 22 to properly perform a telemanipulation.
[0057] Two examples of use of an haptic device 58 are (i) conducting an experiment in a remote laboratory room 18 and (ii) performing a surgery on a remote patient. For instance, a haptic device 58 may comprise Phantom Desktop™ used in connection with two pantins and a universal joint wrist, e.g. a Phantom 1.5A™ and a Freedom 6S™. Generally speaking, haptic devices 58 allow for controlling remote machines or robots. In the example of a remote laboratory room 18, the haptic devices 58 enable to start experimentations using different robots or to perform simulations of different machines. In the case of a virtual clinic, the haptic devices 58 allow a surgeon to manipulate surgical robots and instruments.
Virtual laboratory
[0058] As previously mentioned, the collaboration platform 20 may encompass a laboratory room 18, which comprises a set of collaborative applications for performing experiments on remote equipments that are actuated by teleoperators. The laboratory room 18 allows to remotely perform interactive experiments, possibly by using haptic devices 58. Moreover, it is also suitable for distant learning, as it enables many geographically disseminated participants 22 to simultaneously conduct an experiment, to get and share the results thereof and, possibly, to visualize in real-time a 3D video of the experiment. The laboratory room 18 also offers the possibility to simulate an experiment from relevant models and databases. Of course, the laboratory room 18 can also be equipped with HD visualization and high quality sound systems such as those found in the multimedia rooms 14.
[0059] Turning to Figure 5, a collaborative environment comprising a laboratory room 18 is shown. A plurality of sites 1-6, each of which comprising a collaboration platform 20 and either a multimedia room 14 or workstation 16, as well as the laboratory room 18 are connected to the network 12. The network 12 can be a private network, shared by a group of universities and linking them together, for example. The laboratory room 18 comprises a virtual laboratory application 70 and a virtual laboratory application using LabVIEW™ 72.
[0060] The virtual laboratory application 70 is completely integrated to the collaboration platform 20 and includes a machine 76 or a robot, connected to a data acquisition device 78 or card in order to monitor and save the results from experimentations. Those results are accessible at site 6 and can be shared with any participant 22 involved and connected to a collaborative work session.
[0061] The laboratory room 18 also comprises a virtual laboratory application using LabVIEW™ 72. This virtual laboratory application 72 comprises a LabVIEW™ server 74, linked to a data acquisition device 78 or card, which is connected to a machine 76 to obtain data from that machine 76. Also, a machine 76 can be directly connected to the LabVIEW™ server 74 for data processing. Finally, the LabVIEW™ server 74 is linked to the network 12 so that it is accessible to all the participants 22 located at sites 1-6. Therefore, the disseminated participants 22 can perform on-line remote experimentations and share their results.
[0062] It is to be understood that one of the sites 1-3 includes a multicast video bridge (not shown) for managing all the cameras.
[0063] The flexibility needed for performing on-line remote experiments is assured by the ability to integrate dynamically new objects and applications. For example, the collaboration platform 20 allows for adding new virtual laboratory applications 70 or 72 during a collaborative work session.
[0064] Referring now to Figure 6, there is shown a flow diagram of an illustrative example of a process 80 executed when adding new virtual laboratory applications 72 during a collaborative work session. The steps of the process 80 are indicated by blocks 82 to 88.
[0065] The process 80 starts at block 82 where a participant 22 can create a new virtual laboratory duplicated object, which contains description information (name, type, version, etc.) and the address of the laboratory application server 74 (LabVIEW™ server for example). Then, at block 84, the collaboration platform 20 publishes the duplicated object over Net-Z and activates remotely, at block 86, the virtual laboratory application 72 within LabVIEW™. Following which, at block 88, the new virtual laboratory application 72 is opened within the shared workspace dedicated to the virtual laboratory tool(s).
[0066] Thus, a new virtual laboratory application 72 is developed and deployed within the LabVIEW™ development environment without affecting the collaboration platform 20. In this way, the developers can produce new shared remote control applications, which are called virtual laboratory, in accordance to their particular needs and use in a collaborative work session.
[0067] It is to be understood that a similar process may be used when adding new collaborative tools, toolkits and applications, i.e. creating a duplicated object of the new collaborative tool, toolkit or application, publishing the duplicated object over Net-Z and opening the new collaborative tool, toolkit or application within the shared workspace dedicated to the collaborative tool, toolkit or application.
Telemedicine [0068] One embodiment of the present invention is its use for performing telemedicine. More specifically, the present invention can be used for performing real-time medical or surgical interventions at distance, for diagnosis or curative purposes, whether or not in a telementoring context. Figure 7 illustrates a telesurgery environment. In this example, a surgeon 90 is located in a medically-adapted multimedia room 14 with a collaboration platform 20 at site 2 while a distant patient 92 is located in a remote, adapted operating room 94, which incorporates the features of a multimedia room 14 and a collaboration platform 20, at site 1. One or several assistants 96 may also be present in the adapted operating room 94 to assist the surgeon 90 in his work.
[0069] The medically-adapted multimedia room 14 and the adapted operating room 94 are both equipped, particularly but not exclusively, with two- way cameras 56 and haptic devices 58 (see Figure 4). The cameras 56 allow (i) the surgeon 90 and the assistant(s) 96 to see each other and (ii) the surgeon 90 to see the distant patient 92, especially the area on which the surgery is performed. The haptic devices 58 allow the surgeon 90 to receive a touch feeling feedback from the contact of the instruments he manipulates. For example, this feedback to the surgeon 90 can be the pressure exerted by the patient's body on an instrument the surgeon 90 manipulates thereon or therein. More specifically, the surgeon 90 performs the surgery on the distant patient 92 via a remote robot 98 placed in the adapted operating room 94. To accomplish this task, the surgeon 90 uses a laboratory room 18 comprising a virtual laboratory application 72 and the server 74 from LabVIEW™, and the surgeon 90 interacts with the robot manipulator with embedded camera 98 via the haptic devices 58 and a data acquisition device 78 or card, which is connected to the LabVIEW™ server 74. The camera embedded in the robot manipulator 98 captures a detailed video of the operation which is displayed in high resolution in a main projection screen such as the screen 54 in Figure 4.
[0070] Audio/video tools facilitate the communications between the surgeon 90 and the assistant(s) 96. Furthermore, resident students 100, located at remote site 3 comprising medically-adapted multimedia room 14, a laboratory room 18 and a collaboration platform 20, may participate to the operation session for the purpose of telementoring and learning. In another embodiment of the invention, several surgeons 90 may collaborate together and teleoperate on the same patient 92. The dedicated broadband network 12 guaranties secure real-time transmission and an acceptable latency for this kind of task. High standards of communication security and fast response which are needed in a telesurgery context are provided.
[0071] It is to be understood that one of the sites 1-3 includes a multicast video bridge (not shown) for managing all the cameras.
[0072] Turning now to Figure 8, there is shown the graphical interface 102 of the collaboration platform 20 as seen by a participant 22. A plurality of applications are integrated into one single window or workspace. Indeed, the space 104 allows for displaying all the connected and disseminated participants 22. The space 106 shows the available sharing documents from different participants 22. The space 108 allows for chatting and instant messaging to all the connected participants 22. The space 110 is a whiteboard for drawing, writing, taking notes, calculating, etc. The space 112 allows participants 22 to browse the web and do searches. The spaces 114 and 116 allow for displaying pictures and videos from different machines in remote virtual loboratory rooms 18. They also allow for displaying the results of experimentations via graphs or tables. Those results can be manipulated and processed afterwards by each participant 22. The space 118 allows the remote control, sharing and annotation of applications such as, for example, MS-Office or other applications. For example, a participant 22 added annotations 119, which may be viewed by all other participants 22. As for the space 120, it allows audio/video streaming with an annotation feature (pointing and highlighting a specific region of the running video) and enabling collaborative control (moving, zooming, etc) of a remote camera. Finally a menu bar 122 shows all the different commands available to the participants 22. Group awareness
[0073] The collaboration platform 20 features the group awareness functionality. Group awareness is defined as enabling each participant 22 to a collaborative work session to perceive the actions, activities and states of the other participants 22. Awareness is also sometimes referred to as "ubiquity". Group awareness management is made possible by functionalities through a dedicated space for awareness information comprising parameters such as presence information, identification, localization, availability, activity level and history. As illustrated in Figure 8, which shows a graphical interface 102 to the collaborative work collaboration platform 20, the space 104 displays the name of all the currently connected participants 22, along with their respective activities and tools, which are available to be shared. Furthermore, the space 106 allows each participant 22 to know who wants to share what. It is to be understood that an administrator of the collaboration platform 20 may restrict the use and/or control of certain tools to specific participants 22.
Inheritance
[0074] Another interesting feature of the collaboration platform 20 is the inheritance functionality. For example, inheritance allows a late-comer participant 22 to join and inherit all shared objects or collaborative objects that have been created or produced during an on-going collaborative work session. Moreover, the collaborative work session is not affected if its initiator disconnects therefrom.
[0075] In one embodiment, the inheritance property is implemented by using the Net-Z middleware of Quazal inc., which was originally dedicated to the development of multi-participant games. Referring back to Figure 2, the middleware 30 provides a set of specialized services, such as coherence synchronization, session and implementation, which facilitate the development and implementation of collaborative functionalities for the application layer 28. The middleware layer 30 makes it possible for the collaboration platform 20 to inherit the performance and properties thereof, such as object duplication and migration, duplicated spaces, data extrapolation, remote procedure calls (RPC), etc. Moreover, the use of the middleware layer 30 allows the collaboration platform 20 to inherit the following characteristics: a descriptive approach with a high level of abstraction, fault tolerance, reliability, flexibility (supporting P2P, Client/Server, and Hybrid architectures), interoperability (it can easily interoperate with other middleware or software components or systems), data integrity (via data encryption), use of compression, ability to evolve (1024 available classes and 4 millions instantiations) and ability to perform load balancing.
[0076] Although the present invention has been described hereinabove by way of non-restrictive, illustrative embodiments thereof, these embodiments can be modified at will, within the scope of the appended claims, without departing from the spirit and nature of the subject invention.

Claims

WHAT IS CLAIMED IS
1. A system for supporting collaborative work for allowing disseminated participants to interact in real-time with each other through a network, the system comprising:
a plurality of collaboration platforms having a layered architecture to support at least one collaborative application and accessible by the disseminated participants at remote sites;
wherein the layered architecture includes at least one layer for interconnecting the plurality of collaboration platforms through the network in a peer-to-peer configuration and wherein the plurality of collaboration platforms are so configured so that when a new collaboration platform interconnects with the plurality of collaboration platforms the new collaboration platform inherits the at least one collaborative application.
2. A system for supporting collaborative work as defined in claim 1 , wherein the layered architecture includes a participant interface layer, an application layer, a middleware layer and a network layer.
3. A system for supporting collaborative work as defined in claim 2, wherein the at least one layer for interconnecting the plurality of collaboration platforms includes the middleware layer and the network layer.
4. A system for supporting collaborative work as defined in claim 2, wherein the at least one collaborative application is supported by the application layer.
5. A system for supporting collaborative work as defined in claim 2, wherein when a new collaborative application is added at one of the plurality of collaboration platforms, the middleware layer a) creates a duplicated object of the new collaborative application; b) publishes the duplicated object to the plurality of collaboration platforms; and c) opens the new collaborative application within the plurality of collaboration platforms.
6. A system for supporting collaborative work as defined in claim 1 , wherein the at least one collaborative application includes at least one element from the group consisting of: conferencing, instant messaging, chatting, audio/video, shared whiteboards, shared data and files, collaborative browsing, slide show, audio/video stream sharing, application sharing and eDrive.
7. A system for supporting collaborative work as defined in claim 1 , wherein the network includes a broadband fault tolerant network.
8. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the plurality of collaboration platforms is associated with at least one element from the group consisting of: a workstation, a multimedia room, a laboratory room and an adapted operating room.
9. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the plurality of collaboration platforms includes at least one camera.
10. A system for supporting collaborative work as defined in claim 9, further comprising a multicast video bridge.
11. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the plurality of collaboration platforms includes at least one haptic device.
12. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the plurality of collaboration platforms includes at least one element from the group consisting of: a white board, a projection screen with an associated video projector and a smartboard.
13. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the plurality of collaboration platforms includes a virtual laboratory application.
14. A system for supporting collaborative work as defined in claim 13, wherein the virtual laboratory application includes a LabVIEW server.
15. A system for supporting collaborative work as defined in claim 14, wherein the layered architecture includes a middleware layer and wherein when a new virtual laboratory is added, the middleware layer a) creates a duplicated object of the new virtual laboratory; b) publishes the duplicated object to the plurality of collaboration platforms; c) activate remotely the virtual laboratory within the LabVIEW server; and d) open a new virtual laboratory application within the plurality of collaboration platforms.
16. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the plurality of collaboration platforms includes a remote machine.
17. A system for supporting collaborative work as defined in claim 16, wherein the remote machine includes a robot.
18. A system for supporting collaborative work as defined in claim 1 , wherein at least one of the collaborative applications includes an annotation feature allowing the disseminated participants to annotate the at least one of the collaborative applications.
19. A system for supporting collaborative work as defined in claim 18, wherein the layered architecture includes an application layer and wherein when at least one of the collaborative applications, the application layer a) creates a duplicated object of the annotation; and b) publishes the duplicated object to the plurality of collaboration platforms.
20. A system for supporting collaborative work as defined in claim 1 , wherein the plurality of collaboration platforms include a graphical interface integrating the at least one collaboration application into a single window.
21. A system for supporting collaborative work as defined in claim 20, wherein the graphical interface is supported by a participant interface layer.
22. A system for supporting collaborative work as defined in claim 20, wherein the graphical interface displays the disseminated participants accessing the plurality of collaboration platforms.
23. A system for supporting collaborative work as defined in claim 20, wherein the graphical interface displays the available sharing documents from the disseminated participants.
PCT/CA2007/001826 2006-10-13 2007-10-15 System for supporting collaborative work WO2008043182A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA2702509A CA2702509A1 (en) 2006-10-13 2007-10-15 System for supporting collaborative work

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US85130406P 2006-10-13 2006-10-13
CA2,563,866 2006-10-13
US60/851,304 2006-10-13
CA002563866A CA2563866A1 (en) 2006-10-13 2006-10-13 System for supporting collaborative work

Publications (2)

Publication Number Publication Date
WO2008043182A1 true WO2008043182A1 (en) 2008-04-17
WO2008043182A8 WO2008043182A8 (en) 2008-07-03

Family

ID=39282380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2007/001826 WO2008043182A1 (en) 2006-10-13 2007-10-15 System for supporting collaborative work

Country Status (2)

Country Link
CA (1) CA2702509A1 (en)
WO (1) WO2008043182A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008056917A1 (en) * 2008-11-12 2010-06-02 Universität Konstanz Cooperation window / wall
EP2311255A1 (en) * 2008-08-14 2011-04-20 Telefonaktiebolaget L M Ericsson (PUBL) Sharing media in a communication network
WO2012100335A1 (en) * 2011-01-25 2012-08-02 Aastra Technologies Limited Collaboration system and method
WO2013035308A3 (en) * 2011-09-05 2013-05-23 Panasonic Corporation Television communication system, terminal, and method
FR2986931A1 (en) * 2012-02-10 2013-08-16 Damien Guerin Method for transmitting audio-visual content produced during surgical operation, involves reproducing audio-visual output stream at terminal of requesting user, where transmission and reproduction of output stream takes place in real-time
US20150261917A1 (en) * 2013-03-15 2015-09-17 Douglas K. Smith Federated Collaborative Medical Records System Utilizing Cloud Computing Network and Methods
EP3021552A1 (en) * 2014-11-14 2016-05-18 Orange Method and apparatus for communicating via a shared interaction space
US9759420B1 (en) 2013-01-25 2017-09-12 Steelcase Inc. Curved display and curved display support
US9804731B1 (en) 2013-01-25 2017-10-31 Steelcase Inc. Emissive surfaces and workspaces method and apparatus
US9984245B2 (en) 2012-07-25 2018-05-29 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for providing a secure virtual research space
US10264213B1 (en) 2016-12-15 2019-04-16 Steelcase Inc. Content amplification system and method
WO2021252023A1 (en) * 2020-06-08 2021-12-16 Dropbox, Inc. Intelligently generating and managing third-party sources within a contextual hub
US11327626B1 (en) 2013-01-25 2022-05-10 Steelcase Inc. Emissive surfaces and workspaces method and apparatus

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6338086B1 (en) * 1998-06-11 2002-01-08 Placeware, Inc. Collaborative object architecture
US20020129106A1 (en) * 2001-03-12 2002-09-12 Surgency, Inc. User-extensible system for manipulating information in a collaborative environment
US20020183878A1 (en) * 2001-03-23 2002-12-05 Valentin Chartier Collaborative design
US20030018719A1 (en) * 2000-12-27 2003-01-23 Ruths Derek Augustus Samuel Data-centric collaborative computing platform
US6640241B1 (en) * 1999-07-19 2003-10-28 Groove Networks, Inc. Method and apparatus for activity-based collaboration by a computer system equipped with a communications manager
US20030217105A1 (en) * 2002-05-17 2003-11-20 Groove Networks, Inc. Method and apparatus for connecting a secure peer-to-peer collaboration system to an external system
US6741904B1 (en) * 2001-02-16 2004-05-25 Benjamin C. Gage Real time design, development and delivery collaborative apparel solution digital platform
US20040181592A1 (en) * 2001-02-22 2004-09-16 Sony Corporation And Sony Electronics, Inc. Collaborative computer-based production system including annotation, versioning and remote interaction
US20050028006A1 (en) * 2003-06-02 2005-02-03 Liquid Machines, Inc. Computer method and apparatus for managing data objects in a distributed context
US20050193062A1 (en) * 2004-02-27 2005-09-01 International Business Machines Corporation Collaboration server, collaboration system, and method and program for collaboration server and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6338086B1 (en) * 1998-06-11 2002-01-08 Placeware, Inc. Collaborative object architecture
US6640241B1 (en) * 1999-07-19 2003-10-28 Groove Networks, Inc. Method and apparatus for activity-based collaboration by a computer system equipped with a communications manager
US20030018719A1 (en) * 2000-12-27 2003-01-23 Ruths Derek Augustus Samuel Data-centric collaborative computing platform
US6741904B1 (en) * 2001-02-16 2004-05-25 Benjamin C. Gage Real time design, development and delivery collaborative apparel solution digital platform
US20040181592A1 (en) * 2001-02-22 2004-09-16 Sony Corporation And Sony Electronics, Inc. Collaborative computer-based production system including annotation, versioning and remote interaction
US20020129106A1 (en) * 2001-03-12 2002-09-12 Surgency, Inc. User-extensible system for manipulating information in a collaborative environment
US20020183878A1 (en) * 2001-03-23 2002-12-05 Valentin Chartier Collaborative design
US20030217105A1 (en) * 2002-05-17 2003-11-20 Groove Networks, Inc. Method and apparatus for connecting a secure peer-to-peer collaboration system to an external system
US20050028006A1 (en) * 2003-06-02 2005-02-03 Liquid Machines, Inc. Computer method and apparatus for managing data objects in a distributed context
US20050193062A1 (en) * 2004-02-27 2005-09-01 International Business Machines Corporation Collaboration server, collaboration system, and method and program for collaboration server and system

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2311255A1 (en) * 2008-08-14 2011-04-20 Telefonaktiebolaget L M Ericsson (PUBL) Sharing media in a communication network
DE102008056917A1 (en) * 2008-11-12 2010-06-02 Universität Konstanz Cooperation window / wall
US9674286B2 (en) 2011-01-25 2017-06-06 Mitel Networks Corporation Collaboration system and method
WO2012100335A1 (en) * 2011-01-25 2012-08-02 Aastra Technologies Limited Collaboration system and method
WO2013035308A3 (en) * 2011-09-05 2013-05-23 Panasonic Corporation Television communication system, terminal, and method
US9124761B2 (en) 2011-09-05 2015-09-01 Panasonic Intellectual Property Management Co., Ltd. Television communication system, terminal, and method
FR2986931A1 (en) * 2012-02-10 2013-08-16 Damien Guerin Method for transmitting audio-visual content produced during surgical operation, involves reproducing audio-visual output stream at terminal of requesting user, where transmission and reproduction of output stream takes place in real-time
US9984245B2 (en) 2012-07-25 2018-05-29 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for providing a secure virtual research space
US10754491B1 (en) 2013-01-25 2020-08-25 Steelcase Inc. Emissive surfaces and workspaces method and apparatus
US11775127B1 (en) 2013-01-25 2023-10-03 Steelcase Inc. Emissive surfaces and workspaces method and apparatus
US11327626B1 (en) 2013-01-25 2022-05-10 Steelcase Inc. Emissive surfaces and workspaces method and apparatus
US9759420B1 (en) 2013-01-25 2017-09-12 Steelcase Inc. Curved display and curved display support
US9804731B1 (en) 2013-01-25 2017-10-31 Steelcase Inc. Emissive surfaces and workspaces method and apparatus
US11102857B1 (en) 2013-01-25 2021-08-24 Steelcase Inc. Curved display and curved display support
US10154562B1 (en) 2013-01-25 2018-12-11 Steelcase Inc. Curved display and curved display support
US10983659B1 (en) 2013-01-25 2021-04-20 Steelcase Inc. Emissive surfaces and workspaces method and apparatus
US10977588B1 (en) 2013-01-25 2021-04-13 Steelcase Inc. Emissive shapes and control systems
US10652967B1 (en) 2013-01-25 2020-05-12 Steelcase Inc. Curved display and curved display support
US11246193B1 (en) 2013-01-25 2022-02-08 Steelcase Inc. Curved display and curved display support
US11443254B1 (en) 2013-01-25 2022-09-13 Steelcase Inc. Emissive shapes and control systems
US20150261917A1 (en) * 2013-03-15 2015-09-17 Douglas K. Smith Federated Collaborative Medical Records System Utilizing Cloud Computing Network and Methods
US20170140105A1 (en) * 2013-03-15 2017-05-18 Douglas K. Smith Federated Collaborative Medical Records System
EP3021552A1 (en) * 2014-11-14 2016-05-18 Orange Method and apparatus for communicating via a shared interaction space
FR3028700A1 (en) * 2014-11-14 2016-05-20 Orange METHOD AND DEVICE FOR COMMUNICATING VIA A SHARED INTERACTION SPACE
US10638090B1 (en) 2016-12-15 2020-04-28 Steelcase Inc. Content amplification system and method
US11190731B1 (en) 2016-12-15 2021-11-30 Steelcase Inc. Content amplification system and method
US10897598B1 (en) 2016-12-15 2021-01-19 Steelcase Inc. Content amplification system and method
US11652957B1 (en) 2016-12-15 2023-05-16 Steelcase Inc. Content amplification system and method
US10264213B1 (en) 2016-12-15 2019-04-16 Steelcase Inc. Content amplification system and method
WO2021252023A1 (en) * 2020-06-08 2021-12-16 Dropbox, Inc. Intelligently generating and managing third-party sources within a contextual hub
US11461428B2 (en) 2020-06-08 2022-10-04 Dropbox, Inc. Intelligently generating and managing third-party sources within a contextual hub
US11853380B2 (en) 2020-06-08 2023-12-26 Dropbox, Inc. Intelligently generating and managing third-party sources within a contextual hub
US11893075B2 (en) 2020-06-08 2024-02-06 Dropbox, Inc. Intelligently generating and managing third-party sources within a contextual hub

Also Published As

Publication number Publication date
CA2702509A1 (en) 2008-04-17
WO2008043182A8 (en) 2008-07-03

Similar Documents

Publication Publication Date Title
WO2008043182A1 (en) System for supporting collaborative work
Pang et al. Collaborative 3 D visualization with CSpray
CN106295107A (en) A kind of medical image that realizes synchronizes the method and system of the consultation of doctors
US9386271B2 (en) System and method for synthesizing and preserving consistent relative neighborhood position in multi-perspective multi-point tele-immersive environments
US9826196B2 (en) System and method for synthesizing and preserving consistent relative neighborhood position in multi-perspective multi-point tele-immersive environments
Kaeri et al. Agent-based system architecture supporting remote collaboration via an internet of multimedia things approach
Buckingham Shum et al. Lyceum: Internet voice groupware for distance learning
Intapong et al. Modular web-based collaboration platform
CA2563866A1 (en) System for supporting collaborative work
Valin et al. Sharing viewpoints in collaborative virtual environments
Singh et al. Towards environment-to-environment (e2e) multimedia communication systems
Vertegaal et al. Look who's talking: the GAZE groupware system
DeFanti et al. Technologies for virtual reality/tele-immersion applications: issues of research in image display and global networking
Koleva et al. Experiencing a presentation through a mixed reality boundary
Kirk Turn it this way: Remote gesturing in video-mediated communication
Drira et al. A multi-paradigm layered architecture for synchronous distance learning
Guo et al. Tangible video communication over the Internet
ter Haar et al. Remote Expert Assistance System for Mixed-HMD Clients over 5G Infrastructure
Billinghurst et al. Collaboration with wearable computers
Lucca et al. An overview of systems enabling computer supported collaborative learning requiring immersive presence (CSCLIP)
Villemur et al. Multimedia tools supporting the work of distributed synchronous cooperative groups
Geyer et al. Integrating support for collaboration-unaware VRML models into cooperative applications
Sun et al. Implementing three-party desktop videoconferencing
Törlind Distributed engineering: tools and methods for collaborative product development
Kong et al. Next-generation collaboration environments for interactive tele-medical consultation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07815977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07815977

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2702509

Country of ref document: CA