US20100146395A1 - Method and System for Exploiting Interactions Via A Virtual Environment - Google Patents

Method and System for Exploiting Interactions Via A Virtual Environment Download PDF

Info

Publication number
US20100146395A1
US20100146395A1 US12/329,905 US32990508A US2010146395A1 US 20100146395 A1 US20100146395 A1 US 20100146395A1 US 32990508 A US32990508 A US 32990508A US 2010146395 A1 US2010146395 A1 US 2010146395A1
Authority
US
United States
Prior art keywords
virtual representation
virtual
avatar
response
components
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/329,905
Inventor
Gustavo De Los Reyes
Sanjay MacWan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AT&T Intellectual Property I LP
Original Assignee
AT&T Intellectual Property I LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AT&T Intellectual Property I LP filed Critical AT&T Intellectual Property I LP
Priority to US12/329,905 priority Critical patent/US20100146395A1/en
Assigned to AT&T INTELLECTUAL PROPERTY I, L.P. reassignment AT&T INTELLECTUAL PROPERTY I, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MACWAN, SANJAY, REYES, GUSTAVO DE LOS
Publication of US20100146395A1 publication Critical patent/US20100146395A1/en
Priority to US15/362,401 priority patent/US10943397B2/en
Priority to US17/169,936 priority patent/US20210166488A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • a computer readable storage medium storing a set of instructions that are executable by a processor, the set of instructions being operable to store a virtual representation of a plurality of physical components, introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component.
  • a system having a memory configured to store a virtual representation of a plurality of physical components and a processor configured to introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component.
  • FIG. 1 shows a schematic representation of various environments according to an exemplary embodiment of the present invention.
  • FIG. 2 shows a virtual physical view of a multimedia center of a home environment according to an exemplary embodiment of the present invention.
  • FIG. 3 shows a virtual physical view of a server room of a central office environment for generating and distributing the multimedia information destined for the multimedia center according to an exemplary embodiment of the present invention.
  • FIG. 4 shows a virtual logical view of a video distribution channel providing video from the central office environment to the multimedia center of the home environment according to an exemplary embodiment of the present invention.
  • FIG. 5 provides an exemplary method showing the deployment of an avatar within a virtual environment according to an exemplary embodiment of the present invention.
  • the exemplary embodiments of the present invention may be further understood with reference to the following description and the related appended drawings, wherein like elements are provided with the same reference numerals.
  • the exemplary embodiments of the present invention are related to systems and methods for controlling and/or interacting with a virtual environment that models physical devices and/or applications.
  • the mechanism for interacting with the virtual environment will be referred to as an “avatar.”
  • an avatar In normal computing lexicon an avatar is a computer user's representation of himself/herself or alter ego, whether in the form of a three-dimensional model used in computer games, a two-dimensional icon used on Internet forums and other communities, or a text construct.
  • an avatar is an “object” representing the embodiment of the user.
  • avatar may refer to the meaning associated with the normal computing usage. However, the term “avatar” may also be used to describe a functionality that does not necessarily need to mimic a user. For example, an avatar may will be able to take advantage of special powers that the avatar enjoys by virtue of its existence in the virtual world. Examples of these special powers will be described in greater detail below.
  • an exemplary virtual environment will be described within which the avatars may be deployed.
  • the exemplary embodiment of a virtual environment will be described with reference to a virtual environment that models a server network providing multimedia capabilities to a home environment.
  • the avatars may be deployed within any virtual environment.
  • FIG. 1 shows a schematic representation of various environments.
  • the environments include a home environment 20 that includes a multimedia center 22 , a kitchen 24 and an HVAC system 26 .
  • the home environment is discretely broken up into physical locations (e.g., multimedia center 22 and kitchen 24 ) or physical function (e.g., HVAC system 26 ) where controllable components exist.
  • physical locations e.g., multimedia center 22 and kitchen 24
  • physical function e.g., HVAC system 26
  • FIG. 1 Another exemplary embodiment of an environment is a central office environment 30 of a company that is distributing multimedia signals to the home environment 20 .
  • the central office environment 30 includes a server room 32 and a laboratory 34 .
  • a virtual environment 10 is also illustrated.
  • the virtual environment 10 is illustrated as including a workstation 15 .
  • the workstation 15 may be any computing device that is capable of executing the software necessary for carrying out the functionality described herein for the virtual environment 10 .
  • the computing device may be a desktop computer, server station, laptop computer, mobile computing device such as a mobile phone, etc.
  • the central office environment 30 and the home environment 20 may exchange signals.
  • An example of the signals may be a request from the multimedia center 22 for a particular video file (e.g., a movie) from the central office environment 30 .
  • the server room 32 of the central office environment 30 may receive the request and fulfill the request by sending the requested video signal to the multimedia center 22 .
  • FIG. 2 shows a virtual physical view 100 of the multimedia center 22 of the home environment 20 .
  • the virtual physical view 100 is created in either the virtual environment 10 or in some other computing device executing a commercially available simulation or virtual world software program to model the actual multimedia center 22 of the home environment and then loaded or stored in the virtual environment 10 .
  • the virtual physical view 100 includes multimedia components such as a residential gateway 105 , a computer 110 , a television 120 , a set top box 125 and a stereo 130 .
  • the virtual physical view 100 also includes physical entities such as chairs 140 and desk 150 to model the actual physical environment of the multimedia center 22 .
  • the physical view 100 may be a replica of the multimedia center 22 (e.g., the physical floor plan shown in the virtual physical view 100 is nearly exactly the same as the actual floor plan of the multimedia center 22 ) or it may be an abstract representation of the multimedia center 22 (e.g., the multimedia components may be displayed, but not in their exact locations or layouts).
  • the user may select and build the type of physical view with which they are comfortable.
  • FIG. 3 shows a virtual physical view 200 of a server room 32 of the central office environment 30 for generating and distributing the multimedia information destined for the multimedia center 22 .
  • the physical view 200 is similar to physical view 100 , except that it shows the other end of the distribution network for the multimedia information.
  • the physical view 200 shows server rack 210 including servers 211 - 216 , server rack 220 including 221 - 226 and server rack 230 including servers 231 - 236 .
  • the physical view 200 also includes other network components such as routers 240 - 260 and switches 270 and 280 . As will be described in greater detail below, the components in the actual server room 32 are responsible for generating and distributing the multimedia signals that are consumed by the multimedia center of the user's home.
  • FIG. 4 shows a virtual logical view 300 of a video distribution channel providing video to the multimedia center 22 .
  • the video distribution channel includes servers 233 and 224 (or applications on the servers 233 and 224 ) that generate the video to be distributed, router 250 that receives the video signal from servers 233 and 224 and routes it to server 212 that is responsible for distribution of the video signal.
  • the video signal is then sent through switch 280 to residential gateway 105 for distribution to the computer 110 and/or the television 120 via set top box 125 .
  • the virtual logical view 300 is only exemplary and that many different logical views may be built to model the distribution of many different types of signals.
  • the logical view 300 may not include all the components in the video distribution channel.
  • the switch 280 that is in server room 280 and residential gateway 105
  • there may be other network components such as additional servers, routers, switches, repeaters, etc.
  • the user may build the virtual logical view in any manner that the user is comfortable interacting with the view.
  • the user may be considered to be a person associated with the entity that is distributing the multimedia signals who is responsible for security applications.
  • the user is attempting to run a test that is designed to disrupt the video signal that is being distributed to the multimedia center 22 of the home environment 20 . Accordingly, the user may select the video distribution channel virtual logical view 300 from a list of virtual logical views.
  • the virtual environment 10 will display the virtual logical view 300 to the user.
  • the one or both of the virtual physical views 100 and 200 may also be displayed.
  • the virtual environment 10 may include multiple displays or multiple display panes to display multiple virtual views.
  • the physical virtual views 100 and 200 may be correlated to the virtual logical view 300 .
  • the boxes illustrating the various components of the virtual logical view 300 may be outlined in a specific color.
  • the components illustrated in the virtual physical views 100 and 200 may be colored in the same manner to illustrate the components that are involved in the selected logical view 300 .
  • two views e.g., the physical view and the logical view
  • two views are not required.
  • a user may be comfortable with only the physical view or the logical view and it may not be necessary to show the other view in the virtual environment 10 .
  • the virtual environment 10 e.g., the workstation 15 executing the virtual environment 10
  • the physical components modeled by the virtual views 100 - 300 there is an actual physical connection between the virtual environment 10 (e.g., the workstation 15 executing the virtual environment 10 ) and the physical components modeled by the virtual views 100 - 300 .
  • the virtual physical view 100 may show the video signal on the television 120 or computer 110 that is actually being displayed on the television screen or computer screen in the home environment 20 .
  • the components illustrated in the virtual environment 10 may have various functionalities associated with the components that the user may select to perform.
  • the associated functionalities may be selected, for example, using a dropdown menu associated with the component (displayed in either the physical views 100 and 200 or logical view 300 ).
  • the server 233 may be generating a first video stream and the server 224 may be generating a second video stream.
  • a user by selecting on of the servers 233 and 224 , may be able to toggle the video stream that is being displayed on the television screen 120 .
  • this control is exerted both in the virtual environment (e.g., what is being displayed on the television 120 screen of virtual physical view 100 ) and on the actual television screen in multimedia center 22 .
  • the virtual environment 10 may be used to allow a user to interact and exert control over components in the real world environment.
  • This integrating of the virtual environment 10 with the real world environment may lead to many unforeseen interactions and consequences.
  • the creation of avatars that can “roam” freely within the virtual environment 10 may be used by a system developer, system administrator, etc. to explore these interactions in the virtual environment 10 to determine both the capabilities and vulnerabilities of the components within the real world environment.
  • avatars within the exemplary virtual environment 10 .
  • the exemplary avatars will be used to produce security exploits to, for example, lead to new security mechanisms.
  • the avatars do not need to be limited to security applications, but may be used for any type of functionality that may be deployed or tested for a real world environment network or component, such as provisioning, network throughput, troubleshooting, etc.
  • a first exemplary avatar may appear to the virtual environment 10 as an Internet Protocol (“IP”) packet that will be able to “flow” through a system from end-to-end. As it goes through each element, it will be able to morph into the correct packet that will enable it to pass through any security checkpoints, such as firewalls, and reach its destination. At that point, it may be able to “own” the destination system by exploiting other vulnerabilities.
  • IP Internet Protocol
  • a user may select the virtual logical view 300 shown in FIG. 4 and select to inject the IP packet avatar into the server 233 .
  • the IP packet avatar may then flow to the router 250 , the server 212 , the switch 280 , the residential gateway 105 and each of the computer 110 and the set top box 125 .
  • the IP packet avatar could morph into the correct configuration to pass through each device. That is, the model of the devices in the virtual environment 10 will include the same functionalities as the actual devices that the virtual environment 10 is modeling.
  • the virtual components will process IP packets in the same manner as the actual devices.
  • each component will process an IP packet in accordance with the protocol stack included in the device.
  • Each layer of a protocol stack will strip away various information from the IP packet to process the specific functionality associated with the layer so that the IP packet may then be repackaged and forwarded to the next device along the path to the final destination.
  • the layers of different devices may require different information to continue to process an IP packet.
  • the residential gateway 105 may include a firewall to exclude malicious IP packets from entering the devices' residence. The firewall is looking for specific information in each IP packet to allow the IP packet to enter the residence.
  • the IP packet avatar will have the ability to mimic any of this information in order to pass through each of the devices. Then, when the IP packet avatar reaches the final destination (e.g., the computer 110 ), the IP packet avatar would now be able to exploit any security vulnerabilities of that destination device. Thus, a user can inject the IP packet avatar to determine what security breaches a malicious IP packet can exploit in the system.
  • an avatar is a monitoring avatar that can monitor the virtual environment 10 for signs of the malicious avatars such as the above described IP packet avatar or other misbehaving avatars.
  • the monitoring avatar may also include the functionality to respond to the malicious avatars in order to protect the system.
  • a user may implement an attack on the system with one or more malicious avatars and determine whether the monitoring avatars are able to identify the malicious avatars and take the proper corrective action to protect against the malicious avatars.
  • a user who is attempting to protect the physical components of the network can launch a variety of attacks in the virtual environment 10 to determine if the network security measures implemented in the actual network and devices (as modeled by the monitoring avatar in the virtual environment 10 ) can protect against the various attacks without having to launch an actual attack against the physical network.
  • a basic attack avatar may embody any known attack that can be used within a network. Examples may include denial of service attacks, eavesdropping, data modification, IP spoofing, sniffer attacks, etc.
  • the basic attack avatar may be modified as new types of attacks are developed allowing a user to launch attacks within the virtual environment 10 to continuously assess the vulnerability of the actual network.
  • an attack avatar may be a social engineering avatar.
  • the social engineering avatar may launch an attack in the virtual environment 10 , for example, by interacting with other avatars.
  • Social engineering attacks are those aimed to get proprietary information by conning others.
  • a user can launch the social engineering avatar to interact with other avatars to determine the types of attacks to which other users are vulnerable.
  • the social engineering avatar may engage another user's avatar in a chat session and ask a series of questions aimed at obtaining private information from the other user.
  • the social engineering avatar may then provide the user that launched the attack information on the effectiveness of certain techniques to obtain user's private information.
  • the social engineering avatar may also be able to use the user's private information to launch other attacks.
  • an avatar may be a discontinuity avatar that continuously probes the virtual environment 10 in search of discontinuities that may be exploitable. For example, this avatar can automatically walk the entire virtual environment 10 while “clicking” or otherwise exercising its powers in order to determine if there is an unexpected response. Any unexpected response may indicate a software bug that may be exploitable. This avatar will especially probe the edges of the world where there may be programming discontinuities. A special case of the discontinuity avatar may be able to “see” everything in the virtual environment 10 . It will use its powers of teleporting its vision in order to get into secure areas to carry out security exploits.
  • a final exemplary avatar may be a self-developing avatar. This avatar will increase in knowledge by virtue of its interaction with other avatars and with the virtual environment 10 . It will develop independently of its “master” in the real world environment. It will learn the tricks of hacking the virtual environment 10 just as real-world hackers learn their trade. This self-developing avatar could turn out to be the most powerful avatar because it may exhibit the most unpredictable behavior.
  • FIG. 5 provides an exemplary method 400 showing the deployment of an avatar within the virtual environment 10 .
  • the exemplary avatar being deployed is shown as a security related avatar, but this is only exemplary.
  • the real world environment is virtualized to create the virtual environment 10 .
  • the exemplary avatar is deployed within the virtual environment 10 .
  • the avatar will then perform its functionality. For example, if the deployed avatar is the IP packet avatar, the avatar will perform according to the description of such an avatar described above.
  • step 430 it is determined whether the avatar exploited any security issues within the virtual environment. For example, was the IP packet avatar able to access a user's environment and gain access to a user's component. In another example, it may be determined if the social engineering avatar was able to obtain a user's private information. If the avatar is not able to exploit any security issues in step 430 , the results may be displayed to the user in step 450 . The results in such a case may simply be that the avatar has been deployed, but that it has not been able to cause any problems within the virtual environment 10 .
  • this information may also be displayed to the user in step 450 .
  • the display may show the user, for example, the type of vulnerability that the avatar exploited or the type of successful attack launched by the avatar.
  • the method may continue to step 440 to determine if any countermeasures deployed in the network can resolve the issue created by the avatar. For example, if the avatar is a basic attack avatar that launches a denial of service attack, the network may include countermeasures such as blocking traffic from certain IP address or range of address, disconnecting offending nodes or edge devices of the network, etc. to combat such an attack.
  • step 440 it is determined if these countermeasures are successful or if there are any countermeasures that were attempted to deal with the issue. If the countermeasures are successful, the results are displayed to the user in step 450 .
  • the display may indicate the type of countermeasure that was used and how it mitigated the attack. If the countermeasure was unsuccessful or not deployed at all, the display may indicate this information and may also indicate a potential type of countermeasure that may be used.
  • security type avatars are not the only type of avatars that may be deployed within the virtual environment. For example, there may be an avatar that is related to network routing that runs through various network routing scenarios based on different network loadings and other factors to determine a best route for packets within the network. This avatar may run constantly in the virtual environment and the results may be used to alter the routing tables of routing devices in the real world environment. Thus, the deployment method for different avatars may be different depending on the functionality provided by the avatars.

Abstract

A method on a computer readable storage medium operable to store a virtual representation of a plurality of physical components, introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component. A system having a memory configured to store a virtual representation of a plurality of physical components and a processor configured to introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component.

Description

    BACKGROUND
  • As embedded devices (e.g., any device that includes a processor, controller, micro-controller or other type of computing device) become ubiquitous in a variety of environments such as the home and the workplace, users have a desire for better interaction with such devices. One manner of providing better interaction with a variety of devices or a series of devices that are providing a specific functionality for the user (e.g., multiple devices that make up a multimedia center) is to create a virtual environment that represents the physical devices. Some of these virtual environments allow a user to control the physical devices by interacting with the virtual environment. An example of such a virtual environment is described in U.S. Provisional Patent Application 61/096,960 entitled “Method and System for Controlling Physical Components Via A Virtual Environment” filed on September 15, 2008.
  • SUMMARY OF THE INVENTION
  • A computer readable storage medium storing a set of instructions that are executable by a processor, the set of instructions being operable to store a virtual representation of a plurality of physical components, introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component.
  • A system having a memory configured to store a virtual representation of a plurality of physical components and a processor configured to introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a schematic representation of various environments according to an exemplary embodiment of the present invention.
  • FIG. 2 shows a virtual physical view of a multimedia center of a home environment according to an exemplary embodiment of the present invention.
  • FIG. 3 shows a virtual physical view of a server room of a central office environment for generating and distributing the multimedia information destined for the multimedia center according to an exemplary embodiment of the present invention.
  • FIG. 4 shows a virtual logical view of a video distribution channel providing video from the central office environment to the multimedia center of the home environment according to an exemplary embodiment of the present invention.
  • FIG. 5 provides an exemplary method showing the deployment of an avatar within a virtual environment according to an exemplary embodiment of the present invention.
  • DETAILED DESCRIPTION
  • The exemplary embodiments of the present invention may be further understood with reference to the following description and the related appended drawings, wherein like elements are provided with the same reference numerals. The exemplary embodiments of the present invention are related to systems and methods for controlling and/or interacting with a virtual environment that models physical devices and/or applications. The mechanism for interacting with the virtual environment will be referred to as an “avatar.” In normal computing lexicon an avatar is a computer user's representation of himself/herself or alter ego, whether in the form of a three-dimensional model used in computer games, a two-dimensional icon used on Internet forums and other communities, or a text construct. Thus, an avatar is an “object” representing the embodiment of the user. As used throughout this description, the term “avatar” may refer to the meaning associated with the normal computing usage. However, the term “avatar” may also be used to describe a functionality that does not necessarily need to mimic a user. For example, an avatar may will be able to take advantage of special powers that the avatar enjoys by virtue of its existence in the virtual world. Examples of these special powers will be described in greater detail below.
  • However, prior to describing the exemplary embodiments of the avatars, an exemplary virtual environment will be described within which the avatars may be deployed. The exemplary embodiment of a virtual environment will be described with reference to a virtual environment that models a server network providing multimedia capabilities to a home environment. However, those skilled in the art will understand from the following description that the avatars may be deployed within any virtual environment.
  • FIG. 1 shows a schematic representation of various environments. The environments include a home environment 20 that includes a multimedia center 22, a kitchen 24 and an HVAC system 26. In this example, the home environment is discretely broken up into physical locations (e.g., multimedia center 22 and kitchen 24) or physical function (e.g., HVAC system 26) where controllable components exist. A more detailed view of the multimedia center will be provided below. However, it should be noted that the present invention is not limited to multimedia components, but may be implemented to control any physical device that is capable of being controlled (e.g., a thermostat, an oven, etc.).
  • Another exemplary embodiment of an environment is a central office environment 30 of a company that is distributing multimedia signals to the home environment 20. The central office environment 30 includes a server room 32 and a laboratory 34. In addition, a virtual environment 10 is also illustrated. The virtual environment 10 is illustrated as including a workstation 15. Those of skill in the art will understand that a more proper understanding of the interaction between the virtual environment 10 and the workstation 15 is that the virtual environment 10 is being executed by the workstation 15. That is, the workstation 15 may be any computing device that is capable of executing the software necessary for carrying out the functionality described herein for the virtual environment 10. For example, the computing device may be a desktop computer, server station, laptop computer, mobile computing device such as a mobile phone, etc.
  • As shown in FIG. 1, the central office environment 30 and the home environment 20 may exchange signals. An example of the signals may be a request from the multimedia center 22 for a particular video file (e.g., a movie) from the central office environment 30. The server room 32 of the central office environment 30 may receive the request and fulfill the request by sending the requested video signal to the multimedia center 22. In addition, in this example, it is shown that there is an interaction between the virtual environment 10 and both the home environment 20 and the central office environment 30. As will be described in greater detail below, this interaction allows a user working in the virtual environment 10 to exercise control over physical components included in the home environment 20 and/or the central office environment 30 using the interface provided by the virtual environment 10.
  • FIG. 2 shows a virtual physical view 100 of the multimedia center 22 of the home environment 20. The virtual physical view 100 is created in either the virtual environment 10 or in some other computing device executing a commercially available simulation or virtual world software program to model the actual multimedia center 22 of the home environment and then loaded or stored in the virtual environment 10. The virtual physical view 100 includes multimedia components such as a residential gateway 105, a computer 110, a television 120, a set top box 125 and a stereo 130. The virtual physical view 100 also includes physical entities such as chairs 140 and desk 150 to model the actual physical environment of the multimedia center 22. The physical view 100 may be a replica of the multimedia center 22 (e.g., the physical floor plan shown in the virtual physical view 100 is nearly exactly the same as the actual floor plan of the multimedia center 22) or it may be an abstract representation of the multimedia center 22 (e.g., the multimedia components may be displayed, but not in their exact locations or layouts). The user may select and build the type of physical view with which they are comfortable.
  • FIG. 3 shows a virtual physical view 200 of a server room 32 of the central office environment 30 for generating and distributing the multimedia information destined for the multimedia center 22. The physical view 200 is similar to physical view 100, except that it shows the other end of the distribution network for the multimedia information. The physical view 200 shows server rack 210 including servers 211-216, server rack 220 including 221-226 and server rack 230 including servers 231-236. The physical view 200 also includes other network components such as routers 240-260 and switches 270 and 280. As will be described in greater detail below, the components in the actual server room 32 are responsible for generating and distributing the multimedia signals that are consumed by the multimedia center of the user's home.
  • FIG. 4 shows a virtual logical view 300 of a video distribution channel providing video to the multimedia center 22. In this exemplary embodiment, the video distribution channel includes servers 233 and 224 (or applications on the servers 233 and 224) that generate the video to be distributed, router 250 that receives the video signal from servers 233 and 224 and routes it to server 212 that is responsible for distribution of the video signal. The video signal is then sent through switch 280 to residential gateway 105 for distribution to the computer 110 and/or the television 120 via set top box 125. Those skilled in the art will understand that the virtual logical view 300 is only exemplary and that many different logical views may be built to model the distribution of many different types of signals. Thus, there may be many types of logical views that are created and stored in the virtual environment 10. In addition, the logical view 300 may not include all the components in the video distribution channel. For example, between the switch 280 that is in server room 280 and residential gateway 105, there may be other network components such as additional servers, routers, switches, repeaters, etc. Again, the user may build the virtual logical view in any manner that the user is comfortable interacting with the view.
  • The following will provide an example of the use of the virtual views 100-300 and a user's interaction with the virtual views 100-300. In this example, the user may be considered to be a person associated with the entity that is distributing the multimedia signals who is responsible for security applications. In this example, the user is attempting to run a test that is designed to disrupt the video signal that is being distributed to the multimedia center 22 of the home environment 20. Accordingly, the user may select the video distribution channel virtual logical view 300 from a list of virtual logical views.
  • Once the logical view 300 for the video distribution channel is selected, the virtual environment 10 will display the virtual logical view 300 to the user. In addition, the one or both of the virtual physical views 100 and 200 may also be displayed. For example, the virtual environment 10 may include multiple displays or multiple display panes to display multiple virtual views. The physical virtual views 100 and 200 may be correlated to the virtual logical view 300. For example, the boxes illustrating the various components of the virtual logical view 300 may be outlined in a specific color. The components illustrated in the virtual physical views 100 and 200 may be colored in the same manner to illustrate the components that are involved in the selected logical view 300. However, it should be noted that two views (e.g., the physical view and the logical view) are not required. For example, a user may be comfortable with only the physical view or the logical view and it may not be necessary to show the other view in the virtual environment 10.
  • As described above, there is an actual physical connection between the virtual environment 10 (e.g., the workstation 15 executing the virtual environment 10) and the physical components modeled by the virtual views 100-300. Thus, when the user selects the video distribution channel logical view 300, the virtual physical view 100 may show the video signal on the television 120 or computer 110 that is actually being displayed on the television screen or computer screen in the home environment 20.
  • The components illustrated in the virtual environment 10 may have various functionalities associated with the components that the user may select to perform. The associated functionalities may be selected, for example, using a dropdown menu associated with the component (displayed in either the physical views 100 and 200 or logical view 300). For example, the server 233 may be generating a first video stream and the server 224 may be generating a second video stream. A user, by selecting on of the servers 233 and 224, may be able to toggle the video stream that is being displayed on the television screen 120. As noted above, this control is exerted both in the virtual environment (e.g., what is being displayed on the television 120 screen of virtual physical view 100) and on the actual television screen in multimedia center 22.
  • Thus, as can be seen from the above example, the virtual environment 10 may be used to allow a user to interact and exert control over components in the real world environment. This integrating of the virtual environment 10 with the real world environment may lead to many unforeseen interactions and consequences. The creation of avatars that can “roam” freely within the virtual environment 10 may be used by a system developer, system administrator, etc. to explore these interactions in the virtual environment 10 to determine both the capabilities and vulnerabilities of the components within the real world environment.
  • The following will provide examples of the use of avatars within the exemplary virtual environment 10. It should be noted that the exemplary avatars will be used to produce security exploits to, for example, lead to new security mechanisms. However, the avatars do not need to be limited to security applications, but may be used for any type of functionality that may be deployed or tested for a real world environment network or component, such as provisioning, network throughput, troubleshooting, etc.
  • A first exemplary avatar may appear to the virtual environment 10 as an Internet Protocol (“IP”) packet that will be able to “flow” through a system from end-to-end. As it goes through each element, it will be able to morph into the correct packet that will enable it to pass through any security checkpoints, such as firewalls, and reach its destination. At that point, it may be able to “own” the destination system by exploiting other vulnerabilities.
  • For example, a user may select the virtual logical view 300 shown in FIG. 4 and select to inject the IP packet avatar into the server 233. The IP packet avatar may then flow to the router 250, the server 212, the switch 280, the residential gateway 105 and each of the computer 110 and the set top box 125. As described above, as the IP packet avatar was flowing through the system, the IP packet avatar could morph into the correct configuration to pass through each device. That is, the model of the devices in the virtual environment 10 will include the same functionalities as the actual devices that the virtual environment 10 is modeling. Thus, the virtual components will process IP packets in the same manner as the actual devices. Those skilled in the art will understand that each component will process an IP packet in accordance with the protocol stack included in the device. Each layer of a protocol stack will strip away various information from the IP packet to process the specific functionality associated with the layer so that the IP packet may then be repackaged and forwarded to the next device along the path to the final destination. The layers of different devices may require different information to continue to process an IP packet. For example, the residential gateway 105 may include a firewall to exclude malicious IP packets from entering the devices' residence. The firewall is looking for specific information in each IP packet to allow the IP packet to enter the residence. The IP packet avatar will have the ability to mimic any of this information in order to pass through each of the devices. Then, when the IP packet avatar reaches the final destination (e.g., the computer 110), the IP packet avatar would now be able to exploit any security vulnerabilities of that destination device. Thus, a user can inject the IP packet avatar to determine what security breaches a malicious IP packet can exploit in the system.
  • Another example of an avatar is a monitoring avatar that can monitor the virtual environment 10 for signs of the malicious avatars such as the above described IP packet avatar or other misbehaving avatars. The monitoring avatar may also include the functionality to respond to the malicious avatars in order to protect the system. Thus, in this manner a user may implement an attack on the system with one or more malicious avatars and determine whether the monitoring avatars are able to identify the malicious avatars and take the proper corrective action to protect against the malicious avatars. Again, in this manner, a user who is attempting to protect the physical components of the network can launch a variety of attacks in the virtual environment 10 to determine if the network security measures implemented in the actual network and devices (as modeled by the monitoring avatar in the virtual environment 10) can protect against the various attacks without having to launch an actual attack against the physical network.
  • As described above, the user can launch a variety of attacks within the virtual environment 10 using different avatars. A basic attack avatar may embody any known attack that can be used within a network. Examples may include denial of service attacks, eavesdropping, data modification, IP spoofing, sniffer attacks, etc. The basic attack avatar may be modified as new types of attacks are developed allowing a user to launch attacks within the virtual environment 10 to continuously assess the vulnerability of the actual network.
  • Another example of an attack avatar may be a social engineering avatar. The social engineering avatar may launch an attack in the virtual environment 10, for example, by interacting with other avatars. Social engineering attacks are those aimed to get proprietary information by conning others. Thus, a user can launch the social engineering avatar to interact with other avatars to determine the types of attacks to which other users are vulnerable. For example, the social engineering avatar may engage another user's avatar in a chat session and ask a series of questions aimed at obtaining private information from the other user. The social engineering avatar may then provide the user that launched the attack information on the effectiveness of certain techniques to obtain user's private information. The social engineering avatar may also be able to use the user's private information to launch other attacks.
  • Another example of an avatar may be a discontinuity avatar that continuously probes the virtual environment 10 in search of discontinuities that may be exploitable. For example, this avatar can automatically walk the entire virtual environment 10 while “clicking” or otherwise exercising its powers in order to determine if there is an unexpected response. Any unexpected response may indicate a software bug that may be exploitable. This avatar will especially probe the edges of the world where there may be programming discontinuities. A special case of the discontinuity avatar may be able to “see” everything in the virtual environment 10. It will use its powers of teleporting its vision in order to get into secure areas to carry out security exploits.
  • A final exemplary avatar may be a self-developing avatar. This avatar will increase in knowledge by virtue of its interaction with other avatars and with the virtual environment 10. It will develop independently of its “master” in the real world environment. It will learn the tricks of hacking the virtual environment 10 just as real-world hackers learn their trade. This self-developing avatar could turn out to be the most powerful avatar because it may exhibit the most unpredictable behavior.
  • Those skilled in the art will understand that the above avatars are only exemplary and that many different avatars having many different types of functionalities within the virtual environment may be developed. The integrating of the virtual environment with the real world environment will lead to many unforeseen interactions and consequences. The interaction between the virtual environment and the real world environment and the creation of avatars that can be exercised extensively to explore some of the many interactions will result in allowing users to better understand the real world environment possibilities of the network, both potential problems and potential benefits. For example, the results may be used to explore new attacks and then help us design new security mechanisms that are effective without unduly constraining creativity.
  • FIG. 5 provides an exemplary method 400 showing the deployment of an avatar within the virtual environment 10. Again, the exemplary avatar being deployed is shown as a security related avatar, but this is only exemplary. In step 410, the real world environment is virtualized to create the virtual environment 10. In step 420, the exemplary avatar is deployed within the virtual environment 10. The avatar will then perform its functionality. For example, if the deployed avatar is the IP packet avatar, the avatar will perform according to the description of such an avatar described above.
  • In step 430, it is determined whether the avatar exploited any security issues within the virtual environment. For example, was the IP packet avatar able to access a user's environment and gain access to a user's component. In another example, it may be determined if the social engineering avatar was able to obtain a user's private information. If the avatar is not able to exploit any security issues in step 430, the results may be displayed to the user in step 450. The results in such a case may simply be that the avatar has been deployed, but that it has not been able to cause any problems within the virtual environment 10.
  • If the avatar is able to exploit a security issue in step 430, this information may also be displayed to the user in step 450. The display may show the user, for example, the type of vulnerability that the avatar exploited or the type of successful attack launched by the avatar. In addition, the method may continue to step 440 to determine if any countermeasures deployed in the network can resolve the issue created by the avatar. For example, if the avatar is a basic attack avatar that launches a denial of service attack, the network may include countermeasures such as blocking traffic from certain IP address or range of address, disconnecting offending nodes or edge devices of the network, etc. to combat such an attack. In step 440, it is determined if these countermeasures are successful or if there are any countermeasures that were attempted to deal with the issue. If the countermeasures are successful, the results are displayed to the user in step 450. For example, the display may indicate the type of countermeasure that was used and how it mitigated the attack. If the countermeasure was unsuccessful or not deployed at all, the display may indicate this information and may also indicate a potential type of countermeasure that may be used.
  • It should be noted that once an avatar is deployed within the virtual environment 10, it may remain active for extended periods of time waiting to exploit vulnerabilities within the network as the network is changed, e.g., new hardware is added, new applications are added, etc. In addition, as noted multiple times above, security type avatars are not the only type of avatars that may be deployed within the virtual environment. For example, there may be an avatar that is related to network routing that runs through various network routing scenarios based on different network loadings and other factors to determine a best route for packets within the network. This avatar may run constantly in the virtual environment and the results may be used to alter the routing tables of routing devices in the real world environment. Thus, the deployment method for different avatars may be different depending on the functionality provided by the avatars.
  • It will be apparent to those skilled in the art that various modifications may be made in the present invention, without departing from the spirit or the scope of the invention. Thus, it is intended that the present invention cover modifications and variations of this invention provided they come within the scope of the appended claimed and their equivalents.

Claims (20)

1. A computer readable storage medium storing a set of instructions that are executable by a processor, the set of instructions being operable to:
store a virtual representation of a plurality of physical components;
introduce a component having a predefined functionality to interact with the virtual representation; and
generate indications of a response of the virtual representation to the interactions of the component.
2. The computer readable storage medium of claim 1, wherein the set of instructions are further operable to:
store the indications of the response.
3. The computer readable storage medium of claim 1, wherein the set of instructions are further operable to:
display the indications of the response.
4. The computer readable storage medium of claim 1, wherein the response of the virtual representation is based on an actual response of one or more of the plurality of physical components.
5. The computer readable storage medium of claim 1, wherein the functionality includes mimicking an Internet Protocol (IP) packet that travels between representations of the physical components.
6. The computer readable storage medium of claim 1, wherein the functionality includes scanning the virtual representation for other components having predefined functionalities and reacting to the other components.
7. The computer readable storage medium of claim 1, wherein the functionality includes attacking one or more of the plurality of components.
8. The computer readable storage medium of claim 1, wherein the functionality includes interacting with users within the virtual representation and attempting to collect information from these users.
9. The computer readable storage medium of claim 1, wherein the functionality includes monitoring the virtual representation for changes.
10. The computer readable storage medium of claim 1, wherein the component generates additional functionality based on the interactions with the virtual representation and the response of the virtual representation.
11. A system, comprising:
a memory configured to store a virtual representation of a plurality of physical components; and
a processor configured to introduce a component having a predefined functionality to interact with the virtual representation and generate indications of a response of the virtual representation to the interactions of the component.
12. The system of claim 11, wherein the memory is further configured to store the indications of the response.
13. The system of claim 11, further comprising:
a display to display the indications of the response.
14. The system of claim 11, wherein the response of the virtual representation is based on an actual response of one or more of the plurality of physical components.
15. The system of claim 11, wherein the functionality includes mimicking an Internet Protocol (IP) packet that travels between representations of the physical components.
16. The system of claim 11, wherein the functionality includes scanning the virtual representation for other components having predefined functionalities and reacting to the other components.
17. The system of claim 11, wherein the functionality includes attacking one or more of the plurality of components.
18. The system of claim 11, wherein the functionality includes interacting with users within the virtual representation and attempting to collect information from these users.
19. The system of claim 11, wherein the functionality includes monitoring the virtual representation for changes.
20. The system of claim 11, wherein the component generates additional functionality based on the interactions with the virtual representation and the response of the virtual representation.
US12/329,905 2008-12-08 2008-12-08 Method and System for Exploiting Interactions Via A Virtual Environment Abandoned US20100146395A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/329,905 US20100146395A1 (en) 2008-12-08 2008-12-08 Method and System for Exploiting Interactions Via A Virtual Environment
US15/362,401 US10943397B2 (en) 2008-12-08 2016-11-28 Method and system for exploiting interactions via a virtual environment
US17/169,936 US20210166488A1 (en) 2008-12-08 2021-02-08 Method and system for exploiting interactions via a virtual environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/329,905 US20100146395A1 (en) 2008-12-08 2008-12-08 Method and System for Exploiting Interactions Via A Virtual Environment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/362,401 Continuation US10943397B2 (en) 2008-12-08 2016-11-28 Method and system for exploiting interactions via a virtual environment

Publications (1)

Publication Number Publication Date
US20100146395A1 true US20100146395A1 (en) 2010-06-10

Family

ID=42232454

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/329,905 Abandoned US20100146395A1 (en) 2008-12-08 2008-12-08 Method and System for Exploiting Interactions Via A Virtual Environment
US15/362,401 Active 2030-05-12 US10943397B2 (en) 2008-12-08 2016-11-28 Method and system for exploiting interactions via a virtual environment
US17/169,936 Abandoned US20210166488A1 (en) 2008-12-08 2021-02-08 Method and system for exploiting interactions via a virtual environment

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/362,401 Active 2030-05-12 US10943397B2 (en) 2008-12-08 2016-11-28 Method and system for exploiting interactions via a virtual environment
US17/169,936 Abandoned US20210166488A1 (en) 2008-12-08 2021-02-08 Method and system for exploiting interactions via a virtual environment

Country Status (1)

Country Link
US (3) US20100146395A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110088073A1 (en) * 2009-10-12 2011-04-14 At&T Intellectual Property I, L.P. User-configured background channels in internet-protocol television
US20110107329A1 (en) * 2009-11-05 2011-05-05 International Business Machines Corporation Method and system for dynamic composing and creating 3d virtual devices
US20120272131A1 (en) * 2011-04-21 2012-10-25 International Business Machines Corporation Handling unexpected responses to script executing in client-side application
US20120324591A1 (en) * 2011-06-14 2012-12-20 International Business Machines Corporation System and method to protect a resource using an active avatar
US20140267564A1 (en) * 2011-07-07 2014-09-18 Smart Internet Technology Crc Pty Ltd System and method for managing multimedia data
CN104699476A (en) * 2014-07-01 2015-06-10 北京邮电大学 Simulation method, simulation device and simulation system
US20170063567A1 (en) * 2014-09-05 2017-03-02 Sharp Kabushiki Kaisha Heating cooking system
US9818228B2 (en) 2015-08-07 2017-11-14 Microsoft Technology Licensing, Llc Mixed reality social interaction
US9922463B2 (en) 2015-08-07 2018-03-20 Microsoft Technology Licensing, Llc Virtually visualizing energy

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5808612A (en) * 1996-01-12 1998-09-15 International Business Machines Corporation Virtual office with connections between source data machine, and a viewer objects
US6292830B1 (en) * 1997-08-08 2001-09-18 Iterations Llc System for optimizing interaction among agents acting on multiple levels
US20030182582A1 (en) * 2002-03-19 2003-09-25 Park Jong Sou Network security simulation system
US20050022014A1 (en) * 2001-11-21 2005-01-27 Shipman Robert A Computer security system
US6971026B1 (en) * 1999-09-29 2005-11-29 Hitachi, Ltd. Method and apparatus for evaluating security and method and apparatus for supporting the making of security countermeasure
US20060212932A1 (en) * 2005-01-10 2006-09-21 Robert Patrick System and method for coordinating network incident response activities
US7308394B2 (en) * 2005-02-24 2007-12-11 Ultravision Security Systems, Inc. Method for modeling and testing a security system
US20080222731A1 (en) * 2000-01-14 2008-09-11 Secure Computing Corporation Network security modeling system and method
US20090007270A1 (en) * 2007-06-26 2009-01-01 Core Sdi, Inc System and method for simulating computer network attacks

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734373A (en) * 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
AU1328597A (en) * 1995-11-30 1997-06-19 Virtual Technologies, Inc. Tactile feedback man-machine interface device
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
JP4332964B2 (en) * 1999-12-21 2009-09-16 ソニー株式会社 Information input / output system and information input / output method
US7774440B1 (en) * 2001-07-25 2010-08-10 Scalable Network Technologies, Inc. Method and system for enhancing performance of a physical network under real-time control using simulation of a reference model
KR100609710B1 (en) * 2004-11-25 2006-08-08 한국전자통신연구원 Network simulation apparatus and method for abnormal traffic analysis
US20070132785A1 (en) * 2005-03-29 2007-06-14 Ebersole John F Jr Platform for immersive gaming
US7864168B2 (en) * 2005-05-25 2011-01-04 Impulse Technology Ltd. Virtual reality movement system
US20080215994A1 (en) * 2007-03-01 2008-09-04 Phil Harrison Virtual world avatar control, interactivity and communication interactive messaging
WO2009062153A1 (en) * 2007-11-09 2009-05-14 Wms Gaming Inc. Interaction with 3d space in a gaming system
US8638301B2 (en) * 2008-07-15 2014-01-28 Immersion Corporation Systems and methods for transmitting haptic messages
US8303406B2 (en) * 2008-11-24 2012-11-06 Disney Enterprises, Inc. System and method for providing an augmented reality experience
DK180470B1 (en) * 2017-08-31 2021-05-06 Apple Inc Systems, procedures, and graphical user interfaces for interacting with augmented and virtual reality environments
US10635895B2 (en) * 2018-06-27 2020-04-28 Facebook Technologies, Llc Gesture-based casting and manipulation of virtual content in artificial-reality environments
US10712901B2 (en) * 2018-06-27 2020-07-14 Facebook Technologies, Llc Gesture-based content sharing in artificial reality environments
US10783712B2 (en) * 2018-06-27 2020-09-22 Facebook Technologies, Llc Visual flairs for emphasizing gestures in artificial-reality environments

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5808612A (en) * 1996-01-12 1998-09-15 International Business Machines Corporation Virtual office with connections between source data machine, and a viewer objects
US6292830B1 (en) * 1997-08-08 2001-09-18 Iterations Llc System for optimizing interaction among agents acting on multiple levels
US6971026B1 (en) * 1999-09-29 2005-11-29 Hitachi, Ltd. Method and apparatus for evaluating security and method and apparatus for supporting the making of security countermeasure
US20080222731A1 (en) * 2000-01-14 2008-09-11 Secure Computing Corporation Network security modeling system and method
US20050022014A1 (en) * 2001-11-21 2005-01-27 Shipman Robert A Computer security system
US20030182582A1 (en) * 2002-03-19 2003-09-25 Park Jong Sou Network security simulation system
US20060212932A1 (en) * 2005-01-10 2006-09-21 Robert Patrick System and method for coordinating network incident response activities
US7308394B2 (en) * 2005-02-24 2007-12-11 Ultravision Security Systems, Inc. Method for modeling and testing a security system
US20090007270A1 (en) * 2007-06-26 2009-01-01 Core Sdi, Inc System and method for simulating computer network attacks

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110088073A1 (en) * 2009-10-12 2011-04-14 At&T Intellectual Property I, L.P. User-configured background channels in internet-protocol television
US8907981B2 (en) * 2009-11-05 2014-12-09 International Business Machines Corporation Method and system for dynamic composing and creating 3D virtual devices
US20110107329A1 (en) * 2009-11-05 2011-05-05 International Business Machines Corporation Method and system for dynamic composing and creating 3d virtual devices
US20120272131A1 (en) * 2011-04-21 2012-10-25 International Business Machines Corporation Handling unexpected responses to script executing in client-side application
US9026902B2 (en) * 2011-04-21 2015-05-05 International Business Machines Corporation Handling unexpected responses to script executing in client-side application
US8880993B2 (en) 2011-04-21 2014-11-04 International Business Machines Corporation Handling unexpected responses to script executing in client-side application
US20120324591A1 (en) * 2011-06-14 2012-12-20 International Business Machines Corporation System and method to protect a resource using an active avatar
US10229280B2 (en) * 2011-06-14 2019-03-12 International Business Machines Corporation System and method to protect a resource using an active avatar
US20140267564A1 (en) * 2011-07-07 2014-09-18 Smart Internet Technology Crc Pty Ltd System and method for managing multimedia data
US9420229B2 (en) * 2011-07-07 2016-08-16 Smart Internet Technology Crc Pty Ltd System and method for managing multimedia data
CN104699476A (en) * 2014-07-01 2015-06-10 北京邮电大学 Simulation method, simulation device and simulation system
US20170063567A1 (en) * 2014-09-05 2017-03-02 Sharp Kabushiki Kaisha Heating cooking system
US11258626B2 (en) * 2014-09-05 2022-02-22 Sharp Kabushiki Kaisha Heating cooking system
US9818228B2 (en) 2015-08-07 2017-11-14 Microsoft Technology Licensing, Llc Mixed reality social interaction
US9922463B2 (en) 2015-08-07 2018-03-20 Microsoft Technology Licensing, Llc Virtually visualizing energy

Also Published As

Publication number Publication date
US20170076506A1 (en) 2017-03-16
US20210166488A1 (en) 2021-06-03
US10943397B2 (en) 2021-03-09

Similar Documents

Publication Publication Date Title
US20210166488A1 (en) Method and system for exploiting interactions via a virtual environment
US11666817B2 (en) Mission-based, game-implemented cyber training system and method
US9680867B2 (en) Network stimulation engine
US9729567B2 (en) Network infrastructure obfuscation
US11411920B2 (en) Method and system for creating a secure public cloud-based cyber range
WO2018175551A1 (en) Mission-based, game-implemented cyber training system and method
Bellekens et al. From cyber-security deception to manipulation and gratification through gamification
Jafarian et al. Delivering Honeypots as a Service.
Nguyen et al. Analyzing moving target defense for resilient campus private cloud
Sallés et al. Security of runtime extensible virtual environments
Johnson et al. Learn DDoS attacks with a game
Nidd et al. Tool-based risk assessment of cloud infrastructures as socio-technical systems
Bhuiyan et al. Service Store Model and Tools for Fresco Applications
Aleem et al. A review of the security architecture for SDN in light of its security issues
Cifranic et al. Decepti-SCADA
Smith Taking back the Internet: Defeating DDoS and adverse network conditions via reactive BGP routing
Beltran et al. The Use of a game-based interface for home network security
Aybar Developing simulated cyber attack scenarios against virtualized adversary networks
Murillo et al. and Álvaro A. Cárdenas
Kulmala Improving network security with software-defined networking
Hand Toward An Active Network Security Architecture

Legal Events

Date Code Title Description
AS Assignment

Owner name: AT&T INTELLECTUAL PROPERTY I, L.P.,NEVADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REYES, GUSTAVO DE LOS;MACWAN, SANJAY;REEL/FRAME:021971/0655

Effective date: 20081205

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION