WO2013024364A2 - Systems and methods for virtual viewing of physical events - Google Patents

Systems and methods for virtual viewing of physical events Download PDF

Info

Publication number
WO2013024364A2
WO2013024364A2 PCT/IB2012/002129 IB2012002129W WO2013024364A2 WO 2013024364 A2 WO2013024364 A2 WO 2013024364A2 IB 2012002129 W IB2012002129 W IB 2012002129W WO 2013024364 A2 WO2013024364 A2 WO 2013024364A2
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
data
position data
virtual environment
rendering
Prior art date
Application number
PCT/IB2012/002129
Other languages
French (fr)
Other versions
WO2013024364A3 (en
Inventor
Fabio GALLO
Juan Manuel Rejen
Original Assignee
Iopener Media Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iopener Media Gmbh filed Critical Iopener Media Gmbh
Priority to EP12787858.5A priority Critical patent/EP3114650A2/en
Publication of WO2013024364A2 publication Critical patent/WO2013024364A2/en
Publication of WO2013024364A3 publication Critical patent/WO2013024364A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Definitions

  • the methods and systems described herein relate generally to graphical virtualization of physical events.
  • the methods and systems described herein relate to data capture of physical events, generation of a virtual representation of the e vent, and display of the virtual representation of the event.
  • GPS technology has been integrated into many events, such as sporting events, with GPS receivers placed on participants, able to record and provide position and velocity information to external servers.
  • Formula One race cars may include GPS receivers and transmit GPS data to local receivers to be interpreted by pit crews, media broadcasters, race supervisors, or others. This information may also be provided to spectators, allowing them to see near realtime positions of race cars on a map of the racetrack.
  • the present application is directed to systems and methods for generating virtual viewing of physical events.
  • Event participants may have GPS receivers and data collection modules installed in vehicles, such as cars for vehicle racing events, boats for sailing races, planes for air races or acrobatic shows, or similar entities, or within equipment, such as helmets, padding, packs, or similar gear, as well as transmitters capable of sending the GPS data and collected data to local receivers for collation and processing.
  • the data may be provided to a virtualization engine, which may generate a virtual environment modeled on the real world environment in which the event occurs.
  • the virtualization engine may use the received data to generate virtual objects representing each event participant and/or their vehicle, and may place the virtual objects within the virtual environment at locations determined by positioning data.
  • the virtualization engine may further generate one or more viewpoints or virtual cameras, and may place the cameras anywhere within the virtual environment, including within the virtual objects.
  • the virtualization engine may then render, in real time or near real time, a realistic view of the virtual environment and virtual objects.
  • the rendered view may thus comprise a realistic simulated view of the physical event. Unlike physical cameras, however, the rendered view may be arbitrarily positioned, including above participants, below participants, inside participants or their vehicles, in the middle of a track or path of participants, or anywhere
  • the virtual cameras may be moved or rotated during the event, and the rendered simulation may be paused, rewound, or slowed.
  • viewers m ⁇ ' be able to have a customizable, personal view of an event.
  • broadcasters may be able to show rendered simulations of additional views for commentary purposes, such as pausing action during a turn to allow a view of the distance between two vehicle bumpers; providing a simulated view from within a peloton of the Tour de France; providing a simulated view of the inside of a vehicle during a crash, even if the vehicle has no camera; or any other view.
  • live broadcasts can be enhanced in ways previously impossible.
  • the present application is directed to a system for virtualization of a physical event.
  • the system includes a computing device comprising a processor configured to execute a tracking server and a virtualization engine.
  • the tracking server is configured for receiving position data from one or more additional computing devices.
  • the virtualization engine is configured for (a) generating a virtual environment comprising: (i) a virtual object corresponding to each of the one or more additional computing devices, each virtual object having a position within the virtual environment corresponding to the received position data for the corresponding computing device, and (is) a virtual camera within the virtual environment; and (b) rendering an image of the virtual environment corresponding to a view of the virtual camera.
  • the tracking server is further configured for receiving direction and speed data from the one or more computing devices; and each virtual object has a direction and speed within the virtual environment corresponding to the received position data.
  • the virtual camera has a position, speed, and direction within the virtual environment.
  • the position, speed, and direction of the virtual camera are based on the position, speed, and direction of a virtual object.
  • the virtu aiizati on engine is configured for transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe, un textured, or other reduced quality representation or image of the virtual environment and/or virtual object, responsive to a processing capability of the client, device being less than a predetermined threshold.
  • the virtualization engine is configured for transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe, untextured, or other reduced quality representation or image of the virtual environment and/or virtual object, responsive to a bandwidth of a connection to the client device being less than a predetermined threshold.
  • the tracking server is further configured for receiving position data from the one or more additional computing devices via a cellular network, and is configured for interpolating a direction for each of the one or more additional computing devices based on the received position data and prior received position data
  • the virtual camera has a position above the virtual environment, and the rendered image comprises a map.
  • the tracking server is further configured for receiving a first set of low temporal resolution position data from a second computing device via a first network and for receiving a second set of high temporal resolution position data from the second computing device via a second network at a subsequent time; and the virtualization engine is configured for rendering a first image comprising a map displaying a virtual object having a position corresponding to the first set of low temporal resolution position data, and rendering a second image comprising a three-dimensional view displaying a second virtual object having a position corresponding to the second set of high temporal resolution position data at the subsequent time.
  • the first virtual object comprises an icon
  • the second virtual object comprises a three-dimensional object corresponding to a vehicle or participant carrying the second computing device.
  • the present application is directed to a method for virtualization of a physical event.
  • the method includes receiving position data, by a tracking server executed by a processor of a computing device, from one or more additional computing devices.
  • the method also includes generating a virtual environment, by a virtualization engine executed by the processor of the computing device, the virtual environment comprising: (i) a virtual object corresponding to each of the one or more additional computing devices, each virtual object having a position within the virtual environment corresponding to the received position data for the corresponding computing device, and (ii) a virtual camera within the virtual environment.
  • the method further includes rendering an image of the virtual environment, by the virtualization engine, corresponding to a view of the virtual camera.
  • the method includes receiving direction and speed data by the tracking server from the one or more computing devices; and each virtual object has a direction and speed within the virtual environment corresponding to the received position data.
  • the method includes the virtual camera having a position, speed, and direction within the virtual environment.
  • the position, speed, and direction of the virtual camera are based on the position, speed, and direction of a virtual object.
  • the method includes transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe, untextured, or other reduced quality representation or image of the virtual environment and/or virtual object, responsive to a processing capability of the client device being less than a predetermined threshold.
  • the method includes transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe image of the virtual environment, responsive to a bandwidth of a connection to the client, device being less than a predetermined threshold.
  • the method includes receiving position data from the one or more additional computing devices via a cellular network, and interpolating a direction for each of the one or more additional computing devices based on the received position data and prior received position data.
  • the method includes the virtual camera having a position above the virtual environment, and the rendered image comprises a map.
  • the method includes receiving, by the tracking server, a first set of low temporal resolution position data from a second computing device via a first network.
  • the method also includes rendering, by the virtuaiization engine, a first image comprising a map displaying a virtual object having a position corresponding to the first set of low temporal resolution position data.
  • the method further includes receiving, by the tracking server, a second set of high temporal resolution position data from the second computing device via a second network at a subsequent time.
  • the method also includes rendering, by the virtuaiization engine, a second image comprising a three-dimensional view displaying a second virtual object having a position corresponding to the second set of high temporal resolution position data at the subsequent time.
  • the first virtual object comprises an icon
  • the second virtual object comprises a three-dimensional object corresponding to a vehicle or participant carrying the second computing device.
  • Figure 1 A is a block diagram illustrative of an embodiment, of a networked environment useful for the systems and methods described in this document;
  • Figure I B is a block diagram illustrative of a certain embodiment of a computing machine for practicing the methods and systems described herein;
  • Figure 2 A is a block diagram of an embodiment of a system for virtualization of a physical event
  • Figure 2B is another block diagram of an embodiment of a virtualization system
  • Figure 3 A is a block diagram of an embodiment of a data capture and transmission system
  • Figure 3B is a block diagram of an embodiment of a virtualization system
  • Figure 4 is a block diagram of an embodiment of a hybrid short/long range data capture and transmission system.
  • Figure 5 is a flow chart of an embodiment of a method for providing a virtual display of a physical event utilizing a hybrid short/long range data capture and transmission system.
  • the networked environment 101 includes one or more client machines 102A-102N (generally referred to herein as “client machine(s) 102" or “client(s) 102") in communication with one or more servers 106A-106N (generally referred to herein as “server machine(s) 106" or “server(s) 106") over a network 104.
  • the client machme(s) 102 can, in some embodiments, be referred to as a single client machine 102 or a single group of client machines 102, while server(s) 106 may be referred to as a single server 106 or a single group of se ers 106.
  • any number of clients 102 may be in communication with any number of servers 106.
  • a single client machine 102 communicates with more than one server 106, while in another embodiment a single sewer 106 communicates with more than one client machine 102.
  • a single client machine 102 communicates with a single server 106.
  • a single network 104 is shown connecting client machines 102 to server machines 106, it should be understood that multiple, separate networks may connect a subset of client machines 102 to a subset of server machines 106.
  • the computing environment 101 can include an appliance (not shown in FIG.
  • This appliance can mange client/sewer connections, and in some cases can load balance connections made by client machines 102 to server machines 106.
  • Suitable appliances are manufactured by any one of the following companies: the Cirri x Systems Inc. Application Networking Group; Silver Peak Systems, Inc, both of Santa Clara, California; Riverbed Technology, Inc. of San Francisco, California; F5 Networks, Inc. of Seattle, Washington; or Juniper Networks, Inc. of Sunnyvale, California.
  • Clients 102 and server 106 may be provided as a computing device 100, a specific embodiment of which is illustrated in Figure I B. Included within the computing device 100 is a system bus 150 that communicates with the following components: a central processing unit 121; a main memory 122; storage memory 128; an input/output (I/O) controller 123; display devices 124A-324N; an installation device 116; and a network interface 118.
  • the storage memory 128 includes: an operating system, software routines, and a client agent 120.
  • the I/O controller 123 is further connected one or more input devices. As shown in Figure IB, the I/O controller 123 is connected to a camera 125, a keyboard 126, a pointing device 127, and a microphone 129.
  • Embodiments of the computing machine 100 can include a central processing unit 121 characterized by any one of the following component configurations: logic circuits that respond to and process instructions fetched from the main memory unit 122; a microprocessor unit, such as: those manufactured by Intel Corporation; those manufactured by Motorola Corporation; those manufactured by Transmeta Corporation of Santa Clara, California; the RS/6000 processor such as those manufactured by International Business Machines; a processor such as those manufactured by Advanced Micro Devices; or any other combination of logic circuits. Still other embodiments of the central processing unit 122 may include any combination of the following: a microprocessor, a microcontroller, a central processing unit with a single processing core, a central processing unit with two processing cores, or a central processing unit with more than one processing core.
  • Figure I B illustrates a computing device 100 that includes a single central processing unit 121
  • the computing device 100 can include one or more processing units 121.
  • the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units 121 to simultaneously execute instructions or to simultaneously execute instructions on a single piece of data.
  • the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units to each execute a section of a group of instructions. For example, each processing unit 121 may be instructed to execute a portion of a program or a particular module within a program.
  • the processing unit 121 can include one or more processing cores.
  • the processing unit 121 may have two cores, four cores, eight cores, etc.
  • the processing unit 121 may comprise one or more parallel processing cores.
  • the processing cores of the processing unit 121 may in some embodiments access available memory as a global address space, or in other embodiments, memory within the computing device 100 can be segmented and assigned to a particular core within the processing unit 121.
  • the one or more processing cores or processors in the computing device 100 can each access local memory.
  • memory within the computing device 100 can be shared amongst one or more processors or processing cores, while other memory can be accessed by particular processors or subsets of processors.
  • the multiple processing units can be included in a single integrated circuit (IC).
  • IC integrated circuit
  • the processors can execute a single instruction simultaneously on multiple pieces of data (SIMD), or in other embodiments can execute multiple instructions simultaneously on multiple pieces of data (MIMD).
  • SIMD single instruction simultaneously on multiple pieces of data
  • MIMD multiple instructions simultaneously on multiple pieces of data
  • the computing device 300 can include any number of SIMD and MIMD processors.
  • the computing device 100 can include a graphics processor or a graphics processing unit (not shown).
  • the graphics processing unit can include any combination of software and hardware, and can further input graphics data and graphics instructions, render a graphic from the inputted data and instructions, and output the rendered graphic.
  • the graphics processing unit can be included within the processing unit 121.
  • the computing device 100 can include one or more processing units 121, where at least one processing unit 121 is dedicated to processing and rendering graphics.
  • One embodiment of the computing device 100 provides support for any one of the following installation devices 1 16: a CD-ROM drive, a CD-R/RW drive, a DVD-ROM drive, tape drives of various formats, USB device, a bootable medium, a bootable CD, a bootable CD for GNU/Linux distribution such as K OPPIX®, a hard-drive or any other device suitable for installing applications or software.
  • Applications can in some embodiments include a client agent 120, or n ⁇ ' portion of a client agent 120.
  • the computing device 100 may further include a storage device 128 that can be either one or more hard disk drives, or one or more redundant arrays of independent disks; where the storage device is configured to store an operating system, software, programs applications, or at least a portion of the client agent 120.
  • a storage device 128 can be either one or more hard disk drives, or one or more redundant arrays of independent disks; where the storage device is configured to store an operating system, software, programs applications, or at least a portion of the client agent 120.
  • A. further embodiment of the computing device 100 includes an installation device 1 16 that is used as the storage device 128.
  • Embodiments of the computing device 100 include any one of the following I/O devices 130A-130N: a camera 125, keyboard 126; a pointing device 127; a microphone 129; mice;
  • An I/O controller 123 may in some embodiments connect to multiple I/O devices 103A-130N to control the one or more I/O devices. Some embodiments of the I/O devices 1 30A-130N may be configured to provide storage or an installation medium 116, while others may provide a universal serial bus (USB) interface for receiving USB storage devices such as the USB Flash Drive line of devices manufactured by Twintech Industry, Inc.
  • USB universal serial bus
  • an I/O device 130 may be a bridge between the system bus 150 and an external communication bus, such as: a USB bus; an Apple Desktop Bus; an RS-232 serial connection; a SCSI bus; a FireWire bus; a
  • Fire Wire 800 bus an Ethernet bus; an AppieTalk bus; a Gigabit Ethernet bus; an Asynchronous Transfer Mode bus; a HIPPI bus; a Super HIPPI bus; a SerialPlus bus; a SCI/LAMP bus; a FibreChannei bus; or a Serial Attached small computer system interface bus.
  • the computing machine 100 can execute n ⁇ ' operating system, while in other embodiments the computing machine 100 can execute any of the following operating systems: versions of the MICROSOFT WINDOWS operating systems such as WINDOWS 3.x; WINDOWS 95; WINDOWS 98; WINDOWS 2000; WINDOWS NT 3.51; WINDOWS NT 4.0; WINDOWS CE; WTNDOWS XP; WINDOWS VISTA; and WINDOWS 7; the different releases of the Unix and Linux operating systems; any version of the MAC OS manufactured by Apple Computer; OS/2, manufactured by International Business Machines; any embedded operating system; any real-time operating system; any open source operating system; any proprietary operating system; any operating systems for mobile computing devices; or any other operating system.
  • the computing machine 100 can execute multiple operating systems. For example, the computing machine 100 can execute
  • PARALLELS or another virtualization platform that can execute or manage a virtual machine executing a first operating system, while the computing machine 100 executes a second operating system different from the first operating system.
  • the computing machine 100 can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a netbook; a device of the IPOD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein.
  • the computing machine 100 can be a mobile device such as any one of the following mobile devices: a JAVA-enabled cellular telephone or personal digital assistant (PDA), such as the i55sr, i58sr, i85s, i88s, i90c, i95cl, or the iml 100, all of which are manufactured by Motorola Corp; the 6035 or the 7135, manufactured by Kyocera; the i 300 or 1330, manufactured by Samsung Electronics Co., Ltd; the TREQ 180, 270, 600, 650, 680, 700p, 700w, or 750 smart phone manufactured by Palm, Ine; any computing device that, has different, processors, operating systems, and input devices consistent with the device; or any other mobile computing device capable of performing the methods and systems described herein.
  • PDA personal digital assistant
  • the computing device 300 can be any one of the follo wing mobile computing devices: any one series of Blackberry, or other handheld de vice manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Palm Pre; a Pocket PC; a Pocket PC Phone; or any other handheld mobile device.
  • the computing device 100 may a smart phone or tablet computer, including products such as the iPhone or iPad manufactured by Apple, Inc. of Cupertino, CA; the BlackBerry devices manufactured by Research in Motion, Ltd. of Waterloo, Ontario, Canada; Windows Mobile devices manufactured by Microsoft Corp., of Redmond, WA; the Xoom manufactured by Motorola, Inc. of
  • Lihertyville, IL devices capable of running the Android platform provided by Google, Inc. of Mountain View, CA; or any other type and form of portable computing device.
  • the computing device 100 can be a virtual machine.
  • the virtual machine can be any virtual machine managed by a hypervisor developed by
  • the virtual machine can be managed by a hypervisor executing on a server 106 or a hypervisor executing on a client 102.
  • the computing device 100 can in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; an application or program; executable instructions; a virtual machine; a hypervisor; a web browser; a web-based client; a client-server application; an ActiveX control; a Java applet; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio or receiving and playing streamed video and/or audio; an application for facilitating real-time-data communications; a HTTP client; a FTP client; or any other set of executable instructions.
  • Still other embodiments include a client device 102 that displays application output generated by an application remotely executing on a server 106 or other remotely located machine. In these embodiments, the client device 102 can display the application output in an application window, a browser, or other output window
  • the computing device 1 00 may further include a network interface 118 to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (e.g., 802.1 1 , Tl , T3, 56kb, X.25, SNA, DECNET), broadband connections (e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET), wireless connections, or some combination of any or all of the above. Connections can also be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, ARCNET, SONET, SDH, Fiber
  • the network 104 can comprise one or more sub-networks, and can be installed between any combination of the clients 102, servers 106, computing machines and appliances included within the computing environment 101.
  • the network 104 can be: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network 104 comprised of multiple sub-networks 104 located between the client machines 102 and the servers 106; a primary public network 104 with a private sub-network 104; a primary private network 104 with a public sub-network 104; or a primary private network 104 with a private sub-network 104.
  • the network topology of the network 104 can differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; or a tiered-star network topology.
  • Additional embodiments may include a network 104 of mobile telephone networks that, use a protocol to communicate among mobile devices, where the protocol can be any one of the following: AMPS; TDMA; CDMA; GSM; GPRS UMTS; or any other protocol able to transmit data among mobile devices.
  • AMPS AMPS
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile communications
  • GPRS UMTS or any other protocol able to transmit data among mobile devices.
  • the computing environment 101 can include more than one server 106A-106N such that the servers 106A-106N are logically grouped together into a server farm 106.
  • the server farm 106 can include servers 106 that are geographically dispersed and logically grouped together in a server farm 106, servers 106 that are located proximate to each other and logically grouped together in a se er farm 106, or several virtual servers executing on physical servers.
  • Geographically dispersed servers 106A- 106N within a server farm 106 can, in some
  • a physical event 180 such as a race, athletic event, or other event, includes one or more objects 181, such as a race car, boat, airplane, human, bulldozer, police ear, etc.
  • the system receives position data 128 for each of the objects 181.
  • the system generates a virtual environment 184 with virtual objects 185 representing each object 181, at, a location determined by received position data 128.
  • the system also identifies a position and direction for a virtual camera 186.
  • the system also identifies a zoom and/or focus for the virtual camera 186.
  • the system may identify any parameter for the camera, including white balance, filters, color temperature, bit depth, resolution, or any such parameters.
  • determining the position may include identifying an acceleration, a velocity, a vibration frequency or amplitude, or any other such displacement.
  • the system renders a graphical view from the virtual camera 186 of a virtual event 187, corresponding to the physical event 180.
  • the rendered view from virtual camera 186 may comprise an accurate real-time representation of a view of the physical event from a real-world position corresponding to the position and direction of the virtual camera 186.
  • the rendered view may be provided as part of a media broadcast.
  • a broadcast of a race may use the virtualized event to show a viewpoint from a virtual camera where no physical camera has been placed, or even could be placed.
  • the virtual camera may be placed on the roadway, in a position locked to a vehicle such as a spoiler or
  • the virtual camera may be placed within a virtual vehicle, showing a simulation of what the driver sees. Because the view is virtualized, in some embodiments, the driver's hand motions may not be rendered. However, views through windshields may be appropriately rendered, providing a realistic simulated view. In a further embodiment, the virtual camera view from inside a vehicle may be used to recreate the driver's view during an accident, or spin-out, even though no actual camera existed inside the vehicle.
  • the virtual camera may be placed on the water's surface, on a mast of a ship, on a virtual chase plane or boat following a virtual ship object, or even in locations previously unfilmable, such as underwater.
  • the water may be rendered substantially more transparent than the real water, allowing an underwater virtual camera to view the positions of ships from distances much greater than would be possible in reality.
  • the event may comprise a military action, real or simulated for training purposes.
  • vehicles and troops may carry GPS transmitters, and the virtualization system may generate a virtual representation of the event, allowing a commander to move a virtual camera around the battlefield to see which areas are hidden from view, potential sniper or ambush locations, etc.
  • the lack of data from an opposing force is not a detriment, as the virtual camera may be used to locate areas that should be investigated by troops.
  • the rendered view may be provided to one or more client devices, including televisions, computers, tablet computers such as iPads, smart phones, or other devices.
  • client devices including televisions, computers, tablet computers such as iPads, smart phones, or other devices.
  • the resolution of the rendered environment may be drastically reduced, allowing real time transfer over very low bandwidth connections or connections with bandwidth below a predetermined threshold, or to devices with reduced processing capability below a predetermined threshold.
  • a high resolution virtual rendered view may be provided to a device capable of receiving and displaying high-definition video.
  • a very low resolution rendered view, a non- textured view, a wireframe view, or other simple virtualizatkras may be provided to devices capable of receiving and displaying only low resolution video.
  • static images may be delivered to devices, if necessary, or if useful for display purposes. For example, commentators on a media broadcast of an event may use a static rendered image from a particular viewpoint to display and discuss an interaction, such as a close call between two vehicles. The viewpoint may be selected to show the lateral displacement of the vehicle bumpers, for example.
  • the virtual environment may be pre-rendered, reducing processing load on the virtualization system.
  • the track may be mapped and rendered in advance, in one embodiment, satellite map data, such as that provided via Google Maps by Google, Inc. of Mountain View, California, may be used to generate the topology and texture of the virtual environment.
  • satellite map data such as that provided via Google Maps by Google, Inc. of Mountain View, California, may be used to generate the topology and texture of the virtual environment.
  • the virtual environment may be generic, such as where the event is a water race or an airplane acrobatic show, and the virtual environment need simply be an expanse of open water or sky.
  • GPS receivers and radio transmitters may not be needed.
  • a physical camera may be used to capture real-time images of the event, and an image recognition system may be used to detect participants and locate them within the environment.
  • an image recognition system may detect the locations of different players based on the colors of their uniforms. The players ma ⁇ ' then be rendered at the detected locations in a virtual environment, allowing a virtual camera to be positioned anywhere on the ice.
  • the system 200 includes car equipment 212 (e.g., a GPS receiver) positioned on the real-world car (i.e., dynamic or real object).
  • the GPS receiver 212 receives signals from multiple GPS satellites 205 and formulates a position of the car periodically throughout a race event 210.
  • the car may be configured with other equipment 212 as shown, such as an inertial measurement unit (IMU), telemetry, a mobile radio, and/or other types of communication (e.g., WiMAX, CDMA, etc.).
  • IMU inertial measurement unit
  • telemetry e.g., WiMAX, CDMA, etc.
  • a base station or communication solution 214 is also provided locally forming a radio communication link with the car's mobile radio.
  • the base station 214 receives information from the car and relays it to a networked server 216.
  • the server 216 can communicate the information from the car to a database 232 via the network 220.
  • components of virtualization system 200 may be interconnected or interact in other configurations.
  • the radio transmitter sends position information and any other telemetry data that may be gathered from the dynamic object to the radio base station 214.
  • the position information is updated rapidly, such as a rate of at least 30 Hz.
  • other event information 218, such as weather, flags, etc. may also be transmitted to the network server 216 from an event information system (not shown).
  • radio messages for each of the different dynamic vehicles are preferably discernable from each other and m ⁇ ' be separated in time or frequency.
  • the communication between the car and the base station 214 is not limited to radio communication but may be any other type of communication, such as Wifi, WiMAX, 802.1 1 , infrared light, laser, etc.
  • an event toolset 234 processes the database 232 to normalize data and/or to identify event scenarios.
  • web services 236 provide a web interface for searching and/or analyzing the database 232.
  • one or more media casters 238 process the database 232 to provide real-time or near real-time data streams for the real-world events to a client device 250.
  • media casters 238 may comprise virtualization and rendering engines, while in other embodiments, virtualization and rendering engines may be part of a server 216 or web server 236.
  • FIG. 2B refers to auto racing
  • a real world event e.g., a sport, a game, derby cars, a boat race, a horse race, a motorcycle race, a bike race, a travel simulation, a military action, etc.
  • a real world event e.g., a sport, a game, derby cars, a boat race, a horse race, a motorcycle race, a bike race, a travel simulation, a military action, etc.
  • a rendered virtual view from the position of a virtual camera may be displayed.
  • the data collection system 300 may comprise a GPS antenna 301 and GPS unit 302.
  • the data collection system 300 may also comprise a programmable control unit or processor 303.
  • the data collection system 300 may also include an inertial measurement unit 304, and/or one or more input/output units 305a-305n.
  • the data collection system 300 may include a radio modem and radio antenna 307.
  • the data collection system 300 may include a storage device 308.
  • Data collection system 300 may further include a power supply unit 309, or connect to a power supply unit 309 of a vehicle.
  • a data collection and transmission system 300 may comprise a GPS antenna 301 and GPS unit 302.
  • GPS receivers are generally available and are used, for example, for navigation on board of the ships, to assist, surveying operations, etc.
  • GPS is based on an older system named Navstar (NAVigation by Satellite Timing And Ranging), The GPS system is operated by U.S. military authorities.
  • DGPS differential GPS receivers
  • GPS receiver unit 302 may comprise any type or form of GPS receiver, such as any of the models of OEMV receivers manufactured by ovAtel of Canada, the Condor family of GPS modules manufactured by Trimble Navigation, Ltd. of Sunnyvale, California, or any other GPS receiver.
  • GPS antenna 301 m ⁇ ' comprise n ⁇ ' type of single or dual frequency GPS antenna, such as a NovAtel GPS-702L antenna, or any other type and form of antenna.
  • a GPS antenna 301 and GPS unit 302 may be employed instead of a GPS antenna 301 and GPS unit 302 .
  • different position detection means may be employed. For example, laser measurements, short-range radio transponders placed in the raceway, or other position detection methods may be employed.
  • a camera and image recognition algorithm may be used to visually detect the position of one or more dynamic objects.
  • a data collection and transmission system 300 may comprise a processor or programmable control unit 303.
  • Programmable control unit (PCU) 303 may comprise a programmable computer capable of receiving, processing, and transforming digital and analog sensor data, and transmitting the data as a serial data stream to a radio modern 306.
  • the PCU 303 may comprise any type and form of programmable computer, and may comprise any of the types of computing device discussed above in connection with FIG. I B.
  • the PCU may comprise any type and form of programmable computer, and may comprise any of the types of computing device discussed above in connection with FIG. I B.
  • the PCU may capture and transform sensor and GPS data into a serial data stream.
  • the PCU may incl ude a timer and may provide a timestamp for v alues of the data stream.
  • a data collection and transmission system 300 may comprise an inertial measurement unit 304.
  • an inertial measurement unit 304 may comprise a gyroscopic-based attitude and heading reference system (AHRS) for providing drift- free 3D orientation and calibrated 3D acceleration, 3D rate of turn (rate gyro) and 3D earth- magnetic field data.
  • AHRS gyroscopic-based attitude and heading reference system
  • Inertial measurement unit 304 may comprise an MTI JMU from XSens Motion Technol ogy of the Netherlands, any of the models of iSensor IM Us from Analog Devices Inc. of Norwood, MA, or any other type and form of inertial measurement unit.
  • IMU IMU
  • 304 ma ⁇ ' be mounted in the center of the object, or in any other location. Embodiments utilizing the latter may require recalibration of sensor data.
  • a data collection and transmission system 300 may comprise one or more input/output units 305a-305n, referred to generally as TO unit(s) 305.
  • an TO unit 305 may comprise a sensor, such as a temperature sensor; fuel sensor; throttle position sensor; steering wheel, joystick or rudder position sensor; aerilon position sensor; tachometer; radio signal strength sensor; odometer or speedometer sensor; transmission position sensor, or any other type and form of sensor.
  • an I/O unit 305 may comprise a switch, such as a brake light, switch, headlight, switch, or other switch, or receive a signal from or detect position of such a switch.
  • an T/O unit, 305 may comprise a microphone or video camera.
  • an T/O unit, 305 may further comprise an output interface, such as a display, light, speaker, or other interface for providing a signal to an operator of a dynamic object, such as a driver of a race car.
  • the output may include an indicator that the PCU 303 is receiving signal from a GPS unit or is broadcasting properly, for example.
  • I/O units 305 may be connected to or comprise sensors or other devices within a controller area network (CAN) or vehicle data bus.
  • CAN controller area network
  • a data collection and transmission system 300 may comprise a radio modern 306 and radio antenna 307.
  • Radio modem 306 and radio antenna 307 may provide communication with a ground station or receiver, and may transmit serial data provided by PCU 303.
  • radio modem 306 may comprise an E-A F35 radio modem manufactured by Adeunis RF of France.
  • Radio modem 306 may be single- or mult-channel, and may have any RF power level, including 250 mW, 500mW, 1W or any other value.
  • a data collection and transmission system 300 may comprise a storage device 308, such as flash memory, for storing and buffering data, storing sensor calibration values, or storing translation or calculation programs of PCU 303. Any type and form of storage device 308 may be utilized.
  • a storage device 308 such as flash memory, for storing and buffering data, storing sensor calibration values, or storing translation or calculation programs of PCU 303. Any type and form of storage device 308 may be utilized.
  • data collection and transmission system 300 may further comprise a power supply unit 309.
  • power supply unit 309 may comprise a battery pack, solar panel, or other power supply.
  • the data collection system 300 may connect to the vehicle's engine or battery.
  • Data collection and transmission system 300 may be small and lightweight to meet requirements for auto racing or motorcycle racing, or to provide functionality without unduly burdening a human or animal carrying the system.
  • data collection and transmission systems 300 may be sufficiently small to be used by marathon runners, cyclists, camel or horse racers, players in a team sport, or in other such activities.
  • the size of the data collection and transmission system 300 may be reduced in various embodiments by removing unneeded components, such as an interface for a CAN bus when the system is to be used in a race without such a network, such as a motorcycle race.
  • the system may be less than 50()g in some embodiments, and may have dimensions of roughly 100mm by 90mm by 30mm ( ⁇ 10%), or approximately 300 cubic centimeters in volume, such that the system may be easily installed in a vehicle without compromising performance, or may be carried by a person or animal. In other embodiments where space and weight are not at a premium, additional components and/or sensors may be included.
  • the system may be less than 250g, less than 150g, or may less than 750g, less than 1kg or any other such range.
  • the system may be less than 300 cubic centimeters in volume, such as 250 cubic centimeters, 200 cubic centimeters, or any such volume, or may be more than this volume, such as 350 cubic centimeters, 400 cubic centimeters, or any other such volume, and the length, depth, and width of the system may vary accordingly, as well as with respect to each other such that the aspect ratio is different than mentioned above.
  • FIG. 3B illustrated is a block diagram of an embodiment of a virtualization engine or virtualization server 330.
  • virtualization server 330 may comprise a network interface 331 for receiving data from data collection and transmission system(s) 300 and providing rendered images or video to client devices; a real-data location module 332 for interpreting and/or collating received data and mapping location data of real objects into a virtual environment 334; a rendering engine 336 for rendering views of virtual objects in the virtual environment; a processor 338; and a storage device 339.
  • Processor 338, storage device 339 and network interface 331 may comprise any type or form of processor, storage devices, and network interfaces discussed above in connection with FIG. 1 B.
  • a real-data location module 332 may comprise an application, service, daemon, server, or other executable code for determining a virtual location of a real-data object in the virtual environment 334 based on a real location of the real-data object in the real environment, and responsive to received sensor data such as GPS or IMU sensors.
  • real-data location module 332 may comprise functionality for receiving multiple sets of location or position information from multiple objects in the real environment, such as multiple race cars, and translating the information into virtual code objects for placement within a virtual environment 334.
  • Virtual code objects may comprise data sets of object identifiers, position data, velocity and direction data, heading data, etc., and real-data location module 332 may collate the data and generate a record with the object identifier for processing by the rendering engine 336.
  • Virtual environment 334 may comprise a simulated virtual environment based on a real environment, such as a race track, expanse of ocean or sky, ground terrain, city environment, outer space or orbit environment, or other real environment.
  • virtual environment 334 may comprise terrain and texture maps, and textures applied to the maps.
  • a representation of the local environment for the event includes position information of static objects (i.e., track).
  • the position information includes latitude, longitude, and elevation of points al ong the race track.
  • points can be obtained from a topographical map, such as Google Earth, and/or any other map source.
  • Rendering engine 336 may comprise an application, service, daemon, server, routine, or other executable code for rendering a 3D or 2D image from one or more virtual camera viewpoints within a virtual environment 334.
  • rendering engine 336 may utilize ray tracing, ray casting, scanline rendering, z-buffering, or any other rendering techniques, and may generate wireframe, polygon, or textured images.
  • rendering engine 336 may render the view of a virtual camera in real-time.
  • rendering engine 336 may render stereoscopic views from two displaced virtual cameras. Virtual cameras may be placed arbitrarily throughout the virtual environment, including inside virtual objects, and may have static positions or may travel through the environment, for example, following or positioned relative to a particular object.
  • rendering engine 336 may render environmental data in the virtual environment, based on real-world data of the environment, including realistic time-of-day lighting, weather, flags or signs, wave height, clouds, or other data. As discussed above, in some embodiments, rendering engine 336 may generate low-resolution rendered images or video, for display by client devices with reduced processing power or slower network connectivity. In some embodiments, rendered images or video may be provided to a media server or HTTP server for streaming or pseudo-streaming to one or more client devices. Multiple media servers or casters may be located in a geographically dispersed arrangement (e.g. worldwide) to provide low-latency connections to client, devices. In one embodiment, rendering and/or streaming may be offloaded to a server farm or rendering or streaming engine operated via a cloud service.
  • users may view rendered images or video through a media playback system, such as a television, computer media player application, web page, or other interface.
  • a media playback system such as a television, computer media player application, web page, or other interface.
  • users may view rendered images or video through an interactive application.
  • the application may provide capability for the user to specify a virtual camera position within the virtual environment or otherwise move or rotate the virtual camera, or select from a plurality of currently-rendered virtual cameras.
  • the application may transmit a request to the virtualization system to generate a virtual camera at the specified location and generate a new rendered image or video.
  • the application may allow interaction with images or 3D data of the virtual environment, such as measuring displacement between two virtual objects: pausing, playing back, rewinding, or fast-forwarding video or playing video in slow-motion; zooming in on a part of an image; labeling an image or virtual object; or any other interaction.
  • different data collection and transmission systems 300 and/or different communications networks may be used to provide flexibility and reliability, particularly in events that occur across a wider geographic area. For example, many rally races include circuits covering over 50 kilometers. High-bandwidth radio coverage of the entire course may be expensive and impractical. Accordingly, a hybrid system may be implemented to provide reduced data via a wide area communication system, such as satellite phones or cellular modems, and increased or high resolution data at one or more positions along the course via a radio network. Thus, in regions with radio coverage, high resolution data may be obtained from a data capture and transmission system via short or medium range radio, and in regions without radio coverage, lower resolution data may be obtained via cellular or other networks.
  • a wide area communication system such as satellite phones or cellular modems
  • a vehicle, animal, or participant may carry or be configured with a data acquisition and transmission system 300 which may communicate via short or medium range radio to one or more receivers to provide high- resolution and detailed tracking information to a tracking system 408.
  • a data acquisition and transmission system 300 may communicate via short or medium range radio to one or more receivers to provide high- resolution and detailed tracking information to a tracking system 408.
  • Such high-resolution data may include GPS positioning data, inertia! measurements, acceleration data including
  • the vehicle, animal, or participant may carry or be configured with a second mobile device 402 which may
  • the mobile device 402 may comprise a smart phone, tablet computer, or other such device, such as an Android operating system smart phone or Apple iOS operating system smart phone. Such devices frequently include GPS capability and may provide position data via the cellular or satellite network to tracking system 408. In many embodiments, the mobile device 402 may also comprise a compass and provide directional data to tracking system 408, while in others, only position data may be communicated.
  • position data ma ⁇ ' be provided to a marshalling server 404 for use in controlling the race or event.
  • Marshalling server 404 may comprise a server or sendee operated by a computing device for receiving position and/or direction information from mobile devices 402.
  • Such position and/or direction information may be of particular use in large events, such as a rally circuit where vehicles m ⁇ ' be widely separated.
  • a marshal may receive position information via a map or other interface provided via a marshalling interface 406 provided by a computing device, such as a tablet or smart phone, said position information compiled by marshalling server 404 and transmitted to the computing device.
  • marshalling server 404 may provide an overview of a race or event including positions of vehicles or participants, allowing marshals to make informed decisions regarding start or hold times, safety commands, etc.
  • the server may also provide marshals with direct feedback in case of an accident, crash, or inj ury, or may indicate a vehicle or participant has stopped, implying a potential hazard.
  • penalty situations may be automatically identified, such as a vehicle leaving the course to take a hidden short cut.
  • communication between mobile devices 402 and marshalling server 404 may be bi-directional, allowing transmission of indicators to vehicles or participants, such as hazard indicators, or start indicators.
  • Position data from the mobile device 402 and data acquisition modules 300 may be provided to a tracking system 408, which may comprise an application, server, service, or other logic executed by a computing device.
  • Tracking system 408 may compile data from devices 402, 300, to provide telemetry data for viewing by client devices executing telemetry viewing applications 410, or for rendering by a virtualizatioii server 330 for viewing on client devices 250 as discussed above.
  • a client device 250 may execute a telemetry viewer 410 or the viewer m ⁇ ' be executed by a separate device.
  • Telemetry viewer 410 may display a map indicating position and/or direction of vehicles or other participants in an event, with position and/or direction data obtained by mobile devices 402.
  • Low-resolution data may be inadequate for virtual izati on, but may be displayed on the map, such as when rally vehicles are out of range of radio receivers. Accordingly, the telemetry viewer 410 may be useful for broadcast enrichment, tracking, and marshalling. Conversely, high-resolution data may be virtualized as discussed above and used for high-quality broadcast enrichment, virtual cameras, game play, separation measurement or replay, or other such uses.
  • the module may record or store data for later upload to tracking system 408 and/or virtualization server 330. While this data may not be provided in real-time, it may still be useful for replays with virtual cameras or similar uses.
  • a tracking system may receive tracking data from a device, such as a data acquisition and transmission module or mobile device installed in a vehicle or carried by a participant.
  • the data may be received via a high-bandwidth but limited-range radio network, or via cellular or satellite modem.
  • the data may be limited and not include inertia! measurements or control states, or may be sent at low temporal resolution, such as providing position data every 10 seconds or every minute, rather than multiple times per second.
  • the tracking system may identify position information for the vehicle or participant.
  • the tracking system may identify heading or direction information, or may interpolate such heading or direction information based on previous measurements. For example, the tracking system may identify a current position for a vehicle, a previous position for a vehicle, and extend a vector between the positions to identify a likely present heading. Similarly, the tracking system may identify a speed for the vehicle or participant based on distance travelled between successive measurements.
  • the tracking system may update position information within a database and/or on a map for the vehicle or participant.
  • Position information may be two dimensional or three dimensional, based on capabilities of the mobile device, and the data may be updated accordingly.
  • the tracking system may compile position information for the vehicle or participant with position information of other vehicles or participants and may provide or display position information of a plurality of participants or vehicles to telemetry viewers of client devices.
  • the data may be of high accuracy or resolution
  • the tracking system may identify position, direction, speed, and/or any other data, including control states or positions, throttle or gear information, or other details.
  • a database and/or map may be updated with the information, previously interpolated values may be updated with measured values, and/or the data may be compiled with data of other vehicles or participants.
  • a virutuaiization server m ⁇ ' render a three dimensional view from a virtual camera angle to display the vehicle and/or participant for high quality broadcast enrichment or gaming purposes.
  • the implementation can be as a computer program product (i.e., a computer program tangibly embodied in an information carrier).
  • the implementation can, for example, be in a machine -readable storage device, for execution by, or to control the operation of, data processing apparatus.
  • the implementation can, for example, be a programmable processor, a computer, and/or multiple computers.
  • systems described above may provide multiple ones of any or each of those components and these components may be provided on either a standalone machme or, in some embodiments, on multiple machines in a distributed system.
  • the systems and methods described above may be implemented as a method, apparatus or article of manufacture using programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof.
  • the systems and methods described above may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture.
  • article of manufacture is intended to encompass code or logic accessible from and embedded in one or more computer-readable devices, firmware, programmable logic, memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, SRAMs, etc.), hardware (e.g., integrated circuit chip, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), etc.), electronic devices, a computer readable non-volatile storage unit (e.g., CD-ROM, floppy disk, hard disk drive, etc.).
  • the article of manufacture may be accessible from a file server providing access to the computer-readable programs via a network transmission line, wireless transmission media, signals propagating through space, radio waves, infrared signals, etc.
  • the article of manufacture may be a flash memory card or a magnetic tape.
  • the article of manufacture includes hardware logic as well as software or programmable code embedded in a computer readable medium that is executed by a processor.
  • the computer-readable programs may be implemented in any combination of the computer-readable programs.
  • the software programs may be stored on or in one or more articles of manufacture as object code.

Abstract

The present application is directed to systems and methods for generating virtual viewing of physical events. Event participants may have GPS receivers and data collection modules installed in vehicles, such as cars for vehicle racing events, boats for sailing races, planes for air races or acrobatic shows, or similar entities, or within equipment, such as helmets, padding, packs, or similar gear, as well as transmitters capable of sending the GPS data and collected data to local receivers for collation and processing. The data may be provided to a virtualizatkra engine, which may generate a virtual environment, modeled on the real world environment in which the event occurs. The virtiialization engine may use the received data to generate virtual objects representing each event participant and/or their vehicle, may place the virtual objects within the virtual environment at locations determined by positioning data, and may render images or video from one or more virtual cameras within the virtual environment for display for clients.

Description

This application claims priority to and the benefit, of U.S. Provisional Application No. 60/524,742, entitled "Systems and Methods for Virtual Viewing of Physical Events," filed August 17, 2011, the entirety of which is hereby incorporated by reference.
Field of the Invention.
The methods and systems described herein relate generally to graphical virtualization of physical events. In particular, the methods and systems described herein relate to data capture of physical events, generation of a virtual representation of the e vent, and display of the virtual representation of the event.
Background of the Invention
GPS technology has been integrated into many events, such as sporting events, with GPS receivers placed on participants, able to record and provide position and velocity information to external servers. For example, Formula One race cars may include GPS receivers and transmit GPS data to local receivers to be interpreted by pit crews, media broadcasters, race supervisors, or others. This information may also be provided to spectators, allowing them to see near realtime positions of race cars on a map of the racetrack.
However, such visualizations are typically limited to top-down views of the track, with dots or symbols representing each car as they move around the track. Spectators wishing to view
I the actual race are limited to what they can see in person, or camera views from cameras prepositioned around the track. Additionally, while some vehicles may have internal cameras, allowing viewers to see what the driver sees, most vehicles do not. Accordingly, broadcasts of the event are typically limited to predetermined views. umma^
The present application is directed to systems and methods for generating virtual viewing of physical events. Event participants may have GPS receivers and data collection modules installed in vehicles, such as cars for vehicle racing events, boats for sailing races, planes for air races or acrobatic shows, or similar entities, or within equipment, such as helmets, padding, packs, or similar gear, as well as transmitters capable of sending the GPS data and collected data to local receivers for collation and processing. The data may be provided to a virtualization engine, which may generate a virtual environment modeled on the real world environment in which the event occurs. The virtualization engine may use the received data to generate virtual objects representing each event participant and/or their vehicle, and may place the virtual objects within the virtual environment at locations determined by positioning data.
The virtualization engine may further generate one or more viewpoints or virtual cameras, and may place the cameras anywhere within the virtual environment, including within the virtual objects. The virtualization engine may then render, in real time or near real time, a realistic view of the virtual environment and virtual objects. The rendered view may thus comprise a realistic simulated view of the physical event. Unlike physical cameras, however, the rendered view may be arbitrarily positioned, including above participants, below participants, inside participants or their vehicles, in the middle of a track or path of participants, or anywhere
? else. Additionally, the virtual cameras may be moved or rotated during the event, and the rendered simulation may be paused, rewound, or slowed.
Accordingly, viewers m }' be able to have a customizable, personal view of an event. Furthermore, broadcasters may be able to show rendered simulations of additional views for commentary purposes, such as pausing action during a turn to allow a view of the distance between two vehicle bumpers; providing a simulated view from within a peloton of the Tour de France; providing a simulated view of the inside of a vehicle during a crash, even if the vehicle has no camera; or any other view. Accordingly, live broadcasts can be enhanced in ways previously impossible.
In one aspect, the present application is directed to a system for virtualization of a physical event. The system includes a computing device comprising a processor configured to execute a tracking server and a virtualization engine. The tracking server is configured for receiving position data from one or more additional computing devices. The virtualization engine is configured for (a) generating a virtual environment comprising: (i) a virtual object corresponding to each of the one or more additional computing devices, each virtual object having a position within the virtual environment corresponding to the received position data for the corresponding computing device, and (is) a virtual camera within the virtual environment; and (b) rendering an image of the virtual environment corresponding to a view of the virtual camera.
In one embodiment of the system, the tracking server is further configured for receiving direction and speed data from the one or more computing devices; and each virtual object has a direction and speed within the virtual environment corresponding to the received position data. In another embodiment of the system, the virtual camera has a position, speed, and direction within the virtual environment. In a further embodiment, the position, speed, and direction of the virtual camera are based on the position, speed, and direction of a virtual object.
In another embodiment of the system, the virtu aiizati on engine is configured for transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe, un textured, or other reduced quality representation or image of the virtual environment and/or virtual object, responsive to a processing capability of the client, device being less than a predetermined threshold. In yet another embodiment of the system, the virtualization engine is configured for transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe, untextured, or other reduced quality representation or image of the virtual environment and/or virtual object, responsive to a bandwidth of a connection to the client device being less than a predetermined threshold.
In some embodiments of the system, the tracking server is further configured for receiving position data from the one or more additional computing devices via a cellular network, and is configured for interpolating a direction for each of the one or more additional computing devices based on the received position data and prior received position data, in a further embodiment, the virtual camera has a position above the virtual environment, and the rendered image comprises a map.
In one embodiment of the system, the tracking server is further configured for receiving a first set of low temporal resolution position data from a second computing device via a first network and for receiving a second set of high temporal resolution position data from the second computing device via a second network at a subsequent time; and the virtualization engine is configured for rendering a first image comprising a map displaying a virtual object having a position corresponding to the first set of low temporal resolution position data, and rendering a second image comprising a three-dimensional view displaying a second virtual object having a position corresponding to the second set of high temporal resolution position data at the subsequent time. In a further embodiment, the first virtual object comprises an icon, and wherein the second virtual object, comprises a three-dimensional object corresponding to a vehicle or participant carrying the second computing device.
In another aspect, the present application is directed to a method for virtualization of a physical event. The method includes receiving position data, by a tracking server executed by a processor of a computing device, from one or more additional computing devices. The method also includes generating a virtual environment, by a virtualization engine executed by the processor of the computing device, the virtual environment comprising: (i) a virtual object corresponding to each of the one or more additional computing devices, each virtual object having a position within the virtual environment corresponding to the received position data for the corresponding computing device, and (ii) a virtual camera within the virtual environment. The method further includes rendering an image of the virtual environment, by the virtualization engine, corresponding to a view of the virtual camera.
In one embodiment, the method includes receiving direction and speed data by the tracking server from the one or more computing devices; and each virtual object has a direction and speed within the virtual environment corresponding to the received position data. In another embodiment, embodiment, the method includes the virtual camera having a position, speed, and direction within the virtual environment. In a further embodiment, the position, speed, and direction of the virtual camera are based on the position, speed, and direction of a virtual object.
In some embodiments, the method includes transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe, untextured, or other reduced quality representation or image of the virtual environment and/or virtual object, responsive to a processing capability of the client device being less than a predetermined threshold. In other embodiments, the method includes transmitting the rendered image to a client device for display; and rendering the image comprises rendering a wireframe image of the virtual environment, responsive to a bandwidth of a connection to the client, device being less than a predetermined threshold.
In one embodiment, the method includes receiving position data from the one or more additional computing devices via a cellular network, and interpolating a direction for each of the one or more additional computing devices based on the received position data and prior received position data. In a further embodiment, the method includes the virtual camera having a position above the virtual environment, and the rendered image comprises a map.
In some embodiments, the method includes receiving, by the tracking server, a first set of low temporal resolution position data from a second computing device via a first network. The method also includes rendering, by the virtuaiization engine, a first image comprising a map displaying a virtual object having a position corresponding to the first set of low temporal resolution position data. The method further includes receiving, by the tracking server, a second set of high temporal resolution position data from the second computing device via a second network at a subsequent time. The method also includes rendering, by the virtuaiization engine, a second image comprising a three-dimensional view displaying a second virtual object having a position corresponding to the second set of high temporal resolution position data at the subsequent time. In a further embodiment, the first virtual object comprises an icon, and wherein the second virtual object comprises a three-dimensional object corresponding to a vehicle or participant carrying the second computing device. The details of various embodiments of the invention are set forth in the accompanying drawings and the description below.
Brief Desc^
The foregoing and other objects, aspects, features, and advantages of the invention will become more apparent and better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:
Figure 1 A is a block diagram illustrative of an embodiment, of a networked environment useful for the systems and methods described in this document;
Figure I B is a block diagram illustrative of a certain embodiment of a computing machine for practicing the methods and systems described herein;
Figure 2 A is a block diagram of an embodiment of a system for virtualization of a physical event;
Figure 2B is another block diagram of an embodiment of a virtualization system;
Figure 3 A is a block diagram of an embodiment of a data capture and transmission system;
Figure 3B is a block diagram of an embodiment of a virtualization system;
Figure 4 is a block diagram of an embodiment of a hybrid short/long range data capture and transmission system; and
Figure 5 is a flow chart of an embodiment of a method for providing a virtual display of a physical event utilizing a hybrid short/long range data capture and transmission system.
The features and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functional!}' similar, and/or structurally similar elements.
De a^ed De^
Prior to discussing methods and systems for generating a virtualized representation of a physical event, it may be helpful to discuss embodiments of computing systems useful for practicing these methods and systems. Referring first to Figure 1 A, illustrated is one
embodiment of a networked environment, 101 in which a simulated environment can be provided. As shown in FIG. 1 A, the networked environment 101 includes one or more client machines 102A-102N (generally referred to herein as "client machine(s) 102" or "client(s) 102") in communication with one or more servers 106A-106N (generally referred to herein as "server machine(s) 106" or "server(s) 106") over a network 104. The client machme(s) 102 can, in some embodiments, be referred to as a single client machine 102 or a single group of client machines 102, while server(s) 106 may be referred to as a single server 106 or a single group of se ers 106. Although four client machines 102 and four se er machines 106 are depicted in FIG. 1A, any number of clients 102 may be in communication with any number of servers 106. In one embodiment a single client machine 102 communicates with more than one server 106, while in another embodiment a single sewer 106 communicates with more than one client machine 102. In yet another embodiment, a single client machine 102 communicates with a single server 106. Further, although a single network 104 is shown connecting client machines 102 to server machines 106, it should be understood that multiple, separate networks may connect a subset of client machines 102 to a subset of server machines 106. In one embodiment, the computing environment 101 can include an appliance (not shown in FIG. 1 A) installed between the server(s) 106 and client machine(s) 102. This appliance can mange client/sewer connections, and in some cases can load balance connections made by client machines 102 to server machines 106. Suitable appliances are manufactured by any one of the following companies: the Cirri x Systems Inc. Application Networking Group; Silver Peak Systems, Inc, both of Santa Clara, California; Riverbed Technology, Inc. of San Francisco, California; F5 Networks, Inc. of Seattle, Washington; or Juniper Networks, Inc. of Sunnyvale, California.
Clients 102 and server 106 may be provided as a computing device 100, a specific embodiment of which is illustrated in Figure I B. Included within the computing device 100 is a system bus 150 that communicates with the following components: a central processing unit 121; a main memory 122; storage memory 128; an input/output (I/O) controller 123; display devices 124A-324N; an installation device 116; and a network interface 118. In one embodiment, the storage memory 128 includes: an operating system, software routines, and a client agent 120. The I/O controller 123, in some embodiments, is further connected one or more input devices. As shown in Figure IB, the I/O controller 123 is connected to a camera 125, a keyboard 126, a pointing device 127, and a microphone 129.
Embodiments of the computing machine 100 can include a central processing unit 121 characterized by any one of the following component configurations: logic circuits that respond to and process instructions fetched from the main memory unit 122; a microprocessor unit, such as: those manufactured by Intel Corporation; those manufactured by Motorola Corporation; those manufactured by Transmeta Corporation of Santa Clara, California; the RS/6000 processor such as those manufactured by International Business Machines; a processor such as those manufactured by Advanced Micro Devices; or any other combination of logic circuits. Still other embodiments of the central processing unit 122 may include any combination of the following: a microprocessor, a microcontroller, a central processing unit with a single processing core, a central processing unit with two processing cores, or a central processing unit with more than one processing core.
While Figure I B illustrates a computing device 100 that includes a single central processing unit 121, in some embodiments the computing device 100 can include one or more processing units 121. In these embodiments, the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units 121 to simultaneously execute instructions or to simultaneously execute instructions on a single piece of data. In other embodiments, the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units to each execute a section of a group of instructions. For example, each processing unit 121 may be instructed to execute a portion of a program or a particular module within a program.
In some embodiments, the processing unit 121 can include one or more processing cores. For example, the processing unit 121 may have two cores, four cores, eight cores, etc. In one embodiment, the processing unit 121 may comprise one or more parallel processing cores. The processing cores of the processing unit 121 may in some embodiments access available memory as a global address space, or in other embodiments, memory within the computing device 100 can be segmented and assigned to a particular core within the processing unit 121. In one embodiment, the one or more processing cores or processors in the computing device 100 can each access local memory. In still another embodiment, memory within the computing device 100 can be shared amongst one or more processors or processing cores, while other memory can be accessed by particular processors or subsets of processors. In embodiments where the computing device 100 includes more than one processing unit, the multiple processing units can be included in a single integrated circuit (IC). These multiple processors, in some embodiments, can be linked together by an internal high speed bus, which may be referred to as an element interconnect bus.
In embodiments where the computing device 100 includes one or more processing units 121, or a processing unit 121 including one or more processing cores, the processors can execute a single instruction simultaneously on multiple pieces of data (SIMD), or in other embodiments can execute multiple instructions simultaneously on multiple pieces of data (MIMD). In some embodiments, the computing device 300 can include any number of SIMD and MIMD processors.
The computing device 100, in some embodiments, can include a graphics processor or a graphics processing unit (not shown). The graphics processing unit can include any combination of software and hardware, and can further input graphics data and graphics instructions, render a graphic from the inputted data and instructions, and output the rendered graphic. In some embodiments, the graphics processing unit can be included within the processing unit 121. In other embodiments, the computing device 100 can include one or more processing units 121, where at least one processing unit 121 is dedicated to processing and rendering graphics.
One embodiment of the computing device 100 provides support for any one of the following installation devices 1 16: a CD-ROM drive, a CD-R/RW drive, a DVD-ROM drive, tape drives of various formats, USB device, a bootable medium, a bootable CD, a bootable CD for GNU/Linux distribution such as K OPPIX®, a hard-drive or any other device suitable for installing applications or software. Applications can in some embodiments include a client agent 120, or n}' portion of a client agent 120. The computing device 100 may further include a storage device 128 that can be either one or more hard disk drives, or one or more redundant arrays of independent disks; where the storage device is configured to store an operating system, software, programs applications, or at least a portion of the client agent 120. A. further embodiment of the computing device 100 includes an installation device 1 16 that is used as the storage device 128.
Embodiments of the computing device 100 include any one of the following I/O devices 130A-130N: a camera 125, keyboard 126; a pointing device 127; a microphone 129; mice;
trackpads; an optical pen; trackballs; microphones; drawing tablets; video displays; speakers; Inkjet printers; laser printers; and dye-sublimation printers; touch screen; or any other input output device able to perform the methods and systems described herein. An I/O controller 123 may in some embodiments connect to multiple I/O devices 103A-130N to control the one or more I/O devices. Some embodiments of the I/O devices 1 30A-130N may be configured to provide storage or an installation medium 116, while others may provide a universal serial bus (USB) interface for receiving USB storage devices such as the USB Flash Drive line of devices manufactured by Twintech Industry, Inc. Still other embodiments include an I/O device 130 that may be a bridge between the system bus 150 and an external communication bus, such as: a USB bus; an Apple Desktop Bus; an RS-232 serial connection; a SCSI bus; a FireWire bus; a
Fire Wire 800 bus; an Ethernet bus; an AppieTalk bus; a Gigabit Ethernet bus; an Asynchronous Transfer Mode bus; a HIPPI bus; a Super HIPPI bus; a SerialPlus bus; a SCI/LAMP bus; a FibreChannei bus; or a Serial Attached small computer system interface bus.
In some embodiments, the computing machine 100 can execute n}' operating system, while in other embodiments the computing machine 100 can execute any of the following operating systems: versions of the MICROSOFT WINDOWS operating systems such as WINDOWS 3.x; WINDOWS 95; WINDOWS 98; WINDOWS 2000; WINDOWS NT 3.51; WINDOWS NT 4.0; WINDOWS CE; WTNDOWS XP; WINDOWS VISTA; and WINDOWS 7; the different releases of the Unix and Linux operating systems; any version of the MAC OS manufactured by Apple Computer; OS/2, manufactured by International Business Machines; any embedded operating system; any real-time operating system; any open source operating system; any proprietary operating system; any operating systems for mobile computing devices; or any other operating system. In still another embodiment, the computing machine 100 can execute multiple operating systems. For example, the computing machine 100 can execute
PARALLELS or another virtualization platform that can execute or manage a virtual machine executing a first operating system, while the computing machine 100 executes a second operating system different from the first operating system.
The computing machine 100 can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a netbook; a device of the IPOD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein. In other embodiments the computing machine 100 can be a mobile device such as any one of the following mobile devices: a JAVA-enabled cellular telephone or personal digital assistant (PDA), such as the i55sr, i58sr, i85s, i88s, i90c, i95cl, or the iml 100, all of which are manufactured by Motorola Corp; the 6035 or the 7135, manufactured by Kyocera; the i 300 or 1330, manufactured by Samsung Electronics Co., Ltd; the TREQ 180, 270, 600, 650, 680, 700p, 700w, or 750 smart phone manufactured by Palm, Ine; any computing device that, has different, processors, operating systems, and input devices consistent with the device; or any other mobile computing device capable of performing the methods and systems described herein. In still other embodiments, the computing device 300 can be any one of the follo wing mobile computing devices: any one series of Blackberry, or other handheld de vice manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Palm Pre; a Pocket PC; a Pocket PC Phone; or any other handheld mobile device. In yet still other embodiments, the computing device 100 may a smart phone or tablet computer, including products such as the iPhone or iPad manufactured by Apple, Inc. of Cupertino, CA; the BlackBerry devices manufactured by Research in Motion, Ltd. of Waterloo, Ontario, Canada; Windows Mobile devices manufactured by Microsoft Corp., of Redmond, WA; the Xoom manufactured by Motorola, Inc. of
Lihertyville, IL; devices capable of running the Android platform provided by Google, Inc. of Mountain View, CA; or any other type and form of portable computing device.
In still other embodiments, the computing device 100 can be a virtual machine. The virtual machine can be any virtual machine managed by a hypervisor developed by
XenSolutions, Citrix Systems, IBM, VMware, or any other hypervisor. In still other
embodiments, the virtual machine can be managed by a hypervisor executing on a server 106 or a hypervisor executing on a client 102. In still other embodiments, the computing device 100 can in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; an application or program; executable instructions; a virtual machine; a hypervisor; a web browser; a web-based client; a client-server application; an ActiveX control; a Java applet; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio or receiving and playing streamed video and/or audio; an application for facilitating real-time-data communications; a HTTP client; a FTP client; or any other set of executable instructions. Still other embodiments include a client device 102 that displays application output generated by an application remotely executing on a server 106 or other remotely located machine. In these embodiments, the client device 102 can display the application output in an application window, a browser, or other output window.
The computing device 1 00 may further include a network interface 118 to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (e.g., 802.1 1 , Tl , T3, 56kb, X.25, SNA, DECNET), broadband connections (e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET), wireless connections, or some combination of any or all of the above. Connections can also be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, ARCNET, SONET, SDH, Fiber
Distributed Data Interface (FDDI), RS232, RS485, IEEE 802.1 1 , IEEE 802.1 la, IEEE 802.1 lb, IEEE 802.1 1 g, CDMA, GSM, WiMax and direct asynchronous connections). The network 104 can comprise one or more sub-networks, and can be installed between any combination of the clients 102, servers 106, computing machines and appliances included within the computing environment 101. In some embodiments, the network 104 can be: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network 104 comprised of multiple sub-networks 104 located between the client machines 102 and the servers 106; a primary public network 104 with a private sub-network 104; a primary private network 104 with a public sub-network 104; or a primary private network 104 with a private sub-network 104. The network topology of the network 104 can differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; or a tiered-star network topology.
Additional embodiments may include a network 104 of mobile telephone networks that, use a protocol to communicate among mobile devices, where the protocol can be any one of the following: AMPS; TDMA; CDMA; GSM; GPRS UMTS; or any other protocol able to transmit data among mobile devices.
The computing environment 101 can include more than one server 106A-106N such that the servers 106A-106N are logically grouped together into a server farm 106. The server farm 106 can include servers 106 that are geographically dispersed and logically grouped together in a server farm 106, servers 106 that are located proximate to each other and logically grouped together in a se er farm 106, or several virtual servers executing on physical servers.
Geographically dispersed servers 106A- 106N within a server farm 106 can, in some
embodiments, communicate using a WAN, MAN, or LAN, where different geographic regions can be characterized as: different continents; different regions of a continent; different countries; different states; different cities; different campuses; different rooms; or any combination of the preceding geographical locations. In some embodiments the server farm 106 may be administered as a single entity, while in other embodiments the server farm 106 can include multiple server farms 106. Referring now to FIG. 2A, illustrated is an abstraction of an embodiment of a system for virtualization of a physical event. In brief overview, a physical event 180, such as a race, athletic event, or other event, includes one or more objects 181, such as a race car, boat, airplane, human, bulldozer, police ear, etc. that interact with the surrounding environment and with each other. The system receives position data 128 for each of the objects 181. The system generates a virtual environment 184 with virtual objects 185 representing each object 181, at, a location determined by received position data 128. The system also identifies a position and direction for a virtual camera 186. In some embodiments, the system also identifies a zoom and/or focus for the virtual camera 186. Thus, the system may identify any parameter for the camera, including white balance, filters, color temperature, bit depth, resolution, or any such parameters. Similarly, determining the position may include identifying an acceleration, a velocity, a vibration frequency or amplitude, or any other such displacement.
Utilizing the virtual objects 185, attributes of the virtual environment 184, and the position, direction, and view attributes of the virtual camera 186, such as focus, resolution, speed, etc., the system renders a graphical view from the virtual camera 186 of a virtual event 187, corresponding to the physical event 180. By constantly updating the position data 182 and corresponding locations and directions of virtual objects 185, the rendered view from virtual camera 186 may comprise an accurate real-time representation of a view of the physical event from a real-world position corresponding to the position and direction of the virtual camera 186.
In some embodiments, the rendered view may be provided as part of a media broadcast. For example, a broadcast of a race may use the virtualized event to show a viewpoint from a virtual camera where no physical camera has been placed, or even could be placed. The virtual camera may be placed on the roadway, in a position locked to a vehicle such as a spoiler or
] 7 bumper, in a chase or overhead view, or in any other position. In one embodiment, the virtual camera may be placed within a virtual vehicle, showing a simulation of what the driver sees. Because the view is virtualized, in some embodiments, the driver's hand motions may not be rendered. However, views through windshields may be appropriately rendered, providing a realistic simulated view. In a further embodiment, the virtual camera view from inside a vehicle may be used to recreate the driver's view during an accident, or spin-out, even though no actual camera existed inside the vehicle.
In other embodiments, such as where the event is a boat race, the virtual camera may be placed on the water's surface, on a mast of a ship, on a virtual chase plane or boat following a virtual ship object, or even in locations previously unfilmable, such as underwater. In one such embodiment, the water may be rendered substantially more transparent than the real water, allowing an underwater virtual camera to view the positions of ships from distances much greater than would be possible in reality.
In still other embodiments, the event may comprise a military action, real or simulated for training purposes. In such cases, vehicles and troops may carry GPS transmitters, and the virtualization system may generate a virtual representation of the event, allowing a commander to move a virtual camera around the battlefield to see which areas are hidden from view, potential sniper or ambush locations, etc. In such cases, the lack of data from an opposing force is not a detriment, as the virtual camera may be used to locate areas that should be investigated by troops.
The rendered view may be provided to one or more client devices, including televisions, computers, tablet computers such as iPads, smart phones, or other devices. In some
embodiments, due to the fact that rendered view is a virtual representation, the resolution of the rendered environment may be drastically reduced, allowing real time transfer over very low bandwidth connections or connections with bandwidth below a predetermined threshold, or to devices with reduced processing capability below a predetermined threshold. For example, a high resolution virtual rendered view may be provided to a device capable of receiving and displaying high-definition video. Conversely, a very low resolution rendered view, a non- textured view, a wireframe view, or other simple virtualizatkras may be provided to devices capable of receiving and displaying only low resolution video. In a further embodiment, static images may be delivered to devices, if necessary, or if useful for display purposes. For example, commentators on a media broadcast of an event may use a static rendered image from a particular viewpoint to display and discuss an interaction, such as a close call between two vehicles. The viewpoint may be selected to show the lateral displacement of the vehicle bumpers, for example.
In many embodiments, the virtual environment may be pre-rendered, reducing processing load on the virtualization system. For example, in embodiments with races, the track may be mapped and rendered in advance, in one embodiment, satellite map data, such as that provided via Google Maps by Google, Inc. of Mountain View, California, may be used to generate the topology and texture of the virtual environment. In other embodiments, the virtual environment may be generic, such as where the event is a water race or an airplane acrobatic show, and the virtual environment need simply be an expanse of open water or sky.
In a further embodiment, such as where participants in the event are in a relatively small region, GPS receivers and radio transmitters may not be needed. Instead, a physical camera may be used to capture real-time images of the event, and an image recognition system may be used to detect participants and locate them within the environment. For example, if the event is an ice hockey game, a camera may be used to record images of the game and an image recognition system may detect the locations of different players based on the colors of their uniforms. The players ma}' then be rendered at the detected locations in a virtual environment, allowing a virtual camera to be positioned anywhere on the ice.
Referring now to FIG. 2B, illustrated is a diagram of an embodiment, of a virtual) zation system 200 for an auto racing example. The system 200 includes car equipment 212 (e.g., a GPS receiver) positioned on the real-world car (i.e., dynamic or real object). For example, the GPS receiver 212 receives signals from multiple GPS satellites 205 and formulates a position of the car periodically throughout a race event 210. The car may be configured with other equipment 212 as shown, such as an inertial measurement unit (IMU), telemetry, a mobile radio, and/or other types of communication (e.g., WiMAX, CDMA, etc.). In some embodiments, a base station or communication solution 214 is also provided locally forming a radio communication link with the car's mobile radio. The base station 214 receives information from the car and relays it to a networked server 216. The server 216 can communicate the information from the car to a database 232 via the network 220. Although shown with specific connections, in many embodiments, components of virtualization system 200 may be interconnected or interact in other configurations.
The radio transmitter sends position information and any other telemetry data that may be gathered from the dynamic object to the radio base station 214. Preferably, the position information is updated rapidly, such as a rate of at least 30 Hz. In some embodiments, other event information 218, such as weather, flags, etc., may also be transmitted to the network server 216 from an event information system (not shown). In some embodiments, radio messages for each of the different dynamic vehicles are preferably discernable from each other and m }' be separated in time or frequency. The communication between the car and the base station 214 is not limited to radio communication but may be any other type of communication, such as Wifi, WiMAX, 802.1 1 , infrared light, laser, etc.
In some embodiments, an event toolset 234 processes the database 232 to normalize data and/or to identify event scenarios. In one embodiment, web services 236 provide a web interface for searching and/or analyzing the database 232. In some embodiments, one or more media casters 238 process the database 232 to provide real-time or near real-time data streams for the real-world events to a client device 250. In some embodiments, media casters 238 may comprise virtualization and rendering engines, while in other embodiments, virtualization and rendering engines may be part of a server 216 or web server 236.
Although FIG. 2B refers to auto racing, the technology is applicable to virtually any event in which a real world event (e.g., a sport, a game, derby cars, a boat race, a horse race, a motorcycle race, a bike race, a travel simulation, a military action, etc.) may be virtualized and a rendered virtual view from the position of a virtual camera may be displayed.
Referring now to FIG. 3 A, illustrated is a block diagram of an embodiment of a system for data collection and transmission 300. In brief overview, the data collection system 300 may comprise a GPS antenna 301 and GPS unit 302. The data collection system 300 may also comprise a programmable control unit or processor 303. In some embodiments, the data collection system 300 may also include an inertial measurement unit 304, and/or one or more input/output units 305a-305n. In another embodiment, the data collection system 300 may include a radio modem and radio antenna 307. In many embodiments, the data collection system 300 may include a storage device 308. Data collection system 300 may further include a power supply unit 309, or connect to a power supply unit 309 of a vehicle.
Still referring to FIG. 3A and in more detail, in some embodiments, a data collection and transmission system 300 may comprise a GPS antenna 301 and GPS unit 302. GPS receivers are generally available and are used, for example, for navigation on board of the ships, to assist, surveying operations, etc. GPS is based on an older system named Navstar (NAVigation by Satellite Timing And Ranging), The GPS system is operated by U.S. military authorities.
Similar satellite navigations systems may be used, such as the Galileo system being developed by the European Union, the GLONASS system developed by Russia, the IRNSS system developed by India, the Beidou system developed by or the COMPASS system under development by the People's Republic of China, the QZSS system developed by Japan. In some embodiments, for improved accuracy, differential GPS receivers (DGPS) may be used.
Differential GPS utilizes a local reference with an accurately known location. By applying a correction on the GPS data based on the local reference, the general accuracy can be improved significantly. For example, positional accuracy on a decimeter or centimeter level of granularity can be achieved. GPS receiver unit 302 may comprise any type or form of GPS receiver, such as any of the models of OEMV receivers manufactured by ovAtel of Canada, the Condor family of GPS modules manufactured by Trimble Navigation, Ltd. of Sunnyvale, California, or any other GPS receiver. GPS antenna 301 m }' comprise n}' type of single or dual frequency GPS antenna, such as a NovAtel GPS-702L antenna, or any other type and form of antenna.
In other embodiments, instead of a GPS antenna 301 and GPS unit 302, different position detection means may be employed. For example, laser measurements, short-range radio transponders placed in the raceway, or other position detection methods may be employed. As discussed above, in one such embodiment, a camera and image recognition algorithm may be used to visually detect the position of one or more dynamic objects.
In some embodiments, a data collection and transmission system 300 may comprise a processor or programmable control unit 303. Programmable control unit (PCU) 303 may comprise a programmable computer capable of receiving, processing, and transforming digital and analog sensor data, and transmitting the data as a serial data stream to a radio modern 306. The PCU 303 may comprise any type and form of programmable computer, and may comprise any of the types of computing device discussed above in connection with FIG. I B. The PCU
303 may capture and transform sensor and GPS data into a serial data stream. In some embodiments, the PCU may incl ude a timer and may provide a timestamp for v alues of the data stream.
In some embodiments, a data collection and transmission system 300 may comprise an inertial measurement unit 304. In one embodiment, an inertial measurement unit 304 may comprise a gyroscopic-based attitude and heading reference system (AHRS) for providing drift- free 3D orientation and calibrated 3D acceleration, 3D rate of turn (rate gyro) and 3D earth- magnetic field data. Inertial measurement unit 304 may comprise an MTI JMU from XSens Motion Technol ogy of the Netherlands, any of the models of iSensor IM Us from Analog Devices Inc. of Norwood, MA, or any other type and form of inertial measurement unit. IMU
304 ma}' be mounted in the center of the object, or in any other location. Embodiments utilizing the latter may require recalibration of sensor data.
In some embodiments, a data collection and transmission system 300 may comprise one or more input/output units 305a-305n, referred to generally as TO unit(s) 305. In some embodiments, an TO unit 305 may comprise a sensor, such as a temperature sensor; fuel sensor; throttle position sensor; steering wheel, joystick or rudder position sensor; aerilon position sensor; tachometer; radio signal strength sensor; odometer or speedometer sensor; transmission position sensor, or any other type and form of sensor. In other embodiments, an I/O unit 305 may comprise a switch, such as a brake light, switch, headlight, switch, or other switch, or receive a signal from or detect position of such a switch. In still other embodiments, an T/O unit, 305 may comprise a microphone or video camera. In yet still other embodiments, an T/O unit, 305 may further comprise an output interface, such as a display, light, speaker, or other interface for providing a signal to an operator of a dynamic object, such as a driver of a race car. The output may include an indicator that the PCU 303 is receiving signal from a GPS unit or is broadcasting properly, for example. In some embodiments, I/O units 305 may be connected to or comprise sensors or other devices within a controller area network (CAN) or vehicle data bus.
In some embodiments, a data collection and transmission system 300 may comprise a radio modern 306 and radio antenna 307. Radio modem 306 and radio antenna 307 may provide communication with a ground station or receiver, and may transmit serial data provided by PCU 303. In one embodiment, radio modem 306 may comprise an E-A F35 radio modem manufactured by Adeunis RF of France. Radio modem 306 may be single- or mult-channel, and may have any RF power level, including 250 mW, 500mW, 1W or any other value.
In many embodiments, a data collection and transmission system 300 may comprise a storage device 308, such as flash memory, for storing and buffering data, storing sensor calibration values, or storing translation or calculation programs of PCU 303. Any type and form of storage device 308 may be utilized.
In some embodiments, data collection and transmission system 300 may further comprise a power supply unit 309. For example, in embodiments in which data collection and transmission system 300 is carried by a person, power supply unit 309 may comprise a battery pack, solar panel, or other power supply. In other embodiments, such as where data collection and transmission system 300 is installed on a vehicle, the data collection system 300 may connect to the vehicle's engine or battery.
Data collection and transmission system 300 may be small and lightweight to meet requirements for auto racing or motorcycle racing, or to provide functionality without unduly burdening a human or animal carrying the system. For example, data collection and transmission systems 300 may be sufficiently small to be used by marathon runners, cyclists, camel or horse racers, players in a team sport, or in other such activities. The size of the data collection and transmission system 300 may be reduced in various embodiments by removing unneeded components, such as an interface for a CAN bus when the system is to be used in a race without such a network, such as a motorcycle race. The system may be less than 50()g in some embodiments, and may have dimensions of roughly 100mm by 90mm by 30mm (±10%), or approximately 300 cubic centimeters in volume, such that the system may be easily installed in a vehicle without compromising performance, or may be carried by a person or animal. In other embodiments where space and weight are not at a premium, additional components and/or sensors may be included. For example, the system may be less than 250g, less than 150g, or may less than 750g, less than 1kg or any other such range. Similarly, the system may be less than 300 cubic centimeters in volume, such as 250 cubic centimeters, 200 cubic centimeters, or any such volume, or may be more than this volume, such as 350 cubic centimeters, 400 cubic centimeters, or any other such volume, and the length, depth, and width of the system may vary accordingly, as well as with respect to each other such that the aspect ratio is different than mentioned above. Referring now to FIG. 3B, illustrated is a block diagram of an embodiment of a virtualization engine or virtualization server 330. In brief overview, virtualization server 330 may comprise a network interface 331 for receiving data from data collection and transmission system(s) 300 and providing rendered images or video to client devices; a real-data location module 332 for interpreting and/or collating received data and mapping location data of real objects into a virtual environment 334; a rendering engine 336 for rendering views of virtual objects in the virtual environment; a processor 338; and a storage device 339. Processor 338, storage device 339 and network interface 331 may comprise any type or form of processor, storage devices, and network interfaces discussed above in connection with FIG. 1 B.
Still referring to FIG. 3B and in more detail, a real-data location module 332 may comprise an application, service, daemon, server, or other executable code for determining a virtual location of a real-data object in the virtual environment 334 based on a real location of the real-data object in the real environment, and responsive to received sensor data such as GPS or IMU sensors. In one embodiment, real-data location module 332 may comprise functionality for receiving multiple sets of location or position information from multiple objects in the real environment, such as multiple race cars, and translating the information into virtual code objects for placement within a virtual environment 334. Virtual code objects may comprise data sets of object identifiers, position data, velocity and direction data, heading data, etc., and real-data location module 332 may collate the data and generate a record with the object identifier for processing by the rendering engine 336.
Virtual environment 334 may comprise a simulated virtual environment based on a real environment, such as a race track, expanse of ocean or sky, ground terrain, city environment, outer space or orbit environment, or other real environment. In some embodiments, virtual environment 334 may comprise terrain and texture maps, and textures applied to the maps.
Many tools exist for creating virtuai environments, such as 3D studio Max, Blender, AutoCAD, Lightwave, Maya, Softimage XSI, Grome, or any other type of 3D editing software. In some examples, a representation of the local environment for the event includes position information of static objects (i.e., track). For example, the position information includes latitude, longitude, and elevation of points al ong the race track. Such points can be obtained from a topographical map, such as Google Earth, and/or any other map source.
Rendering engine 336 may comprise an application, service, daemon, server, routine, or other executable code for rendering a 3D or 2D image from one or more virtual camera viewpoints within a virtual environment 334. In some embodiments, rendering engine 336 may utilize ray tracing, ray casting, scanline rendering, z-buffering, or any other rendering techniques, and may generate wireframe, polygon, or textured images. In many embodiments, rendering engine 336 may render the view of a virtual camera in real-time. In some embodiments, such as for use with 3D televisions, rendering engine 336 may render stereoscopic views from two displaced virtual cameras. Virtual cameras may be placed arbitrarily throughout the virtual environment, including inside virtual objects, and may have static positions or may travel through the environment, for example, following or positioned relative to a particular object.
In some examples, rendering engine 336 may render environmental data in the virtual environment, based on real-world data of the environment, including realistic time-of-day lighting, weather, flags or signs, wave height, clouds, or other data. As discussed above, in some embodiments, rendering engine 336 may generate low-resolution rendered images or video, for display by client devices with reduced processing power or slower network connectivity. In some embodiments, rendered images or video may be provided to a media server or HTTP server for streaming or pseudo-streaming to one or more client devices. Multiple media servers or casters may be located in a geographically dispersed arrangement (e.g. worldwide) to provide low-latency connections to client, devices. In one embodiment, rendering and/or streaming may be offloaded to a server farm or rendering or streaming engine operated via a cloud service.
In some embodiments, users may view rendered images or video through a media playback system, such as a television, computer media player application, web page, or other interface. In other embodiments, users may view rendered images or video through an interactive application. The application may provide capability for the user to specify a virtual camera position within the virtual environment or otherwise move or rotate the virtual camera, or select from a plurality of currently-rendered virtual cameras. The application may transmit a request to the virtualization system to generate a virtual camera at the specified location and generate a new rendered image or video.
In some embodiments, the application may allow interaction with images or 3D data of the virtual environment, such as measuring displacement between two virtual objects: pausing, playing back, rewinding, or fast-forwarding video or playing video in slow-motion; zooming in on a part of an image; labeling an image or virtual object; or any other interaction.
In some embodiments, different data collection and transmission systems 300 and/or different communications networks may be used to provide flexibility and reliability, particularly in events that occur across a wider geographic area. For example, many rally races include circuits covering over 50 kilometers. High-bandwidth radio coverage of the entire course may be expensive and impractical. Accordingly, a hybrid system may be implemented to provide reduced data via a wide area communication system, such as satellite phones or cellular modems, and increased or high resolution data at one or more positions along the course via a radio network. Thus, in regions with radio coverage, high resolution data may be obtained from a data capture and transmission system via short or medium range radio, and in regions without radio coverage, lower resolution data may be obtained via cellular or other networks.
Referring now to Figure 4, illustrated is a block diagram of an embodiment of a hybrid short/long range data capture and transmission system. A vehicle, animal, or participant may carry or be configured with a data acquisition and transmission system 300 which may communicate via short or medium range radio to one or more receivers to provide high- resolution and detailed tracking information to a tracking system 408. Such high-resolution data may include GPS positioning data, inertia! measurements, acceleration data including
acceleration in any direction, direction heading, positions of throttles or steering wheels, rudder positions, positions of other controls, or other such data. Similarly, the vehicle, animal, or participant may carry or be configured with a second mobile device 402 which may
communicate over a cellular or satellite network. In many embodiments, the mobile device 402 may comprise a smart phone, tablet computer, or other such device, such as an Android operating system smart phone or Apple iOS operating system smart phone. Such devices frequently include GPS capability and may provide position data via the cellular or satellite network to tracking system 408. In many embodiments, the mobile device 402 may also comprise a compass and provide directional data to tracking system 408, while in others, only position data may be communicated.
In some embodiments, position data ma}' be provided to a marshalling server 404 for use in controlling the race or event. Marshalling server 404 may comprise a server or sendee operated by a computing device for receiving position and/or direction information from mobile devices 402. Such position and/or direction information may be of particular use in large events, such as a rally circuit where vehicles m }' be widely separated. For example, during such races, to warn spectators of approaching vehicles, a marshal may receive position information via a map or other interface provided via a marshalling interface 406 provided by a computing device, such as a tablet or smart phone, said position information compiled by marshalling server 404 and transmitted to the computing device. Accordingly, the marshal may be able to identify approaching vehicles based on their GPS positions while the vehicle is still out of sight, increasing safety. Thus, marshalling server 404 may provide an overview of a race or event including positions of vehicles or participants, allowing marshals to make informed decisions regarding start or hold times, safety commands, etc. The server may also provide marshals with direct feedback in case of an accident, crash, or inj ury, or may indicate a vehicle or participant has stopped, implying a potential hazard. In some embodiments, by providing an indication of positions and past positions of vehicles or participants along a course, penalty situations may be automatically identified, such as a vehicle leaving the course to take a hidden short cut. In some embodiments, communication between mobile devices 402 and marshalling server 404 may be bi-directional, allowing transmission of indicators to vehicles or participants, such as hazard indicators, or start indicators.
Position data from the mobile device 402 and data acquisition modules 300 may be provided to a tracking system 408, which may comprise an application, server, service, or other logic executed by a computing device. Tracking system 408 may compile data from devices 402, 300, to provide telemetry data for viewing by client devices executing telemetry viewing applications 410, or for rendering by a virtualizatioii server 330 for viewing on client devices 250 as discussed above. Although shown separate, in many embodiments, a client device 250 may execute a telemetry viewer 410 or the viewer m }' be executed by a separate device. Telemetry viewer 410 may display a map indicating position and/or direction of vehicles or other participants in an event, with position and/or direction data obtained by mobile devices 402. Low-resolution data may be inadequate for virtual izati on, but may be displayed on the map, such as when rally vehicles are out of range of radio receivers. Accordingly, the telemetry viewer 410 may be useful for broadcast enrichment, tracking, and marshalling. Conversely, high-resolution data may be virtualized as discussed above and used for high-quality broadcast enrichment, virtual cameras, game play, separation measurement or replay, or other such uses.
In some embodiments, when a data acquisition and transmission module 300 is out of range of a radio receiver, the module may record or store data for later upload to tracking system 408 and/or virtualization server 330. While this data may not be provided in real-time, it may still be useful for replays with virtual cameras or similar uses.
Referring now to Figure 5, illustrated is a flow chart of an embodiment of a method for providing a virtual display of a physical event utilizing a hybrid short/long range data capture and transmission system. At step 500, a tracking system may receive tracking data from a device, such as a data acquisition and transmission module or mobile device installed in a vehicle or carried by a participant. The data may be received via a high-bandwidth but limited-range radio network, or via cellular or satellite modem. In some embodiments of the latter, the data may be limited and not include inertia! measurements or control states, or may be sent at low temporal resolution, such as providing position data every 10 seconds or every minute, rather than multiple times per second. If the data is low accuracy or limited data, then at step 502, the tracking system may identify position information for the vehicle or participant. In some embodiments, the tracking system may identify heading or direction information, or may interpolate such heading or direction information based on previous measurements. For example, the tracking system may identify a current position for a vehicle, a previous position for a vehicle, and extend a vector between the positions to identify a likely present heading. Similarly, the tracking system may identify a speed for the vehicle or participant based on distance travelled between successive measurements.
At step 504, the tracking system may update position information within a database and/or on a map for the vehicle or participant. Position information may be two dimensional or three dimensional, based on capabilities of the mobile device, and the data may be updated accordingly. The tracking system may compile position information for the vehicle or participant with position information of other vehicles or participants and may provide or display position information of a plurality of participants or vehicles to telemetry viewers of client devices.
In some embodiments, the data may be of high accuracy or resolution, and at step 506, the tracking system may identify position, direction, speed, and/or any other data, including control states or positions, throttle or gear information, or other details. As discussed above, a database and/or map may be updated with the information, previously interpolated values may be updated with measured values, and/or the data may be compiled with data of other vehicles or participants. At step 508, a virutuaiization server m }' render a three dimensional view from a virtual camera angle to display the vehicle and/or participant for high quality broadcast enrichment or gaming purposes. The above-described systems and methods can be implemented in digital electronic circuitry, in computer hardware, firmware, and/or software. The implementation can be as a computer program product (i.e., a computer program tangibly embodied in an information carrier). The implementation can, for example, be in a machine -readable storage device, for execution by, or to control the operation of, data processing apparatus. The implementation can, for example, be a programmable processor, a computer, and/or multiple computers.
It should be understood that the systems described above may provide multiple ones of any or each of those components and these components may be provided on either a standalone machme or, in some embodiments, on multiple machines in a distributed system. The systems and methods described above may be implemented as a method, apparatus or article of manufacture using programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. In addition, the systems and methods described above may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture. The term "article of manufacture" as used herein is intended to encompass code or logic accessible from and embedded in one or more computer-readable devices, firmware, programmable logic, memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, SRAMs, etc.), hardware (e.g., integrated circuit chip, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), etc.), electronic devices, a computer readable non-volatile storage unit (e.g., CD-ROM, floppy disk, hard disk drive, etc.). The article of manufacture may be accessible from a file server providing access to the computer-readable programs via a network transmission line, wireless transmission media, signals propagating through space, radio waves, infrared signals, etc. The article of manufacture may be a flash memory card or a magnetic tape. The article of manufacture includes hardware logic as well as software or programmable code embedded in a computer readable medium that is executed by a processor. In general, the computer-readable programs may be implemented in any
programming language, such as LISP, PERL, C, C++, C#, PROLOG, or in any byte code language such as JAVA, The software programs may be stored on or in one or more articles of manufacture as object code.
Whil e various embodiments of the methods and systems have been described, these embodiments are exemplary and in no way limit the scope of the described methods or systems. Those having skill in the relevant art can effect changes to form and details of the described methods and systems without departing from the broadest scope of the described methods and systems. Thus, the scope of the methods and systems described herein should not be limited by any of the exemplary embodiments and should be defined in accordance with the accompanying claims and their equivalents.

Claims

What is Claimed:
1. A system for virtualization of a physical event, comprising:
a computing device comprising a processor configured to execute a tracking server and a virtualizati on engine;
wherein the tracking sewer is configured for receiving position data from one or more additional computing devices; and
wherein the virtualization engine is configured for (a) generating a virtual environment comprising: (i) a virtual object corresponding to each of the one or more additional computing devices, each virtual object having a position within the virtual environment corresponding to the received position data for the corresponding computing device, and (ii) a virtual camera within the virtual environment; and (b) rendering an image of the virtual environment corresponding to a view of the virtual camera.
2. The system of claim 1, wherein the tracking server is further configured for receiving direction and speed data from the one or more computing devices; and wherein each virtual object has a direction and speed within the virtual environment corresponding to the received position data.
3. The system of claim 1, wherein the virtual camera has a position, speed, and direction within the virtual environment.
4. The system of claim 3, wherein the position, speed, and direction of the virtual camera are based on the position, speed, and direction of a virtual object.
5. The system of claim 1, wherein the virtualizatioii engine is configured for transmitting the rendered image to a client device for display: and wherein rendering the image comprises rendering a wireframe image of the virtual environment, responsive to a processing capability of the client device being less than a predetermined threshold.
6. The system of claim 1, wherein the virtualizatioii engine is configured for transmitting the rendered image to a client device for display; and wherein rendering the image comprises rendering a wireframe image of the virtual environment, responsive to a bandwidth of a connection to the client device being less than a predetermined threshold.
7. The system of claim 1, wherein the tracking server is further configured for receiving position data from the one or more additional computing devices via a cellular network, and is configured for interpolating a direction for each of the one or more additional computing devices based on the received position data and prior received position data.
8. The system of claim 7, wherein the virtual camera has a position above the virtual environment, and the rendered image comprises a map.
9. The system of claim 1, wherein the tracking server is further configured for receiving a first set of low temporal resolution position data from a second computing device via a first network and for receiving a second set of high temporal resolution position data from the second computing device via a second network at a subsequent time; and wherein the virtualizatioii engine is configured for rendering a first image comprising a map displaying a virtual object having a position corresponding to the first set of low temporal resolution position data, and rendering a second image comprising a three-dimensional view displaying a second virtual object having a position corresponding to the second set of high temporal resolution position data at the subsequent time.
10. The system of claim 9, wherein the first virtual object comprises an icon, and wherein the second virtual object comprises a three-dimensional object corresponding to a vehicle or participant carrying the second computing device.
11. A method for virtualization of a physical event, comprising:
recei ving position data, by a tracking server executed by a processor of a computing device, from one or more additional computing devices;
generating a virtual environment, by a virtualization engine executed by the processor of the computing device, the virtual environment comprising: (i) a virtual object corresponding to each of the one or more additional computing devices, each virtual object having a position within the virtual environment corresponding to the received position data for the corresponding computing device, and (ii) a virtual camera within the virtual environment; and
rendering an image of the virtual environment, by the virtualization engine,
corresponding to a view of the virtual camera.
12. The method of claim 11 , further comprising receiving direction and speed data by the tracking server from the one or more computing devices; and wherein each virtual object has a direction and speed within the virtual environment corresponding to the received position data.
13. The method of claim 11 , wherein the virtual camera has a position, speed, and direction within the virtual environment.
14. The method of claim 13, wherein the position, speed, and direction of the virtual camera are based on the position, speed, and direction of a virtual object.
15. The method of claim 11 , further comprising transmitting the rendered image to a client device for display; and wherein rendering the image comprises rendering a wireframe image of the virtual environment, responsive to a processing capability of the client device being less than a predetermined threshold.
16. The method of claim 11 , further comprising transmitting the rendered image to a client device for display; and wherein rendering the image comprises rendering a wireframe image of the virtual environment, responsive to a bandwidth of a connection to the client device being less than a predetermined threshold.
17. The method of claim 11 , further comprising receiving position data from the one or more additional computing devices via a cellular network, and interpolating a direction for each of the one or more additional computing devices based on the received position data and prior received position data.
18. The method of claim 17, wherein the virtual camera has a position above the virtual environment, and the rendered image comprises a map.
19. The method of claim 1 1, further comprising;
receiving, by the tracking server, a first set of low temporal resolution position data from a second computing device via a first network;
rendering, by the virtualization engine, a first image comprising a map displaying a virtual object having a position corresponding to the first set of low temporal resolution position data;
receiving, by the tracking server, a second set of high temporal resolution position data from the second computing device via a second network at a subsequent time; and
rendering, by the virtualization engine, a second image comprising a three-dimensional view displaying a second virtual object having a position corresponding to the second set of high temporal resolution position data at the subsequent time.
20. The method of claim 19, wherein the first virtual object comprises an icon, and wherein the second virtual object comprises a three-dimensional object corresponding to a vehicle or participant carrying the second computing device.
PCT/IB2012/002129 2011-08-17 2012-08-17 Systems and methods for virtual viewing of physical events WO2013024364A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP12787858.5A EP3114650A2 (en) 2011-08-17 2012-08-17 Systems and methods for virtual viewing of physical events

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161524742P 2011-08-17 2011-08-17
US61/524,742 2011-08-17

Publications (2)

Publication Number Publication Date
WO2013024364A2 true WO2013024364A2 (en) 2013-02-21
WO2013024364A3 WO2013024364A3 (en) 2014-12-04

Family

ID=47715528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/002129 WO2013024364A2 (en) 2011-08-17 2012-08-17 Systems and methods for virtual viewing of physical events

Country Status (2)

Country Link
EP (1) EP3114650A2 (en)
WO (1) WO2013024364A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016154663A1 (en) * 2015-04-02 2016-10-06 Catapult Group International Ltd Sports virtual reality system
GB2543416A (en) * 2014-03-20 2017-04-19 2Mee Ltd Augmented reality apparatus and method
EP3242273A1 (en) * 2016-05-02 2017-11-08 Facebook, Inc. Systems and methods for presenting content
EP3349184A1 (en) * 2017-01-16 2018-07-18 Keygene N.V. Monitoring plants
CN109417655A (en) * 2016-05-02 2019-03-01 脸谱公司 The system and method for content for rendering
US10357715B2 (en) 2017-07-07 2019-07-23 Buxton Global Enterprises, Inc. Racing simulation
US11321892B2 (en) 2020-05-21 2022-05-03 Scott REILLY Interactive virtual reality broadcast systems and methods
US20220139047A1 (en) * 2019-03-01 2022-05-05 Scorched Ice Inc. Systems and methods for recreating or augmenting real-time events using sensor-based virtual reality, augmented reality, or extended reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6080063A (en) * 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2543416B (en) * 2014-03-20 2018-05-09 2Mee Ltd Augmented reality apparatus and method
GB2543416A (en) * 2014-03-20 2017-04-19 2Mee Ltd Augmented reality apparatus and method
AU2016240390B2 (en) * 2015-04-02 2019-07-11 Catapult Group International Ltd Sports virtual reality system
JP2018518081A (en) * 2015-04-02 2018-07-05 カタプルト グループ インターナショナル リミテッド Sports virtual reality system
WO2016154663A1 (en) * 2015-04-02 2016-10-06 Catapult Group International Ltd Sports virtual reality system
JP2019521547A (en) * 2016-05-02 2019-07-25 フェイスブック,インク. System and method for presenting content
CN109417655B (en) * 2016-05-02 2021-04-20 脸谱公司 System, method, and readable storage medium for presenting content
EP3242273A1 (en) * 2016-05-02 2017-11-08 Facebook, Inc. Systems and methods for presenting content
CN109417655A (en) * 2016-05-02 2019-03-01 脸谱公司 The system and method for content for rendering
WO2018130606A1 (en) * 2017-01-16 2018-07-19 Keygene N.V. Monitoring plants
EP3349184A1 (en) * 2017-01-16 2018-07-18 Keygene N.V. Monitoring plants
US11360068B2 (en) 2017-01-16 2022-06-14 Keygene N V. Monitoring plants
US10357715B2 (en) 2017-07-07 2019-07-23 Buxton Global Enterprises, Inc. Racing simulation
US10953330B2 (en) 2017-07-07 2021-03-23 Buxton Global Enterprises, Inc. Reality vs virtual reality racing
US20220139047A1 (en) * 2019-03-01 2022-05-05 Scorched Ice Inc. Systems and methods for recreating or augmenting real-time events using sensor-based virtual reality, augmented reality, or extended reality
US11321892B2 (en) 2020-05-21 2022-05-03 Scott REILLY Interactive virtual reality broadcast systems and methods

Also Published As

Publication number Publication date
WO2013024364A3 (en) 2014-12-04
EP3114650A2 (en) 2017-01-11

Similar Documents

Publication Publication Date Title
US20140278847A1 (en) Systems and methods for virtualized advertising
EP3338136B1 (en) Augmented reality in vehicle platforms
WO2013024364A2 (en) Systems and methods for virtual viewing of physical events
CN113474825B (en) Method and apparatus for providing immersive augmented reality experience on a mobile platform
JP7210165B2 (en) Method, device and display device for displaying virtual route
EP2950530B1 (en) Marine environment display device
KR101736477B1 (en) Local sensor augmentation of stored content and ar communication
US9235933B2 (en) Wearable display system that displays previous runners as virtual objects on a current runner's path
US11710422B2 (en) Driving analysis and instruction device
CN110057378A (en) Navigation notice based on the helmet
CN109990797A (en) A kind of control method of the augmented reality navigation display for HUD
CN112090064A (en) System, method and apparatus for enabling trace data communication on a chip
US20120287275A1 (en) Time Phased Imagery for an Artificial Point of View
JP6723533B2 (en) Driving simulator
US11270513B2 (en) System and method for attaching applications and interactions to static objects
US11516296B2 (en) Location-based application stream activation
US11841741B2 (en) Composite pose estimate for wearable computing device
US20230201723A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience in a gaming environment
US20240053609A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
WO2024031141A1 (en) Systems and methods for generating and/or using 3-dimensional information with one or more moving cameras
CN113450439A (en) Virtual-real fusion method, device and system
CN117135334A (en) Combined vision system based on airborne three-dimensional image engine and vision display method

Legal Events

Date Code Title Description
122 Ep: pct application non-entry in european phase

Ref document number: 12787858

Country of ref document: EP

Kind code of ref document: A2

REEP Request for entry into the european phase

Ref document number: 2012787858

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012787858

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12787858

Country of ref document: EP

Kind code of ref document: A2