US20070225961A1 - Visual debugging system for 3D user interface program - Google Patents

Visual debugging system for 3D user interface program Download PDF

Info

Publication number
US20070225961A1
US20070225961A1 US11/478,418 US47841806A US2007225961A1 US 20070225961 A1 US20070225961 A1 US 20070225961A1 US 47841806 A US47841806 A US 47841806A US 2007225961 A1 US2007225961 A1 US 2007225961A1
Authority
US
United States
Prior art keywords
application
embedded device
run
simulation engine
icons
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/478,418
Other versions
US8589142B2 (en
Inventor
James Ritts
Baback Elmieh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US11/478,418 priority Critical patent/US8589142B2/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ELMIEH, BABACK, RITTS, JAMES
Publication of US20070225961A1 publication Critical patent/US20070225961A1/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DURNIL, DAVID L., ELMIEH, BABACK, RITTS, JAMES
Application granted granted Critical
Publication of US8589142B2 publication Critical patent/US8589142B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices

Definitions

  • aspects of the present disclosure relate to tools and features to facilitate the development and implementation of 3D content used in embedded devices.
  • the embedded devices may be mobile devices that capture, receive, and/or transmit voice, data, text, and/or images.
  • Other aspects of the present disclosure relate to tools and features to facilitate the debugging of 3D graphical user interface programs for such devices.
  • BREWTM platform Various application development platforms (e.g., the BREWTM platform) have been created and marketed that allow users to author programs for ultimate export to target embedded devices such as mobile phones.
  • Software exists e.g., the BREWTM Simulator) for simulating the execution of these programs on a generic computer platform before exporting the program to the target embedded device.
  • Apparatus including an embedded device simulation engine to simulate, on a computer platform other than a target embedded device, a 3D application authored for the target embedded device.
  • an application run controller is provided to control the manner in which the application is run in the embedded device simulation engine.
  • a status provider is also provided to provide, as the 3D application is run in the simulated environment of the embedded device simulation engine, information regarding statuses of 3D icons in a scene of the 3D application, of animations defined in the 3D application, and of events occurring that affect the 3D application.
  • FIG. 1 is a block diagram of one or more device content development platforms
  • FIG. 2 is a schematic block diagram of a simulator platform
  • FIG. 3 is a schematic diagram of one or more platform screens.
  • FIG. 4 is a diagram of an example of a screen shot of a status window of the illustrated debugging system.
  • FIG. 1 illustrates a 3D content development system 9 .
  • the illustrated system 9 includes one or more device content development platforms 10 , and a mobile device 12 .
  • Mobile device 12 may, for example, be a mobile phone.
  • the illustrated mobile device 12 is an embedded device, which captures, receives, and/or transmits voice, data, text, and/or images.
  • the illustrated mobile device 12 further includes a display 13 and keys 14 , to allow the control of mobile device 12 and the input of information into mobile device 12 .
  • the illustrated device content development platform(s) 10 may be a single platform, a distributed platform, or multiple individual platforms.
  • the illustrated platform(s) includes a number of software interfaces which interact with and provide corresponding windows or screens on a computer platform. Each of these software interfaces includes software running on a computer platform. These interfaces and windows include a scripting window 16 a and a corresponding scripting language interface 16 b .
  • a source code window 18 a is provided which corresponds to a source code interface 18 b .
  • a debugging system 20 is provided.
  • the debugging system 20 includes a debugging window 20 a which corresponds to a debugging interface 20 b .
  • the illustrated 3D content development system 9 may be tailored to a system for developing and implementing 3D user interfaces for use on the embedded device. More specifically, the 3D user interface may cause the display of a 3D graphical virtual interface that graphically portrays (on display 13 of mobile device 12 ) and simulates a physical device with its interface components, and therefore, serves as a 3 dimensional (3D) user interface, with icons embedded therein.
  • Scripting language interface 16 b is coupled to, and generates, one or more script files 22 , which cater to the building of 3D user interfaces. Specifically, those script files 22 provide information for 3D icon and scene definition as well as for programming the animation of the defined 3D icons and scenes.
  • the 3D icons and scenes, as animated, are tied to or associated with mobile device 12 , and tools thereof, to control or input and/or to display or output various mobile device operations, settings, events, and/or statuses.
  • Each of the illustrated interfaces 16 b , 18 b , and 20 b is operable through the use of its corresponding window for receiving controls and information via a computer screen and, for displaying information to the user via the same computer screen.
  • Source code interface 18 b in connection with the source code window 18 a , allows for the creation of a program using source code, typically using commands provided in code provided for original equipment manufacturers (OEMs).
  • OEMs original equipment manufacturers
  • Debugging interface 20 b interacting with debugging window 20 a , facilitates the simulation of script files 22 for purposes of checking and debugging the script file. More specifically, the debugging interface 20 b may provide, via a computer screen display on debugging window 20 a , information regarding statuses of 3D icons in a scene or in scenes of a given 3D application. The debugging interface may further provide on the debugging window 20 a information regarding statuses of animations defined in the 3D application, and of the events occurring that affect the 3D application.
  • Scripting language interface 16 b produces script files 22
  • source code interface 18 b produces source code 24
  • Either or each of these types of code may be compiled to produce compiled script and/or source code 26 .
  • the illustrated device content development platform(s) 10 further includes user interface (UI) assets as well as user interface layout files. These include 3D model files 35 , animation files 36 , texture files 37 , and user interface (UI) layout files 38 .
  • a graphics engine layer of a mobile device controls 3D graphical functions on mobile device 12 in accordance with the compiled script and/or source code 26 in connection with any associated UI assets and UI layout files, as stored in files 35 , 36 , 37 , and 38 .
  • Debugging system 20 may be implemented in the form of a simulator platform 40 as shown in FIG. 2 .
  • the illustrated simulator platform 40 includes a mobile device simulation engine 42 , a simulation input/output interface 44 , and a platform screen and keyboard 46 .
  • the illustrated platform 40 further includes a script file 48 to be debugged, and a script file interpreter 50 .
  • the script file interpreter 50 interprets script file 48 so that it can be simulated by mobile device simulation engine 42 .
  • the illustrated platform screen and keyboard 46 causes the simulated display 47 of the screen of the simulated mobile device, and provides the ability to receive certain simulated key and command inputs 49 via the either or both of the platform screen and keyboard 46 .
  • platform screen and keyboard 46 display a status window 48 , and provide, through either computer screen icons or buttons, or through the use of physical control switches or inputs, including, for example certain keys on the keyboard, pause 50 , play 51 , and tick 52 controls.
  • Status window 48 provides information to the user as the application is run in a simulated environment of mobile device simulation engine 42 .
  • the information that is provided is information regarding statuses of 3D icons in the scene or scenes of the 3D application, of animations defined in the 3D application, and of events occurring that affect the 3D application.
  • a fine-tune mechanism may be provided to allow fine-tuning (i.e., modification) of portions of the 3D application, by allowing a user to change the script.
  • This mechanism may be provided as part of simulator platform 40 , or the user may use the scripting interface 16 b of the device content development platform 10 .
  • Such fine-tuning of the 3D application may include defining animations of the 3D application, and/or defining interaction flow of the 3D application, including interaction flow of the icons in the 3D user interface.
  • the illustrated status window 48 provides, i.e., displays to a user via a computer screen, extensive information describing statuses of all or a subset of icons in the 3D user interface, all or a subset of animations of objects in the scene, and all or a subset of events.
  • Events occur external to the 3D application.
  • an event may be a state change of the simulated target device, or an event may result from user interaction with the 3D application, either by manipulation of a 3D icon or by the use of device keys in the simulated target device.
  • Examples of events include the statuses of the target device itself, including a command key or a given key being depressed, or a state change in the device, such as, a GPS signal being detected, or a low battery state in the target device.
  • FIG. 3 shows a schematic diagram of one or more platform screens of simulator platform 40 .
  • the illustrated screen(s) 60 include a simulated display 62 and a status window 64 .
  • the simulated display 62 presents the 3D user interface being implemented by the 3D application through the use of mobile device simulation engine 42 , running the script in script file 48 as interpreted by script file interpreter 50 .
  • the example scene depicted in simulator display 62 includes a default ambient lighting and a single camera view showing a puppy and a ball on a checkered floor.
  • Status window 64 includes a number of status indications 66 .
  • Status indications 66 may, for example, be graphical or textual indications on status window 64 of certain information. Those status indications include, in the embodiment shown in FIG. 3 , a current script status indication 68 , an event history status indication 70 , active animations status indication 72 , icons status indication 74 , camera status indication 76 and light status indication 78 .
  • the current script status indication 68 portrays, via status window 64 , the following the types of script information: an indication of the current state of the animation run controller, represented by the numerical identity of the current animation state within the animation state machine.
  • event history status indication 70 portrays, via the status window 64 , the following types of event history information: a listing of recent events pertinent to the animation, which may include key input notifications, and the starting or ending of individual animation loops.
  • active animations status indication 72 portrays, via status window 64 , the following types of active animation information: for each currently running animation being applied to the scene, to a camera, to a light, or to an icon, an indication is given of the total length of the animation, the subset of the total length being run, and the position of the animation being currently displayed from within that subset.
  • current script status indication information 68 may be depicted in one window
  • event history status indication 70 may be depicted in a separate window.
  • the status window 64 further includes graphical tools for allowing a user to activate or deactivate a pause switch 80 , a play switch 82 , and a tick switch 84 .
  • These graphical tools may, for example, be graphical buttons provided on a computer screen within status window 64 .
  • the pause switch 80 causes the 3D application to pause at a particular point in its playback, thereby allowing the status information in each of the status indications 66 to portray information relevant to that particular point within the 3D application.
  • the play switch 82 causes the 3D application to be run, or to resume from, a pause state.
  • the tick switch 84 can be depressed to cause the application to move from one increment to another in its execution. In other words, the tick switch is provided to step through each of the increments in running the script.
  • the increments may include individual consecutive frames of animations of icons currently be displayed in the scene on the simulated screen.
  • the status indications 66 monitor, and accordingly portray, via a screen, status information for each of the 3D icons in the scene or scenes of the 3D application.
  • Status information is provided for each of the frames of animations of the objects in the scene.
  • the objects may include one or more lights, one or more 3D icons, a targeted direction or directions of individual lights, one or more cameras, and targeted directions of the each of the cameras.
  • the status information may further include the position of each of the lights, icons, light target positions, cameras, and camera target positions in the scene including whether such object or location is obstructed or has collided with another object, and whether or not a particular object is active or inactive.
  • a light or a camera may be provided for within the script but may be active or inactive at a given point within the execution of the 3D application.
  • FIG. 4 provides an example of a screen shot in accordance with one specific embodiment.
  • the illustrated example screen shot 90 includes buttons towards the bottom of the screen for pause 92 , play 94 and tick 96 controls.
  • the illustrated screen 90 includes script identification information 98 , event history information 100 , and key press and device state information 102 .
  • information is provided regarding active animations 104 .
  • the active animations information 104 includes the current animation position, the total animation length, and the subset of the total animation length being run 106 describing each active animation.
  • the particular frame range 108 of the animation is depicted.
  • the present frame 110 within that range, and a description for the animation 112 are also presented.
  • a scene is a type of 3D “world” that encapsulates a se of 3D elements.
  • a scene defines a virtual environment, or space, in which 3D elements exist and may be situated and animated. That is, certain properties of an individual 3D element, such as its position and orientation, may be defined as relative to the scene in which the 3D element exists.
  • icons, cameras, lights and other 3D elements are each part of a scene. They may be part of a single scene, or of two or more separate scenes.
  • a scene may include nodes.
  • the scene includes nodes, each node being a point in the scene to which one or more objects are attached.
  • a node acts as an abstract reference point, or origin, for the positions of its attached objects.
  • the node may itself be animated, in which case any animation performed on a node is propagated to its attached objects.
  • a model is a set of data that describes the appearance and behavior of objects within a scene.
  • a model may constitute a single, independent scene object, or a model may comprise several objects.
  • the data contained within the model may include geometrical data and surface or material properties. In the example shown in FIG. 3 , a puppy is depicted fetching a ball.
  • a single model encapsulates and describes the appearance of the puppy, the ball, and the surface on which the puppy is sitting, as well as a set of animations that can be applied to them.
  • Mesh geometry can be drawn in various ways. It can be painted with a solid color, smoothly shaded between the colors at its vertices, or drawn with a texture map. Textures are the name for a specially-formatted image which is used to “drape” over the geometry represented by a model in order to give it a detailed surface. Textures are defined in texture files, in the illustrated embodiment
  • textures are associated with the geometry they modify, for example, by the manner in which the name of file is specified. That is a texture file with the name “puppy.qxt” is associated with the model file “puppy.qxm”.
  • Each scene may have at least one camera.
  • the camera encapsulates the vantage point from which the scene is viewed.
  • the camera itself is a scene object, and may be animated within the scene.
  • a default camera (looking at the center of the world) may be provided for every scene, which is activated if no other camera is turned on.
  • a scene may have one or more lights.
  • a scene may include default ambient “all-over” lighting. It is possible to bake lighting into vertex color and texture of a model to simulate static lighting in this ambient mode. Life-like dynamic lighting may be achieved by adding a light to a scene. A light is attached to a node, but in addition, it is associated with another node. That is, that association of the light to the other node defines the direction in which the light shines. Accordingly, a light can be pointed like a “flashlight”. In addition, one may include parameters to define the color of the light that is shined into the scene.
  • Animation files may be provided, that describe how an object is animated. When an animation file is called upon, it is applied to a specific node within the scene.
  • Animation files in the illustrated embodiment, are a bit like a strip of film (or a timeline in flash), and contain a set of frames. These frames do not have to represent a continuous sequence, and can contain several completely different animations in the same frame “stack”, which is why, in the illustrated embodiment, when they are called upon, both a start frame and an end frame are specified.
  • an animation When an animation is activated, it is applied to a specific named node that it is meant to animate.
  • one animation file may be provided for animating a puppy, while a separate animation file is provided for animating the camera and light.
  • the instructions specified in an animation file pass into the object attached to that node, and the object does whatever those particular frames tell it to do. For example, a puppy may spin on a spot, fly around the scene, or jump up and down.
  • a 4-way navigation key typically provided in a mobile device key board can be used to animate a puppy in various ways. For example, in this example as shown in FIG. 3 , one may press the right nav key, causing the ball to roll off to the right, shortly followed by the chasing puppy that retrieves it.
  • each of the platforms shown in the figures herein may be performed by a general purpose computer alone or in connection with a specialized processing computer. Such processing may be performed by a single platform, or by a distributed processing platform, or by separate platforms. In addition, such processing can be implemented in the form of special purpose hardware, or in the form of software being run by a general purpose computer. Any data handled in such processing or created as a result of such processing can be stored in any type of memory. By way of example, such data may be stored in a temporary memory, such as in the RAM of a given computer system or subsystems. In addition, or in the alternative, such data may be stored in longer-term storage devices, for example, magnetic discs, rewritable optical discs, and so on.
  • computer-readable media may comprise any form of data storage mechanism, including such memory technologies as well as hardware or circuit representations of such structures and of such data.
  • An integrated circuit may include one or more parts of the structure and processing disclosed herein.

Abstract

In one embodiment, apparatus are provided, including an embedded device simulation engine, an application run controller, and a status provider. The embedded device simulation engine is provided to simulate, on a computer platform other than a target embedded device, a 3D application authored for the target embedded device. The application run controller is provided to control the manner in which the 3D application is run in the embedded device simulation engine. The status provider is provided to provide, as the 3D application is run in the simulated environment of the embedded device simulation engine, information regarding statuses of 3D icons in the scene or scenes of the 3D application, of animations defined of the 3D icons in the 3D application, and of events occurring that affect the 3D application.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • Priority is hereby claimed to U.S. Provisional Patent Application No. 60/696,345.
  • COPYRIGHT NOTICE
  • This patent document contains information subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent, as it appears in the US Patent and Trademark Office files or records, but otherwise reserves all copyright rights whatsoever.
  • FIELD OF THE DISCLOSURE
  • Aspects of the present disclosure relate to tools and features to facilitate the development and implementation of 3D content used in embedded devices. The embedded devices may be mobile devices that capture, receive, and/or transmit voice, data, text, and/or images. Other aspects of the present disclosure relate to tools and features to facilitate the debugging of 3D graphical user interface programs for such devices.
  • BACKGROUND OF THE DISCLOSURE
  • Various application development platforms (e.g., the BREW™ platform) have been created and marketed that allow users to author programs for ultimate export to target embedded devices such as mobile phones. Software exists (e.g., the BREW™ Simulator) for simulating the execution of these programs on a generic computer platform before exporting the program to the target embedded device.
  • SUMMARY OF THE DISCLOSURE
  • Apparatus are provided including an embedded device simulation engine to simulate, on a computer platform other than a target embedded device, a 3D application authored for the target embedded device. In addition to the embedded device simulation engine, an application run controller is provided to control the manner in which the application is run in the embedded device simulation engine. A status provider is also provided to provide, as the 3D application is run in the simulated environment of the embedded device simulation engine, information regarding statuses of 3D icons in a scene of the 3D application, of animations defined in the 3D application, and of events occurring that affect the 3D application.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting example embodiments of the disclosure are further described in the detailed description, which follows, by reference to the noted drawings, in which like reference numerals represents similar parts throughout the several views of the drawings, and wherein:
  • FIG. 1 is a block diagram of one or more device content development platforms;
  • FIG. 2 is a schematic block diagram of a simulator platform;
  • FIG. 3 is a schematic diagram of one or more platform screens; and
  • FIG. 4 is a diagram of an example of a screen shot of a status window of the illustrated debugging system.
  • DETAILED DESCRIPTION
  • Referring now to the drawings in greater detail, FIG. 1 illustrates a 3D content development system 9. The illustrated system 9 includes one or more device content development platforms 10, and a mobile device 12
  • Mobile device 12 may, for example, be a mobile phone. The illustrated mobile device 12 is an embedded device, which captures, receives, and/or transmits voice, data, text, and/or images. The illustrated mobile device 12 further includes a display 13 and keys 14, to allow the control of mobile device 12 and the input of information into mobile device 12.
  • The illustrated device content development platform(s) 10 may be a single platform, a distributed platform, or multiple individual platforms. The illustrated platform(s) includes a number of software interfaces which interact with and provide corresponding windows or screens on a computer platform. Each of these software interfaces includes software running on a computer platform. These interfaces and windows include a scripting window 16 a and a corresponding scripting language interface 16 b. A source code window 18 a is provided which corresponds to a source code interface 18 b. A debugging system 20 is provided. The debugging system 20 includes a debugging window 20 a which corresponds to a debugging interface 20 b.
  • The illustrated 3D content development system 9 may be tailored to a system for developing and implementing 3D user interfaces for use on the embedded device. More specifically, the 3D user interface may cause the display of a 3D graphical virtual interface that graphically portrays (on display 13 of mobile device 12) and simulates a physical device with its interface components, and therefore, serves as a 3 dimensional (3D) user interface, with icons embedded therein.
  • Scripting language interface 16 b is coupled to, and generates, one or more script files 22, which cater to the building of 3D user interfaces. Specifically, those script files 22 provide information for 3D icon and scene definition as well as for programming the animation of the defined 3D icons and scenes. The 3D icons and scenes, as animated, are tied to or associated with mobile device 12, and tools thereof, to control or input and/or to display or output various mobile device operations, settings, events, and/or statuses.
  • Each of the illustrated interfaces 16 b, 18 b, and 20 b is operable through the use of its corresponding window for receiving controls and information via a computer screen and, for displaying information to the user via the same computer screen.
  • Source code interface 18 b, in connection with the source code window 18 a, allows for the creation of a program using source code, typically using commands provided in code provided for original equipment manufacturers (OEMs).
  • Debugging interface 20 b, interacting with debugging window 20 a, facilitates the simulation of script files 22 for purposes of checking and debugging the script file. More specifically, the debugging interface 20 b may provide, via a computer screen display on debugging window 20 a, information regarding statuses of 3D icons in a scene or in scenes of a given 3D application. The debugging interface may further provide on the debugging window 20 a information regarding statuses of animations defined in the 3D application, and of the events occurring that affect the 3D application.
  • Scripting language interface 16 b produces script files 22, while source code interface 18 b produces source code 24. Either or each of these types of code may be compiled to produce compiled script and/or source code 26. The illustrated device content development platform(s) 10 further includes user interface (UI) assets as well as user interface layout files. These include 3D model files 35, animation files 36, texture files 37, and user interface (UI) layout files 38.
  • A graphics engine layer of a mobile device controls 3D graphical functions on mobile device 12 in accordance with the compiled script and/or source code 26 in connection with any associated UI assets and UI layout files, as stored in files 35, 36, 37, and 38.
  • Debugging system 20 may be implemented in the form of a simulator platform 40 as shown in FIG. 2. The illustrated simulator platform 40 includes a mobile device simulation engine 42, a simulation input/output interface 44, and a platform screen and keyboard 46. The illustrated platform 40 further includes a script file 48 to be debugged, and a script file interpreter 50. The script file interpreter 50 interprets script file 48 so that it can be simulated by mobile device simulation engine 42.
  • The illustrated platform screen and keyboard 46 causes the simulated display 47 of the screen of the simulated mobile device, and provides the ability to receive certain simulated key and command inputs 49 via the either or both of the platform screen and keyboard 46.
  • In addition, platform screen and keyboard 46 display a status window 48, and provide, through either computer screen icons or buttons, or through the use of physical control switches or inputs, including, for example certain keys on the keyboard, pause 50, play 51, and tick 52 controls.
  • Status window 48 provides information to the user as the application is run in a simulated environment of mobile device simulation engine 42. The information that is provided is information regarding statuses of 3D icons in the scene or scenes of the 3D application, of animations defined in the 3D application, and of events occurring that affect the 3D application.
  • A fine-tune mechanism may be provided to allow fine-tuning (i.e., modification) of portions of the 3D application, by allowing a user to change the script. This mechanism may be provided as part of simulator platform 40, or the user may use the scripting interface 16 b of the device content development platform 10. Such fine-tuning of the 3D application may include defining animations of the 3D application, and/or defining interaction flow of the 3D application, including interaction flow of the icons in the 3D user interface.
  • The illustrated status window 48 provides, i.e., displays to a user via a computer screen, extensive information describing statuses of all or a subset of icons in the 3D user interface, all or a subset of animations of objects in the scene, and all or a subset of events. Events occur external to the 3D application. For example, an event may be a state change of the simulated target device, or an event may result from user interaction with the 3D application, either by manipulation of a 3D icon or by the use of device keys in the simulated target device. Examples of events include the statuses of the target device itself, including a command key or a given key being depressed, or a state change in the device, such as, a GPS signal being detected, or a low battery state in the target device.
  • FIG. 3 shows a schematic diagram of one or more platform screens of simulator platform 40. The illustrated screen(s) 60 include a simulated display 62 and a status window 64. The simulated display 62 presents the 3D user interface being implemented by the 3D application through the use of mobile device simulation engine 42, running the script in script file 48 as interpreted by script file interpreter 50. The example scene depicted in simulator display 62 includes a default ambient lighting and a single camera view showing a puppy and a ball on a checkered floor.
  • Status window 64 includes a number of status indications 66. Status indications 66 may, for example, be graphical or textual indications on status window 64 of certain information. Those status indications include, in the embodiment shown in FIG. 3, a current script status indication 68, an event history status indication 70, active animations status indication 72, icons status indication 74, camera status indication 76 and light status indication 78.
  • In the illustrated embodiment, the current script status indication 68 portrays, via status window 64, the following the types of script information: an indication of the current state of the animation run controller, represented by the numerical identity of the current animation state within the animation state machine.
  • In this embodiment, event history status indication 70 portrays, via the status window 64, the following types of event history information: a listing of recent events pertinent to the animation, which may include key input notifications, and the starting or ending of individual animation loops.
  • In this embodiment, active animations status indication 72 portrays, via status window 64, the following types of active animation information: for each currently running animation being applied to the scene, to a camera, to a light, or to an icon, an indication is given of the total length of the animation, the subset of the total length being run, and the position of the animation being currently displayed from within that subset.
  • While a single status window 64 is depicted in the illustrated schematic of the platform screen or screens 60, separate status windows may be provided or accessible for one or more portions of the information depicted in the illustrated status window 64. For example, current script status indication information 68 may be depicted in one window, while event history status indication 70 may be depicted in a separate window.
  • The status window 64 further includes graphical tools for allowing a user to activate or deactivate a pause switch 80, a play switch 82, and a tick switch 84. These graphical tools may, for example, be graphical buttons provided on a computer screen within status window 64. The pause switch 80 causes the 3D application to pause at a particular point in its playback, thereby allowing the status information in each of the status indications 66 to portray information relevant to that particular point within the 3D application. The play switch 82 causes the 3D application to be run, or to resume from, a pause state. The tick switch 84 can be depressed to cause the application to move from one increment to another in its execution. In other words, the tick switch is provided to step through each of the increments in running the script. The increments may include individual consecutive frames of animations of icons currently be displayed in the scene on the simulated screen.
  • Generally, the status indications 66 monitor, and accordingly portray, via a screen, status information for each of the 3D icons in the scene or scenes of the 3D application. Status information is provided for each of the frames of animations of the objects in the scene. The objects may include one or more lights, one or more 3D icons, a targeted direction or directions of individual lights, one or more cameras, and targeted directions of the each of the cameras. The status information may further include the position of each of the lights, icons, light target positions, cameras, and camera target positions in the scene including whether such object or location is obstructed or has collided with another object, and whether or not a particular object is active or inactive. For example, a light or a camera may be provided for within the script but may be active or inactive at a given point within the execution of the 3D application.
  • FIG. 4 provides an example of a screen shot in accordance with one specific embodiment. The illustrated example screen shot 90 includes buttons towards the bottom of the screen for pause 92, play 94 and tick 96 controls. In addition, the illustrated screen 90 includes script identification information 98, event history information 100, and key press and device state information 102. In addition, information is provided regarding active animations 104. The active animations information 104 includes the current animation position, the total animation length, and the subset of the total animation length being run 106 describing each active animation. In this regard, the particular frame range 108 of the animation is depicted. The present frame 110 within that range, and a description for the animation 112, are also presented.
  • In embodiments herein, a scene is a type of 3D “world” that encapsulates a se of 3D elements. A scene defines a virtual environment, or space, in which 3D elements exist and may be situated and animated. That is, certain properties of an individual 3D element, such as its position and orientation, may be defined as relative to the scene in which the 3D element exists. In the illustrated embodiment, icons, cameras, lights and other 3D elements are each part of a scene. They may be part of a single scene, or of two or more separate scenes.
  • A scene may include nodes. In the illustrated embodiments, the scene includes nodes, each node being a point in the scene to which one or more objects are attached. A node acts as an abstract reference point, or origin, for the positions of its attached objects. The node may itself be animated, in which case any animation performed on a node is propagated to its attached objects. A model is a set of data that describes the appearance and behavior of objects within a scene. A model may constitute a single, independent scene object, or a model may comprise several objects. The data contained within the model may include geometrical data and surface or material properties. In the example shown in FIG. 3, a puppy is depicted fetching a ball. In this case, a single model encapsulates and describes the appearance of the puppy, the ball, and the surface on which the puppy is sitting, as well as a set of animations that can be applied to them. Mesh geometry can be drawn in various ways. It can be painted with a solid color, smoothly shaded between the colors at its vertices, or drawn with a texture map. Textures are the name for a specially-formatted image which is used to “drape” over the geometry represented by a model in order to give it a detailed surface. Textures are defined in texture files, in the illustrated embodiment
  • Those textures are associated with the geometry they modify, for example, by the manner in which the name of file is specified. That is a texture file with the name “puppy.qxt” is associated with the model file “puppy.qxm”.
  • Each scene may have at least one camera. The camera encapsulates the vantage point from which the scene is viewed. The camera itself is a scene object, and may be animated within the scene. A default camera (looking at the center of the world) may be provided for every scene, which is activated if no other camera is turned on.
  • A scene may have one or more lights. In addition, or alternatively, a scene may include default ambient “all-over” lighting. It is possible to bake lighting into vertex color and texture of a model to simulate static lighting in this ambient mode. Life-like dynamic lighting may be achieved by adding a light to a scene. A light is attached to a node, but in addition, it is associated with another node. That is, that association of the light to the other node defines the direction in which the light shines. Accordingly, a light can be pointed like a “flashlight”. In addition, one may include parameters to define the color of the light that is shined into the scene.
  • One or more animation files may be provided, that describe how an object is animated. When an animation file is called upon, it is applied to a specific node within the scene. Animation files, in the illustrated embodiment, are a bit like a strip of film (or a timeline in flash), and contain a set of frames. These frames do not have to represent a continuous sequence, and can contain several completely different animations in the same frame “stack”, which is why, in the illustrated embodiment, when they are called upon, both a start frame and an end frame are specified.
  • When an animation is activated, it is applied to a specific named node that it is meant to animate. By way of example, one animation file may be provided for animating a puppy, while a separate animation file is provided for animating the camera and light. The instructions specified in an animation file pass into the object attached to that node, and the object does whatever those particular frames tell it to do. For example, a puppy may spin on a spot, fly around the scene, or jump up and down.
  • A 4-way navigation key typically provided in a mobile device key board can be used to animate a puppy in various ways. For example, in this example as shown in FIG. 3, one may press the right nav key, causing the ball to roll off to the right, shortly followed by the chasing puppy that retrieves it.
  • The processing performed by each of the platforms shown in the figures herein may be performed by a general purpose computer alone or in connection with a specialized processing computer. Such processing may be performed by a single platform, or by a distributed processing platform, or by separate platforms. In addition, such processing can be implemented in the form of special purpose hardware, or in the form of software being run by a general purpose computer. Any data handled in such processing or created as a result of such processing can be stored in any type of memory. By way of example, such data may be stored in a temporary memory, such as in the RAM of a given computer system or subsystems. In addition, or in the alternative, such data may be stored in longer-term storage devices, for example, magnetic discs, rewritable optical discs, and so on. For purposes of the disclosure herein, computer-readable media may comprise any form of data storage mechanism, including such memory technologies as well as hardware or circuit representations of such structures and of such data. An integrated circuit may include one or more parts of the structure and processing disclosed herein.
  • The claims, as originally presented and as they may be amended, encompass variations, alternatives, modifications, improvements, equivalents, and substantial equivalents of the embodiments and teachings disclosed herein, including those that are presently unforeseen or unappreciated, and that, for example may arise from applicants/patentees, and others.

Claims (20)

1. Apparatus comprising:
an embedded device simulation engine to simulate, on a computer platform other than a target embedded device, a 3D application authored for the target embedded device;
an application run controller to control the manner in which the 3D application is run in the embedded device simulation engine; and
a status provider to provide, as the 3D application is run in the simulated environment of the embedded device simulation engine, information regarding statuses of 3D icons in a scene of the 3D application, of animations of 3D icons in the 3D application, and of events occurring that affect the 3D application.
2. The apparatus according to claim 1, wherein the 3D application comprises a 3D user interface application.
3. The apparatus according to claim 1, wherein the application run controller includes a play control to cause the 3D application to be executed using the embedded device simulation engine.
4. The apparatus according to claim 3, wherein the application run controller includes a pause control to pause the 3D application as it is run in the embedded device simulation engine.
5. The apparatus according to claim 1, wherein the application run controller includes a tick control to step stepping through increments in the script being executed by the embedded device simulation engine.
6. The apparatus according to claim 5, wherein the increments include individual consecutive frames of an animation of a 3D icon being currently displayed on a simulation screen of the embedded device simulation engine.
7. The apparatus according to claim 1, further comprises a status monitor to monitor status information for each of the 3D objects in the scenes of the 3D application, for each animation frame of the 3D objects.
8. The apparatus according to claim 6, further comprising a status monitor to monitor status information for each of the 3D objects in scenes of the 3D application, for each animation frame of the 3D objects.
9. The apparatus according to claim 8, wherein objects in a scene depicted in the simulation screen include one or more lights, one or more 3D icons, one or more targeted directions of lights, one or more cameras, one or more targeted directions of cameras, positions of each of the objects in the scene, and information regarding whether objects are obstructed or collided and whether select ones of the objects are on or off.
10. The apparatus according to claim 1, wherein the status provider includes a display mechanism to display to the user via a computer screen the information regarding the statuses.
11. The apparatus according to claim 9, wherein the status provider includes a display mechanism to display to the user via a computer screen the information regarding the statuses.
12. The apparatus according to claim 10, wherein the display mechanism displays the information regarding the statuses via a status window.
13. The apparatus according to claim 1, further comprising a fine-tune mechanism to fine-tune portions of the 3D application.
14. The apparatus according to claim 10, further comprising a fine-tune mechanism to fine-tune portions of the 3D application.
15. The apparatus according to claim 14, wherein the fine-tuned portions of the 3D application include defined animations of the 3D application.
16. The apparatus according to claim 15, wherein the fine-tuned portions of the 3D application further include defined interaction flow of the 3D application.
17. The apparatus according to claim 16, wherein the defined interaction flow includes defined interaction flow of 3D objects of the scene to be depicted by the 3D application.
18. A method comprising:
simulating, in a device simulation engine on a computer platform other than a target embedded device, a target embedded device running a 3D application authored for the target embedded device;
controlling the manner in which the 3D application is run in the embedded device simulation engine; and
providing, as the 3D application is run in the simulated environment of the device simulation engine, information regarding statuses of 3D icons in a scene of the 3D application, of animations of 3D icons in the 3D application, and of events occurring that affect the 3D application.
19. Machine-readable media encoded with the data, the encoded data being interoperable with a machine to cause:
simulating, in a device simulation engine on a computer platform other than a target embedded device, a target embedded device running a 3D application authored for the target embedded device;
controlling the manner in which the 3D application is run in the embedded device simulation engine; and
providing, as the 3D application is run in the simulated environment of the device simulation engine, information regarding statuses of 3D icons in a scene of the 3D application, of animations of 3D icons in the 3D application, and of events occurring that affect the 3D application.
20. Apparatus comprising:
means for simulating, on a computer platform other than a target embedded device, an engine running a 3D application authored for the target embedded device;
means for controlling the manner in which the 3D application is run by the means for simulating; and
means for providing, as the 3D application is run in a simulated environment cause by the means for simulating, information regarding statuses of 3D icons in a scene of the 3D application, of animations of 3D icons in the 3D application, and of events occurring that affect the 3D application.
US11/478,418 2005-06-29 2006-06-28 Visual debugging system for 3D user interface program Expired - Fee Related US8589142B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/478,418 US8589142B2 (en) 2005-06-29 2006-06-28 Visual debugging system for 3D user interface program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US69634505P 2005-06-29 2005-06-29
US11/478,418 US8589142B2 (en) 2005-06-29 2006-06-28 Visual debugging system for 3D user interface program

Publications (2)

Publication Number Publication Date
US20070225961A1 true US20070225961A1 (en) 2007-09-27
US8589142B2 US8589142B2 (en) 2013-11-19

Family

ID=37460282

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/478,418 Expired - Fee Related US8589142B2 (en) 2005-06-29 2006-06-28 Visual debugging system for 3D user interface program

Country Status (8)

Country Link
US (1) US8589142B2 (en)
EP (1) EP1915693A2 (en)
JP (1) JP5237095B2 (en)
KR (1) KR101022075B1 (en)
CN (2) CN101233496A (en)
CA (1) CA2613570C (en)
TW (1) TWI391817B (en)
WO (1) WO2007002952A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090066702A1 (en) * 2007-09-06 2009-03-12 Luc Dion Development Tool for Animated Graphics Application
US20090070440A1 (en) * 2007-09-06 2009-03-12 Luc Dion Controlling presentation engine on remote device
US20120154409A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Vertex-baked three-dimensional animation augmentation
CN102577404A (en) * 2009-11-06 2012-07-11 索尼公司 Three dimensional (3D) video for two-dimensional (2D) video messenger applications
US20130318453A1 (en) * 2012-05-23 2013-11-28 Samsung Electronics Co., Ltd. Apparatus and method for producing 3d graphical user interface
US11567628B2 (en) * 2018-07-05 2023-01-31 International Business Machines Corporation Cognitive composition of multi-dimensional icons
US11775260B1 (en) * 2022-05-17 2023-10-03 Tsinghua University Method, apparatus, and device for DAS-based custom function expansion, and storage medium

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9792718B2 (en) * 2008-07-25 2017-10-17 Qualcomm Incorporated Mapping graphics instructions to associated graphics data during performance analysis
CN102089784A (en) * 2008-07-25 2011-06-08 高通股份有限公司 Partitioning-based performance analysis for graphics imaging
KR20130043241A (en) * 2008-07-25 2013-04-29 퀄컴 인코포레이티드 Performance analysis during visual creation of graphics images
US8587593B2 (en) 2008-07-25 2013-11-19 Qualcomm Incorporated Performance analysis during visual creation of graphics images
US9411413B2 (en) 2010-08-04 2016-08-09 Apple Inc. Three dimensional user interface effects on a display
US8913056B2 (en) 2010-08-04 2014-12-16 Apple Inc. Three dimensional user interface effects on a display by using properties of motion
US9311427B2 (en) 2012-01-03 2016-04-12 Cimpress Schweiz Gmbh Automated generation of mobile optimized website based on an existing conventional web page description
KR20150101915A (en) * 2014-02-27 2015-09-04 삼성전자주식회사 Method for displaying 3 dimension graphic user interface screen and device for performing the same
US20150347094A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Interactive learning tool using playground
KR101913060B1 (en) * 2018-02-12 2018-10-29 민경현 Mobile device and method for debugging
CN117421251B (en) * 2023-12-18 2024-03-19 武汉天喻信息产业股份有限公司 Method and system for debugging user interface of embedded terminal

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913052A (en) * 1997-01-24 1999-06-15 Lucent Technologies Inc. System and method for debugging digital signal processor software with an architectural view and general purpose computer employing the same
US20010027387A1 (en) * 2000-03-30 2001-10-04 Hideaki Miyake Debugging supporting apparatus, debugging supporting method and recording medium readable by computer with its programs recorded thereon
US20020169591A1 (en) * 2001-03-12 2002-11-14 Martin Ryzl Module for developing wireless device applications using an integrated emulator
US6514142B1 (en) * 1995-05-24 2003-02-04 Sega Enterprises, Ltd. Picture processing device and game device using the same
US20030236657A1 (en) * 2001-03-12 2003-12-25 Martin Ryzl Method of developing wireless device applications using an integrated emulator and an IDE
US20040150626A1 (en) * 2003-01-30 2004-08-05 Raymond Husman Operator interface panel with control for visibility of desplayed objects
US6778190B1 (en) * 1995-10-09 2004-08-17 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US20050130744A1 (en) * 2000-09-18 2005-06-16 Nintendo Co., Ltd Video game distribution network
US20060048006A1 (en) * 2004-08-31 2006-03-02 Wenkwei Lou Wireless remote firmware debugging for embedded wireless device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000040169A (en) 1995-05-24 2000-02-08 Sega Enterp Ltd Image processor and game device using the same
JPH1033145A (en) 1996-07-23 1998-02-10 Ootome Kamaboko Kk Chicken-containing boiled fish paste and its production
WO1998033145A1 (en) 1997-01-24 1998-07-30 Sony Corporation Pattern data generator, pattern data generating method, and its medium
JP3688862B2 (en) 1997-08-06 2005-08-31 アルゼ株式会社 Amusement machine display test equipment
US6113645A (en) 1998-04-22 2000-09-05 Scientific Learning Corp. Simulated play of interactive multimedia applications for error detection
JP3476689B2 (en) 1998-10-02 2003-12-10 株式会社ネクステック Network system
EP1067806A1 (en) 1999-07-09 2001-01-10 CANAL+ Société Anonyme Apparatus for and method of testing applications
JP2001353678A (en) 2000-06-12 2001-12-25 Sony Corp Authoring system and method and storage medium
JP2002049927A (en) 2000-08-01 2002-02-15 Victor Co Of Japan Ltd Information processor
JP4330412B2 (en) 2003-09-25 2009-09-16 株式会社ディンプス Game device and program for causing computer to function
JP2005165873A (en) 2003-12-04 2005-06-23 Masahiro Ito Web 3d-image display system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6514142B1 (en) * 1995-05-24 2003-02-04 Sega Enterprises, Ltd. Picture processing device and game device using the same
US6778190B1 (en) * 1995-10-09 2004-08-17 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US5913052A (en) * 1997-01-24 1999-06-15 Lucent Technologies Inc. System and method for debugging digital signal processor software with an architectural view and general purpose computer employing the same
US20010027387A1 (en) * 2000-03-30 2001-10-04 Hideaki Miyake Debugging supporting apparatus, debugging supporting method and recording medium readable by computer with its programs recorded thereon
US20050130744A1 (en) * 2000-09-18 2005-06-16 Nintendo Co., Ltd Video game distribution network
US20020169591A1 (en) * 2001-03-12 2002-11-14 Martin Ryzl Module for developing wireless device applications using an integrated emulator
US20030236657A1 (en) * 2001-03-12 2003-12-25 Martin Ryzl Method of developing wireless device applications using an integrated emulator and an IDE
US20040150626A1 (en) * 2003-01-30 2004-08-05 Raymond Husman Operator interface panel with control for visibility of desplayed objects
US20060048006A1 (en) * 2004-08-31 2006-03-02 Wenkwei Lou Wireless remote firmware debugging for embedded wireless device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090066702A1 (en) * 2007-09-06 2009-03-12 Luc Dion Development Tool for Animated Graphics Application
US20090070440A1 (en) * 2007-09-06 2009-03-12 Luc Dion Controlling presentation engine on remote device
US8301689B2 (en) 2007-09-06 2012-10-30 Bluestreak Technology, Inc. Controlling presentation engine on remote device
CN102577404A (en) * 2009-11-06 2012-07-11 索尼公司 Three dimensional (3D) video for two-dimensional (2D) video messenger applications
US20120154409A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Vertex-baked three-dimensional animation augmentation
US8963927B2 (en) * 2010-12-15 2015-02-24 Microsoft Technology Licensing, Llc Vertex-baked three-dimensional animation augmentation
US20130318453A1 (en) * 2012-05-23 2013-11-28 Samsung Electronics Co., Ltd. Apparatus and method for producing 3d graphical user interface
US11567628B2 (en) * 2018-07-05 2023-01-31 International Business Machines Corporation Cognitive composition of multi-dimensional icons
US11775260B1 (en) * 2022-05-17 2023-10-03 Tsinghua University Method, apparatus, and device for DAS-based custom function expansion, and storage medium

Also Published As

Publication number Publication date
US8589142B2 (en) 2013-11-19
KR20080021824A (en) 2008-03-07
KR101022075B1 (en) 2011-03-17
CA2613570C (en) 2012-10-09
TW200713122A (en) 2007-04-01
CA2613570A1 (en) 2007-01-04
CN101233496A (en) 2008-07-30
EP1915693A2 (en) 2008-04-30
CN102855190A (en) 2013-01-02
WO2007002952A3 (en) 2007-04-05
TWI391817B (en) 2013-04-01
JP2008545207A (en) 2008-12-11
WO2007002952A2 (en) 2007-01-04
JP5237095B2 (en) 2013-07-17

Similar Documents

Publication Publication Date Title
US8589142B2 (en) Visual debugging system for 3D user interface program
US20080049015A1 (en) System for development of 3D content used in embedded devices
US9305403B2 (en) Creation of a playable scene with an authoring system
CN107679188B (en) Method for loading 3D model in webpage
US20080184139A1 (en) System and method for generating graphical user interfaces and graphical user interface models
US20120107790A1 (en) Apparatus and method for authoring experiential learning content
CN102999932A (en) Chart animation
WO2010107624A2 (en) Smooth layout animation of continuous and non-continuous properties
US20090219291A1 (en) Movie animation systems
CN110262791B (en) Visual programming method and device, operator and readable storage medium
CN112711458A (en) Method and device for displaying prop resources in virtual scene
WO2020152189A1 (en) A toy system for augmented reality
KR20070101844A (en) Methods and apparatuses for authoring declarative content for a remote platform
US20110209117A1 (en) Methods and systems related to creation of interactive multimdedia applications
CN114443945A (en) Display method of application icons in virtual user interface and three-dimensional display equipment
CN109513212B (en) 2D mobile game UI (user interface) and scenario editing method and system
US5821946A (en) Interactive picture presenting apparatus
CN116778038A (en) Animation editor and animation design method based on three-dimensional map visualization platform
Felicia Getting started with Unity: Learn how to use Unity by creating your very own" Outbreak" survival game while developing your essential skills
JP2008535070A (en) Method for constructing a multimedia scene comprising at least one pointer object, and corresponding scene rendering method, terminal, computer program, server and pointer object
CN117349917B (en) Construction scheme simulation system, method and storage medium
CN117953128A (en) Three-dimensional image acquisition system
Ball et al. A prototype Hotel Browsing system using Java3D
CN114816411A (en) Augmented reality model conversion method, device, equipment and storage medium
CN117899493A (en) Game editing method and device and electronic equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RITTS, JAMES;ELMIEH, BABACK;REEL/FRAME:018396/0003

Effective date: 20061002

AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RITTS, JAMES;ELMIEH, BABACK;DURNIL, DAVID L.;SIGNING DATES FROM 20010824 TO 20110825;REEL/FRAME:026830/0978

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20211119