WO2007125512A2 - Method for situational end-user authoring of distributed media presentations - Google Patents

Method for situational end-user authoring of distributed media presentations Download PDF

Info

Publication number
WO2007125512A2
WO2007125512A2 PCT/IB2007/051600 IB2007051600W WO2007125512A2 WO 2007125512 A2 WO2007125512 A2 WO 2007125512A2 IB 2007051600 W IB2007051600 W IB 2007051600W WO 2007125512 A2 WO2007125512 A2 WO 2007125512A2
Authority
WO
WIPO (PCT)
Prior art keywords
presentation
timeline
devices
distributed
programming device
Prior art date
Application number
PCT/IB2007/051600
Other languages
French (fr)
Other versions
WO2007125512A3 (en
Inventor
Markus G.L.M. Van Doorn
Original Assignee
Koninklijke Philips Electronics, N.V.
U.S. Philips Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics, N.V., U.S. Philips Corporation filed Critical Koninklijke Philips Electronics, N.V.
Publication of WO2007125512A2 publication Critical patent/WO2007125512A2/en
Publication of WO2007125512A3 publication Critical patent/WO2007125512A3/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1671Special purpose buttons or auxiliary keyboards, e.g. retractable mini keypads, keypads or buttons that remain accessible at closed laptop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/022Electronic editing of analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/06Cutting and rejoining; Notching, or perforating record carriers otherwise than by recording styli
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/36Monitoring, i.e. supervising the progress of recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L2012/2847Home automation networks characterised by the type of home appliance used
    • H04L2012/2849Audio/video appliances

Definitions

  • the present invention relates to distributed media/environmental-effect presentations and, more particularly, to the authoring of the presentation on- site.
  • SMS short message
  • the present invention is directed to a common, mobile programming device for authoring a distributed presentation of environmental effects and/or media respectively timed to immerse the end user in a realistic presentation.
  • the device includes a control for end-user navigation of a common presentation timeline in scheduling devices to present at respective times on the timeline. Further included is an actuator for registering against the timeline in performing the scheduling.
  • the devices to present are spatially distributed in an ambient environment of the end user.
  • the programming device selectively enables the registering respectively when it sufficiently nears a corresponding one of the distributed devices so as to enter its communication range.
  • a system for authoring a timed, distributed presentation of environmental effects and/or media includes programmable devices and a programming device.
  • the latter includes a control for end-user navigation of a common presentation timeline in scheduling the programmable devices to present at respective times on the timeline. Further included is an actuator for registering against the timeline in performing the scheduling.
  • the programmable devices are spatially distributed in an ambient environment of the end user.
  • the scheduling includes setting a given distributed device, in correspondence with the navigating, to operating states in which the given device is being scheduled to present.
  • the setting registers the states against the timeline.
  • a method for authoring a timed, distributed presentation of environmental effects and/or media includes spatially distributing devices in an ambient environment of an end user and scheduling the distributed devices to present at respective times on a common presentation timeline.
  • the scheduling involves the end-user approaching the distributed devices individually to bring a common programming device into range. It also involves the end-user navigating, on the common programming device, the timeline in correspondence with the approaching.
  • a programming device for authoring a timed, distributed presentation of environmental effects and/or media is configured for editing, in time and space, state durations, in the presentation, of respective programmable devices against a presentation timeline common for all of the programmable devices.
  • the editing in space transfers a given duration to another of the programmable devices and involves transport of the programming device to the other device.
  • a programmable device for recording and performing, in concert with at least one other performing device, a timed, distributed presentation of environmental effects and/or media has a perform mode and a record or "rehearse" mode.
  • Switching out of record mode causes the programmable device to output, to a coordinator of the presentation, a time-specified device state just recorded or an identifier of the recorded device state.
  • FIG. 1 is a conceptual diagram providing an overview of a distributed media authoring system according to the present invention
  • FIG. 2 is timing chart demonstrating editing of programmable device state durations in authoring a presentation according to the present invention
  • FIG. 3 is a front and side view of a programming device according to the present invention.
  • FIG. 4 is a flow chart of a process for authoring a presentation according to the present invention.
  • FIG. 1 portrays, by illustrative and non- limitative example, an overview of a distributed media authoring system 100 according to the present invention.
  • the authoring system 100 includes programmable devices 110, a programming device 120 and a presentation engine 130.
  • the programmable devices 110 shown include an audio unit 110a having speakers 140, a light or lamp 110b, an electric fan 110c, a television (TV) HOd, a set-top box (STB) 11Oe, a wide screen TV controller 11Of of a wide screen TV 150, and a coffee maker HOg.
  • These devices 110 are operable according to a timed script to deliver a multimedia/environmental effect presentation.
  • the fan 110c might come on at a particular point when, for example, the wide screen TV shows a storm tossing a boat at sea.
  • the coffee maker 11Og might brew, as another example, to lend realism to a morning wake-up scene and/or audio clip.
  • the presentation engine 130 includes a presentation specification recorder 152 for recording a hypermedia presentation to be made, and a presentation coordinator 153 for coordinating the presentation, possibly spanning multiple devices, as it is being rendered to the end user.
  • the recorder 152 translates communications from the programmable devices 110 into the presentation, for subsequent reading by the coordinator 153.
  • the presentation coordinator 153 may be an amBXTM ("ambient experiences") engine.
  • Each of the programmable devices 110 plays a role in authoring the presentation by means of a record or rehearse mode 154, and has, in addition, a normal or perform mode 155 for delivering or rendering its part of the presentation.
  • the programming device 120 automatically switches the programmable device 110 into record mode when the programmable device comes within a near field communication (NFC) range 156 of the programmable device.
  • NFC near field communication
  • the range 156 may be merely a few centimeters in some implementations or may be up to 2 meters, and may vary with programmable device 110 as is the case for the lamp 110b and the wide screen TV controller 11Of.
  • the range is small enough to require the programming device 120 almost to touch the programmable device 110, as in the Near Field Communication Interface and Protocol (NFCIP-I).
  • NFCIP-I Near Field Communication Interface and Protocol
  • the programming device 120 affords the end user 166 a selection between which is being currently programmed.
  • the authoring system 100 has a user interface for recording which can be regarded as being divided, in time and space, into respective portions.
  • the audio unit 110a has a user interface portion 157 that includes a tuning knob 158, a light emitting diode (LED) panel 162, and other components (not shown) on the audio unit.
  • the user interface portion 157 further includes the programming device 120.
  • the programming device 120 is shown in broken line outline to suggest temporal existence at this location when the audio unit 110a is being programmed, as opposed to location near the lamp 110b when the lamp is being programmed.
  • the programming device 120 is preferably mobile, and is preferably hand- held by an end user 166 during its operation.
  • FIG. 1 shows another broken line 170 encompassing the user interface portion 157, and another user interface portion 174 for the lamp 110b is shown encompassed by the broken line 178.
  • the user interface portion 174 likewise includes the programming device 120.
  • Some programmable devices 110 might not have their own user interface in which case the programming device 120, while within the respective range 156, serves as the user interface portion.
  • the user interface for recording in the authoring system 100 is temporally divided into portions 157, 174 that are spatially distributed within an ambient environment of the end-user 166 in correspondence with distributed programmable devices 110.
  • FIG. 2 demonstrates some examples of editing programmable device state durations in authoring a presentation according to the present invention.
  • the presentation will be delivered through the use of two fans 204, 208, a TV 212, a graphic overlay controller 214, a wide- screen TV controller 216, two lights 220, 224 and a coffee maker 228, each of which is approached by the end user 166 individually to schedule its contributory performance in the presentation to be delivered.
  • the end user 166 approaches carrying the programming device 120 into the respective NFC range 156. Entry into the range 156 automatically switches on the record mode 154.
  • the end user 166 approaches the fan 204, putting the fan into record mode 154.
  • states of the fan are time-stamped according to a common presentation timeline 232 navigable by the end user 166 on the programming device 120.
  • the programming device 120 allows control of how fast the timeline 232 progresses or rewinds, and will continue to progress, rewind or hold constant the time at the pace set by the end user 166. Accordingly, the end user 166 may already have navigated to a particular time at the moment of entry in the range 156 of the fan 204, and may have since further navigated while in the range.
  • the end user 166 has switched on the fan 204 at the time 15 seconds on the timeline 232, i.e., the user had navigated to 15 seconds by the time he or she switched on the fan.
  • Switching on the fan 204 causes a change in state from "off to "on.”
  • Responsive to the state change while in record mode 154, the fan 204 time stamps the on state.
  • the time-stamped state is annotated with an identifier of the fan 204 and logged.
  • the end user has also navigated to between 45 and 50 seconds at the time he or she switched the fan 204 off.
  • the off state is similarly time stamped, annotated and logged.
  • the fan 204 remains on until the end of the timeline unless an off state is logged, the latter being the case here. If the end user 166 performs a save on the programming device 120, all logged entries not already transmitted to the presentation engine 130, or a session identifier of them, is transmitted to the presentation engine for recording by the recorder 152. The same occurs, as an alternative to saving, if the end-user at any time leaves the range 156.
  • This transmission may occur wirelessly or over a wired connection, e.g., the Internet or other wide-area network. Alternatively, the transmission may be sent over a local area network (LAN) wired or wireless.
  • FIG. 1 shows the recorder 152 as included in the presentation engine 130, the recorder may be implemented on a machine separate from the presentation engine, e.g., in the programming device 120.
  • the fan 208 was switched on at 1 minute and switched off at 1 1/2 minutes. Subsequent to the setting of these state intervals for the fans 204, 208, a "copy and paste” operation or a “cut and paste” operation is performed. As indicated by the arrow 236, the tail end 240 of the fan 204 "on state” interval is copied or cut, the tail end extending from about 35 seconds onward. This tail end 240 has a state duration of about 10-15 seconds, and is pasted onto the latter part of the fan 208 "on state” interval. This pasting occurs illustratively with overlap, since navigation was merely to the 1 minute and 20 second point at the time of pasting.
  • the pasting could alternatively, and perhaps more typically, have been done to cleanly follow the existing on-state interval for the fan 208 thereby extending it to 1 minute and 40 or 45 seconds. Given the overlapped pasting, the on-state interval of the fan 208 therefore now ends at between 1 minute and 30 and 1 and minute and 35 seconds. In the case of "cut and paste,” a further effect is that the fan 204 "on state interval" is truncated to end at 35 seconds. In the case of either operation, the cut or copy is performed by the end user 166 on the programming device 120 while in the range 156 of the fan 204. The end user 166 then approaches the fan 208.
  • the end user 166 uses the programming device 120 to paste the tail end 240, i.e., state duration of about 10-15 seconds, onto the fan 208 "on state" interval.
  • This is an example of editing in time and space, since the pasting sub-operation requires that the programming device 120 be transported into the range 156 of another device.
  • the above- illustrated editing operations preferably are applied to source and target devices that are compatible, e.g., both fans.
  • the wide-screen TV controller 216 is operated by the end user 166 while in the respective range 156 to show a particular video 244, the operation beginning at time 2 minutes on the timeline 232. After navigating to between 2 minutes and 35 seconds and two minutes and 40 seconds, the video 244 ends. Accordingly, delimiting entries are logged to show an on state for the particular video 244 and an off state. This is an example, on how a state of a device may involve more than merely the device being on or off. Here, the state includes the particular video 244 chosen for inclusion at this temporal point in the presentation. Similarly, the device state for the graphic overlay controller 214 would relate to the particular graphic or scene overlay intended for the presentation at a particular time, and two separate device states 248, 252 are shown in FIG.
  • the editing operation cuts or copies the entire duration of the state interval, which, as seen from the arrow 248, is pasted onto the same device 216 to start at 15 seconds according to the timeline 232.
  • the lights 220, 224 being in the same neighborhood, have respective ranges 156 that overlap.
  • the end user 166 cuts or copies a tail end 256, although, more generally, any part of a state interval can be cut or copied.
  • both lights 220, 224 are in record mode. The programming device 120 will therefore allow the end user 166 to select between the two devices 220, 224 for the cut or copy sub-operation.
  • the programming device 120 affords the option of pasting to both device 220, 224 simultaneously in timeline concurrency by means of a single paste sub-operation that selects both devices.
  • This simultaneous pasting is represented by an arrow 260.
  • Simultaneous pasting to devices 110 can alternatively paste to the other devices a relative time difference rather than a state duration.
  • moving a state duration that begins at 20 seconds to the 1 minute, 40 second mark entails a relative time difference of 1 minute 20 seconds for device 220.
  • Simultaneous pasting according to the alternative embodiment would move the existing state duration for the device 224 forward by that relative time difference. Device 224 would therefore begin its on state at 2 minutes 40 seconds and turn off at 3 minutes.
  • the programming device 120 In the event the programming device 120 cannot simultaneously be into range of both devices 220, 224, the programming device is carried or otherwise brought into the ranges 156 separately for two correspondingly separate paste sub-operations.
  • the end user 166 selects both devices 220, 224 on the programming device 120.
  • the programming device 120 attaches their respective identifiers to the paste sub -operation, which it then transmits, preferably wirelessly, to the presentation engine 130 for recording.
  • the presentation recorder 152 accordingly pastes the state duration to both devices 220, 224.
  • NFCIP-I A further alternative under NFCIP-I is to provide the end -user 166 on-screen on the programming device 120 with a selectable portable link to the device 110 currently in-range.
  • NFCIP-I allows a mobile device within NFC range of another device to automatically set up a BluetoothTM connection which survives as the mobile device leaves the range and thereby breaks the NFC connection.
  • the end-user can select a portable link by which to subsequently perform the simultaneous paste.
  • FIG. 3 is a front and side view of an exemplary embodiment of the programming device 120.
  • the programming device 120 which could be incorporated into a portable media player, such as an MP3 player, or another portable multifunction terminal such as mobile phone, has a screen 304, an audio speaker 308, a four-way screen-navigating rocker switch 312 having a central selection button 316, a timeline navigation wheel 320, a toggle button 324 for pausing and continuing the timeline 232, and an on/off switch (not shown).
  • the programming device 120 features a cut, copy and paste buttons or actuators 328, 332, 336, respectively.
  • the programming device 120 also preferably includes a radio frequency identification (RFID) reader that, when activated, emits a short-range radio signal for powering up a microchip on the tag or transponder of each programmable device 110. This allows for reading data stored on the tag, as when reading states for display on the screen 304.
  • RFID radio frequency identification
  • the programmable device 110 likewise preferably has a reader for reading a tag on the programming device so as to receive time stamps identifying current time on the timeline 232.
  • audio cues consisting of regularly-spaced audio announcements emanate from the audio speaker 308 during timeline navigation, and during pauses in timeline navigation.
  • the end user 166 hears, for example, "1 second, 2 seconds, 3 seconds, 4 seconds . . .” from the speaker 308. This way, the end user 166 can experience the presentation timeline 232 without having to look at any screen, allowing him or her to fully focus on the situational (i.e., on-site) experience being built up.
  • the timeline 232 can, in addition or alternatively, be displayed on the screen 304. It can, for instance, roll horizontally according to navigation, with device states being shown alongside as in FIG. 2.
  • programmable devices 110 can be selected on-screen in the event of overlapping ranges 156 or, in one embodiment, in case of simultaneous pasting.
  • FIG. 4 provides a process 400 for authoring a presentation according to the present invention.
  • Programmable devices 110 are spatially distributed in the end-user's ambient environment (step S410).
  • the end user 166 approaches, bringing the programming device 120 into NFC range 156 (step S420). This puts the approached programmable device 110 in record mode 154, provided the programming device 120 has itself been activated as by means of the on/off switch, or otherwise put into record mode.
  • the programming device 120 which may be embodied with any of a variety of other hand-portable appliances, protects against inadvertently switching programmable devices 110 into record mode while passing by.
  • the end user 166 selectively activates and deactivates states of the current programmable device 110 in correspondence with: 1) navigating the common presentation timeline 232 on the programming device 120, with audio cues and/or visual feedback; 2) registering time-specified state changes against the timeline 232 using the programming and programmable devices 120, 110; and 3) optionally editing state durations in time and space using the programming and programmable devices 120, 110 (step S430).
  • the current programmable device 110 switches out of record mode 154, either because the programming device 120 has moved out of range or has, under end-user control, selected perform mode 155 or another operating mode, the current device outputs a time- specified state or corresponding identifier (step S440).
  • the option of sending the identifier applies to stateful devices, i.e., devices that save their own scheduled device states or "actions.”
  • the identifier can be in the form of a session identifier that includes a device identifier.
  • the presentation coordinator 153 in reading the recorded script, sends the session identifier back to the originating device 110 at the moment the device is scheduled to perform.
  • the device 110 uses the incoming session identifier to retrieve the corresponding action list or action from storage.
  • a stateful device which sends out and receives back an identifier of an action list is preferably configured with a timer so that actions on the list after the first action can be performed on time.
  • stateless device which do not save their states, react to a save by the end user 166 by transmitting the state(s) as an action or action list to the presentation engine 130.
  • the presentation coordinator 153 sends back the action to be performed at that moment, thereby controlling synchronization of the distributed devices 110.
  • the authoring process 400 is carried out for each programmable device 120 (steps S450, S460).

Abstract

Authoring a distributed presentation of environmental effects and/or media respectively timed to immerse the end user in a realistic presentation uses a common programming device (120). The device includes a control for end -user navigation of a common presentation timeline (232) in scheduling programmable devices to present at respective times on the timeline. Further included is an actuator for registering against the timeline in performing the scheduling. The devices to present (110) are spatially distributed in an ambient environment of the end user. The programming device selectively enables the registering respectively when it sufficiently nears a corresponding one of the distributed devices so as to enter its communication range (S420).

Description

METHOD FOR SITUATIONAL END-USER AUTHORING OF DISTRIBUTED MEDIA
PRESENTATIONS
The present invention relates to distributed media/environmental-effect presentations and, more particularly, to the authoring of the presentation on- site.
Authoring of interactive media presentations is well known from digital versatile disc (DVD) authoring and hypertext markup language (HTML) authoring applications such as Apple's iDVD™, Macromedia's Dreamweaver™ and Microsoft's Frontpage™. In addition, markup languages such as Synchronized Multimedia Integration Language
(SMIL) allow an author to synchronize multimedia events.
In ambient intelligence environments, however, several fragments of media such as audio, video, animation, graphics and text and/or environmental effects, such as light, wind, heat and scent can run in parallel. The present inventor has observed that the above-mentioned end-user authoring applications have been designed with a single device in mind, i.e., a personal computer (PC) screen, and that, although languages like SMIL are extendable to distributed devices, an easy and intuitive user interface for authoring does not exist.
In one aspect, the present invention is directed to a common, mobile programming device for authoring a distributed presentation of environmental effects and/or media respectively timed to immerse the end user in a realistic presentation. The device includes a control for end-user navigation of a common presentation timeline in scheduling devices to present at respective times on the timeline. Further included is an actuator for registering against the timeline in performing the scheduling. The devices to present are spatially distributed in an ambient environment of the end user. The programming device selectively enables the registering respectively when it sufficiently nears a corresponding one of the distributed devices so as to enter its communication range.
In another aspect, a system for authoring a timed, distributed presentation of environmental effects and/or media includes programmable devices and a programming device. The latter includes a control for end-user navigation of a common presentation timeline in scheduling the programmable devices to present at respective times on the timeline. Further included is an actuator for registering against the timeline in performing the scheduling. The programmable devices are spatially distributed in an ambient environment of the end user.
The scheduling includes setting a given distributed device, in correspondence with the navigating, to operating states in which the given device is being scheduled to present. The setting registers the states against the timeline. In a further aspect, a method for authoring a timed, distributed presentation of environmental effects and/or media includes spatially distributing devices in an ambient environment of an end user and scheduling the distributed devices to present at respective times on a common presentation timeline. The scheduling involves the end-user approaching the distributed devices individually to bring a common programming device into range. It also involves the end-user navigating, on the common programming device, the timeline in correspondence with the approaching.
In yet another aspect, a programming device for authoring a timed, distributed presentation of environmental effects and/or media is configured for editing, in time and space, state durations, in the presentation, of respective programmable devices against a presentation timeline common for all of the programmable devices. The editing in space transfers a given duration to another of the programmable devices and involves transport of the programming device to the other device.
In a still further aspect, a programmable device for recording and performing, in concert with at least one other performing device, a timed, distributed presentation of environmental effects and/or media, has a perform mode and a record or "rehearse" mode.
Switching out of record mode causes the programmable device to output, to a coordinator of the presentation, a time-specified device state just recorded or an identifier of the recorded device state.
Details of the novel, distributed, multimedia/environmental effect, end-user, situational (i.e., on-site), authoring system and components are set forth below with the aid of the following drawings:
FIG. 1 is a conceptual diagram providing an overview of a distributed media authoring system according to the present invention;
FIG. 2 is timing chart demonstrating editing of programmable device state durations in authoring a presentation according to the present invention; FIG. 3 is a front and side view of a programming device according to the present invention; and
FIG. 4 is a flow chart of a process for authoring a presentation according to the present invention. FIG. 1 portrays, by illustrative and non- limitative example, an overview of a distributed media authoring system 100 according to the present invention. The authoring system 100 includes programmable devices 110, a programming device 120 and a presentation engine 130.
The programmable devices 110 shown include an audio unit 110a having speakers 140, a light or lamp 110b, an electric fan 110c, a television (TV) HOd, a set-top box (STB) 11Oe, a wide screen TV controller 11Of of a wide screen TV 150, and a coffee maker HOg. These devices 110 are operable according to a timed script to deliver a multimedia/environmental effect presentation. Thus, the fan 110c might come on at a particular point when, for example, the wide screen TV shows a storm tossing a boat at sea. The coffee maker 11Og might brew, as another example, to lend realism to a morning wake-up scene and/or audio clip.
The presentation engine 130 includes a presentation specification recorder 152 for recording a hypermedia presentation to be made, and a presentation coordinator 153 for coordinating the presentation, possibly spanning multiple devices, as it is being rendered to the end user. The recorder 152 translates communications from the programmable devices 110 into the presentation, for subsequent reading by the coordinator 153. The presentation coordinator 153 may be an amBX™ ("ambient experiences") engine.
Each of the programmable devices 110 plays a role in authoring the presentation by means of a record or rehearse mode 154, and has, in addition, a normal or perform mode 155 for delivering or rendering its part of the presentation. The programming device 120 automatically switches the programmable device 110 into record mode when the programmable device comes within a near field communication (NFC) range 156 of the programmable device. The range 156 may be merely a few centimeters in some implementations or may be up to 2 meters, and may vary with programmable device 110 as is the case for the lamp 110b and the wide screen TV controller 11Of. According to one embodiment, the range is small enough to require the programming device 120 almost to touch the programmable device 110, as in the Near Field Communication Interface and Protocol (NFCIP-I). In the case of overlapping ranges, which might apply to the TV HOd and its set -top box HOe, the programming device 120 affords the end user 166 a selection between which is being currently programmed.
The authoring system 100 has a user interface for recording which can be regarded as being divided, in time and space, into respective portions.
The audio unit 110a, for example, has a user interface portion 157 that includes a tuning knob 158, a light emitting diode (LED) panel 162, and other components (not shown) on the audio unit. The user interface portion 157 further includes the programming device 120. The programming device 120 is shown in broken line outline to suggest temporal existence at this location when the audio unit 110a is being programmed, as opposed to location near the lamp 110b when the lamp is being programmed. The programming device 120 is preferably mobile, and is preferably hand- held by an end user 166 during its operation. For explanatory purposes, FIG. 1 shows another broken line 170 encompassing the user interface portion 157, and another user interface portion 174 for the lamp 110b is shown encompassed by the broken line 178. The user interface portion 174 likewise includes the programming device 120.
Some programmable devices 110, however, such as colored LEDs, might not have their own user interface in which case the programming device 120, while within the respective range 156, serves as the user interface portion.
In particular therefore, the user interface for recording in the authoring system 100 is temporally divided into portions 157, 174 that are spatially distributed within an ambient environment of the end-user 166 in correspondence with distributed programmable devices 110.
FIG. 2 demonstrates some examples of editing programmable device state durations in authoring a presentation according to the present invention. The presentation will be delivered through the use of two fans 204, 208, a TV 212, a graphic overlay controller 214, a wide- screen TV controller 216, two lights 220, 224 and a coffee maker 228, each of which is approached by the end user 166 individually to schedule its contributory performance in the presentation to be delivered. The end user 166 approaches carrying the programming device 120 into the respective NFC range 156. Entry into the range 156 automatically switches on the record mode 154. The end user 166 approaches the fan 204, putting the fan into record mode 154. From this time forward, states of the fan are time-stamped according to a common presentation timeline 232 navigable by the end user 166 on the programming device 120. The programming device 120 allows control of how fast the timeline 232 progresses or rewinds, and will continue to progress, rewind or hold constant the time at the pace set by the end user 166. Accordingly, the end user 166 may already have navigated to a particular time at the moment of entry in the range 156 of the fan 204, and may have since further navigated while in the range.
As seen from FIG. 2, the end user 166 has switched on the fan 204 at the time 15 seconds on the timeline 232, i.e., the user had navigated to 15 seconds by the time he or she switched on the fan. Switching on the fan 204 causes a change in state from "off to "on." Responsive to the state change while in record mode 154, the fan 204 time stamps the on state. The time-stamped state is annotated with an identifier of the fan 204 and logged. The end user has also navigated to between 45 and 50 seconds at the time he or she switched the fan 204 off. The off state is similarly time stamped, annotated and logged. During the presentation, the fan 204 remains on until the end of the timeline unless an off state is logged, the latter being the case here. If the end user 166 performs a save on the programming device 120, all logged entries not already transmitted to the presentation engine 130, or a session identifier of them, is transmitted to the presentation engine for recording by the recorder 152. The same occurs, as an alternative to saving, if the end-user at any time leaves the range 156. This transmission may occur wirelessly or over a wired connection, e.g., the Internet or other wide-area network. Alternatively, the transmission may be sent over a local area network (LAN) wired or wireless. Although FIG. 1 shows the recorder 152 as included in the presentation engine 130, the recorder may be implemented on a machine separate from the presentation engine, e.g., in the programming device 120.
The fan 208 was switched on at 1 minute and switched off at 1 1/2 minutes. Subsequent to the setting of these state intervals for the fans 204, 208, a "copy and paste" operation or a "cut and paste" operation is performed. As indicated by the arrow 236, the tail end 240 of the fan 204 "on state" interval is copied or cut, the tail end extending from about 35 seconds onward. This tail end 240 has a state duration of about 10-15 seconds, and is pasted onto the latter part of the fan 208 "on state" interval. This pasting occurs illustratively with overlap, since navigation was merely to the 1 minute and 20 second point at the time of pasting. The pasting could alternatively, and perhaps more typically, have been done to cleanly follow the existing on-state interval for the fan 208 thereby extending it to 1 minute and 40 or 45 seconds. Given the overlapped pasting, the on-state interval of the fan 208 therefore now ends at between 1 minute and 30 and 1 and minute and 35 seconds. In the case of "cut and paste," a further effect is that the fan 204 "on state interval" is truncated to end at 35 seconds. In the case of either operation, the cut or copy is performed by the end user 166 on the programming device 120 while in the range 156 of the fan 204. The end user 166 then approaches the fan 208. While in the range 156 of the fan 208, thereby putting the fan 208 into record mode 154, the end user 166 uses the programming device 120 to paste the tail end 240, i.e., state duration of about 10-15 seconds, onto the fan 208 "on state" interval. This is an example of editing in time and space, since the pasting sub-operation requires that the programming device 120 be transported into the range 156 of another device. The above- illustrated editing operations preferably are applied to source and target devices that are compatible, e.g., both fans. Although it is possible to paste an audio state onto a video device so as to apply to an audio portion of the device, the reverse operation would encounter the difficulty of pasting a video state onto a non- video device.
The wide-screen TV controller 216 is operated by the end user 166 while in the respective range 156 to show a particular video 244, the operation beginning at time 2 minutes on the timeline 232. After navigating to between 2 minutes and 35 seconds and two minutes and 40 seconds, the video 244 ends. Accordingly, delimiting entries are logged to show an on state for the particular video 244 and an off state. This is an example, on how a state of a device may involve more than merely the device being on or off. Here, the state includes the particular video 244 chosen for inclusion at this temporal point in the presentation. Similarly, the device state for the graphic overlay controller 214 would relate to the particular graphic or scene overlay intended for the presentation at a particular time, and two separate device states 248, 252 are shown in FIG. 2 for the TV 212. Referring again, to the wide screen TV controller video 244, the editing operation cuts or copies the entire duration of the state interval, which, as seen from the arrow 248, is pasted onto the same device 216 to start at 15 seconds according to the timeline 232. The lights 220, 224, being in the same neighborhood, have respective ranges 156 that overlap. As shown, the end user 166 cuts or copies a tail end 256, although, more generally, any part of a state interval can be cut or copied. With the programming device 120 in the overlap between the ranges 156, both lights 220, 224 are in record mode. The programming device 120 will therefore allow the end user 166 to select between the two devices 220, 224 for the cut or copy sub-operation.
Preferably, the programming device 120 affords the option of pasting to both device 220, 224 simultaneously in timeline concurrency by means of a single paste sub-operation that selects both devices. This simultaneous pasting is represented by an arrow 260.
Simultaneous pasting to devices 110, one of which is the same device cut or copied from, can alternatively paste to the other devices a relative time difference rather than a state duration. Here, for example, moving a state duration that begins at 20 seconds to the 1 minute, 40 second mark entails a relative time difference of 1 minute 20 seconds for device 220. Simultaneous pasting according to the alternative embodiment would move the existing state duration for the device 224 forward by that relative time difference. Device 224 would therefore begin its on state at 2 minutes 40 seconds and turn off at 3 minutes.
In the event the programming device 120 cannot simultaneously be into range of both devices 220, 224, the programming device is carried or otherwise brought into the ranges 156 separately for two correspondingly separate paste sub-operations.
In an alternative embodiment, even if range overlap does not exist, the end user 166 selects both devices 220, 224 on the programming device 120. The programming device 120 attaches their respective identifiers to the paste sub -operation, which it then transmits, preferably wirelessly, to the presentation engine 130 for recording. The presentation recorder 152 accordingly pastes the state duration to both devices 220, 224.
A further alternative under NFCIP-I is to provide the end -user 166 on-screen on the programming device 120 with a selectable portable link to the device 110 currently in-range. NFCIP-I allows a mobile device within NFC range of another device to automatically set up a Bluetooth™ connection which survives as the mobile device leaves the range and thereby breaks the NFC connection. When leaving one of the programmable devices 110 targeted for a paste to approach another of them, the end-user can select a portable link by which to subsequently perform the simultaneous paste.
FIG. 3 is a front and side view of an exemplary embodiment of the programming device 120. The programming device 120, which could be incorporated into a portable media player, such as an MP3 player, or another portable multifunction terminal such as mobile phone, has a screen 304, an audio speaker 308, a four-way screen-navigating rocker switch 312 having a central selection button 316, a timeline navigation wheel 320, a toggle button 324 for pausing and continuing the timeline 232, and an on/off switch (not shown). On one side, the programming device 120 features a cut, copy and paste buttons or actuators 328, 332, 336, respectively. The programming device 120 also preferably includes a radio frequency identification (RFID) reader that, when activated, emits a short-range radio signal for powering up a microchip on the tag or transponder of each programmable device 110. This allows for reading data stored on the tag, as when reading states for display on the screen 304. The programmable device 110 likewise preferably has a reader for reading a tag on the programming device so as to receive time stamps identifying current time on the timeline 232.
Turning the wheel 320 clockwise quickens the pace at which the timeline 232 progresses, or slows the pace at which the timeline goes backward in time. Conversely, turning the wheel 320 counterclockwise quickens backward movement or slows forward movement. Advantageously, audio cues consisting of regularly-spaced audio announcements emanate from the audio speaker 308 during timeline navigation, and during pauses in timeline navigation. In record mode 154, the end user 166 hears, for example, "1 second, 2 seconds, 3 seconds, 4 seconds . . ." from the speaker 308. This way, the end user 166 can experience the presentation timeline 232 without having to look at any screen, allowing him or her to fully focus on the situational (i.e., on-site) experience being built up. If the end user 166 pauses the timeline 232 at a particular point in time, she hears "12 seconds, 12 seconds, 12 seconds . . ." which lets the end user know where she is on the timeline. If she rewinds the timeline 232, she hears "12 seconds, 11 seconds, 10 seconds, 9 seconds, . . ." Here, the announcements are made one second apart. The counting will either speed up or skip some times if navigation forward quickens, and slow down or more finely divide time announcements if navigation slows. The timeline 232 can, in addition or alternatively, be displayed on the screen 304. It can, for instance, roll horizontally according to navigation, with device states being shown alongside as in FIG. 2.
The end user can save logged entries by navigating on-screen and then selecting via the central selection button 316, or a separate button can be provided for this purpose. As mentioned above, programmable devices 110 can be selected on-screen in the event of overlapping ranges 156 or, in one embodiment, in case of simultaneous pasting.
In one embodiment, the end user 166 relies on audio cues in navigating the timeline 232, and moving out of range to save logged entries. The rocker switch 312 and selection button 316 can therefore be eliminated, thereby further simplifying the user interface. FIG. 4 provides a process 400 for authoring a presentation according to the present invention. Programmable devices 110 are spatially distributed in the end-user's ambient environment (step S410). For a current device 110, the end user 166 approaches, bringing the programming device 120 into NFC range 156 (step S420). This puts the approached programmable device 110 in record mode 154, provided the programming device 120 has itself been activated as by means of the on/off switch, or otherwise put into record mode.
Requiring this of the programming device 120, which may be embodied with any of a variety of other hand-portable appliances, protects against inadvertently switching programmable devices 110 into record mode while passing by.
While in record mode 154, the end user 166 selectively activates and deactivates states of the current programmable device 110 in correspondence with: 1) navigating the common presentation timeline 232 on the programming device 120, with audio cues and/or visual feedback; 2) registering time-specified state changes against the timeline 232 using the programming and programmable devices 120, 110; and 3) optionally editing state durations in time and space using the programming and programmable devices 120, 110 (step S430). When the current programmable device 110 switches out of record mode 154, either because the programming device 120 has moved out of range or has, under end-user control, selected perform mode 155 or another operating mode, the current device outputs a time- specified state or corresponding identifier (step S440). The option of sending the identifier applies to stateful devices, i.e., devices that save their own scheduled device states or "actions." The identifier can be in the form of a session identifier that includes a device identifier. Each time the device 110 sends out an identifier due to saving or exit out of range, a new session is indicated. Thus, at the time of presentation, the presentation coordinator 153, in reading the recorded script, sends the session identifier back to the originating device 110 at the moment the device is scheduled to perform. The device 110 uses the incoming session identifier to retrieve the corresponding action list or action from storage. A stateful device which sends out and receives back an identifier of an action list is preferably configured with a timer so that actions on the list after the first action can be performed on time. On the other hand, stateless device, which do not save their states, react to a save by the end user 166 by transmitting the state(s) as an action or action list to the presentation engine 130. In reading the script at presentation time, the presentation coordinator 153 sends back the action to be performed at that moment, thereby controlling synchronization of the distributed devices 110.
The authoring process 400 is carried out for each programmable device 120 (steps S450, S460).
While there have shown and described and pointed out fundamental novel features of the invention as applied to preferred embodiments thereof, it will be understood that various omissions and substitutions and changes in the form and details of the devices illustrated, and in their operation, may be made by those skilled in the art without departing from the spirit of the invention. For example, different media can be scheduled for the presentation without any special effects. It should be recognized that structures and/or elements and/or method steps shown and/or described in connection with any disclosed form or embodiment of the invention may be incorporated in any other disclosed or described or suggested form or embodiment.

Claims

CLAIMS:
1. A common, mobile programming device (120) for authoring a timed, distributed presentation of at least one of environmental effects and media by a plurality of devices (110), the common, mobile programming device comprising: a control (320, 324) for navigating, by an end user, a common presentation timeline in scheduling the plural devices to present at respective times on said timeline; and an actuator (328, 332, 336) for registering against said timeline in performing said scheduling, said plural devices being spatially distributed in an ambient environment of said end user, the common, mobile programming device being configured for selectively enabling said registering respectively when the programming device sufficiently nears a corresponding one of the distributed devices so as to enter a communication range of said corresponding one.
2. The programming device of claim 1, being further configured for providing a regularly- spaced series of audio announcements identifying current time on said timeline during said navigating and during pauses in said navigating (308).
3. The programming device of claim 1, at least some of said scheduling being performed on a user interface temporally divided into portions (157, 174) that are spatially distributed within said ambient environment in correspondence with said distributed devices and that are respectively implemented, at least partially, on the programming device.
4. A system for authoring timed, distributed presentations of at least one of environmental effects and media, said system comprising the programming device of claim 1 , wherein said corresponding one is configured for automatically operating in record mode (154) when the programming device is within said range, and for automatically, upon leaving said range, switching out of record mode and into an operating mode (155).
5. A presentation engine (130) for use with the programming device of claim 1, said engine being configured for controlling synchronization of said distributed devices, said presentation engine comprising: a presentation specification recorder (152) for selectively recording said scheduling for subsequent rendering, by said distributed devices, of the authored media presentation according to said scheduling; and a presentation coordinator (153) for, in processing the recorded scheduling, transmitting information to said distributed devices to cause said rendering.
6. The programming device of claim 1, comprising at least said actuator, and optionally one or more additional actuators, so that the programming device is configured for at least one of copying and cutting, for subsequent pasting selectively, along said timeline an operating state of one of said distributed devices over a predetermined time duration so as to reschedule said state for said duration for another of said distributed devices (256, 260).
7. The programming device of claim 1, comprising at least said actuator, and optionally one or more additional actuators, such that the programming device is configured for at least one of copying and cutting, for subsequent pasting selectively, along said timeline a remaining time duration of a particular operating state of one of said distributed devices (236, 240).
8. The programming device of claim 1 , wherein said range is a Near Field Communication (NFC) range (156).
9. A presentation engine (130) for use with the programming device of claim 1, said engine being configured for controlling synchronization of said distributed devices, said engine comprising: a presentation specification recorder (152) for selectively recording, to a document, said scheduling or references to said scheduling; and a presentation coordinator (153) for reading the recorded document in coordinating, among said distributed devices, rendering, by said distributed devices, of the authored media presentation according to said scheduling.
10. A system for authoring a timed, distributed presentation of at least one of environmental effects and media, comprising: a plurality of programmable devices (110); and a programming device (120) including a control for navigating, by an end user, a common presentation timeline (232) in scheduling the plural programmable devices to present at respective times on said timeline, and further including an actuator for registering against said timeline in performing said scheduling, said plural programmable devices being spatially distributed in an ambient environment of said end user, wherein said scheduling comprises setting a given device from among said distributed devices, in correspondence with the navigating of said timeline, to a plurality of operating states (212, 252) in which said given device is being scheduled to present, said setting registering said states against said timeline.
11. A method for authoring a timed, distributed presentation of at least one of environmental effects and media, comprising: spatially distributing a plurality of devices (110) in an ambient environment of an end user; and scheduling the distributed devices to present at respective times on a common presentation timeline (232), said scheduling comprising approaching, by said end user, the distributed devices individually to bring a common programming device (120) into range (156) and navigating, on said common programming device, said timeline in correspondence with said approaching.
12. The method of claim 11, further comprising providing a regularly-spaced series of audio announcements (308) identifying current time on said timeline during navigation of said timeline and during pauses in said navigation.
13. The method of claim 11 , further comprising: recording said scheduling for subsequent rendering, by said distributed devices, of the authored media presentation according to said scheduling (S430, S440); and, in processing the recorded scheduling, transmitting information to said distributed devices to cause said rendering.
14. The method of claim 11 , further comprising at least one of copying and cutting, for subsequent pasting selectively, along said timeline an operating state of one of said distributed devices over a predetermined time duration so as to reschedule said state for said duration for another of said distributed devices (236, 240).
15. The method of claim 11, further comprising at least one of copying and cutting, for subsequent pasting selectively, along said timeline a remaining time duration of a particular operating state of one of said distributed devices (244, 248).
16. The method of claim 11 , further comprising: selectively recording, to a document, said scheduling or references to said scheduling (S440); and reading the recorded document in coordinating, among said distributed devices, rendering, by said distributed devices, of the authored media presentation according to said scheduling (153).
17. The method of claim 11, wherein said scheduling comprises setting a given device from among said distributed devices, in correspondence with the navigating of said timeline, to a plurality of operating states in which said given device is being scheduled to present (212, 252).
18. A computer program product (120) for authoring a timed, distributed presentation of at least one of environmental effects and media, said product comprising a computer readable medium embodying a computer program having instructions executable by a processor to perform the method of claim 11.
19. A programming device (120) for authoring a timed, distributed presentation of at least one of environmental effects and media, said programming device being configured for editing, in time and space (236), state durations, in said presentation, of respective programmable devices against a presentation timeline (232) common for all of said programmable devices, the editing in space transferring a given one of said durations to another of said programmable devices and involving transport of said programming device to said another device.
20. The programming device of claim 19, further configured for performing said editing in space so as to designate a specific plurality of said programmable devices (260) and to transfer said given one and the designation to a presentation engine for pasting of said given one to each of the devices of said specific plurality.
21. The programming device of claim 19, being further configured for providing a regularly-spaced series of audio announcements identifying current time on said timeline during navigation of said timeline and during pauses in said navigation (308).
22. A programmable device (120) for recording and performing, in concert with at least one other performing device, a timed, distributed presentation of at least one of environmental effects and media, said programmable device having a perform mode (155) and a record mode (154), switching out of record mode causing said programmable device to output, to a coordinator of said presentation, a time-specified device state just recorded or an identifier of the recorded device state (S440).
PCT/IB2007/051600 2006-05-01 2007-04-30 Method for situational end-user authoring of distributed media presentations WO2007125512A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US79697906P 2006-05-01 2006-05-01
US60/796,979 2006-05-01

Publications (2)

Publication Number Publication Date
WO2007125512A2 true WO2007125512A2 (en) 2007-11-08
WO2007125512A3 WO2007125512A3 (en) 2008-01-10

Family

ID=38544081

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/051600 WO2007125512A2 (en) 2006-05-01 2007-04-30 Method for situational end-user authoring of distributed media presentations

Country Status (1)

Country Link
WO (1) WO2007125512A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009098550A1 (en) * 2008-02-04 2009-08-13 Sony Ericsson Mobile Communications Ab Intelligent interaction between a wireless portable device and media devices in a local network

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5086385A (en) * 1989-01-31 1992-02-04 Custom Command Systems Expandable home automation system
US20020035404A1 (en) * 2000-09-14 2002-03-21 Michael Ficco Device control via digitally stored program content
US20030103088A1 (en) * 2001-11-20 2003-06-05 Universal Electronics Inc. User interface for a remote control application
US20050040250A1 (en) * 2003-08-18 2005-02-24 Wruck Richard A. Transfer of controller customizations
US20050148828A1 (en) * 2003-12-30 2005-07-07 Kimberly-Clark Worldwide, Inc. RFID system and method for tracking environmental data
US20060053447A1 (en) * 2002-06-27 2006-03-09 Openpeak Inc. Method, system, and computer program product for managing controlled residential or non-residential environments

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5086385A (en) * 1989-01-31 1992-02-04 Custom Command Systems Expandable home automation system
US20020035404A1 (en) * 2000-09-14 2002-03-21 Michael Ficco Device control via digitally stored program content
US20030103088A1 (en) * 2001-11-20 2003-06-05 Universal Electronics Inc. User interface for a remote control application
US20060053447A1 (en) * 2002-06-27 2006-03-09 Openpeak Inc. Method, system, and computer program product for managing controlled residential or non-residential environments
US20050040250A1 (en) * 2003-08-18 2005-02-24 Wruck Richard A. Transfer of controller customizations
US20050148828A1 (en) * 2003-12-30 2005-07-07 Kimberly-Clark Worldwide, Inc. RFID system and method for tracking environmental data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009098550A1 (en) * 2008-02-04 2009-08-13 Sony Ericsson Mobile Communications Ab Intelligent interaction between a wireless portable device and media devices in a local network
US8072905B2 (en) 2008-02-04 2011-12-06 Sony Ericsson Mobile Communications Ab Intelligent interaction between devices in a local network

Also Published As

Publication number Publication date
WO2007125512A3 (en) 2008-01-10

Similar Documents

Publication Publication Date Title
CN103856607B (en) Video on mobile phone terminal is rendered to the method and system that video playback apparatus is play
JP5808433B2 (en) Broadcast program processing device, broadcast program processing method, broadcast station device, information distribution server, program, and information storage medium
EP1796090A4 (en) Reproduction device, reproduction method and program for reproducing graphic data and application in association with each other
US20060190559A1 (en) Multimedia contents moving method and system
US20060034583A1 (en) Media playback device
WO2017036269A1 (en) Method, device and multimedia player for displaying information display item
US9866922B2 (en) Trick playback of video data
CN1111797A (en) Audio and video docking and control system
KR20100050180A (en) Mobile terminal having projector and method for cotrolling display unit thereof
MXPA04001211A (en) Universal remote control capable of simulating a skip search.
JP2004080447A (en) Contents reproducing apparatus, operation control method for contents reproducing apparatus, and program for controlling contents reproduction
CN103959801B (en) Replay device and playback method
CN107690086A (en) Video broadcasting method, playback terminal and computer-readable storage medium
CN104156151A (en) Image display method and image display device
US9886403B2 (en) Content output device for outputting content
WO2007125512A2 (en) Method for situational end-user authoring of distributed media presentations
CN103024505A (en) Method, terminal and system for remotely controlling media playing on basis of application
JP4654604B2 (en) Recorder / player and program
KR101050186B1 (en) Multi media process play system and method
JP2008118328A (en) Content reproduction
JP2007027915A (en) Content output controller, content output system, content output management device, and content output method
CN101790054B (en) Instant message display module and instant message display method
GB0005018D0 (en) Interaction with a television system
US20220222709A1 (en) System and method for controlling an electronic device embedded in a package of a consumer product
JPH08331662A (en) Method for controlling electronic equipment, electronic equipment controller and television receiver

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07735710

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07735710

Country of ref document: EP

Kind code of ref document: A2