WO2014060598A2 - Sensing systems, associated methods and apparatus - Google Patents

Sensing systems, associated methods and apparatus Download PDF

Info

Publication number
WO2014060598A2
WO2014060598A2 PCT/EP2013/071892 EP2013071892W WO2014060598A2 WO 2014060598 A2 WO2014060598 A2 WO 2014060598A2 EP 2013071892 W EP2013071892 W EP 2013071892W WO 2014060598 A2 WO2014060598 A2 WO 2014060598A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
data
sensor
processing unit
sensed
Prior art date
Application number
PCT/EP2013/071892
Other languages
French (fr)
Other versions
WO2014060598A3 (en
Original Assignee
My-View Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by My-View Limited filed Critical My-View Limited
Publication of WO2014060598A2 publication Critical patent/WO2014060598A2/en
Publication of WO2014060598A3 publication Critical patent/WO2014060598A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)

Abstract

There is descried a system comprising at least one sensor. The at least one sensor configured to be positioned at a user, and configured to sense data relating to one or more of a user's senses. The system also comprises a processing unit configured to receive sensed data from the at least one sensor, and configured to process sensed data to produce processed data suitable for output. One of the at least one sensor and the processing unit is configured such that processed data is based on biometric data corresponding to the one or more of the user's senses.

Description

SENSING SYSTEMS, ASSOCIATED METHODS AND APPARTUS
Technical Field The invention relates to sensing systems, associated methods and apparatus. In particular the invention relates to sensing systems, methods and apparatus, useful in providing data representative of the sensed experiences of a user.
Background
In many applications it is desirable to obtain information or data representative of the experiences of an individual. For example, in military and public order applications, it can be useful to see and/or hear what an individual has seen and/or heard in a given situation. Additionally, in training and coaching applications (e.g. sports coaching), it can be useful to review an individual's experiences and, for example, compare these to the actual actions taken by the individual. Similarly, providing improved video footage or the like, more representative of a user during entertainment events (e.g. data representative of a sports player during sporting events) may provide an additional viewing opportunity to a spectator at that entertainment event.
Some systems may consist of a camera mounted on an individual's clothing potentially giving a view of what is in front of the individual. However, the images obtained from such systems are likely not to accurately account for the individual's experiences, and the images obtained from the camera may bear little resemblance to what was actually seen. Summary
According to an aspect of the invention, there is a system for providing data relating to, or representative of, a user's sensed experience. The providing of such data may be useful in obtaining a better appreciation of a user's sensed experience, which may have application in training and coaching, recreational activities, reproductions of sport events, public order, etc.
The system may comprise one or more sensors, which may be configured to be positioned at a user. The sensor(s) may be configured to sense data relating to one or more of a user's senses. The system may comprise a processing unit, configured to receive sensed data from one or more sensors, and which may be configured to process sensed data so as to produce processed data suitable for output (e.g. for output to a display, or communication to a further device for output, or the like).
At least one of the sensor and the processing unit may be configured such that processed data is based on biometric data corresponding to, or associated with, the one or more of a user's senses. The biometric data may comprise user-specific biometric data that corresponds to a particular user. In other words, the user-specific biometric data may be data that is specific to that individual user (e.g. not necessarily some or all other users). The biometric data may additionally or alternatively comprise user-generic biometric data that corresponds to more than one user. The user-generic biometric data may be data associated with, for example, a demographic of the user. For example, the user- generic biometric data may comprise data based on the biometrics of a user for a particular age, height, gender, or the like, of that user. In some examples, the sensor may comprise at least one image sensor (e.g. two or more image sensors). The image sensor(s) may be configured as cameras. In those examples, the biometric data may comprise data relating to the vision of the user. The biometric data may be associated with both eyes of a user individually, or cumulatively. In other words, at least one of the image sensor(s) and the processing unit may be configured such that processed data is based on biometric data corresponding to, or associated with, the one or both eyes a user.
The biometric data may comprise data relating to the field or view (e.g. degrees of side view, degrees of up view, degrees of down view, location of retinal blind spot, etc.) of one or both of a user's eyes. The biometric data may comprise data associated with the user's resolution within the field of view (e.g. user's resolution within areas in the field of view, when looking directly forward). The biometric data may comprise data associated with the depth of view of a user's eye (e.g. minimum focal length for a given resolution, maximum focal for a given resolution). The biometric data may comprise data associated with the dynamic range or a user's eye(s). The biometric data may comprise data associated with, for example, the pupillary distance, or the like, of a user's eyes. The pupillary distance may allow for a relative offset between a user's eyes to be determined. This may be used to provide processed data that can be used to construct a stereoscopic image, or the like. The processed data may be based on biometric data, by specifically configuring the sensor so as to obtain sensed data that is based on biometric data. For example, the field of view, focal length, dynamic range, or the like, of the sensor may be configured so as to correspond to the corresponding biometric data for that user.
Additionally, or alternatively, the processed data may be based on biometric data, by specifically configuring the processing unit to transform, or adapt, sensed data based on the biometric data for that user.
In some examples, the system comprises two image sensors - one corresponding to each eye of a user. In those cases, the image sensor for each eye may be specifically configured based on biometric data for that eye. Additionally, or alternatively the processing unit may be specifically configured to transform or adapt sensed data based on the biometric data for each eye. In other words, the processing unit may be configured to operate on sensed data from each sensor differently, depending on biometric data associated with the corresponding eye of a user.
When using two image sensors, the system may be configured such that the image sensors are positioned at the user such that a lens of each image sensor is positioned with respect to, or adjacent, a corresponding eye of a user. The system may comprise a headset, on which the image sensors are supported. The headset may be configured such that, when worn by a user, the lens of each image sensor is positioned on a corresponding side of the user's head.
The sensor of the system may additionally, or alternatively, comprise one or more audio sensors. In those examples, the biometric data may comprise data relating to an audio response of a user's ears, for example. The system may be configured such that at least one of the audio sensor and the processing unit is configured so processed data is based on biometric data corresponding, or associated with, a user's hearing.
Biometric data relating to the audio response of a user's ears may comprise data relating to the audio frequency response, and/or audio level response, of a user's ears or hearing. The biometric data relating to the audio response may comprise directional data (e.g. corresponding to a directional bias of a user's ear). In some examples, the audio sensors may be configured to be beamformed, or weighted, so as to correspond to a directionality of a user's ears.
In a similar manner as above, the sensor may comprise two audio sensors, each configured to be positioned with respect to, or adjacent, a corresponding ear of the user.
The audio sensors may be supported on a headset, wherein the headset is configured such that, when worn by the user, each of the audio sensors is positioned on a corresponding side of the user's head.
The processed data may comprise directional information based on the sensed data from the two audio sensors. For example, the processing unit may be configured to determine audio direction information based on the sensed data from the two audio sensors on either side of a user's head.
The system may be configured to determine location-based data associated with the location of a user (e.g. GPS data, cellular data or the like). The processed data may additionally comprise location-based data. The system may comprise a location sensor for example, a GNSS receiver, such as a GPS receiver, and/or cellular receiver, or the like. The system may be configured such that the sensor (e.g. the image sensor, and/or audio sensor) is configured to transmit sensed data to the processing unit over a wireless communications link, such as a WiFi, BlueTooth®, cellular communication, NFC, or the like. The system may be configured such that sensed data is stored at a headset (e.g. for subsequent communication to the processing unit, which may be remote from the headset). In some examples, the sensed data may be processed by the processing unit at the headset, or the like. In which case, the processed data may be stored at the headset, or the like, for subsequent output to an output device (e.g. display, or the like).
The processing unit may be configured to output processed data to a display, or may comprise a display configured to display the processed data. The processing unit may be configured to display processed data in real time, and/or to play processed data back at a later time.
The processing unit may be configured to output data corresponding to some or all of the sensors, such that the data can be subsequently be used independently. For example, the processing unit may output data corresponding to each sensor so that a processed image and/or sound data may be played back through an observers headset (e.g. a virtual reality headset) comprising a screen located in front of each of a wearer's eyes, and headphones. In those cases, data derived from particular sensors may be output on corresponding displays, or speakers, or the like. The system may comprise at least one image projector. The projector may be configured to be positioned at a user, and configured to project images received by, for example, the processing unit. The system may be configured to project images to a surface so as to be viewed by a user.
Projected images may include pictures, media, videos, documents, or the like. The image projector may be additionally configured as the image sensor, configured to sense data associated with a user's experience.
The system may be able to project an image associated with stored data (e.g. images, media, or the like) associated with a particular user (e.g. that user and/or different users, such as user data from a library). The system may be configured to permit a user to "play back" particular situations, or circumstances, or the like.
The system may be configured to project images so as to guide, or tutor, a user in one or more particular circumstances (e.g. technical, medical circumstances, etc.).
The system may be configured to process data local to the system, or remote from the system, so as to provide a projected image. The system may be connected to a network {e.g. the Internet) to allow streaming of such data, and projections.
According to a further aspect of the invention, there is provided a headset comprising at least one sensor configured to be positioned at a user, and configured to sense data relating to one or more of a user's senses. The headset may be configured such that, when worn by the user, the at least one sensor is positioned adjacent a corresponding sensory organ, such as an eye, or ear, of a user, The sensor may be configured such that sensed data is based on biometric data corresponding to the one or more of a user's senses.
The headset may comprise, or be configured to be in communication with (e.g. connectable to) a processor unit. Such a processing unit may be configured to receive sensed data, and configured to process sensed data to produce processed data suitable for output. The processing unit may be configured such that processed data is based on biometric data corresponding to the one or more of a user's senses.
According to a further aspect of the invention, there is provided a processing unit for providing data relating to a user's sensed experience.
The processing unit may comprise a processor configured to process, based on biometric data corresponding to one or more of the user's senses, sensed data from at least one sensor and to provide processed data corresponding to the biometric data.
According to a further aspect of the invention, there is provided a method for providing data relating to a user's sensed experience.
The method may comprise receiving, by a processing unit, sensed data from at least one sensor. The at least one sensor may have been positioned at a user so as to sense data relating to one or more of the user's senses. The method may further comprise processing, by the processing unit, received sensed data so as to produce processed data suitable for output.
At least one of the sensor and the processing unit may be configured such that processed data is based on biometric data corresponding to the one or more of a user's senses,
According to a further aspect of the invention, there is a method of viewing a user's sensed experience,
The method may comprising using processed data to display a user sensed experience, where processed data is based on particular biometric data.
The method may comprise viewing a user's sensed experience for training, or coaching purposes (e.g. sports training). The method may comprise viewing a user's sensed experience for public order purposes (e.g. corroborating events). The method may comprise viewing a user's sensed experience for entertainment purposes (e.g. point of view shots, for example, during sporting events). According to a further aspect of the invention, there is a method of providing data relating to a user's sensed experience, where the method may comprise:
obtaining information associated with a user experience,
transforming the information into sensed-user data using biometric data associated with the user.
The information may be transformed at a sensor, and/or at a processing unit using biometric data associated with the user. According to a further aspect of the invention, there is a system comprising:
a sensor configured to be positioned at a user, and configured to sense data relating to one or more of a user's senses;
a processing unit configured to receive sensed data from the sensor, and configured to process sensed data to produce processed data suitable for output; and at least one of the sensor and the processing unit being configured such that processed data is based on biometric data corresponding to the one or more of a user's senses.
According to a further aspect of the invention, there is a system comprising:
a processing unit configured to receive data associated with a particular user task, and
at least one image projector configured to be positioned at a user, and configured to project images received by the processing unit to a surface so as to be viewable by a user.
The system may comprise a headset. The image projector(s) may be configured to project an image from a user (e.g. the headset) to a surface so as to be viewable by a user.
Projected images may include pictures, media, videos, documents, or the like. The image projector may be additionally configured as an image sensor, configured to sense data associated with a user's experience.
The system may be able to project an image associated with stored data (e.g. images, media, or the like) associated with a user (e.g. that user and/or different users, such as user data from a library, database, or the like). The system may be configured to permit a user to "play back" particular situations, or circumstances, or the like.
The system may be configured to project images so as to guide, or tutor, a user in a particular circumstance.
The system may be configured to process data local to the system, or remote from the system, so as to provide a projected image. The system may be connected to a network (e.g. the Internet) to allow streaming of such data, and projections.
According to a further aspect of the invention, there is provided a computer program, configured to provide any of the above methods, or any other described embodiments.
The computer program may be provided on a computer readable medium. The computer program may be a computer program product. The product may comprise a non-transitory computer usable storage medium. The computer program product may have computer-readable program code embodied in the medium configured to perform the method. The computer program product may be configured to cause at least one processor to perform some or ail of the method.
The invention includes one or more corresponding aspects, embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation. For example, features associated with particular recited embodiments relating to the system, may be equally be appropriate as features of embodiments relating the headset, or processor unit, and vice versa. Furthermore, features associated with particular recited embodiments relating to methods, may be equally appropriate as features of embodiments relating specifically to the apparatus, such as the systems, headsets, etc., and vice versa.
It will be appreciated that one or more embodiments/aspects may be useful in obtaining a better appreciation of a user's sensed experience.
The above summary is intended to be merely exemplary and non-limiting.
Brief description of the drawings
Exemplary embodiments of the invention are described herein with reference to the accompanying drawings, in which:
Figure 1 is a schematic representation of a headset fitted to a user's head;
Figure 2 is a schematic representation of a headset fitted to a user's head;
Figure 3 is a schematic representation of a system for providing data relating to a user's sensed experience;
Figures 4a and 4b illustrate a field of view of cameras fitted to a headset;
Figure 5 is a flow diagram showing a method for providing data relating to a user's sensed experience; and
Figure 6 shows a headset fitted to a user's head. Description
Generally disclosed herein is a system for providing or using data relating to, or representative of, a user's sensed experience,
In some examples, the system comprises one or more sensors configured to sense data relating to a user's senses or experiences. The data provided from the system can be considered to be based on biometric data. For example, sensed data may be adapted, or transformed, using biometric data relating to a user. In some embodiments, this may be achieved using hardware, firmware, or software (e.g. post¬ processing), or combination thereof, so as to provide processed data based on biometric data.
Figures 1 and 2 show an embodiment comprising a headset 10. Here, the headset 10 is configured so as to be fitted to the head of a user. The headset 10 in this particular example comprises two sensor arms 1 1 a, 11 b configured to be worn on the head of a user, one sensor arm 1 1 a, 1 1 b resting on each ear. The sensor arms 1 1 a, 1 1 b are connected by a connecting portion 12, configured to fit around the back of the user's head to a union 14.
Each of the sensor arms 11 a, 11 b is configured to hold one or more sensors. In the exemplary headset of Figures 1 and 2, the sensor arms each hold an image sensor 16a, 16b (e.g. a camera). Here, the headset 10 and the image sensors 16a, 16b are positioned such that the direction of each sensor 16a, 16b is essentially in a direction corresponding to the direction in which the user's head is facing, when the headset 10 is worn. In this particular example, the headset 10 comprises only the sensor arms 11a, 11b (and associated sensors) and the connecting portion 12, As such, the headset 10 can be unobtrusively fitted to the head of the subject, yet be able to record sensed data and, for example, be lightweight. Further, the headset 10 has an ergonomic design, allowing it to be comfortably worn by a user over long periods. Of course, in further examples, the headset 10 may comprise additional, and alternative, components or features.
The sensor arms 11a, 11b are arranged such that, when the headset 10 is worn by a user, the image sensors 18a, 16b are located in the proximity of (e.g. as close as possible to) the eyes of the user. Here, the image sensors 18a, 16b are positioned adjacent a corresponding eye of the user on the sides of the user's head. The sensor arms 11a, 11b may be deformable such that they may be bent to place the image sensors 16a, 18b in the desired locations.
The sensor arms 11a, 11b and the connecting portion 12 may also provide ducting through which cabling related to the image sensors 16a, 16b, and ancillary electronic components, may be routed.
Figure 3 shows a system 30 for providing data relating to a user's sensed experience comprising the headset 10, The system 30 further comprises a processing unit 32 configured to receive sensed data from the images sensors 18a, 18b of the headset 10 and to process the data to produce processed data representative of the user's experience. The processing unit 32 comprises a receiver 34 configured to receive data from the headset 10. The receiver 34 is in electrical communication with a processor 38 configured to process the received sensed data to provide processed data suitable for display by an output 38 (e.g. a display). The processor 36 is also in electrical communication with a memory 37, which is configured to store data relating to the processing of the sensed data received from the image sensors 16a, 16b.
The output 38 may be a display suitable for displaying images captured by the sensors 16a, 16b. In exemplary systems, the processing unit 32 may form part of an electronic computing device, such as a computer, laptop, smart phone, or the like.
In the exemplary system of Figure 3, communication between the headset 10 and the processing unit 32 is by a wireless communications link. Specifically, the headset 10 comprises a wireless transmitter (not shown), which receives data from the sensors 16a, 16b and transmits that sensed data to the processing unit 32.
In other exemplary systems, communication between the headset 10 and the processing unit 32 may be by wired connection. Alternatively, the headset 10 may comprise a memory (not shown) on which sensed data from the cameras 16a, 16b is stored. The sensed data may then be downloaded to the processing unit 32 at a later time for processing.
Alternatively, the processor 36 and the memory 37 may be located on the headset 10. In such a configuration, data may be received from the sensors 16a, 16b by the processor, processed and stored in the memory 37 ready for download to a computing device at a later time.
As mentioned above, the image sensors 16a, 16b are located as proximate the location to the user's eyes. In addition, in the example shown, the image sensors 16a, 16b are configured to capture images pointing in the same direction as the user's eyes when the user is looking directly forward. In this example, each image sensor 16a, 16b is specifically configured based on biometric data relating to the vision of the user. Here, each of the image sensors 16a, 16b are independently configured for each of the eyes of a user, based on biometric data for that eye of the user. Of course, in alternative embodiments, the image sensors may be configured based on common biometric data for each sensor.
Here, the biometric data comprises data relating to the field of view (e.g. degrees of side view, degrees of up view, degrees of down view, location of retinal blind spot, etc.) for both of the user's eyes, as well as data associated with the depth of view of a user's eyes (e.g. minimum focal length for a given resolution, maximum focal for a given resolution). In some examples, the biometric data may comprise data associated with the dynamic range of the user's eye(s), as well as other factors such as frequency response (e.g. colour perception).
In some examples, the biometric data may comprise data associated with the user's resolution within the field of view (e.g. user's resolution within areas in the field of view, when looking directly forward). Because each of the image sensors 16a, 16b are independently configured for each of the eyes of a user, based on biometric data for that eye of the user, the sensed data from each sensor is representative of the image viewed from the user's eye. That is, the user will have a particular angle of field of view, for example, and the processed data relating to images produced by the sensors 16a, 16b will have a corresponding field of view. Configuring the sensors in this manner may be achieved through the use of lenses fitted to the image sensor, apertures of the sensor, and the like, which may be adjustable for different users. Alternatively, this may be achieved in software, hardware and/or firmware processing, undertaken in the processor 36. That is, the processor 38 can be configured to process sensed data so as to produce processed data based on biometric data for each image sensor 16a, 16b. In examples in which the sensed data is processed, and transformed, using biometric data at the processing unit, the biometric data may additionally comprise pupillary distance. The pupillary distance may allow for a relative offset between a user's eyes to be determined, and so may be used to provide processed data that can be used to construct a stereoscopic image, or the like. A pupillary distance, or the like, may be used to calculate, determine or approximate, and overlap region of image from data of the two sensors.
The images, or sensed data, obtain from the sensors 16a, 16b may be adapted according to user-specific biometric data, i.e. data relating specifically to that user. Additionally, or alternatively, the sensed data may be adapted according to user- generic biometric data, for example, data associated with a "standard" user, or a user with of a corresponding demographic (e.g. one or more of age, gender, height). This may be gathered from tests performed on a number of typical users to determine an average field of view, an average focal distance and average value for any other biometric parameter. Alternatively, the specific individual may be tested to determine their biometric parameters and the sensed data are adapted accordingly.
When in use, the sensors 16a, 16b view the scene in front of the user's face as would be seen by the user. Referring to Figures 4a and 4b, the field of view 40a, 40b of each sensor the 16a, 16b generally is shown. A series of objects 42 is shown to fall within the fields of view 40a, 40b. Moreover, the fields of view 40a. 40b are seen to overlap in the same way as the fields of view of the user's eyes overlap. Referring to Figure 5, the image sensors 18a, 16b obtain sensed data 50 and pass these to a transmitter on the headset 10. The transmitter transmits the sensed data 52 to the receiver 34 located at the processing unit 32. The receiver 34 passes the sensed data to the processor 36, which, in conjunction with the memory 37, processes the sensed data to provide processed data 54, which has been based on the biometric data for that user, in the case of the sensors 16a, 16b, the processed data comprises images that are adapted to the field of view of the user. In this way, the processed images correspond to the actual view of the user and are therefore representative of the user's experiences. As mentioned above, the processed data may be adapted through the use of specific hardware modifications to the sensor, or through algorithms used during the processing. The processor 36 displays the processed images 56 on the output 38, which comprises a screen suitable for showing the images. The images may be streamed to the display in real time. Alternatively, the images may be stored in the memory 37 for playback at a later time.
In the example of a sporting event, the images viewed from a player (e.g. a rugby player) may be streamed to a crowd display, and/or may be streamable, or downloadable, to a portable media device (e.g. a smart phone).
It will be appreciated that while the exemplary headset 10 of Figures 1 to 3 comprises two image sensors 16a, 16b, the headset 10 may be implemented with only one image sensor. The same principles apply in that the images produced by the single sensor can be adapted to correspond to the biometrics of the user. For example, in some cases, the pupillary distance of a user may be provided so as to harvesting two independent images representative of images to each eye, from a single image, and then each of those images may be adapted, modified or transformed, based on biometric data.
Referring to Figure 6, in other exemplary headsets 60, audio sensors 82a, 82b may be included in the sensor arms 11a, 11b, The audio sensors 82a, 62b are located in positions as close as possible to the user's ears. In the headset 60, the audio sensors 62a, 62b are adjacent a corresponding ear on the sides of the user's head. In addition, the processed audio data provided by the audio sensors 62a, 62b and processed by the processing unit 32 is adapted to correspond to the biometric data. That is, the processed audio data may be adapted to have a frequency response corresponding to the frequency response of the user. In addition, the processed audio data may be adapted to be attenuated to correspond to the sound levels that would be heard by the user, based on the biometric data. As above with reference to the image data, the adaptation of the sound data may be undertaken in hardware or in software,
The audio data may also be processed to determine direction information. By measuring the relative times at which audio events are sensed by each of the audio sensors, and the relative audio level at each of the audio sensors, the direction from which the audio event emanates can be determined.
In a similar way to the image sensors 16a, 18b, the audio sensors 82a, 62b obtain audio data and pass this to the transmitter on the headset 60, The transmitter transmits the sensed data to the receiver 34 on the processing unit 32. The receiver 34 passes the sensed audio data to the processor 38, which, in conjunction with the memory 37, processes the sensed audio data to produce processed audio data adapted to the biometrics of the user. The processor 36 outputs the processed audio data to the output 38, which comprises a loudspeaker. In addition, direction information resolved by the processor 36 may be displayed on a display of the output 38,
It will be understood that in some examples the headset 80 may also comprise only a single audio sensor. Furthermore, in some examples, the system may be configured without image sensors (e.g. only audio sensors).
In one exemplary system, the processed image and sound data may be played back through a virtual reality headset comprising a screen located in front of each of a wearer's eyes, and headphones. The processed image data from each of the cameras 18a, 16b may be played back on the corresponding screen of the virtual reality headset. The processed audio data from each of the audio sensors 82a, 62b may be played back through the corresponding left or right headphone of the virtual reality headset. This allows the wearer to "replay" the situation in which the headset user was placed at the time the data was recorded to get a truer understanding of what the user saw and heard.
In other exemplary headsets, a global navigation satellite system (for example, the GPS) receiver may be incorporated. This provides additional information relating to the position of the user when the image and sound data were recorded. Further, exemplary headsets may comprise an audio output, which allows the user to listen to an audio feed while image and sound data are recorded.
Further exemplary systems may comprise a plurality of sensors, each configured to provide sensed data in a plurality of different directions. For example, the system may comprise a plurality of cameras, each configured to provide images from a different direction. In this way, data may be collected to show the situation all around a user. This may be particularly useful in applications in which the user is working alone in a potentially dangerous situation, such as a security guard or police office, and where threats may appear from behind the user. The system may comprise a record button that is configured to begin and end the recording and processing of sensed data.
In further embodiments, the system 10 may be additionally, or alternatively, be configured to use data to project an image (e.g. pictures, media, videos, documents, or the like). For example, in some cases, the sensors 1 1 a, 1 1 b may be configured to additionally, or even alternatively, project an image, which can be displayed upon a surface for the user to see (e.g. a surface in front of a user). In some of those examples, the user may be able to adjust the projection length to allow the image, video, etc. to be focused (i.e. presented as a sharp image) on a particular desired surface.
In some cases, the user may be able to project an image associated with stored data in order to reconstruct an image, moving video, or the like, showing captured images associated with that user, or even one or more even different users (e.g. from a library). Such a system may be useable to "play back" particular situations, or circumstances. In such embodiments, previously stored data can be used to guide or tutor a user so as to allow them to better perform a task, for example, by allowing the user to follow a replay of a scenario (e.g. allowing the option to play, pause, rewind, fast forward, etc.). Additionally, or alternatively, the system may be configured to present documents (e.g. training manuals, or the like). In such circumstances, the system may be connected to a network (e.g. the Internet) to allow streaming of such data, and projections. It will readily be appreciated that such a configuration may be particularly useful when a user intends to one or both hands for a particular project (e.g. electrician, technician, medical practitioner, or the like),
It will readily be appreciated by a skilled reader that in some embodiments, the system may be configured to project images from the headset (or similar), without the use of sensors per, se. In such embodiments, the system, may nevertheless by configured to project images associated with a user experience, or other information,
The skilled person will envisage other embodiments without departing from the scope of the claims.

Claims

CLAIMS;
1. A system comprising:
at least one sensor configured to be positioned at a user, and configured to sense data relating to one or more of a user's senses;
a processing unit configured to receive sensed data from the at least one sensor, and configured to process sensed data to produce processed data suitable for output,
one of the at least one sensor and the processing unit being configured such that processed data is based on biometric data corresponding to the one or more of the user's senses.
2. A system according to claim 1 , wherein the at least one sensor comprises at least one camera and the biometric data comprises data relating to the field of view of the user, and wherein the processed data is adapted such that the field of view of the at least one camera corresponds to data relating to the field of view of the user.
3. A system according to claim 2, wherein the at least one camera comprises two cameras configured to be positioned on the user such that a lens of each camera is positioned adjacent a corresponding eye of the user.
4. A system according to claim 3, further comprising a headset on which the cameras are supported, wherein the headset is configured such that, when worn by the user, the lens of each camera is positioned on a corresponding side of the user's head.
5. A system according to any preceding claim, wherein the at least one sensor comprises at least one audio sensor and the biometric data comprises data relating to an audio response of a user's ears, and wherein the processed data is adapted to correspond to the data relating to the audio response of the user.
6. A system according to claim 5, wherein the data relating to the audio response of the user's ears comprises data relating to the audio frequency response of the user's ears.
7. A system according to claim 5 or 6, wherein the data relating to the audio response of the user's ears comprises data relating to the audio level response of the user's ears.
8. A system according to any of claims 5 to 7, wherein the at least one audio sensor comprises two audio sensors, each configured to be positioned adjacent a corresponding ear of the user.
9. A system according to claim 8, further comprising a headset on which the audio sensors are supported, wherein the headset is configured such that, when worn by the user, each of the audio sensors is positioned on a corresponding side of the user's head.
10. A system according to claim 8 or 9, wherein the processing unit is configured to determine audio direction information based on the sensed data from the two audio sensors.
11. A system according to any preceding claim, wherein the at least one sensor is configured to transmit sensed data to the processing unit over a wireless telecommunications link,
12. A system according to any preceding claim, wherein the processing unit comprises a display means configured to display the processed data.
13. A system according to claim 12, wherein the processing unit is configured to display the processed data either in real time, or to play the processed data back at a later time.
14. A system according to any preceding claim, wherein the at least one sensor is configured to provide sensed data corresponding to the biometric data.
15. A system according to any preceding claim, wherein the processing unit is configured to process the sensed data such that the processed data corresponds to the biometric data.
16. A system according to any preceding claim, wherein the biometric data comprises data relating to an average of biometric data obtained from a plurality of subjects.
17. A system according to any preceding claim, wherein the biometric data comprises data relating to user specific biometric data obtained from the user.
18. A system according to any preceding claim, wherein the system is further configured to project an image from a user to a surface so as to be viewable by a user.
19. A headset comprising:
at least one sensor configured to sense data relating to one or more of the user's senses,
the headset configured such that, when worn by a user, the at least one sensor is positioned adjacent a corresponding sensory organ of a user.
20. A headset according to claim 19, wherein the at least one sensor is configured such that sensed data is based on biometric data corresponding to one or more of the user's senses.
21. A processing unit for providing data relating to a user's sensed experience and suitable for output by an output means,
the processing unit comprising a processor configured to process sensed data from at least one sensor and to provide processed data suitable for output, the processed data being based on biometric data corresponding to one or more of a user's senses.
22. A method for providing data relating to a user's sensed experience, the method comprising:
receiving, by a processing unit, sensed data;
processing, by the processing unit, the sensed data to produce processed data suitable for output by an output means,
the sensed data being sensed by at least one sensor positioned at a user and one of the at least one sensor and the processing unit being configured such that the processed data is based on biometric data corresponding to the one or more of the user's senses.
23. A computer program product comprising computer program code configured, when run on a computer, to undertake the method of claim 21.
24. A system substantially as herein described with reference to the accompanying drawings.
25. A headset substantially as herein described with reference to the accompanying drawings.
26. A processing unit substantially as herein described with reference to the accompanying drawings.
PCT/EP2013/071892 2012-10-19 2013-10-18 Sensing systems, associated methods and apparatus WO2014060598A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1218836.3 2012-10-19
GB1218836.3A GB2507111A (en) 2012-10-19 2012-10-19 User-based sensing with biometric data-based processing to assess an individual's experience

Publications (2)

Publication Number Publication Date
WO2014060598A2 true WO2014060598A2 (en) 2014-04-24
WO2014060598A3 WO2014060598A3 (en) 2014-06-12

Family

ID=47359171

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/071892 WO2014060598A2 (en) 2012-10-19 2013-10-18 Sensing systems, associated methods and apparatus

Country Status (2)

Country Link
GB (1) GB2507111A (en)
WO (1) WO2014060598A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182051B (en) * 2014-08-29 2018-03-09 百度在线网络技术(北京)有限公司 Head-wearing type intelligent equipment and the interactive system with the head-wearing type intelligent equipment
CN105991154A (en) * 2015-02-10 2016-10-05 北京艾沃信通讯技术有限公司 Head mounted multimedia collection device and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050195277A1 (en) * 2004-03-04 2005-09-08 Olympus Corporation Image capturing apparatus
US20070201847A1 (en) * 2006-02-24 2007-08-30 Tianmo Lei Fully Automatic, Head Mounted, Hand and Eye Free Camera System And Photography

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9121707D0 (en) * 1991-10-12 1991-11-27 British Aerospace Improvements in computer-generated imagery
US6786860B2 (en) * 2001-10-03 2004-09-07 Advanced Bionics Corporation Hearing aid design
GB0419346D0 (en) * 2004-09-01 2004-09-29 Smyth Stephen M F Method and apparatus for improved headphone virtualisation
US9292973B2 (en) * 2010-11-08 2016-03-22 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050195277A1 (en) * 2004-03-04 2005-09-08 Olympus Corporation Image capturing apparatus
US20070201847A1 (en) * 2006-02-24 2007-08-30 Tianmo Lei Fully Automatic, Head Mounted, Hand and Eye Free Camera System And Photography

Also Published As

Publication number Publication date
GB2507111A (en) 2014-04-23
GB201218836D0 (en) 2012-12-05
WO2014060598A3 (en) 2014-06-12

Similar Documents

Publication Publication Date Title
US10142618B2 (en) Imaging apparatus and imaging method
US20180123813A1 (en) Augmented Reality Conferencing System and Method
JP5891131B2 (en) Image generating apparatus and image generating method
US20180124497A1 (en) Augmented Reality Sharing for Wearable Devices
JP2022531067A (en) Audio spatialization and enhancement between multiple headsets
CN108762496B (en) Information processing method and electronic equipment
WO2015186686A1 (en) Position determination apparatus, audio apparatus, position determination method, and program
CN113366863B (en) Compensating for head-related transfer function effects of a headset
JP2022549985A (en) Dynamic Customization of Head-Related Transfer Functions for Presentation of Audio Content
US20240042318A1 (en) Gaming with earpiece 3d audio
JP6580516B2 (en) Processing apparatus and image determination method
US10536666B1 (en) Systems and methods for transmitting aggregated video data
JP6538003B2 (en) Actuator device
US20200137488A1 (en) Virtual microphone
WO2014060598A2 (en) Sensing systems, associated methods and apparatus
WO2023147038A1 (en) Systems and methods for predictively downloading volumetric data
US10979733B1 (en) Systems and methods for measuring image quality based on an image quality metric
JP2022015647A (en) Information processing apparatus and image display method
JP2022022871A (en) Processing device and immersive degree derivation method
JP6600186B2 (en) Information processing apparatus, control method, and program
US11734905B1 (en) Systems and methods for lighting subjects for artificial reality scenes
US11638111B2 (en) Systems and methods for classifying beamformed signals for binaural audio playback
US11343567B1 (en) Systems and methods for providing a quality metric for media content
US11870852B1 (en) Systems and methods for local data transmission
US20230101693A1 (en) Sound processing apparatus, sound processing system, sound processing method, and non-transitory computer readable medium storing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13821452

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 13821452

Country of ref document: EP

Kind code of ref document: A2